At around 05:00 on a misty morning in early June, a group of engineers, computer scientists and researchers boarded a boat on the remote Scottish islands of Orkney. At around midnight the same day they docked back at the same port they'd left from that morning. In the intervening 19 hours a massive infrastructure project was completed: a watertight data centre, comprising of 864 servers across 12 racks, was lowered to the bottom of the ocean.
A giant barge, equipped with cranes, dragged the data centre out to its resting place. Once it was secured on a rock slab 117 feet below the surface, engineers used an underwater drone to connect it to a previously laid cable. "There's always the risk that there might be something wrong because we haven't seen the cable while," says Ben Cutler, Microsoft's leader of the underwater data centre, that's been dubbed Project Natick.
The purpose of the ambitious project? To potentially reduce the colossal amounts of energy needed for the world's increasingly greedy consumption of data. Microsoft has been working on Project Natick since 2014 and previously tested a small-scale submerged data centre in the Pacific Ocean for 105 days.
The new 40-foot long structure uses compression technology from submarines and has been designed to sit at the bottom of the sea for up to five years. It will be operational for at least a year for Microsoft to see how it performs and during this time engineers cannot physically access the capsule. If the technologies inside break, there will be no hope of repair.
The point of submerging the data centre is simple: the cold sea water could help to reduce the cost of cooling servers inside the metallic tube. The cable that's attached to the data centre provides power that's generated from renewable sources on land and also contains the internet cables, which will output to the shore. Microsoft says the data centre is as powerful as "several thousand high-end consumer PCs".
And as the world gets more and more hungry for data, companies are struggling to get a grip on its environmental impact. "It's not going away, it's getting bigger," says Colin Pattinson, the dean of Leeds Beckett University's School of Computing, Creative Technologies and Engineering. "Everything we produce, everything we create, is becoming more data intensive," he adds.
European Commission research says the ICT industry, which includes data centres, creates up to two per cent of global CO2 emissions. Data centres also have the fastest-growing carbon footprint of any area of the ICT sector, mainly due to the growth in cloud computing and overall internet use. The UK, Germany, France and Netherlands host the biggest data centres in Europe and it's estimated that by 2020 there will be 104 terawatt-hours per year of energy consumption by data centres in the EU. It's forecasted this will be around four per cent of all energy usage.
So, if data centres can be sunk to the bottom of the ocean to reduce the amount of electricity they need, can't we put them all down there? The short answer: no. There are more complex factors at play. Currently, Microsoft is unsure whether the latest phase of its experiment will be successful. Things could go wrong, servers could break down and the whole project could sink without a trace.
There are also considerations around environmental impact. Cutler says the first 105-day test phase had a tiny amount of heat being emitted into the ocean and he expects the same will happen this time around. The amount of heat being released is being monitored with sensors as is the amount of noise created by the data centre. "If you looked at the phase one data centre we found that within days it was overrun with sealife," he says. "We had fish around us, crabs crawling all over the place." From cameras positioned around the Orkney data centre, he says there are already "some very exotic looking creatures scooting around".
"In a sense, it's worth a try," says Pattinson, who has studied the environmental impact of data centres. "The most recent drives to reduce energy consumption have been to take advantage of climatic conditions." In the last decade, data centres have been created in country's with cooler climates, such as Iceland. "The data centres located in Scandinavia and Northern Europe are more energy efficient, mainly due to the cooler ambient conditions that facilitate economiser use," the European Commission research states.
Pattinson says anything that helps to reduce the energy consumption of data centres is a positive but efficiency gains are getting smaller. A few years ago, techniques such as server virtualisation, free air cooling techniques and the exploitation of cooler climates made a big difference - but those gains are getting smaller by the day.
So while there may not be a radical solution for cutting data centre energy use, we might be able to use emerging technologies to improve efficiency still further. One of the biggest areas for potential efficiency gains is using artificial intelligence and machine learning to better understand data. Algorithms developed by Google's DeepMind have shown it is possible to reduce the energy used for cooling its data centres by up as much as 40 per cent.
"Effectively what we're trying to do now is squeeze yet more savings out of the same basic tech," Pattinson adds. "We might reduce the rate of the increase but there will still be an increase in the energy demands that data centres create because of the volumes of data we're producing."
Microsoft's Cutler adds that the firm is also exploring using underwater data centres as artificial reefs - manmade structures that can be used to promote wildlife. Around the world old trains, tanks and ships have been made into artificial reefs to improve marine environments.
But data centres are big business, especially as the cloud and AI arms of tech's biggest firms grow. (Apple says all of its data centres are fully powered by renewable energy, Facebook is working on getting to 100 per cent renewable energy, Google will purchase 100 per cent renewable energy to match its data use in 2018 and Microsoft says all of its are carbon neutral). More than $20 billion was spent on buildings for data centres in 2017, according to research from real estate firm CBRE. And Microsoft has an economic case for attempting Project Natick.
"You could imagine a data centres in the ocean that could have more than one of these," Cutler says. "There could be a whole lot and they could be connected." He predicts it will be possible for Microsoft to manufacture and create the underwater data centres within 90 days and install them near cities. "Imagine we have a factory where we build these data centres and that's where the servers show up. They get outfitted, sealed into these containers are in inventory for a day and then someone calls to say they need two megawatts of capacity off the coast of a country."
Beatrice Nicolas-Meunier, a project manager at engineering firm Naval Group which built the Natick data centre using submarine tech, says scaling isn't a problem. "On the technology part, we can go larger and bigger," she explains. Any potential issues come from transporting larger objects. She says the company would have to adopt specialist marine technology to move such giant data centres. "If Microsoft tomorrow wanted to develop a very large one we will have to use the same tools," Nicolas-Meunier explains. "The limitation is the tools around the deployment of such a vessel."
If everything goes to plan for Microsoft, Cutler imagines creating underwater data centres off the shores of countries around the world, adding that in most cases cities with large populations are only 200 kilometres from the shore. And moving data centres closer to cities could have another benefit: if data doesn't have so far to travel, connections run faster. But don't expect them to be added to lakes or other bodies of water. "The nice thing about the ocean is there's always a current," he says. "If I had a stagnant body of water, then I've got an issue: where is the heat going to go?"