by Gage Taylor
Microsoft Research’s New Experiences and Technologies wing (NExT) recently went public with Project Natick, a data center enclosed in a steel capsule designed to rest on the ocean floor. In theory, if developed at a commercial scale, the concept could represent a massive step forward for data storage, as these centers are more easily deployed, reduce consumer latency (due to their proximity to the most populated regions of the earth, the coasts), and save money on air conditioning and cooling compared to traditional server farms. It may also answer the growing energy demands of the tech world, as Microsoft is attempting to pair the system with either a wind or hydropower system to generate electricity. This could mean that no new energy would be added to the ocean, and as a result, there would be no overall heating, a conclusion supported by the early research.
Coupled with the announcement were the initial reports from the project’s first test, codenamed the “Leona Philpot” after a character from Microsoft’s Halo franchise, which spent 105 days off the Central Californian coast near San Luis Obispo. The trial proved more successful than anticipated. Since researchers had been worried about hardware failures, the capsule was outfitted with more than 100 different sensors to measure pressure, motion, humidity, and other environmental conditions; however, everything went according to plan, as the capsule came back intact. The environmental impact, naturally one of the largest potential concerns with the project, seemed next to none. “Extremely” little heat was produced, and the sounds of the spinning drives and fans inside the container were drowned out by the clicking of shrimp that swam next to the system.
The research group has begun work on a system three times as large, built in collaboration with an as-yet-unannounced hydropower developer. Expected to begin next year, it will likely be located near Florida or in Northern Europe, where there are already extensive ocean energy projects underway. In addition to the tech required to store these servers underwater, an important aspect of the project will be the additional development of server technology that doesn’t currently exist – the end goal of the project is for the capsules to each last twenty years, with servicing happening every five, a feat that current technologies simply couldn’t handle. However, Microsoft’s announcement is incredibly optimistic about the situation, stating that, “with the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly” [http://natick.research.microsoft.com/]. This represents an opportunity to develop data centers that are both resilient and long-lived. Project Natick is still in its early stages, but something tells me this could be huge.
Markoff, John. “Microsoft Plumbs Ocean’s Depths to Test Underwater Data Center.” New York Times. Feb 1, 2016. [http://www.nytimes.com/2016/02/01/technology/microsoft-plumbs-oceans-depths-to-test-underwater-data-center.html?_r=0]
Paul, Ian. “Microsoft’s audacious Project Natick wants to submerge your data in the oceans.” PC World. Feb 1, 2016. [http://www.pcworld.com/article/3027934/data-center-cloud/microsofts-project-natick-wants-to-submerge-your-data-in-the-oceans.html]