Data centers are to be found in the most unlikely places — from the icy tundra of Antarctica or the belly of a converted 19th century church, to a retrofitted nuclear bunker or a 32-story colossus.

In 2009, Google received a patent for the idea of building data center platforms that would sit some miles offshore. Picture an oil platform for compute and storage and the whole thing running off wind and solar power.

The jury is still out about the future of the data center. Will it become a modular shipping container and be shipped out to sea, one stacked on top of the other (similar to Google’s idea), or will it become an enormous, ultra-efficient warehouse out in Nevada or in the solar goldfields of the Saharan desert?

Whatever happens, one thing is clear: we will need ever-increasing amounts of data storage and computing power.

The cloud may not have won every battle for outsourcing the datacenter—a significant number of enterprises prefer to build their own—but it’s winning the battle for convergence. Through hybrid cloud innovations, the line between the corporate firewall and the massive scalability of tier 1 hosting providers is becoming a blurred one.

Geography is becoming virtual and data, regardless of where it lives, must scale massively and across multiple platforms. NoSQL and MySQL and Oracle queries emerge within this ecosystem as allies, not sworn enemies.

When Facebook recently announced that it will open source Presto, the data engine that fuels storage and retrieval of over 300 petabytes of data for its one billion users, it signaled that Open Source will continue to play a vital role in the future of the data center.

Wherever it goes, and whatever it looks like, the data centers of the future will be super efficient and push the boundaries of the physical, but they’ll also be driven by open source innovation.

Click to see a live webcam of Antarctic conditions at the McMurdo Station data center.

Watch a video of Rackspace building today’s data centers and getting ready for tomorrow’s: