- That's so Meta
- Posts
- Meta's Datacenters + Zuck's Security Detail
Meta's Datacenters + Zuck's Security Detail
A deep dive into datacenter costs and tradeoffs in the race to dominate AI
Ahoy Metamates! 🚢 🏴☠️
This week we’ll talk about the infrastructure race to build and cool data centers that will decide if Meta’s AI capabilities will be superior in the long run.
But first, some tea on Mark’s life:
Who Keeps Mark Safe?
Meta is notorious for its security spend on Mark. There are some costs that make sense - for example, Mark has to always fly private and can never fly in the same plane as most other Meta execs (SOX governance controls). But 16 bodyguards for one person? Really?
Meta spends $14M annually on Mark’s security (up from $10M last year and staggeringly $27M during COVID)
Mark’s movements are tracked in real time in a central control center, part of GSOC at MPK
Any Meta employee can take a tour of GSOC (last I checked)
In case you’re curious, Mark’s personal security detail setup is the brainchild of Tim Wenzel. A man with a fairly intense resume: from developing diplomat security program for “high threat regions” n at the Diplomatic Security Service at the US Dept. of State to being a paramedic specializing in mass casualty response and for some reason also a phlebotomist?
It gets really intense - and drones are involved.
No one, and I truly mean NO ONE, was seriously thinking about drones for civilian use cases in 2015. Correction: Tim Wenzel was. He designed an entire drone system to fly over Mark’s house and keep him safe.
Some intense stuff we know about Mark’s security:
Meta pays for a Special Projects R&D lab focused entirely on coming up with ways to keep Mark safe
Mark’s home is built with ballistic materials and hardening methods and includes multiple safe rooms.
There’s a team focused on aerial drone detection & mitigation strategies
Tim went on to lead pretty critical parts of Meta - including securing IP around the globe. Insane that he started as a contractor basically retrofitting Mark’s house after the IPO.
The Race To Build Infra
Meta datacenter in Utah. Photo Credit : Dessert News
All of Meta’s big bets center on its ability to massively scale its infrastructure - a non trivial effort involving new chips, cables under the ocean and telco partnerships.
It’s a slippery slope as analysts, particularly Morgan Stanley, were heavily resistant to new spend, citing Microsoft’s ability to run on its retrofitted setup without increasing costs. In response, Meta ramped down existing expansion plans, saving up to $7B by its own estimates.
In this month’s deep- dive series, we’ll cover the state of Meta’s infra, starting with data centers. I promise, it won’t be boring:
Meta’s data center map and “Project Bigfoot”
How a small Dutch farming community fought Meta off
Why does no big tech company have a data center in Arkansas?
Why cooling matters - immersion vs air cooling strategies
👉 Next issue: chips, system design, cables in the ocean and competitive costs on building an AI empire.
Meta’s Datacenter Empire
The Prineville, Oregon data center
In 2010 Meta started building its own data centers and, unlike Amazon and Google, they’ve been extra hush about it. It’s a massive operation - each build takes about 1000 construction workers on site at peak, and creates ~ 100 new jobs to manage the facility once complete.
18 data center campuses globally
Median cost to build of ~$460 per sqft
~40 million square feet of space (equivalent to 616 football fields)
Meta’s lifetime data center investment brought in $18.6B to the US GDP
Source: Digital Real Estate Data 2022 ( note: Mesa has since been scrapped)
“Project Bigfoot”
30% of all data centers in the world are in the US- and they have a massive NIMBY problem. No one, and I mean NO ONE, wants one in their area. While companies like Meta boast being 100% ran on renewable energy, there’s an emerging water crisis that no one seems to be talking about openly.
Data centers need water to cool - and our world, largely, is in a state of drought. An open loop design for a large data center, researchers say, can gobble up anywhere between 1 million and 5 million gallons of water a day — as much as a town of 10,000 to 50,000 people.
NIMBY Vibez
In The Dalles, Ore., a local paper fought to unearth information revealing that a Google data center uses over a quarter of the city’s water.
In Los Lunas, N.M., farmers protested a decision by the city to allow a Meta data center to move into the area.
The tiny Dutch farming community of Zeewolde went nuts on Meta, in the civil European way
“Project Bigfoot”
For nearly a year now, a mystery land buyer in Rosemount, MN has been upsetting the local population with plans to build a massive data center on University of Minnesota’s Land. That “ mystery buyer” happens to be Meta.
The locals are less than enthused:
alleged local resident on reddit
Others were slightly less laconic:
word smith, this one
That got me thinking - what factors into Meta’s decision to build a datacenter?
It’s a bit of a conundrum, really.
The logical thing to do would be to find a state with cooler weather, cheap land and easy access to infrastructure. However companies, like Meta, do literally the exact opposite - why? It’s all about cheap power and access to carbon neutral power ( solar and wind) - if you want to play datacenter bingo, here’s your card:
And, if you’re wondering why there’s literally not a single big tech company with an owned data center in Arkanzas, here’s your answer:
TLDR: Data centers use a ton of water and create no local jobs = everyone is pissed off.
Stay cool, everyone
Cooling is actually essential to server performance - which is why I spent an entire paragraph on it, naturally. And Meta’s making a bold new bet on a closed loop cooling system to power its AI ambitions.
At the Open Compute Summit, Meta outlined a roadmap for a gradual shift to a water-cooled infrastructure, using cold plates to provide direct-to-chip cooling for AI workloads.
What does this mean in, like, English?
Source: Meta
Simple explanation: the winner in AI will be the company that can run more efficiently at scale to maximize its existing resources because standing up new data centers takes forever +1 day.
2 different strategies to win efficiency
Microsoft is trying to pull off immersion cooling, which is exactly what it sounds like and looks like this below.
How it works: servers are immersed in a coolant fluid that boils off as the chips generate heat, removing the heat as it changes from liquid to vapor. The vapor then condenses into liquid for reuse, all without a pump. more here
Meta is using Air-Assisted Liquid Cooling (AALC). Which is a fancy way of saying it will use cold plates and air cooling to cycle and cool water in a closed loop system without evaporation.
Source: Meta. Screenshot depicts blue air pipe cooling system
Curious to learn more? Additional reading below:
The information provided herein is for general informational purposes only and is not intended to provide tax, legal, or investment advice and should not be construed as an offer to sell, a solicitation of an offer to buy, or a recommendation of any security by That's So Meta, its employees and affiliates, or any third-party. Any expressions of opinion or assumptions are for illustrative purposes only and are subject to change without notice. Past performance is not a guarantee of future results and the opinions presented herein should not be viewed as an indicator of future performance. Investing in securities involves risk. Loss of principal is possible.Third-party data has been obtained from sources we believe to be reliable; however, its accuracy, completeness, or reliability cannot be guaranteed.