Thanks for reading The Briefing, our nightly column where we break down the day’s news. If you like what you see, I encourage you to subscribe to our reporting here.
Greetings!
If you’ve got a ticket on the AI gravy train, times are good. That could be because you’re a supplier to Nvidia—and therefore enjoying booming business, as Nvidia CEO Jensen Huang pointed out in his speech on Monday at the company’s GTC convention. Or it could be because you’re a small data center operator picking up business from a big tech giant looking for extra computing capacity.
Take Nebius, a Netherlands-based data center firm, whose stock rocketed 15% on Monday after it announced that Meta Platforms had agreed to spend up to $27 billion over several years renting its cloud capacity. This deal also demonstrates how Meta, despite planning to spend $125 billion on capital expenditures expanding its AI computing facilities this year, needs more capacity, right now. That is prompting it to tap outside cloud firms to supplement its own facilities.
Meta’s spending on outside cloud services perhaps deserves more scrutiny. Not that the company has hidden it: Meta warned in January that its operating expenses—which directly hit the profit statement, unlike capital expenditures—will rise 40% this year to as much as $169 billion. While some of that is due to compensation costs for AI talent, as well as depreciation expenses for its own servers, the company very prominently listed “third-party cloud spend” as a cause. (Translated into English, that’s what a company forks over to use a cloud service like Nebius, Amazon or Google Cloud.)
In the same vein, Meta revealed in a securities filing in late January that it had $131 billion of contractual commitments as of Dec. 31, “mostly related to third-party cloud capacity arrangements,” plus its own investments in servers and network infrastructure. A year earlier, its contractual commitments totaled just $32.8 billion, which it then said was mostly related to its own servers.
Why should anyone other than Nebius shareholders care about this shift? You don’t have to be a total nerd (although it helps) to know that the trade-offs for companies of using their own data centers versus the public cloud have long been debated. The consensus is that outside cloud services are costlier but don’t require the big up-front spending that building a data center entails.
Meta shareholders should certainly care about how the company is spending money on AI. Meta’s operating profit margins are already trending down—they fell to 41% in the fourth quarter of 2025 from 48% a year earlier. Wall Street analysts estimate the operating margin will fall to 34.8% for 2026 as a whole, according to S&P Global Market Intelligence. Now you know why Meta might be considering layoffs of up to 20% of the company, as Reuters reported last week. The cost of that extra cloud spending needs to come from somewhere.
Nvidia’s Trillion-Dollar Estimate
At one of Nvidia’s GTC conventions last year, Jensen Huang caused some confusion among analysts and reporters when he revealed that Nvidia was expecting $500 billion in revenue from sales of its latest AI chips, the Blackwell and the Rubin, between 2025 and 2026. That was a huge number for a company that in the year to January 2025 had reported just $130 billion in revenue.
But it wasn’t entirely clear what the $500 billion figure represented and precisely for what period. At the time, Morgan Stanley devoted a chunky section of its report on the GTC event to “parsing the $500 billion commentary,” as the bank put it. That number turned out to cover the calendar years 2025 through 2026, even though Nvidia doesn’t report on a calendar-year basis but instead on a fiscal year ending in January.
Despite the confusion, Huang was at it again on Monday, at the opening day of the company’s GTC convention in San Jose, Calif. In his two-hour-plus speech, the Nvidia CEO recalled that $500 billion statement and said he now anticipated “at least $1 trillion” in revenue from the Blackwell and Rubin chips. An accompanying slide showed that the number covered the period between 2025 and 2027 (presumably the calendar years, if it’s consistent with the previous projection). In other words, that’s an additional $500 billion for this year alone. Woohoo! (Here’s more on GTC.)
In Other News
• Hua Hong Group, China’s second-largest chip manufacturer, has developed 7-nanometer chipmaking technology for AI chips, Reuters reported, citing four people familiar with the matter.
• Intuit’s founder and executive leadership team have canceled plans for future stock sales, the company said in a regulatory filing on Monday. At the same time, Intuit said it planned to “substantially accelerate” its remaining $3.5 billion worth of planned stock buybacks, having repurchased $1.8 billion worth of stock in the previous two quarters.
Today on The Information’s TITV
Check out today’s episode of TITV in which we speak with our Nvidia reporter about the significance of this week’s product announcements for the company.
Recommended Newsletter
Start your day with Applied AI, the newsletter from The Information that uncovers how leading businesses are leveraging AI to automate tasks across the board. Subscribe now for free to get it delivered straight to your inbox twice a week.
Join The Information at the New York Stock Exchange on Monday, April 27, to hear from top executives and investors on how the rapid buildout of AI is reshaping tech, finance, and capital markets