U.S. President Donald Trump delivers his State of the Union address during a Joint Session of Congress at the U.S. Capitol on Tuesday, Feb. 24, 2026, in Washington, D.C. (Andrew Harnik/Getty Images/TNS)
Andrew Harnik
TNS
Donald Trump announced new agreements with technology companies to bear the energy costs of data centers by building their own power plants during his State of the Union address, delivered as the Florida Legislature advances data center regulations against the president’s wishes.
“Many Americans are also concerned that energy demand from AI data centers could unfairly drive up their electric utility bills,” Trump said. “We’re telling the major tech companies that they have the obligation to provide for their own power needs. They can build their own power plants as part of their factory, so that no one’s prices will go up.”
His administration has been pushing for months for data center regulation to be uniform across the country, but Florida is poised to defy him. Trump wrote in an executive order in December that his administration and Congress must act to create a “minimally burdensome national standard — not 50 discordant State ones.”
The Florida Legislature is eyeing new rules regulating AI data centers this session. It’s one of the few issues with bipartisan agreement. The Florida Senate is set to take up a bill Wednesday setting tariff and service requirements for data centers. A House committee advanced a companion bill Tuesday that has similar regulations, and also creates a five-mile boundary around schools and homes where companies can’t build data centers.
It’s not immediately clear how those regulations might apply to Trump’s announced agreement. The House version of the Legislature’s data center bill prevents local governments from issuing permits for the construction of data centers “or support facilities,” within certain boundaries.
Though House Speaker Daniel Perez, R-Miami, has publicly agreed with Trump’s stance to leave it up to the federal government, Gov. Ron DeSantis is pushing for state-level regulations
Florida’s plans could put the state in Trump’s crosshairs. Under his December executive order, he threatened to limit federal grants to states that enforce laws that conflict with his policies.
Claire Heddles is the Miami Herald’s senior political correspondent. She previously covered national politics and Congress from Washington, D.C at NOTUS. She’s also worked as a public radio reporter covering local government and education in East Tennessee and Jacksonville, Florida.
Indian conglomerate Adani Group said on Monday it would invest $100 billion over the next decade to build data centers specialized for AI across the country, a move that underscores India’s ambition to play a larger role in the global AI race.
The investment, which will run through 2035, is aimed at building renewable-energy-powered data centers designed to support AI workloads, the company said. It expects the plan to catalyze an additional $150 billion in related investments and result in a $250 billion AI infrastructure ecosystem in India over the decade.
The announcement coincides with India’s ongoing AI Impact Summit in New Delhi this week, where leaders from some of the world’s top AI companies, including OpenAI, Nvidia, Anthropic, Microsoft and Google, are meeting policymakers and industry executives.
Adani Group chairman Gautam Adani (pictured above) described the plan as a long-term bet on the convergence of energy and computing. “India will not be a mere consumer in the AI age,” he said, adding that the group aims to help build a domestic AI infrastructure base.
The plan is to build atop Adani’s own existing data-center platform and its partnerships with companies like Google and Microsoft. The conglomerate is developing large-scale AI data-center campuses in Visakhapatnam and Noida, and has plans for more facilities in Hyderabad and Pune. An expanded partnership with Walmart-owned Flipkart will focus on another AI data center.
Adani said the broader plan calls for deploying up to 5 gigawatts of data-center capacity. The company said the facilities will be developed as a unified system that would scale power generation and processing capacity in parallel.
Techcrunch event
Boston, MA | June 23, 2026
The effort builds on AdaniConneX, a joint venture between Adani Enterprises and U.S.-based EdgeConneX, a developer and operator of data centers for hyperscale and enterprise customers. The JV, Adani said, has already developed about 2 gigawatts of data-center capacity across India.
Central to the strategy is Adani’s renewable-energy portfolio, which the group said will supply carbon-neutral power to the data centers. The company pointed to its 30-gigawatt Khavda renewable project in western India — more than 10 gigawatts of which is already operational — and said it plans to invest an additional $55 billion to expand renewable generation and battery energy storage over the coming years.
To reduce exposure to global supply-chain disruptions, Adani said it plans to co-invest in domestic manufacturing of critical components, such as transformers, power electronics and thermal management systems.
Adani did not respond to questions about how much of the $100 billion investment is already committed capital, how the spending will be phased over the coming years, and when the first large-scale AI workloads are expected to become operational.
Power, rather than compute, is fast becoming the limiting factor in scaling AI data centers. That shift has prompted Peak XV Partners to back C2i Semiconductors, an Indian startup building plug-and-play, system-level power solutions designed to cut energy losses and improve the economics of large-scale AI infrastructure.
C2i (which stands for control conversion and intelligence) has raised $15 million in a Series A round led by Peak XV Partners, with participation from Yali Deeptech and TDK Ventures, bringing the two-year-old startup’s total funding to $19 million.
The investment comes as data-center energy demand accelerates worldwide. Electricity consumption from data centers is projected to nearly triple by 2035, per a December 2025 report from BloombergNEF, while Goldman Sachs Research estimates data-center power demand could surge 175% by 2030 from 2023 levels — the equivalent of adding another top-10 power-consuming country.
Much of that strain comes not from generating electricity but from converting it efficiently inside data centers, where high-voltage power must be stepped down thousands of times before it reaches GPUs. This process currently wastes about 15% to 20% of energy, C2i’s co-founder and CTO Preetam Tadeparthy said in an interview.
“What used to be 400 volts has already moved to 800 volts, and will likely go higher,” Tadeparthy told TechCrunch.
Founded in 2024 by former Texas Instruments power executives Ram Anant, Vikram Gakhar, Preetam Tadeparthy, and Dattatreya Suryanarayana, along with Harsha S. B and Muthusubramanian N. V, C2i is redesigning power delivery as a single, plug-and-play “grid-to-GPU” system spanning the data-center bus to the processor itself.
C2i co-founders Vikram Gakhar, Preetam Tadeparthy, Ram Anant, and Dattatreya Suryanarayana (Left to right)Image Credits:C2i
By treating power conversion, control and packaging as an integrated platform, C2i estimates it can cut end-to-end losses by around 10% — roughly 100 kilowatts saved for every megawatt consumed — with knock-on effects for cooling costs, GPU utilisation and overall data-center economics.
Techcrunch event
Boston, MA | June 23, 2026
“All that translates directly to total cost of ownership, revenue, and profitability,” Tadeparthy said.
For Peak XV Partners (which split from Sequoia Capital in 2023), the attraction lies in how power costs shape the economics of AI infrastructure at scale. Rajan Anandan, the venture firm’s managing director, told TechCrunch that after the upfront capital investment in servers and facilities, energy costs become the dominant ongoing expense for data centers, making even incremental efficiency gains highly valuable.
“If you can reduce energy costs by, call it, 10 to 30%, that’s like a huge number,” Anandan said. “You’re talking about tens of billions of dollars.”
The claims will be tested quickly. C2i expects its first two silicon designs to return from fabrication between April and June, after which the startup plans to validate performance with data-center operators and hyperscalers that have asked to review the data, according to Tadeparthy.
The Bengaluru-based startup has built a team of about 65 engineers and is setting up customer-facing operations in the U.S. and Taiwan as it prepares for early deployments.
Power delivery is one of the most entrenched parts of the data-center stack, long dominated by large incumbents with deep balance sheets and years-long qualification cycles. While many newer companies focus on improving individual components, redesigning power delivery end-to-end requires coordinating silicon, packaging, and system architecture simultaneously — a capital-intensive approach that few startups attempt and one that can take years to prove in production environments.
Anandan said the real question now is execution, noting that all startups face technology, market, and team risks when betting on how industries evolve. In C2i’s case, he said, the feedback loop should be relatively short. “We’ll know in the next six months,” said Anandan, pointing to upcoming silicon and early customer validation as the moment when the thesis will be tested.
The bet also reflects how India’s semiconductor design ecosystem has matured in recent years.
“The way you should look at semiconductors in India is, this is like 2008 e-commerce,” said Anandan. “It’s just getting started.”
He pointed to the depth of engineering talent — with a growing share of global chip designers based in the country — alongside government-backed design-linked incentives that have lowered the cost and risk of tape-outs, making it increasingly viable for startups to build globally competitive semiconductor products from India rather than operate only as captive design centers.
Whether those conditions translate into a globally competitive product will become clearer over the coming months, as C2i begins validating its system-level power solutions with customers.
With proposals of large-scale data centers spreading across Michigan, U.S. Senate candidate Abdul El-Sayed on Thursday released what he called “terms of engagement” aimed at protecting communities from higher utility bills, grid strain, and environmental harm.
El-Sayed, a progressive Democrat running in the 2026 Senate primary, said at least 15 data center projects have been proposed across the state in the past year, including a planned 1.4-gigawatt facility tied to Oracle and OpenAI. His campaign said a project of that size would consume more electricity than the entire city of Detroit.
“We’ve watched as data center projects have proliferated up and down our state, raising alarm and concern about the impacts on water resources, electric bills, and safety,” Abdul said in a statement. “That’s because our local utilities have bought off the politicians who are supposed to regulate them–and because there simply hasn’t been the leadership to take on powerful corporations. These terms of engagement represent the bare minimum that data center projects should be able to guarantee if they want to move into our communities.”
He argued that utility companies are pushing to fast-track approvals without adequate oversight, even as residents face rising rates and persistent reliability problems.
The plan targets investor-owned utilities such as DTE Energy and Consumers Energy, which El-Sayed said have a history of rate hikes without improvements in service. His campaign accused utilities and developers of “steamrolling” local governments and regulators as communities scramble to understand the long-term impacts of energy-hungry data centers.
Under El-Sayed’s “Our Communities, Our Terms” framework, data center projects would be required to meet a series of conditions before receiving approval:
No rate hikes: Data centers would be required to pay for their own energy demand, preventing costs from being passed on to residential ratepayers.
Community transparency: Local residents would have a meaningful role in approvals and in negotiating community benefits.
Energy reliability guarantees: Projects would need enforceable commitments to improve, not weaken, grid reliability, funded by data center revenues.
Jobs guarantees: Developers would face penalties if promised local jobs fail to materialize.
Water protection: Data centers would be required to use closed-loop cooling systems to limit water use and pollution.
Community benefits agreements: Binding agreements would be required to deliver tangible benefits, such as grid upgrades, buried power lines, and improvements to water infrastructure.
No clean-energy loopholes: Utilities would be barred from using data center demand as a justification to weaken Michigan’s clean-energy laws.
Enforceability: All commitments would have to include clear penalties for noncompliance.
El-Sayed is competing in the Democratic primary against U.S. Rep. Haley Stevens of Birmingham and state Sen. Mallory McMorrow of Royal Oak. His campaign said his opponents have supported tax exemptions for data center development without enforceable protections for ratepayers or the environment.
The campaign also emphasized that El-Sayed has never taken campaign contributions from utility companies that could benefit from rapid data center expansion.
A former Detroit health director and Wayne County health executive, El-Sayed has built his Senate run around challenging corporate power and prioritizing public health, affordability, and environmental protection. His campaign said the data center policy is part of a broader push to ensure that large infrastructure projects deliver measurable benefits to the communities that host them, rather than shifting costs onto residents.
On their own, each deal is dizzying in scale. But in aggregate, we see how Silicon Valley is moving heaven and earth to give OpenAI enough power to train and serve future versions of ChatGPT.
This week on Equity, Anthony Ha and I (Max Zeff) go beyond the headlines to break down what’s really going on in these AI infrastructure deals.
Rather conveniently, OpenAI also gave the world a glimpse this week of a power-intensive feature it could serve more broadly if it had access to more AI data centers.
The company launched Pulse — a new feature in ChatGPT that works overnight to deliver personalized morning briefings for users. The experience feels similar to a news app or a social feed — something you check first thing in the morning — but doesn’t have posts from other users or ads (yet).
Pulse is part of a new class of OpenAI products that work independently, even when users aren’t in the ChatGPT app. The company would like to deliver a lot more of these features and roll them out to free users, but they’re limited by the number of computer servers available to them. OpenAI said it can only offer Pulse to its $200-a-month Pro subscribers right now due to capacity constraints.
Techcrunch event
San Francisco | October 27-29, 2025
The real question is whether features like Pulse are worth the hundreds of billions of dollars being invested in AI data centers to support OpenAI. The feature looks cool and all, but that’s a tall order.
Watch the full episode to hear more about the massive AI infrastructure investments reshaping Silicon Valley, TikTok’s ownership saga, and the policy changes affecting tech’s biggest players.