ReportWire

Tag: NVIDIA

  • PGGM Investments Has $114.74 Million Position in NVIDIA Corporation $NVDA

    [ad_1]

    PGGM Investments reduced its holdings in NVIDIA Corporation (NASDAQ:NVDAFree Report) by 61.9% during the 2nd quarter, according to its most recent filing with the Securities and Exchange Commission. The fund owned 726,231 shares of the computer hardware maker’s stock after selling 1,177,849 shares during the period. NVIDIA accounts for about 2.0% of PGGM Investments’ portfolio, making the stock its 17th largest holding. PGGM Investments’ holdings in NVIDIA were worth $114,737,000 as of its most recent SEC filing.

    Several other hedge funds and other institutional investors have also recently added to or reduced their stakes in NVDA. Kathleen S. Wright Associates Inc. increased its stake in shares of NVIDIA by 169.3% during the 1st quarter. Kathleen S. Wright Associates Inc. now owns 404 shares of the computer hardware maker’s stock worth $44,000 after purchasing an additional 254 shares during the last quarter. Barnes Dennig Private Wealth Management LLC bought a new stake in shares of NVIDIA during the 1st quarter worth about $51,000. Bruce G. Allen Investments LLC boosted its holdings in shares of NVIDIA by 198.2% during the 1st quarter. Bruce G. Allen Investments LLC now owns 492 shares of the computer hardware maker’s stock worth $53,000 after buying an additional 327 shares during the period. Legend Financial Advisors Inc. bought a new stake in shares of NVIDIA during the 2nd quarter worth about $55,000. Finally, Campbell Capital Management Inc. boosted its holdings in shares of NVIDIA by 5,900.0% during the 1st quarter. Campbell Capital Management Inc. now owns 600 shares of the computer hardware maker’s stock worth $65,000 after buying an additional 590 shares during the period. 65.27% of the stock is currently owned by institutional investors and hedge funds.

    Analyst Upgrades and Downgrades

    Several analysts recently commented on NVDA shares. Truist Financial lifted their target price on shares of NVIDIA from $210.00 to $228.00 and gave the stock a “buy” rating in a report on Thursday, August 28th. Phillip Securities raised shares of NVIDIA from a “moderate buy” rating to a “strong-buy” rating in a report on Monday, July 14th. Wall Street Zen downgraded shares of NVIDIA from a “buy” rating to a “hold” rating in a report on Saturday, October 11th. Barclays reaffirmed an “overweight” rating and set a $240.00 target price (up previously from $200.00) on shares of NVIDIA in a report on Monday, September 22nd. Finally, The Goldman Sachs Group set a $210.00 price objective on shares of NVIDIA and gave the company a “buy” rating in a report on Monday, October 6th. Four analysts have rated the stock with a Strong Buy rating, thirty-nine have given a Buy rating, three have assigned a Hold rating and one has given a Sell rating to the company’s stock. Based on data from MarketBeat, the company currently has a consensus rating of “Moderate Buy” and a consensus target price of $222.23.

    Check Out Our Latest Research Report on NVDA

    Insider Activity

    In other news, CEO Jen Hsun Huang sold 75,000 shares of the business’s stock in a transaction that occurred on Monday, August 4th. The shares were sold at an average price of $178.16, for a total value of $13,362,000.00. Following the completion of the transaction, the chief executive officer directly owned 73,523,225 shares in the company, valued at approximately $13,098,897,766. The trade was a 0.10% decrease in their position. The sale was disclosed in a filing with the Securities & Exchange Commission, which can be accessed through this hyperlink. Also, Director Harvey C. Jones sold 250,000 shares of the business’s stock in a transaction that occurred on Thursday, September 18th. The shares were sold at an average price of $176.21, for a total transaction of $44,052,500.00. Following the completion of the transaction, the director owned 7,183,280 shares of the company’s stock, valued at $1,265,765,768.80. The trade was a 3.36% decrease in their ownership of the stock. The disclosure for this sale can be found here. In the last ninety days, insiders sold 3,828,937 shares of company stock worth $680,708,026. 4.17% of the stock is currently owned by corporate insiders.

    NVIDIA Price Performance

    Shares of NASDAQ:NVDA opened at $179.83 on Thursday. The firm has a 50 day moving average of $179.37 and a 200-day moving average of $151.96. The stock has a market capitalization of $4.37 trillion, a PE ratio of 51.23, a PEG ratio of 1.30 and a beta of 2.12. The company has a current ratio of 4.21, a quick ratio of 3.60 and a debt-to-equity ratio of 0.08. NVIDIA Corporation has a 12 month low of $86.62 and a 12 month high of $195.62.

    NVIDIA (NASDAQ:NVDAGet Free Report) last issued its earnings results on Wednesday, August 27th. The computer hardware maker reported $1.05 EPS for the quarter, beating the consensus estimate of $1.01 by $0.04. The company had revenue of $46.74 billion during the quarter, compared to analysts’ expectations of $45.65 billion. NVIDIA had a net margin of 52.41% and a return on equity of 101.74%. The business’s revenue for the quarter was up 55.6% compared to the same quarter last year. During the same period in the previous year, the company earned $0.68 EPS. NVIDIA has set its Q3 2026 guidance at EPS. On average, equities research analysts predict that NVIDIA Corporation will post 2.77 EPS for the current fiscal year.

    NVIDIA Dividend Announcement

    The company also recently announced a quarterly dividend, which was paid on Thursday, October 2nd. Stockholders of record on Thursday, September 11th were given a $0.01 dividend. The ex-dividend date was Thursday, September 11th. This represents a $0.04 annualized dividend and a dividend yield of 0.0%. NVIDIA’s payout ratio is presently 1.14%.

    NVIDIA Profile

    (Free Report)

    NVIDIA Corporation provides graphics and compute and networking solutions in the United States, Taiwan, China, Hong Kong, and internationally. The Graphics segment offers GeForce GPUs for gaming and PCs, the GeForce NOW game streaming service and related infrastructure, and solutions for gaming platforms; Quadro/NVIDIA RTX GPUs for enterprise workstation graphics; virtual GPU or vGPU software for cloud-based visual and virtual computing; automotive platforms for infotainment systems; and Omniverse software for building and operating metaverse and 3D internet applications.

    Recommended Stories

    Want to see what other hedge funds are holding NVDA? Visit HoldingsChannel.com to get the latest 13F filings and insider trades for NVIDIA Corporation (NASDAQ:NVDAFree Report).

    Institutional Ownership by Quarter for NVIDIA (NASDAQ:NVDA)



    Receive News & Ratings for NVIDIA Daily – Enter your email address below to receive a concise daily summary of the latest news and analysts’ ratings for NVIDIA and related companies with MarketBeat.com’s FREE daily email newsletter.

    [ad_2]

    ABMN Staff

    Source link

  • Why Your Business Should Consider Nvidia’s New $4,000 Desktop Supercomputer

    [ad_1]

    Nvidia’s desktop supercomputer went on sale Wednesday, evoking FOMO from some tech enthusiasts who weren’t able to get their hands on one and raves from those who did. But the DGX Spark may not be a must-have for everyone in the AI world.

    First announced in January at CES, the DGX Spark was originally priced at $3,000 and was scheduled to be released in May. Things change in the tech world, though.

    The Spark was originally known as “Project DIGITS,” but got a name upgrade. The launch was pushed back another five months also—and during that time Nvidia increased the price to $3,999. (That’s still far less than the $129,000 Nvidia charged for the DGX-1 supercomputer in 2016.)

    That hasn’t reduced demand, however, as the device sold out almost immediately on Wednesday. Described as powerful enough to build complex AI models but small enough to fit on your desk, it’s a computer that could democratize AI beyond the corporate giants, broadening innovation.

    Curious about the DGX Spark or thinking about getting one when supplies are restocked? Here’s what you need to know.

    How can I order a DGX Spark?

    The Spark can be ordered online at nvidia.com, as well as from select partners and stores, including Micro Center and PNY. (The system is sold out on Nvidia’s website and the partner sites appear to be out of stock as well.)

    Who is the DGX Spark designed for?

    With its eye-popping specifications, the DGX Spark might seem like a dream machine for any power-user of PCs, but it’s best suited for researchers. (It won’t play video games and is entirely overpowered for web browsing.) Nvidia CEO Jensen Huang, when introducing the device, said that “placing an AI supercomputer on the desks of every data scientist, AI researcher and student empowers them to engage and shape the age of AI.”

    The company envisions the initial target audience will be twofold: Developers at large tech companies, who can create a working use case (in other words, a practical application of AI technology), which can then be scaled via data centers or cloud computing—or smaller developers who don’t have the finances to access those data centers, but still have ideas for new applications.

    Nvidia has its eyes on a broader audience in the long-term, though. At that same CES keynote, Huang noted that in the near future, anyone “who uses computers as a tool” will need their own personal AI supercomputer.

    What are the system specs of the DGX Spark?

    If you’re not super fluent in computer-speak, brace yourself. The Spark uses Nvidia’s GB10 Grace Blackwell (or, if you prefer, just Blackwell) GPU chip and has 128 GB of GPU memory. It boasts up to 4TB of NVMe SSD (solid state) storage. And Nvidia says it can deliver a petaflop of AI performance, which works out to a quadrillion calculations each second (technically, these calculations are called FLOPS, for “floating point operations per second).

    For comparison, the fastest supercomputer in the world is El Capitan at the Lawrence Livermore National Laboratory. It is rated at 2.79 exaflops and is designed to “help researchers ensure the safety, security, and reliability of the nation’s nuclear stockpile in the absence of underground testing.” However, it’s not for sale. 

    What can all of the DGX Spark’s horsepower actually do?

    The Spark is capable of handling AI models that have as many as 200 billion parameters. That’s something that used to require access to data centers that were far, far beyond the budget of smaller developers. Making a more affordable system will let developers prototype, fine-tune and test complex AI models. And if a model is too big for a Spark to handle, the computer can be linked to another Spark to let researchers move forward with testing.

    Will there be other versions of the DGX Spark?

    Yes. Several Nvidia partners will make their own desktop supercomputers using the Nvidia GB10 Grace Blackwell Superchip, which powers the Spark. Among the PC companies that will offer these are Dell, Asus, Acer, HP, Lenovo, MSI and Gigabyte. None are currently available. However, the majority are expected later this year (with the caveat that some could slip to early 2026). Expect to pay largely the same as the Spark’s $4,000 for these.

    [ad_2]

    Chris Morris

    Source link

  • BlackRock Joins Nvidia and Microsoft in a $40 Billion Move to Secure Data Centers

    [ad_1]

    The artificial intelligence sector keeps booming with yet another mega-deal.

    A group including BlackRock, Nvidia and Microsoft is buying Aligned Data Centers in an approximately $40 billion deal in an effort to expand next-generation cloud and artificial intelligence infrastructure.

    The acquisition comes amid a flurry of deals in recent months involving top AI developers that are flooding the booming AI sector with resources and money, and addressing resources — such as electricity and infrastructure — needed to support such technology.

    Last week it was revealed that semiconductor maker AMD will supply its chips to artificial intelligence company OpenAI as part of an agreement to team up on building AI infrastructure. OpenAI will also get the option to buy as much as a 10 percent stake in AMD, according to a joint statement announcing the deal.

    Last month, OpenAI and Nvidia announced a $100 billion partnership that will add at least 10 gigawatts of data center computing power.

    Aligned’s portfolio includes 50 campuses and more than 5 gigawatts of operational and planned capacity, including assets under development, mostly located across the U.S. and in Latin America. Some locations include northern Virginia; Chicago; Dallas; Ohio; Phoenix; Salt Lake City; Sao Paulo, Brazil; Queretaro, Mexico; and Santiago, Chile.

    Aligned, which is privately held, will continue to be led by CEO Andrew Schaap and keep its headquarters in Dallas.

    One of the sellers, Macquarie Asset Management, initially invested in Aligned in 2018. Ben Way, head of Macquarie Asset Management, said in a statement, “The scaling of Aligned Data Centers from two locations to 50 in seven years is representative of our approach to working with great companies and teams to support their rapid growth and deliver positive impact.”

    The transaction is the first deal for the investment consortium, which is named the Artificial Intelligence Infrastructure Partnership. The consortium has an initial target of mobilizing and deploying $30 billion of equity capital, with the potential of reaching $100 billion including debt.

    “AIP is positioned to meet the growing demand for the infrastructure required as AI continues to reshape the global economy,” BlackRock Chairman and CEO and AIP Chairman Larry Fink said in a statement. “This partnership is bringing together leading companies and mobilizing private capital to accelerate AI innovation and drive global economic growth and productivity.”

    The deal is expected to close in the first half of 2026.

    Shares of Nvidia rose about 1 percent in morning trading.

    [ad_2]

    Associated Press

    Source link

  • You’ll Soon Be Hearing a Lot About AI ‘Supercomputers’ That Aren’t for You

    [ad_1]

    No, your new “AI PC” or “Copilot+ PC” is not great at running AI—at least, not with any amount of compute that matters. The real artificial intelligence is being processed behind the closed doors of sprawling data centers that are currently springing up all around the United States. Nvidia, which became a trillion-dollar company thanks to AI, is now asking you to stick an ounce of that cloud on your desk.

    Nvidia first announced its $4,000 DGX Spark AI compute machine, then dubbed “Project Digits,” during CES 2025. If you don’t remember the specifics, I don’t blame you. While CEO Jensen Huang was talking up the AI and graphics capabilities of the firm’s RTX 50-series GPUs, Spark was being pushed elsewhere on the floor as an at-home device built specifically for high-end AI workloads. The company went so far as to call it a “new class of computer” in its announcement post—despite the fact that it’s powered by a Blackwell chip, an architecture that shows up across several other Nvidia lines.

    I wonder if Elon Musk will stick the DGX Spark on his carpet like he did his federal gaming PC. © Nvidia

    The DGX Spark is set to start shipping on Wednesday. Nvidia’s usual partners like Acer, Asus, Lenovo, MSI, Dell, and Gigabyte are already lined up to put out their own versions of the device. You may not find those products when roaming the sparse halls of your nearest Best Buy, but Nvidia said it will ship them to Micro Center stores in the U.S. Nvidia made a big deal of handing out Sparks to major companies like OpenAI and Microsoft, as well as to Elon Musk at the Starbase headquarters in Texas. Maybe the billionaire founder of xAI will plug it in and use it to vibe code that AI-centric “Wokipedia” competitor—which he promised would help people “understand the universe.”

    You won’t use the ‘AI supercomputer’ for anything but AI

    Acer Nvidia Dgx Spark Ai Compute 1
    Acer’s own version of Nvidia’s DGX Spark will support similar levels of AI compute in its pint-sized package. © Kyle Barr / Gizmodo

    I saw both Nvidia’s and Acer’s versions of Spark at IFA 2025. They both looked like big, shiny mini PCs—but they weren’t actually running Windows. Spark runs a customized, Ubuntu-based Linux distribution loaded with several of Nvidia’s AI tools for AI image models and LLMs, or large language models. A 20-core, ARM-based CPU accompanies a Grace Blackwell GPU. If all that mattered was core count, the Spark would tie Nvidia’s GeForce RTX 5070, one of its lower-end GPUs.

    On its face, those specs don’t sound very “supercomputer.” But inside, you’ll find much more performance and power draw than a typical desktop PC. The 2.65-pound Spark box holds 128GB of system memory and 4TB of storage. Its Blackwell chip promises 1 petaflop of AI compute performance, which is many times more than the 170 teraflops Nvidia’s 2016 DGX-1 AI compute machine offers, though it’s worth noting that flops, a measure of how fast a GPU can perform a certain number of floating point operations per second, are a rough metric. The Spark is also running at 240W, compared to the older model’s 3,200W power draw.

    For another loose example of AI compute capability, the DGX Spark promises to perform around 1,000 TOPS, or trillions of operations per second. Though that comes in below the RTX 5090, which boasts 3,352 TOPS, it surpasses any equivalent PC of the same size, and its memory puts it over the edge for the sake of developing and designing the next chatbot. For comparison, Qualcomm’s upcoming Snapdragon X2 Elite Extreme chip is supposed to be much better for AI than its predecessor, thanks to a redesigned neural processing unit, but that can only claim 70 TOPS of AI performance. Your usual PC is still limited to running extremely low-end AI models or background AI tasks.

    Oh, and all of that comes with a retail price of around $4,000. Don’t worry: Nvidia doesn’t expect every Joe Schmoe to buy one of these for the sake of running Windows 11 Recall. The DGX Spark is built for nascent AI developers, students, or, perhaps, curious AI dabblers who can afford to drop the equivalent of two $2,000 RTX 5090 GPUs to buy a specialized computer. Its real mission is to get more developers to create AI applications that people actually want to use—or, in Huang’s words, “the next wave of breakthroughs.” Hopefully, that’ll take the form of something beyond a chatbot interface promising to fix home decoration or remedy the abstract concept of loneliness.

     

    [ad_2]

    Kyle Barr

    Source link

  • Is the AI Conveyor Belt of Capital About to Stop?

    [ad_1]

    The American economy is little more than a big bet on AI. Morgan Stanley investor Ruchir Sharma recently noted that money poured into AI investments now accounts for about 40% of the United States’ GDP growth in 2025, and AI companies are responsible for 80% of growth in American stocks. So how bad is it that the most recent major deal among AI giants, agreements that have driven up stock prices dramatically, look like a snake eating its own tail?

    In recent months, Nvidia announced that it would invest $100 billion into OpenAI, OpenAI announced that it would pay $300 billion to Oracle for computing power, and Oracle announced it would buy $40 billion worth of chips from Nvidia. It doesn’t take a flow chart to get the feeling that these firms are just moving money around between each other. But surely that’s not happening…right?

    It’s a little harder to get assurances of that than you might think. 

    Artur Widak/Anadolu via Getty Images

    Is it all round-tripping?

    Many of these agreements are, on their face, mutually beneficial. If everything is on the level, while these deals might be circular, they should be moving everything forward. Rishi Jaluria, an analyst at RBC Capital Markets, told Gizmodo that deals like these could result in a “less capacity-constrained world,” which would allow for faster development of models that could produce higher returns on investment.

    “The better models we have, the more we can realize a lot of these AI use cases that are on hold just because the technology isn’t powerful enough yet to handle it,” he said. “If that happens, and that can generate real [return on investment] for customers … that results in real cost savings, potentially new revenue generation opportunities, and that creates net benefits from a GDP perspective.”

    So as long as we keep having AI breakthroughs and these companies figure out how to monetize their products, everything should be fine. On the off chance that doesn’t happen, though? 

    “If that doesn’t happen, if there is no real enterprise AI adoption, then it’s all round-tripping,” Jaluria said.

    Round-tripping, generally speaking, refers to the unethical and typically illegal practice of making trades or transactions to artificially prop up a particular asset or company, making it look like it’s more valuable and in demand than it actually is. In this case, it would be tech companies that are trying to make it appear like they are more valuable than they actually are by announcing big deals with each other that move the stock price. 

    So what might suggest whether this money is actually accomplishing anything other than serving as hot air in a rapidly inflating bubble? Jaluria said he’s watching for faster developments of models, advancements in performance, and overall AI adoption. “If this leads to a step function change in the way enterprise is adopting and utilizing AI, that creates a benefit,” he said.

    Whether that is happening currently or not is kind of in the eye of the beholder. OpenAI has certainly shown advancements in its technology. The release of its Sora 2 video generation model has unleashed a fresh hell upon the world, used to generate significant amounts of copyright violations and misinformation. But the latest version of the company’s flagship model, GPT-5, underwhelmed and failed to live up to expectations when it was released in August. 

    Adoption rates of the technology are also a bit of a Rorschach test. The company boasts that 10% of the world is using ChatGPT, and nearly 80% of the business world says that it’s looking into how to utilize the technology. But the early adopters aren’t finding much utility. According to a survey from the Massachusetts Institute of Technology, 95% of companies that have tried to integrate generative AI tools into their operations have produced zero return on investment.

    Where these investments are generating a return is in the stock market. Which, frankly, does not quell concerns about these firms simply boosting one another’s bottom line.

    Take Oracle, for example. Last month, the cloud provider had a rough quarter by all traditional indicators. It missed on both its revenue and earnings projections, and its net income was flat year-over-year. And yet, the stock price soared. The reason: the company’s plump list of remaining performance obligations—financial agreements that will provide revenue that have not yet been fulfilled. There, the company showed a massive amount of growth, a 359% increase from the year prior, with a projected $455 billion coming in. 

    That money is not real yet. Nor is the growth the company has promised, claiming that its Oracle Cloud Infrastructure revenue would grow from under $20 billion to nearly $150 billion before the start of the 2030s. But all of it was sufficient for investors to drive up Oracle’s share price enough to slingshot CEO Larry Ellison into the top spot on the world’s richest person list, briefly leapfrogging Elon Musk. 

    A video of Sam Altman generated by OpenAI's Sora 2
    Still from a promotion video of Sam Altman generated by OpenAI’s Sora 2. © OpenAI

    OpenAI is either the nexus point or the void at the center

    Most of this promised revenue will come from OpenAI, which made a commitment to purchase $300 billion worth of computing power from the company over five years. The clock on that contract doesn’t start until 2027, but assuming it actually happens, it would be one of the largest cloud computing deals in history.

    It’s also one of the most unlikely, just based on where the companies involved currently stand. In order to provide the compute that it has promised to OpenAI, Oracle will reportedly need to generate 4.5 gigawatts of power capacity, more than two Hoover Dams’ worth of power. On the other side of the deal, OpenAI will have to pay about $60 billion per year to fit the bill for the agreement. It currently generates about $10 billion in revenue, which, statistically speaking, is less than $60 billion.

    You can see a similar circular shape to OpenAI’s recent deal with Nvidia rival AMD, too. The exact details of the agreement weren’t reported, but chipmaker AMD expects to generate tens of billions of dollars over the next half-decade as it sells its AI chips to OpenAI. As part of the agreement, OpenAI gets a swath of shares in AMD, with options to buy up to 10% of the company. Lucky for OpenAI, there’s really no better time to get your hands on some AMD shares than right before it announces a big AI-related deal. The company’s stock price surged by about 35% following the announcement. 

    With those two most recent deals on the books, OpenAI has agreed to more than $1 trillion worth of computing deals so far this year. That’s a lot for any company to spend, but it’s especially a lot for a still-private company that reports just $10 billion in projected revenue through 2025. Even by its most recent funding rounds, the company as a whole is currently valued at about $500 billion.

    Most of those deals have contingencies attached. For instance, Nvidia’s investment in OpenAI isn’t actually $100 billion, but an initial $10 billion for one gigawatt of data center capacity with the potential for $100 billion if 10 gigawatts are ultimately achieved. But the stock prices and valuations certainly seem to treat these deals as if they are set in stone. And OpenAI seems to be operating that way, too. The company claims that it’ll more than 10x its revenue in the next few years, and projects it’ll hit $129 billion annually by 2029.

    Conveyor belts of capital

    That type of potentially inflated revenue figure is the kind of thing that makes some people think of the Dot Com bubble of the early 2000s, where we saw companies like Commerce One receive a $21 billion valuation despite barely having any revenue. But Peter Atwater, Adjunct Professor of Economics at William and Mary and President of consulting firm Financial Insyghts, sees a different reflection in the AI bubble: the housing market collapse. 

    “What we saw at the top of the mortgage market was all of these conveyor belts of capital, money flowing from one party to another party to another party. And what you started to see was that there were multiple points of relationship so that any participant in the system was then dependent on every other conveyor belt in the system working simultaneously to keep the system going,” he told Gizmodo. “In many ways, we’re seeing the same developing web of capital flows across the AI space.”

    This creates some obvious problems. The circular deals that, in theory, are wheels moving the whole thing forward all have to keep turning. If any of them stop, the whole thing stops, because they are all so interconnected that no failure is truly isolated. 

    Atwater said that the types of major, metric-contingent deals that have been dominating headlines in the AI space aren’t all that different from some of what was happening in the mortgage industry back in 2007, where some of the financial commitments required mortgages to meet certain conditions.

    “In the frenzy of a bubble, everyone overcommits. The purpose of overcommitting is to stake a claim in what you believe will be an intensely scarce commodity in the future. So you have buyers overcommit and you have sellers agreeing to overprovide as a result,” he explained. “What we find over and over is that commitments are among the first obligations to be cut off once conditions change, once confidence begins to fall.”

    Right now, there’s a stomach for those commitments. That isn’t guaranteed to be there in the future if all of these promised returns on investment don’t materialize. Atwater said that the market requires credit markets being willing to continue to extend massive sums of money to cover the agreements made, equity markets that value these transactions at “an extraordinary multiple,” and suppliers capable of delivering the promised products. There’s no guarantee that all of those factors will hold. 

    The math is already pretty tricky. As tech commentator Ed Zitron has pointed out, major firms like Microsoft, Meta, Tesla, Amazon, and Google have invested about $560 billion in AI infrastructure over the last two years. They’ve brought in a combined $35 billion in AI-related revenue. OpenAI’s commitments are even bigger, with returns that are arguably even smaller. 

    The company’s development and expansion of its services will rely in no small part on massive data center projects, which will require the same amount of energy to operate as New York City and San Diego combined—energy that currently isn’t even available. And, once again, there is no guarantee that the end product, once all of that energy is spent and data centers are built, will actually generate revenue.

    “Ultimately, if you do not have a consumer for the product, there will be no AI space because these companies can’t continue to do this for nothing. Listening to a lot of the calls in the last couple of weeks, there’s a clear open question as to how these companies are going to make money at this,” Atwater said.

    For the moment, everyone is seeing green, and hope springs eternal. As long as that is the case, no one will ask where the revenue is coming from. “Right now, the AI sector is operating in a forever mindset. They are acting as if they have a very long period of time under which they can figure this out and make money,” Atwater said. “As long as confidence is high, this entire ecosystem can offer fantasy. When confidence falls, they’re going to be expected to deliver real-term performance in a very short time frame.”

    Unfortunately, should that happen, it won’t just be these companies that bear the brunt of the failure. “You have to look at this as a larger ecosystem. To talk about AI today, it means we have to talk about the credit market, we have to talk about the credit market. Wall Street and AI are a single beast,” Atwater said, warning that a very small number of firms currently have a major grasp on the whole of the American economy. 

    Lots of investors are piling into the AI space, fearful of missing out on a market that seems like it can only go up. But few of them are looking at why those valuations and stock prices keep climbing, showing little curiosity as to what might happen if all of this money is just getting shifted around, artificially inflating the actual value of the companies they are betting on. 

    “‘Why?’,” Atwater said, “is the last question asked in a bull market.”

    [ad_2]

    AJ Dellinger

    Source link

  • Nvidia’s AI empire: A look at its top startup investments | TechCrunch

    [ad_1]

    No company has capitalized on the AI revolution more dramatically than Nvidia. Its revenue, profitability, and cash reserves have skyrocketed since the introduction of ChatGPT over two years ago — and the many competitive generative AI services that have launched since. Its stock price has soared, making it a $4.5 trillion market cap company. 

    The world’s leading high-performance GPU maker has used its ballooning fortunes to significantly increase investments in startups, particularly in AI. 

    Nvidia has participated in 50 venture capital deals so far in 2025, already surpassing the 48 deals the company completed in all of 2024, according to PitchBook data. Note that these investments exclude those made by its formal corporate VC fund, NVentures, which also significantly increased its investment pace over that period. (PitchBook says NVentures engaged in 21 deals this year, compared to just one in 2022.)  

    Nvidia has stated that the goal of its corporate investing is to expand the AI ecosystem by backing startups it considers to be “game changers and market makers.”  

    Below is a list of startups that raised rounds exceeding $100 million since 2023 where Nvidia is a named participant, organized from the highest to lowest amount raised in the round. 

    This list shows just how far and wide Nvidia has spread its tentacles in the tech industry, beyond supplying its products. 

    The billion-dollar-round club

    OpenAI: Nvidia backed the ChatGPT maker for the first time in October 2024, reportedly writing a $100 million check as part of a colossal $6.6 billion round that valued the company at $157 billion. The chipmaker’s investment was dwarfed by OpenAI’s other backers, notably Thrive, which according to the New York Times invested $1.3 billion. While PitchBook data indicates Nvidia did not participate in OpenAI’s $40 billion funding round that closed in March, the chipmaker announced in September that it would invest up to $100 billion in the company over time, structured as a strategic partnership to deploy massive AI infrastructure. 

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    xAI: In 2024, OpenAI tried to persuade its investors not to invest in any of its rivals. But Nvidia participated in the $6 billion round of Elon Musk’s xAI last December anyway. Nvidia will also invest up to $2 billion in the equity portion of xAI’s planned $20 billion funding round, Bloomberg reported, a deal structured to help xAI purchase more Nvidia gear. 

    Mistral AI: Nvidia invested in Mistral for the third time when the French-based large language model developer raised €1.7 billion (about $2 billion) Series C at a €11.7billion ($13.5 billion) post-money valuation in September.   

    Reflection AI: In October, Nvidia led a $2 billion funding round for Reflection AI, a one-year-old startup, valuing the company at $8 billion. Reflection AI is positioning itself as a US-based competitor to Chinese DeepSeek, whose open-source large language model offers a less-expensive alternative to closed-source models from companies such as OpenAI and Anthropic. 

    Thinking Machines Lab: Nvidia was among a long list of investors who backed former OpenAI Chief Technology Officer Mira Murati’s Thinking Machines Lab’s $2 billion seed round. The funding, which was formally announced in July, valued the new AI startup at $12 billion. 

    Inflection: One of Nvidia’s first significant AI investments also had one of the more unusual (but increasingly common) outcomes. In June 2023, Nvidia was one of several lead investors in Inflection’s $1.3 billion round, a company co-founded by Mustafa Suleyman, the famed founder of DeepMind. Less than a year later, Microsoft hired Inflection’s founders, paying $620 million for a non-exclusive technology license, leaving the company with a significantly diminished workforce and a less defined future. 

    Nscale: After the startup’s $1.1 billion round in September, Nvidia participated in Nscale’s $433 million SAFE funding in October. That’s a deal that secures future equity for investors. Nscale, which formed in 2023 after spinning out of Australian cryptocurrency mining company Akorn Energy, is building data centers in the UK and Norway for OpenAI’s Stargate project

    Wayve: In May 2024, Nvidia participated in a $1.05 billion round for the U.K.-based startup, which is developing a self-learning system for autonomous driving. Nvidia is expected to invest an dditional $500 million in Wayve, the startup told TechCrunch in September. Wayve is testing its vehicles in the U.K. and the San Francisco Bay Area. 

    Figure AI: In September, Nvidia participated in the Figure AI’s Series C funding round of over $1 billion, which valued the humanoid robotics startup at $39 billion. The chipmaker first invested in Figure in February 2024 when the company raised a $675 million Series B round at a $2.6 billion valuation. 

    Scale AI: In May 2024, Nvidia joined Accel and other tech giants Amazon and Meta to invest $1 billion in Scale AI, which provides data-labeling services to companies for training AI models. The round valued the San Francisco-based company at nearly $14 billion. In June, Meta invested $14.3 billion for a 49% stake of Scale, and hired away the company’s co-founder and CEO Alexandr Wang, as well as several other key Scale employees. 

    The many-hundreds-of-millions-of-dollars club

    Commonwealth Fusion: The chipmaker participated in the nuclear fusion-energy startup’s  $863 million funding round in August 2025. The deal, which also included investors like Google and Breakthrough Energy Ventures, valued the company at $3 billion. 

    Crusoe: A startup building data centers reportedly to be leased to Oracle, Microsoft, and OpenAI raised $686 million in November 2024, according to an SEC filing. The investment was led by Founders Fund, and the long list of other investors included Nvidia. 

    Cohere: The chipmaker has invested in enterprise large language model provider Cohere across multiple funding rounds, including the $500 million Series D, which closed in August, valuing Cohere at $6.8 billion. Nvidia first backed the Toronto-based startup in 2023. 

    Perplexity: Nvidia first invested in Perplexity in November 2023 and has participated in most of the subsequent funding rounds of the AI search engine startup, including the $500 million round closed in December 2024. The chipmaker participated in the company’s July funding round, which valued Perplexity at $18 billion. However, Nvidia did not join the startup’s subsequent $200 million fundraise in September, which boosted the company’s valuation to $20 billion, according to PitchBook data. 

    Poolside: In October 2024, the AI coding assistant startup Poolside announced it raised $500 million led by Bain Capital Ventures. Nvidia participated in the round, which valued the AI startup at $3 billion. 

    Lambda: AI cloud provider Lambda, which provides services for model training, raised a $480 million Series D at a reported $2.5 billion valuation in February. The round was co-led by SGW and Andra Capital Lambda, and joined by Nvidia, ARK Invest, and others. A significant part of Lambda’s business involves renting servers powered by Nvidia’s GPUs. 

    CoreWeave: Although CoreWeave is no longer a startup, but a public company, Nvidia invested in GPU-cloud provider when it was still one, back in April 2023. That’s when CoreWeave raised $221 million in funding. Nvidia remains a significant shareholder. 

    Together AI: In February, Nvidia participated in the $305 million Series B of this company, which offers cloud-based infrastructure for building AI models. The round valued Together AI at $3.3 billion and was co-led by Prosperity7, a Saudi Arabian venture firm, and General Catalyst. Nvidia backed the company for the first time in 2023.  

    Firmus Technologies: In September, Firmus Technologies, the Singapore-based data center company, received A$330 million (approximately $215 million USD) in funding at a A$1.85 billion ($1.2 billion USD) valuation from investors, including Nvidia. Firmus is developing an energy-efficient ‘AI factory’ in Tasmania, an island state of Australia. The startup originally provided cooling technologies for Bitcoin mining. 

    Sakana AI: In September 2024, Nvidia invested in the Japan-based startup, which trains low-cost generative AI models using small datasets. The startup raised a massive Series A round of about $214 million at a valuation of $1.5 billion. 

    Nuro: In August, Nvidia participated in the $203 million funding round for the self-driving startup focused on delivery. The deal valued Nuro at $6 billion, a significant 30% drop from its peak at $8.6 billion valuation in 2021. 

    Imbue: The AI research lab that claims to be developing AI systems that can reason and code raised a $200 million round in September 2023 from investors, including Nvidia, Astera Institute, and former Cruise CEO Kyle Vogt. 

    Waabi: In June 2024, the autonomous trucking startup raised a $200 million Series B round co-led by existing investors Uber and Khosla Ventures. Other investors included Nvidia, Volvo Group Venture Capital, and Porsche Automobil Holding SE. 

    Deals of over a $100 million

    Ayar Labs: In December, Nvidia invested in the $155 million round of Ayar Labs, a company developing optical interconnects to improve AI compute and power efficiency. This was the third time Nvidia backed the startup. 

    Kore.ai: The startup developing enterprise-focused AI chatbots raised $150 million in December of 2023. In addition to Nvidia, investors participating in the funding included FTV Capital, Vistara Growth, and Sweetwater Private Equity. 

    Sandbox AQ: In April, Nvidia, alongside Google, BNP Paribas, and others, invested $150 million in Sandbox AQ, a startup developing large quantitative models (LQMs) for handling complex numerical analysis and statistical calculations. The investment increased Sandbox AQ’s Series E round to $450 million and the company’s valuation to $5.75 billion. 

    Hippocratic AI: This startup, which is developing large language models for healthcare, announced in January that it raised a $141 million Series B at a valuation of $1.64 billion led by Kleiner Perkins. Nvidia participated in the round, along with returning investors Andreessen Horowitz, General Catalyst, and others. The company claims that its AI solutions can handle non-diagnostic patient-facing tasks such as pre-operating procedures, remote patient monitoring, and appointment preparation. 

    Weka: In May 2024, Nvidia invested in a $140 million round for AI-native data management platform Weka. The round valued the Silicon Valley company at $1.6 billion. 

    Runway: In April, Nvidia participated in Runway’s $308 million round, which was led by General Atlantic and valued the startup developing generative AI models for media production at $3.55 billion, according to PitchBook data. The chipmaker has been an investor in since 2023.  

    Bright Machines: In June 2024, Nvidia participated in a $126 million Series C of Bright Machines, a smart robotics and AI-driven software startup. 

    Enfabrica: In September 2023, Nvidia invested in networking chips designer Enfabrica’s $125 million Series B. Although the startup raised another $115 million in November, Nvidia didn’t participate in the round. 

    Reka AI: In July, an AI research lab Reka, raised $110 million in a round that included Snowflake and Nvidia. The deal tripled the startup’s valuation to over $1 billion, according to Bloomberg.    

    This post was first published in January 2025.

    [ad_2]

    Marina Temkin

    Source link

  • Trump announces 130% tariffs on China. The global trade war just came roaring back

    [ad_1]

    (CNN) — President Donald Trump announced he will impose an additional 100% tariff on goods from China, on top of the 30% tariffs already in effect, starting November 1 or sooner. The threat is a massive escalation after months of a trade truce between the two nations.

    “The United States of America will impose a Tariff of 100% on China, over and above any Tariff that they are currently paying,” Trump said in a post on Truth Social Friday afternoon. “Also on November 1st, we will impose Export Controls on any and all critical software.”

    Trump’s announcement is tied to Beijing ramping up export controls on its critical rare earths, which are needed to produce many electronics. As a result, Trump appeared to call off a meeting with Chinese President Xi Jinping that was scheduled for later this month in South Korea.

    Trump’s initial message Friday, delivered via a Truth Social post, in which he threatened “massive” new tariffs, was ill received by investors on Friday as fears of a spring déjà vu, when tariffs on Chinese goods soared to a stunning 145%, set in. Markets closed sharply lower on Friday after Trump’s initial comments, with the Dow falling by 878 points, or 1.9%. The S&P 500 was down 2.7%, and the tech-heavy Nasdaq tumbled 3.5%.

    While Trump doesn’t always act on his threats, investors, consumers and businesses still have reason to worry.

    President Donald Trump is threatening to raise tariffs on Chinese goods shipped to the United States. Credit: Jessica Koscielniak / Reuters via CNN Newsource

    The two largest economies depend on each other

    The United States and China are the world’s two largest economies. Although Mexico has recently replaced China as the top source of foreign goods shipped to the United States, America depends on China for hundreds of billions of dollars’ worth of goods. Meanwhile, China is one of the top export markets for America.

    In particular, electronics, apparel and furniture are among the top goods the United States receives from China. Trump has pushed CEOs, especially in tech, to move production to the United States, but he’s softened his approach in recent months as business leaders have satisfied the president with announcements of hundreds of billions of dollars in investments in US manufacturing — even if they continue to make the bulk of their products overseas.

    Shortly after imposing minimum 145% tariffs on Chinese goods — an effective embargo on trade, Trump issued an exemption for electronics, making them subject to 20% tariffs instead. The move was, in many ways, an acknowledgment that the Trump administration understood the pain he was inflicting on the US economy through his sky-high tariffs.

    Then, in May, US and Chinese officials further established the interdependence of trade by agreeing to lower tariffs on one another. China brought levies on American exports down to 10% from 125%, and the United States brought rates down to 30% from 145%.

    Both countries’ stock markets rallied as a result.

    It was only a matter of time

    Trump on Friday claimed trade hostility from China “came out of nowhere.” But in reality, it’s been bubbling up for months.

    For the United States, a critical part of trade agreements has been to ensure China will increase its supply of rare earth magnets. Yet despite several apparent breakthroughs, Trump has in recent months repeatedly accused China of violating the terms.

    Trump first responded by putting restrictions on sales of American technologies to China, including a key Nvidia AI chip. Many of these restrictions were later lifted.

    Then came the Trump administration’s announcement that it would soon impose fees on goods transported on Chinese-owned or -operated ships. China countered with a similar plan on American ships that took effect Friday.

    In short: Trump has already demonstrated there’s no limit to how high he’ll go with tariffs on China, and Xi has shown no mercy in how he chooses to retaliate.

    But Trump’s ability to continue to impose tariffs on a whim could soon end, pending the verdict in a landmark case kicking off in the Supreme Court next month. Xi, however, faces no such constraints.

    [ad_2]

    Elisabeth Buchwald and CNN

    Source link

  • So, Is Intel Still Making Graphics Cards?

    [ad_1]

    There’s something strange going on at Intel. The company is looking to get leaner as it simultaneously builds up its chipmaking capabilities. The U.S. chip giant’s nascent venture into graphics cards did not go unappreciated by the PC-buying community, especially the budget-end Battlemage GPUs like the B850. Amid the hubbub of its big Panther Lake announcement, one thing seemed to be missing: a clear idea of its future plans for GPUs, with or without Nvidia’s aid.

    (Full disclosure: Intel invited me to its chipset fab in Phoenix, Ariz. Travel and lodging were paid by Intel, but Gizmodo did not guarantee any coverage as a condition of accepting the trip.)

    With the introduction of Panther Lake and updates to XeSS upscaling software comes Intel’s new Xe3 graphics microarchitecture, which sits under the umbrella of Arc B-Series (Battlemage). Intel said we should expect better performance at lower wattages than its previous Arrow Lake H lineup and much better frame rates in games with the 12Xe core chip variants. But what about everything else? Intel detailed the “Next Arc family” will be labeled Xe3P. No, not Xe4. Will it be a discrete GPU, aka the rumored “Celestial” or C-line of graphics cards?

    The next Xe3P graphics architecture will be a “significant architectural advancement’ for Intel. It may or may not be a discrete GPU. © Intel

    Intel’s head of architecture, graphics, and software, Tom Petersen, told reporters in a roundtable Q&A that Panther Lake would only hint at what’s implied by the name. “Xe3P is a significant architectural advancement from where we are now,” he said. Whether that means it’s a whole family of products, that doesn’t matter. However, it may still be called “Celestial,” more for the sake of continuity than anything.

    “Our naming is not great,” Petersen said. “If we knew what we knew now, we would name those things differently.”

    Even Intel doesn’t know what it will do with Nvidia

    Acer Panther Lake Swift 16 Laptop
    An Acer Swift 16 AI set to debut at the end of this year will include some variety of Panther Lake inside. © Kyle Barr / Gizmodo

    Intel has other things on its mind. The company needs you to know that its Fab 52 in Chandler, Ariz., that’s generating the company’s new 18A process is up and running. So much so, they strapped me and a host of other journalists and analysts in a white bunny suit to inspect the place. Just to enter this temple to silicon, your body is wrapped head to toe in Gore-Tex waterproof layers, your eyes and feet are covered, and you start to blend in with everybody roving those floors. What can I tell you? Not much. How big is the fab floor where they make the chips in square feet? “A shit ton,” or at least that’s what Intel spokesperson Thomas Hannaford was allowed to say. I couldn’t take pictures. I couldn’t tell you how big the lithography machines were. That would give some competitors an edge, perhaps give an indication to the world of how many chips they planned to ship, or so that’s what Intel claimed.

    As I stared up at the flying shuttle robots roving across the ceiling—looking like the two-pronged “Recognizer” vehicles out of a Tron movie—while they carried wafers to and fro across rails in the ceiling, I could tell I was a resource for Intel’s mission statement. Fab 52 has been in production since 2021. Since then, the person who started Intel on this mission for U.S. manufacturing, Pat Gelsinger, was pushed out as CEO and the company went into a year-long spiral that culminated with President Donald Trump pushing the federal government to take a 10% stake in the company. Then, Nvidia came in with its Scrooge McDuck-sized moneybags ($5 billion, to be exact) to pump even more fuel into the chipmaker’s furnaces. Among all the capital changing hands, Nvidia and Intel’s respective CEOs touted a new combo chip that would combine Team Blue’s CPU with Team Green’s GPU.

    Intel Fab Tour Panther Lake 2
    How big are both production floors of Fab 52? A “shit ton” of square feet. © Intel

    The fab is only as important as the chips they make with it. And while I could sit here and wax lyrical about the company’s Clearwater Forest data center chips, the PC-buying public only cares about what’s going to end up in their desktop or laptop. Companies don’t like to talk about their futures, but from what Intel execs said last week, the company itself is still trying to figure out what a partnership means.

    “It’s brand new,” Petersen said, referring to the still unknown chips it could make with Nvidia. “We don’t know all the answers to that. You’ll know more about that relatively soon. We’re still in the figure-it-all-out mode.”

    [ad_2]

    Kyle Barr

    Source link

  • AI to Consume 12 Percent of Electricity, but There Are Caveats

    [ad_1]

    A new report paints a dire picture of the future in which electricity demand is surging and the transition to clean energy is still decades away.

    The global risk management provider DNV forecasts that global emissions will reach net zero only after 2090, and anticipates a temperature increase of roughly 2.2 degrees Celsius above preindustrial levels by 2100, although they caution it could be higher than that. Furthermore, AI data centers are expected to consume about 12 percent of all electricity in North America as soon as 2040. 

    “A casual observer might conclude that the energy transition is stalled or in reverse. That is most definitely not the case,” the report states. “Some aspects of the transition are supercharged and progressing rapidly, while other aspects of the transition have hit turbulence and are delayed.”

    There are, however, some caveats and reasons for optimism. DNV’s report notes, for example, that as soon as about 2060, carbon dioxide emissions are expected to fall by about 63 percent, with fossil fuels all but exiting the global energy mix. It also states that upheaval in U.S. policy will likely slow the clean energy transition, but not entirely derail it, largely because of China’s leadership in technological development and renewables buildout. 

    As far as AI, the report anticipates data centers will account for an outsized chunk of electricity in North America—consuming some 16 percent overall, with that aforementioned 12 percent coming strictly from AI. Globally, however, data centers are expected to surge to consume some 5 percent of electricity by 2040 with 3 percent attributable to AI, specifically. The report, however, anticipates that initial exponential growth in power demand from AI will become linear in time, even as the “cognitive services” it provides grow exponentially. The energy efficiency of “leading [machine learning] hardware” has improved about 40 percent year-over-year, according to the report.

    During NYC Climate Week in late September, Nvidia’s head of sustainability Josh Parker joined panels to discuss AI and sustainability. On one, he argued it’s worth it to bring new energy online to fuel AI, if artificial intelligence is applied to accelerate innovation in sustainability, emissions reduction, and clean energy.

    “AI really can be—and will be if we use it properly—a fantastic solution to some of the biggest challenges that we’ve had in sustainability,” he said on a separate panel on the same subject. “AI is not only providing more performance per watt of energy, but it’s also more performance per liter of water. It’s more performance per ton of steel, more performance per chip, and across every metric you can think of.”

    [ad_2]

    Chloe Aiello

    Source link

  • Even after Stargate, Oracle, Nvidia, and AMD, OpenAI has more big deals coming soon, Sam Altman says | TechCrunch

    [ad_1]

    At nearly the same moment as Nvidia CEO Jensen Huang was expressing surprise over OpenAI’s multibillion-dollar deal with competitor AMD — shortly after his company agreed to invest up to $100 billion into the AI model maker — Sam Altman was saying that more such deals are in the works.

    Huang appeared on CNBC’s Squawk Box on Wednesday. When asked if he knew about the AMD deal before it was announced, he answered, “Not really.”  

    As TechCrunch previously reported, OpenAI’s deal with AMD is unusual. AMD has agreed to grant OpenAI large tranches of AMD stock — up to 10% of the company over a period of years contingent on factors like increases in stock price. In exchange, OpenAI will use and help develop the chipmaker’s next-generation AI GPUs chips. This makes OpenAI a shareholder in AMD.  

    Nvidia’s deal is the reverse. Nvidia has invested in the AI model-making startup, making it a shareholder in OpenAI. 

    While OpenAI has been using Nvidia gear for years through cloud providers like Microsoft Azure, Oracle OCI, and CoreWeave, “This is the first time we’re going to sell directly to them,” Huang explained. He added that his company would still continue to supply gear to the cloud makers, too.

    These direct sales, which include AI gear beyond GPUs like systems and networking, are intended to “prepare” OpenAI for the day when it is its own “self-hosted hyperscaler,” Huang said. In other words, when it’s using its own data centers. 

    But Huang admits that OpenAI doesn’t “have the money yet” to pay for all of this gear. He estimated that each gigawatt of AI data center will cost OpenAI “$50 to $60 billion,” to cover everything from the land and power to the servers and equipment.   

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    So far, in 2025, OpenAI has commissioned 10 gigawatts’ worth of U.S. facilities through its $500 billion Stargate deal with partners Oracle and SoftBank. (Plus, it penned a $300 billion cloud deal with Oracle.)

    Its partnership with Nvidia was for at least 10 gigawatts of AI data centers. Its partnership with AMD was for 6 gigawatts. Plus its “Stargate UK” partnership involves expanding data centers in the U.K., and it has other European commitments. By some estimates, OpenAI has this year inked $1 trillion worth of such deals.  

    Similar to the AMD deal, Nvidia’s deal has been criticized for being “circular,” Bloomberg reported. The critics say Nvidia is essentially underwriting OpenAI’s purchases, getting the AI startup’s stock for its efforts. 

    Altman to the world: Expect more

    As Huang was dissecting OpenAI’s infrastructure needs on CNBC, OpenAI CEO Sam Altman’s interview with Andreessen Horowitz’s a16z Podcast dropped.

    During the podcast, a16z co-founder Ben Horowitz told Altman that he’s “very impressed by deal structure improvement,” referring to these most recent deals. Andreessen Horowitz is an OpenAI investor, so it would be shocking if he wasn’t impressed. OpenAI has found a way to potentially obtain billions of dollars of equipment on someone else’s dime. Repeatedly. 

    When asked about these recent deals, Altman said, “You should expect much more from us in the coming months.” 

    Altman sees OpenAI’s future models and upcoming other products as so much more capable, thereby fueling so much more demand, that “we have decided that it is time to go make a very aggressive infrastructure bet,” he explained.  

    The problem is that OpenAI’s revenue today is currently nowhere near a $1 trillion, though it is, by all accounts, growing rapidly, reportedly hitting $4.5 billion in the first half of 2025.

    Yet Altman obviously believes that eventually all of this investment will pay for itself. “I’ve never been more confident in the research road map in front of us and also the economic value that will come from using those [future] models.” 

    But, he said, OpenAI can’t get to all of that economic lushness on its own.

    “To make the bet at this scale, we kind of need the whole industry, or big chunk of the industry, to support it. And this is from the level of electrons to model distribution and all the stuff in between, which is a lot. So we’re going to partner with a lot of people,” Altman said, with more deals expected in the coming months.

    So stand by, tech industry. OpenAI is still wheeling and dealing.

    [ad_2]

    Julie Bort

    Source link

  • Top analyst on concerns about Nvidia fueling an AI bubble: ‘We’ve seen this movie before. It was called Enron, Tyco’ | Fortune

    [ad_1]

    A top Wall Street analyst has sounded an alarm over the U.S. equity bull market, warning that its remarkable run is built on a precariously narrow foundation: a surge in spending on, and optimistic assumptions about, infrastructure for artificial intelligence (AI). This spending has fueled a boom in the shares of most of the so-called Magnificent 7 and a few dozen related businesses, which have now come to account for roughly 75% of the S&P 500’s returns since the rally of the last few years began.

    The commentary on September 29 by Morgan Stanley Wealth Management’s chief investment officer, Lisa Shalett, frames the current market boom as a “one-note narrative” almost entirely dependent on massive capital expenditures in generative AI, raising questions about its durability as economic and competitive risks start to mount. Shalett’s critique came squarely in the middle of some people in the AI field — and many financial commentators around Wall Street —fretting at market exuberance and beginning to talk openly about a bubble.

    In an interview with Fortune, Shalett said she was “very concerned” about this theme in markets, saying her office had broadened from a belief that the market would only bid up seven or 10 stocks to roughly 40. “At the end of the day … this is not going to be pretty” if and when the generative AI capital expenditure story falters, she said.

    Shalett said she’s worried about a “Cisco moment” like when the dotcom bubble burst in 2000, referring to the company that was briefly the most valuable company in the world before an 80% stock plunge. [By “Cisco moment” did she mean a whole bunch of circular financing coming back to bite the company? If so, that would be worth adding/briefly explaining.] When asked how close we are to such a moment, Shalett said probably not in the next nine months, but very possibly in the next 24. When you look at the actual spending and the amount of capital coming into the space, “we’re a lot closer to the seventh inning than the first or second inning,” she said.

    ‘Starting to do what all ultimate bad actors do’

    Shalett’s comments centered on several recent multibillion-dollar deals to scale up data-center infrastructure. As notable substacker and former Atlantic writer Derek Thompson recently noted in a post titled “This is how the AI bubble will pop,” so much money is being spent to support AI’s energy-consumption needs that it’s the equivalent of a new Apollo space mission every 10 months. (Tech companies are spending roughly $400 billion this year alone on data-center infrastructure, while the Apollo program allocated about $300 billion in today’s dollars to get to the moon from the 1960s to the ’70s.)

    What’s more than a little concerning to Shalett is that one company alone, Nvidia—the most valuable company in the history of the world, with an over $4.5 trillion market cap—is at the center of a significant number of these deals. In September alone, Nvidia invested $100 billion in OpenAI in a massive deal, just days after pledging $5 billion to Intel (the Intel agreement was tied to chips, not data-center infrastructure, per se).

    Fortune‘s Jeremy Kahn reported in late September on significant concerns about “circular” financing, or Nvidia’s cash essentially being recycled throughout the AI industry. Shalett sees this as a major concern and a major sign that the business cycle is headed toward some kind of endgame. “The guy at the epicenter, Nvidia, is basically starting to do what all ultimate bad actors do in the final inning, which is extending financing, they’re buying their investors.”

    Shalett expanded on her concerns by saying that companies around Nvidia “are starting to become interwoven.” She noted that OpenAI is partially owned by Microsoft, but now Nvidia has also made an investment in the startup, while Oracle and AMD each have their own purchasing agreements with OpenAI. But OpenAI also has a data-center deal with tech giant Oracle, with the “bad news,” Shalett notes, that this deal is “totally debt-financed.” OpenAI also struck a deal in October with chip-maker AMD that allows OpenAI to buy up to 10% of AMD. “Essentially, Nvidia’s main competitor is going to be partially owned by OpenAI, which is partially owned by Nvidia. So, Nvidia can ‘own’ a piece of its largest competitor. It is totally circular and increases systemic risk.”

    When reached for comment, a spokesperson for Nvidia said, “We do not require any of the companies we invest in to use Nvidia technology.”

    Nvidia CEO Jensen Huang discussed the OpenAI investment in an appearance on the Bg2 podcast with Brad Gerstner and Clark Tang on September 25, calling it an “opportunity to invest” and part of a partnership geared toward helping OpenAI build their own AI infrastructure. When asked about the allegation of circular financing in general and the Cisco precedent in particular, Huang talked about how OpenAI will fund the deal, arguing that it will have to be funded by OpenAI’s future revenues, or “offtake,” which he pointed out are “growing exponentially,” and by its future capital, whether it’s raised by a sale of equity or debt. That will depends on investors’ confidence in OpenAI, he said, and beyond that, it’s “their company, it’s not my business. And of course, we have to stay very close to them to make sure that we build in support of their continued growth.”

    Shalett said that she and her team were “starting to watch” for signs of a bubble popping, highlighting the deal announced roughly a week before OpenAI struck its $100 billion data-center deal with Nvidia, when it struck another with Oracle worth $300 billion. Analysts at KeyBanc Capital Markets estimated that Oracle will have to borrow $100 billion of that amount—$25 billion a year for the next four years.

    “Every morning the opening screen on my Bloomberg is what’s going on with CDS spreads on Oracle debt,” Shalett said, referring to credit default swaps, the financial instrument that was obscure before the Great Financial Crisis, but infamous for the role it played in a global market meltdown. CDSs essentially serve as insurance to investors in case of insolvency by a market entity. “If people start getting worried about Oracle’s ability to pay,” Shalett said, “that’s gonna be an early indication to us that people are getting nervous.” She added that all the indications to her speak of the end of a cycle and history is littered with cautionary tales from such times.

    Oracle did not respond to requests for comment.

    90% growth since the last bear market

    Since the October 2022 bear market bottom and the launch of ChatGPT, according to Shalett’s calculations, the S&P 500 has soared 90%, but most of these gains have come from a small group of stocks. The so-called “Magnificent Seven”—including high-profile names like Nvidia and Microsoft—plus another 34 AI data-center ecosystem companies, are responsible for, as cited by Shalett and separately by JP Morgan Asset Management’s Michael Cembalest, about three-quarters of overall market returns, 80% of earnings growth, and a staggering 90% of capital spending growth in the index. Comparatively, the other 493 names in the S&P 500 are up just 25%—showing just how concentrated the rally has become.

    The so-called “hyperscaler” companies alone are now spending close to $400 billion annually on capex supporting AI infrastructure, Morgan Stanley Wealth Management calculated. The economic influence of AI capex is now immense, contributing an estimated 100 basis points—fully one percentage point—to second-quarter GDP growth, according to Morgan Stanley’s research. This pace outstrips the rate of underlying consumer spending growth by tenfold, underscoring its centrality to both market performance and broader economic data.

    “People conflate AI adoption, which is in the first inning, with the capex infrastructure buildout, which has been going full-out since 2022,” Shalett told Fortune. She cited concerns about the prominence of private equity and debt capital coming into play, as that “tends to produce bubbles, because it may be unspoken-for capacity.” In other words, people have money to burn and they’re throwing it at things that may not pay off.

    Shalett waved away macro theories about the labor market or the Federal Reserve. “We think that’s missing the forest for the trees because the forest is entirely rooted in this one story” about AI infrastructure. Morgan Stanley’s bull-case mid-2026 price target for the S&P 500 is an eye-popping 7,200, but Shalett highlights that even the most optimistic outlook admits that risk premiums, credit spreads, and market volatility do not seem to fully account for the vulnerabilities lurking beneath the AI-fueled advance.

    Shalett’s analysis suggests that AI capex maturity is approaching and some possible slowdowns are already visible. For instance, hyperscalers have already seen free-cash-flow growth turn negative, a sign that investment may have outpaced underlying technology returns. Strategas, an independent research firm, estimates that hyperscaler free cash flow is set to shrink by more than 16% over the next 12 months, putting pressure on lofty valuations and forcing investors to demand more discipline in how these funds are deployed.

    Shalett was asked about data centers’ disproportionate impact on GDP throughout 2025, which media blogger Rusty Foster of Today in Tabs described as: “Our economy might just be three AI data centers in a trench coat.” The Morgan Stanley exec said “That’s what makes this cycle so fragile,” adding that at some point, “we’re not gonna be building any data centers for a while.” After that, it’s just a question of whether you crash: “Do you have a mild 1991-92-style recession or does it really become bad?”

    A more bullish case

    Bank of America Research weighed in on the semiconductors sector in a Friday note, writing that vendor financing in the space, especially Nvidia’s $100 billion commitment to OpenAI, has been “raising eyebrows.” Nevertheless, the team, led by senior analyst Vivek Arya, argued that the deal is structured by performance and competitive need, rather than pure speculative frenzy.

    In an interview with Fortune, Arya explained why he wasn’t worried despite the “optics” being pretty obviously bad. “It’s very easy to say, ‘Oh, Nvidia is giving [OpenAI] money and they are buying chips with that money” and so on, but he argued the headlines are misleading about how much money is actually being spent and the $100 billion sticker price on the OpenAI deal “scared everyone.” Noting that the deal has multiple tranches that will play out over several years to come, he said it’s not like Nvidia is “just handing a $100 billion check to OpenAI [and saying] you know, go have fun.”

    “Nvidia didn’t fund all of it,” Arya said of the wider generative AI capex boom. Citing public filings, Arya argued that Nvidia’s entire investment in the AI ecosystem is in fact less than $8 billion or so over the last 12 months, not such a large figure after all. And he’s still bullish on Nvidia and OpenAI, he added, because he sees them as the winners of this particular story. “We think they are going to be among the four or five ecosystems that come up. It’s not like Nvidia is going and investing in every one of those ecosystems, right? They’re only investing in one of those five, which is, of course, the most disruptive,” that being OpenAI.

    When asked about his own fears of a bubble, Arya actually sounded a calmer but strikingly similar tune to Shalett. “I’m extremely comfortable with what will happen in the next 12 months,” Arya said, “And I have high sense of optimism about what will happen in the next five years. But can there be periods of digestion in between? Yeah.” Explaining that this is the nature of any infrastructure cycle, “it’s not always up and to the right.” In other words, after the next nine months in Shalett’s opinion and the next year in Arya’s, the data-center buildout endgame could be in play. “When these data centers are built,” Arya said, “they are not built for today’s demand. They’re built with some anticipation of demand that will develop in the next, you know, 12 to 18 months. So, are they going to be 100% utilized all the time? No.”

    Rising worries about a bubble

    Some of the biggest names in tech and Wall Street offered were hedging hard about the possibility of a bubble on Friday. Goldman Sachs CEO David Solomon and Jeff Bezos, both speaking at a tech conference in Turin, Italy, said they were seeing the same patterns as Shalett. Solomon said the massive amounts of spending weren’t fundamentally different from other booms and busts. “There will be a lot of capital that was deployed that didn’t deliver returns,” he said. That’s no different from how investment works. “We just don’t know how that will play out.”

    Bezos characterized it as “kind of an industrial bubble,” arguing that the infrastructure would pay off for many years to come.

    OpenAI CEO Sam Altman, who got markets jittery in late August when he mentioned the B-word, was asked again to comment on the subject while touring (what else?) a giant new data center in Texas. “Between the 10 years we’ve already been operating and the many decades ahead of us, there will be booms and busts,” Altman said. “People will overinvest and lose money, and underinvest and lose a lot of revenue.”

    For his part, Cisco CEO John Chambers, one of the faces of the dotcom bubble, told the Associated Press on October 3 that he sees “a lot of tremendous optimism” about AI that is similar to the “irrational exuberance on a really large scale” that marked the internet age. It indicates a bubble to him, but only “a future bubble for certain companies. Is there going to be train wreck? Yes, for those that aren’t able to translate the technology into a sustainable competitive advantage, how are you going to generate revenue after all the money you poured into it?”

    When asked whether the size of this potential bubble represents uncharted waters for the economy, especially considering the one-note nature of the long bull market, Shalett said Wall Streeters are always evaluating risk. But putting on her “American citizen hat,” she warned about the media consolidation that sees Oracle’s founder Larry Ellison also now playing a major role in TikTok (as part of a buying consortium of Trump-friendly billionaires) and Paramount in Hollywood and CBS News in New York (through his son, David Ellison, the media company’s new owner). Shalett said she’s worried about “groupthink” filtering into the functioning of markets. “That is not something that most of us have experienced in our lifetimes,” she said. “You stop factoring in risk premiums into markets, there is no bear case to anything.”

    [ad_2]

    Nick Lichtenberg

    Source link

  • AMD Inks Huge Compute Power Deal With OpenAI, Mirroring Nvidia’s Move

    [ad_1]

    OpenAI’s Sam Altman and AMD’s Lisa Su testify before the Senate on May 08, 2025 in Washington, DC. Photo by Chip Somodevilla/Getty Images

    Nvidia may be dominating the graphics processing unit (GPU) market right now, but its closest rival, AMD, is catching up. Today, (Oct. 6), AMD announced a landmark collaboration with OpenAI that mirrors a recent deal between OpenAI and Nvidia. Under the agreement, AMD will deploy six gigawatts of computing power to OpenAI, which will in turn have the option to acquire up to 10 percent of AMD’s stock—a stake worth roughly $33 billion now after the announcement sent AMD shares to soar 24 percent.

    The partnership gives OpenAI a critical boost in computing resources as it continues to roll out new A.I. models and tools. “This partnership is a major step in building the compute capacity needed to realize A.I.’s full potential,” OpenAI CEO Sam Altman said in a statement.

    OpenAI’s first one-gigawatt deployment is scheduled for the second half of 2026 and will use AMD’s MI450 chips. This initial rollout will coincide with a vesting schedule of AMD stock for OpenAI, allowing OpenAI to acquire up to 160 million shares as deployments scale to six gigawatts. The stock grant will vest based on OpenAI hitting technical and commercial milestones. The full deal will only be executed if AMD’s stock reaches $600 per share. AMD shares are currently traded at $204 apiece.

    The AMD partnership is the latest in a string of blockbuster A.I. deals. Nvidia recently announced its own long-term pact with OpenAI, pledging up to $100 billion in investments over the next decade. In return, OpenAI will obtain as much as 10 gigawatts of computing power from Nvidia’s systems.

    Global venture capital funding rose 38 percent year-over-year to $97 billion in the third quarter, according to Crunchbase, with nearly half of that money flowing into A.I. ventures. Analysts say the current boom evokes the early days of the internet.

    “We still believe we are in the early innings of this spending cycle,” said Dan Ives, an analyst with Wedbush Securities, in a client note. AMD’s new deal with OpenAI marks a “1996 moment” for the tech world, he added, likening today’s A.I. momentum to the foundational years of the tech economy.

    Nvidia’s shares slipped more than 1 percent today following AMD’s announcement, but the company still holds a commanding lead with more than 90 percent of the global GPU market. Nvidia’s early success in meeting A.I.-fueled GPU demand has propelled its market cap to $4.5 trillion and fueled $41 billion in data center revenue between May and July. AMD, in comparison, has a market cap of $334 billion and brought in $3.2 billion in data center revenue in its most recent quarter.

    Lisa Su, who has led AMD as CEO since 2014, is confident that the OpenAI deal will accelerate that growth. Her company has a “clear line of sight” to achieve tens of billions of dollars in data center revenue by 2027, Su told analysts today, adding that these numbers could grow even higher. “In addition to the OpenAI opportunity, and the very significant revenue addition there, we expect to generate well over $100 billion in the next several years,” she said.

    AMD Inks Huge Compute Power Deal With OpenAI, Mirroring Nvidia’s Move

    [ad_2]

    Alexandra Tremayne-Pengelly

    Source link

  • The billion-dollar infrastructure deals powering the AI boom | TechCrunch

    [ad_1]

    It takes a lot of computing power to run an AI product — and as the tech industry races to tap the power of AI models, there’s a parallel race underway to build the infrastructure that will power them. On a recent earnings call, Nvidia CEO Jensen Huang estimated that between $3 trillion and $4 trillion will be spent on AI infrastructure by the end of the decade — with much of that money coming from AI companies. Along the way, they’re placing immense strain on power grids and pushing the industry’s building capacity to its limit.

    Below, we’ve laid out everything we know about the biggest AI infrastructure projects, including major spending from Meta, Oracle, Microsoft, Google, and OpenAI. We’ll keep it updated as the boom continues and the numbers climb even higher.

    Microsoft’s $1 billion investment in OpenAI

    This is arguably the deal that kicked off the whole contemporary AI boom: In 2019, Microsoft made a $1 billion investment in a buzzy non-profit called OpenAI, known mostly for its association with Elon Musk. Crucially, the deal made Microsoft the exclusive cloud provider for OpenAI — and as the demands of model training became more intense, more of Microsoft’s investment started to come in the form of Azure cloud credit rather than cash.

    It was a great deal for both sides: Microsoft was able to claim more Azure sales, and OpenAI got more money for its biggest single expense. In the years that followed, Microsoft would build its investment up to nearly $14 billion — a move that is set to pay off enormously when OpenAI converts into a for-profit company.

    The partnership between the two companies has unwound more recently. In January, OpenAI announced it would no longer be using Microsoft’s cloud exclusively, instead giving the company a right of first refusal on future infrastructure demands but pursuing others if Azure couldn’t meet their needs. More recently, Microsoft began exploring other foundation models to power its AI products, establishing even more independence from the AI giant.

    OpenAI’s arrangement with Microsoft was so successful that it’s become a common practice for AI services to sign on with a particular cloud provider. Anthropic has received $8 billion in investment from Amazon, while making kernel-level modifications on the company’s hardware to make it better suited for AI training. Google Cloud has also signed on smaller AI companies like Lovable and Windsurf as “primary computing partners,” although those deals did not involve any investment. And even OpenAI has gone back to the well, receiving a $100 billion investment from Nvidia in September, giving it capacity to buy even more of the company’s GPUs.

    The rise of Oracle

    On June 30, 2025, Oracle revealed in an SEC filing that it had signed a $30 billion cloud services deal with an unnamed partner; this is more than the company’s cloud revenues for all of the previous fiscal year. OpenAI was eventually revealed as the partner, securing Oracle a spot alongside Google as one of OpenAI’s string of post-Microsoft hosting partners. Unsurprisingly, the company’s stock went shooting up.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    A few months later, it happened again. On September 10, Oracle revealed a five-year, $300 billion deal for compute power, set to begin in 2027. Oracle’s stock climbed even higher, briefly making founder Larry Ellison the richest man in the world. The sheer scale of the deal is stunning: OpenAI does not have $300 billion to spend, so the figure presumes immense growth for both companies, and more than a little faith.

    But before a single dollar is spent, the deal has already cemented Oracle as one of the leading AI infrastructure providers — and a financial force to be reckoned with.

    Building tomorrow’s hyperscale data centers

    For companies like Meta that already have significant legacy infrastructure, the story is more complicated — although equally expensive. Mark Zuckerberg has said that Meta plans to spend $600 billion on U.S. infrastructure through the end of 2028.

    In just the first half of 2025, the company spent $30 billion more than the previous year, driven largely by the company’s growing AI ambitions. Some of that spending goes toward big ticket cloud contracts, like a recent $10 billion deal with Google Cloud, but even more resources are being poured into two massive new data centers.

    A new 2,250-acre site in Louisiana, dubbed Hyperion, will cost an estimated $10 billion to build out and provide an estimated 5 gigawatts of compute power. Notably, the site includes an arrangement with a local nuclear power plant to handle the increased energy load. A smaller site in Ohio, called Prometheus, is expected to come online in 2026, powered by natural gas. 

    That kind of buildout comes with real environmental costs. Elon Musk’s xAI built its own hybrid data center and power-generation plant in South Memphis, Tennessee. The plant has quickly become one of the county’s largest emitters of smog-producing chemicals, thanks to a string of natural gas turbines that experts say violate the Clean Air Act.

    The Stargate moonshot

    Just two days after his second inauguration, President Trump announced a joint venture between SoftBank, OpenAI, and Oracle, meant to spend $500 billion building AI infrastructure in the United States. Named “Stargate” after the 1994 film, the project arrived with incredible amounts of hype, with Trump calling it “the largest AI infrastructure project in history. Sam Altman seemed to agree, saying, ​​”I think this will be the most important project of this era.” 

    In broad strokes, the plan was for SoftBank to provide the funding, with Oracle handling the buildout with input from OpenAI. Overseeing it all was Trump, who promised to clear away any regulatory hurdles that might slow down the build. But there were doubts from the beginning, including from Elon Musk, Altman’s business rival, who claimed the project did not have the available funds.

    As the hype has died down, the project has lost some momentum. In August, Bloomberg reported that the partners were failing to reach consensus. Nonetheless, the project has moved forward with the construction of eight data centers in Abilene, Texas, with construction on the final building set to be finished by the end of 2026.

    This article was first published on September 22.

    [ad_2]

    Russell Brandom

    Source link

  • What’s behind the massive AI data center headlines? | TechCrunch

    [ad_1]

    Silicon Valley flooded the news this week with headlines about wild AI infrastructure investments.

    Nvidia said it would invest up to $100 billion in OpenAI. Then OpenAI said it would build out five more Stargate AI data centers with Oracle and SoftBank, adding gigawatts of new capacity online in the coming years. And it was later revealed that Oracle sold $18 billion in bonds to pay for these data centers.

    On their own, each deal is dizzying in scale. But in aggregate, we see how Silicon Valley is moving heaven and earth to give OpenAI enough power to train and serve future versions of ChatGPT.

    This week on Equity, Anthony Ha and I (Max Zeff) go beyond the headlines to break down what’s really going on in these AI infrastructure deals.

    Rather conveniently, OpenAI also gave the world a glimpse this week of a power-intensive feature it could serve more broadly if it had access to more AI data centers.

    The company launched Pulse — a new feature in ChatGPT that works overnight to deliver personalized morning briefings for users. The experience feels similar to a news app or a social feed — something you check first thing in the morning — but doesn’t have posts from other users or ads (yet).

    Pulse is part of a new class of OpenAI products that work independently, even when users aren’t in the ChatGPT app. The company would like to deliver a lot more of these features and roll them out to free users, but they’re limited by the number of computer servers available to them. OpenAI said it can only offer Pulse to its $200-a-month Pro subscribers right now due to capacity constraints.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    The real question is whether features like Pulse are worth the hundreds of billions of dollars being invested in AI data centers to support OpenAI. The feature looks cool and all, but that’s a tall order.

    Watch the full episode to hear more about the massive AI infrastructure investments reshaping Silicon Valley, TikTok’s ownership saga, and the policy changes affecting tech’s biggest players.

    [ad_2]

    Maxwell Zeff

    Source link

  • Nvidia teams up with Intel in $5B deal to shape AI future – MoneySense

    [ad_1]

    Nvidia CEO Jensen Huang called it “a fusion of two world-class platforms” that combines Intel’s strength in making conventional computer chips, known as CPUs, that power most laptops, with Nvidia’s focus on the specialized graphics chips that are critical for artificial intelligence. “This partnership is a recognition that computing has fundamentally changed,” Huang told reporters Thursday. “The era of accelerated and AI computing has arrived.”

    Intel shares jumped nearly 23%, its biggest one-day percentage gain since 1987. Nvidia shares added more than 3%.

    Nvidia deal and U.S. backing give Intel a much-needed boost

    For data centres, Intel will make custom chips that Nvidia will use in its AI infrastructure platforms. For personal computer products, Intel will build chips that integrate Nvidia technology.

    The agreement provides a lifeline for Intel, which was a Silicon Valley pioneer that enjoyed decades of growth as its processors powered the personal computer boom, but fell into a slump after missing the shift to the mobile computing era unleashed by the iPhone’s 2007 debut. Intel fell even farther behind in recent years amid the AI boom that’s propelled Nvidia into the world’s most valuable company. Intel lost nearly $19 billion last year and another $3.7 billion in the first six months of this year, and expects to slash its workforce by a quarter by the end of 2025.

    U.S. President Donald Trump’s administration stepped in last month to secure a 10% stake—433.3 million shares of non-voting stock priced at $20.47 apiece—making it one of Intel’s biggest shareholders. Federal officials said they invested in Intel in order to bolster U.S. technology and domestic manufacturing. Of Nvidia’s own Intel stake, Huang said “the Trump administration had no involvement in this partnership at all,” though “would have been very supportive, of course.”

    Intel’s stock price surge Thursday pushed the total value of the U.S. government’s stake in Intel to $13.2 billion, a $2.5 billion increase from before Nvidia’s announcement.

    Canada’s best dividend stocks

    Nvidia–Intel pact a “game-changer” for U.S. tech

    Huang said Nvidia has been in talks with Intel for about a year. Intel CEO Lip-Bu Tan, who joined the press call with Huang on Thursday, said he’s been talking to Nvidia since he was named Intel’s new leader in March. “This is a very big, important milestone,” Tan said. “I call it a game-changing opportunity that we can work together.”

    The deal is “bullish for U.S. tech,” Wedbush Securities analyst Daniel Ives said in a client note. Ives said it brings Intel “front and center into the AI game” and, combined with the U.S. government stake, adds to “a golden few weeks for Intel after years of pain and frustration for investors.”

    Article Continues Below Advertisement


    Nvidia, meanwhile, has soared because its specialized chips are underpinning the AI boom. The chips, known as graphics processing units, or GPUs, are highly effective at developing powerful AI systems.

    Left out of the celebration Thursday was another U.S. chipmaking rival, Advanced Micro Devices. Shares in the leading maker of both GPUs and CPUs dropped slightly Thursday. AMD, Intel, and Nvidia are all headquartered in Santa Clara, California.

    Chip rivalry intensifies as China boosts Huawei and bans Nvidia

    The deal between Nvidia and Intel comes as China moves to be less dependent on U.S. semiconductor technology. This week, Chinese officials reportedly forbade several large domestic technology companies from purchasing Nvidia chips, and China-based Huawei announced that it was expanding its development of AI chips and manufacturing.

    While Nvidia and Intel will work together to develop new chips, a manufacturing deal has yet to be struck between the two. The potential access to Intel’s chip foundries by Nvidia poses a risk to Taiwan Semiconductor Manufacturing Company, which currently manufactures the tech giant’s flagship processors. Huang emphasized Thursday that both his company and Intel remain “very successful customers” of TSMC. Huang has been in Britain on a visit that coincides with Trump’s trip to the country, and he has been attending events with the president along with other Silicon Valley bigwigs.

    At a signing ceremony for a trans-Atlantic tech partnership on Thursday with British Prime Minister Keir Starmer, Trump mused that AI was “taking over the world.” “I’m looking at you guys. You’re taking over the world, Jensen,” Trump said. Huang and Trump also both attended a royal banquet, prompting the tech mogul to dish about the Windsor Castle event to Intel’s CEO in the seconds before their press event. “The cognac was excellent, but just not enough of it,” Huang told Tan. “I guess the cognac was from 1912.”

    Get free MoneySense financial tips, news & advice in your inbox.



    About The Associated Press

    [ad_2]

    The Associated Press

    Source link

  • Markets are selling off after Powell said six words investors don’t want to hear: ‘Equity prices are fairly highly valued’ | Fortune

    [ad_1]

    • Markets fell after Fed Chairman Jerome Powell warned that stocks are “highly valued.” U.S. stocks dropped, with tech leading losses on skepticism over Nvidia’s $100 billion OpenAI deal. Europe and U.K. markets opened lower.

    U.S. Federal Reserve Chairman Jerome Powell gave a speech in Rhode Island yesterday and, afterwards, was asked whether the Fed was keeping an eye on the markets. His reply contained six words that investors didn’t want to hear: “Equity prices are fairly highly valued.”

    The S&P 500 lost 0.55% on the day. Markets in the U.K. and Europe are all down this morning. The picture is mixed: Asia largely had a good day and U.S. futures are marginally up, so it’s not a tsunami.

    Powell’s remarks weren’t controversial. 

    Everyone knows that most major indexes have hit record highs this year. But it is clear that investors are wary of any sign that the Fed thinks “irrational exuberance”—as former Fed chair Alan Greenspan once called it—has kicked in. That would be a point at which the Fed could be expected to start raising interest rates in order to pierce an economic bubble. And that would be bad for stocks.

    Powell said: “We do look at overall financial conditions, and we ask ourselves whether our policies are affecting financial conditions in a way that is what we’re trying to achieve … But you’re right, by many measures, for example, equity prices are fairly highly valued.”

    UBS’s Paul Donovan interpreted it this way: “Powell apparently just wants investors’ confidence to be somewhat less certain.”

    One thing they are not confident about is tech stocks. The Nasdaq Composite lost nearly a full percentage point yesterday as traders expressed skepticism over Nvidia’s $100 billion investment in OpenAI. “There were as many questions as answers” about the deal, according to a note from Jim Reid and the team at Deutsche Bank this morning. A number of analysts are questioning how sustainable the AI boom is. Nasdaq futures are up this morning, premarket, however.

    Why are futures rising when the underlying indexes lost ground yesterday? Because the broad thrust of Powell’s speech contained worries about the softening labor market—which implies the Fed will stay on its rate-cutting path in the near-term.

    Here’s snapshot of the markets ahead of the opening bell in New York this morning:

    • S&P 500 futures were up 0.17% this morning. The index closed down 0.55% in its last session.
    • STOXX Europe 600 was down 0.28% in early trading. 
    • The U.K.’s FTSE 100 down 0.12% in early trading.
    • Japan’s Nikkei 225 was up 0.3%.
    • China’s CSI 300 was up 1.02%.
    • The South Korea KOSPI was down 0.4%.
    • India’s Nifty 50 was down 0.22% before the end of the session.
    • Bitcoin declined to $112.5K.
    Fortune Global Forum returns Oct. 26–27, 2025 in Riyadh. CEOs and global leaders will gather for a dynamic, invitation-only event shaping the future of business. Apply for an invitation.

    [ad_2]

    Jim Edwards

    Source link

  • Build-A-Bear Workshop Outpaces Nvidia, Microsoft, Oracle | Entrepreneur

    [ad_1]

    Nvidia may be the most valuable company in the world, surging to a record-high $4.395 trillion market capitalization over the past few months, but when it comes to stock growth, one surprising company has it beat: Build-A-Bear Workshop.

    Build-A-Bear’s stock grew by more than 2,000% over the past five years, making it one of the top 20 companies in the world by share growth, per The Washington Post. Company shares are up over 60% year-to-date at the time of writing. According to Build-A-Bear’s earnings report for the second quarter ending August 2, total revenue hit $124.2 million, an 11% increase from the same period last year. It was the company’s most profitable second quarter in its history.

    Build-A-Bear’s stock growth beats the world’s biggest tech giants, such as Nvidia (surged by over 1,300% in the past five years, with shares up over 30% year-to-date); Microsoft (stock grew by 147% across the past five years); and Oracle (stock swelled 444% across the same time period).

    Related: How Labubu Outsold Barbie and Hot Wheels — and Will Help Parent Company Pop Mart Earn $4 Billion This Year

    At Build-A-Bear, customers stuff a plush toy, add a toy heart, and dress the stuffed animal. The company was founded in October 1997 in Saint Louis, Missouri, and the experience in stores has remained consistent since its founding.

    The company’s CEO, Sharon Price John, who took over in 2013, told CNBC that the process of making a bear is “a really emotional, memorable experience that creates a tremendous amount of equity.” The store’s in-person experience contributes to its resilience, even as other mall stores like Claire’s close hundreds of locations.

    Build-A-Bear Workshop in Denver, Colorado. Photo by Joe Amon/The Denver Post via Getty Images

    “Those strong feelings that consumers have for brands are very stretchable beyond just that one experience,” John told the outlet.

    University of Pennsylvania Marketing Professor Americus Reed told CNBC that the “ritualistic” process of creating a stuffed animal at Build-A-Bear creates a memorable experience that is “really hard to replicate.” Build-A-Bear creates a deeper connection with its customers, building a sense of loyalty, Reed explained.

    Related: The Lego Resale Market Is Reportedly Thriving — And Some Sets Can Fetch Over $15,000

    Zach Wray, a customer whose family has hundreds of bears, told The Washington Post that the experience of creating a stuffed animal is what keeps his kids coming back to Build-A-Bear.

    “They make it really special for the kids,” Wray told the outlet.

    Nostalgia also plays a role in the company’s growth. A recent survey released by Build-A-Bear earlier this month shows that 92% of adults still have their childhood stuffed animal, and nearly 100% say that teddy bears are for all ages. Two-fifths (40%) of Build-A-Bear’s customers are adults, not kids, according to The Washington Post.

    Build-A-Bear has 627 stores across 32 countries, 100 of which opened within the past two years. The company told The Washington Post that it plans to open 60 more locations this year, and that almost all of its stores in North America were profitable.

    Related: This Mom’s Side Hustle Selling a $600 Children’s Toy Became a Business Making Over $1 Million a Year: ‘There Is a Lot to Love’

    Nvidia may be the most valuable company in the world, surging to a record-high $4.395 trillion market capitalization over the past few months, but when it comes to stock growth, one surprising company has it beat: Build-A-Bear Workshop.

    Build-A-Bear’s stock grew by more than 2,000% over the past five years, making it one of the top 20 companies in the world by share growth, per The Washington Post. Company shares are up over 60% year-to-date at the time of writing. According to Build-A-Bear’s earnings report for the second quarter ending August 2, total revenue hit $124.2 million, an 11% increase from the same period last year. It was the company’s most profitable second quarter in its history.

    Build-A-Bear’s stock growth beats the world’s biggest tech giants, such as Nvidia (surged by over 1,300% in the past five years, with shares up over 30% year-to-date); Microsoft (stock grew by 147% across the past five years); and Oracle (stock swelled 444% across the same time period).

    The rest of this article is locked.

    Join Entrepreneur+ today for access.

    [ad_2]

    Sherin Shibu

    Source link

  • NVIDIA is investing up to $100 billion in OpenAI to build 10 gigawatts of AI data centers

    [ad_1]

    NVIDIA will invest up to $100 billion in OpenAI as the ChatGPT maker sets out to build at least 10 gigawatts of AI data centers using NVIDIA chips and systems. The strategic partnership is gargantuan in scale. The 10-gigawatt buildout will require millions of NVIDIA GPUs to run OpenAI’s next-generation models. NVIDIA’s investment will be doled out progressively as each gigawatt comes online.

    The first phase of this plan is expected to come online in the second half of 2026, and will be built on NVIDIA’s Vera Rubin platform, which NVIDIA CEO will be a “big, big, huge step up,” over the current-gen Blackwell chips.

    “NVIDIA and OpenAI have pushed each other for a decade, from the first DGX supercomputer to the breakthrough of ChatGPT,” said Jensen Huang in a announcing the letter of the intent for the partnership. “Compute infrastructure will be the basis for the economy of the future, and we will utilize what we’re building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale,” said Sam Altman, CEO of OpenAI.

    NVIDIA has made a number of strategic investments lately, including in Intel, shortly after the took a 10 percent stake in the American chipmaker. The company also recently to license AI technology from startup Enfabrica and hire its CEO and other key employees.

    OpenAI has also formed other strategic partnerships over the last few years, including a somewhat complicated . This summer it struck a to build out 4.5 gigawatts of data center capacity using more than 2 million Oracle chips. That deal was part of , the strategic partnership between SoftBank, OpenAI, NVIDIA, Oracle, Arm and Microsoft with a promise to spend $500 billion in the US on AI infrastructure.

    [ad_2]

    Andre Revilla

    Source link

  • Nvidia to invest $100 Billion in OpenAI for data centers

    [ad_1]

    Nvidia Corp. will invest as much as $100 billion in OpenAI to support building of new data centers and other artificial intelligence infrastructure, a blockbuster deal that underscores booming demand for AI tools like ChatGPT and the computing power to make them run. The investment is intended to help OpenAI build data centers with a […]

    [ad_2]

    Bloomberg News

    Source link