AI Chip Contenders Face Daunting 'Moats'
Published on November 28, 2023 at 10:50PM
Barriers to entry in an industry dominated by TSMC and Nvidia are very high. From a report: In the drama that has just played out in Silicon Valley over the future of OpenAI, one side plot concerned an ambitious chip venture by its chief executive Sam Altman. Before he was ousted and reinstated to the helm of the company, Altman had sought to raise as much as $100bn from investors in the Middle East and SoftBank founder Masayoshi Son to build a rival to compete with sector giants Nvidia and Taiwan Semiconductor Manufacturing Co. This would be a vast undertaking. And one where $100bn may not go very far. Given that the US chip designer and Taiwanese chipmaker are critical to all things generative AI, Altman is unlikely to be the only one with hopes of taking them on. But the barriers to entry -- moats in Silicon Valley parlance -- are formidable. Nvidia has about 95 per cent of the markets for GPU, or graphics processing units. These computer processors were originally designed for graphics but have become increasingly important in areas such as machine learning. TSMC has about 90 per cent of the world's advanced chip market. These businesses are lucrative. TSMC runs on gross margins of nearly 60 per cent, Nvidia at 74 per cent. TSMC makes $76bn in sales a year. The impressive figures make it seem as though there is plenty of room for more contenders. A global shortage of Nvidia's AI chips makes the prospect of vertical integration yet more attractive. As the number of GPUs required to develop and train advanced AI models grows rapidly, the key to profitability for AI companies lies in having stable access to GPUs. [...] It is one thing for companies to design customised chips. But Nvidia's profitability comes not from making chips cost-efficient, but by providing a one-stop solution for a wide range of tasks and industries. For example, Nvidia's HGX H100 systems, which can go for about $300,000 each, are used to accelerate workloads for everything from financial applications to analytics. Coming up with a viable rival for the HGX H100 system, which is made up of 35,000 parts, would take much more than just designing a new chip. Nvidia has been developing GPUs for more than two decades. That head start, which includes hardware and related software libraries, is protected by thousands of patents. Even setting aside the challenges of designing a new AI chip, manufacturing is where the real challenge lies.
Published on November 28, 2023 at 10:50PM
Barriers to entry in an industry dominated by TSMC and Nvidia are very high. From a report: In the drama that has just played out in Silicon Valley over the future of OpenAI, one side plot concerned an ambitious chip venture by its chief executive Sam Altman. Before he was ousted and reinstated to the helm of the company, Altman had sought to raise as much as $100bn from investors in the Middle East and SoftBank founder Masayoshi Son to build a rival to compete with sector giants Nvidia and Taiwan Semiconductor Manufacturing Co. This would be a vast undertaking. And one where $100bn may not go very far. Given that the US chip designer and Taiwanese chipmaker are critical to all things generative AI, Altman is unlikely to be the only one with hopes of taking them on. But the barriers to entry -- moats in Silicon Valley parlance -- are formidable. Nvidia has about 95 per cent of the markets for GPU, or graphics processing units. These computer processors were originally designed for graphics but have become increasingly important in areas such as machine learning. TSMC has about 90 per cent of the world's advanced chip market. These businesses are lucrative. TSMC runs on gross margins of nearly 60 per cent, Nvidia at 74 per cent. TSMC makes $76bn in sales a year. The impressive figures make it seem as though there is plenty of room for more contenders. A global shortage of Nvidia's AI chips makes the prospect of vertical integration yet more attractive. As the number of GPUs required to develop and train advanced AI models grows rapidly, the key to profitability for AI companies lies in having stable access to GPUs. [...] It is one thing for companies to design customised chips. But Nvidia's profitability comes not from making chips cost-efficient, but by providing a one-stop solution for a wide range of tasks and industries. For example, Nvidia's HGX H100 systems, which can go for about $300,000 each, are used to accelerate workloads for everything from financial applications to analytics. Coming up with a viable rival for the HGX H100 system, which is made up of 35,000 parts, would take much more than just designing a new chip. Nvidia has been developing GPUs for more than two decades. That head start, which includes hardware and related software libraries, is protected by thousands of patents. Even setting aside the challenges of designing a new AI chip, manufacturing is where the real challenge lies.
Read more of this story at Slashdot.
Comments
Post a Comment