New generation of chips will drive the AI wave - FT中文網
登錄×
電子郵件/用戶名
密碼
記住我
請輸入郵箱和密碼進行綁定操作:
請輸入手機號碼,透過簡訊驗證(目前僅支援中國大陸地區的手機號):
請您閱讀我們的用戶註冊協議私隱權保護政策,點擊下方按鈕即視爲您接受。
FT商學院

New generation of chips will drive the AI wave

High bandwidth memory is in rapidly rising demand to facilitate lightning-fast data transfers
00:00

{"text":[[{"start":11.6,"text":"Some of you will remember the days when downloading a movie meant a wait of over an hour. "},{"start":16.154,"text":"Today, the latest chips can transfer data equivalent to more than 160 full-HD movies in under one second. "}],[{"start":23.439999999999998,"text":"The artificial intelligence chip industry has developed a newfound appreciation for high bandwidth memory, or HBM, the technology behind such lightning-fast data transfers. "},{"start":33.132,"text":"Analysts had once deemed it unlikely to ever become commercially viable when it was first launched in 2013. "}],[{"start":39.86,"text":"But US chip designers Nvidia and AMD are breathing new life into the advanced technology that has now become a critical component in all AI chips. "},{"start":47.952,"text":"The most pressing issue for AI chipmakers is the ever-growing demand for more processing power and bandwidth requirements as companies rush to expand data centres and develop AI systems such as large language models. "}],[{"start":60.05,"text":"Meanwhile, data-heavy generative AI applications are pushing the performance limits of what conventional memory chips can offer. "},{"start":67.167,"text":"Faster data processing speeds and transfer rates require a larger number of chips that take up more physical space and consume more power. "}],[{"start":75.42,"text":"Until now, the traditional set-up has been the placing of chips side by side on a flat surface which are then connected with wiring and fuses. "},{"start":83.037,"text":"More chips mean slower communication through them and higher power consumption. "}],[{"start":87.53,"text":"HBM upends decades of chip industry convention by stacking multiple layers of chips on top of each other and uses cutting edge components, including a tiny circuit board thinner than a piece of paper, to pack chips much closer together in a three-dimensional shape. "}],[{"start":101.86,"text":"This improvement is critical to AI chipmakers as proximity between chips uses less energy — HBM uses about three-quarters less than traditional structures. "},{"start":110.727,"text":"Research has also shown HBM also provides as much as five times higher bandwidth and takes up less space — less than half the size of current offerings. "}],[{"start":119.92,"text":"While the technology is advanced, it is not new. "},{"start":123.174,"text":"AMD and South Korean chipmaker SK Hynix started working on HBM 15 years ago, when high performance chips were mostly used in the gaming sector. "}],[{"start":132.72,"text":"Critics at the time were sceptical that the performance boost would be worth the added costs. "},{"start":137.499,"text":"HBM uses more components, many of which are intricate and difficult to manufacture, compared with traditional chips. "},{"start":144.029,"text":"By 2015, two years into the launch, analysts expected HBM to be relegated to a tiny niche. "},{"start":150.247,"text":"Costs seemed too high for mass market use. "}],[{"start":153.3,"text":"They are still expensive today, costing at least five times more than standard memory chips. "},{"start":158.667,"text":"The difference now is that the AI chips that they go into fetch a steep price too. "},{"start":162.89700000000002,"text":"And the tech giants now have a much larger budget to spend on advanced chips than gamers did a decade ago. "}],[{"start":169.38000000000002,"text":"For now, just one company, SK Hynix, is able to mass produce the third generation of HBM products, the ones used in the latest AI chips. "},{"start":178.29700000000003,"text":"It has 50 per cent of the global market, while the rest of the market is held by two rivals, according to data from consultancy TrendForce. "},{"start":185.81400000000002,"text":"Samsung and Micron produce older generation HBMs and are set to release their latest versions in the coming months. "},{"start":191.81900000000002,"text":"They stand to gain a windfall from the high margin product alongside rapidly rising demand. "},{"start":196.87400000000002,"text":"In June, TrendForce forecast global demand for HBM would rise 60 per cent this year. "}],[{"start":203.40000000000003,"text":"The AI chip war is about to turbocharge that growth even further next year. "},{"start":207.85400000000004,"text":"AMD has just launched a new product with hopes to take on Nvidia. "},{"start":211.65900000000005,"text":"A global shortage coupled with strong demand for a more affordable alternative to Nvidia’s offerings means a lucrative opportunity for rivals able to offer chips with comparable specifications. "},{"start":221.32700000000003,"text":"But shaking Nvidia’s dominance, the key to which lies not just in the physical chip itself but its software ecosystem, which includes popular developer tools and programming models, is another story. "},{"start":231.95700000000002,"text":"Replicating that will take years. "}],[{"start":234.85000000000002,"text":"That is why hardware — and the number of HBMs squeezed into each new chip — will now become yet more important for contenders that want to take on Nvidia. "},{"start":243.02900000000002,"text":"Boosting memory capacity through the use of upgraded HBMs is one of the few ways to gain competitiveness in the short term. "},{"start":249.47200000000004,"text":"For example, AMD’s latest MI300 accelerator uses eight HBMs, more than the five or six in Nvidia’s products. "}],[{"start":258.02000000000004,"text":"Chips have historically been a cyclical industry prone to dramatic booms and busts. "},{"start":262.58700000000005,"text":"On its present course, a lasting increase in demand for new products should make future downturns less turbulent than in the past. "}],[{"start":269.03000000000003,"text":""}]],"url":"https://creatives.ftacademy.cn/album/135041-1702960980.mp3"}

版權聲明:本文版權歸FT中文網所有,未經允許任何單位或個人不得轉載,複製或以任何其他方式使用本文全部或部分,侵權必究。
設置字型大小×
最小
較小
默認
較大
最大
分享×