TrendForce released a report today stating that generative AI is the integration of AI technologies such as GAN, CLIP, Transformer, Diffusion and other algorithms, pre-trained models, and multimodality. Data, computing power, and algorithms are the three indispensable keys to deep cultivation of generative AI.
The report pointed out that because generative AI must invest a large amount of data for training, a large number of high-performance GPUs must be used to shorten the training. Taking the GPT model behind ChatGPT as an example, its training parameters have increased from about 120 million in 2018 to nearly 180 billion in 2020.
ChatGPT is a new chat robot model released by the artificial intelligence research laboratory OpenAI on November 30, 2022. It’s an AI-powered natural language processing tool designed to mimic human-like conversation based on user prompts.
TrendForce predicts that the demand for GPUs is estimated to be about 20,000 units, and in the future, it will be up to 30,000 units for commercial use (the calculation basis of this article is mainly based on NVIDIA A100). The development of generative AI will become a trend, which will drive a significant increase in GPU demand and benefit-related supply chains.
According to reports, the biggest beneficiary is NVIDIA, the leader in GPU chips. Its DGX A100, which can reach 5 PetaFLOPS computing performance, is almost the first choice for large-scale data analysis and AI-accelerated computing. In addition, there are AMD launched MI100, MI200, and MI300 series chips.