1. The global digitalization continues to advance, and the server ushered in intergenerational upgrades
The server is generally composed of CPU, storage chip, PCB motherboard, power supply, cabinet, heat dissipation and other modules. Among them, the CPU is the core component of the server, which determines the computing performance of the server. The update iteration of the chip promotes the upgrade of the server platform and will drive the supporting communication chip, PCB motherboard, DARM and other parts to upgrade.
(1) Server industry chain
The upstream of the server industry chain is component manufacturers and supporting software manufacturers. The industry is concentrated and mainly controlled by the United States, Japan, and South Korea companies, such as: Intel, AMD, Intel, etc.; Midstream is a server manufacturer. The core parts of the procurement, according to the demand of downstream customers, produce and sell the server to the downstream customers; before 2012, the downstream customers of the server are mainly traditional customers such as banks, governments and telecommunications, and other large enterprises. In recent years The explosive growth of computing power demand and the migration of storage and computing resources to the cloud have become the main incremental customers with cloud computing giants such as Amazon, Microsoft, Google, Google.
(2) The history of upgrade of the server platform
The CPU is the core component of the server, leading the upgrade of the server platform. The CPU is connected by a chipset composed of memory control chips, PCIE control chips, and I/O processing chips. Signals and other transmission. As the core component of processing and calculating data, the CPU needs to conduct good coordinated work with chipset, PCB boards, memory and hard disks to play its best performance. Therefore, when the CPU architecture is upgraded, the supporting components also need to be upgraded simultaneously to jointly promote the upgrade of the server platform.
Intel, AMD's next -generation processor is ready to go, and the server platform upgrade is now. The server can be divided into two servers: X86 and non -X86. The sales and sales of the X86 server are more than 90%for a long time. Intel and AMD, as the double oligopoly enterprise of the X86 server CPU, occupy almost the entire share and determine the upgrade path of the server. Intel released the fourth -generation Xeon processor on January 10, 2023, and AMD also released the fourth -generation EPYC processor on November 10, 2022. With the release of a new generation of high -performance processors, server -related supporting components will usher in new growth opportunities.
The processor is upgraded, and the server supporting parts ushered in major changes. Intel and AMD server processors are upgraded in the direction of faster processing and higher transmission rates. There are two significant changes in the upgrade of the processor architecture: the PCIE bus is upgraded from PCIE 4.0 to PCIE 5.0; the memory module is upgraded from DDR4 to DDR5. Higher bus standards have higher requirements for CCL and PCB: low -profile electric constant, low -scattered factor, low roughness and high -level number. According to the JEDEC standard, the DDR5 memory module has a higher specifications and more functions. There are also three supporting chips, which are serial detection hub (SPD), temperature sensor (TS), and power management chip (PMIC). With the improvement of the penetration rate of the new generation of CPU platforms, related CCL, PCB, and memory interface chip manufacturers will benefit from it.
(3) Global digital promotion data center continuous expansion
The global digital process has continued to advance, and the scale of data and computing power has grown rapidly. With the advancement of the global digitalization process, emerging digital technology industries such as cloud computing, artificial intelligence, driverless, and AR/VR have flourished, and the total global data has shown explosive growth. According to IDC forecasts, the compound growth rate of newly added and replicated data from 2021 to 2025 will reach 23%, and it will jump to 181 ZB by 2025. At the same time, the global computing power scale has also grown at the same time. In 2021, the total computing power scale of the global calculation equipment reached 615 EFLOPS. According to Huawei GIV forecast, the global general calculation power of the global general calculation will reach 3.3 ZFLOPS (FP32), and the computing power of AI will reach 105 ZFLOPS (FP16), an increase of 500 times.
Computing power has become a key production factor in the digital era. With the comprehensive opening of the global digital process, the computing power has been used as an important support, and the empowerment effect has already appeared. The "2021-2022 Global Computing Power Index Evaluation Report" was released by Tsinghua Industrial Research Institute and Inspur Information. GDP will increase by 3.5 ‰ and 1.8 ‰, and it is expected that the trend will continue in 2021-2025. More and more countries will recognize the importance of computing power on the macroeconomic, computing power capital and material capital formation complementive effects, and jointly promote GDP growth. Computing power will inevitably become the main points of the future of various countries.
Cloud computing leaders continue to increase, data center construction is not decreasing, and server shipments continue to rise. With the advancement of digitization, the demand for computing power continues to increase, and the market size of data centers has continued to expand. Cloud computing leading enterprises at home and abroad, such as: Amazon, Microsoft, Google, etc., cloud computing revenue has grown rapidly, and capital expenditures have been increased, which are mainly used for the construction of data center infrastructure. As the largest part of the cost of data center construction, the market size expands rapidly.
Second, ChatGPT triggered the innovation of AI industry, and the needs of computing power and communications have leaped sharply.
(1) ChatGPT is quickly out of the circle, and the inflection point of the AI industry has arrived
ChatGPT performed well, and the AI intelligence level was greatly improved. ChatGPT is a pre -training model launched by American artificial intelligence company Openai in November 2022. It can understand people's intentions, make multiple rounds with people, and the content of the answer is very realistic. Its excellent manifestations have gained hundreds of millions of users in just 2 months. In contrast, Tiktok, the most popular short video app in history, took 9 months to reach 100 million users. It can be seen that ChatGPT's intelligence level has been widely recognized, and the era of AI and people dancing is no longer a dream.
ChatGPT greatly improves the efficiency of human -computer interaction, and the personal AI era is coming. GPT was rated by Bill Gates as the most important technical progress since the graphical interface, and was called Huang Renxun as an iPhone in the field of artificial intelligence. In the 1980s, the emergence of graphics user interfaces can complete the operation of the computer through the mouse clicking graphics interface, which greatly reduces the threshold for using the computer. The personal computer era has risen. At the beginning of this century, the iPhone used touch screen technology. Users no longer interact with mobile phones through keyboards, which greatly enriched the experience of human -computer interaction and opened the era of moving the Internet. ChatGPT enables people and machines to directly talk. The machine can understand the complicated intentions of people and can make appropriate reactions. People and machines are no longer limited to simple instruction interactions. They can fluently dialog with machines and perform complex interaction. Machines can understand the complex intentions of human beings, assisting humans to do more work. Human productivity will be greatly improved, and the personal AI era is coming. From the historical experience, the high degree and speed of the wave caused by the new round of interaction revolution will exceed the previous generation. At present, the personal AI era is still in a bud, and the changes that will bring to human society in the future are inestimable.
(2) The potential of the pre -training large models has emerged, and the dawn of strong artificial intelligence appears.
Pre -training model parameters have been greatly improved, and amazing intelligence has emerged. From AlexNet to GPT, the model parameters have increased from 60 million to 100 billion yuan. The computing power resources required for model training have increased by hundreds of thousands of times. The ability of the model has evolved from simple division to complex multi -round dialogue. The scale of artificial intelligence models is positive as performance. As the model increases, the models have emerged with surprising intelligence.
ChatGPT has stimulated the huge potential of large models, and a little dawn of strong artificial intelligence appears. ChatGPT is essentially a large -scale pre -training language processing model. Its excellent performance is mainly due to four aspects: large -scale pre -training base (GPT), field learning (Instrument), thinking chain ability (COT) and human alignment alignment Ability (RLHF). The large base allows the model to have the potential of intelligent emergence, and then stimulates the potential of models to emit models through Instrument Tuning, COT, RLHF and other technologies, and finally makes ChatGPT have complex intelligence such as multiple rounds of dialogue, contextual understanding, and knowledge reasoning. ChatGPT reveals the huge potential of big models to the world. The big model may be a feasible road to strong artificial intelligence.
(3) The science and technology giant accelerates the betting AI industry, a new round of AI equipment competition kicks off
OpenAI is the first, the technology giant is unwilling to fall behind, and has accelerated the AI industry, and the AI commercialization process has accelerated. Since the release of ChatGPT in November 2022, the subversive chat experience has swept the world. Microsoft has chased Openai billions of dollars. The new version of Bing has been transformed into a chat search with the help of ChatGPT to explore a new round of technology giants. Bard and Wenxin said, Microsoft 365 Copilot made a shocking appearance. Giants such as JD.com, Ali, Tencent, Huawei, and other giants in China have deployed large models. A large number of startup companies have gathered in the AI industry. The process of commercialization of AI will speed up.
The application of AI has sprung up, and the needs of computing power and communications have exploded. ChatGPT is just the beginning of AI large -scale applications. More AI applications are on the road. Massive data will produce and non -structural data. These data need to be transmitted with larger bandwidth. deal with. In addition, the computing power that can be provided by a single server is limited. The computing power required for a single large model is provided by the thousands of servers connected by the network. At that time, the demand for AI servers and switches will grow rapidly.
The AI AI AI Server. With the widespread use of AI technology, the serial processing architecture of the CPU can no longer meet the computing power needs of the AI era. GPUs, FPGA, etc. use parallel computing chips more suitable for dense calculations, and the AI server emerges. The AI server is mainly a heterogeneous server that can be CPU + GPU, CPU + FPGA, CPU + TPU, CPU + ASIC or CPU + multiple acceleration cards, but the CPU + GPU architecture is currently widely used. The AI server is not much different from the ordinary server on the composition component, mainly in the following aspects: 1) larger capacity memory to meet the demand of real -time load increase in big data; Support NVMe / PCIe and other SSDs to meet the fast storage needs of big data or model parameters; 3) network modules with higher bandwidth to meet the high -speed transmission needs of data between AI servers and end users. As AI is widely used in various industries, the demand for computing power will be exponentially increased, and the demand for AI servers will grow at a high speed.
Third, DDR5 accelerate penetration, the amount of memory interface chip rises in volume and price
(1) Upgrade of memory interface and supporting chips
Memory modules (commonly known as memory bar) are data transit stations of CPU and hard disks, which play an important role in data buffer and stable performance. Memory modules are mainly divided into four categories: RDIMM (storage dual -column direct insertion memory module), LRDIMM (dual -load dual -column direct insertion memory module), UDIMM (no buffer dual -column direct insertion memory module), SODIMM (small shape in shape Double -column direct insertion memory module). Among them, RDIMM and LRDIMM are mainly used for servers. Udimm and SODIMM are mainly used in ordinary desktop and laptops. The server memory module has stricter requirements in terms of speed, capacity, stability, error correction ability, etc., and the value is higher.
Memory interface chip can improve the speed and stability of memory data access, which is the only way for the CPU to access memory data. Memory interface chip can be divided into: storage cushable (RCD), data buffer (DB). RCD is used to buffer the address, command and control signal from the memory controller; DB is used to buffer the data signal from the memory controller or memory particles. At present, the memory interface chip is only used for server memory module, which provides important guarantees for the large capacity, high speed, and high stability of the server. Among them, RDIMM only uses RCD chips, and LRDIMMM uses RCD and DB suits. It has a better performance and is mainly used for high -end servers.
Memory upgrade is replaced, and the amount of memory interface chip is increased. Compared with DDR4's last -generation memory interface chip, DDR5's first sub -agent interface chip, the working voltage (1.1V) is lower, the transmission rate is increased by 50%, the stability and effectiveness are further improved, and the value has also increased. Among them, DDR5 LRDIMM staged a "1+10" architecture based on DDR4 "1+9" (1 RCD and 9 DB). In the middle of DDR5, UDIMM and SODIMM, which originally needed signal cushioning, need to be equipped with a CLOCK DRIVER (CKD) to improve the integrity and reliability of the clock signal. At present, Jedec is developing the standard of the corresponding product.
In addition to changes in memory interface chips, the DDR5 generation memory module also needs to be equipped with three supporting chips: 1 SPD chip, 1 PMIC chip, and 2 TS chips. At present, the server memory module RDIMM and LRDIMMM need to be equipped with three supporting chips; desktop machines, laptop memory module UDIMM and SODIMM need to configure two of them: 1 SPD chip and 1 PMIC chip.
(2) The server platform upgrade, DDR5 memory module demand has greatly improved