The small chips we see represent the world’s cutting-edge technology. It covers all fields such as consumer electronics, automotive electronics, industrial automation, financial systems, defense, and military industries, and has laid the ground for informationization and intelligence in all walks of life. Basically, the development of chips is changing the world.
The development of the chip can be traced back to the birth of the transistor. In 1947, scientists William Shockley, John Button, and Water Braton invented the world’s first transistor at Bell Labs in the United States, and they jointly won the 1956 Nobel Prize in Physics. Before this, humans had invented the electron tube. In 1942, 17468 electron tubes, 7200 resistors, 10,000 capacitors, and 500,000 wires were used. The first computer was manufactured with a power consumption of 150 kilowatts. This is an area of 150 square meters. Behemoth weighing 30 tons. If these discrete devices and circuits can be fabricated on a single dielectric substrate, the size and reliability can be greatly reduced. This is the idea of the initial integrated circuit. The advent of transistors made this idea possible. It replaced the function of a vacuum tube and was soon used by electronic computers. It reduced the computer made of electronic tubes to several cabinets.
In 1958, Jack Kilby, who worked at Texas Instruments, used a germanium (Ge) substrate to connect several transistors, resistors, and capacitors together to successfully develop the world’s first integrated circuit. Although it may not look good, it has proven to be much more efficient than using discrete components. After 42 years, Jack Kilby also won the Nobel Prize in Physics. A few months after Jack Gere invented the germanium-based integrated circuit, Robert Noyce successively invented the silicon (Si) -based integrated circuit. Most of today’s semiconductor applications are silicon-based integrated circuits.
The manufacture of integrated circuits has made all components a whole in structure, making electronic components a big step towards miniaturization, low power consumption, intelligence, and high reliability. A chip is an integrated circuit fabricated on a small semiconductor wafer and then packaged in a tube case to become a miniature structure with the required circuit functions.
Nowadays, with the continuous development of technology, the integration degree of chips is getting higher and higher. According to the development trend of Moore’s Law proposed by Gordon Moore in 1965-the number of transistors on a chip doubles approximately every 18-24 months. The manufacturing process has evolved from 0.5 microns, 0.35 microns, 0.25 microns, 0.18 microns, 0.15 microns, 0.13 microns, 90 nm, 65 nm, 45 nm, 32 nm, 28 nm, 22 nm, 14 nm, to the current 10 nm, 7 Nano, 5 nano … In recent years, the industry has begun to face the problem of failure of Moore’s Law, because as the density of circuits on silicon wafers increases, its complexity and error rate will also increase exponentially. Scientists are thinking of other ways To maintain the development of Moore’s Law.
Development history of processor chips
There are many types of chips, but nothing more than analog chips and digital chips. The analog chip is used to measure all perceptions of the analog world, such as images, sounds, touch, temperature, humidity, etc. can be included in it. Digital chips include processors (CPU, GPU, MCU, DSP, etc.), memories (DRAM, NAND Flash, NOR Flash) and logic ICs (mobile phone baseband, Ethernet chip) and so on.
With the rapid development of electronic information technology today, we cannot be separated from PCs, mobile phones, tablets, digital cameras, automotive electronics, and home appliances. These electronic products can interact with people because they all use a kind of internal Chip-processor. Various types of processors have been born in different application scenarios. They have different computing speeds, different costs, different architectures, and different functions.
AI smart chip
The AI chip can be understood as a processor chip for AI applications, which belongs to the combination of two major areas of AI and processor chips. With the rapid development of artificial intelligence, AI chips have become the hottest investment fields. In addition to Intel, Nvidia, ARM and other established chip manufacturers, Internet companies such as Google, Facebook, and Microsoft have also entered the AI chip market.
AI applications usually include various algorithms based on deep neural networks, as well as tasks such as image recognition, video retrieval, voice recognition, voiceprint detection, search engine optimization, and autonomous driving. The most critical capabilities are “training” and “inference” And “training” is to complete the learning of features from massive data, which requires extremely high computing performance and high accuracy. In order to support the computing performance and accuracy of AI, the ideal AI chip needs to have highly parallel processing capabilities, support floating-point calculations for various data types, and memory bandwidth for storing massive data.