Discover the PCIM Europe

AI The impact of artificial intelligence on the semiconductor industry

Von Nigel Charig

Related Vendors

Developments in AI are proving beneficial for semiconductor manufacturers, both by creating new marketing opportunities and by facilitating manufacturing process improvements. This article offers examples of both factors.

According to a report titled ‘Accenture Semiconductor Technology Vision 2019’, 77 percent of semiconductor executives surveyed said they have adopted AI in their business or are piloting the technology.
According to a report titled ‘Accenture Semiconductor Technology Vision 2019’, 77 percent of semiconductor executives surveyed said they have adopted AI in their business or are piloting the technology.
(Bild: Adobe Stock)

Artificial intelligence (AI) comprises technologies that range from machine learning to natural language processing. It allows computer systems to perform tasks normally requiring human intelligence such as visual perception or decision-making. And it’s beginning to have a significant impact on the semiconductor industry, because it both creates new market opportunities, and allows improvements to the semiconductor design and fabrication process.

Software to hardware shift creates market opportunities

For decades now, the architecture and software layers of the technology stack have dominated high tech, because of the important advances they have brought to PCs and mobile phones⁠—the game-changing innovations that have defined this era. However, the growth of AI could change this, as AI applications share a reliance on hardware as a core enabler of innovation.

According to a McKinsey article, ‘Artificial-intelligence hardware: New opportunities for semiconductor companies’, AI could allow semiconductor companies to capture 40 to 50 percent of the technology stack’s value, with storage experiencing the highest growth. However, the companies will capture most value in compute, memory and networking.

AI has made important advances through development of sophisticated machine-learning (ML) algorithms that can process large data sets, “learn” from experience and improve over time. The greatest leaps were realized by advances in deep learning (DL), a type of ML that can process a wider range of data, requires less data pre-processing by humans, and often produces more accurate results.

As developers try to improve training and inference⁠—two key AI activities⁠—they often encounter roadblocks related to storage, memory, logic, and networking. By providing next-generation accelerator architectures, semiconductor companies could increase computational efficiency or facilitate the transfer of large data sets through memory and storage. For instance, specialized memory for AI has 4.5 times more bandwidth than traditional memory, making it much better suited to handling the vast stores of big data that AI applications require.

With hardware serving as a differentiator in AI, semiconductor companies will find not only greater demand for their existing chips, but also opportunities for novel technologies in several areas. Specifically, these include:

Compute: There are opportunities in existing markets for parallel processing accelerators such as GPUs or FPGAs. There is also a potential for workload-specific AI accelerators.

Memory: There are currently opportunities for high-bandwidth memory and on-chip SRAM memory.

Storage: More data retention will potentially increase demand for existing storage systems, plus AI-optimized storage systems.

Non-volatile memory (NVM) is also emerging for roles in both memory and storage applications.

Networking: There are existing opportunities within data center infrastructures, with new possibilities for programmable switches and high-speed interconnect.

Using AI in semiconductor manufacturing

According to a report titled ‘Accenture Semiconductor Technology Vision 2019’, 77 percent of semiconductor executives surveyed said they have adopted AI in their business or are piloting the technology.

One application area is predictive maintenance, where AI and Big Data can be used to improve manufacturing yield and quality. For example, David Fried, CTO at Coventor, describes an evolutionary process of adding sensors to tools and moving towards AI to keep pace with ever-tighter manufacturing tolerances.

He notes that as fabrication moved from 200mm to 300mm, more sensors appeared on tools to increase available data⁠—but analysis, if done, tended to be reactive. If a fault occurred, standard characterization operations were performed to identify the process and then the item that had failed. Fault detection could then be added to that item to prevent future failures. However, this reactive operational mode becomes increasingly difficult to sustain as processing complexity levels increase.

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy.

Unfold for details of your consent

Yet equipment sensors allow analysis of data both from tool operations and the wafer process status. Feedback ranges from information about which wafer is in which chamber to a robot arm’s current position, and many other aspects besides⁠—and this, multiplied up for all a fab’s equipment and processes of many different types, creates a massive big data challenge.

However, if this challenge can be overcome, so the diverse data is accessible, harvestable, and capable of rendering into a common format for processing, then new possibilities emerge. Real time, basic machine learning can be started, actively coupling electrical test data and metrology data back to ‘things’. Trends and patterns start emerging, and algorithms can be installed to guard against or compensate for deviation.

Ultimately, though, the benefits could extend beyond this⁠—through an exciting journey from simple machine learning, which already exists, to a bigger, broader fully integrated data set. This will enable deeper involvement with AI and an understanding of the objective of the factory line from yield, throughput, and device-performance perspectives. It will also allow fine-tuning of the whole production path.

(ID:46531254)