ELECTRONIC DESIGN AUTOMATION The evolution of electronic design automation technology
Electronic design automation (EDA) tools are available to engineers working at integrated circuit, printed circuit board, and system level. This article looks at how EDA has contributed to electronic product development until now, and how it might continue to do so in the future.
The invention of the transistor in 1947 at Bell Laboratories spawned an explosion in electronics technology, and an industry that has profoundly changed our way of life. Evolution and the appearance of ever-smaller and more powerful devices has been driven at a relentless pace as companies compete to get to market first, and maximize profits from new technology opportunities.
The size of the stakes, and the efforts that tech companies devote to becoming winners, is vividly illustrated in a nonfiction, Pulitzer Prize-winning book called ‘The Soul of A New Machine’ written by Tracy Kidder in 1981. It describes the attempt by Data General in 1980 to develop a next-generation computer to prevent Digital Equipment Corporation (DEC) achieving domination of the new 32-bit minicomputer market. A key theme in the book is the tension between engineering quality and time to market. This has persisted as a potential if not actual problem, but has been mitigated by the appearance and evolution of electronic design automation (EDA) – sometimes known as electronic computer-aided design (ECAD) - tools. They are available to facilitate integrated circuit (IC), printed circuit board (PCB) and complete system design.
Some EDA companies offer integrated packages that cover all aspects of design, from IC, through PCB, to system. Cadence, for example, offers a broad portfolio of tools to address an array of challenges related to custom IC, digital, IC package, and PCB design and system-level verification. These allow developers to meet their power, performance, and area targets, overcome mixed-signal design constraints, and achieve faster design closure. However, the role of EDA in IC design in particular becomes essential, as ICs can comprise many billions of components. Below, we look at how IC EDA has evolved to its current form, and some factors expected to influence its ongoing development.
Evolution of EDA
Prior to the development of EDA, integrated circuits were designed by hand and manually laid out. Some advanced shops used geometric software to generate tapes for a Gerber photoplotter, which generated a monochromatic exposure image, but even those copied digital recordings of mechanically drawn components. The process was fundamentally graphic, with the translation from electronics to graphics done manually. By the mid-1970s, developers started to automate circuit design in addition to drafting and the first placement and routing tools were developed.
The next era began following the publication of "Introduction to VLSI Systems" by Carver Mead and Lynn Conway in 1980; this ground-breaking text advocated chip design with programming languages that compiled to silicon. The immediate result was a considerable increase in the complexity of the chips that could be designed, with improved access to design verification tools that used logic simulation. Often the chips were easier to lay out and more likely to function correctly, since their designs could be simulated more thoroughly prior to construction.
Although the languages and tools have evolved, this general approach of specifying the desired behavior in a textual programming language and letting the tools derive the detailed physical design remains the basis of digital IC design today.
Current digital flows are extremely modular, with front ends producing standardized design descriptions that compile into invocations of units similar to cells without regard to their individual technology. Cells implement logic or other electronic functions via the utilization of a particular integrated circuit technology. Fabricators generally provide libraries of components for their production processes, with simulation models that fit standard simulation tools. Analog EDA tools are far less modular, since many more functions are required, they interact more strongly and the components are, in general, less ideal.
Future trends for EDA
The semiconductor industry is being asked to produce ever more complex integrated circuits (ICs) for several reasons. One influence is the growing demand from automotive customers developing more connectivity features like Advanced Driver Assistance System (ADAS) and electric or hybrid vehicles. ADAS averts distraction and reduce the pressure on drivers by assisting them through itsartificial intelligence – and its growth is stimulating demand for the complex ICs it runs on. The surge in demand for complex semiconductors will be further stimulated by the appearance of self-driving cars on the market.
ADAS is just one of the many applications driving the growth of artificial intelligence (AI), and its associated machine learning (ML) and deep learning (DL) technologies. Semiconductor manufacturers have to supply more complex ICs such as CPUs and GPUs with hundreds of cores, plus terabytes of memory and multiple high speed communication channels - and require increasingly sophisticated EDA tools to help them.
Additionally, though, developers often identify a need to optimize their AI performance, without power compromises, by building dedicated logic. Developing the right AI architecture for a given application requires EDA tools that can work with higher levels of abstraction. Mentor, a Siemens business, is starting to see increased business in their Catapult HLS (High Level Synthesis) technology for companies developing AI IP accelerators for their system-on-chip (SoC) designs.
This enables AI architects to develop their math code, translate it to C or SystemC, and see upfront which parts of their algorithm should be implemented in hardware vs. software. They can then converge on the ideal architecture much sooner than trying to go down to the register transfer (RT) level right away.
While EDA can help with AI solution design, AI can equally be used to improve EDA tools. For the last several years, Mentor’s R&D staff has been integrating ML into their own EDA tools. The company currently has five tool offerings commercially available that leverage ML to help deliver better results, and deliver them more quickly.
ML can boost EDA performance because ML requires large volumes of data to be effective – and EDA produces data in large volumes. In fact, EDA data is so readily available that, when leveraging ML for EDA, the question becomes: What data sets can be leveraged effectively for what tool functions?