Why software-driven verification is rising to prominence?

|April 27, 2016 0

Chi-Ping Hsu

The system and semiconductor worlds are facing a transition.

Some system companies are doing their own semiconductor design. In fact, it is notable that all the leading smartphone companies design their own application processors.

___________________________________________________________________________________________________________

At the same time, semiconductor companies have to create a large part of the software stack for each system on chip (SoC) since the software and silicon are intimately related. Both these trends mean that software and the SoC need to be designed in parallel.

Software represents the greatest cost and also the biggest bottleneck in SoC design. And, SoCs play an increasingly important role in many electronic systems. So it has become vital to ensure that every part of the system—from chip to package to board—is optimized and verified.

For example, an SoC intended for a smartphone has to run Android (with only one obvious exception). It doesn’t matter whether it is a smartphone company that is designing its own chips or an SoC company that sells standard products to other manufacturers. The requirements are very similar in either case, and Android simply has to run on the chip.

No company designing such a chip is going to tape out the design without first running the Android binary on a model of the chip. This is not just to ensure that the software runs. Other major characteristics, such as the effectiveness of the SoC power architecture or the thermal effects in different modes (making a call, playing a game, listening to an MP3 file) need to be measured, too. This is software-driven hardware verification.

It is not a surprise that system companies that do their own semiconductor design still have more software engineers than semiconductor designers. But the same is true of semiconductor companies that do SoC designs—they, too, have more software engineers than semiconductor designers.

Addressing Verification Challenges

Verification always requires a multi-faceted approach. Software-driven hardware verification, in fact, can only be used relatively late in the design cycle when enough of the design has been completed for the software to run.

Earlier, at the block level, verification can be done with simulation and verification IP (VIP), or with formal techniques, or even with FPGA prototyping. But, ultimately, when the design is approaching tapeout and most of the blocks exist, then the software needs to be run.

However, there is a major challenge. Booting Android, let alone running any application software once it is booted, requires billions of vectors. The SoC on which the software has to run may itself consist of billions of gates. This makes verification very time-consuming and complex, but it has to be done.

The cost in terms of both schedule and dollars is far too great to risk taping out an SoC where all the blocks have been verified but the ultimate system verification of running the software has not been done.

There are two key technologies for software-driven hardware verification:

The first is emulation. Emulators are relatively expensive—but they provide the value sought in terms of efficient and effective software-driven hardware verification. Over the years, we saw that emulation could sometimes be the weak link because of the lack of flexibility and the difficulty of getting a design into the system. Ten years ago, this could have taken literally months, but now the landscape is different. Emulation tools can now accept anything that RTL simulation can accept and compile it extremely fast in a matter of minutes and hours.

The second key technology is virtual platform technology. This allows code written for one microprocessor to be run on a normal architecture core. The binary of the software load runs on a “model” of the processor.

The combination of emulation and the virtual platform working together is called hybrid emulation. The code binary (Android, let’s say) runs on the virtual platform, and the rest of the design can be compiled from RTL and then run on the emulation platform. The two parts are automatically linked together so that the processor can communicate with the rest of the chip.

Software-based verification is not new, of course. But 2016 is going to be the year where it becomes more important. Since the software component of a system grows rapidly, so does the requirement for ensuring that the SoC runs all pre-existing software before it is available.

The author is SVP & chief strategy officer for EDA products and technologies at Cadence Design Systems

No Comments so fars

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.