DAC Technical Review (Day 2)

In the 2nd day of DAC, I attended a technology Session Bridging pre-silicon verification and post-silicon validation, a user track presentation on An Open Database for the Open Verification Methodology Synopsys VCS demo and verification luncheon, visited the booth of the following companies: Realintent, Adlec, IBM, Nextop, eVe, ExpertIO

Bridging pre-silicon verification and post-silicon validation

This technology session has panel discussion on closing the gap between verification and validation. Verification and validation has two very different culture arise from limitation in our work. The industry has the same problem we are facing in PMC. There is problem between control vs speed cost vs effort in testing. Since the test environment is incompatible between the two side, it is a challenge duplicate a problem from one side to the other side. The latest technology is Design for Debug (DFD) to close the loop between validation and verification. The idea of DFD is very simply, insert build in waveform probe and signal poke logic into the silicon, so we have get more control and observability in validation. The DFD is very new, but they are aiming to get the flow standardize and automate just like DFT. Simulator will have hooks to the DFD interface to recreate bugs found in validation or generate vectors and dump it to the device. It is interest to see the statistic of RevA success rate has dropped significantly in the industry, from 29% in 2002 to 28% in 2007, and seeing more Rev on average. DFD can speed up the validation process and better turn around between validation and verification. ClearBlue is a DFD tool and they claim overhead is 1-2% area increase in the silicon. However the Intel panel guests cite a number as high as 10% in their own in-house DFD solution. User can trade off between increasing pin count or adding more probe memory to cope with the bandwidth requirement on the DFD interface.

An Open Database for the Open Verification Methodology

This presentation come out from a university research project. It’s like what Peter had proposed a few years ago. Hook up a C++ SQL API with Specman and save all the coverage or even packet data to a mySQL database. It is a neat proof of concept exercise, but vManager already took address this question.

Synopsys VCS demo and verification luncheon

The VCS demo is not really a demo. It’s just marketing slides. However I chatted with the VCS product manager for half an hour after the demo and manage to hear a few interesting things about VCS.

  1. Cadence has EP and VM for test planning, Synopsys just use a Excel spreadsheet template.  The spreadsheet will suck in the coverage data in XLM format to report the test result.
  2. VCS multi-core is more advance than I had expected.  The user can partition the design along logical blocks (subsystems) and run each block in a different core to speed up the simulation.  The Cadence multi-core solution does not partition the design, it merely move some function like waveform dumping, checking assertion, running the testbench in a different core.  The catch is each core checks out a SVN license.
  3. VCS has a new constrain resolver, but they use a different approach than Igen.  They don’t break down the generation into ICFS.  Looks like there are more than one constraint resolver algorithm out there.  They claim the new constrain resolver is better than Cadence, but they are only comparing to pgen.  The VCS guy is not aware of igen.
  4. VCS finally support uni-cov, which supported by Cadence since IES8.2.  They have a tool to merge uni-cov files in parallel, kinda like the NHL playoff.  I think we can modify our coverage merge script to merge coverage in parallel to avoid crashing.

Realintent

This company offer statistic verification tool that runs very early in the development cycle to catch bugs before having testbench.  I have a demo with them and able to play around with their tool.  LINT is the HDL linting tool.  Other than having a filmsy GUI and integrated with Verdi, I don’t see any advantage over HAL.   IIV is a tool for imply intention verification, which analyze the HDL code and check it against 20 predefined checks.  LINT catches syntax error and combination error, IIV obviously sequential error like dead state in a state machine.  I don’t think IIV is very useful since the user cannot define custom checks.  The built-in checks only catch careless mistakes or stupid logical error made by co-op students.  XV is their tool for ‘X’ propagation verification.  It is still in beta.  The tool reads the RTL code, generate a small Verilog testbench which poke internal signal to ‘X’ and check the propagation.  The tool then run that small testbench on your normal simulator and see any ‘X’ is propagated anywhere.  I doubt the usefulness of this tool.  Lastly, they have ABV for formal assertion checks, but they don’t have a demo setup.  I suspect the tool is not ready even a working beta.  I am not very impressed by Realintent, if their tools works just advertised, we will probably save a few days of debug time in the beginning and that’s it.  I am always skeptic about their claim.

Aldec
They used to provide OEM simulator to Xilinux and other FPGA vendors, now they are selling it as a standalone simulator. The simulator runs on Windows and Linux. It comes with a lint tool and support assertion checking (but not formal analysis). This tool targets FPGA designs, since it probably won’t able to handle 50Mil gates ASIC design. The IDE GUI is simple and pretty, but lacks features and strength of IES.

IBM
I went to the IBM booth to find out what’s new in DOORS and vManager integration. The IBM guy brings in Michael Mussey from Cadence, who overseas the vManager project, when he walked by us. In short the integration is not yet working. In the planing front, DOORS will generate the vPlan file from the design spec, verifiers only have to map the coverage and checkers in the vPlan, via a RIF (requirement input format) XML file. In the reporting from, Cadence is working on a script take the vPlan and UCM, generate a UCIF (universal coverage input file) and feed it back to DOORS. Another potential application is use DOOR for verification schedule, DOORS has a plugin that talk to Microsoft Project. It looks like historical data is not saved in DOORS, DOORS only report the current view. Michael from Cadence mentioned that they are adding a MySQL backend to vManger to report historical data. I think we can look into using this new feature to replace our Excel spreadsheet. DOORS has bug tracking tool integration as well. A confirmed bug report should automatically trigger a change request in the requirement spec. We may need to upgrade our the PREP system to work with DOORS.

Nextop
The Nextop is very interesting. It generate assertion (PSL or SVA) automatically from monitoring your simulation. It is an unique solution to address the problem of who writes the assertion. Their tool will analyze how the signals is used in the simulation and generate a list of PSL or SVA statement as the properties of the block. Then the designer have to go through the list (a few hundreds of them) and categorize whether the should always hold true (an assertion) or it’s only true because we haven’t run enough stimulus. (a coverage) Then we the testbench will incorporate the assertions and use them for the rest of the simulation. My only concern is their solution seems too good to be true and I can’t tell the quality of the auto-generated assertion from the demo. I would like to evaluation this tool and is the generate asserted useful or just junks. The simulation over is about 10-15% when the tool is turned on to collect information. Currently, it only work on block level at the moment and the biggest size they had ever tried only has 12K line of code. The designer is weakest link in their flow, since the designer has to check and classify each generated assertion one by one.

eVe
They make emulation box. They talk about testbench speed up, so I am interested in their presentation. But it turns out they mean their box only support synthesable testbench. They don’t have any interface for the FPGA array to communicate with the simulator. They keep telling me that I can easily hook up the emulation box with the simulation by building custom PLI function. Yeah, right. It looks like there are not many emulation box support simulation acceleration out there. Probably it is only supported by the box from the big 3 simulator vendors.

ExpertIO
Verification IP vendor. They have Ethernet, SAS, SATA, FC VIP. The offering looks good but the only problem is the VIP is implemented in encrypted behavioral Verilog with SystemVerilog wrapper to control the transactor. They refuse to show me how the API of the VIP looks like unless PMC went through the official NDA process. The only information I can get is the a useless feature list of their VIP but I can’t tell how easy or annoying to use their VIP.

Leave a Reply