Tag Archives: dac

DAC Technical Review (Day 4,5)

The exhibition floor is over in day 4 and 5. In day 4, I attended user track presentation on verification and a technical session on What input language for HLS. In day 5, I attended a workshop on Software Engineering using Agile Software Development technics

User track presentation on verification

In the presentation Migrating HW/SW co-verification platforms from simulator to emulators, it outlines a few challenges in the flow: 1. compile ASIC/IP vendor simulation model to the emulator. 2. generate the primary clock in emulation. 3. Use transaction based VIP or use emulator specific transactor IP.

In the presentation A Methodology for automatic generation of register bank RTL, related verification environment and firmware headers, it outlines a flow similar to our RDA flow. The difference between RDA and their flow is they support IP-XACT as the register definition flow and using tcl/Java to generate the files. The register XML files is translated into Java class, then registers are read from the Java class to generate the register RTL, vr_ad register files and firmware headers files. Their flow does not support auto-generation of backdoor path in vr_ad, neither does our RDA flow.

In the presentation Utilizing assertion synthesis to achieve an automated assertion-based verification methodology for a complex graphics chip designs, nVidia demonstrate the use of the Nextop tool to generate properties representing design functionality. The flow is pretty much the same as what’s outlined in Nextop’s booth. The presentation has introduced a few new concepts, first is the notation of assertion density, which measures the number of assertion properties required to define the functionality of the design. Then there is the difference between synthesized properties and synthesable properties. The first one refers to the properties auto-generated using Nextop’s flow, the later one refers to the assertion is able to run inside the emulation box. However the specification mining is only as good as the simulation traces feed into tool.

In the presentation A smart debugging strategy for billion-gate SOCs, Samsung present a solution to a common problem we have in verification. When a simulation fails, we need the waveform to debug. On one hand, takes time to rerun the simulation and dump all the waveform. On the other hand, it takes up disk space and slow the simulation down if we start dumping all the waveform in all simulation runs. An approach to solve this problem is save check points in the simulation, then rerun the simulation and dump the waveform from the closest check point to where the simulation fails. We attempted to implement a home grown solution using ncsim’s native save/reload function, but save/reload function has been very buggy and very inefficient in term of snapshot size. The presentation introduces a tool from System Centroid called SSi-1 to automate the flow. It worth to evaluate SSi-1 to see how well it solves the problem of dumping waveform in re-simulation. The only concern is System Centroid is a Korean company and most information in its website is written in Korean.

In the presentation Bug hunting methodology using semi-formal verification techniques, ST Microelectronics introduce a way to combine the formal verification with simulation. The idea is invoke the formal engine in pre-defined search point during the simulation to limit the scope of formal search space. The formal engine can be triggered when a certain RTL condition is met, on interval, on FSM or on coverage.

What Input Language is the best choice for High-Level Synthesis (HLS)?
This is the most heated debate session in DAC. The panel invited speakers from Cadence, Mentor, Synfora, Bluespec, Forte and AutoESL for this show down on their HLS methodology. In this a three way debate, the languages of choices are C/C++, System C and System Verilog. All the speakers are biased one way or another because they are representing their company which invested millions of dollar in a certain language, so they really advocate their choice of language is better than others.

The benefit of using System Verilog over C++ or System C is SV allow the designer specify hardware architecture. The weakness of System C or C++ follows sequential models as it lacks ways to specify concurrent models. Architecture decision cannot be made by the synthesis tool since it is the first order optimization. C++ or System C HLS tool has to use compile directives to specify the hardware architecture.

The benefit of C/C++ is the only language used by both HW/SW developers. Algorithm are modeled in C/C++, so it makes C/C++ the most native input to HLS tool. Modeler or SW developer does not need to learn a new language and there is no need to translation the code in C/C++ to another language. Using C/C++ can postpone defining the architecture by separate the layer of abstraction or even making decision on the HW/SW boundary.

System C is kinda half way between C/C++ and System Verilog. The advocate thinks it has the best of both world, but others thinks it got the worst of the both world. It provides limited language construct to define timing related information and concurrency statements. It can define more accurate hardware architecture than C/C++, but it also carries the burden of a 40 years old programming language that is not design to describe hardware implementation in the first place. However, System C is supported by Cadence, Mentor, Forte and NEC CyberWorkBench, the four biggest HLS tool vendors.

Agile Programming Techniques
I signed up a full day workshop on How to Write Better Software in day 5. The workspace is conducted by IBM internal software training consultants. IBM is huge on agile software development. Agile project focus on four elements, stable, time-boxed short iterations, stakeholder feedback, self-directed teams and sustainable pace. The workspace introduced two agile methodologies, eXtrememe programming (XP) and Scrum.

In XP, there are 12 programming practices, the instructor did not go over all of them in the workspace. The major practices they had mentioned are: 1. Lean-software development, 2. test-driven development (TDD), 3. automation, 4. continuous integration/build and 5. pair programming. Lean-software development apply value stream map to eliminate waste. TDD focus on the idea unit test and re-factoring.

In scrum, the project is divided into 2-4 week sprints. In the beginning of sprint, there is a sprint planning meeting. The product owner determine the priority of all the user stories in the product backlog. Then scrum team will pick the sprint backlog and commit to the sprint goal. Scrum master remove road blocks of the scrum team. A user story describes functional that will be useful to a stakeholder of the system. Within a sprint period, the team should get everything done, (coded, tested, documented) of the picked user stories. The scrum team will conduct short 15 minutes daily scrum meeting to report the progress from yesterday, the plan for tomorrow and road blocks need to resolve. At the end of the sprint period, there is a sprint review meeting and demo of the sprint goal. Unfinished user stories should put back to the product backlog and re-evaluate its priority.

Verification is a huge software project on its own as we already created more lines of code than the RTL design. I think applying Agile programming techniques will help us to improve the quality of work. The workshop is just an introduction to Agile, it outlines what Agile is and its potential benefits, but it leaves out details on the know-how. It would be nice to learn more on how to apply Agile in verification setting as our work is not exactly the same as software development projects in IBM. Moreover, knowing the principles of Agile is one thing, avoiding pit-falls during the execution is another thing. There are many real-life problems need to be sorted out to make an Agile project successful. The workshop did not talk about how to estimate schedule with Agile given that the planning is only done within each sprint, how to manage people within a Agile team, how to deal with free riders or how to deal with difference in skill levels or how to deal some tasks that no one want to work.

Given the workshop is a 3 days IBM internal training squeezed into 1 day, it is understandable that a lot of information is left out. However I am leaving the workspace unsatisfied, I expected to learn more about Agile from the workshop.

DAC Technical Review (Day 2)

In the 2nd day of DAC, I attended a technology Session Bridging pre-silicon verification and post-silicon validation, a user track presentation on An Open Database for the Open Verification Methodology Synopsys VCS demo and verification luncheon, visited the booth of the following companies: Realintent, Adlec, IBM, Nextop, eVe, ExpertIO

Bridging pre-silicon verification and post-silicon validation

This technology session has panel discussion on closing the gap between verification and validation. Verification and validation has two very different culture arise from limitation in our work. The industry has the same problem we are facing in PMC. There is problem between control vs speed cost vs effort in testing. Since the test environment is incompatible between the two side, it is a challenge duplicate a problem from one side to the other side. The latest technology is Design for Debug (DFD) to close the loop between validation and verification. The idea of DFD is very simply, insert build in waveform probe and signal poke logic into the silicon, so we have get more control and observability in validation. The DFD is very new, but they are aiming to get the flow standardize and automate just like DFT. Simulator will have hooks to the DFD interface to recreate bugs found in validation or generate vectors and dump it to the device. It is interest to see the statistic of RevA success rate has dropped significantly in the industry, from 29% in 2002 to 28% in 2007, and seeing more Rev on average. DFD can speed up the validation process and better turn around between validation and verification. ClearBlue is a DFD tool and they claim overhead is 1-2% area increase in the silicon. However the Intel panel guests cite a number as high as 10% in their own in-house DFD solution. User can trade off between increasing pin count or adding more probe memory to cope with the bandwidth requirement on the DFD interface.

An Open Database for the Open Verification Methodology

This presentation come out from a university research project. It’s like what Peter had proposed a few years ago. Hook up a C++ SQL API with Specman and save all the coverage or even packet data to a mySQL database. It is a neat proof of concept exercise, but vManager already took address this question.

Synopsys VCS demo and verification luncheon

The VCS demo is not really a demo. It’s just marketing slides. However I chatted with the VCS product manager for half an hour after the demo and manage to hear a few interesting things about VCS.

  1. Cadence has EP and VM for test planning, Synopsys just use a Excel spreadsheet template.  The spreadsheet will suck in the coverage data in XLM format to report the test result.
  2. VCS multi-core is more advance than I had expected.  The user can partition the design along logical blocks (subsystems) and run each block in a different core to speed up the simulation.  The Cadence multi-core solution does not partition the design, it merely move some function like waveform dumping, checking assertion, running the testbench in a different core.  The catch is each core checks out a SVN license.
  3. VCS has a new constrain resolver, but they use a different approach than Igen.  They don’t break down the generation into ICFS.  Looks like there are more than one constraint resolver algorithm out there.  They claim the new constrain resolver is better than Cadence, but they are only comparing to pgen.  The VCS guy is not aware of igen.
  4. VCS finally support uni-cov, which supported by Cadence since IES8.2.  They have a tool to merge uni-cov files in parallel, kinda like the NHL playoff.  I think we can modify our coverage merge script to merge coverage in parallel to avoid crashing.

Realintent

This company offer statistic verification tool that runs very early in the development cycle to catch bugs before having testbench.  I have a demo with them and able to play around with their tool.  LINT is the HDL linting tool.  Other than having a filmsy GUI and integrated with Verdi, I don’t see any advantage over HAL.   IIV is a tool for imply intention verification, which analyze the HDL code and check it against 20 predefined checks.  LINT catches syntax error and combination error, IIV obviously sequential error like dead state in a state machine.  I don’t think IIV is very useful since the user cannot define custom checks.  The built-in checks only catch careless mistakes or stupid logical error made by co-op students.  XV is their tool for ‘X’ propagation verification.  It is still in beta.  The tool reads the RTL code, generate a small Verilog testbench which poke internal signal to ‘X’ and check the propagation.  The tool then run that small testbench on your normal simulator and see any ‘X’ is propagated anywhere.  I doubt the usefulness of this tool.  Lastly, they have ABV for formal assertion checks, but they don’t have a demo setup.  I suspect the tool is not ready even a working beta.  I am not very impressed by Realintent, if their tools works just advertised, we will probably save a few days of debug time in the beginning and that’s it.  I am always skeptic about their claim.

Aldec
They used to provide OEM simulator to Xilinux and other FPGA vendors, now they are selling it as a standalone simulator. The simulator runs on Windows and Linux. It comes with a lint tool and support assertion checking (but not formal analysis). This tool targets FPGA designs, since it probably won’t able to handle 50Mil gates ASIC design. The IDE GUI is simple and pretty, but lacks features and strength of IES.

IBM
I went to the IBM booth to find out what’s new in DOORS and vManager integration. The IBM guy brings in Michael Mussey from Cadence, who overseas the vManager project, when he walked by us. In short the integration is not yet working. In the planing front, DOORS will generate the vPlan file from the design spec, verifiers only have to map the coverage and checkers in the vPlan, via a RIF (requirement input format) XML file. In the reporting from, Cadence is working on a script take the vPlan and UCM, generate a UCIF (universal coverage input file) and feed it back to DOORS. Another potential application is use DOOR for verification schedule, DOORS has a plugin that talk to Microsoft Project. It looks like historical data is not saved in DOORS, DOORS only report the current view. Michael from Cadence mentioned that they are adding a MySQL backend to vManger to report historical data. I think we can look into using this new feature to replace our Excel spreadsheet. DOORS has bug tracking tool integration as well. A confirmed bug report should automatically trigger a change request in the requirement spec. We may need to upgrade our the PREP system to work with DOORS.

Nextop
The Nextop is very interesting. It generate assertion (PSL or SVA) automatically from monitoring your simulation. It is an unique solution to address the problem of who writes the assertion. Their tool will analyze how the signals is used in the simulation and generate a list of PSL or SVA statement as the properties of the block. Then the designer have to go through the list (a few hundreds of them) and categorize whether the should always hold true (an assertion) or it’s only true because we haven’t run enough stimulus. (a coverage) Then we the testbench will incorporate the assertions and use them for the rest of the simulation. My only concern is their solution seems too good to be true and I can’t tell the quality of the auto-generated assertion from the demo. I would like to evaluation this tool and is the generate asserted useful or just junks. The simulation over is about 10-15% when the tool is turned on to collect information. Currently, it only work on block level at the moment and the biggest size they had ever tried only has 12K line of code. The designer is weakest link in their flow, since the designer has to check and classify each generated assertion one by one.

eVe
They make emulation box. They talk about testbench speed up, so I am interested in their presentation. But it turns out they mean their box only support synthesable testbench. They don’t have any interface for the FPGA array to communicate with the simulator. They keep telling me that I can easily hook up the emulation box with the simulation by building custom PLI function. Yeah, right. It looks like there are not many emulation box support simulation acceleration out there. Probably it is only supported by the box from the big 3 simulator vendors.

ExpertIO
Verification IP vendor. They have Ethernet, SAS, SATA, FC VIP. The offering looks good but the only problem is the VIP is implemented in encrypted behavioral Verilog with SystemVerilog wrapper to control the transactor. They refuse to show me how the API of the VIP looks like unless PMC went through the official NDA process. The only information I can get is the a useless feature list of their VIP but I can’t tell how easy or annoying to use their VIP.

DAC 2010 Technical Report (Day 1)

Today is the report of my first day in DAC. I signed up to a full day technical workshop Choosing Advanced Verification Methodology. After the workshop ended at 3:30p, I managed to checked out a few companies in the exhibition floor Vennsa Technologyies, Agnisys Inc and Veritools

Advanced Verification Methodology

The workspace is smaller than I expected. There is only about 20 attendants. It started off with a keynote from Brian Bailey, a verification consultant, on the latest trends in verification. Assertion and ESL seems to be the theme of the day.

We finally see formal verification comes out from academic research and put into use by the industry and developing a good use model. There are 7-8 formal tools venders in the market right now, but looking at historical data in the EDA industry, no matter what previous technology, the market is only big enough for 2 to survive.

ESL is the latest buzz word. The word has many different meaning but basically it means where software and hardware comes together. To verification, ESL means we are building reusable testbench with different abstraction layers. Starting from the top with TLM model to verify the algorithm, then push down to verify the architecture, and then the RTL implementation at the very bottom. TLM 2.0 is the new industry standard and pretty much sweep aside all proprieties prototypes from different vendors. TLM 2.0 still lacks synthesis and no hardware/software interface.

Currently, many people model ESL in SystemC, but both SystemC and System Verilog need a few more revision to fully support ESL. The new ANSI concurrent C/C++ coming out this year many turn SystemC into a obsoleted branch of C/C++. High-level synthesis, C to RTL compiler, is almost an ESL enabler. It separate the architecture from behavioral description. The shift from RTL coding to high-level synthesis would be as disruptive as the shift from schematic capture to RTL coding.

Constraint random generation is a challenge in ESL verification. Current tool does not understand sequential depth and can’t constraint across sequence. Functional coverage is broken. It is merely isolated observation not necessary reflect the verification progress. We need a different metric to provide a direct measure on verification closure.

In ESL development, management will be a new challenge. Now we have to develop the hardware and software in the same development cycle, there will be conflicting schedule between the hardware team and the software team. Communication among different team and clear interface management at the partition between software and hardware implementation is the key.

In the next few years, the speak predicts there will be more research and probably technology break though in these areas: specification capture in verification, sequential equivalent checking, intelligent testbench, assertion synthesis and behavioral indexing.

After the keynote session, it is customer panel. The panel guests are Intel, ARM and nVidia. The ARM and nVidia are assertion expert, the Intel guy is more on ESL. It is Q&A session, but nothing special, the guest only talks about very generic things. They tell us what they do but don’t they us how they do it.

Jasper has the next presentation together with nVidia and ARM gives customer testimony. They talked about their formal verification tool and introduce basic concept like full proof, counter example, partial proof. There are quite a few neat examples of formal verification, like generate an input sequence for a given output trace, answer urgent customer question on whether something is possible in the design, verify dead lock/live lock, checking ‘X’ propagation. Both ARM and nVidia has dedicated assertion team and they said that is important to the success in using assertion in verification.

Synposys presents new updates to VCS simulation. They close the coverage to constraint loop with the Echo testbench technology. It is similar to what Cadence has and it is limited to the coverages that has a direct relationship to the constraint VCS finally has multi-core support. I think Cadence already it in IES 8.2. We should look into using both technologies for Digi. We should work with the CAD group to set up special LSF machines reserved for multi-core jobs.

TSMC talk about its Open Initiation Platform (OIP) for ESL verification. The virtual platform enable hardware/software co-simulation. The testbench is build from the top-down approach. Start with ESL TB to verify the algorithm, then ESL SoC TB to verify the ESL mode, then add cycle accurate adapter to the ESl model and finally the RTL testbench.

Mentors talks about ESL testbench and present a success story of TLM to RTL synthesis verification. They claim the high level synthesis flow save them lots of time and use the same testbench with different abstraction from top to bottom.

There is nothing new in Cadence’s presentation. They just show how vPlan fit in ESL flow.

Vennsa Technologyies
It is a small company in Toronto based on the research of a prof. from U of Toronto. Their OnPoint debug tool is pretty neat. It is an add-on to the simulation help the designer pin point the bug. Once you have an assertion failure, you can fire the OnPoint GUI. The tool will back trace the logic cone, narrow down and suggest where the bug is about. You can also start the back trace from a given value on the output pin at a given time. I played their demo for almost half an hour and it is a very handy tool if it works as advertised. The idea behind the tools sounds and I think we should evaluation the tool.

Agnisys Inc
This company has two product: IVerifySpec, a web GUI replacement for vManager and iDesignSpec, a half bake solution similar to RDA.

IVerifySpec use SQL database to store the vPlan, but it does not support UCD directly. It has to translate the UCD to an XML offline and import to the database. There is a few nice feature in the GUI, like heat map, traceability matrix, some charts and graphs looks like Google Analysis. However their tool is very immature overall, it does not support multi-level hierarchy in requirement specification, no revision control and data entry via the web interface is very tedious and user unfriendly. I should simply ask Cadence copy those nice report feature into vManager.

iDesignSpec sounds good on paper but the implementation is awkward. You enter the register specification in Word using some funny plug-in. Then the plug-in will generate PDF, HTML, XML, VHDL, OVE, C++ files. Somehow it is the exact opposite of our RDA flow. We enter our register description in XML and generate one thing at a time using scripts. The format of their word plug-in is very ugly. The code and PDF file generated by the plug-in is very primitive. I would say even our old ECBI generator is better than this tool. The only thing useful I learn from this presentation is there are industry standard for register description, SystemRDL and IP-XACT. Maybe our RDA tool should support industry standard as well.

Veritools

Their flag ship product is Veritools Designer. It’s basically a Debussy Verdi clone. It can view schematic and waveform, source code debugging. They claim their tool is very fast and only cost 1/4. I am always skeptic about those claims and I don’t like they use a their own waveform database format. It means simulator has to dump another waveform through their PLI. The GUI is fast but the design in the demo is not very big. The GUI is quite primitive compare to Simvision and they can’t beat the price of Simvision which comes free with IES. I do agree Simvision is a bit slow but I think investing in faster computer with bigger RAM can solve this problem. They have an add-on tool called Veritool Verifyer. This tool is kinda dumb. If there is an assertion failure, it read in the waveform and let you to test changes to the assertion code without invoking the simulation. I don’t think it is very useful. When an assertion fail, how often it is due to RTL bug and how often it’s just a faulty assertion?

DAC 2010 – First Impression

This is the first time I am going to the Design Automation Conference (DAC) conference or any industry conference. It is really an eye opener for me. When I first started working in PMC during the dot-com bubble days, the company promised send us to a conference every two years. Unfortunately before my turn to go, the bubble burst and the company is on survival mode for almost a decade. Finally, we are back on the growth track and the company has money to invest in developing the employees and budget to send us to industry conference.

There is a few reason I picked DAC to go. First, it is the biggest conference of the EDA industry. It has 3 days of exhibition and 2 more days for tutorial and workshops. You can see everything under the same roof, all the tool venders, 3rd party IP provider, the big names and new start up that you never heard of. Second, there are many workshops and tutorials sessions specified for verification, so I can learn what’s new other there, what other companies are doing. In fact there are so many interesting sessions that I could not see them all. Last, the conference is in Anaheim, right next to Disneyland. I am flying Pat down here and spend a weekend after the conference as a mini-vacation.

The latest technology presented by the exhibitors are amazing, but I am equally amazed by the registration system. After you have registered on-line, you can pick up your badge in the registration desk. The process is very smooth, just scan the bar code and your card is printed right in front of you. The card has built in RF chip. You no longer have to hand out business cards to exhibitors, they have a cell phone like device scan your card and print out your information automatically.

There are lots of freebies in DAC. It’s only the first day of the exhibition, I only covered 1/3 of the exhibition and I already got the following freebies: 1 backpack, 3 T-shirts, 4 balls, 2 highlight pens, a measuring tape, A battery powered cell phone charger, a pair of waist band and a book “The ten commandments for effective standards”. Other than freebies, there is free beer. Last night we have the kick-off reception sponsored by Intel. Tonight I went to the Denali party in Disney Downtown. Although, there is the beer is bottomless, no one is abusing the kind offer and got drunk. It is an industry conference after all, you don’t want to embarrass yourself in front of potential clients and employers. The industry is a small world after all.

I am looking forward to the rest of the week. I am going to write about what new technology I learn in DAC every day. Stay tuned.