Category Archives: Daily Scribble

My random thoughts of the day.

Chess Analysis #1

Inspired by my friend’s blog, I joined chess.com in Facebook. I played a few blitz games with random people online. I found playing live human is more fun than playing computer because computer does not make blunder moves. My record is straight lost so far, this is the closest game that I have very a good chance to win. I lost the game from a very stupid blunder. I am playing black

[pgn]
[Event “Live Chess”]
[Site “Chess.com”]
[Date “2010.07.25”]
[White “aligray”]
[Black “hevangel”]
[Result “1-0”]
[WhiteElo “1002”]
[BlackElo “1058”]
[TimeControl “100|0”]
[Termination “aligray won by resignation”]

1.e4 Nc6 2.Nf3 Nf6 3.Nc3 d6 4.d4 e5 5.d5 Ne7 6.Nb5 Bd7 7.Be3 c6 8.Nxa7 cxd5 9.Ng5 d4 10.Bc4 Be6
11.Bb5+ Bd7 12.Bd2 Bxb5 13.Nxb5 h6 14.Nf3 Nxe4 15.Qe2 Nxd2 16.Qxd2 Qa5 17.Qxa5 Rxa5 18.Nc7+ Kd7 19.b4 Ra8 20.Nb5 Nc6
21.a3 Be7 22.O-O f5 23.Rad1 Bf6 24.c3 g5 25.g3 g4 26.Nh4 Bxh4 27.gxh4 Rhg8 28.c4 f4 29.c5 dxc5 30.bxc5 g3
31.f3 Ke6 32.Nc7+ Kf5 33.Nxa8 Rxa8 34.Ra1 Re8 35.a4 e4 36.a5 e3 37.a6 Na7 38.axb7 e2 39.Rfb1 Nc6 40.Ra8 Rxa8
41.bxa8=Q
[/pgn]

1.e4 Nc6
I did not use the standard response to king pawn opening. Just want to try something new.

2.Nf3 Nf6
White did not push queen pawn as expected. I made a bad move. Knight is exposed to the pawn in f6. If the white pawn push to f5, it will be protected by the Knight

3.Nc3 d6
White develop the other Knight. I moved the queen pawn to clear the way for the bishop

4.d4 e5
White push queen pawn. I push king pawn to invite a gambit.

5.d5 Ne7
White pushed the queen pawn and eying at my Knight. I lost a tempo by retreat the Knight.

6.Nb5 Bd7
White push a lone Knight. I move the Bishop to chase it

7.Be3 c6
White push bishop to target a1. I move pawn to c6 targeting d5. Actually moving b6 to protect against bishop is a better idea

8.Nxa7 cxd5
White take a1 with Knight. I would be in more trouble if he take c6 first or take a1 with a Bishop. His blunder allow me to exchanged a side pawn with a center pawn and his Knight is trapped by my Queen.

9.Ng5 d4
I wonder white try attack with the other Knight or merely put the Knight there to protect the e4 pawn. I should have move Knight to c6 and the white Knight is dead. Instead, I push my center pawn to pressure the white bishop and estimate the center. I think I opened more front line than I can handle.

10.Bc4 Be6
White Bishop attach f7, I protect it with a Bishop

11.Bb5+ Bd7
White Bishop check, I moved the Bishop to defense.

12.Bd2 Bxb5
White retreat the the e2 Bishop, I exchanged a Bishop with a Bishop. I should have exchange the Knights instead and move the side pawn to center.

13.Nxb5 h6
White Knight take my bishop and escaped. I moved h pawn to chase the other Knight away.

14.Nf3 Nxe4
White Knight retreat. I took the unprotected pawn in the middle with my Knight. Looks like I controlled the center board.

15.Qe2 Nxd2
White Queen chase my Knight. I exchanged the Knight with the Bishop.

16.Qxd2 Qa5
White finished the exchange, I initiated Queenexchange

17.Qxa5 Rxa5
White finished the exchange and my Rook is chasing the Knight.

18.Nc7+ Kd7
Knight check. My King moved away.

19.b4 Ra8
White push the Pawn to chase my Rook. I made a Blunder by moving the Rook to a8

20.Nb5 Nc6
White didn’t see my blunder and draw back the Knight. I have leave the Knight wandering around for too long. I should have plan about end game after the Queen exchange. I moved the Knight to threaten the unprotected b pawn.

21.a3 Be7
White move a pawn to protect b pawn. I moved Bishop to bring it to front line.

22.O-O f5
White castled. I moved the f pawn, but I should have moved the bishop to gain tempo.

23.Rad1 Bf6
Fire power is building up at d4

24.c3 g5
White moved the pawn to put more fire power at d4. I can’t exchange the c3 pawn or the Rook will have a check. I tried to match the other pawn to push forward

25.g3 g4
White can’t move away any firepower on d4, so he match the other pawn. I should have move away the King or invite a pawn exchange. But I try to chase away his Knight for no reason. In end game, the leading side should initiate exchange to widen the leadership.

26.Nh4 Bxh4
White seems to make a suicide move and I exchanged a Bishop for a Knight.

27.gxh4 Rhg8
White’s King defense is broken. Looks like I am going to win and I become careless. I wasted a move to use the Rook to pin the white King. I should have moved my King away from White’s Rook pin.

28.c4 f4
Both side matched pawns. Again I should chase away the White Knight.

29.c5 dxc5
White initiated exchange of pawns.

30.bxc5 g3
Finish the exchange and white now has 2 isolated pawns. I pushed the g pawn.

31.f3 Ke6
White push the f pawn pass my g pawn. I should have keep marching the g pawn. The g pawn is safe since it is protected by the Rook. Instead I moving my King and let Black fork my King and Castle. The tide is turned against me.

32.Nc7+ Kf5
White capitalize my blunder. I should have exchanged that Knight a long time ago.

33.Nxa8 Rxa8
Exchange of pieces

34.Ra1 Re8
White want to march the a pawn. I should keep marching the g pawn.

35.a4 e4
Why would I march the e pawn?

36.a5 e3
Keep marching.

37.a6 Na7
I am losing the march.

38.axb7 e2 39.Rfb1 Nc6 40.Ra8 Rxa8 41.bxa8=Q
I accept my fate and resign after the White got a Queen.

Project Management Fundamental course

I challenged the PMP exam without having any project management training. I just studied the exam questions like I studied for any other public exams. When I got my PMP credential, I joked that I don’t have any practical knowledge of a good project manager, but I sure can talk like one.

After many painful project schedule slips, my company finally realize the importance of project management and sudden jump on the project management bandwagon. Everyone who has to lead something in the next project is sent to take a 2 days project management training off-side in a nice hotel. Breakfast and lunch are include but too bad that we have to pay for our own parking.

The training is really useful, even to a certified PMP like myself. The training is not sitting there all day long and flipping through some boring slides. We learn program management through a series of exercise designed to get us familiar with project management concept and introduce us to some best practices. We are divided up into 4 groups and were told to come up with a mock project to work on. Other groups simply use their existing project for the exercise. My group is more creative, we come up with a fake project which trying to get a better replacement for a tool everyone hates. It turns out using a fake project is better than a real project. It frees us from distraction from personal bias or irrelevant technical details. We can focus on the project management process without worrying about the work piled ahead and have some fun.

Everyone got a certificate after completing the two days training, but I am the only person in the class care about that piece of paper. I can use the training for the continue education requirement for my PMP credential renewal. I get 16 PDU credits and the company pays for them. This training spent most time on the planning phase. I wish the company will give us another training for project tracking and monitor phase. Not only we have problems in planning the project right, but also we have problems keeping track of our progress through out the project. I can also earn more PDU credits and have more free lunch!

DAC Technical Review (Day 4,5)

The exhibition floor is over in day 4 and 5. In day 4, I attended user track presentation on verification and a technical session on What input language for HLS. In day 5, I attended a workshop on Software Engineering using Agile Software Development technics

User track presentation on verification

In the presentation Migrating HW/SW co-verification platforms from simulator to emulators, it outlines a few challenges in the flow: 1. compile ASIC/IP vendor simulation model to the emulator. 2. generate the primary clock in emulation. 3. Use transaction based VIP or use emulator specific transactor IP.

In the presentation A Methodology for automatic generation of register bank RTL, related verification environment and firmware headers, it outlines a flow similar to our RDA flow. The difference between RDA and their flow is they support IP-XACT as the register definition flow and using tcl/Java to generate the files. The register XML files is translated into Java class, then registers are read from the Java class to generate the register RTL, vr_ad register files and firmware headers files. Their flow does not support auto-generation of backdoor path in vr_ad, neither does our RDA flow.

In the presentation Utilizing assertion synthesis to achieve an automated assertion-based verification methodology for a complex graphics chip designs, nVidia demonstrate the use of the Nextop tool to generate properties representing design functionality. The flow is pretty much the same as what’s outlined in Nextop’s booth. The presentation has introduced a few new concepts, first is the notation of assertion density, which measures the number of assertion properties required to define the functionality of the design. Then there is the difference between synthesized properties and synthesable properties. The first one refers to the properties auto-generated using Nextop’s flow, the later one refers to the assertion is able to run inside the emulation box. However the specification mining is only as good as the simulation traces feed into tool.

In the presentation A smart debugging strategy for billion-gate SOCs, Samsung present a solution to a common problem we have in verification. When a simulation fails, we need the waveform to debug. On one hand, takes time to rerun the simulation and dump all the waveform. On the other hand, it takes up disk space and slow the simulation down if we start dumping all the waveform in all simulation runs. An approach to solve this problem is save check points in the simulation, then rerun the simulation and dump the waveform from the closest check point to where the simulation fails. We attempted to implement a home grown solution using ncsim’s native save/reload function, but save/reload function has been very buggy and very inefficient in term of snapshot size. The presentation introduces a tool from System Centroid called SSi-1 to automate the flow. It worth to evaluate SSi-1 to see how well it solves the problem of dumping waveform in re-simulation. The only concern is System Centroid is a Korean company and most information in its website is written in Korean.

In the presentation Bug hunting methodology using semi-formal verification techniques, ST Microelectronics introduce a way to combine the formal verification with simulation. The idea is invoke the formal engine in pre-defined search point during the simulation to limit the scope of formal search space. The formal engine can be triggered when a certain RTL condition is met, on interval, on FSM or on coverage.

What Input Language is the best choice for High-Level Synthesis (HLS)?
This is the most heated debate session in DAC. The panel invited speakers from Cadence, Mentor, Synfora, Bluespec, Forte and AutoESL for this show down on their HLS methodology. In this a three way debate, the languages of choices are C/C++, System C and System Verilog. All the speakers are biased one way or another because they are representing their company which invested millions of dollar in a certain language, so they really advocate their choice of language is better than others.

The benefit of using System Verilog over C++ or System C is SV allow the designer specify hardware architecture. The weakness of System C or C++ follows sequential models as it lacks ways to specify concurrent models. Architecture decision cannot be made by the synthesis tool since it is the first order optimization. C++ or System C HLS tool has to use compile directives to specify the hardware architecture.

The benefit of C/C++ is the only language used by both HW/SW developers. Algorithm are modeled in C/C++, so it makes C/C++ the most native input to HLS tool. Modeler or SW developer does not need to learn a new language and there is no need to translation the code in C/C++ to another language. Using C/C++ can postpone defining the architecture by separate the layer of abstraction or even making decision on the HW/SW boundary.

System C is kinda half way between C/C++ and System Verilog. The advocate thinks it has the best of both world, but others thinks it got the worst of the both world. It provides limited language construct to define timing related information and concurrency statements. It can define more accurate hardware architecture than C/C++, but it also carries the burden of a 40 years old programming language that is not design to describe hardware implementation in the first place. However, System C is supported by Cadence, Mentor, Forte and NEC CyberWorkBench, the four biggest HLS tool vendors.

Agile Programming Techniques
I signed up a full day workshop on How to Write Better Software in day 5. The workspace is conducted by IBM internal software training consultants. IBM is huge on agile software development. Agile project focus on four elements, stable, time-boxed short iterations, stakeholder feedback, self-directed teams and sustainable pace. The workspace introduced two agile methodologies, eXtrememe programming (XP) and Scrum.

In XP, there are 12 programming practices, the instructor did not go over all of them in the workspace. The major practices they had mentioned are: 1. Lean-software development, 2. test-driven development (TDD), 3. automation, 4. continuous integration/build and 5. pair programming. Lean-software development apply value stream map to eliminate waste. TDD focus on the idea unit test and re-factoring.

In scrum, the project is divided into 2-4 week sprints. In the beginning of sprint, there is a sprint planning meeting. The product owner determine the priority of all the user stories in the product backlog. Then scrum team will pick the sprint backlog and commit to the sprint goal. Scrum master remove road blocks of the scrum team. A user story describes functional that will be useful to a stakeholder of the system. Within a sprint period, the team should get everything done, (coded, tested, documented) of the picked user stories. The scrum team will conduct short 15 minutes daily scrum meeting to report the progress from yesterday, the plan for tomorrow and road blocks need to resolve. At the end of the sprint period, there is a sprint review meeting and demo of the sprint goal. Unfinished user stories should put back to the product backlog and re-evaluate its priority.

Verification is a huge software project on its own as we already created more lines of code than the RTL design. I think applying Agile programming techniques will help us to improve the quality of work. The workshop is just an introduction to Agile, it outlines what Agile is and its potential benefits, but it leaves out details on the know-how. It would be nice to learn more on how to apply Agile in verification setting as our work is not exactly the same as software development projects in IBM. Moreover, knowing the principles of Agile is one thing, avoiding pit-falls during the execution is another thing. There are many real-life problems need to be sorted out to make an Agile project successful. The workshop did not talk about how to estimate schedule with Agile given that the planning is only done within each sprint, how to manage people within a Agile team, how to deal with free riders or how to deal with difference in skill levels or how to deal some tasks that no one want to work.

Given the workshop is a 3 days IBM internal training squeezed into 1 day, it is understandable that a lot of information is left out. However I am leaving the workspace unsatisfied, I expected to learn more about Agile from the workshop.

DAC Technical Review (Day 3)

In the 3rd day of DAC, I went to the user track presentation on formal verification, checked out the booth of Onespin, Jasper, SpringSoft, Tuscany, AMIQ, Starnet, Forte Design System and Cypber WorkBench

User track presentation on formal verification

The user track presentation is where users of the EDA present their work on how make the best and creative use of the EDA tools.  There are about 40 presentation today and about 10 of them is related to verification.  I read their presentation poster and then talk to them to exchange ideas.  Here is a few pointers I picked up from the presentation:

  • Use formal verification tool to aid closing of code coverage.  For the line of code is not yet covered, we can write a property statement for that line, feed it to the formal engine and ask it to come up with a input sequence that will trigger the assertion.  The formal engine may either generate an input sequence or prove the line is unreachable.
  • Sun/Oracle has the idea to run the property in the simulation to keep both formal and simulation in sync.  The trick  is to have some “useless” signals in the DUT to qualify the assertion check to avoid having tons of false-negative when the DUT is an invalid state.
  • One presentation presents the result that using formal verification early in the development cycle will catch more bugs in FV.  Duh!
  • This is a good one.  In formal verification, there are two types of properties, abstract properties that is safe and incomplete, constraint properties that are unsafe and complete.  Using which properties type is a trade off between finding counter example or getting a full proof of the design.
  • Exhaust prove is difficult, it is more practical to limit the proof to some reasonable depth.

Onespin

This company build formal verification tool.  Their basic product is similar to IFV, except it has a better and easier to use GUI that allow users do a lot more interaction and visualization.  Their flag ship product is operational formal ABV, instead of defining basic cause-reaction properties in ABV, the tool provides assertion library allow user define operational properties.  Then the user will go through an iteration to get a full coverage on the formal space with the aid of the tool.  The idea is to generate a set of properties that completely define the RTL.  I think the tool will work as it advertise because at the end is human who has to enter the missing properties.  However I wonder what’s the use of getting a complete ABV definition of the RTL.  It seems the idea is totally up side down.  I guess the idea is instead of auditing the implementation of the RTL code, the auditor should audit the complete properties of the code.

One thing I don’t like about Onespin is they have way too many products and it’s really confusing.  The flag ship product has all the features and the rest are just crippled version with fancy marketing terms simply to confuse users.  For example, the difference between two products is only the ability to load external property files vs the property has to be in the same file of the RTL code.  I don’t really like this kind of marketing ploy simply exist to milk more money from the customer.

Jasper

Jasper is THE formal verification tool vendor.   I spent almost 2 hours (and have my lunch) in their booth walk through all the demo and try out almost all the features in their product.  This is the tools of choice in may formal verification presentation in DAC.  The tool is much more user friendly and powerful than IFV.  IFV seems so primitive compare to Jasper.  Jasper also has property libraries for different memory, FIFO model instead of just black-boxing them out.  It support tunnel, so the user can steer the formal engine.  It comes with a lot more formal engines than IFV and gives very clever ways to get a counter example or a proof.  Active design uses the same formal engine but it is for a different application.  The idea is if we have poorly documented legacy RTL, the new designer can use active design to generate properties of the RTL and understand exactly what the RTL can and cannot do.  Another benefit is when we ask the designer can the RTL do such and such, we no longer have to take their work for it, the designer can generate recipe to prove their design to answer our question.  Jasper has an account manager for PMC and she know Bob and Alan.  I think we really should try Jasper in Digi and get Bob setup the tool for us.

SpringSoft

Springsoft acquired Debussy.  Debussy and Verdi does not change much, other than it added support to power-aware design and system verilog.  Siloti is an very neat add-on to Debussy for waveform compression.   The idea is very neat, in simulation we really only need the input, output and flip flop values in the waveform database.  The waveform viewer can calculate the value of the combination signals on the fly.  The waveform database is only 25% of the original size.  Certitude is a tool to test the testbench.  It corrupts the DUT and check whether the testbench fail.  If the testbench still pass when there is signal corruption in the DUT, there must be something wrong with the testbench.

Tuscany

This company has only 1 product.  It is a web interface GUI to display the timing report.  I like the GUI, even though I don’t know much about timing report.  I can see it solve the problem of how to keep track of so many timing reports.

AMIQ

Another small company with only 1 product.  They have a DVT IDE for Specman and SystemVerilog base on Eclipse, the open source Java IDE.  The IDE works like Visual Studio, it has editor, data browser, structure tree, keyword auto-complete, quick linting, hooks to launch Specrun, all under the same GUI.  It is a lot more user friendly compare to editing e code with vi or Emacs.  They are working on the debug interface hooking into the simulation for the next release, it will work like gdb.  I highly recommend purchase a few license (1k per seat, but I am sure Bob can negotiate volume discount if we buy more), give it out to the team to evaluate the product. I think we will see productivity increase with the DVT IDE instantly.

Starnet

They are selling a VNC replacement that they claim s much faster then VNC.  I know CAD is evaluating some fast VNC-like software right now.  Maybe we should get CAD to try out this product as well.  We all know how painful is it to view waveform in Banaglore via VNC.

Forte Design System and Cypber WorkBench

Both company sells high level synthesis (HLS) tool, that compile SystemC into RTL code.  It looks like HLS is finally here.  I don’t have enough domain knowledge to evaluate the tools.  All I can tell is they have a nice GUI and the RTL code generated is not very readable.  I asked about is there any limitation on the SystemC code and the efficiency of the generated RTL, I only got the typical marketing answer.  Too bad that both tools only work with SystemC, it would be nice if there is HLS for behavioral SystemVerilog.

DAC Technical Review (Day 2)

In the 2nd day of DAC, I attended a technology Session Bridging pre-silicon verification and post-silicon validation, a user track presentation on An Open Database for the Open Verification Methodology Synopsys VCS demo and verification luncheon, visited the booth of the following companies: Realintent, Adlec, IBM, Nextop, eVe, ExpertIO

Bridging pre-silicon verification and post-silicon validation

This technology session has panel discussion on closing the gap between verification and validation. Verification and validation has two very different culture arise from limitation in our work. The industry has the same problem we are facing in PMC. There is problem between control vs speed cost vs effort in testing. Since the test environment is incompatible between the two side, it is a challenge duplicate a problem from one side to the other side. The latest technology is Design for Debug (DFD) to close the loop between validation and verification. The idea of DFD is very simply, insert build in waveform probe and signal poke logic into the silicon, so we have get more control and observability in validation. The DFD is very new, but they are aiming to get the flow standardize and automate just like DFT. Simulator will have hooks to the DFD interface to recreate bugs found in validation or generate vectors and dump it to the device. It is interest to see the statistic of RevA success rate has dropped significantly in the industry, from 29% in 2002 to 28% in 2007, and seeing more Rev on average. DFD can speed up the validation process and better turn around between validation and verification. ClearBlue is a DFD tool and they claim overhead is 1-2% area increase in the silicon. However the Intel panel guests cite a number as high as 10% in their own in-house DFD solution. User can trade off between increasing pin count or adding more probe memory to cope with the bandwidth requirement on the DFD interface.

An Open Database for the Open Verification Methodology

This presentation come out from a university research project. It’s like what Peter had proposed a few years ago. Hook up a C++ SQL API with Specman and save all the coverage or even packet data to a mySQL database. It is a neat proof of concept exercise, but vManager already took address this question.

Synopsys VCS demo and verification luncheon

The VCS demo is not really a demo. It’s just marketing slides. However I chatted with the VCS product manager for half an hour after the demo and manage to hear a few interesting things about VCS.

  1. Cadence has EP and VM for test planning, Synopsys just use a Excel spreadsheet template.  The spreadsheet will suck in the coverage data in XLM format to report the test result.
  2. VCS multi-core is more advance than I had expected.  The user can partition the design along logical blocks (subsystems) and run each block in a different core to speed up the simulation.  The Cadence multi-core solution does not partition the design, it merely move some function like waveform dumping, checking assertion, running the testbench in a different core.  The catch is each core checks out a SVN license.
  3. VCS has a new constrain resolver, but they use a different approach than Igen.  They don’t break down the generation into ICFS.  Looks like there are more than one constraint resolver algorithm out there.  They claim the new constrain resolver is better than Cadence, but they are only comparing to pgen.  The VCS guy is not aware of igen.
  4. VCS finally support uni-cov, which supported by Cadence since IES8.2.  They have a tool to merge uni-cov files in parallel, kinda like the NHL playoff.  I think we can modify our coverage merge script to merge coverage in parallel to avoid crashing.

Realintent

This company offer statistic verification tool that runs very early in the development cycle to catch bugs before having testbench.  I have a demo with them and able to play around with their tool.  LINT is the HDL linting tool.  Other than having a filmsy GUI and integrated with Verdi, I don’t see any advantage over HAL.   IIV is a tool for imply intention verification, which analyze the HDL code and check it against 20 predefined checks.  LINT catches syntax error and combination error, IIV obviously sequential error like dead state in a state machine.  I don’t think IIV is very useful since the user cannot define custom checks.  The built-in checks only catch careless mistakes or stupid logical error made by co-op students.  XV is their tool for ‘X’ propagation verification.  It is still in beta.  The tool reads the RTL code, generate a small Verilog testbench which poke internal signal to ‘X’ and check the propagation.  The tool then run that small testbench on your normal simulator and see any ‘X’ is propagated anywhere.  I doubt the usefulness of this tool.  Lastly, they have ABV for formal assertion checks, but they don’t have a demo setup.  I suspect the tool is not ready even a working beta.  I am not very impressed by Realintent, if their tools works just advertised, we will probably save a few days of debug time in the beginning and that’s it.  I am always skeptic about their claim.

Aldec
They used to provide OEM simulator to Xilinux and other FPGA vendors, now they are selling it as a standalone simulator. The simulator runs on Windows and Linux. It comes with a lint tool and support assertion checking (but not formal analysis). This tool targets FPGA designs, since it probably won’t able to handle 50Mil gates ASIC design. The IDE GUI is simple and pretty, but lacks features and strength of IES.

IBM
I went to the IBM booth to find out what’s new in DOORS and vManager integration. The IBM guy brings in Michael Mussey from Cadence, who overseas the vManager project, when he walked by us. In short the integration is not yet working. In the planing front, DOORS will generate the vPlan file from the design spec, verifiers only have to map the coverage and checkers in the vPlan, via a RIF (requirement input format) XML file. In the reporting from, Cadence is working on a script take the vPlan and UCM, generate a UCIF (universal coverage input file) and feed it back to DOORS. Another potential application is use DOOR for verification schedule, DOORS has a plugin that talk to Microsoft Project. It looks like historical data is not saved in DOORS, DOORS only report the current view. Michael from Cadence mentioned that they are adding a MySQL backend to vManger to report historical data. I think we can look into using this new feature to replace our Excel spreadsheet. DOORS has bug tracking tool integration as well. A confirmed bug report should automatically trigger a change request in the requirement spec. We may need to upgrade our the PREP system to work with DOORS.

Nextop
The Nextop is very interesting. It generate assertion (PSL or SVA) automatically from monitoring your simulation. It is an unique solution to address the problem of who writes the assertion. Their tool will analyze how the signals is used in the simulation and generate a list of PSL or SVA statement as the properties of the block. Then the designer have to go through the list (a few hundreds of them) and categorize whether the should always hold true (an assertion) or it’s only true because we haven’t run enough stimulus. (a coverage) Then we the testbench will incorporate the assertions and use them for the rest of the simulation. My only concern is their solution seems too good to be true and I can’t tell the quality of the auto-generated assertion from the demo. I would like to evaluation this tool and is the generate asserted useful or just junks. The simulation over is about 10-15% when the tool is turned on to collect information. Currently, it only work on block level at the moment and the biggest size they had ever tried only has 12K line of code. The designer is weakest link in their flow, since the designer has to check and classify each generated assertion one by one.

eVe
They make emulation box. They talk about testbench speed up, so I am interested in their presentation. But it turns out they mean their box only support synthesable testbench. They don’t have any interface for the FPGA array to communicate with the simulator. They keep telling me that I can easily hook up the emulation box with the simulation by building custom PLI function. Yeah, right. It looks like there are not many emulation box support simulation acceleration out there. Probably it is only supported by the box from the big 3 simulator vendors.

ExpertIO
Verification IP vendor. They have Ethernet, SAS, SATA, FC VIP. The offering looks good but the only problem is the VIP is implemented in encrypted behavioral Verilog with SystemVerilog wrapper to control the transactor. They refuse to show me how the API of the VIP looks like unless PMC went through the official NDA process. The only information I can get is the a useless feature list of their VIP but I can’t tell how easy or annoying to use their VIP.