Category Archives: Daily Scribble

My random thoughts of the day.

WordPress 3.0

I am using WordPress for my blog. WordPress just released 3.0 Last month. I have been putting the upgrade on hold for a while since I was too busy at work and at home. I finally have some time to backup the database, the blog installation directory and upgrade my WordPress installation.

It is a push-button upgrade. New version is downloaded and installed in my server without any trouble. Except the Easy Adsense plug-in seems keep corrupting my layout. Everything works fine after I disable that plug-in. WordPress 3.0 is not much difference from 2.9.x on the surface. The only new feature that I use is get the short link button. I am using a custom theme, so the new 2010 them with configurable header image and background is none of my concern. The new custom taxonomies feature is very powerful, especially creating page with different page types, but I don’t see any use in my blog. I have got around to setup the new custom menu feature. I am using pages as my menu and I am not planning for a change.

I upgraded my Atahualpa to the last version which works with WordPress 3.0. I spent a few minutes cleaning up how pictures is displayed in the sidebar. I setup different sidebars in the posts than the one used in the front pages. The major change is adding the series plug-in. This plug-in groups multiple posts into a series and insert series navigation links in the posts and in the sidebar. I uninstalled the twitter plug-in, since I don’t really use twitter.

I think my blog is up to date and pretty good for now. I don’t have plan for any major upgrade in near future.

Chess Analysis #1

Inspired by my friend’s blog, I joined chess.com in Facebook. I played a few blitz games with random people online. I found playing live human is more fun than playing computer because computer does not make blunder moves. My record is straight lost so far, this is the closest game that I have very a good chance to win. I lost the game from a very stupid blunder. I am playing black

1.e4 Nc6
I did not use the standard response to king pawn opening. Just want to try something new.

2.Nf3 Nf6
White did not push queen pawn as expected. I made a bad move. Knight is exposed to the pawn in f6. If the white pawn push to f5, it will be protected by the Knight

3.Nc3 d6
White develop the other Knight. I moved the queen pawn to clear the way for the bishop

4.d4 e5
White push queen pawn. I push king pawn to invite a gambit.

5.d5 Ne7
White pushed the queen pawn and eying at my Knight. I lost a tempo by retreat the Knight.

6.Nb5 Bd7
White push a lone Knight. I move the Bishop to chase it

7.Be3 c6
White push bishop to target a1. I move pawn to c6 targeting d5. Actually moving b6 to protect against bishop is a better idea

8.Nxa7 cxd5
White take a1 with Knight. I would be in more trouble if he take c6 first or take a1 with a Bishop. His blunder allow me to exchanged a side pawn with a center pawn and his Knight is trapped by my Queen.

9.Ng5 d4
I wonder white try attack with the other Knight or merely put the Knight there to protect the e4 pawn. I should have move Knight to c6 and the white Knight is dead. Instead, I push my center pawn to pressure the white bishop and estimate the center. I think I opened more front line than I can handle.

10.Bc4 Be6
White Bishop attach f7, I protect it with a Bishop

11.Bb5+ Bd7
White Bishop check, I moved the Bishop to defense.

12.Bd2 Bxb5
White retreat the the e2 Bishop, I exchanged a Bishop with a Bishop. I should have exchange the Knights instead and move the side pawn to center.

13.Nxb5 h6
White Knight take my bishop and escaped. I moved h pawn to chase the other Knight away.

14.Nf3 Nxe4
White Knight retreat. I took the unprotected pawn in the middle with my Knight. Looks like I controlled the center board.

15.Qe2 Nxd2
White Queen chase my Knight. I exchanged the Knight with the Bishop.

16.Qxd2 Qa5
White finished the exchange, I initiated Queenexchange

17.Qxa5 Rxa5
White finished the exchange and my Rook is chasing the Knight.

18.Nc7+ Kd7
Knight check. My King moved away.

19.b4 Ra8
White push the Pawn to chase my Rook. I made a Blunder by moving the Rook to a8

20.Nb5 Nc6
White didn’t see my blunder and draw back the Knight. I have leave the Knight wandering around for too long. I should have plan about end game after the Queen exchange. I moved the Knight to threaten the unprotected b pawn.

21.a3 Be7
White move a pawn to protect b pawn. I moved Bishop to bring it to front line.

22.O-O f5
White castled. I moved the f pawn, but I should have moved the bishop to gain tempo.

23.Rad1 Bf6
Fire power is building up at d4

24.c3 g5
White moved the pawn to put more fire power at d4. I can’t exchange the c3 pawn or the Rook will have a check. I tried to match the other pawn to push forward

25.g3 g4
White can’t move away any firepower on d4, so he match the other pawn. I should have move away the King or invite a pawn exchange. But I try to chase away his Knight for no reason. In end game, the leading side should initiate exchange to widen the leadership.

26.Nh4 Bxh4
White seems to make a suicide move and I exchanged a Bishop for a Knight.

27.gxh4 Rhg8
White’s King defense is broken. Looks like I am going to win and I become careless. I wasted a move to use the Rook to pin the white King. I should have moved my King away from White’s Rook pin.

28.c4 f4
Both side matched pawns. Again I should chase away the White Knight.

29.c5 dxc5
White initiated exchange of pawns.

30.bxc5 g3
Finish the exchange and white now has 2 isolated pawns. I pushed the g pawn.

31.f3 Ke6
White push the f pawn pass my g pawn. I should have keep marching the g pawn. The g pawn is safe since it is protected by the Rook. Instead I moving my King and let Black fork my King and Castle. The tide is turned against me.

32.Nc7+ Kf5
White capitalize my blunder. I should have exchanged that Knight a long time ago.

33.Nxa8 Rxa8
Exchange of pieces

34.Ra1 Re8
White want to march the a pawn. I should keep marching the g pawn.

35.a4 e4
Why would I march the e pawn?

36.a5 e3
Keep marching.

37.a6 Na7
I am losing the march.

38.axb7 e2 39.Rfb1 Nc6 40.Ra8 Rxa8 41.bxa8=Q
I accept my fate and resign after the White got a Queen.

Project Management Fundamental course

I challenged the PMP exam without having any project management training. I just studied the exam questions like I studied for any other public exams. When I got my PMP credential, I joked that I don’t have any practical knowledge of a good project manager, but I sure can talk like one.

After many painful project schedule slips, my company finally realize the importance of project management and sudden jump on the project management bandwagon. Everyone who has to lead something in the next project is sent to take a 2 days project management training off-side in a nice hotel. Breakfast and lunch are include but too bad that we have to pay for our own parking.

The training is really useful, even to a certified PMP like myself. The training is not sitting there all day long and flipping through some boring slides. We learn program management through a series of exercise designed to get us familiar with project management concept and introduce us to some best practices. We are divided up into 4 groups and were told to come up with a mock project to work on. Other groups simply use their existing project for the exercise. My group is more creative, we come up with a fake project which trying to get a better replacement for a tool everyone hates. It turns out using a fake project is better than a real project. It frees us from distraction from personal bias or irrelevant technical details. We can focus on the project management process without worrying about the work piled ahead and have some fun.

Everyone got a certificate after completing the two days training, but I am the only person in the class care about that piece of paper. I can use the training for the continue education requirement for my PMP credential renewal. I get 16 PDU credits and the company pays for them. This training spent most time on the planning phase. I wish the company will give us another training for project tracking and monitor phase. Not only we have problems in planning the project right, but also we have problems keeping track of our progress through out the project. I can also earn more PDU credits and have more free lunch!

DAC Technical Review (Day 4,5)

The exhibition floor is over in day 4 and 5. In day 4, I attended user track presentation on verification and a technical session on What input language for HLS. In day 5, I attended a workshop on Software Engineering using Agile Software Development technics

User track presentation on verification

In the presentation Migrating HW/SW co-verification platforms from simulator to emulators, it outlines a few challenges in the flow: 1. compile ASIC/IP vendor simulation model to the emulator. 2. generate the primary clock in emulation. 3. Use transaction based VIP or use emulator specific transactor IP.

In the presentation A Methodology for automatic generation of register bank RTL, related verification environment and firmware headers, it outlines a flow similar to our RDA flow. The difference between RDA and their flow is they support IP-XACT as the register definition flow and using tcl/Java to generate the files. The register XML files is translated into Java class, then registers are read from the Java class to generate the register RTL, vr_ad register files and firmware headers files. Their flow does not support auto-generation of backdoor path in vr_ad, neither does our RDA flow.

In the presentation Utilizing assertion synthesis to achieve an automated assertion-based verification methodology for a complex graphics chip designs, nVidia demonstrate the use of the Nextop tool to generate properties representing design functionality. The flow is pretty much the same as what’s outlined in Nextop’s booth. The presentation has introduced a few new concepts, first is the notation of assertion density, which measures the number of assertion properties required to define the functionality of the design. Then there is the difference between synthesized properties and synthesable properties. The first one refers to the properties auto-generated using Nextop’s flow, the later one refers to the assertion is able to run inside the emulation box. However the specification mining is only as good as the simulation traces feed into tool.

In the presentation A smart debugging strategy for billion-gate SOCs, Samsung present a solution to a common problem we have in verification. When a simulation fails, we need the waveform to debug. On one hand, takes time to rerun the simulation and dump all the waveform. On the other hand, it takes up disk space and slow the simulation down if we start dumping all the waveform in all simulation runs. An approach to solve this problem is save check points in the simulation, then rerun the simulation and dump the waveform from the closest check point to where the simulation fails. We attempted to implement a home grown solution using ncsim’s native save/reload function, but save/reload function has been very buggy and very inefficient in term of snapshot size. The presentation introduces a tool from System Centroid called SSi-1 to automate the flow. It worth to evaluate SSi-1 to see how well it solves the problem of dumping waveform in re-simulation. The only concern is System Centroid is a Korean company and most information in its website is written in Korean.

In the presentation Bug hunting methodology using semi-formal verification techniques, ST Microelectronics introduce a way to combine the formal verification with simulation. The idea is invoke the formal engine in pre-defined search point during the simulation to limit the scope of formal search space. The formal engine can be triggered when a certain RTL condition is met, on interval, on FSM or on coverage.

What Input Language is the best choice for High-Level Synthesis (HLS)?
This is the most heated debate session in DAC. The panel invited speakers from Cadence, Mentor, Synfora, Bluespec, Forte and AutoESL for this show down on their HLS methodology. In this a three way debate, the languages of choices are C/C++, System C and System Verilog. All the speakers are biased one way or another because they are representing their company which invested millions of dollar in a certain language, so they really advocate their choice of language is better than others.

The benefit of using System Verilog over C++ or System C is SV allow the designer specify hardware architecture. The weakness of System C or C++ follows sequential models as it lacks ways to specify concurrent models. Architecture decision cannot be made by the synthesis tool since it is the first order optimization. C++ or System C HLS tool has to use compile directives to specify the hardware architecture.

The benefit of C/C++ is the only language used by both HW/SW developers. Algorithm are modeled in C/C++, so it makes C/C++ the most native input to HLS tool. Modeler or SW developer does not need to learn a new language and there is no need to translation the code in C/C++ to another language. Using C/C++ can postpone defining the architecture by separate the layer of abstraction or even making decision on the HW/SW boundary.

System C is kinda half way between C/C++ and System Verilog. The advocate thinks it has the best of both world, but others thinks it got the worst of the both world. It provides limited language construct to define timing related information and concurrency statements. It can define more accurate hardware architecture than C/C++, but it also carries the burden of a 40 years old programming language that is not design to describe hardware implementation in the first place. However, System C is supported by Cadence, Mentor, Forte and NEC CyberWorkBench, the four biggest HLS tool vendors.

Agile Programming Techniques
I signed up a full day workshop on How to Write Better Software in day 5. The workspace is conducted by IBM internal software training consultants. IBM is huge on agile software development. Agile project focus on four elements, stable, time-boxed short iterations, stakeholder feedback, self-directed teams and sustainable pace. The workspace introduced two agile methodologies, eXtrememe programming (XP) and Scrum.

In XP, there are 12 programming practices, the instructor did not go over all of them in the workspace. The major practices they had mentioned are: 1. Lean-software development, 2. test-driven development (TDD), 3. automation, 4. continuous integration/build and 5. pair programming. Lean-software development apply value stream map to eliminate waste. TDD focus on the idea unit test and re-factoring.

In scrum, the project is divided into 2-4 week sprints. In the beginning of sprint, there is a sprint planning meeting. The product owner determine the priority of all the user stories in the product backlog. Then scrum team will pick the sprint backlog and commit to the sprint goal. Scrum master remove road blocks of the scrum team. A user story describes functional that will be useful to a stakeholder of the system. Within a sprint period, the team should get everything done, (coded, tested, documented) of the picked user stories. The scrum team will conduct short 15 minutes daily scrum meeting to report the progress from yesterday, the plan for tomorrow and road blocks need to resolve. At the end of the sprint period, there is a sprint review meeting and demo of the sprint goal. Unfinished user stories should put back to the product backlog and re-evaluate its priority.

Verification is a huge software project on its own as we already created more lines of code than the RTL design. I think applying Agile programming techniques will help us to improve the quality of work. The workshop is just an introduction to Agile, it outlines what Agile is and its potential benefits, but it leaves out details on the know-how. It would be nice to learn more on how to apply Agile in verification setting as our work is not exactly the same as software development projects in IBM. Moreover, knowing the principles of Agile is one thing, avoiding pit-falls during the execution is another thing. There are many real-life problems need to be sorted out to make an Agile project successful. The workshop did not talk about how to estimate schedule with Agile given that the planning is only done within each sprint, how to manage people within a Agile team, how to deal with free riders or how to deal with difference in skill levels or how to deal some tasks that no one want to work.

Given the workshop is a 3 days IBM internal training squeezed into 1 day, it is understandable that a lot of information is left out. However I am leaving the workspace unsatisfied, I expected to learn more about Agile from the workshop.

DAC Technical Review (Day 3)

In the 3rd day of DAC, I went to the user track presentation on formal verification, checked out the booth of Onespin, Jasper, SpringSoft, Tuscany, AMIQ, Starnet, Forte Design System and Cypber WorkBench

User track presentation on formal verification

The user track presentation is where users of the EDA present their work on how make the best and creative use of the EDA tools.  There are about 40 presentation today and about 10 of them is related to verification.  I read their presentation poster and then talk to them to exchange ideas.  Here is a few pointers I picked up from the presentation:

  • Use formal verification tool to aid closing of code coverage.  For the line of code is not yet covered, we can write a property statement for that line, feed it to the formal engine and ask it to come up with a input sequence that will trigger the assertion.  The formal engine may either generate an input sequence or prove the line is unreachable.
  • Sun/Oracle has the idea to run the property in the simulation to keep both formal and simulation in sync.  The trick  is to have some “useless” signals in the DUT to qualify the assertion check to avoid having tons of false-negative when the DUT is an invalid state.
  • One presentation presents the result that using formal verification early in the development cycle will catch more bugs in FV.  Duh!
  • This is a good one.  In formal verification, there are two types of properties, abstract properties that is safe and incomplete, constraint properties that are unsafe and complete.  Using which properties type is a trade off between finding counter example or getting a full proof of the design.
  • Exhaust prove is difficult, it is more practical to limit the proof to some reasonable depth.

Onespin

This company build formal verification tool.  Their basic product is similar to IFV, except it has a better and easier to use GUI that allow users do a lot more interaction and visualization.  Their flag ship product is operational formal ABV, instead of defining basic cause-reaction properties in ABV, the tool provides assertion library allow user define operational properties.  Then the user will go through an iteration to get a full coverage on the formal space with the aid of the tool.  The idea is to generate a set of properties that completely define the RTL.  I think the tool will work as it advertise because at the end is human who has to enter the missing properties.  However I wonder what’s the use of getting a complete ABV definition of the RTL.  It seems the idea is totally up side down.  I guess the idea is instead of auditing the implementation of the RTL code, the auditor should audit the complete properties of the code.

One thing I don’t like about Onespin is they have way too many products and it’s really confusing.  The flag ship product has all the features and the rest are just crippled version with fancy marketing terms simply to confuse users.  For example, the difference between two products is only the ability to load external property files vs the property has to be in the same file of the RTL code.  I don’t really like this kind of marketing ploy simply exist to milk more money from the customer.

Jasper

Jasper is THE formal verification tool vendor.   I spent almost 2 hours (and have my lunch) in their booth walk through all the demo and try out almost all the features in their product.  This is the tools of choice in may formal verification presentation in DAC.  The tool is much more user friendly and powerful than IFV.  IFV seems so primitive compare to Jasper.  Jasper also has property libraries for different memory, FIFO model instead of just black-boxing them out.  It support tunnel, so the user can steer the formal engine.  It comes with a lot more formal engines than IFV and gives very clever ways to get a counter example or a proof.  Active design uses the same formal engine but it is for a different application.  The idea is if we have poorly documented legacy RTL, the new designer can use active design to generate properties of the RTL and understand exactly what the RTL can and cannot do.  Another benefit is when we ask the designer can the RTL do such and such, we no longer have to take their work for it, the designer can generate recipe to prove their design to answer our question.  Jasper has an account manager for PMC and she know Bob and Alan.  I think we really should try Jasper in Digi and get Bob setup the tool for us.

SpringSoft

Springsoft acquired Debussy.  Debussy and Verdi does not change much, other than it added support to power-aware design and system verilog.  Siloti is an very neat add-on to Debussy for waveform compression.   The idea is very neat, in simulation we really only need the input, output and flip flop values in the waveform database.  The waveform viewer can calculate the value of the combination signals on the fly.  The waveform database is only 25% of the original size.  Certitude is a tool to test the testbench.  It corrupts the DUT and check whether the testbench fail.  If the testbench still pass when there is signal corruption in the DUT, there must be something wrong with the testbench.

Tuscany

This company has only 1 product.  It is a web interface GUI to display the timing report.  I like the GUI, even though I don’t know much about timing report.  I can see it solve the problem of how to keep track of so many timing reports.

AMIQ

Another small company with only 1 product.  They have a DVT IDE for Specman and SystemVerilog base on Eclipse, the open source Java IDE.  The IDE works like Visual Studio, it has editor, data browser, structure tree, keyword auto-complete, quick linting, hooks to launch Specrun, all under the same GUI.  It is a lot more user friendly compare to editing e code with vi or Emacs.  They are working on the debug interface hooking into the simulation for the next release, it will work like gdb.  I highly recommend purchase a few license (1k per seat, but I am sure Bob can negotiate volume discount if we buy more), give it out to the team to evaluate the product. I think we will see productivity increase with the DVT IDE instantly.

Starnet

They are selling a VNC replacement that they claim s much faster then VNC.  I know CAD is evaluating some fast VNC-like software right now.  Maybe we should get CAD to try out this product as well.  We all know how painful is it to view waveform in Banaglore via VNC.

Forte Design System and Cypber WorkBench

Both company sells high level synthesis (HLS) tool, that compile SystemC into RTL code.  It looks like HLS is finally here.  I don’t have enough domain knowledge to evaluate the tools.  All I can tell is they have a nice GUI and the RTL code generated is not very readable.  I asked about is there any limitation on the SystemC code and the efficiency of the generated RTL, I only got the typical marketing answer.  Too bad that both tools only work with SystemC, it would be nice if there is HLS for behavioral SystemVerilog.

DAC Technical Review (Day 2)

In the 2nd day of DAC, I attended a technology Session Bridging pre-silicon verification and post-silicon validation, a user track presentation on An Open Database for the Open Verification Methodology Synopsys VCS demo and verification luncheon, visited the booth of the following companies: Realintent, Adlec, IBM, Nextop, eVe, ExpertIO

Bridging pre-silicon verification and post-silicon validation

This technology session has panel discussion on closing the gap between verification and validation. Verification and validation has two very different culture arise from limitation in our work. The industry has the same problem we are facing in PMC. There is problem between control vs speed cost vs effort in testing. Since the test environment is incompatible between the two side, it is a challenge duplicate a problem from one side to the other side. The latest technology is Design for Debug (DFD) to close the loop between validation and verification. The idea of DFD is very simply, insert build in waveform probe and signal poke logic into the silicon, so we have get more control and observability in validation. The DFD is very new, but they are aiming to get the flow standardize and automate just like DFT. Simulator will have hooks to the DFD interface to recreate bugs found in validation or generate vectors and dump it to the device. It is interest to see the statistic of RevA success rate has dropped significantly in the industry, from 29% in 2002 to 28% in 2007, and seeing more Rev on average. DFD can speed up the validation process and better turn around between validation and verification. ClearBlue is a DFD tool and they claim overhead is 1-2% area increase in the silicon. However the Intel panel guests cite a number as high as 10% in their own in-house DFD solution. User can trade off between increasing pin count or adding more probe memory to cope with the bandwidth requirement on the DFD interface.

An Open Database for the Open Verification Methodology

This presentation come out from a university research project. It’s like what Peter had proposed a few years ago. Hook up a C++ SQL API with Specman and save all the coverage or even packet data to a mySQL database. It is a neat proof of concept exercise, but vManager already took address this question.

Synopsys VCS demo and verification luncheon

The VCS demo is not really a demo. It’s just marketing slides. However I chatted with the VCS product manager for half an hour after the demo and manage to hear a few interesting things about VCS.

  1. Cadence has EP and VM for test planning, Synopsys just use a Excel spreadsheet template.  The spreadsheet will suck in the coverage data in XLM format to report the test result.
  2. VCS multi-core is more advance than I had expected.  The user can partition the design along logical blocks (subsystems) and run each block in a different core to speed up the simulation.  The Cadence multi-core solution does not partition the design, it merely move some function like waveform dumping, checking assertion, running the testbench in a different core.  The catch is each core checks out a SVN license.
  3. VCS has a new constrain resolver, but they use a different approach than Igen.  They don’t break down the generation into ICFS.  Looks like there are more than one constraint resolver algorithm out there.  They claim the new constrain resolver is better than Cadence, but they are only comparing to pgen.  The VCS guy is not aware of igen.
  4. VCS finally support uni-cov, which supported by Cadence since IES8.2.  They have a tool to merge uni-cov files in parallel, kinda like the NHL playoff.  I think we can modify our coverage merge script to merge coverage in parallel to avoid crashing.

Realintent

This company offer statistic verification tool that runs very early in the development cycle to catch bugs before having testbench.  I have a demo with them and able to play around with their tool.  LINT is the HDL linting tool.  Other than having a filmsy GUI and integrated with Verdi, I don’t see any advantage over HAL.   IIV is a tool for imply intention verification, which analyze the HDL code and check it against 20 predefined checks.  LINT catches syntax error and combination error, IIV obviously sequential error like dead state in a state machine.  I don’t think IIV is very useful since the user cannot define custom checks.  The built-in checks only catch careless mistakes or stupid logical error made by co-op students.  XV is their tool for ‘X’ propagation verification.  It is still in beta.  The tool reads the RTL code, generate a small Verilog testbench which poke internal signal to ‘X’ and check the propagation.  The tool then run that small testbench on your normal simulator and see any ‘X’ is propagated anywhere.  I doubt the usefulness of this tool.  Lastly, they have ABV for formal assertion checks, but they don’t have a demo setup.  I suspect the tool is not ready even a working beta.  I am not very impressed by Realintent, if their tools works just advertised, we will probably save a few days of debug time in the beginning and that’s it.  I am always skeptic about their claim.

Aldec
They used to provide OEM simulator to Xilinux and other FPGA vendors, now they are selling it as a standalone simulator. The simulator runs on Windows and Linux. It comes with a lint tool and support assertion checking (but not formal analysis). This tool targets FPGA designs, since it probably won’t able to handle 50Mil gates ASIC design. The IDE GUI is simple and pretty, but lacks features and strength of IES.

IBM
I went to the IBM booth to find out what’s new in DOORS and vManager integration. The IBM guy brings in Michael Mussey from Cadence, who overseas the vManager project, when he walked by us. In short the integration is not yet working. In the planing front, DOORS will generate the vPlan file from the design spec, verifiers only have to map the coverage and checkers in the vPlan, via a RIF (requirement input format) XML file. In the reporting from, Cadence is working on a script take the vPlan and UCM, generate a UCIF (universal coverage input file) and feed it back to DOORS. Another potential application is use DOOR for verification schedule, DOORS has a plugin that talk to Microsoft Project. It looks like historical data is not saved in DOORS, DOORS only report the current view. Michael from Cadence mentioned that they are adding a MySQL backend to vManger to report historical data. I think we can look into using this new feature to replace our Excel spreadsheet. DOORS has bug tracking tool integration as well. A confirmed bug report should automatically trigger a change request in the requirement spec. We may need to upgrade our the PREP system to work with DOORS.

Nextop
The Nextop is very interesting. It generate assertion (PSL or SVA) automatically from monitoring your simulation. It is an unique solution to address the problem of who writes the assertion. Their tool will analyze how the signals is used in the simulation and generate a list of PSL or SVA statement as the properties of the block. Then the designer have to go through the list (a few hundreds of them) and categorize whether the should always hold true (an assertion) or it’s only true because we haven’t run enough stimulus. (a coverage) Then we the testbench will incorporate the assertions and use them for the rest of the simulation. My only concern is their solution seems too good to be true and I can’t tell the quality of the auto-generated assertion from the demo. I would like to evaluation this tool and is the generate asserted useful or just junks. The simulation over is about 10-15% when the tool is turned on to collect information. Currently, it only work on block level at the moment and the biggest size they had ever tried only has 12K line of code. The designer is weakest link in their flow, since the designer has to check and classify each generated assertion one by one.

eVe
They make emulation box. They talk about testbench speed up, so I am interested in their presentation. But it turns out they mean their box only support synthesable testbench. They don’t have any interface for the FPGA array to communicate with the simulator. They keep telling me that I can easily hook up the emulation box with the simulation by building custom PLI function. Yeah, right. It looks like there are not many emulation box support simulation acceleration out there. Probably it is only supported by the box from the big 3 simulator vendors.

ExpertIO
Verification IP vendor. They have Ethernet, SAS, SATA, FC VIP. The offering looks good but the only problem is the VIP is implemented in encrypted behavioral Verilog with SystemVerilog wrapper to control the transactor. They refuse to show me how the API of the VIP looks like unless PMC went through the official NDA process. The only information I can get is the a useless feature list of their VIP but I can’t tell how easy or annoying to use their VIP.

DAC 2010 Technical Report (Day 1)

Today is the report of my first day in DAC. I signed up to a full day technical workshop Choosing Advanced Verification Methodology. After the workshop ended at 3:30p, I managed to checked out a few companies in the exhibition floor Vennsa Technologyies, Agnisys Inc and Veritools

Advanced Verification Methodology

The workspace is smaller than I expected. There is only about 20 attendants. It started off with a keynote from Brian Bailey, a verification consultant, on the latest trends in verification. Assertion and ESL seems to be the theme of the day.

We finally see formal verification comes out from academic research and put into use by the industry and developing a good use model. There are 7-8 formal tools venders in the market right now, but looking at historical data in the EDA industry, no matter what previous technology, the market is only big enough for 2 to survive.

ESL is the latest buzz word. The word has many different meaning but basically it means where software and hardware comes together. To verification, ESL means we are building reusable testbench with different abstraction layers. Starting from the top with TLM model to verify the algorithm, then push down to verify the architecture, and then the RTL implementation at the very bottom. TLM 2.0 is the new industry standard and pretty much sweep aside all proprieties prototypes from different vendors. TLM 2.0 still lacks synthesis and no hardware/software interface.

Currently, many people model ESL in SystemC, but both SystemC and System Verilog need a few more revision to fully support ESL. The new ANSI concurrent C/C++ coming out this year many turn SystemC into a obsoleted branch of C/C++. High-level synthesis, C to RTL compiler, is almost an ESL enabler. It separate the architecture from behavioral description. The shift from RTL coding to high-level synthesis would be as disruptive as the shift from schematic capture to RTL coding.

Constraint random generation is a challenge in ESL verification. Current tool does not understand sequential depth and can’t constraint across sequence. Functional coverage is broken. It is merely isolated observation not necessary reflect the verification progress. We need a different metric to provide a direct measure on verification closure.

In ESL development, management will be a new challenge. Now we have to develop the hardware and software in the same development cycle, there will be conflicting schedule between the hardware team and the software team. Communication among different team and clear interface management at the partition between software and hardware implementation is the key.

In the next few years, the speak predicts there will be more research and probably technology break though in these areas: specification capture in verification, sequential equivalent checking, intelligent testbench, assertion synthesis and behavioral indexing.

After the keynote session, it is customer panel. The panel guests are Intel, ARM and nVidia. The ARM and nVidia are assertion expert, the Intel guy is more on ESL. It is Q&A session, but nothing special, the guest only talks about very generic things. They tell us what they do but don’t they us how they do it.

Jasper has the next presentation together with nVidia and ARM gives customer testimony. They talked about their formal verification tool and introduce basic concept like full proof, counter example, partial proof. There are quite a few neat examples of formal verification, like generate an input sequence for a given output trace, answer urgent customer question on whether something is possible in the design, verify dead lock/live lock, checking ‘X’ propagation. Both ARM and nVidia has dedicated assertion team and they said that is important to the success in using assertion in verification.

Synposys presents new updates to VCS simulation. They close the coverage to constraint loop with the Echo testbench technology. It is similar to what Cadence has and it is limited to the coverages that has a direct relationship to the constraint VCS finally has multi-core support. I think Cadence already it in IES 8.2. We should look into using both technologies for Digi. We should work with the CAD group to set up special LSF machines reserved for multi-core jobs.

TSMC talk about its Open Initiation Platform (OIP) for ESL verification. The virtual platform enable hardware/software co-simulation. The testbench is build from the top-down approach. Start with ESL TB to verify the algorithm, then ESL SoC TB to verify the ESL mode, then add cycle accurate adapter to the ESl model and finally the RTL testbench.

Mentors talks about ESL testbench and present a success story of TLM to RTL synthesis verification. They claim the high level synthesis flow save them lots of time and use the same testbench with different abstraction from top to bottom.

There is nothing new in Cadence’s presentation. They just show how vPlan fit in ESL flow.

Vennsa Technologyies
It is a small company in Toronto based on the research of a prof. from U of Toronto. Their OnPoint debug tool is pretty neat. It is an add-on to the simulation help the designer pin point the bug. Once you have an assertion failure, you can fire the OnPoint GUI. The tool will back trace the logic cone, narrow down and suggest where the bug is about. You can also start the back trace from a given value on the output pin at a given time. I played their demo for almost half an hour and it is a very handy tool if it works as advertised. The idea behind the tools sounds and I think we should evaluation the tool.

Agnisys Inc
This company has two product: IVerifySpec, a web GUI replacement for vManager and iDesignSpec, a half bake solution similar to RDA.

IVerifySpec use SQL database to store the vPlan, but it does not support UCD directly. It has to translate the UCD to an XML offline and import to the database. There is a few nice feature in the GUI, like heat map, traceability matrix, some charts and graphs looks like Google Analysis. However their tool is very immature overall, it does not support multi-level hierarchy in requirement specification, no revision control and data entry via the web interface is very tedious and user unfriendly. I should simply ask Cadence copy those nice report feature into vManager.

iDesignSpec sounds good on paper but the implementation is awkward. You enter the register specification in Word using some funny plug-in. Then the plug-in will generate PDF, HTML, XML, VHDL, OVE, C++ files. Somehow it is the exact opposite of our RDA flow. We enter our register description in XML and generate one thing at a time using scripts. The format of their word plug-in is very ugly. The code and PDF file generated by the plug-in is very primitive. I would say even our old ECBI generator is better than this tool. The only thing useful I learn from this presentation is there are industry standard for register description, SystemRDL and IP-XACT. Maybe our RDA tool should support industry standard as well.

Veritools

Their flag ship product is Veritools Designer. It’s basically a Debussy Verdi clone. It can view schematic and waveform, source code debugging. They claim their tool is very fast and only cost 1/4. I am always skeptic about those claims and I don’t like they use a their own waveform database format. It means simulator has to dump another waveform through their PLI. The GUI is fast but the design in the demo is not very big. The GUI is quite primitive compare to Simvision and they can’t beat the price of Simvision which comes free with IES. I do agree Simvision is a bit slow but I think investing in faster computer with bigger RAM can solve this problem. They have an add-on tool called Veritool Verifyer. This tool is kinda dumb. If there is an assertion failure, it read in the waveform and let you to test changes to the assertion code without invoking the simulation. I don’t think it is very useful. When an assertion fail, how often it is due to RTL bug and how often it’s just a faulty assertion?

DAC 2010 – First Impression

This is the first time I am going to the Design Automation Conference (DAC) conference or any industry conference. It is really an eye opener for me. When I first started working in PMC during the dot-com bubble days, the company promised send us to a conference every two years. Unfortunately before my turn to go, the bubble burst and the company is on survival mode for almost a decade. Finally, we are back on the growth track and the company has money to invest in developing the employees and budget to send us to industry conference.

There is a few reason I picked DAC to go. First, it is the biggest conference of the EDA industry. It has 3 days of exhibition and 2 more days for tutorial and workshops. You can see everything under the same roof, all the tool venders, 3rd party IP provider, the big names and new start up that you never heard of. Second, there are many workshops and tutorials sessions specified for verification, so I can learn what’s new other there, what other companies are doing. In fact there are so many interesting sessions that I could not see them all. Last, the conference is in Anaheim, right next to Disneyland. I am flying Pat down here and spend a weekend after the conference as a mini-vacation.

The latest technology presented by the exhibitors are amazing, but I am equally amazed by the registration system. After you have registered on-line, you can pick up your badge in the registration desk. The process is very smooth, just scan the bar code and your card is printed right in front of you. The card has built in RF chip. You no longer have to hand out business cards to exhibitors, they have a cell phone like device scan your card and print out your information automatically.

There are lots of freebies in DAC. It’s only the first day of the exhibition, I only covered 1/3 of the exhibition and I already got the following freebies: 1 backpack, 3 T-shirts, 4 balls, 2 highlight pens, a measuring tape, A battery powered cell phone charger, a pair of waist band and a book “The ten commandments for effective standards”. Other than freebies, there is free beer. Last night we have the kick-off reception sponsored by Intel. Tonight I went to the Denali party in Disney Downtown. Although, there is the beer is bottomless, no one is abusing the kind offer and got drunk. It is an industry conference after all, you don’t want to embarrass yourself in front of potential clients and employers. The industry is a small world after all.

I am looking forward to the rest of the week. I am going to write about what new technology I learn in DAC every day. Stay tuned.

Sears Stars on Ice

I have watched figure skating on TV during the Olympic games, but it is the first time I am watching live figure skating. Olympic figure skating ticket is too expensive, Sears Stars on Ice features Olympic figure skaters, like Joanna Rochette, and a it only cost $45. The skaters seems more relax in the show than in the competition. I suspect they are saving their best for the competition and I only get to see the safe moves.

I never watched figure skate seriously, I have only watched the highlight on TV. Watching a live show confirmed my belief that figure skating is boring. The first few skaters are fun to watch, but soon the novelty wears off. The movements of all skaters are more or less the same, they are just jumps, turns, spins, throws in different sequence. Watched one pretty much watched them all. The most noticeable difference among the skaters is some of them screwed up the landing and lost their balance. Figure skating is so boring that I slept thought the second half of the show.

Sports, like figure skating, with no objective measure of winners and losers are usually boring. It relies on subjective measurements like aesthetic appeal of the judges to determine the outcome, so you don’t get any excitement out of competition. In sports with objective measure, it is easy to understand the athletes, as their action are directly related to winning the game. However in sports without objective measure, the athletes’ actions do not have obvious purpose and the sport lost its meaning, thus it is boring.

The 5th Bangalore Trip

This is my 5th trip to Bangalore. Instead of staying in the guest house, this time I am staying in a new hotel in Electronics City, which is only 5 minutes walk away from the office. When I first went to Bangalore 2 years ago, there wasn’t any business hotel close to the office, but now at least 4 new hotels has opened. My hotel is acceptable, but there are still many little things that drive me nuts in the usual Indian way. For example, the room has a do not disturb light outside of the room, but somehow the room service always ignore the sign. They will ring your door bell at 7:30a giving you your morning newspaper, then ring your door bell again at 8:00a asking you whether you have laundry for the day. Like a typical business hotel, the room comes with a comfortable bed and a large working desk, but they don’t have any lights over the desk. I have to move a pole light from other side of the room to light up the desk area. For whatever stupid reason, the hotel decide only to turn on the boiler from 6a-10a and 6p-10p, if you want to take a shower in the afternoon or at night, sorry you only have cold water. Given that most flight arrive India after midnight and almost every traveler want to take a hot shower the first thing after checking into the hotel, it leaves a very bad first impression. I swear I will never go back to this hotel again.

Bangalore has changed a lot since my first visit 2 years ago. The toll bridge linking Electronics City to Bangalore is finally open. Now it only takes 15 minutes from the office to Forum Mall, compare to over 45 minute in the past. Taking the bridge costs Rs45 a day, saving 30 minute for $1 definitely worth the price. However, I suspect the bridge won’t last very long. They forgot to build proper rain drain system for the bridge. When it rains, the center lanes are floated with water as deep as a feet. The bridge opens for the first monsoon season and the pillers already shows signs of rusting and water damage. Our company even issue a warning to the employees warns them not to use the bridge during heavy rain.

Looks like the outsourcing business in Bangalore is doing well. I have met a lot more foreigner this time compare to 2 years ago. There are quite a number of German staying in my hotel and they work for different companies. We love to exchange Indian stories, whining how annoying some of the Indians are and we all agree it sucks living in Bangalore. There are many new non-Indian restaurants opened in last year, Japanese food is in fashion in Bangalore, with the number of Japanese restaurants tripled. Some 5 stars hotels start offer sushi at their Sunday lunch buffet. I am very skeptical about idea of eating raw fish in India, I don’t want to get any food poisoning in Indian. The the number of western restaurants is blooming but the quality of service is deteriorating. In the past the service was supreme, there is always a waiter available when you need one and they call the customers Sir or Madam. Now, maybe the restaurants is spoiled by western customers, their service is no difference from with what we get back home. Food and gloceries is still cheap in Bangalore, although there is an annual 20% pay increase in the high-tech industry, the rest of the people are still very poor.

I see more traffic lights and less cows on the road. The city seems to be less chaotic, I wonder it is just my illusion because I am getting use to the chaos. I hire a cab via the hotel to take me to nice restaurant every night as usual. Somehow the drivers are clueless about the address and they don’t know how to read Google map. To make sure I can dinner on time, I jumped to the front seat and navigate the driver. Since there is no street signs in Bangalore, telling you which street to turn is useless. Google map comes up with a solution, in addition to the street name it also list any landmark at the street corner. The driving instruction in Google map tells you turn right at KFC or BP gas station, pretty neat. I can’t wait to see Google street view comes to Bangalore.