( DAC 03 Item 11 ) ----------------------------------------------- [ 01/20/04 ]
Subject: 0-in, Jasper, Real Intent
CROWDED HOUSE: Oddly enough this year the functional verification market
still hasn't seemed to thin out much. 0-in is the grand papy leader here
forging the assertion-based approach with everyone else doing a "me, too"
behind them. All of these tools pretty much work the same way. Feed
it your design plus your assertions of how your design is supposed to work.
After 10 trillion CPU cycles, the tool will find your broken states.
That is, these tools typically are on top of your NC or VCS licenses.
We just purchased 0-in. While we have access to all of their tools we
have only integrated 0-in Check (simulation assertions) into our design
flow. We use both VCS and NC-Verilog in our flow so we needed a tool
that would play nice with both simulators. That left both of the
standards efforts unavailable to us. 0-in had a deeper and more mature
library than either Verplex or Real Intent at the time of our
evaluation.
We are currently experiencing about a 10:1 ratio of False Firings to
Real Firings. That may be due to inexperience with the library. Only
time will tell.
The biggest current issue is performance. We are getting killed in the
elaboration phase making the overhead on small tests (under 30 min at
system) well over 100%. It is not nearly so bad (about 33%) on our
average tests (~2 hours). To be fair, this is without a lot of
intervention on 0-in's part because we have not prioritized fixing it
yet, so there has only been a few emails over this issue so far. Also
on performance issues we know of problems on our end. 0-in did not have
a good process to give us for transitioning from Block Test to Chip Test
and so many of the blocks have checkers on both inputs and outputs which
means in some places we have twice as many as we need. They have been
very responsive to any bugs we have found (mainly in their parcer). We
still feel that 0-in will pay for itself, both in money and trouble, but
the jury is still out.
- Steven Jorgensen of Hewlett-Packard
I was impressed with what 0-in claims to be able to do, and we feel that
Assertion-based verification is one of the most important techniques of
this year.
- [ An Anon Engineer ]
We found that 0-in is by no means a push-button tool and that it requires
significant investment of time and resources. But, 0-in Search did prove
its value by finding a bug that the designers thought would have been
"hard to find" with their extensive test benches.
- Kripa Venkatachalam of LSI Logic
0-in has a new way of measuring coverage of corner cases. I was
apparently behind the times in my last DAC report talking about "pair
arcs". They found that looking at the interactions of various state
machines sometimes created a lot of extra testing without finding any
extra bugs. They now have the user insert assertions, and have
introduced a new measure (the minimum number of cycles to get from
each state element to an assertion) to measure how well the assertions
cover the design.
0-in says their CheckerWare library, with more than 70 elements, is
easier to use than PSL/Sugar because it's basically verification IP for
larger elements like a FIFO. They also sell protocol monitors for PCI,
etc. but I think they cost extra. Their Search program takes a user-
supplied simulation and branches out some user specified number of
cycles (maybe 10-15) to try to exercise assertions. Their Confirm
program is a traditional model checker and can go a couple of hundred
cycles from the initial state.
Their tools use simulation to confirm any bugs they find, and use all
major simulators as well as Axis, and tie into Debussy and Verdi. They
also output their coverage for Verisity to maintain overall coverage
numbers. It almost sounds like the old guard it teaming up against
the PSL upstarts.
- John Weiland of Intrinsix
I spent a lot of time looking at 0-in (Checkerware) since sister
divisions are using them. But I have very strong reservations about
going to a non-open standard assertion system. From what I gathered our
designs (very computationally-intensive circuits) are not well suited for
static formal methods, so I didn't spend too much time looking at
Jasper/Real Intent, etc.
I am interested in "design checkers" like 0-in checklist, BlackTie PDC,
and SpyGlass that does sanity and pre-synthesis checks on your design.
- Tomoo Taguchi of Hewlett Packard
We try 0-in this year, but now we plan to use Candence PSL feature to do
static simulation with assertion, like 0-in search.
- Ji Li of Via Tech
Also, I took 0-in's Checklist for a spin and have some opinions to
share. It had a problem with the same large case statements that
SpyGlass had problems with. In debugging the issue with Atrenta, I
found that DC can compile the case statement in seconds vs. 10's of
minutes for both these tools -- I don't understand what the problem is.
Also, after spending a day with the tool, I haven't been able to run
the entire design through -- it keeps running out of memory.
Checklist uses formal methods to understand the design, so it shouldn't
produce false positives. But in at least two cases, it produced false
positives because it was "sloppy" and didn't look into details to
figure out that they were false positives. One was an arithmetic
overflow case where the designer had prepended a 1'b0 to the operand to
make sure that an overflow wouldn't occur, but Checklist just counted
bits on the either side of the equals sign and produced a warning.
Another was it warned about a clock domain crossing with
synchronization that originated from a bussed register. Only one bit
of the register cross to the other domain, but Checklist assumed that
the entire register was being synchronized.
So my complaints about Checklist is that it isn't checking enough
before issuing a warning, and that it can't seem to run our entire
design (only 2M gates -- but SpyGlass had problems with it too, so it
might be something in our design that causes some problem with this
class of tools).
- [ An Anon Engineer ]
We purchased 0-in because they have a very robust library of checkers and
they work with both VCS and NC-Verilog which no other library did except
OVL which was inferior. We are currently only using their Check tool but
will be evaluating others as time goes on.
The Jasper Gold tool looks interesting but is very immature. I hammered
them because they had spent time to develop a waveform viewer that did
exactly what Novas Debussey did and thought it was cool. I think they
may be on the right track but time will tell.
One of the projects shown in Jasper's DAC floor demo was for a chip that
at the time I knew there was a bug in because we had sampled it. I don't
know if Jasper found the bug or not.
I did not look at all the formal Model Checkers but the ones I did see
appeared to be heading in the direction that 0-in was already going.
Using simulation seeds and obtaining coverage information from checker
are things 0-in had at least 2 years ago.
- [ An Anon Engineer ]
Our team we had 0-in and BlackTie seminar in-house this year. So far no
decision has been made yet. I am somewhat curious about Jasper.
- [ An Anon Engineer ]
Any time you put Harry Foster and Doug Perry on the same formal
verification team you have got to pay attention. Looks like Jasper is
trying to make formal tools much faster.
- Clifford Cummings of Sunburst Design
The other interesting verification tools were the static formal
verification offerings from Jasper and 0-in. Jaspers claims they offer
the first tool that does exhaustive verification on a RTL design with
properties and does not require a testbench. They claim their engine
has the ability to identify the state space that is not relevant to the
property being proved and reduce the time required to verify a property
and increase the coverage to 100%.
Their properties are written in synthesizable Verilog. 0-in claims to
have the same capabilities as Jasper with the properties specified in
0-in format.
- [ An Anon Engineer ]
Jasper Design Engineering (jasper is a touchstone for gold) sells Jasper
Gold. This tool is strictly formal but is not limited to the perhaps
200 cycles that everyone was talking about at DAC this year. It somehow
pares down the state space to what it thinks is relevant, based on the
user requirements. The requirements are specified in Verilog. The tool
basically compares an RTL level design to a high level behavioral design.
They emphasized high level requirements rather than low level assertions.
- John Weiland of Intrinsix
0-in seem to have a pretty good story. One of the main strengths is a
small step to get into the tool. Designers find the idea of assertions
quite tractible and more formal approaches get snuck in on the coat
tails. Jasper have an interestingly different story and it'll either fly
or crash and burn - I think the jury is still out.
- Kevin Jones of Rambus
I met with 0-in, Real Intent, Verplex and Cadence, along with attending
the Friday tutorial on assertion based verification.
Although PSL/Sugar was discussed, everyone said that starting with OVL
was a good idea. It is "free" and "open source". PSL/Sugar will support
the OVL assertions, so that is also a good thing.
- Tom Paulson of QuickLogic
Real Intent has a very compelling story and sensible approach to using
assertions intelligently.
- [ An Anon Engineer ]
Real Intent is the easiest to learn and implement. We are using it for
the last 2 years successfully in all our designs.
- Yuval Itkin of Metalink Ltd.
|
|