( DAC 02 Item 8 ) ----------------------------------------------- [ 9/10/02 ]
Subject: O-in, Real Intent, @HDL, Averant, BlackTie, Ketchum, EDAptive
THE OLDE GUARD: In the sea of newbie verification companies desperately
vying for EDA user's attention (and dollars), there are 5.5 members of the
Old Guard in this niche. In order, they're 0-in, Real Intent, Averant,
BlackTie and @HDL. The 0.5 is Synopsys Ketchum -- which has been discussed
for almost 2 years now even though it's not been officially announced yet.
"New verification: ABV was the great buzz word everywhere on the floor.
Amazing enough two vendors (TransEDA and Cadence) already have Sugar
2.0 support. Unfortunately both of them don't have a formal tool in
place right now. From all the formal tools I liked Real Intent's Verix
the most. I don't think that the semi-formal approach (O-in) is the
right way to go. Averant is looking promising, too (especially if the
rumours about an alliance with Verisity come true. @HDL is Verilog
only (unfortunately - liked their debug environment) and they have
tight links to Synopsys - one can argue if this is good or bad. I
think it's bad."
- Raimund Soenning of Philips
"0-In: 0-In Check, 0-In Search.
Again we see the "Assertion Based Verification", but what is really
worth emphasizing is the: "Dynamic formal verification".
Dynamic formal verification is a hybrid of simulation verification and
model checking. It works something like this: step forward with
simulation, choose "points" from which you start model checking. Do
the model checking around those points.
You end up having formal model checking "along" a simulation path.
Think of simulation like an arrow that goes straight to the target,
and model checking like circles that start on different points on that
arrow and "encircle" it with a new layer: the formal verification.
SAT solvers are perfect for this purpose, light and easy to use.
0-in? - Definitely an eye catcher! (and a bull's eye for that matter)."
- Monica Farkash of IBM
"In our environment, we have just 1 person developing 0-In checkers
independently, while the new and legacy Verilog monitors are developed
by a lot of people at different time. The checkers are developed
independent of our monitors. Our predefined checker directives greatly
simplify the effort of developing monitors. We choose to put all the
0-In checker directives into a single control file. It's concise
format makes the checker source file fast and easy to understand. It's
also a good document for reference.
Although we use 0-In check extensively, it is the dynamic formal 0-In
Search strategy that initially drew us to 0-In. We've run 0-In Search
on several tricky blocks, and did find several tough bugs. 0-In Search
tends to find bugs that are missed by simulation. 0-In Check helps to
find bugs that can be caught by simulation."
- [ An Anon Engineer ]
"0-in
The grand-daddy of formal verification is still around, although with a
much smaller booth this time. 0-in came from the world of automatic
checking, where their tool would examine the RTL code, and look for all
kinds of common failures and report them. I believe that they have
gone through multiple versions of their tool.
0-in now make a claim to 3 distinct types of checking:
1) Formal assertions
2) Simulation enhancement
3) Protocol checking
0-in has a bunch of built-in assertions that assume protocol knowledge.
The example of this that they like to give is for an arbitrator. You
can use the built-in assertion to check your arbitrator. You just give
it the inputs and outputs, and tell it the arbitration type, and it
will provide both formal and run-time checking.
What I don't like about 0-in is their insistence that the assertions be
embedded in the RTL code. I understand that they want to combine
design and verification, and I applaud the goal. Unfortunately, the
real world isn't always that clean and organized. People write code
without thinking about verification. Projects are started before a
methodology has been defined. It happens all the time and tools need
to be able to compensate for the real, dirty world where the code
sometimes gets ahead of the plans. Also, their assertions are in their
0-in own language. I don't believe that 0-in supports the emerging
standard assertion languages (yet).
What makes 0-in unique as far as I could tell, is that they admit that
assertions are not the only verification technique. They will connect
up to traditional simulations, and provide a "formal assist" to the
simulation. That means that for every state the simulator reaches,
0-in will use formal methods to explore the state-space around that
state looking for violations of any of their assertion statements.
This is the method they use to extend their ability to find violations
in larger designs.
Finally, 0-in provides protocol monitors in the form of assertion
statements for various protocols including PCI-X, SPI4, and a bunch of
others. I believe that these are simply a bunch of assertions to
check the validity of the bus operations."
- Andy Meyer of Zaiq Technologies
"* attended O-in Design Automation demo
This demo was probably the most impressive I have seen. In addition
to complex assertion checking, a Dynamic Model Checker is implemented
by insertion of special comments into your RTL code. A customer was
there that described how 0-in found some very obscure bugs in his
design. I would highly recomment the we get a demo of this tool. To
be fair, we should also look at what Verplex has to offer. I got a
very short demo and contact info."
- Ed Strauch of Cirrus Logic
"Real Intent
Yet another formal verification company. Like most of the others, they
have a set of pre-defined checks looking for multiply driven nets, and
a couple of unique tests, such as one that looks for dead code. After
the pre-defined checks are complete, the user must create the
assertions that the Real Intent tool will verify formally.
As with several other tools, this one can take properties that were
checked or derived in one module and see that they match the properties
of connected modules. This effectively checks the simple interconnect
issues between modules. This is good for basic bus communication, but
it will not be able to check that distributed state machines are
interoperating correctly."
- Andy Meyer of Zaiq Technologies
"We brought in Real Intent with a flourish of excitement about it
replacing simulations. What's actually happened is we use Implied
Intent as a super Lint tool, and no one goes to the trouble of writing
the assertions to use Express Intent. Designers indicate that Express
Intent is hard to use or not applicable to "real" designs."
- [ An Anon Engineer ]
"Real Intent: Verix
I would set this year's motto to "ASSERTIONS". Everybody does them.
So does Real Intent. They currently use Verilog/VHDL assertions.
They proudly declare to be the first in the industry to generate
automatic assertions. They have an automatic code analyzer that
generates assertions checking for: no bus contention, no array bounds
violations, no floating bus, all branches are entered, all memory
locations are initialized. If you have a SW background think of
similar tools like Zero-Fault, Insure. I wonder why did it take so
long to emulate the SW ideas into HW ?
They use multiple engines for formal verification (SAT including).
It seems they were also the first to implement a formal hierarchical
flow. It works something like this: the tool "goes up" until it find
the origin of a signal, and build a partial model.
They also support clock-domain checking for multiple clocks.
Overall - Real Intent has more than just intentions. If they were the
first to think up those solutions then they must have real potential."
- Monica Farkash of IBM
"Real Intent: Verix Tool
Competing product: @HDL's Verifier
Highlights
I attended both the hands-on tutorial session and the demo. Basically,
the Verix tool is a formal verification tool that is used for property
checking. The properties are embedded in the HDL code as comments or
derived automatically. The language used is called DAS, based on a
subset of Superlog that has been standardized by the Accellera
standardization committee. Real Intent will support any other
assertion languages endorsed by Accellera.
Verix supports two main kinds of assertions: automatic & user-defined.
Automatic assertions are a collection of predefined properties such as
array-bounds check, reachability analysis, proper clock-domain
crossing, and so on. User-defined assertions are those that the user
has to explicitly insert in the code.
Pros
- Supports user defined property spec
- Generates automatic testbench generation
- Will support Accellera standard
- Can automatically generate verilog monitors from properties
that can be used in simulation.
Cons
- The debug environment is rather primitive, not as flashy as
@HDL's tool.
While it is hard to say how well Verix works for real designs, they
claim to have about 20 customers.
- Ambar Sarkar of Paradigm Works, Inc.
"@HDL sells a tool that will automatically extract assertions form your
Verilog. They have both a bounded model checker and a true model
checker (understand "always" and "never") in the same tool. They are
preparing to support the Sugar standard assertion language."
- John Weiland of Intrinsix
"@HDL: @Verifyer, @Design.
@HDL's guest at DAC, Stephen Meier (Synopsys), used the opportunity to
promote OpenVera. @HDLs tutorial was the best I saw to teach Assertion
Based Verification.
@HDL has a common environment for formal verification and simulation.
The verification flow can be followed in the book "The Art of
Verification with Vera". The main idea is you have automatic
generation of assertions (extract propriety). You, as a user, can add
new assertions & constraints. These act as checkers during simulation
and act as constraints during FV. The stimuli generation is guided by
the checking results.
If you are interested in the Model Checking Flow ( @Verifyer), @HDL
has automatic property extraction, automatic design reduction, and a
hierarchical model checking solutions. They use multiple solvers: BDD,
Bounded, Formal Trajectory, and can hook any engine you might have to
their environment. Their analysis tool is @Design.
Overall? It is a complete system, and we love complete solutions. It
has everything in it, formal, simulation, stimuli generation, analysis,
coverage ... maybe not each module is the brightest bulb on the tree,
but together, @HDL is enlightening."
- Monica Farkash of IBM
"@HDL Verifier + Designer
Currently they have no VHDL support.
They have an automatic property extraction tool. Properties may also
be written manually - supported languages are OpenVeraTM and Sugar.
The tools combine simulation and formal methods. On the simulator side
there is a strong link to Synopsys' VCS due to OpenVeraTM language. In
fact it is the only simulator supported so far.
@HDL offers great analysis and debug tools with own waveform viewer and
source code debugger as well as a schematic viewer (comparable to Novas
Debussy).
Side Note: there are rumours that @HDL will be acquired by Synopsys."
- Raimund Soenning of Philips
"We've looked at Averant in the past and we like their tool, but
again we can not get any designers to step up and use these tools."
- [ An Anon Engineer ]
"Averant sells a property checker that is bounded in time. They claim
Solidify is the most efficient tool on the market, it does a lot of
automatic checks (like checks for deadlock), and it is the easiest to
use because it's focused on block level designers rather than system
designers. They are teaming with Verisity. They like their language
better than Sugar so they currently don't plan to support it."
- John Weiland of Intrinsix
"Solidify APG seems to be quite comprehensive as it automatically
generated over 50k properties in 7 categories on one of our large
designs. The APG checks were done on the RTL that was lint-clean."
- Chandra Moturu of Hewlett-Packard
"Averant Solidify
Solidify is a static functional verification tool that exhaustively
verifies HDL designs by using properties/assertions and formal methods.
For defining properties Averant uses their own language called HPL
(Hardware Property Language) but they are going to support OVL from
Accellera as well. Properties can be embedded in the HDL code or be
kept in a separate file. They do have VHDL support! Solidify also
includes a set of automatic checks like set/reset checks, clock
gating & crossings, dead lock checks, bus contention, stuck at
faults, ... It has a kind of linting to assure that a property is
syntactically correct. For every failing property a waveform trace
is generated and also linked back into the HDL code (cross-probing).
It has some possibilities to 'tune' the performance by decomposing
complex properties to a set of simpler properties or reducing the
cone of logic setting constraints (as a very first constraint it just
needs the reset sequence). It also provides static coverage to
supplement the verification process with assertions.
Side note: Averant and Verisity are planning a partnership."
- Raimund Soenning of Philips
"4.1 Averant Solidify
Averant was founded in 1997 as HDAC, Inc. and renamed Averant, Inc. in
May 2000. Averant is headquartered in Santa Clara, CA, and has an
Eastern Region sales and support office near Boston. They have a large
number of customers (more than 27) including Cisco, ARM and nVidia.
Highlights
Solidify is their flagship tool. It's a static functional verification
tool that exhaustively verifies properties of Verilog or VHDL designs.
Solidify properties are developed using an HDL-like language called HPL
(Hardware Property Language) that that leverages existing HDL coding
knowledge, in other words, making it easy for designers to understand
and write properties. HPL can express temporal relationships, sequence
of events and verification loops, among others.
Solidify also contains a library of standard design checks that
automatically generate thousands of Solidify properties to find the
most common static errors in designs, including things like clock
gating and crossing, deadlocks and bus contention.
Averant has also licensed Verisity's 'e' verification language, and
they plan to add support for 'e' to Solidify.
Like other tools in this class, Solidify is attempting to the reduce
functional simulation by static analysis of RTL code, using their
proprietary property language. Solidify process the code using their
own native VHDL and Verilog compilers. Their GUI has a design browser,
shows the source code, properties and coverage analysis, as well as a
waveform of a counter-example if one exists.
I can certainly see the value in static property checking tools. These
seemed to be all the rage at DAC this year. However, I'm not sure
whose jobs this tool is going to make easier. HPL is designed to be
comfortable to the RTL designers. Now instead of just coding their
blocks, they have to think a little more about boundary cases and
coding in rules to detect them. If designers are the ones running
Solidify, then this will take time away from the actual coding. If the
RTL design with HPL checks is handed off to a verification engineer to
run, then there is a new loop created between the verifier and the
designer, in addition the the "standard" functional verification loop.
It's possible Solidify will find too many problems too early,
interrupting the designer from completing the first pass of a second
block, while the first block is being debugged. When the design
changes, Solidify will need to be run again (normally this kind of
functional verification would have taken place during regression).
Thus, a new tool that needs to run somewhat outside the normal
functional verification flow."
- Jim Reisert of Paradigm Works, Inc.
"Verplex BlackTie
BlackTie is a formal tool working with the assertions/constraints
defined with OVL or SUGAR. It splits into two parts what they call PDC
(pre-defined checks) and UDC (user-defined checks).
With PDC they check mainly 3 things in the design:
- Semantic consistency like X-assignments on signals, range checks,
overflow checks
- Structural consistency like bus floating, bus contention
(tri-states), set/reset checks
- Clock synchronization where you can specify your rules for proper
synchronization and clock relationships
For PDC you need to provide an initial sequence for resetting the
design in order to limit the search space.
UDCs are written by the user himself and are intended to be used early
in the design cycle (parallel to RTL coding) and can check for example:
- 1-hot condition on inputs or state machines
- overflows
- assert never checks (e.g. two signals high at the same time)
The assertions can be controlled concerning their severity level.
Reports can be generated how often an assertion gets triggered. They
are easy to en- & disable. Synopsys pragmas can be respected as well.
Roadmap: in the future BlackTie will also provide
- Dead code check
- Extensive FSM check
- Coverage analysis
- Sugar monitor generation capability
They also say they'll add hierarchical management to BlackTie."
- [ An Anon Engineer ]
"I don't have a magic bullet on timing closure, but it sure seems to me
that using a property-checker (BlackTie or the likes) to check your
code, and then C for simulation of functionality sure seems like the
smart answer. Now, I don't know if the tools are ready yet, but given
how much money & time the two could save, it seems like a no-brainer
vs. buying lots of VCS/Vera/whatever seats and writing sims for
months and months."
- [ An Anon Engineer ]
"Verplex has a family of equivalence checkers and assertion checkers.
One tool that is somewhat unusual can compare an extracted netlist to
Verilog or even RTL (I think Chrysalis, Innologic and Circuit Semantics
have something like this, too). This is always an issue when you are
designing a library - how do you confirm that the guy who designed your
cells didn't screw up? They are automating this process better for
some special cases (like FPGAs), and are coming out with tools for
custom memories and datapaths (like you would get from Synopsys Module
Compiler). Their BlackTie assertion checker has a lot of built in
checks like one-hots staying one hot, no tri-state contention, etc.
and it can now do user-defined assertions as well. This was a big
battle a few years back. When model checking was first introduced, it
required the user to learn some obscure language, and it really didn't
catch on as a result. Recently, some vendors like Verplex have offered
a tool with a number of common built-in checks so that the users need
only click a button on a GUI. I think offering both types is a good
model.
Mentor says their FormalPro equivalence checker has better capacity
than competitors (they claim only a quarter of the memory usage for
the same size design). They also claim better debug capability, which
is really what separates equivalence checkers.
Circuit Semantics sells a tool that helps Mentor's equivalence checker
understand full custom circuitry; it takes the SPICE netlist and
extracts the logic function."
- John Weiland of Intrinsix
"4.2.6 Synopsys FormalVERA (formerly known as Ketchum)
Key features:
- Advanced APG engine
- Translates specification into properties
- Semiformal test generation (analyses reachable & unreachable states)
Ketchum uses a 2-phase approach and keeps track of changes in the RTL
design. It has no source code browser!!! And it also does not feature
cross clock boundary checking!!! In my eyes Ketchum is a little bit
behind it's rivals."
- Raimund Soenning of Philips
"Synopsys has a tool called Ketchum, which has been looming on the
horizon for more than a year, has been renamed "FormalVera" and
released probably in the third quarter (maybe fourth). This tool
takes a description of properties of your design and automatically
generates testbenches to verify those properties. The goal is for
someone to be able to tack properties onto their design. Those
that can be definitely proved or disproved by formal techniques
will be reported as such. Those that can't will cause monitors to
be generated for simulation. This is a mix of formal verification
and semiformal test generation.
EDAptive computing says their Vectorgen product is close to being
released. It sounds a lot like Synopsys Ketchum. You specify your
system function in Rosetta, and it creates test vectors automatically."
- John Weiland of Intrinsix
"EDAptive/UML/Rosetta
EDAptive convinces us to move towards a new design methodology. They
prepared for us a tool set that supports the complete development of
SOC. It contains theorem provers, automatic HW partitioning, etc.
The whole set is based on a language called Rosetta (a kind of UML if
you are familiar with it).
I thought the neatest feature the tools have is to search for designs
that fit a description in a design pool (for HW reuse). Obviously all
designs have to be written in Rosetta to start with. I will be the
last to argue that changes in our design methodology are not welcome,
and there is definitely value in their approach and tools.
Unfortunately it would mean a major change, and in this branch changes
are mostly incremental, and all of us take unconsciously care that
the increments are kept REALLY SMALL."
- Monica Farkash of IBM
"We had a home-brew version of Ketchum. Concept is really good. It
finds things like gated clocks, bad clock crossings early."
- John Webster of Intel
|
|