( DVcon 07 Item 14 ) -------------------------------------------- [ 05/24/07 ]
Subject: Jasper, Synopsys Magellan, Mentor 0-In, Cadence IFV, IBM RuleBase
JASPER ROARS -- Little tiny Jasper keeps up while competing against the evil
thousand man sales armies of Cadence, Synopsys, and Mentor!
2005 - What do you think about "bug hunters" like 0-In, Jasper, @HDL,
Real Intent, Averant, or Synopsys Magellan or Atrenta PeriScope?
Does your company use such tools?
don't use : #################################### 72%
0-In : ### 7%
Synopsys Magellan : #### 8%
Jasper : ## 3%
Real Intent Verix : ## 5%
Averant Solidify : # 2%
@HDL : # 2%
Cadence BlackTie : 1%
IBM RuleBase : # 2%
Atrenta Periscope : 1%
Calypto : 1%
Avery Design : .5%
Notice the mistake I did in 2005 of not mentioning Cadence BlackTie in the
question. I think that 1% for them in 2005 is way too low for back then.
2007 - Does your project use formal "bug hunters"? If yes, whose
do you use? (e.g. Jasper, Mentor 0-in, Real Intent Verix,
Synopsys Magellan, Averant Solidify, Cadence ISV/IFV/BlackTie,
Avery, IBM RuleBase, or Atrenta PeriScope?)
don't use : ###################################### 74.5%
Mentor 0-In : ### 6.5%
Synopsys Magellan : #### 7.9%
Jasper : ## 4.6%
Real Intent Verix : 1.2%
Averant Solidify : 0.9%
Axiom : 0.5%
Cadence ISV/IFV/BlackTie : ### 6.21%
IBM RuleBase : # 2.0%
Atrenta Periscope : 0.5%
OneSpin 360MV : 0.8%
Avery Design : 0.1%
homebrew : 0.5%
Subtracting out all 72% of the "don't use" people for 2005 leaves:
0-In : ########################## 26%
Synopsys Magellan : ############################## 30%
Jasper : ########## 10%
Real Intent Verix : ################ 16%
Averant Solidify : ####### 7%
@HDL : ####### 7%
Cadence BlackTie : ##### 5%
IBM RuleBase : ####### 7%
Atrenta Periscope : ### 3%
Calypto : ### 3%
Avery Design : ## 2%
Subtracting out all 74.5% of the "don't use" people for 2007 leaves:
Mentor 0-In : ########################## 25.6%
Synopsys Magellan : ############################### 30.9%
Jasper : ################## 18.2%
Real Intent Verix : ##### 4.8%
Averant Solidify : #### 3.6%
Axiom : ## 1.8%
Cadence ISV/IFV/BlackTie : ######################## 24.2%
IBM RuleBase : ######## 7.9%
Atrenta Periscope : ## 1.8%
OneSpin 360MV : ### 3.0%
Avery Design : # 0.6%
homebrew : ## 2.4%
This time around the Cadence bug hunters get a far better representation.
Problem is I don't have good 2005 Cadence bug hunter numbers that I can
compare to this present Cadence ISV/IFV/BlackTie 24.2%. Did it it grow?
Did it shrink? Can't tell. :(
What is interesting is how the 0-in and Magellan numbers barely changed over
the past 2 years -- and I'm quite impressed with those Jasper numbers! Why?
Because Cadence/Synopsys/Mentor each have thousand man walking talking sales
armies selling their wares. Little old tiny Jasper *grew* from 10% to 18.2%
competing against those Big 3. It's like Luxembourg keeping up with France,
Germany, and the UK! Jasper's running with the big dogs! (And keeping up!)
For more understanding, I asked an additional question in 2007:
2007 - In your opinion are these formal "bug hunters" useful or
a waste-of-time? (Choose)
useful : ####### 14.2%
waste-of-time : ### 5.6%
no answer : ######################################## 80.2%
It was a confusing result. Most (80%) people didn't answer. In the end
the useful/waste numbers weren't useful, but the people's reasoning on this
one question was.
One new discovery I found this year was how so many of the customers were
"planning to use" or "looking at" or "evaluating" the individual brands of
bug hunters. I've never seen any other EDA segment that had so much buyer
tire-kicking involved!
9.) Does your project use formal "bug hunters"? (Yes/No) If yes,
whose do you use? (e.g. Jasper, Mentor 0-in, Real Intent Verix,
Synopsys Magellan, Averant Solidify, Cadence ISV/IFV/BlackTie,
Avery, IBM RuleBase, or Atrenta PeriScope?) In your opinion are
these formal "bug hunters" useful or a waste of time? (Choose)
Evaluating Synopsys Magellan and Cadence ISV.
No. Will be evaluating Cadence IFV soon.
Currently evaluating Mentor 0-in and Atrenta PeriScope to assess
their usefulness.
No. Evaluating Synopsys Magellan. Useful to co-exist with
traditional methodology.
No. Once we have an assertion based methodology in place, we may
evaluate Jasper, Mentor 0-in, and Real Intent Verix. I have had
both Mentor and Real Intent come to our site for presentations.
We use internal proprietary tools. Evaluating some vendor tools.
No. Cadence ISV/IFV under evaluation.
Not bought any yet... done some light evaluation.
Evaluating Cadence IFV.
No. Rumor is IFV and Jasper are OK, plan to eval some formal tool.
Have not used any in a project, although have evaluated Magellan.
We are currently looking at Jasper and Mentor 0-In.
We are looking at Magellan.
No, but we are looking into Jasper's tools.
No. Well, we have been looking at Magellan and Cadence IFV a bit.
I feel that these "bug hunters" are probably useful, but only if
the actual system/chip designers put in the effort to create the
assertions during design. Trying to add the assertions after the
fact (which is unfortunately what we sort of need to do, to try to
convince others how useful the "bug hunters" could be if they'd
only put in the up-front effort) is more of a waste of time.
No use, but looking at Mentor 0-in
Yes. Magellan now, but looking at other tools because Magellan
performance is not that great. We think formal tools and "bug
hunters" have their place in the verification flow. We found some
corner cases bug with Magellan.
No, but are considering Mentor 0-in.
Mentor 0-in in past. Useful so far. Considering Magellan for
current project.
Magellan - but other tools are being considered as well. Yes, they
are useful.
We are investigating Jasper and may use it for production work.
No. Thinking about using Real Intent. Bug hunters are useful.
We're playing with 0-in, and will probably buy it.
The intent is to try Cadence IFV on the current project. It would
be a type of trial.
Will have one Magellan in house for current project to test it out.
I believe it will be very useful.
Future maybe Atrenta. Don't know.
We don't use it for now due to lack of budget, but probably will use
BlackTie in the future.
No yet, will be using more moving forward, most likely Atrenta.
Planning to use Magellan.
NO, but planning to purchase Mentor 0-in.
Answer = Planning to use Avery Insight
Planning on Magellan, but no data yet.
No. Plan to use IFV. Not much of a use at this time. SoC's are
growing larger and these tools cannot handle such huge gate counts.
Not yet used... plans are there for Magellan use.
We have plans to use Synopsys Magellan moving forward. These are
useful tools.
Yes. IFV - planned. They are useful.
Starting with IFV tomorrow. No opinion yet, but it better be useful.
IBM RuleBase + Atrenta Periscope (starting)
Only starting. Like IFV (ease of use).
Some work started with Jasper.
Yes. Synopsys Magellan. My current project is just getting started
with it, so it is too early to report on its success, but Magellan
has proven useful on previous projects.
We are going to start using Atrenta Periscope (or Spyglass CDC). I
evaluated this tool for a bug we had in our design that took us a few
man-months to locate. It was due to a pulse stretch that resulted in
a small time for de-assertion. We were able to locate the problem
once we set up the tool correctly. We had to do a little tweaking,
but Atrenta has promised us an enhancement that will not need this
tweaking. Given the types of CDC issues I have seen, I think using
a tool like this is not a waste of time.
None as of yet. Starting to explore Magellan.
We use Cadence IFV - absolutely recommendable!
Yes, OneSpin 360MV; useful in special situations, e.g. ECO
Synopsys Magellan and Cadence ISV/IFV/BlackTie; I find very usefull.
Mentor 0-in, I think the jury is still out on this.
No, they are extra software to 'go wrong'.
Use OneSpin 360MV. They are useful in targetting certain pieces of
design, but need to be targeted carefully as they can take longer to
verify certain pieces of design than other techniques.
Averant Solidify. They do find real bugs. But you need to get up
to speed with them.
Yes. IFV. Useful.
Yes, OneSpin Solutions MV360; Very useful for certain verification
tasks! Again, why don't you mention OneSpin? They are around
since last DAC!
No - they take a lot of time (more than conventiomal methods), but
when complete they are more exhaustive.
Yes. We use Jasper and 0-in. We used to use Magellan. These tools
are useful, but require a large investment in a dedicated staff and
careful choice of application area. Bugs found this way tend to cost
more per bug than bugs found through simulation, but these tools tend
to find obscure situations that would otherwise likely be missed.
Certainly these tools still have a long way to go to be truly mature.
0-in. Very useful for clock domain crossings and synchronization
mechanisms, which are hard to verify in simulations.
Yes - IBM RuleBase. Useful - wouldn't do another project without it!
Yes, occasionally. 0-in.
Moving from Solidify to Magellan (another cost reduction)
Limited use of Cadence ISV/IFV/BlackTie.
No. Maybe useful.
Mostly 0-in, some Jasper.
Sometimes, Jasper and Synopsys Magellan. Can be useful under certain
circumstances, often times, we do not have time to use them, though.
Used to use IBM RuleBase and still think it's the best, but now using
Real Intent Verix and ramping up on Synopsys Magellan. Cadence IFV
seen equivalent to Synopsys Magellan, perhaps easier to use, but
business process, not technical, resulted in Magellan over IFV.
We find good value in these tools.
Do not use. Impression is that they are useful but we don't have
resources to dedicate.
Yes. We used to use Solidfy from Averant (a very good tool BTW). We
are now moving to Synopys' Magellan (no where near as good but it is
free with our current license agreement). I have found Solidify to
be very useful. Not sure about Magellan.
No, my project doesn't, but others had some luck with BlackTie.
No. I suspect these would be useful - more so than linters
We use Cadence IFV and Magellan from Synopys. They are useful both
as bug hunters are well as from proving design correctness (though
capacity limitations are real show-stoppers for bigger modules).
No, waste of time.
Cadence BlackTie, some value when used by designer at block level.
We use the 0-In CDC analysis tool only. Very useful, although
somewhat painful.
Yes, OneSpin. Useful when backed up with "normal" functional
verification.
No. I've used RuleBase in the past, and it is an excellent tool for
bug-hunting.
We uses Cadence's IFV tool, which is not just a bug hunter like some
of the others. You can get proofs out of IFV. I think formal
analysis has a great future. Many of our implementers rely only on
formal analysis for their module bring up. Their designs are often
bug-free when integrated into larger designs. Also, their designs
tend to be more parameterizable and thus more reusable. They also
have tremendous error checkers on their interfaces to insure that
the communication between blocks follows the rules specified.
No, but we are interested.
0-in. Great for finding clock domain crossing problems.
Magellan has seen some use, but management doesn't think its useful
so it's never been staffed on a full-time basis. We play with it
from time to time.
Jasper. Maybe a little Magellan because we got free licenses in a
package deal. Useful - found some pretty strange corner-casey bugs
that probably never would have been found otherwise.
IBM RuleBase (mostly waste of time)
Yes - Rulebase, Jasper. Useful on a small block basis. The tools are
still not mature enough for widespread use but nonetheless useful for
both designers and DV engineers. We currently use the JasperGold tool
on our current project and it has had some success with some blocks.
Occasionally use Synopsys Magellan. Not part of main flow.
We tried out Magellan but concluded that it was far too immature to
give a useful return on the time invested.
Waste of time. The amount of development required on constraints and
properties exceeds the amount to fully develop a Vera-based testbench.
Great at finding bugs you already found, though. ;^)
Not my present project. However, I have seen the use of Jasper and a
swtich over to Magellan eventually in my organization.
Cadence IFV, mainly the automatic checks & they are definitely useful.
No formal bug hunters. We invested a great deal of time and effort
into an excellent simulation environment and methodology. We get
lots of bang for the buck with this "old fashioned" method.
We use Mentor 0-in to formally prove that some assertions cannot
fire, and then turn the assertions off in simulation to save the
simulation overhead of running the assertions. We tried to do
other formal proofs on whole blocks, but that has been difficult.
IBM Rulebase for IP verification. Very usefull.
Limited Cadence IFV support. It is useful for small, new designs.
No. Waste time, simulation and lint can not be replaced.
We have access to 0-in. Very useful as long as the design team has
the time to integrate interface assertions to properly constrain
the block. Many times we don't have this luxury.
Yes. Cadence IFV/BlackTie. Useful.
No, good design practice should alleviate the need.
Yes. Magellan and 0-in. Tried Jasper for some proofs. Useful,
but they all require a 'guru' to drive.
Cadence ISV - very useful.
I like Cadence IFV and have used it the most, though I've checked out
Averant Solidify and really like it as well. It's got some great
features. Our company bought Synopsys Magellan, go figure. To be
fair, Magellan works well, and if you're doing your development in
VCS, you're probably better off with Magellan since it uses the same
interface, and has the same level of language support. Same goes for
using Cadence IFV and Incisive together.
Solidify from time to time. They are helpful provided time is
allotted to them.
No, I think they are waste of time.
Yes - we use 0-In. Evaluated all of them out there. Bug hunters
need resource investment, but they find do find bugs. We had a
deadlock corner case that was found that was missed by months
of simulation.
No. Some people in our group tried a couple early in our project,
but we didn't have much success.
On this project, we are not. I have used Jasper, and liked the
concept, but found it too difficult to get useful results.
Yes, but not currently in our department. Magellan is used in
other departments.
Yes, Cadence IFV, Very Useful
Used Jasper, now using Magellan. Moderately useful for catching
the last 0.01% of bugs.
No, but in the past I have found them useful.
No, they were seen as waste of time in the past.
No. I think these tools are only useful if you're starting a fresh
project with no legacy whatsoever. For upgrade projects, the reality
is the schedule dictates how much time you have for verification.
Yes. Jasper, Mentor 0-in and Synopsys Magellan. Useful.
Yes, we use BlackTie.
IBM Rulebase. Useful BUT for very specific cases (e.g. bus protocol
checking).
We use 0-In. They are very useful. 0-In is really easy to configure
and its CDC checking tool is very valuable.
Mentor 0-in - very useful for finding weird clock crossing boundary
cases, etc. These tools are all a pain to use (finding real vs
spurious errors) but 0-in has many common cases covered already.
No we don't use them. Though I can imagine that re-writing a design
(or the spec, or intent) in another language can certainly uncover a
lot of things. For that purpose I have looked a SysML / UML but no
tools support for those, yet.
Magellan. Useful if done right. Waste of time if not set up
correctly. The tool needs to be used by people who know what
they are doing.
We have looked at Atrenta PeriScope and Jasper (some groups are using
both) but we are not using either in our current project.
Not currently. We used 0-in Search 5 years ago. It was not useful
at the time.
Magellan. In my opinion it is questionable whether the time spent
formally verifying a small block is better spent aggressively testing
through simulation. Formal has the same problems as simulation; just
in a different form. "Overconstrained in formal" equals "undertested
in simulaton". "Violations with no assertion to catch them" equals
"violation with no functional check to catch them".
No. I suspect they are only as useful as the assumptions they
require as an input. Some pressure to look into them.
Magellan, Jasper. Useful.
We're using IBM RuleBase as a "complete" verification tool for IPs
which are simple and/or regular enough to be appropriate. Some
blocks don't get simulated at all at the IP level, and just a
simple simulation at SoC level.
Averant Solidify, Jasper JasperGold. VERY useful! But it helps
a lot if you know how to use them. ;-)
No. We definitely cannot afford these tools, and I don't think
I could make a case to get them.
No, but we would like to try these, if we can get some indication
from a third party that they are not a complete waste of time.
No. Usefulness depends completely on the design in question.
Cadence IFV. Useful, but sometime dynamic sensitization is better
then static sensitization.
I have used 0-in and IFV. 0-in is faster, IFV is perhaps a little
easier to use. They're useful, but only after you take the time to
properly weed out all the false positives.
Yes. Homegrown. Useful.
Synopsys Magellan. It has been very complex and difficult to use.
Yes. Real Intent Verix. Useful if mature and work properly. That
doesn't seem to be the case with Verix.
Yes. Jasper. The formal tool is useful. It is very effective for
certain type of circuits.
No, they were seen as waste of time in the past.
Yes (occasionally). Jasper and Synopsys Magellan. Useful, but
extremely time consuming. We limit usage to special tough cases.
No, they need dedicated personal which are not tangible in other
tasks normally.
Few projects used Cadence BlackTie. Mostly waste of time.
No. We spent time trying to get Magellan to run a couple years ago
and were never completely successful. Real Intent Verix was used
several years ago on a project, and that project was eventually
killed after two full mask sets for a basic lack of functionality.
However, that doesn't necessarily lay the blame on Verix since the
tool is only as good as its operator. Conceptually, formal bug
hunters are a good idea, but the software was too cumbersome at the
time we tried it.
OneSpin solution - they are ahead. If you mean property checking
as "bug hunters".
Yes, 0-in. Useful.
BlackTie, useful at the IP level (part of our quality gate).
Evaluated, found them to be a waste of time. Do not use in
current project.
We use Cadence IFV. I do not think this is a waste of time, since
it allows us to find and to fix bugs much quicker (i.e. to save
time in the always-too-long verification process).
No, I think they are a a waste of time.
Real Intent Verix and Atrenta - prefer Verix due to ease of setup
and familiarity. Tool still needs to evolve to handle complex
ASIC sub-systems.
Not a lot. We use Magellan for some blocks that are simple in
formal and not simple in simulation.
Formal analysis is a waste of time. I have evaluated tools by
multiple vendors and they were all crap. They all have memory
capacity issues, require too much hand holding, are useless on
datapaths and useless beyond a certain number of states which
makes them impractical when designing real chips, under real
deadlines and with limited resources to boot...
No, we have not used these on this project. I believe they are
useful if run by the RTL-designers locally for block-level
assertion/property checking.
Cadence IFV; very useful on module level and maybe for connectivity
checking on top-level (interrupts,...)
No. Good, but hard to justify the cost before a major melt-down.
Yes. We are using the Jasper tools and have found them to be very
useful. We have modified our DV methodology to include more of the
Jasper formal analysis based on our early results.
Yes. We use Cadence IFV and Mentor 0-in. I personally think it is
a waste of time. ROI is very less.
Cadence IFV. These tools are not just bug hunters. You can integrate
them in your overall verification flow and distribute the verification
task for certain features between static tools (more proper name) and
dynamic simulation.
Yes, Jasper, 0-in, Magellan
Yes, I use Verix. Very useful at the beginning of a design cycle.
No. I think they would be useful, but I've never used one.
Yes. Magellan and Jasper. Useful.
Yes, IBM Rulebase and Mentor 0-in.
Magellan. They are useful for small parts of the design, but the
state space for most designs is so very large that only small
portions gain advantage using formal methods.
NO, I'm wondering if they're useful as well.
Jasper. Mixed response. Probably don't know tool capabilities yet.
Mentor 0-in and Magellan. Great technology expecially for block level
verificaiton where designers can write some proprties and verify the
control path while the simulation environment is getting developed.
Also, with formal verification earlier in the design verification
cycle, assertions get validated before using into simulation.
Highly effective in regards to catch corner case bugs.
Yes. Real Intent Verix and BlackTie. Waste of time, I think.
We use 0-In formal tools. They are useful but not a "silver bullet".
Yes. Synopsys Magellan. Useful if used appropriately and sparingly.
Yes, IBM RuleBase, useful
Real Intent Verix and Cadence IFV. They have been useful for
pointing out specific types of problems.
We've tried 0-in and Magellan. Both are good tools with some value,
but neither offered enough bang for the buck in terms of tool cost
or engineering effort.
Looked at Jasper. Didn't see enough reward for time as we have
dedicated verification engineers maintaining a testbench environment
around VCS.
Yes, Cadence IFV, very useful, saves time by finding bugs at block
level before integration, and because block level effort can be
reused during simulations at system level.
No; use BlackTie but not as a bug hunter; waste of time
Yes -- 0-In -- would like to do more but have run out of time. They
are useful (have found bugs before).
No, not yet convinced of their value.
IBM RuleBase for model checking. Write our own functional properties
from the data sheet, and are not using any built-in libraries or
assertions template.
Yes. Tried Jasper. Still not there yet, but promising.
Real Intent Verix - for asynchronous clock domain crossings, and it
has been very useful!!! Real Intent Puretime - for verifiying
multicycle paths, and it has been very useful!!! With both tools we
have identified & fixed problems that would have caused a "re-spin".
Yes. Mainly Jasper and Synopsys Magellan.
Synopsys Magellan, and Mentor 0-in. We have found these tools to be
useful when used in one of two roles. The first is as a designer tool
during the early block development. The second role is to help with
coverage closure, to be able to prove missing cases.
Mentor 0-in. This is useful, we found some bugs and Mentor 0-in has
quick and easy to use CheckerWare IP for common bus protocols and
design components like FIFO, etc.
Yes. Jasper. Jasper does a great job finding tough bugs in places
where formal tools are traditionally used. As expected, it finds
complex corner case bugs that wouldn't be found by other means. But
it also has an assertions development environment which helps write
and debug complex assertions.
I have used Averant Solidify and it found some very interesting bugs
starting from just a few assertions. I was very skeptical about these
tools until I experimented with Solidify and it instantly ruined my
confidence in a design that was supposed to be fairly well tested. ;-)
The counter example VCD waveforms are great for debugging.
No. We used formal tools on onesie-twosie blocks before. Though
useful in special cases, overall very bad ROI.
Synopsys Magellan. Very interesting indeed to work with, very useful
for corner case problems, but we're questioning the value of such
tools for the primary verification approach. It will probably remain
a secondary supplementary tool.
Not at present. We've been looking at them for years.
Index
Next->Item
|
|