( DVcon 07 Item 15 ) -------------------------------------------- [ 05/24/07 ]
Subject: Magellan, 0-In, IFV/BlackTie, Jasper, RuleBase, Verix, Solidify
SHUFFLE & REDEAL -- Because so many people have requested it, here's those
same user comments about the specific bug hunters clustered by tool name.
The 2007 mindshare data with all 74.5% of the "don't use" people removed:
Mentor 0-In : ########################## 25.6%
Synopsys Magellan : ############################### 30.9%
Jasper : ################## 18.2%
Real Intent Verix : ##### 4.8%
Averant Solidify : #### 3.6%
Axiom : ## 1.8%
Cadence ISV/IFV/BlackTie : ######################## 24.2%
IBM RuleBase : ######## 7.9%
Atrenta Periscope : ## 1.8%
OneSpin 360MV : ### 3.0%
Avery Design : # 0.6%
homebrew : ## 2.4%
I forgot that one of the biggest strengths of this census is reading the
individual hands-on user's own words about their experiences. Sorry.
9.) Does your project use formal "bug hunters"? (Yes/No) If yes,
whose do you use? (e.g. Jasper, Mentor 0-in, Real Intent Verix,
Synopsys Magellan, Averant Solidify, Cadence ISV/IFV/BlackTie,
Avery, IBM RuleBase, or Atrenta PeriScope?) In your opinion are
these formal "bug hunters" useful or a waste of time? (Choose)
No. Evaluating Synopsys Magellan. Useful to co-exist with
traditional methodology.
Have not used any in a project, although have evaluated Magellan.
We are looking at Magellan.
Yes. Magellan now, but looking at other tools because Magellan
performance is not that great. We think formal tools and "bug
hunters" have their place in the verification flow. We found some
corner cases bug with Magellan.
Magellan - but other tools are being considered as well. Yes, they
are useful.
Will have one Magellan in house for current project to test it out.
I believe it will be very useful.
Planning to use Magellan.
Planning on Magellan, but no data yet.
Not yet used... plans are there for Magellan use.
We have plans to use Synopsys Magellan moving forward. These are
useful tools.
Yes. Synopsys Magellan. My current project is just getting started
with it, so it is too early to report on its success, but Magellan
has proven useful on previous projects.
None as of yet. Starting to explore Magellan.
Magellan has seen some use, but management doesn't think its useful
so it's never been staffed on a full-time basis. We play with it
from time to time.
Occasionally use Synopsys Magellan. Not part of main flow.
We tried out Magellan but concluded that it was far too immature to
give a useful return on the time invested.
Not my present project. However, I have seen the use of Jasper and a
swtich over to Magellan eventually in my organization.
Yes, but not currently in our department. Magellan is used in
other departments.
Magellan. Useful if done right. Waste of time if not set up
correctly. The tool needs to be used by people who know what
they are doing.
Magellan. In my opinion it is questionable whether the time spent
formally verifying a small block is better spent aggressively testing
through simulation. Formal has the same problems as simulation; just
in a different form. "Overconstrained in formal" equals "undertested
in simulaton". "Violations with no assertion to catch them" equals
"violation with no functional check to catch them".
Synopsys Magellan. It has been very complex and difficult to use.
No. We spent time trying to get Magellan to run a couple years ago
and were never completely successful. Real Intent Verix was used
several years ago on a project, and that project was eventually
killed after two full mask sets for a basic lack of functionality.
However, that doesn't necessarily lay the blame on Verix since the
tool is only as good as its operator. Conceptually, formal bug
hunters are a good idea, but the software was too cumbersome at the
time we tried it.
Not a lot. We use Magellan for some blocks that are simple in
formal and not simple in simulation.
Magellan. They are useful for small parts of the design, but the
state space for most designs is so very large that only small
portions gain advantage using formal methods.
Yes. Synopsys Magellan. Useful if used appropriately and sparingly.
Synopsys Magellan. Very interesting indeed to work with, very useful
for corner case problems, but we're questioning the value of such
tools for the primary verification approach. It will probably remain
a secondary supplementary tool.
Synopsys Magellan and Cadence ISV/IFV/BlackTie; I find very usefull.
Evaluating Synopsys Magellan and Cadence ISV.
No. Well, we have been looking at Magellan and Cadence IFV a bit.
I feel that these "bug hunters" are probably useful, but only if
the actual system/chip designers put in the effort to create the
assertions during design. Trying to add the assertions after the
fact (which is unfortunately what we sort of need to do, to try to
convince others how useful the "bug hunters" could be if they'd
only put in the up-front effort) is more of a waste of time.
Magellan, Jasper. Useful.
Yes. Magellan and 0-in. Tried Jasper for some proofs. Useful,
but they all require a 'guru' to drive.
Synopsys Magellan, and Mentor 0-in. We have found these tools to be
useful when used in one of two roles. The first is as a designer tool
during the early block development. The second role is to help with
coverage closure, to be able to prove missing cases.
Mentor 0-in in past. Useful so far. Considering Magellan for
current project.
We've tried 0-in and Magellan. Both are good tools with some value,
but neither offered enough bang for the buck in terms of tool cost
or engineering effort.
Mentor 0-in and Magellan. Great technology expecially for block level
verificaiton where designers can write some proprties and verify the
control path while the simulation environment is getting developed.
Also, with formal verification earlier in the design verification
cycle, assertions get validated before using into simulation.
Highly effective in regards to catch corner case bugs.
No use, but looking at Mentor 0-in
No, but are considering Mentor 0-in.
Currently evaluating Mentor 0-in and Atrenta PeriScope to assess
their usefulness.
We're playing with 0-in, and will probably buy it.
NO, but planning to purchase Mentor 0-in.
Mentor 0-in, I think the jury is still out on this.
0-in. Very useful for clock domain crossings and synchronization
mechanisms, which are hard to verify in simulations.
Yes, occasionally. 0-in.
Mostly 0-in, some Jasper.
We use the 0-In CDC analysis tool only. Very useful, although
somewhat painful.
0-in. Great for finding clock domain crossing problems.
We use Mentor 0-in to formally prove that some assertions cannot
fire, and then turn the assertions off in simulation to save the
simulation overhead of running the assertions. We tried to do
other formal proofs on whole blocks, but that has been difficult.
We have access to 0-in. Very useful as long as the design team has
the time to integrate interface assertions to properly constrain
the block. Many times we don't have this luxury.
Yes - we use 0-In. Evaluated all of them out there. Bug hunters
need resource investment, but they find do find bugs. We had a
deadlock corner case that was found that was missed by months
of simulation.
We use 0-In. They are very useful. 0-In is really easy to configure
and its CDC checking tool is very valuable.
Mentor 0-in - very useful for finding weird clock crossing boundary
cases, etc. These tools are all a pain to use (finding real vs
spurious errors) but 0-in has many common cases covered already.
Not currently. We used 0-in Search 5 years ago. It was not useful
at the time.
I have used 0-in and IFV. 0-in is faster, IFV is perhaps a little
easier to use. They're useful, but only after you take the time to
properly weed out all the false positives.
Yes, 0-in. Useful.
We use 0-In formal tools. They are useful but not a "silver bullet".
Yes -- 0-In -- would like to do more but have run out of time. They
are useful (have found bugs before).
Mentor 0-in. This is useful, we found some bugs and Mentor 0-in has
quick and easy to use CheckerWare IP for common bus protocols and
design components like FIFO, etc.
Yes. We use Cadence IFV and Mentor 0-in. I personally think it is
a waste of time. ROI is very less.
No. Will be evaluating Cadence IFV soon.
No. Cadence ISV/IFV under evaluation.
Evaluating Cadence IFV.
We don't use it for now due to lack of budget, but probably will use
BlackTie in the future.
The intent is to try Cadence IFV on the current project. It would
be a type of trial.
No. Plan to use IFV. Not much of a use at this time. SoC's are
growing larger and these tools cannot handle such huge gate counts.
Cadence BlackTie, some value when used by designer at block level.
Starting with IFV tomorrow. No opinion yet, but it better be useful.
Only starting. Like IFV (ease of use).
Cadence IFV. Useful, but sometime dynamic sensitization is better
then static sensitization.
We use Cadence IFV - absolutely recommendable!
Limited use of Cadence ISV/IFV/BlackTie.
No; use BlackTie but not as a bug hunter; waste of time
Yes. IFV - planned. They are useful.
I like Cadence IFV and have used it the most, though I've checked out
Averant Solidify and really like it as well. It's got some great
features. Our company bought Synopsys Magellan, go figure. To be
fair, Magellan works well, and if you're doing your development in
VCS, you're probably better off with Magellan since it uses the same
interface, and has the same level of language support. Same goes for
using Cadence IFV and Incisive together.
Limited Cadence IFV support. It is useful for small, new designs.
Yes, Cadence IFV, Very Useful
We uses Cadence's IFV tool, which is not just a bug hunter like some
of the others. You can get proofs out of IFV. I think formal
analysis has a great future. Many of our implementers rely only on
formal analysis for their module bring up. Their designs are often
bug-free when integrated into larger designs. Also, their designs
tend to be more parameterizable and thus more reusable. They also
have tremendous error checkers on their interfaces to insure that
the communication between blocks follows the rules specified.
Cadence IFV, mainly the automatic checks & they are definitely useful.
Few projects used Cadence BlackTie. Mostly waste of time.
BlackTie, useful at the IP level (part of our quality gate).
No, my project doesn't, but others had some luck with BlackTie.
Yes. IFV. Useful.
Yes. Cadence IFV/BlackTie. Useful.
Cadence ISV - very useful.
Yes, we use BlackTie.
We use Cadence IFV. I do not think this is a waste of time, since
it allows us to find and to fix bugs much quicker (i.e. to save
time in the always-too-long verification process).
Cadence IFV; very useful on module level and maybe for connectivity
checking on top-level (interrupts,...)
Cadence IFV. These tools are not just bug hunters. You can integrate
them in your overall verification flow and distribute the verification
task for certain features between static tools (more proper name) and
dynamic simulation.
Yes, Cadence IFV, very useful, saves time by finding bugs at block
level before integration, and because block level effort can be
reused during simulations at system level.
We use Cadence IFV and Magellan from Synopys. They are useful both
as bug hunters are well as from proving design correctness (though
capacity limitations are real show-stoppers for bigger modules).
No. Rumor is IFV and Jasper are OK, plan to eval some formal tool.
We are investigating Jasper and may use it for production work.
Some work started with Jasper.
No, but we are looking into Jasper's tools.
On this project, we are not. I have used Jasper, and liked the
concept, but found it too difficult to get useful results.
Yes. Jasper. The formal tool is useful. It is very effective for
certain type of circuits.
Looked at Jasper. Didn't see enough reward for time as we have
dedicated verification engineers maintaining a testbench environment
around VCS.
Yes. Tried Jasper. Still not there yet, but promising.
Yes. We are using the Jasper tools and have found them to be very
useful. We have modified our DV methodology to include more of the
Jasper formal analysis based on our early results.
Jasper. Mixed response. Probably don't know tool capabilities yet.
Yes. Jasper. Jasper does a great job finding tough bugs in places
where formal tools are traditionally used. As expected, it finds
complex corner case bugs that wouldn't be found by other means. But
it also has an assertions development environment which helps write
and debug complex assertions.
Yes. Magellan and Jasper. Useful.
Yes. Mainly Jasper and Synopsys Magellan.
Used Jasper, now using Magellan. Moderately useful for catching
the last 0.01% of bugs.
Jasper. Maybe a little Magellan because we got free licenses in a
package deal. Useful - found some pretty strange corner-casey bugs
that probably never would have been found otherwise.
Yes (occasionally). Jasper and Synopsys Magellan. Useful, but
extremely time consuming. We limit usage to special tough cases.
Sometimes, Jasper and Synopsys Magellan. Can be useful under certain
circumstances, often times, we do not have time to use them, though.
We are currently looking at Jasper and Mentor 0-In.
Yes. We use Jasper and 0-in. We used to use Magellan. These tools
are useful, but require a large investment in a dedicated staff and
careful choice of application area. Bugs found this way tend to cost
more per bug than bugs found through simulation, but these tools tend
to find obscure situations that would otherwise likely be missed.
Certainly these tools still have a long way to go to be truly mature.
Yes, Jasper, 0-in, Magellan
Yes. Jasper, Mentor 0-in and Synopsys Magellan. Useful.
No. Once we have an assertion based methodology in place, we may
evaluate Jasper, Mentor 0-in, and Real Intent Verix. I have had
both Mentor and Real Intent come to our site for presentations.
Yes - Rulebase, Jasper. Useful on a small block basis. The tools are
still not mature enough for widespread use but nonetheless useful for
both designers and DV engineers. We currently use the JasperGold tool
on our current project and it has had some success with some blocks.
Yes - IBM RuleBase. Useful - wouldn't do another project without it!
IBM RuleBase (mostly waste of time)
IBM Rulebase for IP verification. Very usefull.
No. I've used RuleBase in the past, and it is an excellent tool for
bug-hunting.
IBM Rulebase. Useful BUT for very specific cases (e.g. bus protocol
checking).
We're using IBM RuleBase as a "complete" verification tool for IPs
which are simple and/or regular enough to be appropriate. Some
blocks don't get simulated at all at the IP level, and just a
simple simulation at SoC level.
Yes, IBM RuleBase, useful
IBM RuleBase for model checking. Write our own functional properties
from the data sheet, and are not using any built-in libraries or
assertions template.
Used to use IBM RuleBase and still think it's the best, but now using
Real Intent Verix and ramping up on Synopsys Magellan. Cadence IFV
seen equivalent to Synopsys Magellan, perhaps easier to use, but
business process, not technical, resulted in Magellan over IFV.
We find good value in these tools.
Yes, IBM Rulebase and Mentor 0-in.
IBM RuleBase + Atrenta Periscope (starting)
Yes. Real Intent Verix. Useful if mature and work properly. That
doesn't seem to be the case with Verix.
Real Intent Verix - for asynchronous clock domain crossings, and it
has been very useful!!! Real Intent Puretime - for verifiying
multicycle paths, and it has been very useful!!! With both tools we
have identified & fixed problems that would have caused a "re-spin".
No. Thinking about using Real Intent. Bug hunters are useful.
Yes, I use Verix. Very useful at the beginning of a design cycle.
Yes. Real Intent Verix and BlackTie. Waste of time, I think.
Real Intent Verix and Cadence IFV. They have been useful for
pointing out specific types of problems.
Real Intent Verix and Atrenta - prefer Verix due to ease of setup
and familiarity. Tool still needs to evolve to handle complex
ASIC sub-systems.
I have used Averant Solidify and it found some very interesting bugs
starting from just a few assertions. I was very skeptical about these
tools until I experimented with Solidify and it instantly ruined my
confidence in a design that was supposed to be fairly well tested. ;-)
The counter example VCD waveforms are great for debugging.
Averant Solidify. They do find real bugs. But you need to get up
to speed with them.
Solidify from time to time. They are helpful provided time is
allotted to them.
Averant Solidify, Jasper JasperGold. VERY useful! But it helps
a lot if you know how to use them. ;-)
Yes. We used to use Solidfy from Averant (a very good tool BTW). We
are now moving to Synopys' Magellan (no where near as good but it is
free with our current license agreement). I have found Solidify to
be very useful. Not sure about Magellan.
Moving from Solidify to Magellan (another cost reduction)
Yes, OneSpin 360MV; useful in special situations, e.g. ECO
Use OneSpin 360MV. They are useful in targetting certain pieces of
design, but need to be targeted carefully as they can take longer to
verify certain pieces of design than other techniques.
Yes, OneSpin Solutions MV360; Very useful for certain verification
tasks! Again, why don't you mention OneSpin? They are around
since last DAC!
Yes, OneSpin. Useful when backed up with "normal" functional
verification.
OneSpin solution - they are ahead. If you mean property checking
as "bug hunters".
Future maybe Atrenta. Don't know.
No yet, will be using more moving forward, most likely Atrenta.
We are going to start using Atrenta Periscope (or Spyglass CDC). I
evaluated this tool for a bug we had in our design that took us a few
man-months to locate. It was due to a pulse stretch that resulted in
a small time for de-assertion. We were able to locate the problem
once we set up the tool correctly. We had to do a little tweaking,
but Atrenta has promised us an enhancement that will not need this
tweaking. Given the types of CDC issues I have seen, I think using
a tool like this is not a waste of time.
We have looked at Atrenta PeriScope and Jasper (some groups are using
both) but we are not using either in our current project.
Answer = Planning to use Avery Insight
Yes. Homegrown. Useful.
We use internal proprietary tools. Evaluating some vendor tools.
Not bought any yet... done some light evaluation.
No, they are extra software to 'go wrong'.
No - they take a lot of time (more than conventiomal methods), but
when complete they are more exhaustive.
No. Maybe useful.
Do not use. Impression is that they are useful but we don't have
resources to dedicate.
No. I suspect these would be useful - more so than linters
No, waste of time.
No, but we are interested.
Waste of time. The amount of development required on constraints and
properties exceeds the amount to fully develop a Vera-based testbench.
Great at finding bugs you already found, though. ;^)
No formal bug hunters. We invested a great deal of time and effort
into an excellent simulation environment and methodology. We get
lots of bang for the buck with this "old fashioned" method.
No. Waste time, simulation and lint can not be replaced.
No, good design practice should alleviate the need.
No, I think they are waste of time.
No. Some people in our group tried a couple early in our project,
but we didn't have much success.
No, but in the past I have found them useful.
No, they were seen as waste of time in the past.
No. I think these tools are only useful if you're starting a fresh
project with no legacy whatsoever. For upgrade projects, the reality
is the schedule dictates how much time you have for verification.
No we don't use them. Though I can imagine that re-writing a design
(or the spec, or intent) in another language can certainly uncover a
lot of things. For that purpose I have looked a SysML / UML but no
tools support for those, yet.
No. I suspect they are only as useful as the assumptions they
require as an input. Some pressure to look into them.
No. We definitely cannot afford these tools, and I don't think
I could make a case to get them.
No, but we would like to try these, if we can get some indication
from a third party that they are not a complete waste of time.
No. Usefulness depends completely on the design in question.
No, they were seen as waste of time in the past.
No, they need dedicated personal which are not tangible in other
tasks normally.
Evaluated, found them to be a waste of time. Do not use in
current project.
No, I think they are a a waste of time.
Formal analysis is a waste of time. I have evaluated tools by
multiple vendors and they were all crap. They all have memory
capacity issues, require too much hand holding, are useless on
datapaths and useless beyond a certain number of states which
makes them impractical when designing real chips, under real
deadlines and with limited resources to boot...
No, we have not used these on this project. I believe they are
useful if run by the RTL-designers locally for block-level
assertion/property checking.
No. Good, but hard to justify the cost before a major melt-down.
No. I think they would be useful, but I've never used one.
NO, I'm wondering if they're useful as well.
No, not yet convinced of their value.
No. We used formal tools on onesie-twosie blocks before. Though
useful in special cases, overall very bad ROI.
Not at present. We've been looking at them for years.
Index
Next->Item
|
|