( DVcon 05 Item 13 ) -------------------------------------------- [ 10/25/05 ]

Subject: 0-In, Jasper, Synopsys Magellan, Real Intent Verix, Averant, @HDL

0-In DOWN; MAGELLAN UP -- Most verification groups are still skeptical about
trusting their process to "bug hunters", with 72% this year saying they don't
use them.  Last year it was 70% non-users.  Everyone's waiting for someone
else to use bug hunters first.

  2004 - "What do you think about "bug hunters" like 0-in, Jasper, @HDL,
          Real Intent, Averant, or Synopsys Magellan (Ketchum)?  Does
          your company use such tools?"

                don't use :  ################################### 70%
                     0-In :  ####### 14%
        Synopsys Magellan :  ## 5%
                   Jasper :  ## 5%
        Real Intent Verix :  # 4%
         Averant Solidify :  # 3%
                     @HDL :  # 2%
         Cadence BlackTie :  1%

  2005 - "What do you think about "bug hunters" like 0-In, Jasper, @HDL,
          Real Intent, Averant, or Synopsys Magellan or Atrenta PeriScope?
          Does your company use such tools?"

                don't use :  #################################### 72%
                     0-In :  ### 7%
        Synopsys Magellan :  #### 8%
                   Jasper :  ## 3%
        Real Intent Verix :  ## 5%
         Averant Solidify :  # 2%
                     @HDL :  # 2%
         Cadence BlackTie :  1%
             IBM RuleBase :  # 2%
        Atrenta Periscope :  1%
                  Calypto :  1%
             Avery Design :  .5%

Subtracting out all 70% of the "don't use" people for 2004 leaves:

                0-In :  ################################################ 47%
   Synopsys Magellan :  ################ 16%
              Jasper :  ################ 16%
   Real Intent Verix :  ############# 13%
    Averant Solidify :  ########### 11%
                @HDL :  ######## 8%
    Cadence BlackTie :  ### 3%

Subtracting out all 72% of the "don't use" people for 2005 leaves:

                0-In :  ########################## 26%
   Synopsys Magellan :  ############################## 30%
              Jasper :  ########## 10%
   Real Intent Verix :  ################ 16%
    Averant Solidify :  ####### 7%
                @HDL :  ####### 7%
    Cadence BlackTie :  ##### 5%
        IBM RuleBase :  ####### 7%
   Atrenta Periscope :  ### 3%
             Calypto :  ### 3%
        Avery Design :  ## 2%

I was surprised to see that 50% 0-In drop.  The Magellan 2X rise wasn't news.
The Synopsys salesdroids have been pushing it big time.  But that 0-In drop
caught me off guard.  My guess is that it's Mentor/0-In acquisition pains.
Or maybe the fact that 0-In assertions are proprietary zapped them.  (0-In
assertions dropped 50% in favor of SVA and OVA assertions going up 2X in
2005.)  I don't know.  2006 will tell.

I didn't expect those IBM RuleBase numbers either.  I didn't mention RuleBase
in the question, yet it got 7% mindshare.


  Never used these tools.  From what I've read about Synopsys Magellan,
  it sounds like a great concept to me.  However, it will be several
  years before we venture down that road.

      - Tony Lanier of Harris Corp.


  Starting to play with Magellan.  Too early to tell if its useful or a
  waste of time.  I expect it will be useful.

      - Matt Weber of Silicon Logic Engineering, LLP


  Using Averant.  Never going back to doing without.  Very user friendly,
  not sure how strong their engine is compared to others.  Have started
  investigating - latest interest - IBM RuleBase.

      - Elchanan Rappaport of Lynx Photonic Networks


  You have not mentioned the best - IBM RuleBase.  It is really great.
  I used it in my former company and it really finds bugs quickly.

      - [ An Anon Engineer ]


  Evaluating now.  No VHDL support in 0-In, @HDL.  Your list is missing
  IBM RuleBase and Cadence Incisive Static Verifier (Verplex BlackTie).

      - [ An Anon Engineer ]


  Magellan appears to be the most stable and capable; Atrenta Periscope
  is a bit young, and it could grow interestingly.

  We cannot say much about 0-In (we will start an evaluation as soon as
  the VHDL support will be released).

  However, the best tool (as for performances, at least) is still IBM's
  RuleBase.

  We have 6 verification engineers using Magellan and 15 verification
  engineers using IBM RuleBase via a pre-defined package of properties.

      - [ An Anon Engineer ]


  Have used IBM RuleBase for functional formal checking and was very
  impressed with it's performance on debugging a complex control logic
  in the execution unit of a processor.  Absolutely useful at
  block-level control logic verification and weeds out bugs much
  much earlier than directed/random method would do.

      - [ An Anon Engineer ]


  We use Real Intent for advanced linting, Magellan for bug hunting
  and Jasper for full proof.

      - [ An Anon Engineer ]


  My company has been using Verix "Implied Intent" from Real Intent for
  more than 3 years.  We found quite a few bugs with it at the early
  design stages for many projects over those past 3 years.

      - Peter Chang of Sun Microsystems


  We use Verix and Spyglass for clock domain analysis.  Both are
  extremely valuable.

      - [ An Anon Engineer ]


  The only other tool we use beside our simulator is Verix from Real
  Intent.  I evaluated it twice on previous occasions and finally was
  able to purchase a copy.  None of our team members had used it before.
  They now appreciate having Verix around.  We are using the Implied
  Intent right now that does not require any assertions.  We use it
  mainly after the RTL is coded, just before we start simulating.  All
  the bugs we found would have been found through simulations.  However,
  they are found early and easily, saving us tons of time tracking down
  these problems.  It is definitely a time saving tool worth the money.

  Using Verix and OVL assertions definitely increased our debugging
  productivity.  The OVL assertions pointed out problems way before the
  effect was seen and enabled us to pin-point the source of the problem
  right away.  These 2 techniques increased our productivity in a
  visible way.

      - J.D. Allegrucci of GiQuila Corp.


  We recently tried Real Intent again, but it has very long run times
  on our RTL and was not able to find one simple bug (parallel case
  violation) that we found late in the project.  In other words a lot
  of hot air, but no dice. 

      - Jean-Paul van Itegem of Philips Semiconductors


  We use the Real Intent Verix for clock domain checking.  It is useful.

      - Tom Paulson of Qlogic


  We are using the Real Intent CIV for clock domain crossing (CDC) analysis
  and have found it to be very useful.  Most CDC faults can't be found in
  simulation and static analysis like in CIV is more comprehensive than a
  dynamic simulation based methodology.  However, it is necessary to check
  some assumptions like stable signals and grey coding in simulation.

      - [ An Anon Engineer ]


  We evaluated a couple bug hunters, but not thoroughly.  We get so much
  bang from Vera, code coverage and assertions, we seem to find all our
  corner case bugs (knock on wood!) without bug hunters.

      - Dan Joyce of Hewlett-Packard


  At the current time we have not seriously considered using proving
  tools like Magellan for the current project.

      - [ An Anon Engineer ]


  Magellan seemed interesting, but I don't think we're quite to that
  level yet.

      - Tom Mannos of Sandia National Labs


  We use 0-In.  We are just starting to use the formal tools.  I
  believe it is the next verification area that will give a boost
  in productivity.  I can easily see it replacing block level
  simulation in the next 5 years.

      - [ An Anon Engineer ]


  0-In is effective but has higher than expected sim time overhead.

      - [ An Anon Engineer ]


  We've used Avery Design.

      - Dan Steinberg of Integrated Device Technology


  0-In, especially for clock domain crossings.

      - [ An Anon Engineer ]


  We think they have promise, but concerned about maturity.  We've looked
  at 0-In many times over the past few years, but have always backed off
  due to the way they made you use pragmas.  We think Magellan has
  promise due to their use of SVA.  We're evaluating Magellan right now,
  but I don't think we'll use it in anything official for a while.

      - Jonathan Craft of McData Corp.


  We had a lot of success with 0-In's Clock Crossing tool.

      - [ An Anon Engineer ]


  We are curious bystanders at the moment, most interested in 0-In,
  especially in the area of clock-crossing checks etc.

      - [ An Anon Engineer ]


  We don't consider Averant Solidify as a bug hunter, but rather as an
  independent way to prove correct behavior while RTL authoring.  That
  said, we also have teams using SFV to accomplish black box verification
  (bug hunting, if you will).  Bugs found at the point of authoring are
  hard to track, so I can't cite numbers, but the designers' feedback are
  that they like the tools.  In addition, we have with SFV definitely
  found bugs that otherwise would have survived all the way to silicon.

      - [ An Anon Engineer ]


  We use Solidify from Averant and it is extremely useful.

      - [ An Anon Engineer ]


  Using Magellan in combination with both OVA and SVA.  Best formal tool
  I ever saw!

      - Michael Roeder of National Semiconductor GmbH


  Have done some work with Magellan.

      - [ An Anon Engineer ]


  Currently evaluating with serious intent to buy.  Jasper targets
  verification with higher levels of design specification, which is
  attractive.  0-In understands the fundamental principles of formal
  verification better than any other company.  Magellan is relatively
  easy to use, especially for VCS customers.  It's still a toss-up
  which one wins here.

      - [ An Anon Engineer ]


  Have not looked at them

      - Frank Vorstenbosch of Telecom Modus Ltd.


  We use Jasper.  It has proven itself a couple of times catching fatal
  bugs.  Definitely not a waste of time.

      - Dave Ferris of Tundra Semiconductor


  Several years ago we evaluated 0-In Search with no great success.
  The problem is that our product is a processor, therefore it is very
  hard for us to specify all the constraints that a formal verif tool
  needs.  It sounds like Jasper is ahead in the game.

      - [ An Anon Engineer ]


  0-In analysis is somewhat useful.

      - Harish Bharadwaj of LSI Logic


  We used 0-In in production on a recent ASIC.  It was helpful, but the
  value judgement (time/money spent vs return) would be very close to
  the line between useful or waste of time/money.

      - [ An Anon Engineer ]


  I tried 0-In briefly and found it's not suitable for my purpose.  I
  was trying to prove certain behavior shall never happen for a old
  gate level design.  Even with the help of their FAE, we couldn't get
  the tool to either prove or disapprove.

      - [ An Anon Engineer ]


  Have used Jasper and found it very useful.  The only problem is it is
  very hard to use.  These "bug hunters" are for design engineers not
  for verfication engineers.

      - [ An Anon Engineer ]


  Potentially interesting and we keep looking, particularly at 0-In (since
  we use their other tools) and Jasper but we haven't found traction yet.

      - Kevin Jones of Rambus


  We've done some evaluations, but haven't purchased any of these.

      - [ An Anon Engineer ]


  We're trying Magellan.  We'll know in a year from now.

      - Christian Mautner of Integrated Device Technologies


  No experience.

      - Juan Carlos Diaz of Agere Systems


  Used @HDL.

      - [ An Anon Engineer ]


  We use @HDL, Blacktie and 0-In.  Definitely useful especially if
  you run early.  We find clock domain crossing extremely valuable.
  Confromal has the best debug environment and 0-In has the best
  fundamental checking technology.

      - [ An Anon Engineer ]


  @HDL used.  very helpful

      - George Matthew of SiNett Corp.


  @HDL clock analysis, synchronizer analysis.  They saved our butts
  numerous times.

      - Andrew Peebles of Cortina Systems


  Not used.  Briefly evaluated @HDL and Real Intent.  The @HDL tools was
  able to complete "automatic assertion checking", but Real Intent could
  not.  @HDL had the better GUI.  Budget/time killed an extensive eval.

      - [ An Anon Engineer ]


  We use Jasper.  It's useful.

      - [ An Anon Engineer ]


  Jasper.  Useful but a BIG effort to use.

      - [ An Anon Engineer ]


  A new guy (coming from Apple) in our group says that Jasper Gold is
  about as cool as cool gets.  As for me, it still seems like a tools
  looking for a problem.  It seems to me that if your code is that
  hard to test it's time to hire some better designers.

      - [ An Anon Engineer ]


  We use Real Intent Verix and are evaluating replacing it with Atrenta
  Periscope due to their far superior debug environment.  I find 0-In
  and Jasper to be massive overkill and way too expensive.

      - [ An Anon Engineer ]


  Since we already use Spyglass, and were not entirely happy with it's
  clock domain crossing checks, we looked into using Atrenta Periscope.
  Unfortunately Periscope also has some significant holes in it's
  clock crossing analysis, and it requires a lot of user input to
  constrain the state space.  There's just too much manual work for
  Periscope to be useful at the time.  I do want to use a "bug-hunter"
  in my next project, so I'll take another look at Periscope in a few
  months.

      - [ An Anon Engineer ]


  We are using 0-In.  We also evaluated Magellan.  These have lots of
  potential, but need thorough assertions written first, and Vera-like
  checkers and we have not developed those thoroughly enough yet.

      - [ An Anon Engineer ]


  Use BlackTie, looking at Cadence's ISV.
  Have also used Magellan.
  Have found several non-trivial bugs with BlackTie and Magellan.

      - [ An Anon Engineer ]


  No experience -- well some, but not with these tools and not enough
  to comment.

      - George Gorman of LSI Logic


  Now we begin to try Synopsys Magellan.  I think it is useful for our
  project.

      - Jiye Zhao, Chinese Academy of Sciences


  Magellan has been evaluated, but yet to see in live project.

      - [ An Anon Engineer ]


  No, we might want to try Magellan in the future

      - Tuan Luong of Integrated Device Technologies


  We use Magellan.  The objective is good to catch unexpected corner
  case bugs.  But most often the top level functional properties are
  neither proved or disproved.  And sometimes after months of effort,
  its not one way or the other.

      - [ An Anon Engineer ]


  We have a Magellan license, but I have never used it.

      - John Stiles of Silicon Logic Engineering


  Our side uses Magellan.  We evaluated 0-In and selected it as the tool
  of choice in 2001.  But because of budget restrictions we never used
  it in projects.  Magellan caught up since then and works OK.

      - [ An Anon Engineer ]


  Tried 0-In and Magellan.  Neil Johnson recently dropped you a note on
  our Magellan experience.  As for 0-In, we did not spend enough time
  integrating their tools into our flow.  There was some clashing with
  our internal tools.  This was no fault of 0-In, but rather Altera not
  allocating enough resources to tackle the problem.

      - Ian Perryman of Altera


  Not yet, but will do so.  Will most likely look at Magellan or Jasper
  since we've heard good things about them -- However, we'll let others
  find the bugs in these tools first!

      - Rajen Ramchandani of Mindspeed Technologies


  We use Jasper.  But by attending DVcon & learning more about Magellan,
  I am kind of leaning towards it.  We are interested in evaluating
  Magellan because its hybrid architecture sounds very interesting with
  formal engines and VCS built-in to verify design properties.

      - Azeez Chollampat of PLX Technology


  Magellan is super tool to find deep corner bugs.  But it's an extreme
  time muncher.  Takes a ton of time.

      - [ An Anon Engineer ]


  No use.

      - Michiel Vandenbroek of China Core Technology Ltd.


  Funny way to spend money.  I'd rather buy some good alcohol.

      - [ An Anon Engineer ]


  Not tried!

      - Karthik Kandasamy of Wipro


  We have evaluated most of them on the automatic properties (branch
  enable, pragmas, etc.).   However there is no real interest from
  design community for this kind of tools, mainly because the
  difficulties met in analyzing the results of such tools.

      - Olivier Haller of STMicroelectronics


  I like Atrenta's Periscope the best.  It can do everything else and
  supports PSL assertions.  It's fast on Linux also.  This is helpful
  because you can do basic and directed verification without running
  a dynamic simulation.

      - [ An Anon Engineer ]


  We don't use these, but have an interest in FPGA bug hunting logic
  analysis tools for like Synplicity Identify or Xilinx Chipscope.  We
  currently use Chipscope and it is highly effective.

      - [ An Anon Engineer ]
Index    Next->Item







   
 Sign up for the DeepChip newsletter.
Email
 Read what EDA tool users really think.


Feedback About Wiretaps ESNUGs SIGN UP! Downloads Trip Reports Advertise

"Relax. This is a discussion. Anything said here is just one engineer's opinion. Email in your dissenting letter and it'll be published, too."
This Web Site Is Modified Every 2-3 Days
Copyright 1991-2024 John Cooley.  All Rights Reserved.
| Contact John Cooley | Webmaster | Legal | Feedback Form |

   !!!     "It's not a BUG,
  /o o\  /  it's a FEATURE!"
 (  >  )
  \ - / 
  _] [_     (jcooley 1991)