( ESNUG 538 Item 3 ) -------------------------------------------- [03/28/14]

Subject: Atrenta frustrated by user's flawed eval of 7 constraints tools

> Our management decided we had to clean up and streamline our flow.  They
> required us to benchmark our 7 SDC constraints tools in order to pare it
> down to 1 or possibly 2 tools. ...  After those first cuts, we decided to
> proceed benchmarking Fishtail, Ausdia, Excellicon and Atrenta. ...
>
>     - "User evals Fishtail, Ausdia, Atrenta, Excellicon, Blue Pearl"
>        http://www.deepchip.com/items/0537-01.html


From: [ Manoj Bhatnagar of Atrenta ]

Hi, John,

I'm the Sr. Director of Atrenta's FAEs.  Let me state that Atrenta fully
supports independent user evaluations.   This user provided a significant
level of detail in both criteria and analysis.  The industry as a whole
benefits from competitive analysis like these.

That said there are a few concerns with the conclusions this user found for
SpyGlass Constraints that I would like to address.

          ----    ----    ----    ----    ----    ----    ----

> Looking only as SDC verification, we found significant differences existed
> between them.  We classified them as follows:
>
>     1. SDC Linters - The SDC linters based on generic checkers use
>        older technology and are primarily rule-based, i.e., rules
>        are used to check the syntactical correctness of the SDC.
>        Atrenta Spyglass, Synopsys GCA, and Cadence CCD fall in this
>        category.  Fishtail Confirm for most part also falls in this
>        category, although some parts of it overlaps into the SDC
>        Analyzer category.
>
>     2. SDC Analyzers - These perform SDC linting but their primary
>        focus is to analyze the timing intent of the SDC on the
>        design using semi-formal techniques.  For example, our SDC
>        may be syntactically correct, but the resulting application
>        on the design is still incorrect.  These types of problems
>        are very hard to find and usually only show up during the
>        gate-level sims.  Excellicon Concert and Ausdia TimeVision
>        belong to this category.


Why do this?  The major players do both these domains.  SDC Linting is the
baseline entry point into SDC Verification.

Specifically, our SpyGlass Constraints provides several functions that fall
under the author's SDC Analyzers class - "primary focus is to analyze the
timing intent of the SDC on the design using semi-formal techniques".

   - SpyGlass Clock Waveform verification - Verifies (using formal
     techniques) that the waveform of the clocks matches the design
     implementation intent

   - SpyGlas Timing Exception Verification - Verifies (using formal
     techniques) False Path (FP) and Multi-Cycle Path (MCP) against
     design intent

   - SpyGlass Input/Output delays interacting with multiple clocks

   - SpyGlass Case Analysis propagation conflicts

I don't see the relevance in classifying these tools as SDC Linters or as
SDC Analyzers.  It doesn't add to the benchmark.

          ----    ----    ----    ----    ----    ----    ----

Now I'd like to reply to the specific technical critiques that this user
lodged against SpyGlass Constraints on a item-by-item basis:

SDC VERIFICATION TOOLS:

  Noise:

       SpyGlass Constraints was dinged as the

          "noisiest of all tools, having over a 1000 rules which
           triggered multiple times".

       GuideWare or pre-defined methodologies are delivered as part of
       SpyGlass Constraints.  It employs ~300 rules.  GuideWare for
       it recommends approx 100 specific rules.  Our releases provide
       a user option for very large designs to suppress certain SDC
       parser messages which may be from interoperability issues
       between different implementation tools.

  Multi-mode:

       The user gave two rival tools high marks for multi-mode and
       our tool a low mark.

       Atrenta was the first in the industry to introduce multi-mode SDC
       analysis (2006).  In 2008 this was enhanced to support multi-mode
       coverage.  Our flow fully addresses common issues like unconstrained
       flops in all modes.

  Shell:

       The user said

           "All tools except SpyGlass had a nice Tcl shell".

       Our entire SpyGlass Platform has full Tcl shell support.  This
       was a migration over several major releases.  Tcl shell is now
       mainstream in all SpyGlass tools.

SDC GENERATION:

  Promotion:

       The user said

           "... provides two methods of constraints promotion;
            push constraints to chip boundary, or add hierarchy
            to existing constraints.  Atrenta follows the
            latter approach".

       SpyGlass Constraints does push the chip boundary when needed.
       Also SDC promotion within it is a single step process.

  Demotion:

       Although the user gave SpyGlass Constraints a score for demotion,
       it is not currently a capability we offer.  (Huh?)  We are
       assessing the market needs for constraints demotion.

  Clock detection:

       The user said

           "... SpyGlass did not do well here.  The technology is
            based on simple tracing.  On one of our complex clocking
            designs, these two tools failed to identify some
            generated clocks".

       Since SpyGlass Constraints shares the same engine as our flagship
       SpyGlass CDC, I am perplexed by this conclusion.  ???

  Merged Mode SDC: 

       Although the user gave SpyGlass Constraints a score for this,
       it is currently under controlled release to only a few select
       Atrenta customers.  Our SpyGlass Constraints covers not only
       clocks, but also supports case analysis, I/O delays, FP, MCP
       and clock sense.

  Management:

       Missing from this eval is constraint equivalence.  Constraint
       equivalence ensures design intent is preserved.  Our experience
       has been this is an absolute must-have requirement for customers.

       Our tool does:

           - Equivalence of two SDCs for the same design
           - Hierarchical Equivalence to ensure consistency of
             top vs. block constraints
           - Equivalence between SDCs for two LEC clean designs

       Yet this user doesn't even touch on constraint equivalence???

I will close by reiterating that our FAE's will gladly support any customer
who wants to evaluate our tools -- because we know they'll do well -- but
also to ensure the evaluator has the latest rev and the latest information
on how to best use them.

But I must add that I was personally frustrated to see SpyGlass Constraints
unfairly dinged in an eval with so many holes and flaws in it.  This eval
was so messed up we even got positive scores for specific features that we
currently don't have.  That's just plain wrong.

    - Manoj Bhatnagar
      Atrenta                                    San Jose, CA

          ----    ----    ----    ----    ----    ----    ----

Related Articles

    User benchmarks Fishtail, Ausdia, Atrenta, Excellicon, Blue Pearl


Join    Index    Next->Item






   
 Sign up for the DeepChip newsletter.
Email
 Read what EDA tool users really think.





Feedback About Wiretaps ESNUGs SIGN UP! Downloads Trip Reports Advertise

"Relax. This is a discussion. Anything said here is just one engineer's opinion. Email in your dissenting letter and it'll be published, too."
This Web Site Is Modified Every 2-3 Days
Copyright 1991-2024 John Cooley.  All Rights Reserved.
| Contact John Cooley | Webmaster | Legal | Feedback Form |

   !!!     "It's not a BUG,
  /o o\  /  it's a FEATURE!"
 (  >  )
  \ - / 
  _] [_     (jcooley 1991)