( ESNUG 558 Item 7 ) ---------------------------------------------- [02/26/16]

Subject: Vigyan of Oski on DVcon'15, Aart, formal guidelines, goal-posting

  Hi, John,

  I'm following in the tradition of giving DeepChip our own internal
  DVcon'15 Trip Report in anticipation of upcoming DVcon'16 next week.
       
      - Vigyan Singhal
        Oski Technology, Inc.                  Mountain View, CA

          ----    ----    ----    ----    ----    ----    ----

DVCON ATTENDANCE NUMBERS

   2006 :################################ 650
   2007 :################################### 707
   2008 :######################################## 802
   2009 :################################ 665
   2010 :############################## 625
   2011 :##################################### 749
   2012 :########################################## 834
   2013 :############################################ 883
   2014 :############################################ 879
   2015 :############################################### 934

Saw a nice growth of 55 additional full conference attendees at DVcon this
year, with 39% first timers.  Many of the newcomers were from outside of
the United States (China, Taiwan, Korea, and Europe) making DVcon a more
global event. 

I saw at a lot of formal verification talk at DVcon this time:

      "Formal is the hottest topic at DVCon'15."

          - Michael Sanie, Synopsys verification VP

      "Simulation is dead."

          - Raik Brinkmann, OneSpin CEO

Aart de Geus cited static formal technology as a productivity gain factor
to cause the "left shift" in SoC verification.  Many DVcon'15 follow-up
articles touched on the power of formal technology in some way.

        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----

AART'S KEYNOTE ADDRESS - Shift Left

"Shift left," and squeeze schedule more left, was the theme of Aart de Geus'
keynote speech at DVCon 2015.  To this end, Aart covered four stories on:
design, IP, verification, and software. 

DESIGN

Aart's introduction celebrated 50-year anniversary of Moore's Law, and the
invention of transistors as the singular biggest exponential revolution in
the history of mankind, surpassing the impact of books, stream engines, and
other human achievements.  There are still 10 more years to continue this
path for denser, higher performance silicon with lower power consumption.
Another 10x productivity gain achieved in silicon will give a 100x impact
in application and software in other fields such as: automotive, finance,
energy, and biology -- where exponential revolutions are also happening to
shape our existence on earth.
Aart went on to talk about FinFETs, considered impossible 7-8 years ago,
and now driving exponential growth forward.  He showed the graph (above)
showing "Active Designs & Tape-Outs" at different technology nodes,
stating that designs will stay in 28nm as a resting point for a while
before crossing the bridge to FinFET, while the most advanced work is
already happening in the 5nm.

AART'S "ONE SNPS FLOW" PITCH

Next, Aart talked about the need for one coherent system made up of point
tools that must work well together -- to be part of his design continuum
of early exploration, physical synthesis, sign-off in physical, physical
optimization, yield optimization, and lithography -- a complicated EDA
flow that reduces design uncertainty through monotonic convergence.
"In order to shift left, or optimize any design flow, predictability of
different paths, or the different pieces of the path, and interaction
between those, is now extremely important," said Aart.  This becomes
necessary because everything is highly interconnected and interdependent;
so we have not only the scale complexity, but also systemic complexity
that requires understanding all dimensions at the same time.

Despite the desire to shift left, Aart believed that project schedules
havn't changed that much in the last 15 years, but the content for that
schedule has grown massively.

IP REUSE

The next topic was IP.  "The single biggest productivity enhancer in the
last 15 years, maybe the last 30 years, is fundamentally IP reuse", Aart
said.  IP reuse increases the time your engineers work on your product's
differentiation.  Aart advocated for commercial IP because commercial IP
will see 10x the verification as compared to and in-house IP.

Hard IP is not necessarily differentiated -- your HW engineers need to put
their man-hours into differentiating your chip.  You should buy as many
3rd party IP blocks as you can, so your engineers can focus on doing their
"value add" in your chip's overall design.

VERIFICATION

Verification is the largest delay in EDA in terms of design-hours consumed,
because of the immense state space involved.  As Aart put it "beyond the
infinite, to the cube power."   Aart touched upon simulation, analog-mixed
signal, verification IP, static and formal, coverage, emulation, and FPGA
prototyping as mechanisms that have contributed to advance productivity to
the "shift left" in verification. 

Aart came out against point tools.  He instead advocates for one single
verification continuum (unified computation, or a single vendor verification
platform) where all one vendor's point tools are well tuned together as
the biggest productivity opportunity. 

HARDWARE GOING TO SOFTWARE

Aart bridged the discussion over to virtual prototyping, making key hardware
functionality in software for early testing of embedded software.  Despite
the fact that bug fixing in software has traditionally been a lot less costly
through numerous software patches compared to bug fixing in hardware, which
could result in re-spins, Aart argued that software testing also needs to be
more systematic and rigorous by applying formal techniques.
In conclusion, Aart reemphasized that with the 10x push down to silicon, we
will see other 100x opportunities everywhere, which will change what mankind
is all about.  What's to come is unforeseeable.  However as the enablers in
the left shift that's happening in design, IP, verification and software,
we should be proud of our achievements and continue to collaborate for an
even greater left shift.


VIGYAN'S REACTION TO AART'S KEYNOTE

Aart is very good at delivering a meaningful message while at the same time
selling what Synopsys has to offer in each of the four areas mentioned.

        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----

THE ANNUAL DVCON DESIGN VS. VERIFICATION PANEL
Panelists:

  - Janick Bergeron, fellow, Synopsys
  - Harry Foster, chief scientist, Mentor Graphics
  - JL Gray, senior architect, Cadence
  - Ken Knowlson, principal engineer, Intel
  - Bernard Murphy, CTO, Atrenta

The moderator, Brian Bailey of SemiEngineering, opened with:

   I have been told that verification is an art rather than a science,
   and I heard a number of reasons, butthe top three are as follows:

      1) the verification problem is too large;
      2) nobody actually knows how to automate verification;
      3) It's good that chip verification isn't a science,
         because if it were a science, then all the good jobs
         in verification would have gone to China or India.

But much is changed since then.  Many of the designs we have today are made
from pre-verified blocks.  Many of the designs such as those intended for
the IoT edge devices are simpler.  Many are variations of pre-verified
platforms, so design turnaround time for these are in order of a couple of
months.  Some say it is actually too long, they are looking for weeks, if
not days for complete design turnaround times now.

    EDITOR'S NOTE: NOT exact quotes, but very shortened comments
    for brevity.  Otherwise we'd easily have 18 pages here!  - John

  Q. What progress has been made to reduce the amount of art in
     verification, and what else can be done?

JL Gray: There has been quite a lot of progress over the last 10-15 years.
Constrained random, functional coverage, verification plans.  It is
definitely better than in years past where we had lots of directed tests,
and hope you hit all the cases.

Ken Knowlson: I am the token software person on this panel.  So I approach
this differently, more from what the validation community at Intel is asking
of my cohorts and me.  We want to see production firmware (and drivers) used
more for validation purposes.  We can't afford to continue to write what we
call "verification firmware" for complex chips.  It rivals the production
firmware and yet it doesn't exercise the silicon the same way production
firmware does.

Bernard Murphy: increased awareness of static and formal verification as a
part of the flow -- not as an incidental experimental "let's try this and
see what it tells us", but more as a very necessary component.

There's a kind of a scary reality that verification isn't complete.  There
are bugs in the hardware.  There are bugs in the drivers.  And of course we
try to manage those down to a very non-critical level -- but we never
really know whether we've got this stuff completely flushed out.  And to
be honest, we never do.

Harry Foster: Constrained random formal. In a study I recently completed,
formal apps use grew by 62%, which is remarkable.  It's grown faster
basically than anything else in the EDA industry.

Janick Bergeron: I am disappointed to have been wrong about high-level
behavioral synthesis.  Other than some specialized algorith-to-C projects.
having lots of design IP and VIP negated the need for most design groups
to build chips ground up from behavioral synthesis.

  Q. At what point can we trust the IP and not have to re-verify it?

Harry Foster: It is not like taking Lego blocks and putting them together.
When you start mixing IPs that interact with each other, you have to verify
at a higher level.  You can't verify individual IPs to show they actually
work, and know they are going to work from a system perspective.  You are
obligated to verify the interaction between IPs.

Bernard Murphy: I had an idea six months ago of how to do verification of
control systems.  So clock, and reset management and power management.  I
said to one of our customers is could abstract away all the IPs and only
we look at the block that is controlling these clocks and resets, and we
just do full verification on that piece, and we don't need to look inside
the IPs at all. 

And the customer asked: "Could you look inside the IPs a little?  We have
some of the control logic down inside the IPs for the power management.
And so my idea eventually fell apart, precisely because you can't separate
chip functionalty domains that cleanly. 

JL Grey: You can't separate the domains that clearly because when people
are designing and architecting these things - they are not considering that
you might need to separate them clearly. 

Brian Bailey: Lego was successful, as Aart said, because it has a very
simple and very rigid interface -- so maybe we are defining the systems
wrong by allowing too much variability in the interfaces.

Bernard Murphy: No, chip designers only plan to be tested on functionality
and PPA.  If they can get an edge by breaking down block interface walls,
they will, and all of the companies that build the smart phones, and what
goes inside the smart phones, do precisely that.  They are tinkering with
power management at every level, to save Pico watts.  And they won't stop
just because it simplifies the verification task, unfortunately.

    EDITOR'S NOTE: Deleted 6 pages of pedantics about "what is art
    vs. a science?" and "complex vs. complicated" here.  - John

Harry Foster: So JL, is System Verilog complex or complicated? 

JL Gray: Yes!  [laughter]

  Q. I hear a lot of bashing of System Verilog.  Is there a love-hate
     relationship with System Verilog?

Harry Foster: I think all languages are equally hated.  [laughter]

Justin Refice: That's a great answer!  It allows creativity but at the same
time that creativity can nest you into really bad rabbit holes.  I believe
JL had a comment about UVM and "Oh I can use UVM to solve this problem."
Well, UVM is the Legos.  And you build the Legos, but you haven't actually
solved the problem. 

We always talk about "left shifting" and bringing SW into verification.  Or
bringing it into the system earlier -- but to Bernard's point, why can't we
"right shift" what the product team or verification team has created?  If
we've created an API to verify the interaction with this chip, well gee,
it'd be really nice if we could hand that API directly to the software team
and say: "Here you go! If you're really using this, we know it works! If
you're doing your own thing, we may have missed something, but we know we
didn't miss this API."

Ken Knowlson:  So we are actually doing that with new IPs. It doesn't work
so well with "legacy IPs", you know, IPs with existing drivers and stuff.
So we are driving that from a validation perspective, but we are including
in the design of those APIs, the software team.  So it is something that is
reusable.  We are calling it the Hardware API or something like that.

Harry Foster: I have seen some companies have actually done HW/SW concurrent
engineering quite well, and others are really horrible at that.  If mgmt
doesn't push it, well, you know the software guys are always working on the
last product anyway.  And they don't have time to work on the new one.  So
you really to accomplish this it's an organizational issue.

Janick Bergeron: I've seen HW/SW concurrent engineering work really well
with debugging.  Putting in certain verification features early on in a
project makes debug so much easier later in the project.

    EDITOR'S NOTE: Panel ends with 4 pages of talk about "writing
    beautiful code" and "aesthetic appreciation of code".  -John

        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----

DVCON'15 PAPERS

    Paper: 8 Design Guidelines for Formal Verification

   Author: Anamaya Sullerey of Juniper Networks

Anamaya's motivitation to create these guidelines came from his experiences
that formal verification is very much dependent upon design style.  A bad
chip design style can cause your formal proof to not converge -- and a
not-so-formal-friendly design style will even increase the effort needed to
create a good formal testbench.  This is not true for simulation.

These 8 design guidelines make your design formal verification friendly.

Guideline 1 - Use a functional design approach where each module in your
chip design hierarchy performs a well-defined function with minimal side
effects.  Don't use the more common was event-driven (performing certain
actions in response to the observed events) or data-driven (performing an
action based on observed data) approaches for RTL design.

Formal verification tools often run into capacity limitations due to the
exponential search space it needs to traverse to prove a property.  As a
result, very large designs may not be verified by a single formal testbench.
The functional design approach lets your formal verification to be done in
a "divide and conquer" approach on the sub-block level.  That way, you can
use the "Assume-Guarantee" technique at your chip's sub-block interfaces as
constraints into one and checkers coming out of another.

Guideline 2 - Have clear succinct interface definitions on the blocks and
sub-blocks of your chip.  Such interface definitions help in two ways:

  1. Clear interface definitions save the CPU time for formal testbench
     and properties development.  It also saves human iterations between
     formal verification engineers and designers to understand these
     interfaces.

  2. Since clean interfaces require a lessor number of properties to
     model, the state-space complexity for your checkers is reduced;
     which leads to better formal convergence. 

Guideline 3 - Treat state-space as a chip design consideration.  State-space
is a function of Cone of Influence (COI), which includes all primary inputs
and logic affecting the checkers under verification and connectivity within
the COI.
Anamaya explained this point using an example of seven-instruction-stage
pipeline.  He said that even though Case 2 has larger COI due to the amount
of logic, Case 3 is more complex for formal due to connectivity within COI.

Guideline 4 - Have a symmetrical design implementation.  The idea is that
formal tools exploit symmetry in your chip design and hence your formal
proof may converge faster for a symmetrical design.  Since it may be hard
to design all blocks with complete symmetry, it may be good to isolate this
asymmetry from designs that are largely symmetric.

Guideline 5: Have as many parameters as possible in the design.  Anamaya
considered this point very important.  A parameterized design can be scaled
down, by reducing the parameters, without affecting any key functionality.
This then helps reduce the state-space for your proof to converge.

Guideline 6: Have assertions for design invariants.  These assertions verify
a design invariant.  For example, "a vector is one-hot, guarantee of a grant
for any request in N cycles".  It helps in finding some design bugs much
earlier in the verification cycle.  Some formal tools treat these assertions
as helper assertions to help convergence on other assertions.

Guideline 7: Have simpler RTL code structure.  Formal likes simple.

  1. Move or isolate any independent complex deep-state logic like
     LFSRs and Crypto functions into separate modules.  This eases
     in abstracting behavior of these modules.

  2. Instantiate memories outside of your logic.  This helps your
     formal tool estimate the size of state-space.

  3. Avoid huge expressions in RHS during an assignment.  This helps
     in cut-point insertion for partial proofs.

Guideline 8: Be sure to do error isolation in your chip.  Many designs work
just by processing a large set of independent symmetric contexts -- such as
network flows and cache lines.  In such designs, it's a common technique to
model a single context that is applied throughout for formal verification.
And due to symmetry, a proof of a property for this one modeled context then
proves the correctness for all contexts.

The troble comes with your non-modeled context.  Your inputs can take all
state-space -- that is, even illegal inputs are also allowed.  An important
requirement is that non-modeled context should not affect modeled context.
You must have error isolation or you will end up with false failures.

Anamaya also presented on ways to stick to his guidelines.
Anamaya also stressed giving specific examples of formal-friendly designs to
your chip designers.  Anamaya closed his talk by saying how there should
always be a formal part of final signoff.

        ----    ----    ----    ----    ----    ----    ----

    Paper: Every Cloud - Post-Silicon Bug Spurs Formal Verification Adoption

  Authors: Blaine Hsieh of Faraday Technology
	   Stewart Li and Mark Eslinger of Mentor Graphics

Mark described a situation where their customer found a post-silicon bug,
but they could not reproduce that bug in simulation with their existing UVM
test vectors.  A post silicon bug can be very expensive because it can cause
a million dollar loss per day if not fixed.  The team was trying everything
possible and decided to try formal.

Mark talked about the ABC's of formal:

  - A is for assurance, that correctness of design behavior is ensured
    by formal exhaustive proofs. 
  - B is for bug hunting, formal can help find and fix as many bugs as
    possible, including post-silicon bugs. 
  - C is for coverage closure, which is to determine if a coverage
    statement/bin/element is reachable or unreachable. 

Unlike a Verilog/VHDL simulator, formal tool runtime goes up exponentially
in relation to cycle-depth of analysis.  The challenge of post-silicon debug
is the bug will surely lie beyond the proof-depth that can be reached by
formal tools without using any complexity resolving techniques. 

In the paper, Mark described a technique called goal-posting, where cover
points are used to reach a certain depth using formal tools -- then this
new point is used as the starting point to explore deeper into the design
state-space.  The following is a goal-posting example verifying an over-flow
condition in a FIFO.
Since their design initialization was complex, the customer used simulation
traces to initialize their whole design to a starting point.  Also it was
important that they used over-constraints to reduce the state-space when bug
hunting at the chip level.  They also used symmetry in the design to verify
to only one output.  By using goal posting, over-constraining inputs, and
simples properties, the customer was able to reproduce the bug.

Mark shared some best practices.
With this success, formal was widely adopted by the customer to verify other
blocks, such as CPU Bus Interface, MAC Controller, Interrupt Controller, and
continued to find some of the very deep bugs in their designs.

        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----
        ----    ----    ----    ----    ----    ----    ----

WHO ELSE EXHIBITED

This year DVcon'15 had 37 exhibitors; each of the Big 3, plus 10 new booths.

                        Synopsys + EVE + SpringSoft
                        Cadence
                        Mentor Graphics

                        Agnisys
                        Aldec
                        Amiq EDA
                        Atrenta
                        Avery Design Systems
                        Blue Pearl Software
                        Breker Verification Systems
                        Calypto Design System
                      * Cliosoft
                        Dini Group
                        Doulos
                        EDACafe
                      * Magillem SA
                      * Mathworks
                        OneSpin Solutions GmbH
                        Oski Technology
                        Paradigm Works
                        Pro Design Electronic GmbH
                        Real Intent
                      * Rocketick
                        S2C
                        Semifore
                        Sibridge Techologies
                      * SmartDV Technologies
                      * Sunburst Design
                      * Sutherland HDL
                      * Test and Verification Solutions
                        Truechip Solutions
                      * VeriFast
                        Verific Design Automation
                        Verification Academy
                      * Verifyter
                      * Xpeerant
                        Zocalo Tech

  * -- first time showing at DVcon

Oski has been exhibiting at DVCon for several years.  DVCon2015 was a great
show for us as formal is an increasingly important topic in verification.
The booth crawl with food was a great idea. We plan to be at DVCon for many
years to come.

    - Vigyan Singhal
      Oski Technology, Inc.                      Mountain View, CA

         ----    ----    ----    ----    ----    ----    ----

Related Articles

    Bernard Murphy on DVcon'14, UVM, Lip-bu, SW vs. HW engineers, specs
    Industry Gadfly -- Prakash Narain's not-so-secret DVcon'13 Report
    Brett's quickie trip reports on both DVcon'12 and NASCUG'12 confs
    Brett Cline's DVCon/NASCUG'11 Trip Report & Wally's Keynote Address

Join    Index    Next->Item






   
 Sign up for the DeepChip newsletter.
Email
 Read what EDA tool users really think.





































































 Sign up for the DeepChip newsletter.
Email
 Read what EDA tool users really think.

Feedback About Wiretaps ESNUGs SIGN UP! Downloads Trip Reports Advertise

"Relax. This is a discussion. Anything said here is just one engineer's opinion. Email in your dissenting letter and it'll be published, too."
This Web Site Is Modified Every 2-3 Days
Copyright 1991-2024 John Cooley.  All Rights Reserved.
| Contact John Cooley | Webmaster | Legal | Feedback Form |

   !!!     "It's not a BUG,
  /o o\  /  it's a FEATURE!"
 (  >  )
  \ - / 
  _] [_     (jcooley 1991)