( ESNUG 589 Item 07 ) --------------------------------------------- [05/13/21]
Subject: How Amit and Ravi staged a (tiny) Pearl Harbor on Anirudh and Aart
SOLIDO'S ML FORMULA: Way before most of the other EDA companies slathered
this buzzword to all their tools, Solido was one of the first to announce
"machine learning" in an EDA tool -- it was for library characterization
in 2017. (see ESNUG 576 #4) At the time, they claimed to have been
using ML for 12 years before they announced it.
In this 2019 survey, users mentioned *3* ML technologies from Mentor-Solido:
Solido MLChar Generator -- uses existing libs + target corners to generate
new libraries. It does both standard cells and memories.
"We were able to generate a new STD cells library of ~1000 cells using
MLChar Generate in only 3 hours. Characterizing such a library with
Liberate would normally take us at least one day."
Solido MLChar Analytics -- analyzes Liberty files to find characterization
errors/debug and to understand trends in provider/custom-built or generated
libraries.
"We use Liberate for characterization and then run Solido MLChar
Analytics. Solido finds characterizations errors, such as mismatches
between CCS and NLDM."
"We do not try to find trends/errors using Synopsys SiliconSmart nor
Cadence Liberate, as those tools do not have a strong QA process
like Solido. Solido follows monotonicity trends for consistency."
Solido MLChar Transformer -- manipulates libs in all sorts of creative ways.
To wit -- copy, margin, merge, resize, transform, trim are it's commands.
"I estimate that using MLChar Transformer would save me about 90% of the
time compared with building and operating my own scripts. This is
because the Liberty structure is quite complex -- setting up the scripts
takes a VERY long time. Also, using scripts is not very reliable."
- Solido's 3-piggies of ML characterization is Best of 2019 #6b
FIRST A LITTLE BACKGROUND: As you can see from the section I quoted above,
Solido does REALLY WELL with combining "machine learning" with the library
characterization business. (One of the reasons why MENT/Siemens acquired
Solido is because lib char drives a boatload of SPICE licenses.)
And lib char has a colorful history that pulls in names you normally see
in other, far bigger segments of EDA. It's like a training ground for some
future big league EDA executives.
2003: Magma acquired Silicon Metrics for their SiliconSmart library
characterization tool. A guy named "Anirudh Devgan" was VP
and GM of the product
2007: Synopsys releases Liberty NCX library characterization tool
2011: Cadence buys Altos for their Liberate lib char tool
2012: SNPS buys Magma picking up SiliconSmart, replacing Liberty NCX.
Anirudh doesn't join Synopsys, and instead goes to Cadence
2012: Mentor buys Z Circuit for Kronos library characterization tool
2014: Mentor buys Berkeley Design Automation and get its CEO, a guy
named "Ravi Subramanian" to run IC verification business
2016: Siemens buys Mentor
2017: Siemens under Ravi buys Solido with Amit Gupta CEO, picking up
their Solido Characterization Suite that has machine learning
AND NOW THE NEWS:
2021: Siemens with Ravi/Amit buys Fractal, a small EDA start-up that
sells CrossCheck, an alleged "IP validation tool"
WHY SHOULD ANYONE CARE?
Because Fractal's CrossFire "IP validation tool" has been horribly, horribly
mis-described and mis-marketed. That crappy description makes it sound like
it's some form of IP-focused linter, right? And because Solido bought it,
you'd think it's some sort of linter for cell libraries used in IP, right?
You couldn't be more wrong!
The truth the now renamed CrossCheck is a funky "equivalency checking-ish"
tool that cross-looks at connectivity and timing arcs across very different
formats (which they call "views") of a design to see if they match or not.
That is, it compares
Foo.verilog vs Foo.lib vs Foo.lef vs Foo.gds vs Foo.spice vs Foo.atpg
to see if they all agree in terms of connectivity and timing arcs.
It's like a DRC/LVS/lint-ish equivalency checker for Verilog blocks all the
way to .lib then on to ATPG then on to GDS2 then on to SPICE...
Here's some typical "cross-view" checks it does. (Note the jumps between
strongly divergent Verilog/LEF/.lib/GDS2/SPICE/UPF/etc. file formats.)
- block/cell/pin presence: Do I have any missing pins in my
macro block LEF file? What about entire macro blocks missing
from my LEF file? Are the total number of pins consistent
between views? (large macros may have hundreds of pins;
a common issue is pins missing from one of the views)
- block/cell/pin correctness: Is the naming of my macros and pins
consistent between different views? E.g., a pin called In_A in
LEF and InA in .lib may cause issues unless there is a mapping
file being used.
- block/cell area consistency: Do the block and cell boundary areas
defined in my LEF, DEF, and final GDS match up to each other? Do
they match with the AREA field in my .lib?
- For larger blocks -- power domain consistency: Does the assigned
power domain of pins in my .lib match with the actual SPICE netlist?
E.g., if my .libs says "pin(Y12) { related_power_pin: vss_7 }",
does tracing through my SPICE netlist show that that pin can
actually be traced to draw from vss_7? What about my associated
UPF/CPF file -- does the power domain info match what's inside
my UPF/CPF file, too?
- pin shape and sizes: Are the pin dimensions and locations consistent
between the different physical views, e.g. LEF, DEF and GDS2?
- block hierarchy tree: Is the hierarchy tree consistent across the
different views of my IP block? E.g., is my SPICE netlist flat while
my GDS and/or DEF is hierarchical?
- pin attributes consistency: Analog and clock pins are given special
attributes in LEF and .lib. Do these attributes match up? E.g. is
the pin mislabeled in one of the views?
- timing arc presence: is the presence of timing arcs consistent between
my .lib, Verilog, and VHDL files? i.e., is my .lib missing or does
it have extra timing arcs compared to my Verilog and/or VHDL?
This is just a sample listing of mostly connectivity cross checks -- here's
a sample of the timing arc cross checks CrossCheck does:
In all, CrossCheck does over 350 of these "cross-view" checks on production
blocks of up to 10 Terabytes.
And, no, CrossCheck is NOT a "physically aware" static timing analysis tool
like PrimeTime-SI nor Tempus-SI -- because it's NOT doing timing delay calcs;
it's doing connectivity / timing arc checks across Verilog/GDS2/SPICE "views"
of the same design block.
---- ---- ---- ---- ---- ---- ----
CROSSCHECK IS A COMMUNITY EFFORT: The other very powerful aspect to this
funky DRC/LVS/lint-ish equivalency checker sign-off tool is the R&D guys at
Fractal very wisely opened up what "cross-view" checks are done. That is,
when they first started 11 years ago, they cooked up about 100 "cross-view"
checks on their own -- but over the years the foundry added checks, the
3rd party IP companies added checks, the fabless semi guys added checks,
the IDM guys added checks -- basically if you used CrossCheck you were
allowed to add new "cross-view" checks...
... and if it's useful the Fractal guys would distribute these new checks
to the entire CrossCheck user base. So now the count is ~350 and growing.
It's the CrossCheck user ecosystem that's improving the tool on an almost
daily basis -- not just the Fractal R&D guys in the Netherlands.
So someone comes up with really weird new check that, say, looks at a Verilog
representation of a JTAG controller and how it impacts a SPICE view of that
same design -- if it's useful, they'll make sure you'll have access to it!
(And I must tip my hat to Rene Donkers, Fractal CEO, for making this product
decision 11 years ago.)
---- ---- ---- ---- ---- ---- ----
WHY THIS KICKS ASS: Because this seemingly minor checking tool is used by
all the big design houses and fabs in both the Cadence and Synopsys flows.
and this gives Ravi yet another "choke point" sign-off tool that *everyone*
loves that Aart and Anirudh *must* work with -- or Aart's and Anirudh's
customer bases are going to be very pissed off! -- sort of like how MENT
Calibre became the "choke point" DRC/LVS sign-off tool that CDNS and SNPS PnR
*must* play nice with -- or CDNS and SNPS is in trouble! :)
---- ---- ---- ---- ---- ---- ----
HOW IS THIS A "PEARL HARBOR"? Let's me be very specific on this; Cooley is
calling this Fractal acquisition a tiny Pearl Harbor attack on both Cadence
and Synopsys. Emphasis on tiny.
A big Pearl Harbor attack was in 1997 Mentor Graphics launched Calibre to go
directly against the Dracula-Vampire-Diva DRC/LVS quasi-monopoly Cadence
owned back then.
There was this little known AE named Joe Sawicki who was traveling up and
down the East Coast daring existing DRC users to benchmark MENT's Calibre
"against all rivals!" (which was CDNS' Dracula-Vampire-Diva, of course.)
One spy tells me that first year (1997) Calibre DRC pulled in $800,000 in
initial sales (something to do with launching in November 1997 giving only
2 months left to "sell for the 1997 calender year...")
Anyway 1998-1999-2000-2001 saw Calibre going from being a small contender
in DRC/LVS, to suddenly grabbing 60% to 70% of the total DRC marketshare.
According to EDAC numbers, world DRC/LVS sales totalled $2,337 million
in the last 4 quarters -- and a conservative 60% of this is $1.4 billion
in Calibre sales for Mentor/Siemens -- that Anirudh's Cadence did NOT get
because of this big 1997 Pearl Harbor attack on Cadence's quasi-monopoly
in DRC/LVS it used to have back then.
This Siemens Fractal deal is a tiny version of the Calibre story repeating
itself. CrossCheck (like Calibre) is sign-off tool enjoying a "choke point"
position in the design flow that neither CDNS nor SNPS can ignore -- and it
can't be sued out of business (sorry Aart) nor, with an 11 year lead, does
it appear CrossCheck can be "out R&D-ed" by CDNS nor SNPS R&D (sorry Aart
and sorry Anirudh.)
Will the niche that Siemens CrossCheck fills be earning billions years from
now? I highly doubt it. But will it part of a "checking" tool set that
forces Cadence and Synopsys customers to still be doing EDA deals with the
Siemens/MENT folks years from now? THAT I'd bet on.
That is, as much as Aart and Anirdh fantasize about locking customers into
a SNPS-only or CDNS-only tool flow, the fact that users will always need
*outside* 3rd party "checking" tools like Calibre or CrossCheck, means that
Aart's/Anirudh's "customer lock-in" fantasy will *never* become 100% true.
(Which is why a tiny Pearl Harbor is still a Pearl Harbor nonetheless.)
---- ---- ---- ---- ---- ---- ----
HOW THE $%^* DID THIS HAPPEN? On the business side I heard that Ravi/Amit
managed to buy Fractal for a ~3x multiple -- and that neither Cadence nor
Synopsys bid on it! WTF? Who the hell was asleep in the CDNS and SNPS
M&A departments to miss this one???
- John Cooley
DeepChip.com Holliston, MA
---- ---- ---- ---- ---- ---- ----
Related Articles:
Solido's 3-piggies of ML characterization tools is Best of 2019 #6b
User buzz on Siemens/Solido machine learning is #1 for Best of 2018
Amit on Solido machine learning, lib characterization, and Siri
Amit added 263 engineers on Library Characterization this year!
Solido ML, BDA Ravi, Tom Beckley SSS makes #2 for Best of 2017
Joe Sawicki on ML, Calibre, Solido, VC funding, and heuristics
Joe Sawicki smirks at Cadence Pegasus' 3 big critical DRC failings
Users say MENT Calibre is still DRC king was #3 in Best of 2017
Calibre scales 2,048 CPUs 16nm 700mm2 full chip DRC in 3.5 hours
Anirudh & Sawicki go at it over the recent CDNS Pegasus DRC launch
Anirudh's 19 jabs at Joe Sawicki's Calibre with his Pegasus launch
Join
Index
Next->Item
|
|