( ESNUG 472 Item 3 ) -------------------------------------------- [04/30/08]
Subject: ( ESNUG 470 #5 ) Paul warns on the dangers of SDF-based STA flows
> The reason I switched from PrimeTime was that it was too slow in running
> on our 2 million gate design. It would take me 35 minutes with PrimeTime
> to load my design database, then each time I added a constraint it took
> 5-10 minutes for my design database to be updated. To speed this up, I
> had used 10 PrimeTime licenses in parallel at the same time.
>
> In comparison, TimeCraft was really fast. It took ~5 minutes for it to
> load the same design database, instead of the 35 min it took PrimeTime!
> When I added a constraint, I got the updated results within 1-2 seconds.
>
> - Fei Yan
> VIA Telecom San Diego, CA
From: Paul Zimmer <paulzimmer=user domain=zimmerdesignservices not mom>
Hi, John,
His "timing is identical" remark made me suspicious, so I followed up with
Fei. It turns out Fei's using an SDF flow. This is turning PrimeTime into
a glorified spreadsheet.
I recommend that SDF flows be avoided, for two fundamental reasons that
manifest themselves in various ways:
1) MUCH data is lost when the timing is transferred via SDF. All you
get is min/max delays through gates and nets. No loading info, no
transition time, no aggressor/victim data, etc. This leads to the
following limitations:
a) You can't tell what CAUSES a particular delay (was it a slow
transition on the input, a slow clock transition, a loading
problem, a constraint problem, etc.)
b) You can't do any what-if fixes. You can't swap a cell and get
a see the effects without going all the way back to the netlist
and the SDF generation tool (or maybe back through layout!).
This means no dmsa_fix_hold, my new best friend.
c) You can't do path based analysis. Every slew merge point will
be merged, forever and unalterable, by the SDF generation tool
based on the constraints it was given.
d) There are certain subtle situations where the tool needs to
know exactly where delay variation comes from (OCV vs. noise,
for example). I'm not sure that even an "SDF + incremental SDF"
flow does this correctly.
2) The SDF data has constraint effects hidden in it. Case analysis,
disable timing, certain clock commands, etc. will change the
generated SDF data. This means:
a) You must generate SDF for every *MODE* as well as every corner.
b) If you find a problem caused by a constraint, you may have to go
back through the sdf generation loop to be sure the data is
correct with the new constraint.
With modern processes, the whole STA problem has gotten very complicated;
especially with signal integrity. I don't think you can separate the timing
calculations from the constraint checking any more. It all needs to be in
the same tool.
The days of using PT as a glorified spreadsheet ("abacus mode") are over.
- Paul Zimmer
Zimmer Design Services Roseville, CA
Index
Next->Item
|
|