( ESNUG 538 Item 1 ) -------------------------------------------- [03/28/14]
Subject: Readers on ICC II, ATOP, CDNS EDI, upcharges, Z-Route, 24 months
ICC II SCOOP: So 4 days after I announced ICC II in DeepChip.com, exactly as
predicted, Aart de Geus announced ICC II in his SNUG'14 keynote. (Yay, me!)
From reader feedback, it appears the ICC II technical details that my spies
reported were very close, if not spot on. All of Aart's talk about "better
throughput" and multi-CPU stuff are clearly aimed at knocking down Atoptech.
But, to be fully honest, my spies didn't predict the strong "floorplanning"
spin that SNPS put in the ICC II launch. On cue, Imagination/STmicro/LSI
all gushed about how ICC II's super-fantabulous floorplanning sped their
design process by 7X to 10X -- which meant Aart is positioning ICC II to
also attack Cadence EDI's killer First Encounter floorplanning franchise.
Other than "requalify", "ICC II is 24 months out", plus some Z-Route tech
talk, the most common comment now is: "How much will this cost us?"
---- ---- ---- ---- ---- ---- ----
> - new data model. Old ICC had two separate internal data models pre-CTS
> and post-CTS. Basically PhysOpt plus Astro inside. The new ICC II has
> one data model that's a layer that unifies the two data models -- means
> a bit less memory use / minor speed-up because you don't duplicate data.
>
> - from http://www.deepchip.com/items/0537-10.html
Hey John,
I wanted to add some background on your 'Scoop II' rumor concerning
ICC II which was officially announced today -- you have a good hit
rate on the rumors BTW.
1. This is about Atoptech. The only two tools I hear about
being used at 20 nm or 16 nm are ICC and ATOP. I see no
EDI and sorry Mentor, ST does not count cause they're
from France.
2. This is a premature launch. If you listen to the ATOP
folks, they always beat ICC technically. The only reason
they lose is because of price. And since ATOP has had
some recent Top 10 semiconductor company victories against
plain vanilla ICC, it's safe to say that ICC II is still
many months away from being ready for a competitive fight.
Aart's ICC II announcement does mention early customer
trials, but ICC II so far still does not catch up to ATOP.
Anirudh Devgan, who now has the whole IC physical design
charter at Cadence, plans to revamp First Encounter the
same way he did with Tempus - with internal R&D only, no
acquisitions - so do not expect Cadence to be a factor
for some time either.
3. It's about ICC's current problems. The announcement
indicates that ICC II addresses the litany of end user
complaints about ICC, some of which you hit already hit
on: memory footprint issues, can only handle 1 or 2
scenarios max at a time, weak clock tree routing, and
slow overall run time.
So I am not so sure that a new version of ICC is going to open up a
whole new round of benchmarking so much as try and fortress against
ATOP, and to keep Cadence pinned back -- both for First Encounter and
Tempus.
---- ---- ---- ---- ---- ---- ----
Hi, John,
The EE Times article on ICC II mentions the Magma Volcano database
concept. A concept is not source code. Because the Volcano data
structures fundamentally differ from Milkyway and .db, one simply
can't insert Hydra or Talus source code into ICC II.
It's like trying to insert FORTRAN source into C++ source.
You can take "concepts" in, but they require a full code rewrite.
---- ---- ---- ---- ---- ---- ----
We saw the ICC II demo live. It was all about floorplanning.
It placed blocks, moved blocks around, shaped blocks. Showed
design exploration at the block level only. I think this was
Magma Hydra. The database looked like Volcano.
They showed nothing else. No placer. No MCMM. No routing.
Told us more was to come for those aspects in Q1 of 2015.
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
> If this ICC II rumor is true, what's interesting is it creates a mass
> call for rebenchmarking of all P&R tools. That is, if your engineers
> have to spend time requalifying ICC II, why not also benchmark current
> revs of Atoptech, CDNS EDI, & MENT Olympus-SoC, too, since your staff
> will all be in benchmarking mode anyway?
>
> - from http://www.deepchip.com/items/0537-10.html
Yes, since Synopsys is marketing ICC II as a new product with lots
of new bits, by definition it will need to be tested, and benchmarked
if only against old ICC, and then purchased.
---- ---- ---- ---- ---- ---- ----
As you note, this event has opened the door for Dorado, ICscape and
MENT Olympus. In MCMM benchmarks is also where ATOP clearly shines.
And for floorplanning, it makes the Tempus/First Encounter pair much
more effective (provided you can live in EDI).
---- ---- ---- ---- ---- ---- ----
Pardon the pun, but ICC II isn't ready for primetime yet.
We can't benchmark SW that's not even delivered yet.
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
> - old Z-Route. Still the same and will be kept, but doesn't Z-Route use
> an old/different routing model??? It still starts over with a new
> route instead of what was assumed in the pre-optimization stage.
> Convergence is still an issue.
>
> - from http://www.deepchip.com/items/0537-10.html
Hi, John,
You brought up the question of how is it possible to keep Z-Route
and do all of the updates and performance improvements at the ICC II
database level. Couple of comments and guesses.
Both the Magma and the Avanti teams (now at Synopsys) have a lot of
experience re-writing place and route tools and databases from
scratch under duress. This is all while preserving tool command
compatibility. :)
The 10X performance numbers and 5X memory reductions Synopsys is
describing in ICC II look largely like the type of improvements you
get from a solid database clean-up, making the tool thread tolerant,
and better use of disk vs. in-memory data structures.
---- ---- ---- ---- ---- ---- ----
Hi, John,
Some of the floorplanning and clock tree algorithms look to have
been updated inside ICC II, but this is known territory. I am
hazarding a guess here, but Z-Route probably has its own smaller
run-time data structures that are extracted from the primary layout
database on-the-fly and then results were rewritten back in. That
is the only way to keep the run-time mem footprint reasonable.
With some careful engineering, Z-Route could be reconnected to the
new db.
Also, Z-Route has been through some very serious qualification work
at all of the advanced process nodes (as in actual wafer runs) at
TSMC/GF/UMC/Intel -- you do not want to lose any of that.
---- ---- ---- ---- ---- ---- ----
John,
Re-hosting Z-Route is no big deal as it was already on its own
internal data model interfaced to Milkyway. Since the router
doesn't change the netlist, there is no tight integration that
is needed to the ICC II data model and static timing.
10x faster throughput and 5x less memory, really ?
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
> Rumor was ICC II was to be launched at DAC'13 in Austin, but it
> wasn't ready then. Apparently it is now. From what I've heard,
> ICC II has ...
>
> - from http://www.deepchip.com/items/0537-10.html
Hi, John,
What we saw at SNUG for ICC II was very prelim. Mostly slideware.
It's still in the very early stages.
---- ---- ---- ---- ---- ---- ----
Hi, John,
I was excited to see the ICC II launch. Read multiple articles
on it. There's a bewildering array of numbers in the press
release stories: 10x for this, 5x for that, 2x less of something.
I dug through wondering - when can I use ICC II? Then I run across
this story quoting a Synopsys suit:
Saleem Haider expects to see 10 to 15 design starts this year
with ICC II, including the existing early access customers.
"We have 7 or so active designs in different stages of tapeout
using the partial ICC II systems," he said. "We have some
customers who have done a partition with the early ICC II
software and they are looking to proliferate to other designs,
so I think in 2015 we will blow past the 20-30 designs range."
- EEtimes.com on Saleem Haider of Synopsys (03/24/14)
Um, Ok. Here are my confusions:
A) I probably won't get to use ICC II any time soon. Right
now Synopsys FAE's are running it for customers on-site
in "taxi cab" mode. Plan is to go 24 months before full
deploy. Why announce 2 years early?
B) They claim 5X, when I won't see for 2 years. Every P&R
vendor claims a ~50% runtime reduction every year.
Shouldn't I expect to get at least a 4x runtime speed up
in old ICC in 2 years without switching to the New Thing?
Or does this mean R&D work on old ICC is now done?
C) No one will say how much more ICC II will cost.
Please tell me my math is wrong!
---- ---- ---- ---- ---- ---- ----
ICC II is still way off. It's good to see Synopsys R&D being
proactive instead of waiting for the competition to nudge them.
---- ---- ---- ---- ---- ---- ----
How much will this cost us?
---- ---- ---- ---- ---- ---- ----
I see P.O. paperwork in my future.
---- ---- ---- ---- ---- ---- ----
Have you heard of any pricing on ICC II?
---- ---- ---- ---- ---- ---- ----
Hi, John,
So the key question you need to ask about IC Compiler II is what is
the up-charge that customers are required to pay?
The whole reason financially that Synopsys made this investment is
1) to protect the key P&R market
2) to make more money
Given that Synopsys ICC is already top of the pack with 70% market
share, what premium can Synopsys command for its replacement?
This same Prisoner's Dilemma faced Synopsys during the transition
from PhysOpt/Astro to ICC. While ICC was superior in many aspects,
the customer base was weary because we knew that SNPS Sales was
looking to increase sales with new technology.
From my perspective I have to say "yea" to Aart because he made the
investment in customer capital to create the next generation P&R.
If ICC II is indeed better then it will succeed over time. If
ICC II is not appreciably better as defined by the customer base
then it will be resisted both internally and externally.
---- ---- ---- ---- ---- ---- ----
I can't get any of my SNPS reps to price out ICC II for us.
---- ---- ---- ---- ---- ---- ----
I've already told my mgmnt to expect a call from Synopsys Sales.
---- ---- ---- ---- ---- ---- ----
Now how much?
---- ---- ---- ---- ---- ---- ----
Our CAD department cringes at the word "upcharge".
---- ---- ---- ---- ---- ---- ----
ICC and ICC II are two separate tools, yes?
Or do we buy ICC and then buy ICC II, too?
How much does the addition cost?
---- ---- ---- ---- ---- ---- ----
Isn't this what we pay support for? Shouldn't support cover this?
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
Surprised to see Aart cite a Magma pedigree to his new ICC.
Rajeev's R&D spread everywhere after the SNPS-LAVA merger.
Atoptech, EDI, and Olympus have similar Magma pedigrees.
---- ---- ---- ---- ---- ---- ----
I worked at LAVA when we split our R&D to be half BlastFusion and
half Talus developement. It almost killed the company.
I hope Antun is ready. He is going to get a LOT of high-level
exec customer phone calls on exactly what's implemented in ICC I
(which users will get for free) and what's implemented in ICC II
(which users must pay extra for.)
---- ---- ---- ---- ---- ---- ----
When I asked Synopsys last Friday morning if the ESNUG post made
their embargo invalid, it was clear I had touched a nerve in their
organization. They were absolutely adamant that the embargo was
still valid, and if I refused to promise to honor it, not only
would I not receive my pre-briefing later that day, I would never
receive another pre-briefing from Synopsys going forward. Ever.
- Peggy Aycinena, EDAcafe.com (03/24/14)
---- ---- ---- ---- ---- ---- ----
Related Articles
BDA sold, ICC II, Verdi 3, ProPlus, ArchPro, CDNS Tempus, Calibre
Join
Index
Next->Item
|
|