( DAC 01 Item 14 ) --------------------------------------------- [ 7/31/01 ]
Subject: Verisity Specman 'e', Synopsys Vera, Forte/Chronology RAVE
SOMETHING'S WRONG: At the LA 2000 DAC, the only DataQuest numbers available
for the Specman vs. Vera vs. RAVE battle were for the 1998 fiscal year.
Here's the actual numbers:
Fiscal Year 1998 Market Share
Total 1998 Market: $13.6 million
Verisity Specman & 'E' ####################################### 77.8%
Synopsys Vera ######### 18.5%
Chronology RAVE ## 3.9%
Naturally, at the time, Verisity went around bragging how they owned 78% of
their market. I don't blame them. I would, too, if I were them. Clearly
distressed, Synopsys appears to have contracted Ron Collett to do some
spontaineous market research, because a month later Ron put out a press
release with a study he had done just on Vera users. That press release
reported "VERA usage was five times greater than that of competing testbench
languages". The odd thing about this was Collett never mentioned HOW MANY
Vera users he found. His 5X claim holds true if he found 10 Vera users and
only 2 Specman users. Anyway, it took the heat off Vera for a bit, until
Gary Smith released the 1999 DataQuest numbers:
Fiscal Year 1999 Market Share
Total 1999 Market: $22.4 million
Verisity Specman & 'E' ################ 33%
Synopsys Vera ########################### 54%
Chronology RAVE ###### 13%
So it appeared that Synopsys had stolen this niche from Verisity. It got
more interesting if you did some math. Verisity made .778 x 13.6 = $10.6
million in 1998. It made .33 x 22.4 = $7.4 million in 1999. Verisity
not only lost market share, they also lost $3 million in revenue! Hmmm...
Now it gets even more interesting. Janick Bergeron of Qualis Design (and
who runs a 2,750 member mailing list on verification) released on July 7th
this survey data:
"In the period from May 15th to May 31st, what was your primary
implementation medium for your functional verification
testbenches?
No of % of No of % of
Tool Respondents Domains
--------------------------------------------------
BestBench: 21 3.5% 12 4.6%
C/C++: 23 3.8% 17 6.5%
RAVE: 10 1.7% 4 1.5%
Specman: 368 61.5% 110 42.3%
Superlog: 2 0.3% 2 0.8%
SystemC: 2 0.3% 2 0.8%
TestBuilder: 2 0.3% 2 0.8%
VERA: 68 11.4% 38 14.6%
VHDL: 43 7.2% 23 8.8%
Verilog: 48 8.0% 41 15.8%
other: 11 1.8% 9 3.5%
--------------------------------------------------
TOTAL: 598 260
Commercial tools only:
No of % of No of % of
Tool Respondents Domains
--------------------------------------------------
BestBench: 21 4.5% 12 7.2%
RAVE: 10 2.1% 4 2.4%
Specman: 368 78.5% 110 66.3%
Superlog: 2 0.4% 2 1.2%
VERA: 68 14.5% 38 22.9%
--------------------------------------------------
TOTAL: 469 166
Scientifically, I cannot conclude that Verisity has 79% of the HVL
market share. But qualitatively, I can only conclude that Verisity
is the current market leader."
- Janick Bergeron of Qualis Design
I know and like Janick. He's a good guy. In my gut I don't think he cooked
the numbers in his survey -- and, more importantly, even though I didn't do
a count in my DAC trip report survey data, I did remember more pro-Specman
user e-mails than pro-Vera users e-mails. That is, Janick's data jives a lot
more with my experience than Gary Smith's and Ron Collett's data. Hmmm....
And, as usual, Forte/Chronology's RAVE is still plugging along with its 10%
market share.
"I think some of the Verification languages are useful, but only because
people do not know how to use Verilog and VHDL to design procedureal
BFM's and test benches, which is easy to do. Most BFM's I see are file
driven, so these languages are better than file I/O, but not a lot
better than real procedureal Verilog and VHDL tests. Vera and E take
advantage of the ignorance of people in using Verilog and VHDL
languages to the fullest."
- [ An Anon Engineer ]
"d) Forte: Chronology Quickbench. This company is a merger of
Chronology and the Giga Tools. Might be adding e and Vera
integration (as customers request). Uses RAVE as their HVL.
Rave is Perl-ish. Rave now has coverage constructs."
- Peet James, Qualis Design
"GigaScale Methodology & QuickBench by Forte Design
--------------------------------------------------
Forte's GigaScale Hub integrates into a single environment the is HDL
models, C/C++ models, transaction generators, and monitors. We have
created a similar environment using PLI routines."
- Henry So of Mobilygen, Inc.
"We own Forte's (Chronology's) Quickbench. We have been very happy
with the tool and even happier with the Quickbench support. They are
very attentive -- and probably need to be since they are trying to
compete with the monolithic Verisity.
Our designers really like Rave since it's PERL based. It nonetheless
has been a steep learning curve. After 6 months we are just now
tapping the power the of tool. We have also started developing a host
SCSI model. This seems to be one thing that is missing from this whole
methodology -- there is no company that develops and sells target and
host models for different protocols. Granted Verisity and Forte have
some models for the easy ones like AHB and some of the communications
protocols, but complex protocols like SCSI, S-ATA and PCI have been
their big stumbling blocks."
- [ An Anon Engineer ]
"Forte Systems:
Their new Gigascale Hub provides a nice frame work for gluing HDL,
Rave, and C++ together for system level modeling. Their most
fascinating offering was a new simulation analysis tool called
Perspective. Perspective has some features found in other tools but
has some unique function not found elsewhere. One of Perspective's
features is Functional Coverage. It looked promising although it was
clearly a work in progress and lacked some of the sophistication found
in Specman's functional coverage language and visualization tools.
The real exciting part of Perspective was two fold.
1. Their transaction level visualization tool that offers the
designers the ability to really visualize & debug at the
transaction level. What they offer is more than what Specman
offers via integration with traditional wave form viewers and
even more interesting than the SignalScan Tx technology offered
by Cadence. I was impressed by this aspect.
2. Further more they have temporal checking capabilities in this
tool. Signal level temporal expressions have been supported by
tools like Specman for a number of years but they are doing this
as well as raising the abstraction level of temporal checking to
the transaction level. They showed demos of using this feature
to do transaction level temporal checks on packets through a
router and also to do transaction level score boarding. This
concept is new and is beyond anything that is currently offered by
Verisity and the other testbench automation companies. I plan on
keeping an eye on this.
Forte looks to be an up and coming star in the EDA market and I expect
to see of lot of innovation from them in the coming years."
- Sean Smith of Cisco Systems
"One of our projects uses Rave (Quickbench). We think it does the job.
The other uses Specman, due to the preference of the manager. No
feedback on that."
- Paul Schnizlein, Agere Systems
"Spent only about a week looking at Verisity's Specman. My conclusion
was that in most of the designs I deal with, the added verification
environment complexity has little or no payback over direct Verilog
testbenches. However, I can foresee designs which have regular data
structures like network packets or processor instruction sets where
the Specman approach can provide some payback during verification."
- Tom Loftus, Intrinsix
"We have used Verisity on a few test projects. We love the concepts
and principles behind the tool. Anyone who spends time with Specman
ends up with a richer vocabulary of testing ideas. Ironically, using
Specman made us better C/Verilog implementers and we routinely use
constraint-based pseudo-random stimulus generation in our C code in
our co-simulations. Also, as a group, we do not segregate into Design
vs. Verification vs. Test Engineers. Perhaps this does not encourage
engineers to learn 3 languages (C, Verilog and Specman) vs. 2 languages
(C and Verilog). So, we are "warm" to Specman but it is not standard."
- Tom Coonan, Scientific Atlanta
"We evaluated Vera last year and thought it would be very useful. One
team said they would use Vera so we purchased some keys, but to date
no one has used it for more than learning/playing. We keep hearing
from groups that seem to need it, but maybe they can't afford the ramp
up time?"
- [ An Anon Engineer ]
"If we do anything it'll be Superlog based."
- Mike Carter of Mosaid Technologies
"Vera Future stuff: There is a Vera Book by some Cisco guys that is
coming out soon."
- Peet James, Qualis Design
"What's the difference between C++ with class libraries and Vera/E? Not
much, except C is compiled and Vera/E are interpreted (currently). We
have used Vera 5+ years and been happy with it."
- [ An Anon Engineer ]
"Synopsys' Vera has a fairly bad reputation here. We are impressed
with Specman's capabilities but not enough to buy it. We use TCL
instead, which is free."
- Carl Wakeland, Creative Advanced Technology Center
"I had a brief encounter with Specman and it was not a nice one. Since
the thing is object oriented, the testbench must be layed out very
carefuly right from the beginning. If that is not the case, the whole
project may end up in a trash. Otherwise good idea and I see the
benefit of the tool. I am a big fan of random testbenches, but I
write them all in Verilog myself."
- [ An Anon Engineer ]
"I really like the capabilities of Vera in the area of object oriented
code and reentrant tasks. Vera is really giving Verilog designers C++
capabilities. However, it runs slower than Verilog. One of our
simulation environments actually spends 70% of it's simulation cycles
in the Vera testbench and only 30% simulating the device under test.
Give me these capabilities with a testbench overhead of 10 - 15% and
I will be very happy."
- Dan Joyce, Compaq Computers
"I do not know how Vera compares because we bought Specman before
Synopsys bought Vera and Specman had a technological advantage. I
also am still receiving mixed messages from Synopsys on Vera. The
big downside to Specman is the cost. I need a Specman license, a
VCS/NC-Verilog (yes, we can use either) and a Denali per simulation.
This is very expensive. Currently this is the only reason I would
switch from Specman to Vera."
- [ An Anon Engineer ]
"We're evaluating them (can't say we're pioneers here.) Having to have
a Specman license per VCS license, though, makes it expensive for a
small company."
- Kris Monsen of Mobilygen Corp.
"Specman is the most powerful verification solution available today.
They continue to pioneer new features that everyone else is scrambling
to copy. New innovations in Vera are like reading Specman's feature
sheets from 1-2 years ago. All these tools represent big improvements
over HDL's and C-based approached.
- Sean Smith, Cisco Systems
"No strong opinion here. I suspect we'll see islands of this stuff for
a while. Perhaps Superlog will be the answer."
- Paul Zimmer of Cisco Systems
"Looked at Specman and understand and have written some. We have gone
back to PERL to generate our testbenches. Consider that we must now
write in Verilog, VHDL, PERL, TCL, C. All we need is to maintain
knowledge in another language. E is not a trivial language, but for
our class of designs we can easily (more so than in E) create test
environments and run these tests."
- Dave Brier, Texas Instruments
"If you are doing 100% verification I think Verisity's Specman can be
useful. The verification language market suffers from the same problem
as the "C" market, namely fragmentation and too many vendors pushing
their own solution."
- Anders Nordstrom of Nortel
"I've used Vera for four years. My reasons for getting it were
1. Everyone makes mistakes. I think a particular person makes about
the same number of coding mistakes per line in any language. If
your code can be writen in fewer lines, you get fewer bugs. That
means I get to the design bugs faster.
2. I can recruit from a large pool of people who can write in C.
3. My testbenches are as complex as the chip they test. With Vera
or C, they use only 1/2 the CPU that VCS uses on my design,
instead of an equal amount.
4. I did not want to roll my own PLI.
I understand that Verisity suggests that you write random tests, run
them with a coverage tool, and write directed tests to fill the holes.
That would scare me, unless I knew that they had the best coverage
tool on the planet."
- Jeff Deutch, Avici Systems
"I find Specman and Vera to be useful and I believe they will play an
important part in higher-level verification for years to come. The
question on my mind is, will Superlog eventually replace the need for
a separate high-level verification language? Will eSpecman and Vera
take on high level design language constructs in the future?
I attended the Introduction to Vera and Advanced Vera presentations in
the Synopsys suites. The intro presentation was very good. The
advanced presentation was the same as the intro presentation except
that more time was spent discussing how Vera tied into other Synopsys
tools (I guess that was considered advanced). I was disappointed."
- Cliff Cummings of Sunburst Designs
"I think Specman and Vera are both very useful once your mindset is
moved over to their sweet spot. The issue for us is changing the
group mindset. This is the one area of DAC that drove me the
most and we will go the next step with. We intend on evaluating
Specman and Vera then moving over to the winner."
- Phil Kuglin, Credence Systems Corp.
"It seems that Specman Elite can be incorporated into our verification
environment and speed up the block level verification. It should be
further evaluated using our designs."
- Henry So of Mobilygen, Inc.
"In these times there are no funds to buy additional Vera/Verisity point
tools. Also, while verification languages provide the verification
engineer with his/her own language that can increase his productivity
and effectivity, it is yet another language to learn. What is the
matter with Verilog??"
- Richard Lowry of Starburst Technologies
"We do not currently use these, but I have seen Specman used with great
success by our customers and at other companies. I am concerned about
cost, simulation performance, and overhead of learning "E". I noticed
that both Avery/VCK and Co-Design's Superlog environment provide useful
extensions for test development, including random tests, without having
to go through PLI. That would be my preference."
- [ An Anon Engineer ]
"They're a powerful means to automate boundary condition testing but "e"
apparently sucks!"
- Michael Hede, MindSpeed
"We currently use Vericity's Specman for verifying algorithm intensive
blocks. Tool seems to work very well. My only concern is that writing
E is a separate effort totally disconnected from early system level
verification."
- Vladimir Sindalovsky of Agere
"We find and fix bugs with both Specman and Vera. They beat just
writing in Verilog. But, why for heavens sake doesn't Verilog do a
better job of supporting design verification. Looks like EDA vendors
may want to sell new tools rather than "enhance" old ones."
- John Szetela of AMD
"Synopsys' Vera is used here. But the users need to learn a new
language, then they could create a testbench by Vera. This is stupid
in a way. But for the long development interval, it does help
auto-check in our testbench.
The best soluation is no new language for users. For users, what they
should do is only some push-bottom-like configurations, I believe."
- Jeong-Fa Sheu of Acute Communications
|
|