( DAC'19 Item 10a ) ----------------------------------------------- [03/04/20]
Subject: Cadence Perspec PSS "smorgasbord" strategy is Best of 2019 #10a
PERSPEC TIES WITH INFACT: A big upset this year in the PSS Wars. For 2017
and 2018, with the users Cadence Perspec easily beat out Mentor InFact, with
Anirudh's Perspec ranking #1 and #2 with the users those years.
http://www.deepchip.com/items/dac18-02a.html
http://www.deepchip.com/items/dac17-01a.html
But this year's user comment word count on PSS tools broke out to:
Cadence Perspec 2019: ################################# (2,145 words)
Mentor InFact 2019: ################################ (2,082 words)
Breker TrekSoC 2019: ## (119 words)
Putting both Perspec and InFact into a statistical tie for 1st place; and
poor Breker TrekSoC is left behind in a waaaaaay distant 3rd place. My
analysis is Perspec is still going very strong with the users -- but InFact
is moving up by doing a very different task that only *involves* PSS.
(See DAC'19 #10b for details.)
---- ---- ---- ---- ---- ---- ----
SHINY NEW THING: The holy grail of PSS is the idea of a *portable* set of
tests being used across every stage of chip design & implementation is very
alluring to chip engineers. Cadence Perspec users continue to do that.
Portability
|
|

new native C API
|
The sexy "new" thing that Anirudh's R&D added to his Perspec PSS this year
is a new native C API to drive VIP on UVM-based testbenches plus in the
top level SoC verification environment.
---- ---- ---- ---- ---- ---- ----
THE SMORGASBORD STRATEGY: In the user comments here you'll see Perspec offers
a smorgasbord of things it can do from pre-silicon to System Verilog to
post-silicon; all using one set of tests within one contiguous flow.
That is, Anirudh is taking the "we do it ALL" approach to PSS.
"Our current plan is to leverage Perspec's generated tests on
Xcelium and Palladium."
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
QUESTION ASKED:
Q: "What were the 3 or 4 most INTERESTING specific EDA tools
you've seen this year? WHY did they interest you?"
---- ---- ---- ---- ---- ---- ----
Cadence Perspec
Perspec lets us generate system-level scenarios based on a model of the
scenario state space in a standard language (PSS).
We rely on Accellera's PSS standard, and do not have any SLN legacy.
Our three main interests for using Perspec are:
- Modeling complex system level scenarios on our C-centric
platform where we primarily develop directed tests.
- Perspec introduces randomization at the system level, which
would be tedious and time consuming to address with other
verification methodologies.
- Ease the modeling of complex scenarios on top of UVM testbench.
- We want to reuse PSS models between our IP testbenches in UVM
and our top level C-centric platform.
Perspec also allows to have some functional coverage on the scenarios.
Modeling
Overall, I am expecting a 3x efficiency improvement in scenario modeling
with Perspec.
PSS modeling is quite straightforward, especially if you have some HVL
background. "Specmaniacs" will love its simplicity versus modeling
sequences with SV/UVM.
Perspec has a new compliant PSS model library for common processor
actions and memory operations, with code examples for PSS modeling. We
have not yet used it, so we don't know if it's any good or not.
Other stuff we like about Perspec
- Native C API. We are using Perspec's native C API to drive our
VIP on UVM-based testbench and top level SoC verification.
- Portability. Our current plan is to leverage Perspec's generated
tests on Xcelium and Palladium. We do not yet have plans for FPGA
or post silicon.
- Constraint Solving. Perspec has a good constraint solving
mechanism, which is very close in flexibility to Specman. It
is missing path constraints at this stage.
- Indago debug integration. This might be useful but is not yet
widely used by our team.
I'd recommend Perspec, both for generating System level C tests and to
drive complex sequence on top of a UVM testbench. The first tests can
be ready in only a few days.
---- ---- ---- ---- ---- ---- ----
We use Cadence Perspec for design verification.
One way we use Perspec is to verify unexpected coherency conditions that
might occur in the system at an earlier stage. Before we had Perspec,
it required super long iterative Verilog simulation tests in VCS to
reproduce this rare condition. So, we could only realistically test for
these conditions during post-silicon software validation.
- With Perspec, we can now run those tests much earlier -- during
simulation.
- We run many scenarios in parallel at a low cost to compensate for
the very slow execution speed of a single VCS simulator.
- By simplifying the features and conditions we want to verify as
much as possible, we can achieve our desired result during Xcelium
simulation by running many short tests.
Test Portability and Reuse
Testing portability across different verification platforms is very
important to us. Through Perspec's SLN file, our DV team's Xcelium
verification information can be reused by our software validation team
using Palladium Z1, and by our product development team in a silicon
environment.
This PSS scenario portability is a great help when our teams using
different platforms need to share and analyze a particular scenario.
Perspec Library
It was very impressive to control PCIe VIP through the Perspec PCIe
library. In general, it was very difficult to control the operation of
external PCIe devices implemented as VIPs using C code running on the
CPU. However, Perspec PCIe library provides DPI (Direct Programming
Interface) for controlling the external PCIE VIP (of course only for
Cadence VIP). This allowed us to control the VIP to perform the
desired behavior at the desired time, thus creating various verification
scenarios for PCIe.
Bringing up the first tests
Perspec generates the test code according to the scenario described by
PSS. Then, create a compile-able file using the prepared system
initialization code. This file will be compiled with the software
platform libraries you are using and will be created as an executable
image on the DUT.
All you need to do is to fill the initialization code with the
Perspec-supplied template and link the Perspec actions with your
low-level APIs so that Perspec actions can access your hardware.
The time it takes depends on the architecture and complexity of your SW
platform. However, once this environment is set up, it is much faster
and easier to generate test code with various conditions than when
writing test code manually.
Safety Requirements
I strongly recommend using Perspec for reproducing specific target
conditions during pre-silicon level verification.
The need for higher performance and additional functionality has
increased the complexity of our coherency networks every year. This
added complexity always causes unexpected problems, such as resource
conflicts between multiple coherent masters when working concurrently.
- With mobile phones, these errors could be recovered with
a simple reboot.
- However, with high performance automotive SoCs these small
errors can be fatal to the driver.
i.e., electronic devices are no longer for convenience only. For this
reason, safety verification is very important to prevent errors caused
by unexpected conditions.
The PSS model reusability and constrained-random test generation made
it easy to generate tests with various conditions defined in safety
requirements.
PSS seems to be an optimized methodology for reproducing a specific
target condition. These advantages will make PSS essential for safety
verification.
We like that Perspec works with vManager and SimVision plus is beginning
to work tighter with Xcelium and Palladium.
---- ---- ---- ---- ---- ---- ----
Cadence Perspec. It supports the PSS standard very well and has
excellent graphical tools to help beginner and expert users alike.
Currently, we use Perspec for SoC tests running on cores in pre-silicon
and post-silicon environments.
We are working to adapt PSS for IPs to realize the full potential of
PSS.
---- ---- ---- ---- ---- ---- ----
Cadence Perspec
It has been over a year since PSS 1.0 was released. As users, what we
now need are easy-to-use tooling, libraries and examples to help with
adoption.
Perspec does a good job on the tooling side, no surprises there.
Ready-to-deploy PSS scenario models are equally important - like for
example the PCI express model. We see enormous productivity from such
standard PSS scenario models.
---- ---- ---- ---- ---- ---- ----
I've used Cadence Perspec for about a year now.
At the top level, here is how we use it.
- We create a model in Perspec, describing the components and
actions of the system we want to verify.
- We specify the test scenario.
- Perspec takes the model and the test scenario and generates a
solution that meets all the system constraints and the scenario
we requested.
- We generate the set of tests in a target language -- usually
C, then compile it and run it on our targets.
Portability
Perspec's model and tests work across different (Cadence) verification
tools/engines. This portability has value for us.
- Our main model is for emulation (Palladium).
- We also run the tests that Perspec generates across our virtual
platform.
- Plus, we plan to use the models on silicon.
We can also use Perspec on simulation, but that is less frequent. We
sometimes use simulation at the beginning of the project as the
Emulation and VIP models are not there.
Perspec vs. how we did it before.
It's difficult to quantify the time savings from Perspec, but I don't
think you can get to the same quality manually. We do have team doing
manual tests, but they are focused on a specific feature or area.
Our project is an SoC. With Perspec we can write more interesting
scenarios that are more stressful in activating multiple libraries
simultaneously. We couldn't do as much with our hand-written direct
test Verilog testbenches before.
This is because Perspec deals better with SOC complexity than writing
directed tests. E.g. with direct tests, you look at direct test A vs
direct test B, while the Perspec model understands A, B, and C together.
Perspec can handle the mix of different scenarios better. For example:
- If you have different IP, Perspec can be used to test the IP
running them altogether.
- If you have multiple IPs and you want to activate them all
together, Perspec takes care of a lot of the details, such as
memory, CPU or other resource allocation.
It would be very difficult to manually come up with the tests needed to
mix these IP together. You wouldn't necessarily even have thought to
mix them, plus doing so would take much more effort.
The result is better coverage.
Libraries
Perspec already has built-in support (e.g. libraries), which require
very little upfront investment. Perspec has libraries for ARM, memory
modeling, system modeling, coherency, low power, and PCI Express. We
use all of them except PCI express.
Then there are things you must invest in more heavily. For example, if
you have an outside IP that is proprietary or specific to a project.
The benefit of using Perspec libraries is they're tried and tested.
It's a good initial test suite to run whenever you want to test
something new, such as changes to the model or a new model release.
Debug
Not using Perspec, you sometimes debug by adding "prints" between the
code to show progress, i.e. when the tests get to a certain point.
Using Perspec, when it generates an action, and generates the C code, it
automatically adds these debug hooks, and the prints appear in the text
files. The GUI also processes them and shows them graphically.
Perspec is also integrated with Cadence Indago, so the GUI processes it
and shows it graphically; however, I don't use Indago, as I prefer doing
text-based debug.
Reduced Models
When doing verification, you don't always work on the full SoC all the
time. In particular, you don't always want to build the full SOC model
because it takes more emulation resources, which translates to money.
Also, for some features you need to bring up a model that has two
systems connected together (B2B model).
Creating legal tests for all the different model flavors (reduced, full,
B2B) is easy with Perspec, as Perspec has a good solution for describing
these different models, e.g. reduced vs. full.
It's only a matter of updating some tables describing what CPUs we have,
or updating tables that describe the memory setup. A lot of these
features to describe your model are already built in.
Coverage Analysis
Perspec hooks with Cadence vManager -- which we're use to -- so we get
a unified look and feel for all coverage analysis for both functional
coverage and generation coverage.
We use vManager to run our regressions, but don't yet use it for
coverage yet.
Perspec's GUI
The Perspec GUI is very easy to use -- though because it has a lot of
new stuff to it, it takes time to master. Initially, when I was
debugging my generated test, I would open the text file and look inside
it, but now it is more efficient to use the GUI.
When you generate a solution for a scenario, you can look at the
solution in the graph and then search and highlight/color different
items in the solution, or color all actions per the CPU that runs the
action...
To help you understand what happened, when you select an action in the
solution, Perspec will point you to the generated code in the target
language. Also, understanding the scheduling or resource
constraints/dependencies in the solution is very easy as it is shown
in the solution graph.
An example: When I make changes to an existing model, I want to check
that it works correctly and that I'm not breaking things.
- I know interesting scenarios that should work, so I will take
a scenario and further constraint it, to confirm it works
correctly.
- I might change the way I model the memory, take a scenario and
look at two memory blocks to make sure it finds a solution
with each of them.
It is very easy to iterate on taking a scenario, adding / removing
constraints, generating a solution, checking the results and making any
needed fixes.
Constraint Solving
When you write a scenario, you can use Perspec' GUI to specify a
scenario, or write it in text. Usually I use the GUI -- unless I am
taking something that already exists and I just want to do a small
modification -- I think going text may be better.
I'm told using the GUI to write the scenarios to generate constraints
should prevent a lot of manual errors. But I'm a text guy, so I can't
confirm that. Either way, if Perspec can't solve your constraints,
it gives you an error message.
Conclusion
I would definitely recommend Perspec to someone verifying an SoC with
multiple IPs.
---- ---- ---- ---- ---- ---- ----
1. JasperGold
2. Perspec
3. Protium
---- ---- ---- ---- ---- ---- ----
Related Articles
Cadence Perspec PSS "smorgasbord" strategy is Best of 2019 #10a
MENT InFact ties with CDNS Perspec for PSS is Best of 2019 #10b
And weak user turnout for Breker this year is Best of 2019 #10c
Join
Index
Next->Item
|
|