Showing posts with label constrained random. Show all posts
Showing posts with label constrained random. Show all posts

Saturday, June 27, 2020

Arrays, Dynamic Arrays, Queues: One List to Rule them All


Randomizable lists are, of course, very important in modeling more-complex stimulus, and I've been working to support these within PyVSC recently. Thus far, PyVSC has attempted to stay as close as possible to both the feature set and, to the extent possible, the look and feel of SystemVerilog features for modeling constraints and coverage.  With randomizable lists, unlike other features, I've decided to diverge from the SystemVerilog. Keep reading to learn a bit more about the capabilities of randomizable lists in PyVSC and the reason from diverging from the SystemVerilog approach.

SystemVerilog: Three Lists with Different Capabilities
SystemVerilog is, of course, three or so languages in one. There's the synthesizable design subset used for capturing an RTL model of the design. There's the testbench subset that is an object-oriented language with classes, constraints, etc. There's also the assertion subset. These different subsets of the language have different requirements when it comes to data structures. These different requirements have led SystemVerilog to have three array- or list-like data structures:

Fixed-size arrays, as their name indicates, have a size specified as part of their declaration. A fixed-size array never changes size. Because  the array size is captured as part of the declaration, methods that operate on fixed-size arrays can only operate on a single-size array.

The size of dynamic-size arrays can change across a simulation. The size of a dynamic-size array is specified when it is created using the new operator. Once a dynamic-size array instance has been created, the only way to change its size is to re-create it with another new call. Well, actually, there is one other way. Randomizing a dynamic-size array also changes the size.

The size of a queue is changed by calling methods. Elements can be appended to the list, removed, etc. A queue is also re-sized when it is randomized.


PyVSC: One List with Three Options
If you've done a bit of Python programming, you're well aware that Python has a single list. Python's list is closest to SystemVerilog's queue data structure. My initial thought on supporting randomizable lists with PyVSC was just to create an equivalent to the list and be done. But then I thought a bit more about use models for arrays in verification. Each SystemVerilog array type represents a useful use model, but there's also another use model that I've never properly figured out how to easily represent in SystemVerilog. Fundamentally, there are two use cases for randomizable lists:
  • List with non-random elements
  • List with random elements, whose size is not random
  • List with random elements, whose size is random
When the size of a list whose size is not randomizable is modified by appending or removing elements, its size is preserved when the list is subsequently randomized.

Here are a few examples.

@vsc.randobj
class my_item_c(object):
    def __init__(self):
      self.my_l = vsc.rand_list_t(vsc.uint8_t(), 4)

The example above declares a list that initially contains four random elements.

@vsc.randobj
class my_item_c(object):
    def __init__(self):
      self.my_l = vsc.randsz_list_t(vsc.uint8_t())

    @vsc.constraint
    def my_l_c(self):
        self.my_l.size in vsc.rangelist((1,10))
The example above declares a list whose size will be randomized when the list is randomized. A list with randomized size must have a top-level constraint that specifies the maximum size of the list. Note that in this case the size of the list will be between 1 and 10.

If you wish to use a list of non-random values in constraints, you must store those values in an attribute of type list_t. This allows PyVSC to properly capture the constraints.
@vsc.randobj
class my_item_c(object):
    def __init__(self):
      self.a = vsc.rand_uint8_t()
      self.my_l = vsc.list_t(vsc.uint8_t(), 4)

      for i in range(10):
          self.my_l.append(i)

    @vsc.constraint
    def a_c(self):
      self.a in self.my_l

it = my_item_c()
it.my_l.append(20)

with it.randomize_with(): 
      it.a == 20 

In the example above, the class contains a non-random list with values 0..9. After an instance of the class is created, the list is modified to also contain 20. Then we randomize the class with an additional constraint that a must be 20. This randomization will succeed because the my_l list does contain the value 20.

Using Lists in Foreach Constraints 

PyVSC now also supports the foreach constraint. By default, a foreach constraint provides a reference to each element of the array. 
@vsc.randobj
class my_s(object):
    def __init__(self);
        self.my_l = vsc.rand_list_t(vsc.uint8_t(), 4)

    @vsc.constraint
    def my_l_c(self):
        with vsc.foreach(self.my_l) as it:
            it < 10
In the example above, we constrain each element of the list to have a value less then 10. However, it can also be useful to have an index to use in computing values. The foreach construct allows the user to request that an index variable be provided instead.
@vsc.randobj
class my_s(object):
    def __init__(self);
        self.my_l = vsc.rand_list_t(vsc.uint8_t(), 4)

    @vsc.constraint
    def my_l_c(self):
        with vsc.foreach(self.my_l, idx=True) as i:
            self.my_l[i] < 10
The example above is identical semantically to the previous one. However, in this case we refer to elements of the list by their index. But, what if we want both index and value iterator?
@vsc.randobj
class my_s(object):
    def __init__(self);
        self.my_l = vsc.rand_list_t(vsc.uint8_t(), 4)

    @vsc.constraint
    def my_l_c(self):
        with vsc.foreach(self.my_l, it=True, idx=True) as (i,it):
            it == (i+1)

Just specify both 'it=True' and 'idx=True' and both index and value-reference iterator will be provided.

One List to Rule them All
As of the 0.0.4 release (available now!) PyVSC supports lists of randomizable elements whose size is either fixed or variable with respect to randomization. Check it out and see how it helps in modeling more-complex verification scenarios in Python!

Disclaimer
The views and opinions expressed above are solely those of the author and do not represent those of my employer or any other party.






Saturday, April 25, 2020

Python Verification: Working with Coverage Data



Before jumping into this week's post, I wanted to offer a bit of an apology to my readers. I recently realized that, despite being a Google property, Blogger only notifies authors of comments for moderation if the author has specifically registered a 'moderator' email with the site. So, apologies to those of you that have commented on posts directly on the Blogger site and watched those comments hang out in limbo indefinitely. I should now receive notifications of new comments.

In my last post, we looked at modeling and sampling functional coverage in Python using the Python Verification Stimulus and Coverage (PyVSC). In that post, I showed how a textual coverage report could be generated to the console by calling an API. But, there is much more that we want to do with functional coverage data. The key question is: how do we store and manipulate it?

Storing Coverage Data

There are two big motivations for storing coverage data. The first is that we often wish to aggregate coverage across a large number of tool runs. In order to do that, we need a way to persist the coverage data collected by each individual tool run. The second is that we want to run analysis on the collected and aggregated coverage data. We want a way to browse through the data interactively, and create nice-looking reports and charts.

Standard Coverage Models

Storing coverage data isn't much different than storing any other data. The first big question to answer is whether there is a standard way of of representing the data, or whether we need to invent one. While I've certainly had fun in the past inventing new formats for representing and storing data, considering all the requirements and designing in appropriate features to represent all the key features of a given type of data is a time consuming problem. Certainly something that should be undertaken as a last resort.

The good news is that there are several existing formats for representing coverage. The bad news is that the vast majority are focused on representing code coverage data (eg Coburtura), not functional coverage data. That said, there is one industry standard for representing functional coverage and code coverage: Accellera Unified Coverage Interoperability Standard

While the  UCIS defines several things, it doesn't define a standard database format. That said, what it does define is very useful. Specifically it defines:
  • A data model for representing functional coverage, code coverage, and assertion coverage
  • A C-style API for accessing and modifying this data model
  • An XML interchange format to assist in moving data from one database implementation to another. In a pinch, the XML interchange format can even be used as a very simplistic database.
Design is tough, so it's almost always most efficient to make use of the work of a committee of smart and capable people instead of starting over. UCIS is certainly not perfect. There are some "bugs" in the spec, and some internal inconsistencies. That said, it's far better than starting with a blank sheet of paper. The next challenge was adapting UCIS to Python.

PyUCIS Library

Much of my work recently has been in Python, so I wanted a way to work with the UCIS data model in Python. The PyUCIS library is a pure-Python library for working with the UCIS data model. A block diagram of the architecture is shown below. 


Front-End API

The core of the PyUCIS library is an implementation of the UCIS API. Remember that the API defined by the UCIS is a C-style API, while Python is much more object-oriented. I initially decided to implement just an object-oriented version of the UCIS API, but then realized that reusing existing code snippets written in C would be much harder without an implementation of the C-style API. Fortunately, building a C-style compatibility API on top of the object-oriented one was fairly straightforward.

Backend

The PyUCIS library uses a back-end to store the data being accessed via the front-end API. The PyUCIS library currently implements two back ends: an in-memory back-end, and an interface to existing C-API implementations of the UCIS API.

The in-memory back-end stores coverage data in Python data structures. While it's not possible to persist the data model directly, the contents can be saved to and restored from the XML interchange format specified by the UCIS.

The C-library back-end uses the Python ctypes library to call the UCIS C API as implemented by a tool-specific shared library. This allows PyUCIS to access data in databases implemented by tools that support UCIS.

While PyUCIS doesn't currently implement its own native database for storing coverage data, it's likely that it will in the future. Fortunately, Python provides an SQLite database as part of the core interpreter installation. Stay tuned here.

Built-in Apps

The final part of the PyUCIS library are a set of built-in apps. These are used to perform simple manipulations on the coverage data and create outputs. Currently, PyUCIS only contains one built-in app: reading and writing the UCIS XML interchange format. That said, there are a couple planned on the roadmap:
  • A merge app to combine data from multiple UCIS data models
  • A report app to produce a textual or HTML coverage report

PyUCIS Apps

The top layer of the PyUCIS architecture diagram are external applications that use the PyUCIS API. At the moment, there is only one and it's a proof of concept. PyUCIS Viewer is a Python Qt5-based GUI for viewing coverage data.


While the viewer is certainly primitive (and incomplete) at the moment, hopefully this provides some ideas for what can be done with the data accessed via the PyUCIS API.

Next Steps

PyUCIS is a pretty early-stage tool. I'm using it to save coverage data from the PyVSC library, and to produce some simple text coverage reports, but there's still quite a bit to do. As always, if you'd like to contribute to this or other projects, I'd welcome the help. 
In the next post, I'll return the Python Verification Stimulus and Coverage (PyVSC) library to look at modeling constrained-random stimulus. Until then, stay safe!

Disclaimer
The views and opinions expressed above are solely those of the author and do not represent those of my employer or any other party.

Sunday, April 5, 2020

Python Verification Stimulus and Coverage: Data Types



In my last post, Modeling Random Stimulus and Functional Coverage in Python, I introduced a Python library for modeling random variables, constraints, and functional coverage. Starting with this post, I'll go through several aspects of the PyVSC library in greater detail. In this post, I'll cover the data types supported by PyVSC.

There are two reasons for doing this. For one thing, I think it's a useful way to describe the key features of the library (and hope you agree). The other reason is documentation. I don't do New Year's resolutions, but if I did one of mine this year would have been to do a better job of documenting my projects. For me, at least, documentation seems to be one of the hardest parts of a project -- or, at least, the easiest to defer and ignore. After coming back to a couple of my older projects and having to read code to figure out how to use them, I've decided that I need to invest more in documentation.

Fortunately, creating good documentation and making it readily-available has gotten much easier. Sphinx does a great job of converting ReStructured Text (RST) into nice-looking documentation. Read-the-docs ensures that the latest and greatest version of documentation is always just a click away. You can always find the latest PyVSC documentation here, and I'm investing more time in getting my other projects documented in the same way.

So, there you have it. My strategy is to introduce a set of PyVSC features in each of the next few posts. At the same time, I'll ensure the documentation for those features is in place. With that, let's dig in!

Verification Requires Being Specific with Datatypes

Increasingly, programming languages (looking at you, Python) are eager to separate the declaration of scalar data types from the way that they are represented. While C/C++, SystemVerilog, and Java all require the user to specify information about scalar data types -- width, sign, etc -- Python doesn't. An integer variable is as wide as it needs to be to hold the values the user wants to store in it. Furthermore, an integer variable doesn't have any notion of being signed or unsigned.

When verifying hardware, we need to be a bit more specific because we're working with designs that very much care about the representation of data types. The nets transferring data across a bus interface have a fixed width, and the data stored in registers has both a width and a sign. Consequently, the verification code we write must also be specific about the data it is sending and receiving from the design being verified.

So, when generating stimulus and collecting coverage, we definitely need to capture the width of each verification-centric variable, and whether it is signed or unsigned. With stimulus-generation , there is one other piece of information that we need to track: whether the variable is randomized. 

PyVSC and Scalar Datatypes

PyVSC uses specific data types for both constrained-random stimulus generation and for functional coverage collection. Like other randomization and coverage-collection frameworks, the use of specific data types provided by the library, instead of the language-provided built-in data types, serves two purposes. First, it allows the user to be sufficiently specific about the characteristics and meta-data of the data type. Second, it enables the library to capture expressions.

Currently, PyVSC supports three core categories of data type:

  • Integer scalar -- specific-bitwidth, signed and unsigned, random and non-random
  • Enumerated -- random and non-random variables 
  • Class -- random and non-random instances of a randobj class
@vsc.randobj
class my_s(object):

    def __init__(self):
        self.a = vsc.rand_uint8_t()
        self.b = vsc.uint16_t(2)
        self.c = vsc.rand_int64_t()
PyVSC provides pre-defined data-type classes that roughly correspond to the standard data types defined by the stdint.h C/C++ header file. These data-type classes have widths that are multiples of 8 bits, and specify the sign and randomness of the variable.

Width
Signed
Random
Non-Random
8
Y
rand_int8_t
int8_t
8
N
rand_uint8_t
uint8_t
16
Y
rand_int16_t
int16_t
16
N
rand_uint16_t
uint16_t
32
Y
rand_int32_t
int32_t
32
N
rand_uint32_t
uint32_t
64
Y
rand_int64_t
int64_t
64
N
rand_uint64_t
uint64_t
Just to keep things straightforward, PyVSC defines classes that capture all 16 combinations of width, sign, and randomness.

PyVSC also provides classes for capturing fields that have a width that is not a multiple of 8, or that is wider than 64 bits. 

First, an example:

@vsc.randobj
class my_s(object):

    def __init__(self):
        self.a = vsc.rand_int_t(27)
        self.b = vsc.rand_bit_t(12)


Signed
Random
Non-Random
Y
rand_int_t
int_t
N
rand_bit_t
bit_t

The Data Types chapter of the documentation contains more examples and details on how all of these data types are used.


PyVSC and Composite Data Types

In true object-oriented fashion, PyVSC supports composing larger randomizable classes out of smaller randomizable classes. 

@vsc.randobj
class my_sub_s(object):
    def __init__(self):
        self.a = vsc.rand_uint8_t()
        self.b = vsc.rand_uint8_t()

@vsc.randobj
class my_s(object):

    def __init__(self):
        self.i1 = vsc.rand_attr(my_sub_s())
        self.i2 = vsc.attr(my_sub_s())

In these cases, it's important to specify whether the class-type attribute should be randomized when the containing class is randomized. Decorating the attribute with rand_attr will cause the sub-attributes to be randomized. Not decorating the field, or decorating it with attr will cause the sub-attributes to not be randomized.

Accessing Attribute Values

It's quite common in randomization and coverage frameworks (eg SystemC SCV, CRAVE) to use method calls to access the value of randomizable class attributes. This is because the randomizable attributes are, themselves, objects.
PyVSC provides get_val() and set_val() methods for each scalar datatype provided by the library. In addition, PyVSC implements operator overloading for randobj-decorated classes. In most cases, this means that special randomizable class attributes operate just like any other scalar Python class attribute.

Coming up Next

In the next blog post, we'll look at PyVSC's support for modeling, capturing, and saving functional coverage data. Until then, feel free to check out the PyVSC documentation on readthedocs.io. If you'd like to experiment with PyVSC, install the pyvsc package from pypi.org or check out the PyVSC repository on GitHub.


Disclaimer
The views and opinions expressed above are solely those of the author and do not represent those of my employer or any other party.

Friday, March 27, 2020

Modeling Random Stimulus and Functional Coverage in Python



If you've been following the blog over the last year, you've probably noticed that I've spent quite a bit of time over the last year learning and using Python. For several reasons, it's become my new favorite programming language. Until recently, I've mostly used Python as an implementation language. However, I've been curious as to how well Python works for implementing an embedded domain-specific language (eDSL). Most of my experience in this space has been with C++, so I was looking for an excuse to experiment.

In addition to using Python as a general programming language, I've also been using it as a testbench language for functional verification. I've spent time learning about the cocotb library that interfaces Python to a simulation engine, and in building an efficient task-based interface between Python and BFMs in simulation. With a decade or so developing OVM and UVM testbench environments, one thing I found myself missing in Python were the constraint and functional coverage modeling features that SystemVerilog provides. Now, to be clear, there is at least one existing Python library that provides random stimulus in Python, but after evaluating the approach and supported features against commonly-used SystemVerilog features, I decided to proceed in a different direction.

Functional Verification and Constrained Random Stimulus
Constrained-random stimulus and functional coverage have become well-ingrained in functional verification practice over the last decade or more. SystemVerilog, of course, embeds this functionality in the language, SystemC offers two libraries (SCV and CRAVE) for randomization. The Accellera Portable Test and Stimulus (PSS) language also specifies constrained-randomization and functional coverage features. These existing specifications all overlap on a few key features, though they also each have some unique features.

From a requirements perspective, I wanted to support a super-set of the constraint and functional coverage features from these existing sources to the extent possible. In addition to wanting to support as many useful features as possible, reuse was a key consideration. We're also beginning to see some open-source UVM-based libraries, such as the riscv-dv project from Google for generating RISC-V instruction streams, that include constraints and functional coverage. This library and others are implemented in SystemVerilog, of course, but could be translated to Python. The porting task is definitely eased if the constraints and coverage can simply be mechanically translated instead of being reworked and remodeled to target a different set of supported constructs.

Key Requirements
After a bit of investigation, I settled on the following requirements for my library, and decided to name it Python Verification Stimulus and Coverage (PyVSC).

  • Keep the user-visible modeling constructs as syntactically similar to SystemVerilog as possible and practical
  • Provide an underlying data model that can be programmatically processed to support static analysis, checking, and visualization of the user-specified constraints and coverage.
  • Allow users to capture simple features in natural syntax, while allowing them to programmatically build up the model for more-complex applications. 
  • Be able to take advantage of the availability and high performance of existing SMT solvers 

PyVSC Basics
The code below shows a simple example of capturing a class with random fields using the PyVSC library.
@vsc.randobj
class my_item_c(object):
    def __init__(self):
        self.a = vsc.rand_bit_t(8)
        self.b = vsc.rand_bit_t(8)

     @vsc.constraint
     def ab_c(self):
         self.a != 0
         self.a <= self.b
         self.b in vsc.rangelist(1,2,4,8)
Note that the class is identified as a randomizable class via the vsc.randobj decorator. Decorators, which Python supports as a first-class construct, enable classes and methods to be tagged as having special significance. They also allow additional functionality to be layered on. In this case, functions for randomization will be added to the class. The class inheritance hierarchy will also be altered slightly, to allow constraints and random fields to be configured after construction of a class instance. However, 

Class fields (both random and non-random) that will participate in randomization are declared using VSC types. This enables information on bit-width and randomizable status to be associated with class fields. 

Now, let's have a look at functional coverage.
  @vsc.covergroup
  class my_cg(object):

      def __init__(self):
          # Define the parameters accepted by the sample function
          self.with_sample(dict(
              it=my_item_c()
           ))

           self.a_cp = vsc.coverpoint( self.it.a, bins=dict(
              # Create 4 bins across the space 0..255
              a_bins = bin_array([4], [0,255])
           )
           self.b_cp = vsc.coverpoint(self.it.b, bins=dict(
              # Create one bin for each value (1,2,4,8)
              b_bins = bin_array([], 1, 2, 4, 8)
           )
           self.ab_cross = vsc.cross([self.a_cp, self.b_cp])
A covergroup is a class decorated with the vsc.covergroup decorator. As with a randomizable class, the decorator implements proper construction order, and adds methods to the target class. 

There are several ways that coverage data can be provided to a covergroup class. In the example above, coverage data will be passed as parameters to the 'sample' method. In order to do this, we must specify the sample-method parameter names and types. This is done by calling the with_sample method and passing a Python dictionary with parameter name and type. 

Coverpoints and crosses are defined using the coverpoint and cross methods. Coverpoint bins are declared as shown above, and support the set of value bins supported by SystemVerilog -- individual bins containing individual values and ranges of values, and bin arrays containing values partitioned across the bins.

Okay, let's put it all together. The code below creates an instance of the covergroup, an instance of the randomizable class, randomizes the class, and samples the data in the covergroup.
 # Create an instance of the covergroup
  my_cg_i = my_cg()

  # Create an instance of the item class
  my_item_i = my_item_c()

  # Randomize and sample coverage
  for i in range(16):
      my_item_i.randomize()
      my_cg_i.sample(my_item_i)

  # Now, randomize keeping b in the range [1,2]
  for i in range(16):
      with my_item_i.randomize_with() as it:
          it.b in vsc.rangelist(1,2)
      my_cg_i.sample(my_item_i)

  print("Coverage: %f \%" % (my_cg_i.get_coverage()))

The first randomization loop randomizes the class using constraints declared in the class. The second randomization loop adds an additional inline constraint.

Looking Forward
Over the next couple of posts, I'll go through the stimulus-generation and functional coverage features of PyVSC in more detail. Future posts will also tackle what to we can do in terms of static analysis of constraint models, as well as what to do with functional coverage data once we've collected it. Until then, feel free to have a look at the early documentation on readthedocs.io, and have a look at the PyVSC project on GitHub.


Disclaimer
The views and opinions expressed above are solely those of the author and do not represent those of my employer or any other party.