RE: [sv-ec] Outstanding covergroup filtering (2506) questions

From: Swapnajit Chakraborti <swapnaj@cadence.com>
Date: Fri Apr 08 2011 - 04:18:47 PDT

Hi Scott, Gord,

I agree that in case the expression uses covergroup argument and/or any other variable references, the closure can only be created during covergroup construction. But what is not clear to me is how simulator can defer it's knowledge on coverage space till post simulation. I am assuming that some post simulation tool will evaluate the coverage space based on the closure set passed to it by the simulator and as it is happenning post simulation the coverage space will not be known during simulation. If that is the case, there will be various issues, illegal bin warning cannot be generated, ignore cannot be done correctly and even sampling cannot happen. Please let me know if I am missing something here.

If you agree with the above statement, then I see two ways of solving this problem. One way is to do a dummy simulation run just to create the closure sets, pass the closure set data to the non simulation tool, obtain the coverage space from it (much faster than simulator) and then start a new simulation with this coverage space data.

The other option may be to pass the closure set to a non simulation tool immediately (will not wait till end of simulation) and this tool will compute the coverage space faster than the simulator and return the coverage space to simulator for it's use. This will avoid the dummy simulation run as stated in earlier option. I am not sure if these points were discussed or not and hence trying to clarify them. Please let me know your comments in this regard.

The proposal doesn't discuss how this handshaking between simulator and non-simulator tool will happen, how the closure set will be passed etc. Do you foresee a need to capture that aspect also as part of this proposal? Or this will be left totally tool dependent and remain as vendor's USP?

Regarding scope of declaration of the function to generate queue elements, the ideal place would have been the cross scope because the typedefs (CrossQueueType etc) are available within cross scope. That way it would be the innermost scope and everything starting from class variables, covergroup arguments, cross types all would have been available to this function and there would have been no need to use scope operator to access anything. But declaring it within cross scope looks a bit cumbersome to me because this is going to be an explicit declaration and not like implicit typedefs. Have we thought about next higher logical scope which is covergroup scope? If it is declared within covergroup scope that also looks ok because this ensures some kind of logical encapsulation and ensures access to everything that this function may need (of course for typedefs it will need the scope operator). Keeping the function declaration outside covergroup scope is possibly best approach as it calls for least change in curr
ent syntax.

Regarding order of distribution and filtering of fixed size bins, I am in favor of distributing the values first and then applying the with expression filter. We may want to provide a new covergroup option for this as well which can be controlled by the user.

Regds,
Swapnajit

   

-----Original Message-----
From: owner-sv-ec@eda.org [mailto:owner-sv-ec@eda.org] On Behalf Of Gordon Vreugdenhil
Sent: Tuesday, March 29, 2011 7:43 PM
To: Little Scott-B11206
Cc: sv-ec@eda.org; Fais Yaniv-RM96496
Subject: Re: [sv-ec] Outstanding covergroup filtering (2506) questions

That's correct; but that is true even for the other situations where the function relies on covergroup constructor values or instantiation parameters from an enclosing scope. Any non-local reference would require at least an elaboration (for parameter values) or simulation (for covergroup formals) in order to collect referenced values.

So unless you have a completely pure function (not referencing anything outside the function at all), a totally separate tool would still have to know some additional information.

Gord.

On 3/29/2011 7:08 AM, Little Scott-B11206 wrote:
> Hi Gord:
>
> Thanks for the clarification. I agree that the variable references wouldn't be live. They would be sampled at covergroup creation.
>
> My intention was to indicate that this behavior would be a departure from what is doable today. With current covergroups another tool (e.g., a coverage management tool) can determine the coverage space given the covergroup definition. If constructs like item 3 are used in a covergroup I believe that a simulator or simulator-type evaluation would be required to even calculate the coverage space. Is that correct?
>
> Thanks,
> Scott
>
>> -----Original Message-----
>> From: owner-sv-ec@eda.org [mailto:owner-sv-ec@eda.org] On Behalf Of
>> Gordon Vreugdenhil
>> Sent: Tuesday, March 29, 2011 8:52 AM
>> To: Little Scott-B11206
>> Cc: sv-ec@eda.org; Fais Yaniv-RM96496
>> Subject: Re: [sv-ec] Outstanding covergroup filtering (2506)
>> questions
>>
>> Scott, one clarification -- in (3) you suggest that a simulator would
>> be required to calculate the with expressions at sim time for
>> variable references. I don't think that is required. The simulator
>> would certainly need to capture the values of all referenced
>> variables at the time of covergroup construction and associate the
>> fixed set of values as the "closure"
>> (see http://en.wikipedia.org/wiki/Closure_%28computer_science%29)
>> of the function along with the covergroup.
>>
>> I didn't hear anyone suggest that variable references should be "live"
>> in the sense that changes after covergroup construction would impact
>> the function behavior.
>>
>> So as long as the simulator captured the values, the actual filtering
>> could still be done late if one wanted to do so.
>>
>> Gord.
>>
>>
>> On 3/28/2011 8:00 PM, Little Scott-B11206 wrote:
>>> Hi all:
>>>
>>> Medhi asked me to summarize the primary outstanding issues in the
>> covergroup filtering proposal (2506). I will ask the Freescale user
>> community for their opinion on these issues. I hope that others will
>> do the same and discuss the results on the reflector. I will be
>> traveling and only be able to catch the end of the meeting on April
>> 11th. Hopefully we can resolve most of these issues over the
>> reflector.
>>> 1. Use model for with expressions. My understanding is that the
>> expected use model for the with expression is to reduce/shape the
>> coverage space like is done with ignore bins but using more powerful
>> methods than are allowed by the current ignore bins syntax. If this
>> is the case, does this imply that the solution should be an
>> enhancement to the ignore bins syntax and not something as general as
>> with expressions? Are there additional use models envisioned by the
>> user community that benefit from the power provided by with expressions?
>>> A note on current capabilities:
>>> ignore/illegal bins: these must be explicitly specified which is
>> tedious for large coverage spaces (ignore_bins are often generated
>> via scripts for large spaces). Ignore bins are removed after bin
>> distribution. See pg. 496 in P1800-2009 for an example.
>>
>>> iff: this does not affect the shape of the coverage space. It can
>> affect when coverage is collected.
>>> intersect: allows reduction of cross bins that contain values in a
>> given open range expression. This gives some power but again it is
>> very limiting for complex coverage shaping. See pg. 501 in P1800-2009
>> for an example.
>>> ||,&&: allows a reduction of cross bins by oring or anding
>> combinations of the bins involved in the cross. See pg. 501 in
>> P1800-
>> 2009 for an example. Again, this is not terribly powerful.
>>> 2. Based on the use model for a constructive/generative approach
>>> does
>> it make sense to apply the expression and then do bin distribution or
>> do bin distribution and then apply the filtering? In the first case
>> a change of the expression will likely result in very different bin
>> assignments. In the second approach there may be empty bins. How
>> would these artifacts complicate the coverage merging process?
>> Current covergroup constructs may exhibit these merging issues, but
>> we don't want to needlessly exacerbate the problem.
>>> 3. The proposal adds syntax to use a function to generate a queue of
>> elements that define the cross bins. The question is what will users
>> want to access in these functions? There are three levels of items
>> that may be accessed 1. compile time constants (parameters, etc.) 2.
>> const variable/arguments 3. general variable references (these would
>> be "sampled" at the instantiation of the covergroup). Which level of
>> "purity" do users desire in these functions? It should be noted that
>> as we proceed up the list the opportunity for tools to pre/post
>> calculate the coverage space is diminished. In fact, if the users
>> desire item 3 it would require a simulator to calculate coverage
>> spaces of this type.
>>> Thanks,
>>> Scott
>>>
>>>
>> --
>> --------------------------------------------------------------------
>> Gordon Vreugdenhil 503-685-0808
>> Model Technology (Mentor Graphics) gordonv@model.com
>>
>>
>> --
>> This message has been scanned for viruses and dangerous content by
>> MailScanner, and is believed to be clean.
>>
>

-- 
--------------------------------------------------------------------
Gordon Vreugdenhil                                503-685-0808
Model Technology (Mentor Graphics)                gordonv@model.com
-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.
-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.
Received on Fri Apr 8 04:53:27 2011

This archive was generated by hypermail 2.1.8 : Fri Apr 08 2011 - 04:53:36 PDT