>From: Greg Jaxon <Greg.Jaxon@synopsys.com> > I think P1800 section 8.18 extends ?: to non-integral types. >I too have some problems with that section's wording. >One thing I find ambiguous is that it does not >precisely specify a result type that is statically determinable >independent of the value of the conditional expression. You are right that the rules for the result type are not clearly stated. However, they are implied by the bulleted rules. If either or both of the expressions are integral type, then the operation proceeds as defined, so the result would be of integral type. You kind of have to assume the result is a generic vector, not a specific integral type. In all other cases, the expressions must be the same type. We must assume that the result will be of that type, though it is not actually stated. The rules for enums are unclear. More on that later. > Another problem is that my copy (which may be old) says the >blending of unpacked types is done element-by-element. That is what the new copies say also. >I'm not clear on how to do this for tagged union types, It isn't clear, but I can come up with a scheme that works. First match the tags. If those don't match, then use the default initialization value for the entire union, and you are done. You can't combine the elements of the union if the tags don't match. If they do match, then that is the tag of the result. And since both unions are now known to hold the same type, you can recurse and start combining their contents. >for structure types (which don't have "elements"), It seems pretty obvious that you regard the members as the elements. >or for multi-dimensional array types (which some view as having >subarray-shaped elements of one fewer dimension, and others view as >having a multitude of scalar elements). Nor can I imagine any useful >application for propagating an X effect with a coarse granularity. Finer granularity gives less pessimistic results, but costs more. >There is a clear interest in letting enum type info propagate >up through the ?:, if no blending can happen. Enum type info can >be implicitly discarded at any later stage where it becomes irrelevant. This is another case where the desire for strongly typed enums in testbenches conflicts with the desire for accurate enum behavior in the design itself (e.g. in state machines). The behavior specified in the LRM for this is not entirely clear. Applying the bulleted rules in 8.18 to two enums, neither is technically an integral type. So their types would have to match, and the result would be that type. If the condition were unknown and their values did not match, the result would be the default initialization value for that type. This interpretation produces an enum result, which could then be assigned to an enum, in accordance with strong typing. If we assume that a design will use a 4-state enum, then the default initialization value will be all-X. This is more pessimistic than a bitwise blending, but at least it is pessimistic rather than optimistic. However, I believe other LRM text disagrees with this interpretation. 4.10.4 says that an enum used as part of an expression is automatically cast to its base type. So it has already been cast from an enum to an integral type before the ?: is applied. Therefore, the result with an unknown condition will be bitwise blending. This is a more accurate result for the design, but is no longer an enum. Strong typing prevents direct assignment of the result to an enum variable, unless a cast is used. > User-defined structure types must match to be equivalent, so that >case is trivial. There should be no "blending" effect between structures - >they are all or nothing THEN or ELSE. If a 4-state conditional presents >an 'X, I'd be happy seeing a default or 'X-ful result. As I said before, I assume that element-by-element combining was intended to mean member-by-member combining for structs. I prefer all-or-nothing for efficiency, but I doubt that was the intent. > Finally, I'd like this section to say something like: > > Note: Bitwise blending of signal values is not part of the > synthesizable subset because it presupposes the existence of > hardware "X" detection. I don't see the need for this. Presumably everything being synthesized is 4-state, or synthesis already can't match simulation behavior. And if everything is 4-state, I think the bitwise blending is guaranteed to give pessimistic results. To need this statement, there would have to be a case where the blending could give a definite 0 or 1, while the hardware could give the other. I believe that the blending will not do this, but will give an X instead. Can you give any counter-example? At any rate, this LRM does not specify the synthesizable subset of the language. Any such statement would belong in another standard or sub-standard that did. Steven Sharp sharp@cadence.comReceived on Wed Feb 15 17:06:12 2006
This archive was generated by hypermail 2.1.8 : Wed Feb 15 2006 - 17:07:53 PST