hi- i have run into bit of a problem and i know this might not be the right alias for this, but i am asking for some advice. i wanted to test out the feasibilities of using the systemverilog 2-state data types. my main objective is to observe if i get an enhanced simulation performance, i am not trying to get rid of Xs in the design by simulating 0s / 1s only. i am also not trying to test real-life power on set / reset environments. i don't expect much of a performance boost, i am merely trying it as an exercise. anyhow, i took an existing design in verilog 1995 + verilog 2001 + NTB + OVA + PLI (tf + acc) and am trying to change it to (~40% done so far) systemverilog + SVTB + SVA + VPI the design is not too big, it is a cluster (core + cache + memory) and is probably < 8-10M gates if you synthesize it. there are no bi-di's or tristate buses. everything is behavioral RTL. there were some X-assignments which i got rid of in the systemverilog transformation and i also got rid of several casex statements. after i transformed it painstakingly, the simulation now hangs! how does one generally go about debugging a hang ? i am trying to put a lot of $display and narrowing it down, but are there any alternatives to that ? i am looking for one that does not involve me entering the CLI prompt for the simulator. thanks. -nasimReceived on Mon Jun 12 12:34:38 2006
This archive was generated by hypermail 2.1.8 : Mon Jun 12 2006 - 12:34:59 PDT