Discussion:
verilog inline delay style question
(too old to reply)
Lawrence Nospam
2004-06-09 08:26:39 UTC
Permalink
I have a coworker who uses a very annoying style
of writing verilog. Unfortunately, he does this
for a reason, and I cannot think of a better way
to write it.

I hope someone can help me get rid of his style.

The bad style:

always @(posedge clk)
reg_var <= #CtoD expression;

The reason he does this is because he (sometimes)
mixes behaviorial code and gate-level code in
verification on a module-by-module basis.

The gate-level code has a clock tree which has
different non-zero delays to it's flops. The #CtoD
is needed so that the behaviorial output comes after
the latest gate-level flop captures. This avoids
hold-time problems which don't happen in real life.
Without this, a mixed gate/verilog system will FAIL.

Plus this lets him see something like gate delays
in simulating behaviorial code.


I want those delays (strewn throughout ALL of his
code) to go away. Can anyone help me out?


Is there any way to do something, like a spec-param
per file, or ANYTHING on a command line, module, or
file basis, so that he can satisfy his need for a
delay fom clock to data without cluttering the code?

(I want to stay with always blocks. I can't talk
him into instantiating a flop module with internal
delay.)

Thanks for any ideas.

Lawrence NoSpam
Manoj Vekaria
2004-06-10 05:42:48 UTC
Permalink
Hi !
Tell him to use foloowing style instead.

`define CtoD #1.0

always @(posedge clk)
reg_var <= `CtoD expression;

Now for your simulations, you can redefine CtoD to be empty like this :
`define CtoD

So your simulations will work without this delay.

Apart from, this simulator you use may have some means to ignore
assignment delays like this.

-Manoj
Post by Lawrence Nospam
I have a coworker who uses a very annoying style
of writing verilog. Unfortunately, he does this
for a reason, and I cannot think of a better way
to write it.
I hope someone can help me get rid of his style.
reg_var <= #CtoD expression;
The reason he does this is because he (sometimes)
mixes behaviorial code and gate-level code in
verification on a module-by-module basis.
The gate-level code has a clock tree which has
different non-zero delays to it's flops. The #CtoD
is needed so that the behaviorial output comes after
the latest gate-level flop captures. This avoids
hold-time problems which don't happen in real life.
Without this, a mixed gate/verilog system will FAIL.
Plus this lets him see something like gate delays
in simulating behaviorial code.
I want those delays (strewn throughout ALL of his
code) to go away. Can anyone help me out?
Is there any way to do something, like a spec-param
per file, or ANYTHING on a command line, module, or
file basis, so that he can satisfy his need for a
delay fom clock to data without cluttering the code?
(I want to stay with always blocks. I can't talk
him into instantiating a flop module with internal
delay.)
Thanks for any ideas.
Lawrence NoSpam
Loading...