"Jordan Dimitrov" <
[email protected]> wrote in message
news:
[email protected]
[...]
> assignment foo=0 is NOT executed by the simulator! It has been
> GIVEN to the simulator! Therefore there is no change in the
> value of foo, i.e. the
> always @(foo) bar = !foo;
> statement is never triggered.
A common misconception, but entirely wrong. Initialisation of
variables in Verilog-2001 is simply syntactic sugar for an initial
statement containing a zero-delay blocking assignment; it most
certainly IS "executed" by the simulator, in a nondeterministic
sequence with respect to any other time-zero activity, just
as Stephen Williams indicated.
Whether this was a wise choice by the designers of Verilog-2001
is quite another matter. It was done, as far as I am aware,
with the express intention of getting an event at time 0 on
signals that are so initialised.
> I believe, however that you do have a point here. The
> simulators schedule the threads as they encounter them in the
> source file.
Says who? This is a myth. It is of course possible that some simulators
may do this; equally, I can see many reasons why a simulator might not.
Any Verilog code that relies on any specific scheduling order is
*ipso facto* broken.
[...]
> I believe that this problem can only be solved by using "old"
> and "new" values for all variables in a model,
Nonblocking assignment essentially does precisely this, and gives you
just the effect you require when modelling edge-triggered
synchronous logic.
> i.e. initially
> the "old" value of foo is x, when an assignment is executed,
> the old value still stays at x and the assigned value goes
> into the "new" value of foo. At the end of a micro cycle,
> "new" replaces "old" and this avoids race conditions. It is
> important here to do this at the end of a micro cycle, i.e. we
> allow the value of a variable to change several times within a
> time step.
Yes. It's called "simulation cycles", or "delta cycles" in VHDL. Read
the Scheduling section of the Verilog LRM to understand it fully; please
don't guess. Verilog's scheduling model is a bit of a handful, but
its behaviour is pretty well defined by the LRM.
There *is* a simple solution, I suggest...
put the sensitivity list at the END of the always block...
not so pretty to code, but it works better for modelling
combinational logic.
initial foo = 0;
always begin
bar = !foo;
@foo;
end
Now let's see what happens:
A) foo initialised first, then the always block runs
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~
bar takes the value 1 at the end of time 0, as hoped.
The always block runs just once at time 0, gets the
right answer, and then stalls at @foo.
B) always block runs first, then foo initialised
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
First run of always gives you bar=1'bx because foo==1'bx.
The always block then stalls at @foo.
Then foo is initialised to 0 and there's an event on foo.
That releases the stalled always block, still at time 0;
so it runs again, and sets bar to 1 as required.
Spookily, this form of "always" block is exactly what
is provided by a VHDL process with a sensitivity list.
Hmmm. VHDL users NEVER, NEVER suffer this problem,
because "process with sensitivity list" and "signal
initialisation" were defined sensibly from the outset.
Not bad for a "$400 million mistake".
SystemVerilog changes the definition of variable
initialisation to match VHDL's, and (again as Stephen
pointed out) adds always_comb to provide a combinational
process with bottom-testing of the sensitivity list.
Just like VHDL :-)
--
Jonathan Bromley, Consultant
DOULOS - Developing Design Know-how
VHDL * Verilog * SystemC * Perl * Tcl/Tk * Verification * Project Services
Doulos Ltd. Church Hatch, 22 Market Place, Ringwood, Hampshire, BH24 1AW, UK
Tel: +44 (0)1425 471223 mail:
[email protected]
Fax: +44 (0)1425 471573 Web:
http://www.doulos.com
The contents of this message may contain personal views which
are not the views of Doulos Ltd., unless specifically stated.