FPGA Central - World's 1st FPGA / CPLD Portal

FPGA Central

World's 1st FPGA Portal

 

Go Back   FPGA Groups > NewsGroup > FPGA

FPGA comp.arch.fpga newsgroup (usenet)

Reply
 
LinkBack Thread Tools Display Modes
  #1 (permalink)  
Old 01-17-2004, 07:18 PM
Nick Suttora
Guest
 
Posts: n/a
Default Simulation Speed when using Xilinx DCM

Is there any way to speed up simulation (functional or timing) when
simulating a Xilinx FPGA which uses the DCM? The DCM requires that the
simulator resolution be set to ps range and this really slows down the
simulation speed. I have reduced time delays in my design to reduce
the simulation time required, however it still takes about 1 hour per
millisecond of simulation time. Other than getting a faster computer
are there any other things that can be done to reduce the simulation
time? I have already removed high frequency signals (clocks) from the
simulator waveform window and used variables where possible.
Reply With Quote
  #2 (permalink)  
Old 01-18-2004, 08:26 PM
Mike Treseler
Guest
 
Posts: n/a
Default Re: Simulation Speed when using Xilinx DCM

Nick Suttora wrote:

> the simulation time required, however it still takes about 1 hour per
> millisecond of simulation time. Other than getting a faster computer
> are there any other things that can be done to reduce the simulation
> time? I have already removed high frequency signals (clocks) from the
> simulator waveform window and used variables where possible.



Consider siming your source level before gate level.
That's about ten times faster.

-- Mike Treseler

Reply With Quote
  #3 (permalink)  
Old 01-19-2004, 03:45 PM
Nick Suttora
Guest
 
Posts: n/a
Default Re: Simulation Speed when using Xilinx DCM

Mike Treseler <[email protected]> wrote in message news:<[email protected]>...
> Nick Suttora wrote:
>
> > the simulation time required, however it still takes about 1 hour per
> > millisecond of simulation time. Other than getting a faster computer
> > are there any other things that can be done to reduce the simulation
> > time? I have already removed high frequency signals (clocks) from the
> > simulator waveform window and used variables where possible.

>
>
> Consider siming your source level before gate level.
> That's about ten times faster.
>
> -- Mike Treseler


I am using a core in the part which was delivered as a gate level
netlist so I have no choice.
Reply With Quote
  #4 (permalink)  
Old 01-19-2004, 06:44 PM
Mike Treseler
Guest
 
Posts: n/a
Default Re: Simulation Speed when using Xilinx DCM

Nick Suttora wrote:

>>Consider siming your source level before gate level.
>>That's about ten times faster.
>>
>> -- Mike Treseler

>
>
> I am using a core in the part which was delivered as a gate level
> netlist so I have no choice.


The choice is a tradeoff of the cost of getting the source
code vs the cost of testing the core netlist as is.

-- Mike Treseler


Reply With Quote
Reply

Bookmarks


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
DMA w/ Xilinx PCIX core: speed results and question Brannon King FPGA 8 01-16-2004 03:49 PM
Effect of `timescale precision on simulation speed Allan Herriman Verilog 3 01-07-2004 02:06 AM


All times are GMT +1. The time now is 11:51 AM.


Powered by vBulletin® Version 3.8.0
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Search Engine Friendly URLs by vBSEO 3.2.0
Copyright 2008 @ FPGA Central. All rights reserved