I am still struggling with measuring time delays in the few tens of nanoseconds
using my XK-1 board running with a 400Mhz reference clock. I now have altered
my pulse generator to combine the original start/stop pulses into a single pulse
generated via an XOR gate.
If I feed this pulse into my board that now is running only the function pulse_input()
shown in the code below, the minimum time delay I can see is about 80 nsecs. I
would like to get down to around 30 - 40 nsecs.
Code: Select all
void pulse_input(in port start)//, streaming chanend c_start)
{
int counter = 0;
while (1) {
while(peek(start)==1){
test0 <: 1;
counter++;
}
test0 <: 0;
if(counter>0){
// c_start <: counter;
counter = 0;
}
}
}
is high, increments the counter and sets an output port high for the logic analyzer
(test0).
When the pulse goes low the output port is set low. The commented-out line allows
the data to be sent to another thread for accumulation and display. It is commented
out to keep this example as simple as possible.
The attached screen shot shows the start/stop signals (Channels 0 and 1) and the
output of the XOR chip (Channel2). Channel3 shows the test0 signal.
There is a delay of around 80 nsecs before test0 goes high. It stays high for the
duration of the input pulse which in this case is around 150 nsecs.
The problem starts when the pulse length is less than about 80 nsecs in which case
the pulse_input() function is never called. At the borderline length the function is
called intermittently.
I don't understand why there should be this long delay between the input pulse and the
corresponding input to pulse_input(). I have looked at the generated assembler and
there does not seem to anything odd there. Is there any more documentation on port
I/O than contained in "Programming XC on XMOS Devices"
Thanks again for any help.
John.