Changing AVB network latency

New to XMOS and XCore? Get started here.
Post Reply
nick
Active Member
Posts: 42
Joined: Tue Jan 07, 2020 10:35 am

Changing AVB network latency

Post by nick »

Hi,
I'd like to know the latency introduced by the XMOS with the AVB firmware.
In this post:
https://www.xcore.com/viewtopic.php?t=3189
it is said that the XMOS supports class A traffic (2 ms latency).

Is there a way to change this parameter?
I have an RME Digiface AVB and in the controller you can set the network latency down to 0.3125 ms.
Can I do a similar thing on the XMOS?


User avatar
akp
XCore Expert
Posts: 578
Joined: Thu Nov 26, 2015 11:47 pm

Post by akp »

I haven't tried it but I'd suggest modifying AVB_DEFAULT_PRESENTATION_TIME_DELAY_NS and looking into the source->presentation parameter for the talker. As for the listener, I am pretty sure that just depends what the connected talker is doing when it sets its presentation time.

EDIT: Obviously if you reduce the talker presentation delay then you will have to be very careful about the hops in your network setup.
nick
Active Member
Posts: 42
Joined: Tue Jan 07, 2020 10:35 am

Post by nick »

Thank you for the suggestion. I'll try as soon as possible.
I'm not worried about the hops because I need just a direct connection, or at least 1 switch.
User avatar
akp
XCore Expert
Posts: 578
Joined: Thu Nov 26, 2015 11:47 pm

Post by akp »

Super. Looking forward to hearing the result as I may need to modify it longer (I can live with higher latency but my network setup is - let's say - unusual).
nick
Active Member
Posts: 42
Joined: Tue Jan 07, 2020 10:35 am

Post by nick »

Hi akp,
some good news: the parameter you said (AVB_DEFAULT_PRESENTATION_TIME_DELAY_NS) seems to modify the XMOS talker latency.

To test it I used the RME Digiface AVB (connected through USB3 to a MAC) with a network latency (talker) of 0.3125 ms, and a XMOS multichannel board.
On MAC I ran a software to do a synchronous play-record. On the XMOS I connected (let's call it) TDM OUT line 1 to TDM IN line 1 (a hardware playback).
I set an output buffer on the XMOS of 64 samples and on the MAC software 32 samples (as low as I could).

I did the play-record of a test track several times and then opening the original track and the recorded one, I measured the latency.
I did 2 test:
1) AVB_DEFAULT_PRESENTATION_TIME_DELAY_NS = 2 ms
2) AVB_DEFAULT_PRESENTATION_TIME_DELAY_NS = 1 ms

The first gave me a total latency value ranging from 300 samples to 330 samples (@48 kHz from 6.25ms to 6.86ms).
The second gave me a total latency value ranging from 240 samples to 280 samples (@48 kHz from 5ms to 5.8ms).
So it seems that the parameter is working.

But I have some doubts:
- let's suppose that the in and out latency is symmetrical, so we can divide by 2 my results obtaining about 3.25ms and 2.5ms.This values are greater respect to what I expected to get (the theoretical 2ms o 1ms). The explanation I gave was the AVB-to-USB conversion made by the RME. Is this correct?
- I obtained very different results (for example in the second test I got values from 240 to 280 samples). Can this be related to the USB behaviour too?

Can you think of a better method to measure the latency?

Thank you again for your support.
User avatar
akp
XCore Expert
Posts: 578
Joined: Thu Nov 26, 2015 11:47 pm

Post by akp »

I am glad you got some useful result. I am not sure how to improve your test but if I think of something I will let you know. It does seem like you reduced your latency by 1 msec as expected, though.
ozel
Active Member
Posts: 45
Joined: Wed Sep 08, 2010 10:16 am

Post by ozel »

I'm not 100% clear about your way of measuring latency and my practical AVB knowledge is some years old now, but I would always suggest measuring the latency in a loop-back configuration and "on the same side" of interfaces.
In particular, by using a real oscilloscope and some simple function generator making individual pulses, you can measure easily the round-trip delay and jitter from ADC input to DAC output.
On a mac, Soundflower used to be a nice tool or on Linux just configuring ASLAS/jackd appropriately in order to loop back the input samples into an audio output.

You could probably also try to loop back the ethernet sample payload from RX to TX on the XMOS, but using a hardware oscilloscope is really the simplest way instead of counting samples on the PC. If you switch an analog hardware scope to persistence display mode, you'll see quickly the minimum and maximum of latency variations (=the jitter) after a couple of looped-back pules. The trigger must be set on the input signal pulse that goes into the ADC, the timebase such that you'd see the reappearing loooped-back pulse on the DAC output, measured using a second scope probe.

It's safe to assume the XMOS latency and jitter as pretty deterministic (and pretty much in-line with the presentation time delay - although there should be a practical minimum ) in comparison to the USB interface and PC latency. AFAIK, with USB3 there are different ways to do to synchronous audio, but I would assume there is, in any case, a PID loop on that RME interface that controls buffer levels and will affect the overall latency and jitter more than the XMOS AVB stack using synchronous ethernet and ADCs/DACs or other digital audio interfaces besides USB.

I've briefly checked the RME Digiface AVB as I'm really pleased that we slowly see some more devices and got curious. The manual talks quite a bit about latency, but I assume you've seen that.
Post Reply