digital-brain/content/zettels/time_delay.md

2.5 KiB

+++ title = "Time Delay" author = ["Dehaeze Thomas"] draft = false +++

Tags :

Phase induced by a time delay

Having some time delay can be modelled by a transfer function having constant amplitude but a phase lag increasing with frequency. Such phase lag is linearly proportional to the time delay and to the frequency:

\begin{equation} \phi(\omega) = -\omega \cdot T_s \end{equation}

with:

  • \(\phi(\omega)\) the phase lag in rad
  • \(\omega\) the frequency in rad/s
  • \(T_s\) the time delay in s

Estimation of phase delay induced in sampled systems

Consider a feedback controller implemented numerically on a system with a sampling frequency \(F_s\).

The time delay associated with the limited sampling frequency \(F_s\) is:

\begin{equation} \phi(\omega) = -\frac{\omega}{F_s} \end{equation}

with:

  • \(\phi(\omega)\) the phase lag in rad
  • \(\omega\) the frequency in rad/s
  • \(F_s\) the sampling frequency in Hz

Some values are summarized in Table 1.

Table 1: Phase lag as a function of the frequency (relative to the sampling frequency )
Frequency Phase Delay [deg]
\(F_s/100\) -3.6
\(F_s/10\) -36.0
\(F_s/2\) -180.0

This is the main reason to have a sampling frequency much higher than the wanted feedback bandwidth is to limit the phase delay at the crossover frequency induced by the time delay. Having a sampling frequency a 100 times larger than the crossover frequency is a good objective.

Take the example of a controller implemented with a sampling time of 0.1ms (10kHz sampling frequency).

t_delay = 1e-4; % Delay [s]
G_delay = exp(-t_delay*s);

The induced phase delay as a function of frequency is shown in Figure 1.

At the Nyquist frequency (5 kHz), the phase lag is 180 degrees.

{{< figure src="/ox-hugo/time_delay_induced_phase_lag.png" caption="<span class="figure-number">Figure 1: Phase lag induced by a time delay" >}}

Bibliography