I am working on improving the performance of FreeDV on HF channels. As a first step I have been exploring the bit error patterns from the modem using some samples of a 1300 km HF radio path. These samples were kindly collected by Mark, VK5QI and Brenton, VK2MEV.
The FDMDV modem has 14 DQPSK data carriers. As I know the data that was transmitted, I can calculate the location of each bit error against time. Here are the bit error patterns for the 50W tx power sample, with the waterfall (spectrogram) of the same signal plotted below:
The red and blue lines indicate bit errors for the two bits modulated on each carrier. The number of each carrier (1 to 14) is on the LH axis. The x axis for both plots is time (500 bits/carrier and 10 seconds total).
At around 4-6 seconds on the waterfall we can see a fade in the top few carriers, with a corresponding burst of errors in the top few carriers of the bit error plot. Good.
However there are also two strange bit error effects. Firstly, the lowest few carriers have a permanently high bit error rate. The waterfall always shows a blue colour for these carriers – indicating a low power level. They are attenuated all of the time relative to the other carriers, as well as experiencing some fades at 2-3 and 8-9 seconds. Normally for a HF channel the level (and hence SNR and Bit Error Rate) should go up and down as the channel evolves. This suggests something is attenuating the carriers at around 500Hz, possibly some analog (high pass?) filtering in the SSB transmitter. It can’t be filtering in the receiver, as that would affect the level of both the signal and noise, and hence not affect the SNR or BER.
FreeDV has the ability to replay recorded samples from the SSB receiver. This gives an animated display of the spectrum and waterfall, which can show more information than a fixed image. Here is a screen shot of FreeDV replaying the 50W sample, also showing the lower tones being constantly attenuated:
In this case the waterfall is rotated (time on the vertical axis) compared to the waterfall plot above.
This sample has a centre frequency of 1200Hz. This has since been changed to 1500Hz, which I am hoping will fix the problem by moving the lower tones into a flatter region of the SSB radio passband. It does illustrate how important station set up can be for digital modes – we really want all the carriers to have the same TX power. We need a way to detect this sort of problem – otherwise we are introducing bit errors for no reason. Perhaps a “test frame” mode for FreeDV, so a friend can monitor the BER of each of your carriers, while you adjust your station.
The second strange effect can be observed in the bit error pattern for carrier 8. Bit errors are occurring at regular intervals, rather than the random distribution we would expect. Between symbols 300 and 400 (2 seconds) I count 9 bit errors. Now the FDM modulator waveform has a spiky nature, for example here is 10 seconds of the modulator waveform:
This because every now and again all of the carriers have the same phase, which sum to a big amplitude spike. The large spikes occur at a rate of 9 every 2 seconds. This suggests we are over driving the transmitter, causing distortion of the modem waveform and hence bit errors.
Lets look at a sample generated using 18W of transmit power (plotted below with corresponding waterfall). In this case there is less evidence of regularly spaced bit errors. This supports our theory that the TX was over driven in the 50W sample. However we can still see the effects of attenuation in the low frequency carriers (a high bit error rate). The bursts of errors sweeping through one carrier after another are also more obvious, corresponding to diagonal stripes on the spectrogram.
The next step is to gather some more off air samples using the 1500Hz centre frequency and see if the bit error pattern for the low frequency carriers improves. I am also coding up tests for interleaving and experimenting with unequal error protection schemes.