Signal Processing Asked on January 4, 2022
My receiver contains: AGC,timing recovery,carrier recovey blocks.
I am using gardner timing recovery with loop filter and farrow parabolic filter.
The input to the farrow filter is the fractional delay. I am assuming the fractional delay should be constant after preamble time may be only small variations. The fractional delay does not settle in the preamble time.
Between two data burst only pure carrier is sent so there any no transition for
timing recovery to work. So immediately after the data burst there is lot of variations in the frac. delay. which causes timing errors. any suggestions???
If your data isn't white or for certain TED there is a lot of jitter.
If you're using a pre-amble, why not use a least-squares or grid search for timing detection and then heavily damp your NDA detector? Or don't even bother and just lock it down until the next burst.
Answered by FourierFlux on January 4, 2022
0 Asked on December 12, 2021
1 Asked on December 12, 2021 by user51576
0 Asked on December 10, 2021 by murtpid1
1 Asked on December 7, 2021 by halbe
0 Asked on December 7, 2021 by level2fast
1 Asked on December 6, 2021
2 Asked on December 4, 2021 by kevin-sullivan
2 Asked on November 30, 2021 by user2796729
1 Asked on November 30, 2021
4 Asked on November 28, 2021
1 Asked on November 28, 2021 by jhon-margalit
0 Asked on November 28, 2021 by naveen-gabriel
3 Asked on November 25, 2021 by sensen
1 Asked on November 25, 2021
0 Asked on November 25, 2021
Get help from others!