I made an automatic antenna delay calibration program. It hill-climbs to find the antenna delay that produces the result that is closest to the truth.
Each time it changes the antenna delay it transmits the new value to the peer, which sets up it’s own antenna delay to match.
It produced an odd result. The vertical axis is antenna delay, and the horizontal is meters. I was trying to home in on 1.67 meters. There seem to be two sweet areas. Why is this?