Re: [V4Protocol] Noise and functionality
- Cortland,This is not a question of squelch in the conventional way it is used in VHF/UHF radio.Let me explain the process a bit more.First the Measurement of latency has no bearing on if V4 works or not...it is simply a tool added to allow users to get a handle (at least a rough measurement) of the latency. The number measured is simply displayed...it is not used for anything but is to notify the user they may have some problem (most commonly the VOX or DLY setting on the transceiver or SignaLink set above minimum....something the software has no control over).Second think about the problem of measuring latency a bit. When you key the transceiver the PTT blanks or mutes the receiver audio. So you have virtually 0 audio coming from the receiver. Now you release the PTT. When do you know the receiver is again active?....(that the latency time has elapsed...) The only thing you have to look at is the background noise from the receiver and on HF this is never 0. In fact as long as you have AGC on (not masked off with the RF gain control) the background noise audio level will be close to the normal received audio level (that is what AGC does!).If someone were to ask you to measure this latency on a HF receiver how would you do it? Maybe use a dual trace scope with one trace showing the PTT and the other looking at the audio out of the receiver... This is exactly the same thing the software is doing.So using the receiver audio level of the background noise is the only convenient vehicle you have to measure latency with the DSP software. In practice IF you have AGC on and IF you have your audio level set correctly (say about 1/2 way or so into the green) it will ALWAYS measure the latency. At any rate it is a convenience measurement and has nothing to do with the operation of V4.73,Rick KN6KB
- ___Original Message_________________________________________
From: Rick Muething <rmuething@...>
Date: Sun, 21 Aug 2011 Time: 10:30:57
>It is essential that you have sufficient receive “noise” to drive the receiveRick
>level indicator on V4 well into the green area. It is the return of this “noise”
>after PTT is released (PTT blocks receiver output noise) that is used to tell
>when the receiver is again “alive”.
Please pardon me for saying so, but is this the right way to do it? You
cannot rely on there being sufficient receiver/band noise to do anything
-- the overall noise level may so low, that the timer fails to trigger.
Surely the way to do it would be something like:
1. At the end of sending a block of data, switch from transmit to
receive and start the round-trip timer.
2. Listen for a recognizable character/block from the other station,
then stop the timer. The RTT can then be computed.
Or is this too simple?
- Ian,Believe it or not I understand this pretty well! Measuring the round trip time (which V4 also does in terms of computing and optimizing some of the other timing parameters) does NOT measure the T>R latency. In fact the T>R Latency is not even a component of the round trip time. Think about this not so uncommon scenario:V4 is set up with a modern Transceiver using a SignaLink USB (which uses VOX keying) and the SignaLink keys the transmitter PTT. The Time for the PTT release (AFTER the end of a wave file playing) is a function of the DLY control on the SignaLink...typically < 20ms to as much as 5 or more seconds.(For an SDR radio the “DLY” might be a function of sample rate, buffer size, filter complexity, VAC cables used etc)If the SignaLink is set above the minimal delay the PTT release is slowed and this dramatically increases the T>R latency. But the round trip (the end of DATA from the sending station to the first data from the replying station) is NOT affected by sending stations PTT hold time. If the T>R Latency is too long it will block the reception of all or part of the received data (because the station would still have the PTT active). So excessive T>R Latency essentially makes the receiver “deaf” for a period of time and there is no way the remote station could know how long that “deafness” exists without some complex and time consuming “trial and error” repeated search over several cycles. The easiest and most reliable way is to agree on a maximum permissible T>R Latency value and include that in the spec in the protocol . (This is usually done with virtually all HF ARQ protocols). Then make sure each station complies with that max T>R latency value. Setting a max value is always a compromise but is necessary for compatibility. The longer the max T>R Latency permitted the more T>R>T overhead (timing guard bands) in the protocol but the more compatible the protocol is with all popular hardware. The spec in V4 is 250 ms but the program has some guard band and should work up to 300 ms of T>R Latency. That value is large enough that even SDR Radios (with their increased latency) can operate with proper setup. A typical hardware modern radio may have a T>R Latency of 30-50 ms. So you could lop off perhaps 300-400 ms off the protocol T>R>T cycle (< 10%) if you REQUIRED each station to meet that lower T>R Latency spec....but those with older slower radios or SDR radios would probably be out of luck.Remember in V4 the T>R latency is not measured every cycle and is not even used in the timing calculations. It is measured and presented to the user during a call to aid him in selecting or configuring his station to meet the maximum permissible latency value. For example in the above scenario with a SignaLink or with a SDR if you called CQ or another station with the DLY control not fully CCW or with the wrong setup on the SDR the V4 program will warn the sender with the measured value and a red warning label on the TNC. We don’t want to go back to the 1980’s where to setup a packet station and network you had to guess/enter/tweak half a dozen timing parameters and do trial and error to see if was compatible with other radios on the network.I have a flex 3000 here and it works consistently measuring the T>R latency. The only requirements are:1) You have to have AGC on (that is really good operating practice any way to keep the audio level near constant to the sound card)2) You have to have the audio drive level sufficiently high (I say mid scale but it still usually works in the “blue”). Again this is part of normal sound card setup...it has to have sufficient audio or you will be essentially working with a 8 or 10 bit sound card instead of 16 bits.3) You have to have MON off on the Flex setup (otherwise the V4 software can’t compute the “quiet” threshold when the PTT is active).73,Rick KN6KB
- ___Original Message_________________________________________
From: Rick Muething <rmuething@...>
Date: Tue, 23 Aug 2011 Time: 10:57:23
>Believe it or not I understand this pretty well! Measuring the round
>trip time (which V4 also does in terms of computing and optimizing some
>of the other timing parameters) does NOT measure the T>R latency. In
>fact the T>R Latency is not even a component of the round trip time.
>Think about this not so uncommon scenario:
Aha, now I understand what you're getting at! Many thanks Rick.
I stand (or sit) corrected. And thanks for the explanation. I had noticed that using *WINMOR TNC* I was often unable to produce enough noise to approach the green. That really is off-topic here, though.
- Cortland,Not that off topic. The receive levels of V4 and WINMOR are basically the same and they are NOT set internal to either program. The received levels are set ONLY by1) The audio level into the sound card from the radio speaker or Aux audio output2) The gain controls (pot and/or jumper) in the sound card interface (e.g. The SignaLink has both a RX pot and an internal jumper J2)3) On some sound cards (if the driver for that sound card uses the Windows mixer) the mixer gain control FOR THAT SOUND CARD. See the Help on how to set this using the WIndows Mixer. Note Windows may reset this automatically after some Windows updates (don’t ask me why!)The idea of setting the levels toward the middle of the range is simple. You have a 16 bit Sound card. If you don’t use at least half of the input range of the sound card you loose dynamic range. Without getting the level toward the middle you may be working with the equivalent of an 8 or 10 bit sound card. If you overdrive the sound card (into the RED) you will clip some of the waveforms and that causes distortion in the DSP demodulator.Rick KN6KB
- GM Rick,I don't know if it is my computer or something else but I get very inconsistent numbers for latency in V4. They can vary by as much as a factor of 5 to 1 in just one ARQ CQ sequence. I understand that there would be some variation but not by anywhere near 5 to 1. I am using a TS-2000 with SLUSB sound card and a 2.3 GHz E machine. Oh yes, I have plenty of noise for the program to work with. Background noise at approximately half green line. Any thoughts?Dave K3GAU
- Dave,I really need to see the numbers. They are in the debug log file. Programming is nothing but details!5:1 is a big spread but 20 ms and 100 ms is 5:1 and entirely possible. 200ms and 1000 ms would be truly unusual. Make sure your SignaLink DLY is absolute minimum. (CCW) You typically should see a number somewhere like 50-120 ms or so. Actually anything consistently under 250ms is fine (as I have mentioned before this is a rough test mainly to catch the extremes in setting). The program does NOT use the measurement at all ...it simply is there to warn the user of a setup or hardware configuration that could cause a problem due to excess latency. You might also check the AGC setting on the TS-2000. Setting it fast may reduce the latency (I haven’t checked that yet but it is affected by AGC speed on some radios)Rick KN6KB