5

I have been working on creating a UART on an FPGA. I can successfully transmit and receive single characters typed on PuTTY. However, when I set my FPGA to constantly write a large sequences of "A", sometimes I end up with a sequences of "@" or some other characters until I reset the FPGA a few times.

I believe the UART on the computer looses track of the difference between the start bit and a zero. The delay between the two "A" is ~ 30us (measured with a logic analyzer) and the baud rate is 115200 8N1.

Is there a minimum delay that must be maintained between two consecutive RS232 frames?

Ben N
  • 42,308
Lord Loh.
  • 1,034

3 Answers3

3

As well as speed and number of data bits, I think the two ends must agree on the number of start bits, stop bits and parity bits.

See Asynchronous Serial Communication

RS232 signal

The above shows how characters are separated but has rather idealised rise and fall times, I believe a scope would show something more like what follows (note inverted mark/space axis compared with prior diagram).

enter image description here

Perhaps you should set the speed lower, maybe your FPGA isn't emitting a well-formed signal at higher speeds.

Also RS232 is async, I believe that means the receiver is expected to synchronise it's timing based on the start and stop bits.

  • A is binary 01000001
  • @ is binary 01000000

The difference is a matter of accurate timing. With inaccurate timing a receiver can count six instead of five whilst the +3...15V is asserted.

See Signal Timing and Signal Characteristics

3

Is there a minimum delay that must be maintained between two consecutive RS232 frames?

No, there is no such requirement (no min and no max) in EIA/RS232C.
The Start bit of the next character can immediately follow the Stop bit of a character.
Note that the line idles at the Marking state, which is the same level as the Stop bit.

It is interesting that you make no mention of the Stop bit in the character frame.

I believe the UART on the computer looses track of the difference between the start bit and a zero. The delay between the two "A" is ~ 30us (measured with a logic analyzer)

You are using the wrong tool for this task! You should be using a 'scope. You cannot analyse a timing problem by viewing a sampled and sanitized rendition of the analog signal.
The difference between the Start bit and a zero is timing. The character frames are transmitted at an asynchronous rate. But the bits of the frame have to be clocked at the specified clock rate.
For 115200 baud rate, that would be 8.68usec for 1 bit time. For 8 data bits plus a Start bit and a Stop bit, the frame time is 86.8usec.
You question implies that you have not bothered to look at the EIA/RS232C spec for minimum rise/fall times and when the signal is typically sampled. Interesting method for implementing HW.

Perhaps you should also use a frequency counter to measure the baud rate generator at each end. A mismatch of a few percent can usually be tolerated. A mismatch could produce the symptoms you see.
How come framing errors are not reported by the receiver? Instead of just looking at output, maybe you need to review the stats of the serial port, i.e. /proc/tty/driver/...

sawdust
  • 18,591
1

I suspect that UARTs are still pretty much similar to the original ones. They used a 16xdata rate clock to "sample" the data, vs the earlier analog scheme that used an oscillator that was edge-triggered. Using the sample approach the UART could fairly accurately position it's sample time in the middle of the pulses, and could even do multiple samples to be a little more noise tolerant.

Your description is unclear in that you talk in a recent comment about "detecting a start bit", but you had implied earlier that you're TRANSMITTING and hence would have nothing to "detect".