Blocks on magnet tape occurred when you stopped writing. The tape would stop and about a ½ inch of tape would have no useful information on it. Disks had similar gaps where write heads were turned on or off.

Communication lines, starting with telegraph and teletypes, left the line in ‘marking state’ when there was nothing to send. This meant current was flowing in current loop connections, suitable to transmitters and receivers being all wired in series. The teletype would begin the transmission of each character with a ‘start baud’. Each character was independently timed.

Early synchronous modems such as the AT&T 201 modem were full duplex and would send a continuous sequence of bits that would maintain the clock while there was nothing to send. Normally the modem provided a bit clock signal to the receiver. Certain streams of ~25 bit user data would put the line into a state where the clock recovery failed.

With SDLC and HDLC bit stuffing was introduced and an arbitrary user bit stream could be sent, but the bits were clocked by a signal from the sending modem to the user data source. This signal was slightly irregular. Occasionally the sending modem would calculate that line state transitions had become too few and insert one in order for the receiving modem to maintain bit synchronization. The receiving modem was duplicating that calculation and thus did not mistake the extra transition as a user bit. This was sometimes called bit-stuffing. The end result for the user was the transmission of arbitrary user delimited bit strings. In effect there was a special mark, called a ‘flag’ I recall, that the user could put between bits that would be conveyed by the receiving modem to the user. It was at this level that CRC error detection was frequently added. There was no character sync, but usually the bit stream was divided at higher levels into 8 bit octets. IBM characters were transmitted high bit first, others transmitted low bit first. Flags were sent when there was no data to send.