The original ASCII code uses seven data bits per character to encode lowercase and uppercase characters, digits, punctuation, and some control characters. Newer encodings like ISO 8859-1 require eight bits per character and include additional symbols like umlauts. Obviously, at least 26+26+10 different code words are required for the letters and digits alone, which means that at least seven bits per data word are required when punctuation is included. Therefore, custom encodings must be used when only six or five databits are selected per data word. One typical example is the five-bit encoding historically used for teletype machines, where a special control character shifted between letters and digits etc.
The demo sequence used in this applet reconfigures the 8251 to first use eight bits per word, then seven, then six, then five. The only way to do this is to write a command word with the internal-reset bit (D6) set to the 8251 chip, which resets the internal command state machine. Afterwards, a new mode selection followed by a new command instruction have to be written to the chip. The same data sequence of
0x55 0xaa 0x33 0x00
After the demo sequence has completed, the simulation is paused and the waveforms are redrawn. Zoom into the waveforms to verify the behaviour of the 8251. You can now also click the "continue" button in the simulator control panel to continue the simulation and transmit additional characters or reset the chip configuration to further explore the behaviour of the 8251.
Run the applet | Run the editor (via Webstart)