Hexadecimal Digits
What symbols should be used
for digits greater than nine?

Over half a century ago (when the nibble went from 3 digits of BCD to 4 digits of EBCDIC), it was foolishly decided to employ letters of the alphabet, rather than creating new symbols for digits beyond nine (or, perhaps, using entirely-different glyphs for all sixteen of the hexadecimal digits!) When 7-bit ASCII was standardized (X3.4-1966), another opportunity was missed when the six places beyond the ASCII code for '9' were wasted on punctuation (instead of simply reserving these places for symbols to represent digits from eleven to fifteen!). At the time, many proposals were made for hexadecimal glyphs to replace the "ridiculous" practice of using the alphabet for digits.   [See for example Wikipedia-Hexadecimal#written_representations .]

What makes this history even sadder and more ironic is the fact that, for centuries and millennia in the past, mathematicians and merchants had to labor under the burden of cannibalizing the first few letters of the alphabet to also represent digits from one to nine. It was not until more than 500 years ago that the practice of using separate symbols for digits (along with the use of yet another symbol for zero) began to spread around the world. Perhaps not so coincidentally, great progress then ensued in mathematics, science, invention, commerce, and many other areas.

However, none of the aforementioned[2] proposals for hexadecimal digits ever became commonplace, this regressive alphabetic/numeric overlay has plagued programming languages for decades, and these "ridiculous" ABCDEF-ambiguities are now so embedded that these mistakes will likely never be corrected.