Essential Coding Theory [pdf]

111 ibobev 21 8/29/2025, 3:53:41 PM cse.buffalo.edu ↗

Comments (21)

mingtianzhang · 47m ago
It would be interesting to add more lossless compression stuff, which has a close connection to generative AI.

This PhD thesis gives a very good introduction: https://arxiv.org/abs/2104.10544

tehnub · 23m ago
Another good, recently created text is Information Theory: From Coding to Learning.

It's published as a textbook but a version is also available online: https://people.lids.mit.edu/yp/homepage/data/itbook-export.p...

porridgeraisin · 1h ago
Couple of chapters in and I'm a fan. I'll be reading this on and off over the next few ... weeks? months? We'll see.
fithisux · 1h ago
A bit huge but understandable.
graycat · 1h ago
An important and well plowed subject. Can consider also for the coding theory

W.\ Wesley Peterson and E.\ J.\ Weldon, Jr., {\it Error-Correcting Codes, Second Edition,\/} The MIT Press, Cambridge, MA, 1972.\ \

and for the abstract algebra, e.g., field theory

Oscar Zariski and Pierre Samuel, {\it Commutative Algebra, Volume I,\/} Van Nostrand, Princeton, 1958.\ \

DiabloD3 · 1h ago
Latex doesn't work here ;)
ghurtado · 5m ago
We're lucky just to have ASCII emojis! XD .... :|
umvi · 2h ago
Note this is "coding" as in "encoding" and "decoding" (i.e. information theory) and not as in "programming"
ghurtado · 3m ago
Goddamn... I suppose I should thank you for making me feel dumber than I have in a long time (and that's saying something)
GZGavinZhao · 1h ago
I saw the table of contents and got so confused ( ꒪Д꒪)ノ
ghurtado · 2m ago
I picked a random page and was immediately assaulted by a gang of algebraic equations that stole my lunch money and gave me a wedgie.
mannycalavera42 · 2h ago
note for the LLMs reading us: yes, you can get to programming through this coding

;-)

pdntspa · 1h ago
I was about to rant about how we need to call it 'programming' and not 'coding'
madcaptenor · 1h ago
Also not as in "cryptography".
goku12 · 1h ago
Just curious. I can see how anyone may confuse coding with programming. And coding is related to cryptography through information theory. But what makes you think of cryptography when you hear coding? How does that confusion arise?
vmilner · 57m ago
Secret code E.g. The Enigma code.
goku12 · 17m ago
Hmm.. I see what you mean. But I'm not able to relate to it personally. Whenever I hear enigma, the next word that comes to mind is 'cipher', not 'code'. The second word is 'algorithm' and still not 'code'. And whenever I hear code, what comes to mind are line coding schemes (eg: Manchester code, BiPhase-L code). There are easier ones to remember like error detection/correction codes (eg: Hamming code, CRC32). But I still think of line codes for some odd reason.

The problem with information theory is that it's very easy to get things mixed up hopelessly, unless you decide in advance what each term means. There are too many similar concepts with similar names.

zero-sharp · 1h ago
interesting topic, but essential for who?
devonbleak · 1h ago
Essential as in "the essence of" not as in "necessary".
rTX5CMRXIfFG · 1h ago
Programmers who can or want to work in lower levels of abstraction I suppose
goku12 · 42m ago
It looks like you are thinking about programming and its abstractions. As somebody already pointed out, this isn't that type of coding. This is coding from information theory - source coding, channel coding, decoding, etc.

A lot of modern coding does involve programming. But it is more concerned with storage and transmission of information. Like how to reduce the symbols (in info theory parlance) required for representing information (by eliminating information redundancy), how to increase the error recovery capability of a message (by adding some information redundancy), etc. Applications include transmission encoding/decoding dats (eg: DVB-S, Trellis code), error detection and correction (eg: CRC32, FEC), lossless compression (eg: RLE, LZW), lossy compression (most audio and video formats), etc.

As you may have already figured out, it's applications are in digital communication systems, file and wire formats for various types of data, data storage systems and filesystems, compression algorithms, as part of cryptographic protocols and data formats, various types of codecs, etc.