## 6.1\. WHAT’S THE BIG PICTURE?[](http://csfieldguide.org.nz/CodingIntroduction.html#what-s-the-big-picture "Permalink to this headline")
The word “code” has lots of meanings in computer science. It’s often used to talk about programming, and a program can be referred to as “source code”. However, in this chapter (and the next three chapters), we will use it to talk about representing information in useful ways, such as secret codes. In the previous chapter we looked at using binary representations to store all kinds of data — numbers, text, images and more. But often simple binary representations aren’t so useful. Sometimes they take up too much space, sometimes small errors in the data can cause big problems, and sometimes we worry that someone else could easily read our messages. Most of the the time all three of these things are a problem! The codes that we will look overcome all of these problems, and are widely used for storing and transmitting important information.
The three main reasons that we use more complex representations of binary data are:
* **Compression:** this reduces the amount of space the data needs (for example, coding an audio file using MP3 compression can reduce the size of an audio file to well under 10% of its original size)
* **Encryption:** this changes the representation of data so that you need to have a “key” to unlock the message (for example, whenever your browser uses “https” instead of “http” to communicate with a website, encryption is being used to make sure that anyone eavesdropping on the connection can’t make any sense of the information)
* **Error Control:** this adds extra information to your data so that if there are minor failures in the storage device or transmission, it is possible to detect that the data has been corrupted, and even reconstruct the information (for example, every bar code has an extra digit added to it so that if the bar code is scanned incorrectly in a checkout, it makes a warning sound instead of charging you for the wrong product).
Often all three of these are applied to the same data; for example, a photo taken on a camera is often compressed using JPG, stored on the camera card with error correction, and stored on a backup disk with encryption so that if the disk was stolen the data couldn’t be accessed.
Without these forms of coding, digital devices would be very slow, have limited capacity, be unreliable, and be unable to keep your information private.
## 6.2\. THE WHOLE STORY!
The idea of encoding data to make the representation more compact, robust or secure is centuries old, but the solid theory needed to support codes in the information age was developed in the 1940s — not surprisingly considering that technology played such an important role in World War II, where efficiency, reliability and secrecy were all very important. One of the most celebrated researchers in this area was Claude Shannon, who developed the field of “information theory”, which is all about how data can be represented effectively.
A key concept in Shannon’s work is a measure of information called “entropy”, which established mathematical limits like how small files could be compressed, and how many extra bits must be added to a message to achieve a given level of reliability. While the idea of entropy is beyond the scope of this section, there are some fun games that provide a taste of how you could measure information content by guessing what letter comes next; there’s an Unplugged activity called [Twenty Guesses](http://csunplugged.org/information-theory), and an [online game for guessing sentences](http://www.math.ucsd.edu/~crypto/java/ENTROPY).
## 6.3\. FURTHER READING[](http://csfieldguide.org.nz/CodingIntroduction.html#further-reading "Permalink to this headline")
James Gleick’s book [The Information: A History, a Theory, a Flood](http://www.amazon.com/The-Information-History-Theory-Flood/dp/1400096235) provides an interesting view of the history of several areas relating to coding.
### 6.3.1\. USEFUL LINKS[](http://csfieldguide.org.nz/CodingIntroduction.html#useful-links "Permalink to this headline")
* A good collection of resources related to all three kinds of coding is available in the [Bletchley Park Codes Resources](http://www.cimt.plymouth.ac.uk/resources/codes/)
* [Entropy and information theory](http://en.wikipedia.org/wiki/Entropy_(information_theory))
* [History of information theory and its relationship to entropy in thermodynamics](http://en.wikipedia.org/wiki/History_of_entropy#Information_theory)
* [Timeline of information theory](http://en.wikipedia.org/wiki/Timeline_of_information_theory)
* [Shannon’s seminal work in information theory](http://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Communication)
- perface
- 1. INTRODUCTION
- 2. ALGORITHMS
- 3. HUMAN COMPUTER INTERACTION
- 4. PROGRAMMING LANGUAGES
- 5. DATA REPRESENTATION
- 6. CODING — INTRODUCTION
- 7. COMPRESSION CODING
- 8. ENCRYPTION CODING
- 9. ERROR CONTROL CODING
- 10. ARTIFICIAL INTELLIGENCE
- 11. COMPLEXITY AND TRACTABILITY
- 12. FORMAL LANGUAGES
- 13. COMPUTER GRAPHICS
- 14. COMPUTER VISION
- 15. NETWORK COMMUNICATION PROTOCOLS
- 16. SOFTWARE ENGINEERING
- 17. APPENDICES
- 17.1. GLOSSARY
- 17.2. CONTRIBUTORS
- 17.3. INTERACTIVES
- 17.4. 1.44 ASSESSMENT GUIDE
- 17.5. ALGORITHMS (1.44) - SEARCHING ALGORITHMS
- 17.6. ALGORITHMS (1.44) - SORTING ALGORITHMS
- 17.7. HUMAN COMPUTER INTERACTION (1.44)
- 17.8. PROGRAMMING LANGUAGES (1.44)
- 17.9. 2.44 ASSESSMENT GUIDE
- 17.10. REPRESENTING DATA USING BITS (BINARY NUMBERS) (2.44)
- 17.11. REPRESENTING DATA USING BITS (CHARACTERS/TEXT) (2.44)
- 17.12. REPRESENTING DATA USING BITS (IMAGES/COLOUR) (2.44)
- 17.13. COMPRESSION (2.44) - RUN LENGTH ENCODING
- 17.14. ENCRYPTION (2.44) - RSA CRYPTOSYSTEM
- 17.15. ERROR CONTROL CODING (2.44) - CHECK SUMS
- 17.16. ARTIFICIAL INTELLIGENCE (3.44) - TURING TEST
- 17.17. FUTURE PLANS FOR THE FIELD GUIDE
- 17.18. GUIDE TO SYSTEM FOR OPEN SOURCE DEVELOPERS
- JUST BROWSING