# How computers work

#1

Have a blurry-to-sharp approach to how a computer works.

In a sentence: everything is a number. Sometimes a gajillion-digit number, a but a giant number nevertheless.

Top-level:

• Everything can be seen as a number. (A color? RGB value. A location? x,y coordinates. Letters? A specific item in the alphabet.)
• Once everything becomes numbers/data, you can use arithmetic (add, subtract, multiply, etc.) to turn some data into other data. Simple examples like Caesar cyphers, etc. We can literally add numbers like a calculator, or treat the numbers as “Representing something” and change what it represents.
• A computer changes data from one format to another. Various peripherals (monitors, speakers, etc.) turn that data into a displayed image, or sound, etc.
• Programming is explaining how you want the data to change. At the lowest-level, arithmetic-style operations change the numbers into other numbers.

Mid-level:

• Programming languages (give higher-level commands) vs. the direct arithmetic commands (assembly).
• Binary (store numbers in simplest format, vs. as decimal numbers)
• Transistors, switches, etc. Not a literal “1” or “0”, but a state of charge being held, etc. A single “1” might need a semi-complex circuit to represent it.
• Think about storage of the data, efficiency shortcuts, etc.

Low-level: