Code Year: Why You Should Learn to Code

You'll learn how to write software to make your computer do new and wonderful things thatfind valuable, instead of depending only on what others have done. That's empowering.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

You probably know about Code Year. Code Year, sponsored by Codecademy, challenges people to learn how to program in 2012. The Codeacademy website offers free online lessons in a variety of programming languages; it's received attention in the press and saw a large boost from a comment from New York Mayor Michael Bloomberg on Twitter: "My New Year's resolution is to learn to code with Codecademy in 2012! Join me."

Hundreds of thousands of people have joined Bloomberg. Even though my own Code Year was 30 years ago, I can still appreciate the appeal -- you'll learn how to write software to make your computer do new and wonderful things that you find valuable, instead of depending only on what others have done. That's empowering.

But there's more. By analogy, imagine that a university physics department is sponsoring a new course: "Billiards 101." They've put videos online and offer hands-on lessons in a nearby pool hall. You take the course, and you learn how to handle the cue, how to choose a good shot, how to position yourself for a shot, and how to execute it properly. Throughout the videos and the coaching, though, lessons about physics keep sneaking in: You're playing pool, but at the same time you're finding out about forces, momentum, rolling friction, angular velocity, and so forth.

A programming course can be something like this. Learning to program means learning about computer science, and some of the insights are more important than the skills.

Here's an example. You start by writing simple programs, and as you progress, you get involved with an online community of other programmers. Some of them are passionate about the computers they use -- Windows, Macintosh, or Linux -- in part because of what they can do on their platform (and not on the others). But you can all talk about the same programming problems and solutions as if there were no differences between the different platforms. How can that be?

Alan Turing, the most important historical figure in computer science, explained this in the 1930s: We can make one computer behave as if it were another computer. It's theoretically possible to program the simplest general-purpose computer so that (if you're willing to give it as much time and as much storage space as it might need) it can solve any problem that the most powerful computer in the world can solve. Computers might look and behave very differently from each other, but at the most basic theoretical level they're all essentially the same. Computers are universal machines.

Here's another example. In the programs you write early on, you learn the kinds of things that a computer can do with information (for example, adding numbers together). You quickly move on to programs that manage information in more sophisticated ways (graphics, sound, and animation, perhaps), and you learn how your programs can repeat actions and make simple decisions. It's not always easy. You enter a few lines of code, and the system gives you a cryptic error message about "invalid syntax." Oh, you've left out a parenthesis -- a program running behind the scenes doesn't know how to interpret what you've typed.

You come across similar error messages elsewhere in your dealings with computers, for example when a webpage refuses to accept your credit card number (you've typed it with spaces) or can't interpret a date because it must be entered in a specific format. Programs expect data as input that's formatted or structured in a particular way.

Here's the interesting thing: Programs can treat other programs as data. We tend to think of data as numbers and text and such, information for programs to process. But "information" also includes descriptions of what computers do -- programs. You might have asked yourself at one time or another, "Am I spending too much time on this problem?" A bit of self-reflection, where you think about what you're doing rather than just doing it. Computers can do something analogous. It's part of their universality. (This was another important aspect of Turing's work.)

As you learn more about programming, you'll encounter more and more of these insights. Keep an eye out for them. You'll be catching glimpses of the conceptual foundations of computing -- one of the foundations of our modern world.

Popular in the Community

Close

What's Hot