Back in elementary school, I discovered the World Wide Web and more importantly, the language it consisted of: HTML. I spent my days after school learning how to build web pages, impressing both my friends and my family. Nearing the end of elementary school, however, I wanted to program the computers themselves, so I logged on to an early Google.com and began searching. I found that my computer came with QBASIC preinstalled, and I quickly became fascinated with programming. Shortly, however, I wanted to move on to better things. My parents had purchased a variety of development books for me, ranging from web development with PHP to Visual Basic to C++. Back then, I thought of C++ as some uber-powerful language I would never even begin to conquer. As for Visual Basic, I spent about an hour with it before I got sick of it (such taste for a fifth-grader, I know). I picked up PHP easily, all the while hoping to eventually master the mysteries within the C++ book. Today (roughly a decade later), I know quite a few languages, and I have to say mistakes were made and if I could go back, I would teach myself programming the right way.
There is something my AP Calculus teacher says to us quite often. While most jobs will only require you to know how to solve a particular problem, it is most important in high school to learn how and why that problem was solved. That is, how do the mathematics behind the problem work? A good example is solids of revolution, which we recently wrote a paper on, explaining the processes behind them in detail. Sure, it’s easy to calculate the volume of a solid formed by revolving a curve around an axis, but the priority of our teachers was to get us to understand why we could get the volume that way.
Soon after, I began to understand that the methodology used by my math teacher could (and should) be applied to learning how to program. Very often now I see younger people throwing questions out to the Internet, asking how they could learn to program. Most replies I see suggest that they start with something very high-level like Ruby or Python. Now, I am far from a master programmer, still learning many things myself, but I still fear that the ways of the low-level hacker are becoming scarcer and scarcer.
High-level languages like Ruby and Python will teach you how to program using their syntax, but they won’t teach you how a computer works on the inside, and I believe that is the key to learning programming. Just like how teachers for years have learned how children’s minds work so that they can teach them new things, a new programmer must learn how a computer works in order to write a good program. Now, I’m certainly not suggesting that everyone start by learning Assembly, as that would just introduce platform-dependent terror. Rather, I think the ideal programming language one should start out with is none other than jolly old (ANSI) C.
In Computer Science, programs are written to solve problems. The language used is largely determined by convenience these days, as more and more programmers have begun to take the powerful machines we have today for granted. Imagine what it was like back merely two or three decades ago, when programmers had very little to work with in terms of processing power and available memory. Sometimes, I wish we were still stuck in those days, partly for the nostalgia factor (granted, I was born in 1994 so the oldest hardware I’ve come into contact with was a Pentium machine), but also for the mindset programmers had back then. Programs were written to solve problems given the available hardware, and programmers had to be extra careful with their code to make sure it was efficient enough for the machines they had.
The C programming language is as close-to-the-metal as you can get without getting involved with Assembly instructions. As you learn how to program in C, you also learn how things are laid out in computer memory, as well as how your program is executed. Of course, siding with Assembly will actually teach you these inner-workings on a more extreme level, and I do recommend you do so at one point, but in the spirit of cross-platform development and relative ease of understanding the language, C is the prime choice.
I’m participating in a FIRST FRC Team this year, signing on with the programming team. There are two others working with me, one of which has been with the team since Freshman year (we’re Seniors now, FWIW). The team had used LabView up until this year, when I made the demand that C/C++ be used instead. This was my first time working with C on an embedded system. It was also apparent, however, that it was the first time the other two members had really worked with C/C++ at all. They read through the WPILib documentation (API for interfacing with the robot), and began to code. Setting aside their issues in understanding the Object-Oriented Programming paradigm (which I’ll forgive given their new-ness), there was an obvious lack of understanding that this was an embedded system, with limited memory available. The
new operator was thrown about willy-nilly, without any clean-up in sight. Perhaps it was because they were previously exposed to the Java examples, and assumed that the “convenient” garbage collection carried over to C and C++. Really, it was an issue between learning how to program and learning what you’re programming.
I think I’m beginning to rant now, so I believe I should end this post here.
TL;DR ‐ High-level languages make you lazy in regards to things like memory management and program execution efficiency, whereas a low-level language like C will teach you what you’re programming, while you’re programming.