That Key Thing They Don’t Teach Beginning Programmers

posted in: ones and zeros | 0

Disclaimer: I have taken a limited number of “intro programming” classes, being but a single person, though I have had my impression confirmed by others. If you teach/have taken a course that includes my “missing fundamental,” promote that excellentness in the comments at the bottom!

The #1 Most Fundamental Thing

I came to coding from a very, very liberal arts background. In fact, my Bachelor’s was through the Interdisciplinary program at my school. I literally made it up. A little bit from this major, a little bit from that (1).

The whole point was to connect different ideas from multiple disciplines into a cohesive, well-rounded course of study. Like making a piece of art, I took inspiration from multiple sources and moulded them into something new based on how it felt.

Programming doesn’t work like that.

Actually, code is pretty much the opposite of, for example, a collage, where you just kinda move bits of paper around until they feel right (2). That doesn’t fly in programming, unless maybe your goal is to see how many error messages you can get.

So what’s the most fundamental truth about coding that no one is explicitly teaching?

How to be methodical. Or, how important that is.

This, I believe, is a serious flaw in most curricula. I can scientifically confirm (based on my sample size of one) that some people have never needed to be methodical in the way that a programmer needs to be methodical, and therefore have not spent much time building that skill.

Explicitly teaching a systematic approach to beginner programmers should have a big impact on learning for only a little investment in teaching time.

A World of Pain

I don’t need to tell experienced programmers why being methodical is important. You all can skip this section if you like.

To a beginner, though, lacking that understanding might be holding them back from success, or at least be a huge source of frustration. It was for me.

I started my very first online programming course in the fall of 2012. I dropped out of that course as well. Besides the really bad fit between me and the course (3), it didn’t articulate any fundamental paradigms in how programmers need to think about and approach problem-solving (4).

I remember trying to debug a simple program I’d written, with my (senior engineer) mentor watching over my shoulder. I had zero idea what mistakes in syntax or logic I’d made, so I changed a couple things based on vague hunches that maybe I needed to change this or rearrange that, then tried to run it again.

“No no, you need to be more systematic in your debugging.”

This is the kind of advice that experts give. Totally true and potentially very useful, if you can figure out what the hell they’re talking about.

“What..? What? We don’t cover debugging until next week…”

As stated earlier, I didn’t make it through that class. For multiple reasons, I felt dumb and frustrated and incapable (it didn’t help that I wasn’t used to failing so frequently.) And so, while I had previously enjoyed working through Learn Python the Hard Way (so good!), programming was no longer fun for me and I didn’t want to do it anymore.

Perhaps the worst result of this experience was that I thought I couldn’t be a programmer.

(The mentor, to his credit, totally disagreed with me on this point. But his assurances of the potential he’d seen in me were no match for my frustration.)

The Incredible Lightness of Systemization

A methodical approach takes a potentially painful exercise in problem-solving-failure and turns it into a fun, accessible puzzle.

A demonstration may be the best way to explain the process that I’ve adopted for my little scripts.

The following assignment comes from Programming for Everybody (don’t worry I won’t give away the answer):

Write a program that prompts for a file name, then opens that file and reads through the file, looking for lines of the form:

X-DSPAM-Confidence:    0.8475

Count these lines and extract the floating point values from each of the lines and coput the average of those values and produce an output as shown below.

Ignoring the typo and potential for confusion there, this problem was the most complex assignment of the course up to that point. Potential for hyperventilation: High.

So what did I do? I broke it down into little chunks:

  1. prompts for a file name,
  2. then opens that file and
  3. reads through the file,
  4. looking for lines of the form: X-DSPAM-Confidence:    0.8475
  5. Count these lines and
  6. extract the floating point values from each of the lines and
  7. compute the average of those values and
  8. produce an output as shown below.

To my surprise (and disappointment), they gave us a head start. But the approach I was planning to take was:

Write just enough code to prompt for a file name, and then use a print statement to confirm that I had successfully completed that first tiny step.

Then, add the next smallest possible amount of functionality and use a print statement to confirm it worked.

Rinse and repeat.

Building Upon Success

Using this systematic approach of completing tiny bite-size pieces and confirming they work before moving on to the next piece, you know that a certain amount of your script is (probably) bug-free, and when you do get an error message there’s a decent chance it came from whatever little bit you just added. This makes debugging SO. MUCH. EASIER.

Previously, one of my biggest frustrations had been the fear that I had more than one bug in a script, so if I made one change I wouldn’t know if it helped or not because the other one was still there mucking things up.

Another example of being systematic: when you’re debugging, only make one change at a time. That’s what my mentor was trying to tell me during my first attempt at learning to program. Now you know exactly what effect each change has on the program.

Simple? Simple. You just have to know you need to work this way.

A Terrible Story

One of my favorite activities in college was rock climbing. One day, my instructor told me the story of how people teaching climbing classes used to fall off of cliffs at an alarming frequency. Falling off a cliff is bad bad bad, and many people were seriously injured or killed this way.

So what one simple thing completely ended this unfortunate trend?

They started telling instructors: lots of people are backing off of cliffs while explaining things to their students.

Done. Someone tells you that, and from that point on you will always tie yourself to something before you turn your back to the edge of a cliff.

Awareness of the problem was enough to end the problem entirely.

 

* * * * *

 

(1) This was, of course, back in the day when a respectable private university might only cost $22k per year. Nowadays it seems you can’t even consider a degree unless it will guarantee a well-paying job when you graduate. My heart goes out to those 18 year olds who have to pick a career before they even know what they like to do.

(2) I have been told that senior developers sometimes make choices based on the weather: “So why did you do it that way?” “Oh, I don’t know, because it’s beautiful outside and I was feeling spunky.” …Maybe. But I stand by my premise. “I think this function would look fabulous with a semi-colon… there.” No. Just – no.

(3) MITx assumes its students have recently taken advanced high school math classes, and that is true of all MIT freshmen. Retraining my brain to think like a programmer via math I haven’t seen or thought of in 15 years, just wasn’t going to end in happy fun times. Sorry MIT, you’re still an amazing school, but you and I just weren’t meant to be.

(4) Nor did it spend much time on the many common stumbling points for beginners. I loved Coursera’s Programming for Everybody because the professor takes extra time to cover them. Sometimes too much time, but it’s not his fault I’d already learned a lot of the material.