Kids should learn to code

Does a five-year-old need to learn how to code?

A couple of weeks ago I was interviewed by the BBC. In a fairly long phone call, I either rambled inanely or provided detailed and nuanced answers in context. That depends on your point of view.

Either way, obviously not a lot of it could make it into their story, as they really only needed a few quotes. So I thought I’d put more of what I said here.

The background for the story was the changes to the UK school curriculum which means that all kids are being taught to code. And the basic premise for the piece was that as we’re “entering an era when computers are actually beginning to teach themselves” that this is unnecessary and that coding itself is becoming an outdated skill.

This is a summary of what I tried to say…

Learning to “code”

It’s useful to start with some context. When we talk about teaching kids to “code” we don’t just mean teaching them how to write lines of code – it’s broader than that. Some criticisms of this initiative seem to be arguing against five-year olds needing to learn where to put semi-colons, which is missing the point.

From what I’ve seen, it’s an umbrella term that covers a range of activities such as:

Logical thinking and problem solving

Teaching kids how to understand a description of a problem, identify a solution, and describe that solution by breaking it down into a series of steps.

As kids get older this can be framed as how to write an algorithm. But it’s something that can be started even at Faith’s age (6) and without needing to touch a keyboard. That’s not new – how many developers have had to answer the interview question “describe how to make a cup of tea”?

You don’t need to learn programming language syntax to start getting your head around this, and I would argue it’s a vital skill to develop in life, even if you don’t become a coder.

Technological creativity

We need to do more than teach children how to use the tools that they have today. We need to encourage an ethos from an early age that we don’t have to be passive users of technology.

It’s about teaching kids how to think of and how to approach technology. They don’t have to think of it as a black box that must be used as-is, but as something that they can remix and tweak and modify and change and create. It’s about an attitude of looking at technology as something that they can make do what they want to do, as opposed to use the way someone tells them they should.

This is what I love about running my Code Club. Instead of kids playing a random Flash game they find online, they can make a game themselves, the way they want it to be. If they want it to be faster, slower, bigger, smaller, a different colour, move differently: they are in control. It’s not fixed, they can make it do and behave the way they want it to. And if they realise that they can do that with technology, it’s a real light-bulb moment.

We need kids to have this mindset so they will grow up able to imagine the next wave of innovations. Saying that we don’t need this because we can delegate it to the computers we have today really feels to me to be missing the point. Cognitive computing holds exciting promise and potential but it does not mean “we won’t need to be creative any more, the computers will do that for us, too”.

Coding becoming “outdated”

Leaving aside this bigger picture, is coding itself a useful skill to learn. Is coding going to become outdated?

I don’t think so.

Part of this argument seemed to be “what is the point of teaching kids <insert-name-of-programming-language-here> because by the time they grow up it will be obsolete?”

Programming languages stick around longer than people think – there are people still making a living writing C and maintaining COBOL. (We’re normally after good Prolog people, too!)

But more importantly, a lot of what you learn in one language is transferable. Every time I’ve started working in a new programming language, I’ve built on the basic concepts I already know from others. Maybe we’ll teach children a programming language that isn’t the most widely used language when they’re older. But that doesn’t mean learning the underlying ideas will have been a waste of time.

The argument also seemed to be that not just any particular language, but coding in general will become obsolete. I’m not convinced by this.

What we mean by coding may be different in twenty years to what we mean today. In fact it probably will be. Coding will evolve. It always has, and I’m sure it will continue to.

Even just looking at my personal coding history, you can see that evolution. Writing in assembler (where I was moving data in and out of registers) was different to writing in C. And writing in C (where it wasn’t just about what I wanted it to do functionally, but also doing my own memory management) was different to my coding today in Java.

A big difference is in the level of abstraction. They all involved describing to the computer something that I wanted it to do. But the level of abstraction I’m able to use to describe it has changed.

I’m sure this is a trend that will continue. New programming languages will get higher and higher level. Future programming languages will give us ways to describe what we want with higher levels of abstraction. And maybe that will look closer to natural language than what we have today (well-written Java is already closer to being readable by a lay-person than assembler). Maybe it will be something like a Controlled English language that feels more like describing what you want to another person.

But that won’t mean that coding has become obsolete, just that it will have evolved as it always has.

The need for people who can understand a problem, and describe to a computer how to solve it, will remain – whatever language they use and whether that language looks like “code” as we understand it today.