Everybody is raving about coding, apparently because it’s what you do when you are serious about computers, and guarantees future employment.

Let’s start from the basics: that thing you do when you are serious about computers already has a name: it’s called understanding the world.

Code puts in program form one’s level of understanding about the world; so, a working program is the expression of somebody having understood a piece of the world.

Whoever separates code from programming, and hence from the understanding of the world, can do it for only two reasons: ignorance or malice. The former is no vice, and is by far the most common reason; the latter is a fault, and a serious one.

First case: the ignorant’s coding

Those who mention “coding” out of ignorance or fashion know computers through movies: the inevitably white-young-male-socially-challenged, type-first-think-later hacker that thwarts WWIII or penetrates the bank/CIA/NSA/nuclear plant. If it’s a really recent movie, our guy may even reprogram a satellite orbit from a smartphone, while walking.

Unfortunately, this is the digital competence level of our ruling class, and of most K-12 teachers.

Computer science is not the code, but the understanding of a problem that allows the writing of that code. Code, then, brings its own difficulties (and not trivial ones), but there still is nothing like “I write some code out of the blue and solve a problem”.

Before you code you must understand your problem, analyse it, formalise it (usually with techniques that require a long study themselves); then you’ll need to find out whether the same problem has already been solved elsewhere, why it has been solved that way and not differently, and if that solution fits your context, because there are no new problems and when it comes to building software, copying is a good thing. Of course you will also take into account any additional constraints from your specific context: time, budget, resources, available technologies, potential conflicts with other software already in use. Now you may have a solution you want to implement, and you will have to express in terms that are absolutely precise, because it’s a machine you are talking to. And, you know, the machine is always right because it does what you say, not what you mean.

At this point you can finally concentrate only on code, and on the myriad problems it entails. Unless, of course, somebody up the hierarchy has heard the magic word “agile” and considers his divine right to change his mind once a week, thereby trashing half of your work.

Focusing on “code” as if it were an end and not a means, the starting point and not the end point, is the worst way to acquaint kids with computer science. It gives them the idea that things are simple when not trivial, even (another term I hate) fun. It taints them with the pernicious notion that it’s not necessary to think or study, or learn anything besides what the fingers must type, when anybody with even a minimal exposure to software writing knows that behind a minute of typing hides an hour of study.

This happy-go-lucky, slapdash technological carelessness, is an insult to kids’ intelligence, to their ability to learn, to the contribution they can bring in terms of creativity and invention: because if problems are always the same solutions, thank goodness, evolve. (As a matter of fact, it is also an insult to the great programmers who have built our digital world. They, and not their egotic marketing sidekicks, should be our real source of inspiration.)

Suggesting that one can write code without caring about anything else means failing to understanding that technology is way less important than the way it is applied. It means refusing to acknowledge that implementing well-sounding but fake solutions (storytelling, anyone?) in the hope that “techies will take care of details” is an exercise in futilty.

There is no “technical level” that can be overlooked or blindly left to somebody else, we live in too complex a society.

And another thing: the key-puncher that does not see beyond his keyboard is the mirror image of the dumb manager that speaks in slogans and only cares about his own personal power, unloading his responsibilities on his directs. Both characters have done enough damage, and certainly have had their day.

Programming (not just its coding sub-activity) is the fourth pillar of education after reading, writing and numerical abilities.

Second case: the bastard’s coding

We said ignorance is no vice, but malice is. The insistence to talk about “coding” as if digital culture was limited to the manual skill of typing code is also typical of those who mean to produce piece-working key-punchers, low-skilled, low-cost, interchangeable workers. A XXI-century proletariat so bent on working that it does not notice that work is being kept artificially alive as a social tranquiliser.

Which brings us to the future of work. We are facing a change to which we are not prepared: the end of work or, rather, the end of the culture where work is a requirement for survival. To acknowledge this would be to undermine the sacred axioms of our economic model, and this cannot be allowed.

Each technological revolution has profoundly changed the character of work, but to date for each job made obsolete another was created at more or less the same skill-level. So if the other day you worked in fields, yesterday you could work in a factory with no great need to learn much. You wrote commercial letters, did double-entry book-keeping by hand, and then more or less learned your way into doing it with Excel or a management software. Not much of a thing, no-one was really cut off.

But, today, change is tangible. Robots can cost-effectively do most manual or low-skill jobs, today. And as for intellectual jobs, abstracting and writing news, legal counselling, a good share of medical clinical diagnosis, speculation in stocks have long been eaten by software, and are soon going to be completely swallowed. This means that many professions like journalist, lawyer, general practitioner or stock trader are walking dead today. And the future comes every morning, fresh as a daisy.

On the other hand, skills in the digital realm require years of study and apprenticeship, life-long learning and are not for everybody. Certainly not for those who cannot go beyond a manual or low-skill job, or for those who managed to make a living by using the computer as little more than a typewriter. Certainly not for those who believe education is that thing you can be done with once you’re sixteen.

And it’s still not all: even jobs in the digital realm are decreasing dramatically, because no industry is immune from automation. I’m told the entire technical infrastructure of Google for the north-american continent requires but a couple dozen engineers. Foxconn replaces chinese workers (not exactly famous for their hourly wage) with a million robots that will cost even less and will work every second of their (very long) mechanical life. Amazon, arguably the largest store in the world, hires ten thousand warehousemen. All of them robots.

The problem is not that work is disappearing. The problem is that the economy pushes to eliminate all those jobs who have survived through managerial ignorance or inertia, laxity, or because after all it was politically easier to keep people in the illusion that receiving a paycheck for a job that a machine could do better and more economically could constitute a reasonable life-plan. There is no future as barista, bank teller, waiter, cab driver, truck driver and dozens of other jobs. And I am speaking of today. What about two years from now? Five years? Ten?

Robots work as warehousemen, mechanics, circuit assemblers today. Artificial Intelligence flies planes and drives trains, manages logistics, sets inventory prices, manages hedge funds and the online client service for your online vendor as you read these lines. And in this kind of world there are still those who come speaking of “coding” as if typing blindly on a keyboard were The Way to who knows which magnificent destinies.

We wanted technology to free us from hunger, then from physical labour. It did. Today it is starting to free us from the idea that work is a universal requirement for survival. We need to decide what to do with our time, because we will have more and more of it, while there will be less and less work, for fewer and fewer people. How is anyone going to make a living if our economy and our culture are still based on the need of being in need? It’s high time we rethought them.

Market economy as we knew it so far has brought us here. It will not take us further on.

To Bill Joy, Donald E. Knuth, Richard M. Stallman, Dave Winer.

Share This

Share This

Share this post with your friends!