My family’s first computer was the Commodore VIC-20. Billed by its pitchman, Star Trek’s William Shatner, as the ‘wonder computer of the 1980s’, I have many fond memories of this antiquated machine. I used to play games on it, with cassette tapes that served as primitive storage devices. One of the cassettes we bought was a Pac-Man clone that my brother and I would play. Instead of a yellow pie with a mouth, it used racing cars.
My most vivid memories are of the games whose code I typed in myself. While you could buy software for the VIC-20 (like the racecar game), a major way that people acquired software in those days was through computer code published in the pages of magazines. Want to play a fun skiing game? Then type out the computer program into your computer, line by line, and get to play it yourself. No purchase necessary. These programs were common then, but no longer. The tens of millions of lines of code that make up Microsoft Office won’t fit in a magazine. It would take shelves-worth of books.
Typing code into my computer brought me closer to the machine. I saw how bugs occurred – I have a distinct memory of that skiing program creating graphical gibberish on one side of the screen, until I fixed the text – and I also saw that there was a logic and texture to computer programs. Today’s computer programs are mysterious creations delivered whole to the user, but the old ones had a legible structure.
Later in the 1980s, my family abandoned Commodore for Apple, and I have used some kind of Macintosh ever since. That first Mac was something incredible to Childhood Me. I was entranced by the mouse, and the games, such as Cosmic Osmo and the Worlds Beyond the Mackerel, which offered rich, immersive realms that you could explore just by clicking. This computer could even speak, converting text to speech in an inhuman monotone that delighted my family. Whenever I re-watch the presentation Steve Jobs made introducing the Macintosh in 1984 I get chills, because the Mac was the first machine to make computing pleasurable. And yet, something was lost in my family’s rush to embrace the Mac’s wonders. We became more distant from the machine. We see this continued even today in the iPad, so slick and pristine that I don’t even know how files in it are stored.
The US science fiction author Neal Stephenson wrote a sprawling essay called ‘In the Beginning… Was the Command Line’ (1999), later published as a short book, that is essentially a meditation on computing. Though outdated, it contains a great deal of wisdom. In it, Stephenson writes of four choices that customers had in the computer space.
The first is the ugly but ubiquitous Windows. The second is the beautiful but hermetically sealed contents of Macintosh. The third is the incredibly powerful yet subcultural Unix. And the last is BeOS – more on this one in a minute. Stephenson was primarily analysing operating systems, the underlying software that allows a user to make a machine do its bidding. The operating system manages how the files on your computer are arranged and retrieved, how memory is allotted to different programs that are running, and often the visual interface that you see and click around on a daily basis. According to Stephenson, the problem with Windows and Macintosh is that they discourage a closeness to the machine.
Want to write some commands in black text on a white screen? At the time Stephenson wrote his essay, I’m not sure it was even possible on a Macintosh. Even today, this requires some effort on a Mac. You need to look into the Applications folder, then delve inside Utilities to find the Terminal application, and you need to know how to write its commands correctly, or else you could cause havoc on your machine.
Want to use a specific piece of software? Buy a shrink-wrapped version (or download it, as we do now) and double-click. Type-ins need not apply. But then there’s Unix. This operating system, which has ramified into different flavours, from Linux to FreeBSD, provides control and precision, but sometimes at the expense of the user-friendliness of the mainstream computer experiences.
And let’s not forget BeOS, an operating system from the 1990s that never caught on. BeOS was the happy medium for Stephenson: powerful and user-friendly, but with access to the underlying complexity, terminal and all. BeOS vanished in favour of the highly pasteurised experiences we have in the world of computing, but now there is a growing countercurrent.
Being closer to the computer can help you understand its power, its limitations, and the underlying logic of machines
We are currently awash in efforts to get people to learn how to program, or code. We have Codecademy and Code School. My mother, technically trained but without any previous programming knowledge, recently completed an hour learning how to code through one of these programs. But to what end? Not all jobs will require coding, at least not yet. Rather, what we are going to need – as a society – is a certain amount of computational thinking in this increasingly technological world.
And in this way, computer programming is indeed the future. Programming can teach you a structured way of thinking, but it can also provide a way of recognising what is possible in the technological realm. If I tell you that a given app can do some specific task, then if you’ve coded – even a small amount – your computational training can help to tell you whether I’m spewing nonsense or not. Being closer to the computer and its activities can help you understand its power, its limitations, the range of and potential for glitches, and the underlying logic of machines.
And yet I have never learned how to drive using manual transmission. Neither have my parents. I grew up on automatic and find the idea of learning how to drive with a gear stick overly complicated, dangerous and somewhat pointless. I like having one foot free and not having to dwell on complicated matters of gear change. But when an evangelist for manual transmission speaks to me, I learn that I’m ceding control to a machine, I’m no longer close to how the car works.
The reason I prize coding over the stick shift is cultural. When vehicles were paramount and car culture ascendant, they were the tools of freedom, the choice of tinkerers and the subject of pop songs. To not be aware of their details was to be ignorant in some general way. The more important a piece of technology is to our society, the more we need to understand it.
Which means that, the more important a technology is to our society, the more likely we are to notice its ‘visible seams’. The seams of our technologies are the aspects of our machines that suggest that beneath the shiny surfaces lie complicated innards. Seeing these seams is a hint of understanding what really makes them tick, but we are moving in a direction where designers are trying to hide them. Nowadays, we value the hermetically sealed, perfectly operating piece of technology. Apple tells us over and over: ‘It just works.’ This is infantilising, especially since these gleaming gadgets still have their glitches. Why should we have to rely on a priestly class of experts who are the sole inheritors of a permission to play?
The more we can play with a system, take it apart, tweak it, even make it fail, the more we are comfortable with a technology. We don’t view it as something foreign or strange, as solely the domain of experts, who overcharge us to fix our stuff, under threat of a voided warranty. We see how our machines work, creaky joints and all, and we can take a certain amount of intellectual ownership in them. And more importantly, we can see what sort of role they play in society. Knowing how a car uses gasoline, how the steering mechanism operates, or what the precious metals are in a vehicle, are all windows into various larger issues involving energy, the environment and even commodity pricing.
So how do we get back to being close to our technology?
We need gateways. When my family switched to Macintoshes, we abandoned type-ins. We moved to buying the shrink-wrapped programs that no longer fill computer stores. One of the most quintessentially Mac programs that I used to play with as a kid was something called HyperCard.
HyperCard was this strange combination of programming language and exploratory environment. You could create virtual cards, stitch them together, and add buttons and icons that had specific functionality. You could make fun animations and cool sounds and even connect to other cards. If you’ve ever played the classic game Myst, this was originally developed using HyperCard. HyperCard was like a prototypical series of web pages that all lived on your own computer, but it could also do pretty much anything else you wanted. For a kid who was beginning to explore computers, this visual authoring space was the perfect gateway to the machine.
One program I built with HyperCard was a rudimentary password generator: it could make a random string you could use as a password, but also had options to make the random passwords more pronounceable, and hence, more memorable over the long term. It was simple, but definitely ahead of its time, in my unstudied opinion. I tried to release it as shareware and made absolutely no money.
The computer game designer Chaim Gingold calls these gateways ‘magic crayons’. Like the crayon in the children’s book Harold and the Purple Crayon (1955) that allows the four-year-old hero to draw objects that immediately take on reality, magic crayons are tools that, in Gingold’s words, ‘allow non-programmers to engage the procedural qualities of the digital medium and build dynamic things’. Even in the Apple world, commonly viewed as sterilised of messy code and computational innards, HyperCard allowed access to the complex powerhouse of the digital demesne. HyperCard provided me with the comfort to enter this world, giving me a hint of the possibilities of working under the hood.
If we see tablets and phones as polished slabs performing veritable feats of magic, something is lost
All complex systems that we interact with have different levels. In biology, we can zoom up from biochemical enzymes to mitochondria to cells to organs to whole creatures, with each level providing different layers of insight. As we abstract up from one level to the next, we lose fine-grained control and understanding, but we are also able to better comprehend the larger-level system. In computer software, we can move up from individual bits and bytes to assembly language to computer code to operating systems to the everyday user interface that allows us to click, drag and use Microsoft Office. Each successive level brings us more functionality, but it also takes us further away from the underlying logic of the machine.
There will always be someone more hardcore, more technical than you. Developing a working knowledge of code doesn’t have to mean writing a program in binary, or something similarly difficult. We have tools and interfaces to prevent our lives from being miserable, even if it means we lose some of the mystique of our machines. But we should be able to see under the hood a little. If we see our tablets and phones as mere polished slabs of glass and metal, performing veritable feats of magic, and have little clue what is happening beneath the surface or in the digital sinews, something is lost.
Many years after abandoning our Commodore VIC-20, I found the machine and all its peripherals and manuals languishing in our basement. Among all the data cassettes and cords was some paper and notebooks with my father’s notes. I discovered that he hadn’t been simply buying programs or entering them in from magazines, he had also been writing his own programs, especially for me.
My father had built simple flashcard programs, as a way for me to learn about different topics. I don’t remember playing them, but this must have been my first experience with educational games, experiences that were later eclipsed by commercial games such as Number Munchers and Where in the World Is Carmen Sandiego? My dad never intended to sell his games; they were for our family alone. He was a computer user, but he was also a creator.