Who’s in Charge – Our Technology. . . or Us?
NOTE: Care2 will be selecting 3 commenters to receive copies of this very interesting and provocative book.
Digital technology is programmed. This makes it biased toward those with the capacity to write the code. In a digital age, we must learn how to make the software, or risk becoming the software. It is not too difficult or too late to learn the code behind the things we use—or at least to understand that there is code behind their interfaces. Otherwise, we are at the mercy of those who do the programming, the people paying them, or even the technology itself.
One of the US Air Force generals charged with building and protecting the Global Information Grid has a problem: recruitment. As the man in charge of many of the Air Force’s coolest computer toys, he has no problem attracting kids who want to fly drones, shoot lasers from satellites, or steer missiles into Persian Gulf terrorist camps from the safety of Shreveport. They’re lining up for those assignments. No, the general’s challenge is finding kids capable of programming these weapons systems—or even having the education, inclination, and mental discipline required to begin learning programming from scratch.
Raised on commercial video games that were themselves originally based on combat simulation technologies, these recruits have enviable reflexes and hand-eye coordination. They are terrific virtual pilots. Problem is, without an influx of new programmers capable of maintaining the code and fixing bugs—much less upgrading and innovating new technologies—the general cannot keep his operation at mission readiness. His last resort has been to give lectures at education conferences in which he pleads with high schools to put programming into their curriculums.
That’s right: America, the country that once put men on the moon, is now falling behind most developed and many developing nations in computer education. We do not teach programming in most public schools. Instead of teaching programming, most schools with computer literacy curriculums teach programs. Kids learn how to use popular spreadsheet, word processing, and browsing software so that they can operate effectively in the high-tech workplace. These basic skills may make them more employable for the entry-level cubicle jobs of today, but they will not help them adapt to the technologies of tomorrow.
Their bigger problem is that their entire orientation to computing will be from the perspective of users. When a kid is taught a piece of software as a subject, she’ll tend to think of it like any other thing she has to learn. Success means learning how to behave in the way the program needs her to. Digital technology becomes the immutable thing, while the student is the movable part, conforming to the needs of the program in order to get a good grade on the test.
Meanwhile, kids in other countries—from China to Iran—aren’t wasting their time learning how to use off-the-shelf commercial software packages. They are finding out how computers work. They learn computer languages, they write software and, yes, some of them are even taught the cryptography and other skills they need to breach Western cyber-security measures. According to the Air Force general, it’s just a matter of a generation before they’ve surpassed us.
While military superiority may not be everyone’s foremost goal, it can serve as a good indicator of our general competitiveness culturally and economically with the rest of the world. As we lose the ability to program the world’s computers, we lose the world’s computing business as well. This may not be a big deal to high-tech conglomerates who can as easily source their programming from New Delhi as New Hampshire. But it should be a big deal to us.
Instead, we see actual coding as some boring chore, a working class skill like bricklaying, which may as well be outsourced to some poor nation while our kids play and even design video games. We look at developing the plots and characters for a game as the interesting part, and the programming as the rote task better offloaded to people somewhere else. We lose sight of the fact that the programming—the code itself—is the place from which the most significant innovations emerge.
Okay, you say, so why don’t we just make sure there are a few students interested in this highly specialized area of coding so that we can keep up militarily and economically with everyone else? Just because a few of us need to know how to program, surely that doesn’t mean we all need to know programming, does it? We all know how to drive our cars, yet few of us know how our automobiles actually work, right?
True enough, but look where that’s gotten us: We spend an hour or two of what used to be free time operating a dangerous two-ton machine and, on average, a full workday paying to own and maintain it. Throughout the twentieth century, we remained blissfully ignorant of the real biases of automotive transportation. We approached our cars as consumers, through ads, rather than as engineers or, better, civic planners. We gladly surrendered our public streetcars to private automobiles, unaware of the real expenses involved. We surrendered our highway policy to a former General Motors chief, who became secretary of defense primarily for the purpose of making public roads suitable for private cars and spending public money on a highway system. We surrendered city and town life for the commuting suburbs, unaware that the bias of the automobile was to separate home from work. As a result, we couldn’t see that our national landscape was being altered to manufacture dependence on the automobile. We also missed the possibility that these vehicles could make the earth’s atmosphere unfit for human life, or that we would one day be fighting wars primarily to maintain the flow of oil required to keep them running.
So considering the biases of a technology before and during its implementation may not be so trivial after all. In the case of digital technology, it is even more important than usual. The automobile determined a whole lot about how we’d get from place to place, as well as how we would reorganize our physical environment to promote its use. Digital technology doesn’t merely convey our bodies, but ourselves. Our screens are the windows through which we are experiencing, organizing, and interpreting the world in which we live. They are also the interfaces through which we express who we are and what we believe to everyone else. They are fast becoming the boundaries of our perceptual and conceptual apparatus; the edge between our nervous systems and everyone else’s, our understanding of the world and the world itself.
If we don’t know how they work, we have no way of knowing what is really out there. We cannot truly communicate, because we have no idea how the media we are using bias the messages we are sending and receiving. Our senses and our thoughts are already clouded by our own misperceptions, prejudices, and confusion. Our digital tools add yet another layer of bias on top of that. But if we don’t know what their intended and accidental biases are, we don’t stand a chance of becoming coherent participants in the digital age. Programming is the sweet spot, the high leverage point in a digital society. If we don’t learn to program, we risk being programmed ourselves.
The Bureau of Labor Statistics (http://www.bls.gov/) updates these figures yearly.
This is a chapter from Douglas Rushkoff’s new book, Program or Be Programmed. Rushkoff is a widely known and respected digital thinker and has written several important books on the role of technology in our lives.
Website of Douglas Rushkoff
by Douglas Rushkoff