originally printed in Gravitas magazine, Winter 1998 issue.
4400 words, plus notes. Click here for printer-friendly version.
|author>>||by Rick McGinnis|
|start>>||NINE YEARS AGO, a great bronze safe once belonging to the French fantasist Jules Verne was opened by his great-grandson. Inside, under a pile of linen, he found a manuscript for a novel, written by his great-grandfather 125 years earlier. Verne's inventiveness and imagination were among the keys to his phenomenal success, and his tales of submarines, trips to the moon, and transcontinental flight were not only the prototypes for much of the science fiction to come, but also anticipated much of what we take for granted at the end of this turbulent, headlong century. Still, the novel—published here recently as Paris in the Twentieth Century—was rejected by his publisher as beyond credible: "No one today will believe your prophecy," wrote Pierre-Jules Hertzel to the writer.||
Verne's novel is set in the Paris of 1960, in an age when free-market capitalism governs the world. Everything has been privatized, even education, and institutions are run for profit by share-issuing companies. Because science and technology have been the agents of phenomenal progress and productivity growth, little else interests the vast majority of the population. War—a notoriously unprofitable endeavor—has been eliminated, but so too have the humanities, and while music survives in a utilitarian function, centuries of literature crumble into dust on forgotten bookshelves.
world of automated manufacturing and agriculture, overcrowded cities,
and poorly paid service-sector employees is a crushing dystopia for his
protagonist, Michel Dufrenoy, a young man ill-suited for anything other
than the useless trade of poet. As he falls into the margins of his society,
and beyond, Michel's friends desperately wonder what kind of employment
he might find for himself in a society that values only capital and expediency.
As a writer, they suggest, he might find a position as a "stock
reporter...getting caught out every day in inevitable errors, prophesying
events with great aplomb, on the principle that if the prediction doesn't
come true, the prophet will be forgotten, and if it does, he will pride
himself on his perspicacity, overcoming rival companies for some banker's
greater profit...Will Michel ever consent to that?" As the story is a tragedy,
he will not, and so his fate is sealed.
While only sheer paranoia would persuade us that Verne's
vision of the future was wholly accurate, like many oracles he understood
enough to anticipate correctly at least one aspect of the world to come.
In a world where capital is the fuel of nearly every endeavor, the job
that Michel rejects is held today by thousands of people around the world,
under the name of stock analyst, fund manager, investment adviser, strategic
consultant and, most auspiciously, futurist. Moreover, as the future hasn't
transpired to be as monolithic as Verne envisioned, those employed in these
fields aren't nearly the toiling peons Michel Dufrenoy disdained to join.
At the end not only of a century but of a millennium, the future is a matter
of serious speculation, its trajectory and variables a matter of serious
conjecture, the ability to divine its path a valuable skill. The Future,
it seems, is big business.
|quote>>||"Lasher shrugged. 'Oh hell—prophecy's a thankless business, and history has a way of showing us what, in retrospect, are very logical solutions to awful messes.'"|
—Kurt Vonnegut, Player Piano
VERNE WAS certainly not the first writer to speculate on the future. In a brief survey, we might begin with Plato's Republic, and continue along with More's Utopia, and Francis Bacon's The New Atlantis—a lineage of literature concerned with worlds to come. Verne's own century produced books like Samuel Butler's Erewhon, Edward Bellamy's Looking Backward, and William Morris's News From Nowhere. The larger part of H.G. Wells's work was concerned with the future, and with the increasing popularity, and later legitimacy, of sci-fi, the twentieth century has shown an unceasing fascination with the shape of the future.
And the dominant vision of this future has been dystopic. Orwell's 1984, Huxley's Brave New World, Yevgeny Zamyatin's We, and Vonnegut's Player Piano were among the books outside of sci-fi that have shaped our worst fears about the direction this world might take. As reading has declined, movies have taken over from literature, delivering the nightmarish scenarios of the Alien and Terminator films, and the highly influential Blade Runner. This would hardly be surprising, as the half-century since the Second World War has been dominated by the threat of nuclear apocalypse, a threat now joined by new visions of hell on earth as disease and environmental catastrophe are added to the list of wounds we seem to be inflicting on ourselves and the planet.
But this is only one half of the conventional wisdom. No person or society can live entirely without hope, and so there has always been a market for optimistic scenarios. Indeed, a recent wave of optimism has been sweeping through the business community, inspired by what seem to be remarkable changes in the way we communicate. Fortunes have been made not only by selling new goods in the marketplace, but even by selling the promise of a whole new marketplace. Companies have risen from insignificance to near-monopolistic importance in markets that didn't exist twenty years ago. Our very vocabulary has been changed as new words like software, modem and e-mail have been coined to describe what were once obscure technical concepts. The fetish of all this change is the computer.
Once an intimidating monster that threatened to supplant humanity, the merciless agent of redundancy in Vonnegut's Player Piano, the computer has become a ubiquitous item on desks and in homes. Fiction, though, has become a poor delineator of this "revolution," and instead we have racks of monthly computer magazines, business journals devoting acres of editorial space to the industry's new billionaires and their companies, and a new genre of literature that has sprung from a professional need to keep up with the changes and plot a course into the future. Michel Dufrenoy might not have applied for a position in this new world, either, but he might at least have appreciated the romance with which the "futurists" in these publications depict the coming century. In the work of even the driest of these writers, there is a palpable sense of anticipation, of Utopia finally within our grasp, while in others, there would seem to be nothing in Verne or Wells that isn't on the verge of being fulfilled by a beige-cased business appliance whose potential has been barely tapped.
If only it were so true, or so simple.
"The telephone, and the radio, and the movies that we know
May just be passing fancies, and in time may go..."
—George Gershwin,"Love is Here to Stay"
|THE POPULAR LITERATURE of the modern computer began,
not with Victorian speculation on Charles Babbage's "Difference Engine"
or the prototypical science fiction that grew from Verne and Wells and
was found in magazines like Amazing Stories, but with an article
printed in the July 1945 issue of The Atlantic. The author, Dr.
Vannevar Bush, was a former president of MIT, and had just served as the
head of the wartime Office of Scientific Research and Development. His
We May Think", was written in the closing months of the Second World
War, when cautious optimism began to re-assert itself after years of catastrophe.
Bush speculated on how wartime technological breakthroughs might enable
the construction of machines that might make labour more efficient, communication
more rapid, and the organization of vast amounts of data cohesive and comprehensive.
Mankind, he wrote, "has built a civilization so complex that he needs to
mechanize his records more fully if he is to push his experiment to its
logical conclusion and not merely become more bogged down part way there
by overtaxing his limited memory." Though restrained in tone, the message
was clear: with the aid of science, we have the means in our grasp to perfect
Twenty years later, the machines Bush had described were in existence, in the service of government, science, and business. Huge and complex, maintained by teams of expensively-trained technicians, they had names like UNIVAC and EDSAC but were generically called computers, a name Bush couldn't have anticipated, much as he couldn't have known about the transistor that made the machines he imagined so much smaller and cheaper to build, thus making them commercially viable, if only for the well-financed. In May of 1964, The Atlantic published another article, by Martin Greenberger, an associate professor at MIT. "The Computers of Tomorrow" acknowledged its debt to Bush, and went on to predict computer-driven money markets, communication networks, and the "information utility". Greenberger's tone was far more rapt: "By 2000 AD man should have a much better comprehension of himself and his system, not because he will be innately any smarter than he is today, but because he will have learned to use imaginatively the most powerful amplifier of intelligence yet devised."
Thirty-one years later, Lou Gerstner, the chairman of IBM, makes the keynote address at Comdex, the computer industry's annual trade show. In the decades since Bush and Greenberger's articles, the computer has become all-pervasive, not only as an "information utility" but in the popular consciousness, spawning aggregate miles and months of massed commentary in the news media, and a parallel publishing industry devoted solely to the burgeoning developments in a technology that is smaller, cheaper, and more accessible than Bush or Greenberger could have imagined. Needless to say, Gerstner feels the powerful momentum behind himself and his industry, and his tone is commensurately inspired: "We will improve the world, and the way we work, the way we communicate, live and learn as people...We have grown, we have innovated, and we have prospered at a rate unsurpassed by any other. It's been an amazing, breathtaking ride. It can continue—and accelerate—if we remember that our future rests on how well we respond to the total needs of society and of our customers all around the world." Flushed with pride, and anticipating the deafening roar of applause, Gerstner lowers his voice and intones, "Thank you, and I hope you have a great day at Comdex."
At the end of the twentieth century, no other product
of our age is as synonymous with the prosperous, thriving future as the
computer. It has replaced the car, the airplane, the robot, and even the
spaceship as our fetish object. As an icon, it's almost universally recognizable:
a small screen perched on top of a box, with the rectangular grid of a
keyboard in the foreground. And yet, what are these elements but a jumble
of parts, bits of pre-existing technologies wired together around a poorly
understood package of hardware, animated by an even less-tangible impulse
of software? Like the technology itself, the culture that has sprung up
around it is a shotgun marriage of ideology and utility, uniquely a product
of its age, where discreet elements placed in proximity gain emotional
power and persuasiveness. Like the movies that are considered the art of
the age, computer culture relies on its speed to overcome the contradictions
and inconsistencies that would hobble it.
|quote>>||"It would be hopeless for revolutionaries to try to attack the system without using SOME modern technology. If nothing else they must use the communications media to spread their message. But they should use modern technology for only ONE purpose: to attack the technological system."|
THERE ARE, OF COURSE, dissenting voices in the din of hype. Sven Birkerts's The Gutenberg Elegies: The Fate of Reading in an Electronic Age, published in 1994, lamented that computers, networks, and the promise of readily accessible "information" would swamp libraries, publishing, and the act of reading. Birkerts's argument is largely philosophical: "Somewhere we have gotten hold of the idea that the more all-embracing we can make our communications networks, the closer we will be to that partaking that we long for deep down. For change us as they will, our technologies have not yet eradicated that flame of desire not merely to be in touch, but to be, at least figuratively, embraced, known and valued not abstractly but in presence."
A year later, Clifford Stoll's Silicon Snake Oil made much the same argument from the perspective of an apostate technologist: "Perhaps our networked reality isn't a universal doorway to freedom. Might it be a distraction from reality? An ostrich hole to divert our attention and resources from social problems? A misuse of technology that encourages passive rather than active participation?" More recently, an essay by science fiction writer Bruce Sterling, "Unstable Networks," questioned the ability of the Internet to transcend mundane physical reality: "Isn't it far more likely that we'll get the Internet that we deserve? Cyberspace isn't a world all its own like Jupiter or Pluto, it's a fun-house mirror of the society that breeds it. Like most mirrors it shows whatever it's given: on any day, that's mostly human banality. Cyberspace is not a fairy realm of magical transformations. It's a realm of transformations all right, but since human beings aren't magical fairies you can pretty much scratch the magic and the fairy parts."
Still, these arguments are hardly about to negate the
blare of hype. In the case of Stoll and Birkerts, the arguments rely on
sentiment and perspective for their appeal, and on an idea of the past
that is hardly persuasive in an age that knows little of history other
than what mediated technological art forms like movies choose to depict.
They draw on the same liberal humanism as Gerstner's battle-cry to the
troops, and only offer the quixotic prospect of opposing "progress" as
a solution. Moreover, they are unable to speak to the concepts of "efficiency,"
"expediency," and "market-driven forces" that demand the technology and
fund the computer industry.
Wired July '97 cover
|Ultimately, they can't compete with the romance of a
booming, prosperous future—the kind retailed by assorted journalists, academics,
professional futurists, and by magazines like Wired.
In July of 1997, Wired printed a cover story on "The
Long Boom": Since 1980, the magazine argued, the economy has both confounded
all expectations and offered a unique opportunity to those willing to partake.
Though it may be difficult to see now, "in the developed countries of the
West, new technology will lead to big productivity, increases that will
cause high economic growth—actually, waves of technology will continue
to roll through the early part of the twentieth century." Combined with
globalization and a new ethic of "openness," the process will be irresistible,
and will lead to a "civilization of civilizations".
"Openness" is, of course, the key to all of this: "A positive scenario can inspire us through what will inevitably be traumatic times ahead."
"So suspend your disbelief. Open up to the possibilities."
|quote>>||"I have seen the future, baby, it is murder."|
—Leonard Cohen, "The Future".
THE ARTICLE'S AUTHORS, Peter Schwartz and Peter Leyden, describe a world of increasing decentralization, eroding government regulation, creeping affluence, ecological stability and even improvement, and international tension relieved by economic parity. Heaven on earth? Very nearly, provided it works. And what might stop it? They point to the threats: famine, nationalism, pollution, plague, crime, war, censorship, and "social and cultural backlash"—the antithesis of "openness". They even consider the possibility that "new technologies turn out to be a bust. They simply don't bring the expected productivity or the big economic boosts". Dark thoughts, indeed.
Wired Jan. '98 cover
|Six months later, Wired's January 1998 issue,
the fifth anniversary of the magazine, devotes itself to a theme: "Change
is Good". Evolution is impelling us forward, writes Disney research
and development vice-president Danny Hillis and journalist Oliver Morton.
Technology is the handmaiden of evolution, says supply side theorist George
Gilder. We are at the greatest summit of civilization ever, says Cato
Institute fellow Julian Simon, and there are greater heights to reach.
Government is becoming irrelevant, along with big media, say journalists
John Browning and Randall Rothenberg. Even the Third World will join us
on our triumphant march, say MIT professor Nicholas
Negroponte and Grateful Dead lyricist John Perry Barlow. The messianic
tone of the issue is merely that of Gerstner's Comdex address, amplified.
This time, the threats to the Long Boom are denied a seat
at the table, and the opinions are no longer offered as speculation but
as articles of faith. Change is Good, and Wired is about Change,
so Wired is Good: the morality is self-evident. The philosophy beneath
it is a fascinating marriage of free-market neo-liberal economics and millennial
mysticism, Darwin and Adam Smith, libertarianism and recovered communitarianism,
ecological Gaia theory and technofetishism—and it is meant for both apostate
and true believer.
|The possibilities for parody and satire are endless,
Wired knows it—part of its burgeoning empire is an on-line magazine,
that frequently targets dispensers of "Information Age" hype and billionaire
nerds. Which doesn't prevent Suck co-editor Joey Anuff from contributing
to the "Change is Good" testimonial issue. As a publishing house, Wired
sells pocket dictionaries of computer slang, profiles of "digerati," and
reprints of Marshall McLuhan, the magazine's "patron saint". Martin Greenberger's
"information utility" has sparked the "Information Revolution," and the
control of language is all-important, especially if one is promoting a
revolution whose battles are played out in the media, and whose advances
often seem difficult to articulate.
Nicholas Negroponte, for example, is fond of describing the change as happening on the level of "bits" versus "atoms"; that is, in the digital realm as opposed to the physical. Information, being composed largely of ideas, concepts, and reported facts, is a digital product, and trade in it will, by necessity, overcome borders, laws, tariffs, and even nationalities. It's persuasive rhetoric, especially for those already dealing in information: journalists, politicians, traders, academics, and artists. In the future, it is understood, access to information will be the criterion for determining social classes. With that in mind, we owe it to future generations to make the technology cheap and available. Some money will be made of course, but what a perfect marriage of commerce and social philanthropy!
Even a cogent book like Montreal writer Matthew Friedman's
Logic: Dispatches from the Information Revolution relies on the
meaning of "information" to be self-evident. Talking to Kenyan systems
administrator Crispin Sikuku, they agree that the next century's prime
resource is to be scrutinized and quantified:
|As computers are the conduit of the "information superhighway,"
it's considered imperative that children have access to them, and that
schools and libraries acquire them. The largest single charitable use Bill
Gates has made of his fortune is a foundation to wire libraries and schools
in the United States, and last year he was asked by British prime minister
Tony Blair for his help in a similar
scheme in the U.K. To Gates, it's self-evident that computers will
stem the tide of falling educational standards in the West, and he devotes
a chapter of his book, The Road Ahead, to education: "When the information
highway is in operation, the texts of millions of books will be available.
A reader will be able to ask questions, print the text, read it on-screen,
or even have it read in his choice of voices. He'll be able to ask questions.
It will be his tutor."
The idea that "millions of books" will be available to students is frequently invoked in defence of computer-aided education, in spite of the fact that, while many library systems have put their card catalogues on-line, it is both prohibitively expensive and unimaginably time-consuming for whole texts, never mind whole libraries, to be made available for downloading—and this doesn't even begin to address the issue of copyright or even the physical aspect of reading a book off of a computer screen. As Nicholas Negroponte admits: "Even where computers are omnipresent, the current interface is primitive—clumsy at best, and hardly something with which you might wish to curl up in bed."
Where computers have failed to improve educational standards,
it's because the equipment is old, and the interface inadequate, Gates
states. The solution is to spend more money on computers in schools. He
states emphatically that this will not replace skilled teachers. Never
mind that education has been cut back drastically just as a youthful demographic
boom is swarming through the system, or that another demographic boom—one
that can vote—is about to drain the other end of the public purse in the
form of pensions and health care. Never mind that the loudest accusation
aimed at the Internet is the range of material it hosts that would hardly
be welcome in the classroom. These are minor problems, that can be solved
by an industry known for its ingenuity. Never mind that Gates is a leading
member of that industry, and stands to enrich himself, along with many
of his peers, from the school market. Could this be what he means by "Friction-Free
|quote>>||"The writer sees the world as a jaded world devoid of recuperative power. In the past he has liked to think that man could pull out of his entanglements and start a new creative phase of human living. In the face of our universal inadequacy, that optimism has given place to a stoical cynicism. The old men behave for the most part meanly and disgustingly, and the young are spasmodic, foolish, and all too easily misled."|
—H.G. Wells, Mind at the End of its Tether
THE ART OF THE LONG VIEW: Planning for the Future in an Uncertain World was published in 1991. Its author, Peter Schwartz, was a former futurist for SRI International (a California consultancy firm once associated with Stanford University) and later a futurist for Royal Dutch/Shell. Now president of the Global Business Network, Schwartz (who would later co-author Wired's "Long Boom") sought to share his expertise in scenario-building with those whose livelihoods depended on contingency planning. The book is a fascinating look at the process by which professional futurists generate planning scenarios for different companies, but it would be a mere "how-to" manual if Schwartz hadn't decided to put his cards on the table and outline what he thought would be the defining scenario of the next century: "The Global Teenager".
A huge demographic of children are passing through school systems all over the world. They are "connected," in Schwartz's opinion, through the network of cheap electronics and satellite dishes that make products of the entertainment industry, American or local, in demand everywhere. Schwartz admits that he, like many futurists and planners, learned of the power of demography from their own generation, the baby boom. But this demographic is polyglot and largely poor. While correctly imagining that this may tend to breed desperation and cynicism, even violent resentment against the prosperous, he states: "As a scenario-planner, this type of pessimistic image always inspires the question in me, What would have to take place for this image not to come true?" That goal in mind, he constructs a version of the future based on the best possible outcomes, which include liberalized immigration policies (implemented, of course, by the West), ever cheaper and faster communications technologies available globally (unimpeded by regulation), and the kind of urge to succeed associated with the aspirant middle classes of a generation ago. It's a world of "video-cafes" where teenagers go on teleconference "dates".
It's easy to make fun of Schwartz's optimal scenario,
but it comes from a deep need that is articulated by the book's afterword:
an open letter to his newborn son. "Your future will be filled with ample
challenges and inspiring goals," he assures the infant. "It will be a good
future, a future of good people."
Growing Up Digital cover
|Don Tapscott's Growing
up Digital, published just recently, is a book-length expansion
of these themes. Tapscott, a business
consultant whose specialty is advising executives on the impact of
new technology, began his book essentially by noticing how easily his children
were taking to the same computer that often frustrated him—like Schwartz,
he pulls out the trope about kids programming their parents' VCRs. Like
Schwartz, he roughly sketches the dire consequences of such a demographic
stymied in its attempt to fulfill the promises bundled with the technology,
but quickly reassures us that the necessary adjustments will be made. Even
more than Schwartz, he identifies the market potential of these children,
and like Lou Gerstner's "customers," feels that "our future rests on how
well we respond to (their) total needs."
Yet ultimately, Schwartz and Tapscott's vision of the
future is as sentimental as Sven Birkerts's and Clifford Stoll's reservations
about its direction. It would be just as intolerable for the future to
|quote>>||"...I could see that he was tired after talking so much, and I was not quite sure how tolerant he would be of any opinion that contradicted his own—especially when I remembered his sarcastic reference to the sort of person who is afraid of looking a fool if he cannot pick holes in other people's ideas. So I just made some polite remarks about the Utopian system, and thanked him for his interesting talk—after which I took his arm and led him to supper, saying: 'Well, I must think it over. Then perhaps we can meet again and discuss it at greater length.'"|
—Thomas More, Utopia
MIT ECONOMIST Lester Thurow's 1995 book, The Future of Capitalism, is hardly a glowing encomium of optimism. He has little to say about the free-market system that isn't carefully phrased or deeply critical. There are big changes ahead, but the most he is willing to do is to describe their motive forces, which he imagines as moving like plate tectonics, shifting slowly but surely, occasionally with a sharp wrench like an earthquake. He sees few bright spots among the global economies, but feels that four countries should be singled out for their particular merit in constructing stable, workable systems: Hong Kong, Singapore, Taiwan, and South Korea.
With the collapse of the markets in Asia at the end of 1997, you might have expected a sort of mea culpa from Thurow. His article, "Asia: The Collapse and the Cure", published in the February 5th issue of the New York Review of Books, was just as careful an analysis of the situation as was The Future of Capitalism. But the gist was that the collapse, when it came, was unforeseeable except to insiders, whose implication guaranteed silence. "What is clear by now is that crashes are not set off by outside speculators who see the internal weaknesses and attack. The first investors who leave the market are always the local investors who have the best information."
The irony is profound: while markets communicate with
increasing speed, reacting faster than we could have imagined before, the
information that triggered them is already old, a relic of the past. The
information is there, but until we wire not only every office, home, and
school but, beyond that, every mind and conscience, we will not have access
to it at the peak of its value. Information that carries the promise of
the future inevitably becomes devalued currency by the time it reaches
us, as poor an investment as a collapsed share or a block of baht
or won. We would all love to live in the future, provided it is
a good and prosperous one, but in spite of it all we still merely inhabit
that sharp edge of the past known as the present. We would all love to
glimpse the future, but we lack the information.
©1998, 2001 Rick McGinnis, all rights reserved