Menu
2024 Melinda Wyers
Welcome to Y2KQ!

Confession of an Internet Tub Thumper

John Sloan

     I peaked in 1999, long before it all went to shit, and we lost our minds.
     It started ten years earlier. I was a journalist working in university communications by
day and writing a column on microcomputers for the local newspaper by night.
     “You did this yourself?” asked the man at the print shop counter. Behind him was a
room-sized printing press. It smelled of ink and machine lubricant. The document he held, a
newsletter for a reporters’ association I belonged to, was created with desktop publishing
software on my home computer and printed on a quality laser printer at work.
     The print shop man seemed both impressed by the quality of the work and annoyed that
he might soon be out of a job, at least that part of the job that included design and typesetting for
brochures, newsletters, booklets, and flyers. It was an exciting time, bypassing an entire
industrial process to create publish-quality documents. I even did a term as an adjunct professor,
introducing desktop publishing in the print journalism program, back at my journalism graduate
school.
     There remained limits. The digital documents still needed to be locked in ink and paper
and reproduced hundreds of times over. That’s what had led me to the print shop. The mailing
process involved putting all those newsletters in manila envelops and affixing sticky address
labels created on a dot matrix printer with a word processor feature called mail-merge. But by
the mid-90s, even those constraints would be breached when I was introduced to the Internet and
something called the World Wide Web.
     “Mind if I blow your mind for a minute?” asked Peter, a guy who worked across the hall
from my office at the university. He’d appeared in my office doorway one morning. I followed
him to his office, where he showed me the Web via Mosaic, a “Web browser” developed at the
University of Illinois’ National Center for Supercomputing Applications. In seconds we surfed
from an exhibit of Vatican Museum art treasures at the Library of Congress to ancient aboriginal
artifacts at an Australian museum. As promised, my mind was indeed blown. This was going to
change everything.
     “I have seen the future, and it is totally tubular and way gnarly,” I wrote in my column
that week.
     One thing that Peter and I had in common was working on Apple Macintosh computers.
Most folks were on lowly PCs. It was ridiculously easy to get on the Web from our Macs via the
campus network. We didn’t realize it, but we were already living in the future. Most people
would not see this kind of bandwidth to their homes or workplaces for another decade.
     I spent a fair amount of work time surfing the web. Constructive play, I told myself. I
was exploring and learning. I learned to produce and share quality documents, and I didn’t even
need a laser printer. Anybody could read what I had created literally anywhere in the world.
Around this time, I read a book called being digital by MIT Media Lab co-founder Nicholas
Negroponte. Also got a subscription to Wired magazine.
     That did it. From then on, I was a tub-thumping evangelist for digitization. Everything
was going to move online. Ever greater parts of our lives were going to be digitally mediated.
Entire economies would be built, as Negroponte would say, with bits, not atoms. Welcome to
cyberspace, baby!
     I was given an amazing amount of access to the playground. The university’s IT
department was focused on the care of core systems running on various big iron servers and
mainframes. Microcomputers like the PC and the Mac were “toys”. They kicked Web
development over to our communications department, figuring we knew more about assembling
“pretty pictures and type” (i.e. publishing). They could be left to the serious work on serious
systems.
     In addition to doing the grunt work of coding pages in hypertext markup language
(HTML) I was put at the head of a campus committee to develop policies for the use of the
World Wide Web. We were laying the groundwork for how a university should function in
cyberspace. I also took every opportunity to proselytize for the Web in my newspaper column.
     The grunt work and the policy work came to fruition early in 1999 when we debuted a
new campus Web site and sent governance policies to the university senate. More than just a
static brochure, the home page featured always fresh content of a daily news service provided by
my department and a campus-wide events calendar.
     This is just the beginning, I thought. The newly designed home page would be a portal to
a range of educational and administrative processes, the front gate to a digital university that
would functionally mirror the physical campus. But for this to happen, I needed departments to
commit to change, none more so than IT.
     “If you ask me, we should ban the World Wide Web,” said the professor in the
university’s computer science department. His high-performance computing projects were being
frustrated by everybody “playing” with the Web. Every morning when faculty, staff, and students
fired up their computers and read the news on the campus Web site, network performance would
drop off the table.
     IT had tossed the Web potato to us in the first place but there was concern that we were
creating a monster. Remember, these people were focused on core systems in both the corporate
and academic realms. These systems required critical infrastructure to operate. The Web was a
bandwidth suck.
     Just as I was about to make the case for the Web as its chief onsite evangelist — the
would-be Webmaster of the university — I was taken off Web development and put on Y2K
preparedness. I hated it. I wanted to spread the good news but was condemned to focus on the
potentially bad. In retrospect, Y2K preparedness and Web development were both shining
examples of how not to affect organizational transformation.
     For both, you need general buy-in and, more importantly, a will to act across the
organization. You can’t just hang it on one guy to create a single document, be it a Web home
page or a Y2K guide. With the guide, I realized I wasn’t creating something to spark preparatory
action. I was checking a box for lawyers under the heading of “Due Diligence”.
     It was not the first or last time in my so-called career that I thought I’d lead the parade,
only to find that I was the clown with the shovel following the horses. The bitterness erupted in a
late-night meeting with my boss. Regretted things were said. Words came out of me that had
never been spoken before. Words like “butt humped”. I actually pounded my fist on my desk.
That was something I never thought I would do, ever.
     The next day, I started looking for a new job. Never was there a better time to jump ship.
The dot-com boom was in full swing. Among the skills required of the many fledgling online
startups was experience in putting words and pictures together. A lot of colleagues in the
traditional (dying) media made the jump that year.
     Unlike many of my unfortunate ship jumping colleagues, the tech research firm I went to
survived the Dot-Com crash a year later. My new focus was counseling the same IT types that I
had struggled against. The revolution was coming if they wanted it or not and they would be
expected to support it. A lot of corporate bosses would just expect the infrastructure to be there,
and they would not take no for an answer.
     By 2005, we had a new term to strategize over. It was called Web 2.0. The web was about
to become far more dynamic, interactive, and seamlessly connected with back-end systems —
more than just pretty pictures and type faces. Once Web 2.0 took hold, I saw most of my dreams
of the 90s come true. More and more, we would live, work, and play in a digital realm. Our
reality was as much composed of bits as atoms.
     Was it inevitable that the digital dreams would become a nightmare? Throughout history,
revolutions have started with euphoria and a breath of freedom before it all goes south. Was it
the net or was it me? In a blur of a few years, I went from a youngish tech visionary to a knuckle
dragging Luddite. The old man yelling at clouds, or was it the Cloud?
     “That’s gotta hurt!”
     There is a 1998 episode of Seinfeld where George Costanza thinks of a great line he
could have shouted out in a movie theater. He then looks for other theaters so he can watch the
movie again and shout his heckle at the appropriate moment. But he is upstaged by the
meaningless distraction of a guy with a laser pen.
     A lot of writers are like George. They think of the ideal line later and then will create
entire universes, people, and scenarios just so they can deliver that line. These kinds of writers
are the quiet ones at parties, saying little but listening intently, filing it all away for later use.
     The modern Internet has little space for people who are wired this way, the people who
think, write, and publish in that order. The successful, trending, contributors do very little
thinking or writing. Just publish as fast as possible. That’s OK, though, because they aren’t
trying to spark thought. Their aim is to bypass the higher brain functions and trigger base
emotions. Emotion drives engagement. Engagement makes money.
     The old school writers still occasionally publish. But more often than not, the output is
ignored by the masses due to the meaningless distraction of millions of laser pens. Traditional
publishers — including books, magazines and newspapers — have fought for relevancy as the
eyeballs and business models have gone elsewhere. They have been replaced by platforms that
refuse to even see themselves as publishers. Publishers have accountability.
     With mass Web publishing, we were going to break the constraints of the corporate media
filters. But filters also catch a lot of shit. Without the filters, the shit flows free. We are drowning
in it. The modern corporate publishers (the platforms) not only refuse to filter shit, they
encourage its production.
     It was promising at first. The platforms were these open spaces where anybody could
connect with whoever they wanted and publish whatever they wanted, consuming each other’s
posts and tweets. The relentless drive to monetize your ever shrinking attention span degraded
the platforms to polluted and polluting bias-confirming rage machines. This is a process the
Canadian writer Cory Doctorow calls enshittification.
     I should have seen it coming. I read McLuhan. I knew that new media re-wires our
brains. Not always for the better. I was suddenly old-fashioned. I always saw the computer
screen as the common portal to this brave new world. Portability and mobility would be provided
by the laptop and then the tablet computer. I didn’t consider what it would mean if the primary
access point would be handheld texting machines.
     I have become like the old man trying to catch fish you dare not eat by the local river,
lamenting the loss of the good old days when the water ran sparkling cool and clear, where you
could drink it or go for a swim without risking toxic contamination. Why can’t we keep the nice
things?
     I have struggled to hang on to what is left of my mind, as so many around me have
clearly lost theirs. To the extent that people like me laid the groundwork for this world back in
1999, I apologize.
     But, man, was it ever fun!

John Sloan has been writing in London, Ontario, Canada for 35 years mainly as a journalist, columnist, and technology analyst. He chose this path due to Lou Grant, a degree in Journalism, and a need to eat. He lives in neither a rambling Victorian house nor a cozy flat. He does not own a cat.

Back to Issues
Read More