I consider myself a smart person. On occasion, regardless of what my friends and family might say, sometimes even clever. Yet, when it comes to technology I alternate in equal measure between feeling captivated and terrified.
You see, on the one hand, I drank the Kool-Aid when I was a teenager begging my parents for a Walkman (for those who don’t know, think an iPod, but fatter and the music played on cassette tapes). Later, it was thrilling for once to be stuck in a 1995 L.A. traffic jam; yes, exciting to be immobilized on Sunset Blvd.- because I had an excuse to call my mom in Michigan on my brick of a cell phone. Yah, it cost $2 a minute, but who cared? I could actually have a conversation in my car?! Today, that’s a smartphone and a laptop.
On the other hand, I simply use new tech. As long as the on-button works, I’m good. I never understood the underlying concepts on how any of it works and as a result, my foundational understanding is built on sawdust, and it’s get harder to catch up every year. Where is the internet housed anyway?
The problem is technology doesn’t stop expanding, like a spilled pitcher of lemonade. It is an infinite spider web of steady innovations, applications and an accompanying change in how we live and work. This has been true since the dawn of time and no doubt it will continue until the lights go out on all of us.
That’s great. But what happens when the button is green on my device and nothing happens!? The old: “What the hell is wrong with this computer!?” Every time it feels like a bite into an extra sour lemon. Bitter.
“I just need it to work and I don’t understand it,” I heard a young woman say not long ago to a device technician. She was sitting next to me at the Apple store with a busted ipad to work. That’s brave, I thought. Me, I pretend like I know what is happening. I don’t.
Herein lies the pathological terror part, tinged with a dash of insanity. Every single time I get a new device, I swear this time is going to be different. I am going to hunker down and learn the basics; take the class, watch a YouTube video, and get up to speed. I start off strong the first day, but get overwhelmed in record time, like hours and every time, I shut down. I hobble together the five settings I need to keep it moving and religiously maintain the idea of a deep-dive on my to-do list, and just about every month of every year I copy it over to the next list. Sound familiar?
Underneath these shared experiences, constantly lurking is the idea that some of us are meant to get it and others are not; that the world is divided into technocrats and technophobes. In other words, perhaps I’m just not that smart after all because I’m always at a loss.
At work, I’ve systematically embraced monstrous changes the past two decades. Back in 1995 when I was covering the O.J. Simpson criminal trial in downtown Los Angeles, we went live nightly on the back of a ten-ton truck and several hundred yards of cable crisscrossing an abandoned lot across the street from the courthouse. By 2005, when I was covering Hurricane Katrina in New Orleans, we went live above a submerged Elysian Field neighborhood on a computer. No truck and no cables.
Over ten years, technology changed how we worked in broadcast and it wasn’t always fun. Along with the new tools came new skills. The core skills of producing, reporting, shooting video, producing audio and editing had gone from six people to in some places, like local news, one person. That’s a lot of jobs. Gone.
Technology radically simplified what was possible in television production; streamlining the workflow of shooting, editing, and broadcasting. Digital content changed everything, but it was a slow burn as the industry began to understand how monumental it was. Early on executives across the networks used to discuss digital and the internet as a fad. Yes. They really did. I heard them.
There was little time spent on looking ahead; investigating what the tools might be and the skills needed and thus, coherent strategies for educating the workforce. Instead, jobs and the people in them slowly started disappearing. The rest of us adapted. Technology driven changes have and will continue to sweep across every industry and sector.
And I’m not sure, we in the news, have done a great job of helping us along, understanding these evolutions. Is it possible that somewhere between dystopian headlines screaming “robots will take your job” to raving top-ten lists daring one to live without a smartphone, media makers- meaning, journalists, writers, creatives – have mostly skipped straight past nuance and landed on a binary technology narrative? I wonder.
Have we created a line that is ever growing thicker -and- ever further dividing everyday consumers of technology and technocrats? Is it a refrain that maintains a steady beat of underlying fear that lurks in the hearts of many of us that the world is changing incomprehensibly fast, leaving us in the dust; powerless to do a damn thing. Do tech vendors assume consumers know more or less than they do? Are we talking about technology through every and all lenses?
Can we provide greater context even in this bite-sized content world that we live in?
Ironically, I work in technology and have done so the past five years from angel investing to teaching at a tech-driven Master’s program at New York University, The Interactive Telecommunication’s program at the Tisch School of the Arts. I spend an enormous amount time with people from technologists to animators to producers working in virtual reality and augmented reality.
From these experiences I have learned a very important lesson: people make technology. What one person makes another can learn to use, understand and provide feedback. We need each other as it turns out.
Until recently during a chance meeting of a UX designer- Wikipedia defines a UX designer this way: “A User experience design (UX, UXD, UED or XD) is the process of enhancing user-satisfaction with a product by improving the usability, accessibility, and pleasure provided in people’s interaction with the product- in San Francisco it never dawned on me that I wasn’t technically dumb but that some of the tech is badly designed.
People make technology using fundamental concepts of mathematics, computer science and hardware design. As it turns out technology is similar to building a skyscraper. Each floor is built on the one before, and all of it depends on the foundation. My dad, the civil engineer will be proud of me for making this kind of analogy. We now have computers within our cellphones that are able to process computations while whizzing through our email and making calls. Technology has gotten smaller and more powerful over the years; hardly recognizable from what was developed in the 1960s.
Shouldn’t there be a place to to hear from everyday people about their technology: what they use, how they use it, how it has changed their profession or their lives. Are there stories to tell that give us greater context for what’s happening and what that means for today and perhaps tomorrow? I mean an A-to-Z guide for everyone from artists to zoo keepers. Because here is the truth. People make technology. What people make, other people can learn to understand if only the very basics, one part at a time.