A celebration of Zoom's 50th
Half a century later, I can still sing the address: "Write Zoom. Z, double-O, M. Box 3-5-0. Boston, Mass. 0-2-1-3-4. Send it to Zoom!"
Read my essay/appreciation celebrating the 50th anniversary of the groundbreaking kids show Zoom.
The seductive nostalgia of “Stranger Things”
Over at Salon, I wrote a piece called "Just like ’80s nerd heaven: The seductive nostalgia of “Stranger Things” — and my unexpected ambivalence" that explore how Netflix's supernatural show raids my childhood pop-culture loves, from D&D to E.T. — and I admit, I felt conflicted.
Why I'm against online comment forums
Why am I against online comment forums? Increasingly, I see ad hominem attacks, “you’re a loser” name-calling, and Donald Trump-style playground insults --- all of which have come to pass for grown-up debate in America. Read the rest over on WBUR's Cognoscenti.
My Failure Is Complete: I Fell for Star Wars Hype. Now, Can We Just Watch the Damned Movie?
‘Star Wars,’ And The Force It Awakened In Me
“Star Wars” and its sequels were touchstones, mind-bending fantasy movie experiences into which I poured my longings for escape, creativity and adventure. Read the rest of the essay here.
Why Facebook's Proposed "Dislike" Button Is a Bad Idea
Facebook may finally be getting a button that lets you quickly express something beyond a “like.” In this commentary for WBUR's Cognoscneti, I say "Thumbs Down On Facebook’s ‘Dislike’ Button," and propose something else entirely.
Help, My Computer Is Turning Me Into A Robot
The first three months of 2014 have given us three momentous milestones in technology. There was the 30th anniversary of the Macintosh personal computer back in January. Then came the 10th birthday of Facebook in February. March celebrated 25 years since the beginning of the World Wide Web.
These technologies have made us more connected, more adept, more independent and more informed. Seemingly overnight, they’ve become irreplaceable tools for the workplace and for leisure, allowing us to do things we’d previously never dreamed possible: send messages at the blink of an eye, search vast databases from our homes and offices, and store vast amounts of information. Computers, social media and the Web have unleashed a powerful, creative DIY force. We are now our own secretaries, publishers and number-crunchers. We are indeed powerful.
But to what end?
Much has been written about technology’s downside. Largely, that critique centers on its de-socializing effects. The Internet and our smart devices distract us, and addict us. They tempt us to not “be present” in real world space. I often feel these things to be true. But my take on the dehumanizing aspects of digital technology is somewhat different.
My fear is this: Has my trusty and seemingly innocuous MacBook Air made me more robot-like? Have our computers turned us into them?
Obscene income inequality is immoral
The current disparity between what executives make vs. rank-and-file employees is nothing short of immoral. But sadly the battle for improved pay equity across America’s workforce isn’t going to be won anytime soon.
The current disparity between what executives make vs. rank-and-file employees is nothing short of immoral. But sadly the battle for improved pay equity across America’s workforce isn’t going to be won anytime soon.
In 2012, CEOs of S&P 500 companies made, on average, an astounding 354 times more than the average U.S. worker, according to the AFL-CIO. The ugly numbers continue. Of some 141 countries, the U.S. ranks fourth highest in “wealth inequality,” trailing only Russia, Ukraine and Lebanon.
But there is hope. Some companies — admittedly, a vast minority — do voluntarily cap their top executives’ salaries. For example, Whole Foods Market won’t pay its CEO more than 19 times the company’s average annual wage.
You can read the rest of my column for WBUR's Cognoscenti here.
Put Down The Camera, Pick Up The Fork
Food — you know that stuff you put in your mouth, chew and swallow to stay alive? — has officially jumped the shark.
Remember when food was just something you ate?
Food — you know that stuff you put in your mouth, chew and swallow to stay alive? — has officially jumped the shark.
I’m not exactly sure when it happened. Was it the molecular gastronomy craze of a few years ago, when chefs squirted popcorn and gumdrop foam over your duck breast, and suddenly preparing a meal became a science experiment?
Or was it the proliferation of celebrity cooking and food travel shows, wherein Anthony Bourdain and a crack team of Navy Seals went undercover in Bangkok to bring back a rare mangosteen, then kept it alive in captivity back in his New York City walk-in cooler. (Slight exaggeration, but not so far from the truth.)
You can read the rest of my column for WBUR's Cognoscenti here.
All I needed to know about life I learned from “Dungeons & Dragons”
I was lucky enough to publish this piece on Salon.com, using the occasion of D&D's 40th anniversary this month to wax poetical about all the life lessons the game taught me.
Here's an excerpt:
I played a lot of D&D back in the 1970s and 1980s. After conquering me, D&D went on to transform geek culture. Not only had D&D invented a new genre of entertainment — the role-playing game — but it practically gave birth to interactive fiction and set the foundation for the modern video game industry. Into “Halo” or “Call of Duty”? You’re playing an incredibly sophisticated version of a D&D dungeon crawl.
After a long hiatus, I play the game again now, as a 47-year-old, mostly grown-up person. Today, with my +5 Goggles of Hindsight, I can see how D&D was subtly helping me come of age. Yes, it’s a fantasy game, and the whole enterprise is remarkably analog, powered by face-to-face banter, storytelling and copious Twizzlers and Doritos. But like any pursuit taken with seriousness (and the right dose of humor), Dungeons & Dragons is more than a mere game. Lessons can be applied to the human experience. In fact, all I really need to know about life I learned by playing D&D.