Most Will Get This Wrong

Rubik's CubeReaders of a certain age will recall a period in the early 1980’s when Rubik’s Cube was all the rage (back when people still said things like “all the rage”). For a brief time, it was nearly impossible to go anywhere or do anything without being confronted with some reference to the puzzle or the Hungarian professor who invented it. The item has sold 350 million units to date and is easily the world’s top-selling toy of its kind.

LinkedIn is the largest professional social network on the Internet, currently reporting more than 260 million users worldwide since its launch in 2003. The first tentative steps into the world of business-oriented social media came with a set of expectations for participation, not unlike those dictating one’s professional conduct at an in-person business function. For those who wished to expand their circle of contacts beyond conventional networking channels, the platform was a godsend.

"solve if you are a genius" puzzle, with the answer being an upraised middle fingerToday, these two seemingly disparate streams have converged. As the lines between business and social networking have become increasingly blurred, the typical LinkedIn user can expect to see at least one puzzle of some sort in their daily feed. On Monday, it’s a mathematical equation with the headline “Solve This If U R a Genius” affixed to the top. On Wednesday it might be a word search puzzle, inviting us to join the 2,703 commenters who previously discovered the inspirational message hidden within. Some puzzles attempt to pry into the psychological makeup of the user, operating as a sort of abridged version of the Myers-Brigg indicator. Other puzzles attempt to predict what sort of career we should pursue based on our first name.

Although seemingly innocuous fun, the choice to participate in a puzzle on someone’s LinkedIn feed does raise hypothetical questions. Are we naturally conditioned to think better or worse of a co-worker who didn’t know that 7+7/7+7×7-7=50 (or is it 14)? If a prospective employer researches a candidate’s background on LinkedIn, do her chances of landing the job decrease or increase because she located the word “universe” in a word search puzzle? Have our professional standards relaxed to the point where I could bring a Rubik’s Cube to a job interview?*

cartoon with caption "I would have hired you if you didn't post math problems on LinkedIn"LinkedIn is now such an accepted component of doing business that we are constantly besieged with articles telling us what behaviors to avoid when using the product. Anything can be considered LinkedIn taboo, from failing to include a profile picture to having not enough (or too many) personal recommendations. Still, there is little in the way of journalistic guidance when it comes to online puzzles. We want to believe (and the research seems to indicate) that a fun working environment is a more productive one. Most of us recognize that there’s a time and place for frivolity, however, and a time and place to be serious and get things done.

LinkedIn has experienced its share of backlash in recent years. Some users have cited the platform’s increasing failure to properly qualify prospective connections, a by-product of the “broad and shallow” recruiting model often employed by mass networking agencies. Others have accused LinkedIn advertisers of flooding inboxes with spam messages, saying the platform offers too little value for the cognitive investment. Kendra Eash of the New Yorker recently published a column of honest LinkedIn recommendations which, like all satire, hits just close enough to home to be brilliant:

“How can I sum up Judy in just one paragraph? I can’t, because she will probably rewrite it. A brilliant micromanager and leader of team anxiety, she never met a project she didn’t want to take over. Judy has inspired thousands of eye rolls during her time here, and anybody that’s going to work with her deserves to be warned in advance.”

LinkedIn is an easy target because we expect better from a professional networking website. The inclusion of puzzles on our feeds increasingly risks distancing us from our purpose for being there, disguising digital pablum as meaningful engagement. That said, the viral attraction of puzzles might have business value as a mechanism for screening talent. Already, headhunters have begun posting arithmetic problems (“If you can solve this, I might be interested in hiring you”) as a way to thin the field of prospective job candidates.

This is where things get uncomfortable: the realization that important, real-world decisions might be made on the basis of someone’s performance on a simple (almost stupid), diversionary task. It’s akin to hiring a finance manager because they can beat your top salesperson at Monopoly. How long before such tactics of dubious intention are widely repurposed by principled organizations, and what effect might this have on the future of business interaction?

*Personal caveat: I solved Rubik’s Cube when I was in sixth grade, but for some reason my clients and employers haven’t been impressed by this achievement.


Watch Me Quit

Marina Shifrin dancing in her quit video

In an era where U.S. unemployment figures typically range between five and eight percent (and were as high as 10% in 2009), it might seem oddly inappropriate to celebrate the mundane act of quitting a job—especially when the event is made public at a significant risk to one’s reputation.

Yet, this is exactly what people are doing with their final hours at work. People are using company property and time to choreograph elaborate media happenings, recording them on video and posting them online for a global audience. Although we have no idea who these people are, what they do for a living or why they’re leaving, we are captivated by the spectacle.

Girl holding sign saying "I quit"It’s not difficult to understand why “quit videos” are so magnetic. Nearly everyone who works for a living has been compelled, at some time or another, to seethe silently while enduring an employment scenario that left us wanting more. Perhaps it’s a stupid boss that sets us on edge, or a noisy cubicle placed near the break room. Or it could be the vapid nature of most work environments, a landscape overrun with micromanaging leadership, indecipherable buzzword jargon, and organizations who incessantly bang us over the head about their “commitment to culture.”

Most of us follow the rules of cordial professionalism when leaving our place of employment: we provide two week’s notice, complete the required paperwork, and hand off any projects currently in-stream. We might even have a small party or a round of drinks with our coworkers. We promise to keep in touch, say nice things and move on. It’s all very benign.

Increasingly, though, departing employees are electing to tender their resignations in the form of elaborately staged events. The sheer orchestration required to pull off these stunts is impressive: there’s the coffee shop barista who hired a barbershop quartet, the Renaissance Hotel employee accompanied by a marching band, an insurance salesman dressed as a banana, and the GoDaddy engineer who announced her new puppeteering career via Super Bowl commercial.

And sometimes, the employers fight back. When Marina Shifrin quit her job by dancing to a song by Kanye West in front of 18 million viewers, her viral legacy was cemented (including a number of copycats). However, her former company countered the attack with a video of their own, taking advantage of the opportunity to elevate their name in front of a newly-expanded audience.  Today, Ms. Shifrin writes for Glamour and tweets about orange juice, but one gets the sense that her fleeting celebrity will sustain the semblance of a career (for the time being, at least).

From the perspective of the departing employee, it’s difficult to pinpoint the intended benefit of quit videos. It could be the chance to become an Internet celebrity that’s so intoxicating, or it could be something more deeply psychological. Quitting a job is an inherently solitary exercise; everyone else is a part of something, and we’ve made a decision to extricate ourselves from it. Perhaps the mass exposure of fame provides a sense of immediacy, smoothing the transition from one social circle to another.

factory workers sitting at tables doing menial laborStill, one interesting aspect of note is that most quit videos begin with a sincere explanation of what we’re about to see. Sometimes the tone is almost apologetic, as if the creators are aware that what they’re doing is borderline inappropriate but Darn it, I was mistreated and my story must be heard. Taken in this context, quit videos operate as a rejection of the flawed organizational dynamics found in many corporate ecosystems.

Smart, creative employees long ago stopped thinking of themselves merely as human capital. Perhaps this form of social media offers a greater good yet to be discovered: the actualization of the self, and a public opportunity to reclaim one’s dignity in the face of dehumanization. Whatever the rationale, Frederick Winslow Taylor is surely spinning in his grave.

Reading vs. Consuming

One can’t help but laugh when going through old magazines, especially those “best of” issues that boldly predict the most innovative trends to emerge in the near or distant future.

Cover of Smithsonian MagazineIn the August 2010 issue of Smithsonian, for example, the magazine celebrated its 40th anniversary by listing “40 Things You Need to Know About the Next 40 Years.” The list includes automobiles that run on salt water, organs and body parts made to order, and world peace finally manifesting as the result of our global population reaching old age en masse.

Some of these predicted events may, indeed, turn out to be prescient. In fact, the last item on the list is already taking place: new habits in personal literacy. The proliferation of mobile technology, with its smooth surfaces requiring swipes and taps to operate, have created a modality in which reading has become more dextrous. Meanwhile, smaller screens minimize the amount of content that can be consumed in one session.

Reading and writing, like all activities, are subject to dynamic influences that shape how written material is created, distributed and received. In 15th-century Europe, only 1 in 20 males could read, and writing was an even rarer skill. The advent of the printing press allowed content to be mass-produced at greater volumes, allowing for less “scholarly” works to see publication (the first romance novel was published in 1740). When it comes to formulating a public aesthetic, technology is not neutral.

Words today have migrated from bound paper pulp to 4.5 billion tiny screens worldwide. We see them illuminated on darkened commuter trains. We gaze absentmindedly at them while standing in line at the supermarket. We sneak quick glances at them while stopped at a red light. We aren’t reading for knowledge or enrichment; we’re filling a moment until the next thing happens.

Illustration by Erik Carter. Screenshots from Vine users Alona Forsythe, Brandon Bowen, Dems, imanilindsay, MRose and Sionemaraschino.
Illustration by Erik Carter for the New York Times Magazine. Screenshots from Vine users Alona Forsythe, Brandon Bowen, Dems, imanilindsay, MRose and Sionemaraschino.

But what are we reading that’s so riveting that it removes us from the present moment? As it turns out, nothing. There is a whole industry around the idea of “borecore,” a term used by Jenna Wortham in a New York Times Magazine article this month. She describes how video-sharing apps like Vine and Meerkat operate like a firehose, constantly releasing a voluminous stream of vapid, self-indulgent content more appropriate as time-filler than value enrichment:

“Rather than killing time at the mall, in a Spencer’s Gifts or the food court, young people are filming themselves doing the incredibly mundane: goofing around in a backyard pool, lounging on basement couches, whatever; in other words, recording the minutiae of their lives and uploading it for not very many to see.”

Kindle Fire deviceA similar trend is taking place with non-video content. We don’t read the written word, as we do a book or periodical literature; we consume it. Screens are always on, and we never stop peeking at them. The digital material we consume is highly visual and interactive, requiring a series of finger gestures and precise tapping sequences. Pop-up windows with tiny “close” buttons and interstitial moving images compete for our attention, and every selection we make is recorded by someone, somewhere, for some data-driven purpose we’ll never understand.

Reading off a screen requires a user to rapidly formulate a pattern of behavior that rewards distraction. Sitting down with a long narrative, told in a singular voice, just doesn’t have the same element of persuasion during instances when we need a quick blast of information (such as checking the customer reviews of a product while we’re standing in the store, deciding whether to make the purchase) or just want to waste a few minutes at the airport until our flight is called.

People standing in line, looking at their smart phones.
Photo by Greg Battin of Autodesk University.

Some might argue that technology has even influenced the quality of content we choose to consume. At its “best,” reading from a tablet screen is no different than turning the pages of a book, because good writing will always transcend whatever medium by which it’s delivered. At “worst,” though, we scan words off a screen between the moments of our lives, like sneaking an unhealthy snack into our mouths when we think no one is looking.

Whatever our opinion on the quality of digital material being consumed, one cannot deny that our retrieval capacity has enhanced our role as content aggregators. We reside in constant, curatorial flux regarding the world around us, and the screen is the first place we look when we need answers. It’s also the vehicle of choice when we want to put something out into the world, even if the end in mind is a book printed on paper. Otherwise, this blog wouldn’t exist.