The pop culture catch-phrase generator and communications theorist Marshall McLuhan used the myth of Narcissus as a metaphor to explain the interactions between human beings and media in his groundbreaking work, Understanding Media: The Extensions of Man. Poor Narcissus was mesmerized by the beauty of his own image reflected back at him in the surface of the water, and McLuhan saw the myth as a perfect reflection of the way we all become fascinated by extensions of ourselves in any external material. Today we would likely find Narcissus drowning in a Styx of social media, and research suggests that in the age of the selfie, we are all likely to see ourselves reflected more and more in the media landscape. But, this is only part of the picture.
Perhaps, we can put forward a more apt metaphor to understand the way in which we all wander through our media saturated environment, self-obsessed, true, but also unable to truly engage with each other's viewpoints. Rather than Narcissus, perhaps we are Theseus navigating the Cretan labyrinth.
The myth is a simple one, and well known. The Athenian hero, Theseus, offered to slay the Minotaur that his countrymen were regularly being sacrificed to, seven young men and seven young maidens at a time, as demanded by Minos, the king of Crete. Accounts vary as to what led to this arrangement, but most agree that Minos' own daughter Ariadne fell in love with Theseus and offered him the key to maneuvering through the labrynth where those sacrificed encountered the Minotaur; her gift was a ball of thread that would allow him to retrace his steps.
To begin applying this myth to the media context, we must start with the maze itself. The original labyrinth was constructed by Daedalus, perhaps most famous for creating the wings he used to escape with his ill-fated son, Icarus, because they were being held to protect the secrets of his design. It is said that Daedalus did such a fine job of constructing his labrynth, he himself was hard pressed to find the exit, but even his creation shrinks in comparison to the digital maze we encounter today. Our brains encounter billions of bits of data every second, but various physical filtering mechanisms help to reduce this to a more manageable 2,000 bits per second. So how does our Theseus manage this digital maze? The answer is simple: he solicits the aid of Daedalus himself and constructs a maze built just for him. He does this through a process of further psychological filtering.
This process was explained in Umberto Eco’s novel The Mysterious Flame of Queen Loana. Eco’s protagonist, Yambo, suffers from a form of amnesia that renders him devoid of any personal memories, but a near encyclopedic knowledge of every book he’s ever read. In attempting to explain the narrator’s amnesia, the specialist Dr. Gratarolo reveals an important insight about why our perceptions of reality are selective. Because, “if we had to record and store all the stimuli we encounter, our memory would be a bedlam. So we choose, we filter.” Then Dr. Gratorolo takes it a step further by explaining the specifics behind the mechanism that drives the filtering, telling the narrator: “You recorded the elements most directly associated with your emotions, your desires, your goals.”
The bulk of the book involves Yambo rediscovering himself by sifting meticulously through the various media he consumed and collected growing up, much of it propaganda laden pulp from Mussolini’s Italy that he at some point became mature enough to question. This process grows tedious for readers along the way, I for one was reminded of the hours I spent growing up surrounded by the museum of hand-me-down toys from my older brothers, entertaining at times, but all worn and out of date. The stakes are rather high for Yambo himself, his very identity is on the line, but readers sense the constructed nature of the situation. Still, the concept of the novel fascinates us as we join Yambo who is able to now re-experience the process that most of us go through only once, and blindly to boot. In a sense, the collection of pre-digital media from his youth were both the original blueprints for Yambo's maze of personal identity and perspective, and then later the threads that allowed him to reenter that maze after amnesia had ejected him from it.
To apply this to the digital format, let's turn to a study done by FILTER, an organization sponsored by the EU, that was tasked with “analyzing hidden filtering mechanisms that hinder accessible, fair and affordable knowledge.” They identified several layers of filters that users of the Internet experience, starting with those that exist within the user. The most basic are our sense perceptions that make certain web pages stand out to us. More important are our individual knowledge filters. They claim that “meaning is created when an individual absorbs information and merge it with previous knowledge and insights.” This can be affected by education, values, and culture.
To expand our metaphor with the imagery of today's media, we can envisage our Theseus as a character in a video game. So far we have been choosing our level, picking our course, tweaking the setting parameters, and arming our character. Now we can start the game. Let's get our Theseus to start exploring the virtual maze and doling out his thread. There are several psychological mechanisms that keep our character from straying far from his fellow Athenians or getting lost. The first we will consider is confirmation bias, a concept known widely in the research community to describe the bias towards results that confirm the hypothesis we are seeking to prove. Applied to everyday consumers of media, it means that we seek out the media outlets that promote viewpoints we agree with and avoid the channels we disagree with, i.e. if our Theseus is a conservative he watches Fox News and if liberal MSNBC. It seems reasonable to assume this effect is amplified on the Internet where consumers are not flipping through channels, but are only exposed to content they are actively seeking.
But don't forget that Theseus has a mission, he has to slay the Minotaur. And so, working in tandem with confirmation bias is a disconfirmation bias, the tendency to actively discredit any information that challenges our worldview (while accepting at face value anything that we agree with). If our hero accidentally pauses on MSNBC, but is a conservative, he becomes a “motivated skeptic” when evaluating the views he encounters there (while cheering uncritically along with O’Reilly’s latest rant). Haven't we all found ourselves yelling back at the television set when we don’t like what we hear? Yet, we nod rather passively in agreement to our media confederates. Just like the ancient statues celebrating the mythic Theseus, caste frozen in his moment of triumphant struggle against the great beast, we are all forever engaged in this battle.
Consider the work done by Taber and Lodge of Stonybrook University in New York published in 2006. While studying participants’ reactions to a balanced list of pro and con arguments regarding either gun control or affirmative action, they found that participants were deeply influenced by a prior attitude effect, seeing the views they already agreed with as having stronger arguments. More interesting was the polarization effect that this had: participants claimed more extreme views after reviewing a balanced list of arguments. These effects were only amplified among those with more background knowledge and sophistication regarding these issues.
What is most alarming about these results, is that it occurred among participants who were reminded, even encouraged to keep an open-mind. These were political science students, experts in training. The researchers insist that despite their best efforts at objectivity, the participants were unable to control their biasing impulses, and, further, unaware of them.
American media has claimed as its hallmark of professionalism the concept of neutral reportage. The trap of American journalism is not being partisan, but offering uncritical he said/she said, just the facts stories. This style of news coverage, while seemingly the most balanced, has several shortcomings, as NYU journalism professor Jay Rosen has contended for years through his blog Press Think and elsewhere. It too often leads the media to parrot whatever messages the powers that be may be peddling without question. False statements become newsworthy simply because of their source. The coverage in the run-up to the Iraq War is often cited as a prime example of this. And it is such pro and con reportage that triggers the psychological bias that we have been concerned with here. A balanced report does not lead to a balanced outcome, but merely an enhancement of entrenched perspectives.
A recent study by the Pew Research Center supports this understanding of what's happening in online communications. By mapping Twitter chatter about political topics, the researchers commonly found what they labeled a "polarized crowd," meaning two distinct groups talking about the same thing, but relying on completely different sources of information with little communication between the groups. Pew found at least five other types of communications clusters, but generally none of them reflected persons of differing viewpoints engaging meaningfully with each other.
Thus, it is clear that we modern humans find ourselves completely surrounded by a wide variety of interconnected media that bombards us with more information than our biology can bear. This forces us to filter out the data stream to bearable limits, a function of basic survival. Our psychology directs us to seek out those messages we agree with and avoid or attack those we don’t, with technology increasingly making this possible. A Civil War buff no longer has to wait for the weekend reenactment, he can watch the documentary on YouTube, listen to a related podcast, download an article to his smartphone, and blog about it while he should be working. You can just as easily replace Civil War with Tea Partiers, 9/11 Truthers, Birthers, Tedsters, or Zeitgeisters.
The modern day Theseus is armed with a keyboard and a mouse, ready to do battle with any Minotaurs of opposition voices. Yet, he is forever anchored by the string in his other hand, reminded and reassured of his own worldview, tethered to it as he winds his way through the digital maze.
On the one hand, we could use this metaphor simply as a descriptive model of a phenomenon. But, is there anything wrong with this image? This is where the walls of the maze start to shift on us. Although, our Theseus has a hand in the construction of his own maze, the Daedalus of the digital age has a few more tricks. Remember, some of the filtering is done by our physical and psychological mechanisms, but now it is also accomplished by design—those of the corporate interests and often without us knowing.
The design of the Internet itself revolves around additional filtering systems. Controversial data collection by Google and other online companies have aimed to enhance the users experience by personalizing search results. They base it on 57 signals, including location, computer type, and browser according to Eli Pariser, author of The Filter Bubble and Chief Executive of Upworthy. If you search “Minotaur" you may get an entirely different result than I would. The whole point is to give you more of what you want, faster. Other websites, like YouTube and Amazon, offer users suggestions based on their prior habits and the habits of those with similar interests. Advertisers have already jumped at the chance to individually target consumers based on algorithmic analysis of their habits and interests. Despite privacy issues, most users will likely respond positively to these technologies. People are already choosing apps like Flipboard and Facebook's Paper to get the news tailored to their viewpoints and interests. Who wouldn’t want to be ensconced in a world where everyone seems to be reading the same books you read, listening to the same music you like, and talking about the issues that you are interested in?
The real danger that is being proposed here is that our individualized media mazes, first personally constructed and then technologically engineered, pose the risk of ensnaring us forever. The rise of social media, Facebook and Twitter in the US and around the world, but also Kakao Story in Korea and Weibo in China, enhance the effect of each of us isolated within our own little maze of friends, associates, and like-minded bloggers walled off within the web. Interest and involvement in the virtual on-line world of Second Life has remained steady, in fact, participation has grown in other similar online communities, particularly gaming communities. As further such applications develop, we will all literally inhabit separate realities, even a visit to Google Earth will reveal fragmented results as information presented or promoted will derive from your interests or your friends' experiences and recommendations. The development of Google Glass and other augmented reality applications suggest that these virtual walls and corridors will literally bleed into the physical world.
There have always been those who suffer from a difficulty separating reality from fantasy. These are the RPG-addicted teens or lonely soap-opera fans we view with a mix of pity and disgust, or perhaps even with fear in the extreme version portrayed by the homicidal fan in Stephen King’s Misery. However, there has been plenty of research into perception and memory that suggests that we all have a grip on reality that is more malleable than we might imagine.
In the 1970s, Elizabeth Loftus began investigating the validity of memories in a series of groundbreaking experiments that revealed that eyewitnesses were susceptible to a misinformation effect. In the experiments, participants viewed films and slides of accidents. Loftus proved that their memories of these events could be significantly altered or changed by false information imbedded in the language of follow-up questioning. For example, questions that used the word ”smashed” to describe the event led witnesses to remember broken glass that never existed far more often than if the question included words like “bumped” or “hit” instead. This research demonstrated that when false information was imbedded as an assumed fact within a question, participants remembered colors and objects that never existed.
According to Gary Webb and Elizabeth Olson of Iowa State University (ISU), “eyewitness researchers have noted that mistaken identity rates can be surprisingly high and that eyewitnesses often express certainty when they mistakenly pick someone from a lineup.” In other words, people feel certain of their completely inaccurate memories.
Probably most relevant here is a recent study also from ISU. Participants in one experiment viewed an episode of the popular television drama 24. Half of the participants were immediately questioned about the show while the others were not. All participants were then shown an eight-minute summary of the episode that included false information. Those who were immediately questioned were significantly more likely to recall this misinformation when questioned a week later. In other words, those most consciously engaged with the material were the most easily misled.
This research exposes the immense power that the media has in shaping the public’s memory of important events. As breaking news is reported, journalists are pressed to begin creating a narrative, which is absorbed by the viewers. Even those who were directly involved absorb the messages, perhaps altering their memories of the event.
The need for immediacy in the 24-hour news cycle makes it inevitable that false information will end up being broadcast, whether intentionally or accidentally. Poking fun at the numerous examples of errors made by CNN has become a staple for The Daily Show's writing team. Such mistakes are only exacerbated by the constant repetition of the false information on stations desperate to fill broadcast minutes, which is then easily copy-and-pasted endlessly in various digital formats.
It has been often noted that regular viewers of Fox News were more likely to falsely believe that Saddam Hussein was responsible for the attacks of 9/11. Since our confirmation bias leads us to believe the information of our favorite talking heads uncritically, when false information is perpetuated in the vast blogosphere, people are likely to not only agree with their favorite opinions, but also believe false facts supporting those opinions, and even misremember actual events.
What we are faced with is a true labyrinth, complete with shifting walls, CCTV, and NSA backdoors. We now have a far more sophisticated understanding of how this phenomenon works just in time to see the effects of it accelerate in tandem with exponential technological growth. This is a game we have always been playing, spouting the rhetoric of our own tribe while sparring instinctively with the Minotaurs. But, the next gen version of the game offers a more thrilling, globally connected online experience with more levels of difficulty and more devious programming than Daedalus ever devised. Unlike the statues of the mythic Theseus, though, our digital hero is made of softer stuff, less solid and less real. GAME OVER. PLAY AGAIN?
The modern day Theseus is armed with a keyboard and a mouse, ready to do battle with any Minotaurs of opposition voices.