Earlier this year, I came across an article in The Futurist called “Consumption 2.0.” by Hugo Garcia, who correctly points out that consumers are increasingly seeking alternatives to ownership and the accumulation of property. Media libraries, for instance, are becoming past relics as cloud-based digital access services continue their meteoric rise in popularity. The Consumption 2.0. argument goes that kids today just aren’t really into owning things. They want the freedom to make easier and faster decisions, and impatiently expect the latest things to appear immediately in the here and now. Add to this a culture of rapid obsolescence, and it’s easy to see why we’re increasingly losing our attachment to owning physical stuff. But let’s not kid ourselves, simply because we’re increasingly owning less stuff, doesn’t mean that ownership and accumulation themselves are habits of a bygone area. In fact, as the Consumption 2.0. trend continues and we become increasingly reliant on rentals, pay-per-use, and the licensing of cloud-based content, ownership will become more important and lucrative than ever. While consumers are indeed seeking alternatives to ownership, nothing really is happening to ownership itself.
According to Garcia, consumers are seeking alternatives to ownership and the accumulation of property for a few reasons: First, overpopulation, crowding and urban density has begun to press on the Earth’s ecological limits. I believe that this has resulted in a psychological aversion to clutter and density, demonstrated by the revival in New Age ‘Zen’ lifestyles, the craze for techniques like Yoga, and the design of popular interfaces based on the principle of clear, white space. In general, we strive for a kind of anti-Baroque aesthetic. Second, cultures impacted by Consumption 2.0. value immaterial traits like knowledge, information and reputation equal to or higher than physical things. We’re even accustomed to the idea that the money in our bank accounts is simply a number bouncing through servers. Third, in our work habits we are increasing resembling postmodern nomads, for whom “physical attachments impair mobility”. Fourth, in many places, including those which faced the brunt of the latest economic crisis, a kind of disillusionment has set in with respect to land-value and the ownership of property. A kind of synthesis of all these reasons can be found in Japanese geki-sema share houses, also known affectionately as “Coffin Apartments”.
The celebrants of consumption 2.0. – such as the author of the Futurist article mentioned above – are optimistic about this trend toward so-called “collaborative consumption”, and hail the emerging consumer behaviors as environmentally friendly and socially progressive alternatives to the traditional forms of ownership that were symbolic of a wasteful consumer society.
But rather than a ‘socially progressive’ trend whose end-goal would be the elimination of private property (etc…), the trend seems to be driven more by a mindless Zen-style urge to lose yourself that is increasingly being experienced by a cramped and restless Millennial generation. Further, some proponents of Consumption 2.0. use socially progressive terminology, but the trend itself retains – and amplifies – many of the structures that these individuals rail against. In fact, these ‘alternatives’ to ownership are actually serving to funnel tremendous amounts of wealth upwards into fewer and fewer hands. Private property may be abolished for us on the ground, but not for them in the cloud.
It’s important to remember that all those digital albums you’ve ‘purchased’ aren’t really albums at all, but non-transferable licenses for files that you can’t legally resell. Just this month – in the case of Capitol Records v. ReDigi – a federal judge in New York ruled against reselling copyrighted digital files. Without getting into the details here, the court found that reselling digital music files violates existing copyright laws.
As media libraries become a thing of the past, and we become subscribers to cloud content, we are increasingly being transmitted content that is highly compressed at it’s source (i.e. the satellites that orbit the cloud). Satellite radio, for example, sounds like its being beamed through a tin can and the so-called ‘HD’ picture quality offered by large cable providers is often atrociously compressed and pixelated. The decision makers (i.e. owners of the licenses in the cloud) transmit content in the cheapest way possible. Compare Satellite Radio, which uses only 125Khz per channel to the 200KHz per station quality of FM radio… It’s astounding how much worse the quality of new access-based formats is.
I’m not normally nostalgic, but I’m going to miss used record stores and combing through second hand shops for content. I resent that future generations likely won’t be able to squirrel their things away, and nostalgically rediscover them tattered, scratched, dog eared and covered in dust. Everything will be eternally available in the cloud, presented in whatever new format the cloud owner deems most economical.
Consumption 2.0. often comes off as ‘progressive’ when – behind the curtain – it’s really just an intensification of the status-quo. I worry that as this trend builds momentum, the things we work hard to afford being able to interact with, are becoming valueless licenses that cannot legally be exchanged or re-sold. I would seriously weigh the arguments for and against a trend that seems to be ushering us beyond ownership and the accumulation of property, but Consumption 2.0. really just seems like a clandestine move in the opposite direction.
This was a good week for tech-memes on my Facebook Timeline. Note that clicking these images will take you to the FB page where they’re from, and chances are you will be offended, so take heed before clicking:
As I pulled into the parking lot to see Harmony Korine’s Spring Breakers I noticed flashing police lights and yellow caution tape, and learned that a man had been shot to death here only a few hours earlier. This was my gateway into what was – without a doubt – one of the more nihilistic films of the past decade.
In fact, for 94 minutes, I couldn’t stop conjuring scenes from Menace II Society, where the protagonists’ grandpa asks him if he cares whether he lives or dies, and he answers “I don’t know”, or when O-Dogg, one of cinema’s great unrepentant nihilists, explains his lack of concern with shooting innocent children or seniors with a string of “Shit nigga, I’ll smoke anybody, I just don’t give a fuck. I don’t care who the fuck out there!”
There’s no way around it. Spring Breakers is going to be a misunderstood film. A quick glance at the IMDB user reviews reveals highly divergent comments ranging from “This is by far the worst movie I have ever had to sit through” to “One of my most compelling theater experiences”. But I don’t blame audiences for not getting it. We haven’t had a lot of practice.
It has been some time since the gritty heyday of cultural nihilism in the mid 1990s, when a large chunk of the film’s target audience would have been in diapers. In many ways, Spring Breakers seemed to me a throwback to those dark days, whose soundtrack was a mix-tape of Korn, Marilyn Manson’s ‘Antichrist Superstar’, Insane Clown Posse’s ‘Fuck the World’, DMX’s ‘Flesh of my Flesh, Blood of my Blood’, the music video for Soundgarden’s ‘Black Hole Sun’, films like Pulp Fiction, Natural Born Killers, The Doom Generation, games like Doom II, Duke Nukem 3D, Postal, and which would eventually culminate by spilling over into reality with the 1999 massacre at Columbine. It’s been a while since the kids have had a nauseating overload of the ol’ truly meaningless ultra-violence.
If the nihilist satire of the mid-late 90s was a warning, a threat of what was looming on the horizon, Spring Breakers finally holds up the mirror to a world that ignored all the warnings, and laughed in the face of individuals like Karl Popper who – at the wise old age of 91 – lamented the invasion of violent images being injected into children who “adapt if constantly exposed to extreme situations”, and act as the key driver to the “now evident deterioration of the Western world…a moral corruption of mankind“ marked by the “growth of crime and the loss of normal feelings of living in a well ordered world“.
Spring Breakers offers up a glimpse of a culture that teeters on the brink of total failure and breakdown. It’s a confused state, a kind of neo-tribalism, replete with totems including a hollowed out baby doll bong, an animal carcass helmet, posters of Lil Wayne, and monitors beaming forth the hallucinogenic bright colors of My Little Pony. In other words, Korine succeeds masterfully in presenting a world that is in the process of internalizing the ‘morality’ of 4chan’s notorious /b/ image board.
The film plays out in the manner of a fairy tale, a kind of kamikaze death spiral down the vibrant, glowing, neon rabbit hole of popular culture. Every step of the journey into the new American Dream feels cartoonish, meaningless and directionless. But that’s precisely where Spring Breakers succeeds, in uncomfortable scenes like the one where James Franco’s character Alien – the drug dealing embodiment of infantile and irresponsible gratuitousness – grotesquely sucks the barrel of a handgun as if he were performing fellatio on his neon bikini clad Spring Break “soul mates”.
The irony – and what makes Spring Breakers so delightfully subversive – is that the girls (who include in their ranks former Disney princesses) are driven by the highest and most respected ideals of their MTV pop culture: the carpe diem of #YOLO, the lolita-hedonism that underpins the forever-young gyrating ‘bitches’ and ‘hoes’ of rap culture, and the hypnotizing idol worship of cash wads and glimmering ‘bling’. A particularly memorable scene has one of the girls, upon seeing Alien’s bed lined with rows of bills, exclaiming something to the effect of “all this cash makes my pussy so wet”. Without being preachy, the film offers an forceful absurdist lifeline to the (clearly under 18 year old) kids in the theater, viscerally demonstrating that the drunken, sweaty, infantile ideal of an eternal Spring Break is a nauseating nightmare.
One of our most pervasive conceptual oppositions is the one between the political left and right. In North America, the term ‘left’ is often used interchangeably with the term ‘liberal’; the term ‘right’ is often used interchangeably with the term ‘conservative’. But I dare you to try to trace the historical and conceptual lineage of this opposition. The terms themselves, which appear clear cut become mindboggling and begin to unravel as soon as you begin to seriously interrogate them. For example, how strange is it that a word like ‘neoliberal’ should designate the economic policies of political ‘right-conservatives’, while environmentalism, conservationism, and the ‘conservative’ protection of indigenous customs and ways of life is often understood as a hallmark of the political ‘left-liberals’? My head spins just thinking about it… While I am not a historian, or a political scientist, over the next few posts, I invite you to follow along as I do some casual online research to unravel the roots of our present day notions of ‘left’ and ‘right’. In no way are these exploratory posts intended to be authoritative, and I want to assure you that I will proceed without allegiance to any predetermined ideological position. I would prefer to let the historical record speak for itself. We’ll get to the present day eventually, but it will only confuse us to begin here.
A journey through the past is often required to arrive at the origin of our ideas. Our current conceptual structures were built onto the edifice of existing ones, or have used them as jumping off points. While there appears to be a single, clear, historical point where the terms ‘left’ and ‘right’ took on their political connotations in the Modern world, I have begun to observe that this was not simply an act of historical contingency and chance. This clear historical point, which will be the focus of a later post, derives from the Assemblée National in France, just prior to the time of the French Revolution in 1789, Essentially, the nobles (in favor of continuity and stability – supporters of the church, king and constitution) sat to right of the president of the assembly, a position of honor. The Third Estate (in favour of radical change – anti-church, anti-royalist, in favor of overthrowing the constitution) sat on the left. The question, of course, is why did the nobles sit to the right of the president?
Well, hunting around the internet, I’ve found out that the act of sitting on the right side has had a long history, as has the opposition itself between left and right: “Right and Left have been invested with deep meanings throughout Western history, and indeed, in every other culture we know about as well. The Right is always good and straight and true and up, the Left tends to be the opposite. For example, the Maori, the indigenous people of New Zealand, have a very complex cosmology that ties many dimensions of life to the left and the right, as do many other peoples. In the bible, the elect sit on the right hand of God.”
The biblical record tells us much about the mindset of ancient peoples and is important to consider given the importance that Christianity has had – and continues to have – on our conceptual structures. I’ve come across an interesting reference in Ecclesiastes 10:2, “A wise man’s mind tends toward the right hand, a fool’s toward left”. One biblical commentator notes that the right hand is “ordinarily the best exercised, strongest, and most ready, and the left the contrary”, the right symbolizes a sense of command, while the left a want of prudence and management. More generally, the dichotomy embodies the division between good and evil. Another biblical commentator notes that being at the left is being at a “loss for wisdom, understanding to direct him, he is at a loss for wisdom and understanding to direct him, when he has an affair of any moment upon his hand; which he goes about in an awkward manner, as left handed persons do, and has sinister ends in what he does; and he is to every good work reprobate and unfit, and seeks earth and earthly things, which lie to the left, and in all himself.” The ancients more generally used to identify things wise and prudent with the right hand and things foolish with the left hand.
In the book of Matthew 25:33, we again encounter the left/right dichotomy, this time with the assertion that “when the Son of Man comes in His glory, and all the angels with Him, then He will sit on His glorious throne. All the nations will be gathered before Him; and He will separate them from one another, as the shepherd separates the sheep from the goats; and He will put the sheep on His right, and the goats on the left.” This has to do with sheep on the right representing righteousness, virtuousness, honor and innocence, while goats on the left representing wickedness. The left hand is referred to here as a mark of rejection, while the right as a mark of eminence. Or consider Ephesians 1:20: “He exerted when he raised Christ from the dead and seated him at his right hand in the heavenly realms,”, and again in Psalm 110, “The Lord says to my Lord:’Sit at My right hand / Until I make Your enemies a footstool for Your feet.’” Or in Acts 2:33, where Christ is said to sit at God’s right hand, which for the Hebrews was a position of power, honor and rank.
But the dichotomy of left and right is more general, and perhaps more ancient than the Judeo-Christian biblical record. There is evidence that the opposition (like those between light/dark and male/female) is characteristic of human thought throughout many different early cultures. For more on this, I would direct you to a paper by Franco Fabro called “Left and Right in the Bible from a Neuropsychological Perspective” (Brain and Cognition 24, 161-183, 1994). Basically the world of primitive humans was dominated by basic oppositions. Among these, the left came to embody the negative values, perhaps in confirmation of the perceived “incidence of ‘deviant’ individuals, i.e. left handers and ambidexters, was higher among individuals with specific diseases and mental problems.”
Fabro gives a number of explanations and outlines a long tradition of this dichotomy throughout the Hebrews, the Semitic world, the ancient Arabs, Ugarites, Egyptians, Hittites, Greeks, and Romans. In India, the god Shiva was depicted as early as the 1st century CE with the right side of a man’s and the left side of a woman’s body. Studies of African cultures as well reveal that the right hand is “preeminently the strongest, male, good, lively hand, the one which is used to eat and to offer food, and make presents, whereas the left hand is feeble, feminine, wicked, deathful, and used to take things away or to carry out the dirty actions, such as cleaning one’s anus or genitals…etc…”
And consider the following lines from Book VI Virgil’s Aeneid (29B.C.-19B.C.), which demonstrate the extent to which “in the Roman world, left and right are emblematic of endless beatitude and endless misery”:
Here is the place
Where the road forks: on the right side it goes
Past Dis’s walls, Elysium way,
Our way; but the leftward road will punish
Malefactors, taking them to Tartarus.
These ancient religious and primitive associations seem to be mirrored in the etymologies of our terms ‘left’ and ‘right’. Fabro explains that “in most Indo-European languages the term ‘right’ originates from a single common root which has a large geographic extension and great stability over time”, whereas the term ‘left’ is “generally expressed by several different words with limited geographical extension, which apparently tend to constantly disappear and be replaced by new words.” This, variety and instability surrounding the term ‘left’, “may be explained by the feeling of anxiety and aversion to the left side typical of the communities/cultures mentioned above.
Etymologically, the word ‘left’, derives from the Old English ‘lyft’ which – as far back as 1200CE – connoted weakness and foolishness. In fact, it may have replaced the Old English word ‘winestra’, literally meaning “friendlier,” a euphemism used superstitiously to avoid invoking the unlucky forces connected with the left side. This left side, referred to in Latin as ‘sinister’ meant harmful, unlucky and unfavorable. This Latin word ‘sinister’ derives from Greek influences, such as the practice of facing north while observing omens such as the flight of birds. (For the Greeks, ravens were a favorable omen when they appeared on the right. The crow was unfavorable when it was seen on the left.) One Greek word for left, ‘skaios’, was used to “define something horrible or as an omen for diseases. The Latin, ‘skaevus’, stood for the Western part of the world, which was conceived of as ill omened, inept and witless.” The term ‘left’ continues this association with tropes like “left bank”, “two left feet”, and “out in left field”.
The etymology of our word for ‘right’ derives from the Old English ‘riht’, meaning good, proper, fitting, straight .This is evidenced cross culturally, as “In Norwegian, the word for right, høyre, literally means ‘higher’”. Further, in Hebrew, the root ‘ymn’ (meaning ‘left’), may be translated as “confidence”, and is associated with the word ‘meheyman’, which “stands for a person knowing how to do his job and conscientiously practicing a steady profession.”
As you can see, the story behind Left and Right appears to have begun long before the Assemblée National and the French Revolution.
More to follow…
Rest assured, from time to time, I do post about matters unrelated to Heideggerian critiques of Post-Humanism. As a gargantuan fan of horror movies, I’ve often marveled that so few are actually scary. Many are enjoyable for their campiness, goriness, or cleverness, but only a handful would cause the most hardened horror fan to second guess watching them alone in a dark secluded cabin on a stormy evening. I want to stress that for full effect, you must watch these movies in pitch darkness. I could have waited till Halloween to post this, but I say that any time is a good time to be scared. I won’t include The Exorcist in my list, because it still remains in a league of its own. Plus, almost everyone and their mummified mother, has seen it.
10. Jacobs Ladder (1990)
One of the most unsettling psychological thrillers, which also happens to feature some of the most diabolical demonic imagery ever raised from cinematic abyss. In many ways, Jacobs Ladder is an more mature, and scarier, version of more recent films like Silent Hill, which obviously borrowed heavily from it.
9. Rosemary’s Baby (1968)
There’s a real feeling of timeless doom that pervades Roman Polanski’s Rosemary’s Baby, that manages to wallpaper over its late 60s datedness. It’s a film that sticks with you, well past its eerie ending. Note: Your best bet is the Criterion Blu-Ray, released around Halloween, 2012.
8. The Haunting (1963)
Perhaps the most important element to ensuring a horror film is actually scary is its use of sound effects. Clever sound effects can be responsible for transforming a good haunted house film into a classic. The Wikipedia entry for The Haunting, claims that some of the sounds featured in the move are very low in the bass range, which can cause physical sensations at high volume. I wonder whether this use of the low bass range is now standard horror practice?
7. Event Horizon (1997)
I hope I’m not going to turn you away by including this black gem, which features Sam Neil clawing his own eyes out. It’s really a shame that big budget, ultra-violent, sci-fi horror has become an endangered species. While not a perfect film, Event Horizon, fortunately, features enough jump scenes, whispering demons, and claustrophobic space tunnels to keep us scared in a time when horror migrated away from sci-fi to ‘torture porn’. Plus, you will get to see Sam Neil claw his own eyes out. “Where we’re going, we won’t need eyes to see.”
6. Martyrs (2008)
If Martyrs doesn’t rattle you, please send me an email because I’d like to meet you. This nihilistic film – part of the s0-called “New French Extremity Movement” is the definition of horror. Plus, if you’re a reader of Georges Bataille (or keep a tattered Marquis de Sade book under your pillow) you’ll be “ecstatic” about the second half.
5. Session 9 (2001)
There’s something off-putting about dark, spooky mental hospitals… It’s not just me right? While this film seems to have fallen by the wayside, it is a very effective – if slow moving by today’s standards – thriller. I might give this a watch again before I visit the Wavery Hills Sanatorium this summer.
4. The Descent (2005)
It’s been a while since I saw this one in theaters but I remember The Descent as a claustrophobic ‘trapped in the cave’ film that plays on our fear of dark, enclosed spaces…and flesh eating eyeless monsters. It’s a clever, and nerve racking, film that becomes increasingly scary as it goes on. It looks like a 3D re-release is in the works as well.
3. The Shining (1980)
If you’re here, you’ve probably seen this Stanley Kubrick classic. Unfortunately, with each passing year (as a result of an accumulating number of pop culture references) the film loses a little bit of its bloody bite.
2. Pet Semitary (1989)
This Stephen King adaptation really has it all: the creepy possessed kid, the ghoulish demented old crone, the secluded country house, etc… A film that many skip over because they assume it’s just another lousy King adaptation, about pets! It’s actually one of the darker and more serious King adaptations out there. If Aunt Zelda doesn’t make you at least think about crawling under the bed, you’re probably an emotionless sociopath. Avoid, Pet Semitary 2, which is a total mess.
1. [Rec] (2007)
Like The Haunting (#8 on this list), my hunch is that a major reason [REC] is so terrifying is due to its use of sound effects. This Spanish film has no musical score, and relies solely on ingeniously placed, and often jarring, sound effects. Consider the sound of the body that is thrown from the top floor of the apartment building early in the film. If you have surround sound, and that doesn’t jar you, nothing will. Contrary to the film’s first 20 minutes, this is not your usual shaky cam nonsense. Plus, the sequel [Rec] 2 – which picks up right where [Rec] leaves off is not half bad either…in fact it’s one of the better horror sequels out there. Avoid, however [Rec] 3, which is an abysmal mess.
As our headfirst plunge into the digital future gains momentum, any astute critic should take note of the resistances that have emerged over past year. In fact, while typing this post, I’ve stumbled across a new article on CNET called “Google Glass: The Opposition Grows”, which discusses an emerging “anti-cyborg movement” that has sprung up in response to Google’s new Augmented Reality glasses.
But are these resistances just examples of the alarmist rhetoric that accompanies any technological revolution? In order to answer this question, it is useful to remember how conceptually useful it is to take a new idea/trend/revolution to its end, or an absurd point that may never come to pass, in order to see how dangerous the germinating idea/trend may be. Every generation has critics who do just this, like dystopian authors (Atwood, Orwell, Huxley, etc…) who magnify new ideas/trends and imagine how they might interact with the world if they were to become hegemonic.
It’s here that I want to discuss a March, 2013 report by JWT Intelligence called “Embracing Analog: Why Physical is Hot”, which I eerily encountered on the same day as I watched Season 1 of “Black Mirror”, a fantastic British TV show that offers alternate realities where familiar aspects of contemporary technological life (hypnosis of real-time Twitter feeds; disconnection from human interaction offered by the Microsoft Kinect and Nintendo Wii; privacy issues that surround Facebook Timeline/Google Glass) have become the modus operandi of daily life. I say eerie because the JWT report echoes the dystopic intention of Black Mirror, when demonstrates that “Immersion in the digital world [the world of the black mirror] makes us more keenly aware of what’s unique about physical objects.”
The JWT Report, “Embracing Analog” echoes a longstanding position of Cybject that as time passes we are increasingly clawing for ‘real people’ and ‘real things’, as demonstrated by phenomena like Instagram filters, iPhone speakers shaped like gramophones, the revival of interest in vinyl records, etc… . While digital content is easier, faster, more convenient and cheaper, the report suggests that American adults find comfort in the physical world, and “romanticize the physical, ascribing more meaning to giving and receiving physical objects versus digital versions of the same things”. Basically, as has also been suggested on Cybject, digital content is devoid of the imperfections that provide physical objects their personality.
I find the JWT report fascinating, and am entirely in agreement with its diagnosis, namely that there is an analog counter-trend underway. But I’m not so sure we have adequately understood why this counter-trend is underway, and I suspect that the authors of the report have only scratched the surface. They suggest that “when everything becomes digital and immediately available, one starts to yearn for the analog”, and explain this yearning to be the result of an upset between the IQ (intellectual) and EQ (emotional) sides of our personality. Of course, in their schema IQ maps on to digital content, and EQ maps on to analog content. When we spend too long with digital content, we upset the balance. When our EQ is not being satisfied, the authors explain that we seek the “analog more than ever. We’re looking for more meaningful emotional experiences and connections. We’re seeking to re-balance our IQ and EQ states.”
“As alluring as the digital world may be, we’re beginning to realize its limits.”
While the report is an excellent start, but I’m not sure the situation is as cut and dried as the authors let on. I’d like to suggest – and this might offer a very bleak picture – that we’re not quite as hardwired as the authors make out. What if we aren’t equipped with a kind of natural defensive reaction to the loss of our EQ and wouldn’t miss the imperfections symbolized by analog content? What if the analog counter-trend under way is just a simple nostalgia, or a type of transitory conservatism? Perhaps, more darkly, we’re more adaptable than the authors give us credit for, and our nature has no natural limits? If this is the case, we cannot sit back and wait for the ship to change course, but must chart the course itself.
“Black Mirror” exemplifies the analog counter-trend by taking a quickly growing digital trend, and providing a glimpse of how – once dominant – the trend could seriously conflict with our fleshy, desirous, bodies and our existing ethics and morals.
Episode 2, “15 Million Merits”, reflects a world where individuals live in isolated cells, bombarded by advertisements which they are financially penalized from closing their eyes to. Individuals’ lives consist of using exercise bikes to power television monitors. When they have generated enough power, they can use their credits to audition on a 24/7 American Idol-like reality TV show watched by millions of howling avatars. Of course, this is an absurd future, but the protagonist who has the misfortune of falling in love, has a very difficult time reconciling this emotion which emerges from the reality of his body – very familiar to us in the year 2013 – with the insensitive, hyper-sexualized, parallel earthlings trapped in their vicious hyper-real circles. It’s an episode which years for flesh, bodies and the mystery of proximity.
Episode 3, “An Entire History of You”, offers a glimpse of a world very much like our own, except that people have their entire past stored on a storage chip. Everything they experience through their eyes is recorded onto this chip. Another very familiar emotion, jealousy, is held up to the black mirror, and what emerges is a serious disjuncture. This is a future where once cannot lie, one cannot forget, and one cannot heal. The reason we are able to move on after a difficult chapter in our lives, is because the vividness of experiences fades with time. The Google Glass-style technology ensures that events from the past exist on a timeline parallel with the present. And in this scenario, the familiar mechanisms we use to cope with jealously or infidelity lead to madness. It’s an episode which years for the fleetingness of brain based memory.
By offering these nightmarish technological scenarios, “Black Mirror” reminds us that we may not innately possess the natural balance, sensory ratios, soul, reality principle, etc…, to resist a hegemonic digital future. It puts the onus on us both to shape and adapt our culture, while deciding what is worth saving in our nature.