Top
Nocturne, James Abbott McNeill Whistler

In 1999, an advert went out across movie theaters in the UK with the strapline: “nobody forgets a good teacher.” I knew who my unforgettable teacher was, a woman who taught my class when I was in the equivalent of the third grade, I think. She was memorable because she was a product of a different time. It wasn’t just a matter of her seniority, for that wasn’t unique among our teachers. It was her life experience. She would tell us stories about an exotic place she called Ceylon, the then-British colony of what is now Sri Lanka, where she’d grown-up the daughter of a senior military figure.

She told tales of king cobras and wild elephants, but they’re not what causes her to linger in my mind. Rather, I remember a talk she gave us before a trip to the nearby museum. We were lined up in the playground. She gave instructions on what is perhaps best termed “comportment.” She told us to walk in an orderly fashion, shoulders back, spines straight. She instructed us to keep an eye on our fellow pupils, and give assistance to anyone falling out of line. We were told to be quiet, of course, and not to be in any way unruly. We were instructed always to stand aside to make way for an adult coming down the sidewalk: ‘Always be the first to step into the gutter’. This lady also always insisted on confessing guilt and owning-up when you’d done something wrong. If another boy was in tears because someone had stolen his football, or the class was being kept behind after a profanity was found etched into one of the desks – she always dealt with the guilty party differently if he simply owned-up. She had no time for those who’d hide behind the group to try and avoid punishment. She assumed boisterous lads will always do this stuff. So rather than try to stop them doing it, teach them to take responsibility for it and face the consequences with some dignity.

She must have been brought-up in the 1930s. Her morality presumably stemmed from her own parents and teachers, who would have been products of late Victorian and Edwardian England. Her instructions on how to comport oneself in life gave a glimpse into an old world. Her strange moral teachings carried an air of exotic mystery about them like the aromatic spices of Ceylon itself. They were mysterious because we had no inkling why these convictions about personal comportment were so important. Her tutelage was only ever delivered as a fait accompli. She just had a self-assured certainty about them. Such things were, she would say, ‘proper’ and ‘decent.’ I doubt this lady of Ceylon is still alive, but if she is, one could ask what she would make of life today. There is little to guide people in how to conduct themselves, even less about bearing some responsibility for keeping others in order, and today’s discourse seems not to countenance values like decorum and discretion.     

*****

The last years of my schooling were spent in a high school in an area of North London which is stereotypically associated with a left-liberal bourgeois sentiment. Tony Blair lived just down the road before he moved to Number 10 Downing Street. The local authority that ran the schools pushed whatever progressive initiative was in favour at the time, and the style of education ran counter to that imparted by the lady of Ceylon. There were no uniforms, we used firstnames for teachers, and smoking was permitted in the school grounds. Education was approached there as a means of teaching people to cope with adversity. Although the area has some serious wealth, it also has plenty of poverty. The kids in state schools like this one didn’t share the backgrounds of those pushing the progressive methods. The Blairs sent their kids to an expensive school a few miles away on the West side of town. Because of the material deprivation among some of the pupils, and the volatile home lives that tend to accompany it, the teachers saw their job as helping the kids to cope with life. This took shape through trying to help the pupils to get whatever grades they could to ensure a stable income through white-collar work.

Most of the kids who engaged with their schoolwork wanted to be lawyers or doctors. These two professions served as the archetype for the ‘struggling kid made good in life’ narrative of many films and TV dramas. Cue a scene with exhilarating music playing, with footage of a young person sitting in a library writing away, and then an acceptance letter for a scholarship being opened as the soundtrack builds to a triumphant climax. Some of the wilier pupils would comment skeptically on the sheer numbers of budding doctors and lawyers at that unremarkable school. They knew these ambitions came from popular media. There had been a shift a decade or so earlier that was visibly coming downstream by this point. Sometime in the 1980s, the sheer level of exposure to media had begun to spiral. The VHS, proliferating TV channels, even video games, meant that these young minds had their identities mediated to them through technology before they’d actually started to live.

One can see how the progressive mindset, once peculiar to certain districts, have become much more widespread today. The dominant narrative there was about being a victim of adversity, and the reasons behind the adversities were presented as intrinsically destructive forces. These forces were seen as determinate over one’s life. They were not challenges to be surmounted, but things that must be pulled out at the root, which meant they stayed insurmountable. Deliverance from these forces was promised through sharing one’s oppression with others, with others that share aspects of your identity. In this way, identity was mediated through a structuring account of one’s circumstances: oppression and liberation, privilege and disadvantage.

In a setting with real poverty, this was natural enough. A peculiar aspect of today’s education, however, is that this familiar plotline has become mainstream in bourgeois settings. It is now at work in provincial and suburban schools, where the identities at stake are untethered from material deprivation and social class. Perhaps this is because such accounts are deeply seductive. In their most pronounced forms, they lead to avoidance of personal responsibility. Rather than deal with an immediate challenge, the forces behind some difficult scenario are classified with an ‘us and them’ structure, which leads to an evasion of dealing with the circumstances of one’s own life. There may also be a deeper and more subtle phenomenon occurring, whereby some assumed narrative threatens to displace personal identity. The  specificity of an individual’s life-trajectory is in danger of being displaced by a story. Who one is, then becomes not just inextricable but even indistinguishable from some oppressed characteristic. Some of these characteristics can of course genuinely disadvantage someone, but if they are treated as fully determinative this is destructive for personal identity.

The self-assured certainty of the teacher from Ceylon is rare today. Much of today’s discourse involves the firing of shots between different narrative structures, and setting one particular narrative up as more desirable than others. Stories have of course accompanied human life since early homo-sapiens assembled around the campfire after a day of hunting and gathering. But recent findings as to the deep plasticity of the brain, combined with our now permanent saturation with digital technology, suggests that a profound re-ordering of our neurological apparatus is taking place. It could be so surreptitiously embedded that people do not even see it. Technology so consumes the horizon that we no longer acknowledge it. It seems to exhibit its own sinister modus operandi. The machines will not countenance the defining characteristic of being human: each person’s unique specificity. We must be subsumed into code.

*****

Only two or three decades ago, life was primary over the secondary mediation of life through portrayals of other lives. Yes, of course, kids have read books and watched films for years, but children’s literature was originally fantastically unrealistic, and films were not available on tap but in the movie theater. In contrast, take the common scenario whereby a four-year old child plays games on a tablet. At this age, the brain is particularly tender and malleable, and will form its neurological patterns and structures in response to experience. Imagine this pliant, ever-adapting brain trying to steer the image of a bubble-shaped vehicle at high-speed around a racing track. Consider how the neurological circuitry responds to the onslaught of inciteful sounds like ‘ping!’ and little rounds of applause whenever he picks up a coin or an extra life. Think of how frenetically the scene flickers before him, with coins flying across the screen, little endorsement signs rising up and down when he does well, the snarl and growl of a baddy frustrated at being overtaken. Suddenly little ramps or raised bridges appear, causing the little bubble-car to fly upward and float in the sky. A lurid otherworld is revealed, panning-out to infinity below. The child barely registers the saccharine, candy-colored sky before the unthinkable happens: crash! A brightly flashing tab saying ‘Next Game’ appears. Off we go again.  

Another example is YouTube animations. Their viewing is potentially infinite thanks to algorithm generators. When you see young children exposed to the strange subgenres in this medium, it is shocking how they are stupefied into a deep trance. At first glance, it’s innocuous enough. Brightly colored characters dance around in simple patterns to nursery rhymes. It needn’t matter that they’re low-budget constructions, and almost fully computer generated in some cases. But their unrefined nature means that normal 2D and 3D perspectives are blurred and permeable. This subtly decenters you, as if you’re falling between the cracks of reality’s most fundamental structuring. Things get more disorientating when you try to follow the spatial coordinates of the scenes – as a giant laughing gnome rides a toy-train through a tunnel, and then flies through the sky before morphing into an unearthly scene where the characters dance in a circle. You feel weird, because these coordinates aren’t following the normal rules of space. Conflicting coordinates overlap and interlace, thrusting you into a topsy-turvy hall of mirrors. Eerily dispassionate synth-harmonies crudely mirror the basic rudiments of human emotion. The sounds intertwine with the coarsely psychedelic landscapes which stretch to infinity. The eye adjusts to an endlessly rolling scene of blue hills and bright pink skies. A bright-red sun bounces up-and-down, and smiling purple clouds radiate out in steady, infinite streams.    

An adult can sense the assumed author of these candy-colored vistas and marmalade skies — the global conglomerates accruing profit, somehow, behind the scenes. They have a level of direct access to the infant brain of the sort which even the most accomplished hypnotist could only dream of. Much of the viewing of this stuff takes place with sound pumped directly into the cranium via headphones, increasing the depth of its hypnogogic effects. The wearing of headphones accompanies a broad range of adult pastimes too, of course. Emotional lives are being narrativized by song as people get from a-to-b on public transportation, or workout in the gym. Each person on a treadmill is unwittingly pushing themselves between the cracks of the real, running blindly into the dream-like otherworld between 2D and 3D. They bounce their way along, chasing those serotonin uptake endorsements (‘ping!’). The steady, bouncing pace synchronises with the pulsing bright-red sun from which smiling purple clouds stream forth. There are many such examples. Portrayals of reality are displacing the real through technological mediation. Technology morphs neural wetware to sync it to the algorithms. The stupefied and happy human is the machine’s most vaunted bounty.

Plato famously condemned art as “imitation” (mimesis) of reality, particularly dramatic poetry. He highlighted the deleterious effects of narrative on the actors and the audiences, who would over-identify with narrative personas and lose sight of themselves. His basic point is that visual imitations of reality ensnare people in their dynamics, which is dangerous because of oversimplification. Drama would displace the development of virtue. People would lapse into playing behind superficial masks rather than accept the tiresome business of acquiring (often unpleasant) knowledge about themselves. Narrative displaces the formation of character. Rather than deal with the circumstances in which one finds oneself, people narrativize them before they’ve really tackled them. One’s own life recedes behind the candy-colored horizon. Then you’re living in the far more palatable world of goodies and baddies. The fundamental truth at the center of being human is displaced. Aleksandr Solzhenistyn spoke of a line between good and evil which passes through each and every human heart. This fundamental axis shows that the coordinates of human character are horribly complicated. Myopic evil coexists simultaneously with the most noble goods. The algorithms have to simplify these multi-dimensional tectonics. The unrecognizable reality of who we are must be captured by a flattened-out form of spatial recognition.

A legend about early cinema offers a much cruder version of Plato’s fears, but also indicates the intensification of visual portrayals of life by technology. An 1896 film about a runaway train, L’Arrivée d’un train en gare de La Ciotat, is said to have caused a stampede in the movie theater. There’s also the radio dramatization of H.G Well’s War of the Worlds which is said nearly to have invoked a spontaneous evacuation of New York City. The same point is at play surrounding the endless debate about video-game shoot ‘em ups, reignited yet again after the Christchurch shootings. The placing of the killer’s camera almost exactly replicates the first home-console, first-person perspective shooters.  

*****

Moves toward a technological imitation of the first-person perspective is particularly threatening. For this gives clear indication that one’s own sense of self – the ‘I’ as a subject of experience – is being replaced by characterization. The basic perspective of narrative is third-person. We watch a person as an observer. It doesn’t matter if the viewer has access to the dynamics of that person’s inner life, for these dynamics are also viewed as an observer would perceive them insofar as they are narrated. They cannot, actually, be experienced in the first-person. That experience belongs uniquely to you. first-person portrayals put the observer’s third-person perspective in that impenetrable place. People today, in the midst of spontaneous happenings and events, are not responding im-mediately to occurrences as they happen, but viewing their own experience as observers, unwittingly configuring their responses in line with an assumed audience whose approval they crave (ping! — round of applause). They’re falling into the crack between the first- and third-person.

This is impossible to prove, which shows that human subjectivity remains as mysterious today as the most far-flung lands were in the past. An indicator is given, however, in that the means of address in the third-person, that is, the third-person pronoun, has been presented as the sacred territory of a person’s identity. One’s inward identity is then indistinguishable from the way observers signify how they see that person. A saccharine parallel dimension of rainbow colors appears: the wokeworld, a world which lives in the cracks between the unavoidable truths of unexplainable realities. Narrative resolves, but life subverts. Those initiated into the theoretical bases of the wokeworld, particularly academics, take their place in the story. They’re not the baddies, they ensure they somehow transcend the scene. They take the place of the multinational conglomerates, perpetuating the tale so it pans-out to infinity, while profiting handsomely from their audiences’ stupefaction.

*****

Awareness of a mass-evasion of the development of character has grown apace in recent years, not least due to Jordan B. Peterson. Much of what he says is salient here, about the rediscovery of that practical wisdom which has been wished away as being the fruit of European men, about the avoidable differentiation of male and female archetypes in ancient traditions of human self-understanding, and so on. But even in Peterson’s work, the basic, unassailable reality of one’s own authentic first-person center is displaced. The framework has just switched from victimhood to victor. He tells his followers to treat themselves as if they were another person they’re responsible for helping. That is, to accept a fictional premise as real. He teaches people to compare themselves to who they were yesterday, and not others. He thus instructs people to observe themselves from the outside, and then reform themselves in line with how they’d like to see themselves. As a Jungian psychologist he understands the power of narrative. He’s right that what “we subjectively experience can be likened much more to a novel or a movie than to a scientific description of reality.” But his work is premised on the possibility that our subjective experience can be narratively reordered. The danger of displacing one with the other is therefore still present. He says “you must see yourself as a stranger – and then you must get to know yourself.” He even recommends having a literal conversation with oneself, thus straddling the first- and third-perspective: “Imagine that you are someone with whom you have to negotiate.” A new narrative emerges triumphant, so cue the exhilarating soundtrack: “Define who you are. Refine your personality. Choose your destination and articulate your Being.”

For Petersen, life is about apportioning meaning to one’s existence. There’s much sense in this, but his notion of meaning is too loose and broad to serve the purposes he has for it. If a guy breaks into your apartment brandishing a knife, there’s no time to ponder what the experience means. The pondering comes later. Maybe you’ll conclude you needed to discover unheeded reserves of bravery, or that you were naïve about the human condition and needed a rude awakening. Then the unjustifiable reality of what happened has been resolved. The subjective ‘having’ of experience itself should always be prior to apportioning meaning, but not in the technological age. Now, interpretations of experience interlace with experience itself. People are turning into film critics of their own selves, lapsing into simplified dramatic personae. An exactly opposite stance is seen in old manuals for sacramental confession. Penitents were told just to list their sins factually, with no contextual mitigations, no interpretations, no narrative resolution to make sense of what happened. Then they took responsibility for that internal axis of evil of which Solzhenistyn speaks – however unfair and inexplicable it is.  

My teacher who told tales about Ceylon might well find an ally in Peterson. He does offer to guide people in how to conduct themselves, and even matches her words exactly when it comes to personal comportment. But there is a key difference here, and that is how Peterson shows the depth of our narrative displacement by technology. Turning back the clock is a familiar mimetic trick, and is thus not an answer to this malady. Reality will not endure if the fundamental coordinates of time are toyed with. The Amish option is not an option here. What is required is not another battle between narratives, but a battle to keep narrative itself in its rightful, albeit necessary, place. This place is secondary to the central place of individual uniqueness, the place that grounds one’s character, from whence you can take responsibility.  

At first glance, it seemed that today’s discourse is the polar opposite to that moral instruction of Ceylon. But maybe that’s not the case. Some of today’s hi-octane memetics promise to push toward the triumph of the real. As bizarre new characters and phrases continue to emerge and metamorphize, so do the secondary voices which spell out how they should be interpreted. But the interpretation is one-step removed from the reality. The images simply ‘are’ and impart an instant, undefined ‘sense’ which is exclusive over any clumsy attempt to define their ‘meaning’ in protracted prose. That narrativizing causes the image itself to lose its traction – for that traction lives only in the flash of time that passes prior to its interpretation. It’s not about turning the clock back, then, for it is the machine’s own speed and acceleration which promises to set things back into order. In this way, the new frontiers of the intermeshing of man and machine are precisely the place where that old wisdom of Ceylon can bear its exotic fruit of personal responsibility once again.

Jacob Phillips is an academic living in London. He tweets at @Counteredlogos.