POC are just like you and me. Sure, there are technical, mostly visual, differences between us. However, considered in the grand scheme of things, such differences are quite trivial.
Far from a weakness, this diversity is a strength; we all play a role in moving our democracy forward, and ensuring the public realm remains a lively and vibrant place. Of course, by POC, I am referring to People of Commentary.
POC are everywhere. Turn on the television and you’ll be greeted by POC. Scroll through any social media feed, and without much effort, you’ll find posts made by POC. Walk through the middle of London, and soon enough, you’ll sight chattering congregations of POC.
Given the apparent omnipresence of POC, one eventually begins to ask: where did they come from? Were there this many POC in Britain 50 years ago? Yes, I know I’m pushing my luck.
In all serious consideration, the voice of commentators, self-described or not, for better or for worse, constitutes a large chunk of public, especially political, discussion in Britain.
Conversely, and it would seem simultaneously, we have witnessed a rapid decline of public intellectualism over consecutive decades. Indeed, the noted absence of intellectuals from public life is underscored when most people struggle to define what an intellectual actually is.
Many are inclined to believe that the British are, by their very essence, an anti-intellectual people. Distrustful of abstraction, they very much prefer a hodgepodge philosophy of empirical observation and sainted “Common Sense” – both of which, especially the latter, intellectuals supposedly and infamously disregard.
An immediate glance at ongoing matters would support this position. Despite the fundamental disagreements constituting the “Gender Wars”, it is clear that both sides consider Britain, thankfully or regrettably, uniquely resistant to transgenderism. In my view, this can be traced to our Anglo-Saxon forbearers, who gradually removed the notion of gendered words in our language besides the ones which speak to the empirical (that is, biological-anatomical) distinction between men and women.
All this said, empiricism isn’t exactly synonymous with “anti-intellectualism”, just as the names Francis Bacon, Thomas Hobbes, David Hume, George Berkeley, or Edmund Burke rarely come to mind when discussing “anti-intellectuals”. We can safely assume that intellectuals primarily deal in ideas, but we can’t safely assume said ideas are purely rationalistic and abstract.
Herein lies the distinction: there’s a difference between contemporary “anti-intellectualism”, which has contributed to the explosive ascendancy of POCs, and the “anti-intellectualism” which is distinctly “intellectual” in nature – pertaining to the limits, rather than uselessness, of intellectualism-as-abstraction. As such, we should consider post-war anti-intellectualism as a degeneration of a healthier and more measured position.
Without placing too much weight on the origins of Britain’s post-war anti-intellectualism, I would argue that such a precise attitude be attributed to the popularity of the ideas of George Orwell, as conveyed by cultural osmosis, rather than extensive reading; specifically, his preoccupation with ‘Ordinary People’ and the ways in which they are different to the class of ‘Intellectuals’ whom Orwell sought to disassociate himself.
This is an excerpt from “Ides”. To continue reading, visit The Mallard’s Shopify.
You Might also like
-
The Internet as Mob Rule
The ancient Greeks believed political constitutions repeated in a pattern called kyklos (“cycle). The idea first occurs in Plato’s Republic, gets elaborated by Aristotle in his Politics, then reaches its apogee in Polybius’ Histories.
Unlike modern theorists of cyclical rise and fall of civilisations, such as Oswald Spengler, the kyklos doesn’t have a zenith or golden age. It’s rather a waxing and waning of stable society types, followed by unstable society types. What characterises a stable society is that the ruling class and citizens both strive towards the common good, conceived as the objective purpose of human beings, which results in their happiness and flourishing. Society becomes unstable when its members stop having the common good in mind, and instead strive after their selfish private interests to the detriment of other citizens.
Kyklos then presupposes several things. First, it isn’t culture specific. Its objectivist outlook means it applies equally to all political human groups, always and everywhere. Second, the engine that drives history is human virtue and vice, and not economics, class struggle, or war. These are secondary factors resulting from the characters of human beings. Healthy economies, contented class structures, well-won peace and just wars all result from virtuous people. Third, the stable government types are various. Kyklos defends neither monarchy, nor aristocracy nor a republic exclusively. It isn’t a Whiggish or utopian theory of history, that says if and only if a certain group are in power all will be well. Rather it claims that whatever group are in power, they must be virtuous to rule well. Vice immediately leads to disorder.
Simplifying in the extreme, the kyklos model runs as follows. Rule can be by one person, several, or many. When these rule for the common good, they are just, and are called monarchy, aristocracy and republican respectively. When they rule for their private interest to the detriment of society, they are tyranny, oligarchy and democratic respectively.
It’s important to note that by “democracy” I don’t mean here a system of popular representation or voting. The virtuous form of this is called a polity or republic in classical thinking. In the latter, bonds of authority and specialised expertise remain. In the former, absolutely everything is sacrificed for the sake of equality of the masses (see below).
A good monarch rules with benevolence. His successors are unjust and become tyrants. The nobility removes them, creating an aristocratic state. These in turn degenerate into oligarchs as they grow decadent and self-interested and begin to oppress the poor. The people rise up and remove them, creating a republic where all citizens have a say. But the mass of citizens loses the bonds of political friendship, grows selfish, and the republic becomes a democracy. Democracy eventually deteriorates to a point where all bonds between people are gone, and we have a mob rule. The mob annihilates itself through infighting. One virtuous man seizes power, and we return to monarchy. The cycle begins anew.
With these preliminaries out of the way, I come to my point. I believe the present age we are forced to live through is highly ochlocratic. Of course, it’s not a pure mob rule since we have non-mob elites and a rule of law. I also think our age is oligarchic (dominated by elites swollen with pleasure). But it’s more ochlocratic, I contend, than it was a few centuries ago, and enough that mob behaviour characterises it.
The defining trait of unstable regimes, as I’ve just said, is vice. However, vice doesn’t just happen spontaneously as though people awake one morning deciding to be selfish, spoilt, and cruel. Evil people, as Aristotle notes, often believe they are good. Their fault is that they’ve mistaken something which is bad for what is good. For example, the man who hates the poor falsely believes money is the same as goodness. The man who mocks monks and sages for their abstinence believes all and only pleasure is good. Even when we know what is good for us, ingrained habit or upbringing might make the illusion of goodness overpowering. A lifetime of cake-gorging can condition one to the point it overrides the knowledge that sugar is bad for health.
I think the Spanish thinker Jose Ortega y Gasset in The Revolt of the Masses (1930) unwittingly echoes Plato when he points to the faults of the democratic “mass-man” of the twentieth century. All human societies need specialised minorities to function. The more demanding and specialised a field, the more those who do it will be a minority of the population. Further, all societies, to function, need sources of authority which aren’t decided by a majority vote. Modern democracy has created the illusion that the unspecialised mass is sovereign and has no reliance on anybody. It has achieved this mirage through artificial liberation: creating unnatural freedoms through constant government intervention and technocratic engineering.
This in turn has supported vices out of unthinking habit. The mass-man accepts his lack of qualifications and is proud of this absence. He isn’t one deluded about his knowledge. Quite the opposite. The mass-man is someone who openly declares he knows nothing but demands to be listened to anyway because he’s a member of the sacred demos. In short, according to Ortega y Gasset, the ideology of the mass-man is: “I’m ordinary and ignorant, and so I have more of a say than those who are specialised and learned.”
The internet is a democratic medium par excellence. This isn’t to say that its members are all egalitarian and individualist, rather, its very construction assumes egalitarian and individualist ideas, and these force themselves onto its users whether they be willing or not.
Here we can extend the criticisms that Neil Postman makes at television in Amusing Ourselves to Death (1985) to the web. On the internet, all information is available to everyone. Anyone can create it, and anyone can opine on it. The medium doesn’t distinguish for quality, so the greatest products of human civilisation sit alongside the basest, on the same shelf. There are no filters online for expertise or experience, indeed, any attempts to create such filters are decried as “gatekeeping”. As a result, the internet has no difficulty settings (to use a metaphor). Getting through the easier levels isn’t mandatory to reach the harder ones. You can skip ahead, so to speak, and mingle with the pros as their peer.
Someone might object here that I’m exaggerating, since online communities monitor themselves all the time. I can indeed post my amateur opinions onto an internet space for astrophysicists, but these will mock and exclude me once I become a nuisance. However, this isn’t an answer. The internet is built on the assumption of mass wisdom, and the only way to enforce hierarchies of value on it is by banding a mob together. The space around remains anarchic. Yes, there are communities of wise people online, but these exist in an ocean of communities of fools. The medium presents them all as equally valuable. Which communities grow powerful still depends on the wishes of the mass.
When the internet produces a rare fruit of quality, this is because by sheer accident, the wishes of the mass have corresponded to reality. It isn’t an in-built feature.
The result is that the internet functions like a classic mob regimen or ochlocracy. The medium has no sensitivity to quality, but rather responds to will, provided enough people are behind it. Those who wield influence online do so because the mob will has selected them. They are our modern versions of Plato’s Athenian demagogues, or rabble-rousers of the French Revolution. A mass of ignorant and desperate people swirls around equally ignorant and desperate demagogues who promise them whatever they want. Demagogues rise and fall as the mob is first enamoured then bored of them. As the internet has grown to encompass our whole lives, this ochlocracy has spilt out into the real world.
In this space, truth entirely drops out. It’s a common fault of the ignorant to confuse desire with truth since desires are often hotly felt and what is very vivid seems real. Our egalitarian internet machine therefore is wont to magnify desires rather than realities. And because it magnifies desires, these ever more get confused with reality, until mob wishes would replace the common good of society. I believe a good example of this is how the online demagogue-mob relationship works. When internet personalities, especially political and social influencers, fall from grace, it’s usually because their followers realise they can no longer get what they want out of them (seldom do demagogue and mob cordially separate because each has become wiser). The power lies with the followers and not with their purported leader.
Which brings me back to kyklos. A classic Greek political cycle resets when a virtuous individual takes the reigns from the mob and establishes a monarchy. He recreates justice through his personal goodness. This was more likely, I think, in ancient societies where religion, community and family were stronger, and so the pool of virtuous people never entirely depleted. If our ochlocratic internet is indeed a stage in a kyklos (or a component of an ochlocratic stage), and it ends, I think it will end with one demagogic idiocy imposing itself on the others by force.
A population conditioned by the internet to think mass-appeal as equivalent to truth will readily accept a technocratic whip provided it claims to issue from the general will. Which idiocy gains supremacy is a matter of which can capture the greater part of the mass in the least time, to form a generation in its own image. This is why I don’t think the current trend of the internet becoming more regulated and censored is good. The regulators and censors come from the same debased crop as those they regulate and censor.
Enjoying The Mallard? Consider subscribing to our monthly magazine.
Post Views: 593 -
Joel Coen’s The Tragedy of Macbeth: An Examination and Review
A new film adaptation of Shakespeare’s Scottish tragedy, Joel Coen’s 2021 The Tragedy of Macbeth is the director’s first production without his brother Ethan’s involvement. Released in select theaters on December 25, 2021, and then on Apple TV on January 14, 2022, the production has received positive critical reviews as well as awards for screen adaptation and cinematography, with many others still pending.
As with any movie review, I encourage readers who plan to see the film to do so before reading my take. While spoilers probably aren’t an issue here, I would not want to unduly influence one’s experience of Coen’s take on the play. Overall, though much of the text is omitted, some scenes are rearranged, and some roles are reduced, and others expanded, I found the adaptation to be a generally faithful one that only improved with subsequent views. Of course, the substance of the play is in the performances of Denzel Washington and Frances McDormand, but their presentation of Macbeth and Lady Macbeth is enhanced by both the production and supporting performances.
Production: “where nothing, | But who knows nothing, is once seen to smile” —IV.3
The Tragedy of Macbeth’s best element is its focus on the psychology of the main characters, explored below. This focus succeeds in no small part due to its minimalist aesthetic. Filmed in black and white, the play utilizes light and shadow to downplay the external historical conflicts and emphasize the characters’ inner ones.
Though primarily shown by the performances, the psychological value conflicts of the characters are concretized by the adaptation’s intended aesthetic. In a 2020 Indiewire interview, composer and long-time-Coen collaborator Carter Burwell said that Joel Coen filmed The Tragedy of Macbeth on sound stages, rather than on location, to focus more on the abstract elements of the play. “It’s more like a psychological reality,” said Burwell. “That said, it doesn’t seem stage-like either. Joel has compared it to German Expressionist film. You’re in a psychological world, and it’s pretty clear right from the beginning the way he’s shot it.”
This is made clear from the first shots’ disorienting the sense of up and down through the use of clouds and fog, which continue as a key part of the staging throughout the adaptation. Furthermore, the bareness of Inverness Castle channels the focus to the key characters’ faces, while the use of odd camera angles, unreal shadows, and distorted distances reinforce how unnatural is the play’s central tragic action, if not to the downplayed world of Scotland, then certainly to the titular couple. Even when the scene leaves Inverness to show Ross and MacDuff discussing events near a ruined building at a crossroads (Act II.4), there is a sense that, besides the Old Man in the scene, Scotland is barren and empty.
The later shift to England, where Malcolm, MacDuff, and Ross plan to retake their homeland from now King Macbeth, further emphasizes this by being shot in an enclosed but bright and fertile wood. Although many of the historical elements of the scene are cut, including the contrast between Macbeth and Edward the Confessor and the mutual testing of mettle between Malcolm and MacDuff, the contrast in setting conveys the contrast between a country with a mad Macbeth at its head and the one that presumably would be under Malcolm. The effect was calming in a way I did not expect—an experience prepared by the consistency of the previous acts’ barren aesthetic.
Yet, even in the forested England, the narrow path wherein the scene takes place foreshadows the final scenes’ being shot in a narrow walkway between the parapets of Dunsinane, which gives the sense that, whether because of fate or choice rooted in character, the end of Macbeth’s tragic deed is inevitable. The explicit geographical distance between England and Scotland is obscured as the same wood becomes Birnam, and as, in the final scenes, the stone pillars of Dunsinane open into a background of forest. This, as well as the spectacular scene where the windows of the castle are blown inward by a storm of leaves, conveys the fact that Macbeth cannot remain isolated against the tragic justice brought by Malcom and MacDuff forever, and Washington’s performance, which I’ll explore presently, consistently shows that the usurper has known it all along.
This is a brilliant, if subtle, triumph of Coen’s adaptation: it presents Duncan’s murder and the subsequent fallout as a result less of deterministic fate and prophecy and more of Macbeth’s own actions and thoughts in response to it—which, themselves, become more determined (“predestined” because “wilfull”) as Macbeth further convinces himself that “Things bad begun make strong themselves by ill” (III.2).
Performances: “To find the mind’s construction in the face” —I.4
Film adaptations of Shakespeare can run the risk of focusing too closely on the actors’ faces, which can make keeping up with the language a chore even for experienced readers (I’m still scarred from the “How all occasions” speech from Branagh’s 1996 Hamlet); however, this is rarely, if ever, the case here, where the actors’ and actresses’ pacing and facial expressions combine with the cinematography to carry the audience along. Yet, before I give Washington and McDormand their well-deserved praise, I would like to explore the supporting roles.
In Coen’s adaptation, King Duncan is a king at war, and Brendan Gleeson plays the role well with subsequent dourness. Unfortunately, this aspect of the interpretation was, in my opinion, one of its weakest. While the film generally aligns with the Shakespearean idea that a country under a usurper is disordered, the before-and-after of Duncan’s murder—which Coen chooses to show onscreen—is not clearly delineated enough to signal it as the tragic conflict that it is. Furthermore, though many of his lines are adulatory to Macbeth and his wife, Gleeson gives them with so somber a tone that one is left emotionally uninvested in Duncan by the time he is murdered.
Though this is consistent with the production’s overall austerity, it does not lend much to the unnaturalness of the king’s death. One feels Macbeth ought not kill him simply because he is called king (a fully right reason, in itself) rather than because of any real affection between Macbeth and his wife for the man, himself. However, though I have my qualms, this may have been the right choice for a production focused on the psychological elements of the plot; by downplaying the emotional connection between the Macbeths and Duncan (albeit itself profoundly psychological), Coen focuses on the effects of murder as an abstraction.
The scene after the murder and subsequent framing of the guards—the drunken porter scene—was the one I most looked forward to in the adaptation, as it is in every performance of Macbeth I see. The scene is the most apparent comic relief in the play, and it is placed in the moment where comic relief is paradoxically least appropriate and most needed (the subject of a planned future article). When I realized, between the first (ever) “Knock, knock! Who’s there?” and the second, that the drunk porter was none other than comic actor Stephen Root (Office Space, King of the Hill, Dodgeball), I knew the part was safe.
I was not disappointed. The drunken obliviousness of Root’s porter, coming from Inverness’s basement to let in MacDuff and Lennox, pontificating along the way on souls lately gone to perdition (unaware that his king has done the same just that night) before elaborating to the new guests upon the merits and pitfalls of drink, is outstanding. With the adaptation’s other removal of arguably inessential parts and lines, I’m relieved Coen kept as much of the role as he did.
One role that Coen expanded in ways I did not expect was that of Ross, played by Alex Hassell. By subsuming other minor roles into the character, Coen makes Ross into the unexpected thread that ties much of the plot together. He is still primarily a messenger, but, as with the Weird Sisters whose crow-like costuming his resembles, he becomes an ambiguous figure by the expansion, embodying his line to Lady MacDuff that “cruel are the times, when we are traitors | And do not know ourselves” (IV.2). In Hassell’s excellent performance, Ross seems to know himself quite well; it is we, the audience, who do not know him, despite his expanded screentime. By the end, Ross was one of my favorite aspects of Coen’s adaptation.
The best part of The Tragedy of Macbeth is, of course, the joint performance by Washington and McDormand of Macbeth and Lady Macbeth. The beginning of the film finds the pair later in life, with presumably few mountains left to climb. Washington plays Macbeth as a man tired and introverted, which he communicates by often pausing before reacting to dialogue, as if doing so is an afterthought. By the time McDormand comes onscreen in the first of the film’s many corridor scenes mentioned above, her reading and responding to the letter sent by Macbeth has been primed well enough for us to understand her mixed ambition yet exasperation—as if the greatest obstacle is not the actual regicide but her husband’s hesitancy.
Throughout The Tragedy of Macbeth their respective introspection and ambition reverse, with Washington eventually playing the confirmed tyrant and McDormand the woman internalized by madness. If anyone needed a reminder of Washington and McDormand’s respective abilities as actor and actress, one need only watch them portray the range of emotion and psychological depth contained in Shakespeare’s most infamous couple.
Conclusion: “With wit enough for thee”—IV.2
One way to judge a Shakespeare production is whether someone with little previous knowledge of the play and a moderate grasp of Shakespeare’s language would understand and become invested in the characters and story; I hazard one could do so with Coen’s adaptation. It does take liberties with scene placement, and the historical and religious elements are generally removed or reduced. However, although much of the psychology that Shakespeare includes in the other characters is cut, the minimalist production serves to highlight Washington and McDormand’s respective performances. The psychology of the two main characters—the backbone of the tragedy that so directly explores the nature of how thought and choice interact—is portrayed clearly and dynamically, and it is this that makes Joel Coen’s The Tragedy of Macbeth an excellent and, in my opinion, ultimately true-to-the-text adaptation of Shakespeare’s Macbeth.
Post Views: 584 -
Is it Possible to Live Without a Computer of Any Kind?
This article was originally published on 19th May 2021.
I am absolutely sick to death of computers. The blue light of a screen wakes me up in the morning, I stare at another computer on my desk for hours every day, I keep one in my pocket all the time and that familiar too-bright glow is the last thing I see before I close my eyes at night. Lockdown undoubtedly made the problem much, much worse. Last year, a nasty thought occurred to me: it might be the case that the majority of my memories for several months were synthetic. Most of the sights and sounds I’d experienced for a long time had been simulated – audio resonating out of a tinny phone speaker or video beamed into my eyes by a screen. Obviously I knew that my conscious brain could tell the difference between media and real life, but I began to wonder whether I could be so sure about my subconscious. In short, I began to suspect that I was going insane.
So, I asked myself if it was possible to live in the modern world without a computer of any kind – no smartphone, no laptop, and no TV (which I’m sure has a computer in it somewhere). Of course, it’s possible to survive without a computer, provided that you have an income independent of one, but that wasn’t really the question. The question was whether it’s possible to live a full life in a developed country without one.
Right away, upon getting rid of my computers, my social life ground to a halt. Unable to go to the pub or a club, my phone allowed me to feel like I was still at least on the periphery of my friends lives while they were all miles away. This was hellish, but I realised that it was the real state of my life – my phone acted as a pacifier and my friendships were holograms. No longer built on the foundation of experiences shared on a regular basis, social media was a way for me to freeze-dry my friendships – preserve them so that they could be revived at a later date. With lockdown over though, this becomes less necessary. They can be reheated and my social life can be taken off digital life support. I would lose contact with some people but, as I said, these would only be those friendships kept perpetually in suspended animation.
These days large parts of education, too, take place online. It’s not uncommon now in universities, colleges and secondary schools for work and timetables to be found online or for information to be sent to pupils via internal email networks. Remote education during lockdown was no doubt made easier by the considerable infrastructure already in place.
Then there’s the question of music. No computers would mean a life lived in serene quiet; travelling and working without background sound to hum or tap one’s foot to. An inconvenience, maybe, but perhaps not altogether a negative one. Sir Roger Scruton spoke about the intrusion of mass-produced music into everyday life. Computer-produced tunes are played at a low level in shopping centres and restaurants, replacing the ambient hum and chatter of human life with banal pop music. Scruton believed that the proper role of music was to exalt life – to enhance and make clear our most heartfelt emotions. Music today, though, is designed to distract from the dullness of everyday life or paper over awkward silences at social events. He went so far as to say that pop consumption had an effect on the musical ear comparable to that of pornography on sex.
The largest barrier, however, is the use of the internet for work. Many companies use online services to organise things like shift rotas, pay and holidays and the entire professional world made the switch to email decades ago. How feasible is it to opt out of this? Short of becoming extremely skilled at something for which there is both very little supply and very high demand, and then working for a band of eccentrics willing to accommodate my niche lifestyle, I think it would be more or less impossible. Losing the computer would mean kissing the possibility of a career goodbye.
Lockdown has also sped up the erosion of physical infrastructure required to live life offline as well as accelerated our transformation into a ‘cashless society’. On average, 50 bank branches have closed every month since January 2015, with over 1000 branch closures across the country in the last year alone. It also seems to have wiped away the last remaining businesses that didn’t accept card payments. The high street, already kicking against the current for years, is presently being kept alive by Rishi Sunak’s magic money tree while Amazon records its best quarter for profits ever. It’s no mystery to anyone which way history will go.
I’m lucky that my parents were always instinctively suspicious of ‘screens’. I didn’t get a smartphone until a good way into secondary school and I got my first – and only – games console at the age of 16. I keenly remember getting a laptop for my birthday. I think my parents gave it to me in the hopes that I would become some kind of computing or coding genius – instead, I just played a lot of Sid Meiers Civilisation III. My dad would remind me that nothing on my computer was real, but that didn’t stop me getting addicted to games. If it wasn’t for my parents’ strong interventions I would likely have developed a serious problem – sucked into the matrix and doomed to spend my youth in my bedroom with the blinds down.
All year this year I have wrestled with my media addiction but been unable to throw it off. I told my friends that I was taking a break from social media, I deactivated my Twitter account, I physically hid my phone from myself under my bed, and yet here I am, writing this on my laptop for an online publication. When I got rid of my phone I turned to my computer to fill the time. When I realised that the computer was no better I tore myself from it too… and spent more time watching TV. I tried reading – and made some progress – but the allure of instant reward always pulled me back.
I’m not a completely helpless creature, though. On several occasions I cast my digital shackles into the pit, only to find that I needed internet access for business that was more important than my luddite hissy-fit. Once I opened the computer up for business, it was only a matter of time before I would be guiltily watching Netflix and checking my phone again. It’s too easy – I know all the shortcuts. I can be on my favourite time-absorbing website at any time in three or four keystrokes. Besides, getting rid of my devices meant losing contact with my friends (with whom contact was thin on the ground already). Unplugging meant really facing the horrific isolation of lockdown without dummy entertainment devices to distract me. I lasted a month, once. So determined was I to live in the 17th century that I went a good few weeks navigating my house and reading late at night by candlelight rather than turning on those hated LEDs.
And yet, the digital world is tightening around us all the time. Year on year, relics of our past are replaced with internet-enabled gadgets connected to a worldwide spider web of content that has us wrapped up like flies. Whenever I’ve mentioned this I’ve been met with derision and scorn and told to live my life in the woods. I don’t want to live alone in the woods – I want to live a happy and full life; the kind of life that everyone lived just fine until about the ’90s. I’m sick of the whirr and whine of my laptop, of my nerves being raw from overuse, of always keeping one ear open for a ‘ping ’or a ‘pop’ from my phone, and of the days lost mindlessly flicking from one app to the other. Computers have drastically changed the rhythm of life itself. Things used to take certain amounts of time and so they used to take place at certain hours of the day. They were impacted by things like distance and the weather. Now, so much can occur instantaneously irrespective of time or distance and independent from the physical world entirely. Put simply, less and less of life today takes place in real life.
The world of computers is all I’ve ever known and yet I find myself desperately clawing at the walls for a way out. It’s crazy to think that something so complex and expensive – a marvel of human engineering – can become so necessary in just a few decades. If I can’t get rid of my computers I’ll have to learn to diminish their roles in my life as best I can. This is easier said than done, though; as the digital revolution marches on and more and more of life is moved online, the digital demons I am struggling to keep at arm’s length grow bigger and hungrier.
I’m under no illusions that it’s possible to turn back the tide. Unfortunately the digital revolution, like the industrial and agricultural revolutions before it, will trade individual quality of life for collective power. As agricultural societies swallowed up hunter gatherers one by one before themselves being crushed by industrial societies, so those who would cling to an analogue way of life will find themselves overmatched, outcompeted and overwhelmed. Regardless, I will continue with my desperate, rearguard fight against history the same way the English romantics struggled against industrialisation. Hopeless my cause is, yes, but it’s beautiful all the same.
Post Views: 745