Picture the scene. Strings of tattered bunting, the concrete shaft of a half-built pillar. At the centre of it all a pile of red and black folders, supplanted by a pair of flagpoles bearing faded Union Jacks. A length of striped tape lies beside them on the floor like the shed integument of a snake, and everywhere you look you see road barriers, twisted, contorted, lopsided. If it weren’t for the fact that the setting of the scene is Towner Eastbourne art gallery, you’d think a car had crashed through it. And you probably wouldn’t blame the driver.
This isn’t the aftermath of a riot or the contents of a disorganised storage room. In fact it is Jesse Darling’s winning submission for the 2023 Turner Prize, one of the world’s most prestigious art awards. The prize was established to honour the ‘innovative and controversial’ works of J.M.W. Turner, although in the thirty-nine years since its inauguration, none of the winning submissions have evoked the sublime beauty of Turner’s paintings.
There are no facts when it comes to art, only opinions. The judges, who lauded the exhibition for ‘unsettl[ing] perceived notions of labour, class, Britishness and power’, seem to have glimpsed something profound beyond the shallow display of metal and tape. Or they may simply have considered it the least worst submission in a shortlist which included some accomplished but otherwise unremarkable charcoal sketches, an oratorio about the COVID-19 pandemic and a few pipes.
It may be that beauty lies in the eyes of the beholder. But there is a distinction to be made between works of art whose beauty is universally accepted, and those which fail to find acclaim beyond a small demographic of urban, middle-class bohemians. According to a YouGov poll, an unsurprising 97% of the British public consider the Mona Lisa to be a work of art. That figure drops to 78% for Picasso’s Guernica, 41% for Jackson Pollock’s Number 5, and just 12% for Tracey Emin’s My Bed. That 12% of society, however, represents those among us most likely to work in art galleries and institutions, and to hold the most latitudinarian definition of ‘art’.
For ordinary people, the chief criterion for art is, and always has been, beauty. But like the other humanities, the 20th century saw the art world succumb to the nebulous web of ‘discourse’, with a corresponding shift away from aesthetic merits and towards political ends. Pieces like Marcel Duchamp’s Fountain – an upturned porcelain urinal – proved that works of art could shoot to fame precisely on account of their capacity to disturb and agitate audiences. As philosopher Roger Scruton described it, ‘According to many critics writing today a work of art justifies itself by announcing itself as a visitor from the future. The value of art is a shock value.’ The fact that shock, fear and revulsion create more powerful reactions than the sense of joy, calm or awe one feels when looking at a Rosetti or a Caravaggio is an unfortunate fact of human nature, and remains as true today as it did a hundred years ago. In much the same way that a news stories about declining global poverty rates or deaths from malaria will receive less attention than stories about melting ice caps or rising CO2 emissions, a truly beautiful artwork will receive less attention in the media than something which irks, irritates and offends.
In the opening chapter of The Picture of Dorian Gray, when Basil Hallward reveals the eponymous portrait to his friend Lord Henry, he confesses to feeling reservations about exhibiting the work despite it being, in Henry’s estimation, his masterpiece. ‘There is only one thing in life worse than being talked about,’ Lord Henry reprimands him, ‘and that is not being talked about.’ The sentiment is triply true in the age of social media. Indeed, the term ‘ragebait’ started appearing online in the months following Mark Leckey’s winning submission for the Turner Prize in 2008, an exhibition which featured a glow-in-the-dark stick figure and a naked mannequin on a toilet. Like Dorian Gray, the more the art world thirsts for attention, the more hideous the art itself will become.
The quickest route to attention is politics. At the award ceremony for this year’s Turner Prize, Darling pulled from his pocket a Palestinian flag, ‘Because there’s a genocide going on and I wanted to say something about it on the BBC.’ In his acceptance speech, he lambasted the late Margaret Thatcher for ‘pav[ing] the way for the greatest trick the Tories ever played, which is to convince working people in Britain that studying, self expression and what the broadsheet supplements describe as “culture” is only for certain people in Britain from certain socio-economic backgrounds. I just want to say don’t buy in, it’s for everyone’. The irony is that all the money in the world wouldn’t fix the problems currently afflicting the art scene. If the custodians of modern art want to democratise their vocation, and make culture available to ordinary people, they should follow the example of Turner – and produce something worth looking at.
You Might also like
-
Charles’ Personal Rule: A Stable or Tyrannised England?
Within discussions of England’s political history, the most famous moments are known and widely discussed – the Magna Carta of 1215, and the Cromwell Protectorate of the 1650s spring immediately to mind. However, the renewal of an almost-mediaeval style of monarchical absolutism, in the 1630s, has proven both overlooked and underappreciated as a period of historical interest. Indeed, Charles I’s rule without Parliament has faced an identity crisis amongst more recent historians – was it a period of stability or tyranny for the English people?
If we are to consider the Personal Rule as a period in enough depth, the years leading up to the dissolution of Charles’ Third Parliament (in 1629) must first be understood. Succeeding his father James I in 1625, Charles’ personal style and vision of monarchy would prove to be incompatible with the expectations of his Parliaments. Having enjoyed a strained but respectful relationship with James, MPs would come to question Charles’ authority and choice of advisors in the coming years. Indeed, it was Charles’ stubborn adherence to the Divine Right of King’s doctrine, writing once that “Princes are not bound to give account of their actions but to God alone”, that meant that he believed compromise to be defeat, and any pushback against him to be a sign of disloyalty.
Constitutional tensions between King and Parliament proved the most contentious of all issues, especially regarding the King’s role in taxation. At war with Spain between 1625 – 1630 (and having just dissolved the 1626 Parliament), Charles was lacking in funds. Thus, he turned to non-parliamentary forms of revenue, notably the Forced Loan (1627) – declaring a ‘national emergency’, Charles demanded that his subjects all make a gift of money to the Crown. Whilst theoretically optional, those who refused to pay were often imprisoned; a notable example would be the Five Knights’ Case, in which five knights were imprisoned for refusing to pay (with the court ruling in Charles’ favour). This would eventually culminate in Charles’ signing of the Petition of Right (1628), which protected the people from non-Parliamentary taxation, as well as other controversial powers that Charles chose to exercise, such as arrest without charge, martial law, and the billeting of troops.
The role played by George Villiers, the Duke of Buckingham, was also another major factor that contributed to Charles’ eventual dissolution of Parliaments in 1629. Having dominated the court of Charles’ father, Buckingham came to enjoy a similar level of unrivalled influence over Charles as his de facto Foreign Minister. It was, however, in his position as Lord High Admiral, that he further worsened Charles’ already-negative view of Parliament. Responsible for both major foreign policy disasters of Charles’ early reign (Cadiz in 1625, and La Rochelle in 1627, both of which achieved nothing and killed 5 to 10,000 men), he was deemed by the MP Edward Coke to be “the cause of all our miseries”. The duke’s influence over Charles’ religious views also proved highly controversial – at a time when anti-Calvinism was rising, with critics such as Richard Montague and his pamphlets, Buckingham encouraged the King to continue his support of the leading anti-Calvinist of the time, William Laud, at the York House Conference in 1626.
Heavily dependent on the counsel of Villiers until his assassination in 1628, it was in fact, Parliament’s threat to impeach the Duke, that encouraged Charles to agree to the Petition of Right. Fundamentally, Buckingham’s poor decision-making, in the end, meant serious criticism from MPs, and a King who believed this criticism to be Parliament overstepping the mark and questioning his choice of personnel.
Fundamentally by 1629, Charles viewed Parliament as a method of restricting his God-given powers, one that had attacked his decisions, provided him with essentially no subsidies, and forced him to accept the Petition of Right. Writing years later in 1635, the King claimed that he would do “anything to avoid having another Parliament”. Amongst historians, the significance of this final dissolution is fiercely debated: some, such as Angela Anderson, don’t see the move as unusual; there were 7 years for example, between two of James’ Parliaments, 1614 and 1621 – at this point in English history, “Parliaments were not an essential part of daily government”. On the other hand, figures like Jonathan Scott viewed the principle of governing without Parliament officially as new – indeed, the decision was made official by a royal proclamation.
Now free of Parliamentary constraints, the first major issue Charles faced was his lack of funds. Lacking the usual taxation method and in desperate need of upgrading the English navy, the King revived ancient taxes and levies, the most notable being Ship Money. Originally a tax levied on coastal towns during wartime (to fund the building of fleets), Charles extended it to inland counties in 1635 and made it an annual tax in 1636. This inclusion of inland towns was construed as a new tax without parliamentary authorisation. For the nobility, Charles revived the Forest Laws (demanding landowners produce the deeds to their lands), as well as fines for breaching building regulations.
The public response to these new fiscal expedients was one of broad annoyance, but general compliance. Indeed, between 1634 and 1638, 90% of the expected Ship Money revenue was collected, providing the King with over £1m in annual revenue by 1637. Despite this, the Earl of Warwick questioned its legality, and the clerical leadership referred to all of Charles’ tactics as “cruel, unjust and tyrannical taxes upon his subjects”.However, the most notable case of opposition to Ship Money was the John Hampden case in 1637. A gentleman who refused to pay, Hampden argued that England wasn’t at war and that Ship Money writs gave subjects seven months to pay, enough time for Charles to call a new Parliament. Despite the Crown winning the case, it inspired greater widespread opposition to Ship Money, such as the 1639-40 ‘tax revolt’, involving non-cooperation from both citizens and tax officials. Opposing this view, however, stands Sharpe, who claimed that “before 1637, there is little evidence at least, that its [Ship Money’s] legality was widely questioned, and some suggestion that it was becoming more accepted”.
In terms of his religious views, both personally and his wider visions for the country, Charles had been an open supporter of Arminianism from as early as the mid-1620s – a movement within Protestantism that staunchly rejected the Calvinist teaching of predestination. As a result, the sweeping changes to English worship and Church government that the Personal Rule would oversee were unsurprisingly extremely controversial amongst his Calvinist subjects, in all areas of the kingdom. In considering Charles’ religious aims and their consequences, we must focus on the impact of one man, in particular, William Laud. Having given a sermon at the opening of Charles’ first Parliament in 1625, Laud spent the next near-decade climbing the ranks of the ecclesiastical ladder; he was made Bishop of Bath and Wells in 1626, of London in 1629, and eventually Archbishop of Canterbury in 1633. Now 60 years old, Laud was unwilling to compromise any of his planned reforms to the Church.
The overarching theme of Laudian reforms was ‘the Beauty of Holiness’, which had the aim of making churches beautiful and almost lavish places of worship (Calvinist churches, by contrast, were mostly plain, to not detract from worship). This was achieved through the restoration of stained-glass windows, statues, and carvings. Additionally, railings were added around altars, and priests began wearing vestments and bowing at the name of Jesus. However, the most controversial change to the church interior proved to be the communion table, which was moved from the middle of the room to by the wall at the East end, which was “seen to be utterly offensive by most English Protestants as, along with Laudian ceremonialism generally, it represented a substantial step towards Catholicism. The whole programme was seen as a popish plot”.
Under Laud, the power and influence wielded by the Church also increased significantly – a clear example would be the fact that Church courts were granted greater autonomy. Additionally, Church leaders became evermore present as ministers and officials within Charles’ government, with the Bishop of London, William Juxon, appointed as Lord Treasurer and First Lord of the Admiralty in 1636. Additionally, despite already having the full backing of the Crown, Laud was not one to accept dissent or criticism and, although the severity of his actions has been exaggerated by recent historians, they can be identified as being ruthless at times. The clearest example would be the torture and imprisonment of his most vocal critics in 1637: the religious radicals William Prynne, Henry Burton and John Bastwick.
However successful Laudian reforms may have been in England (and that statement is very much debatable), Laud’s attempt to enforce uniformity on the Church of Scotland in the latter half of the 1630s would see the emergence of a united Scottish opposition against Charles, and eventually armed conflict with the King, in the form of the Bishops’ Wars (1639 and 1640). This road to war was sparked by Charles’ introduction of a new Prayer Book in 1637, aimed at making English and Scottish religious practices more similar – this would prove beyond disastrous. Riots broke out across Edinburgh, the most notable being in St Giles’ Cathedral (where the bishop had to protect himself by pointing loaded pistols at the furious congregation. This displeasure culminated in the National Covenant in 1638 – a declaration of allegiance which bound together Scottish nationalism with the Calvinist faith.
Attempting to draw conclusions about Laudian religious reforms very many hinges on the fact that, in terms of his and Charles’ objectives, they very much overhauled the Calvinist systems of worship, the role of priests, and Church government, and the physical appearance of churches. The response from the public, however, ranging from silent resentment to full-scale war, displays how damaging these reforms were to Charles’ relationship with his subjects – coupled with the influence wielded by his wife Henrietta Maria, public fears about Catholicism very much damaged Charles’ image, and meant religion during the Personal Rule was arguably the most intense issue of the period. In judging Laud in the modern-day, the historical debate has been split: certain historians focus on his radical uprooting of the established system, with Patrick Collinson suggesting the Archbishop to have been “the greatest calamity ever visited upon by the Church of England”, whereas others view Laud and Charles as pursuing the entirely reasonable, a more orderly and uniform church.
Much like how the Personal Rule’s religious direction was very much defined by one individual, so was its political one, by Thomas Wentworth, later known as the Earl of Strafford. Serving as the Lord Deputy of Ireland from 1632 to 1640, he set out with the aims of ‘civilising’ the Irish population, increasing revenue for the Crown, and challenging Irish titles to land – all under the umbrella term of ‘Thorough’, which aspired to concentrate power, crackdown on oppositions figures, and essentially preserve the absolutist nature of Charles’ rule during the 1630s.
Regarding Wentworth’s aims toward Irish Catholics, Ian Gentles’ 2007 work The English Revolution and the Wars in the Three Kingdoms argues the friendships Wentworth maintained with Laud and also with John Bramhall, the Bishop of Derry, “were a sign of his determination to Protestantize and Anglicize Ireland”.Devoted to a Catholic crackdown as soon as he reached the shores, Wentworth would subsequently refuse to recognise the legitimacy of Catholic officeholders in 1634, and managed to reduce Catholic representation in Ireland’s Parliament, by a third between 1634 and 1640 – this, at a time where Catholics made up 90% of the country’s population. An even clearer indication of Wentworth’s hostility to Catholicism was his aggressive policy of land confiscation. Challenging Catholic property rights in Galway, Kilkenny and other counties, Wentworth would bully juries into returning a King-favourable verdict, and even those Catholics who were granted their land back (albeit only three-quarters), were now required to make regular payments to the Crown. Wentworth’s enforcing of Charles’ religious priorities was further evidenced by his reaction to those in Ireland who signed the National Covenant. The accused were hauled before the Court of Castle Chamber (Ireland’s equivalent to the Star Chamber) and forced to renounce ‘their abominable Covenant’ as ‘seditious and traitorous’.
Seemingly in keeping with figures from the Personal Rule, Wentworth was notably tyrannical in his governing style. Sir Piers Crosby and Lord Esmonde were convicted by the Court of Castle Chamber for libel for accusing Wentworth of being involved in the death of Esmond’s relative, and Lord Valentina was sentenced to death for “mutiny” – in fact, he’d merely insulted the Earl.
In considering Wentworth as a political figure, it is very easy to view him as merely another tyrannical brute, carrying out the orders of his King. Indeed, his time as Charles’ personal advisor (1639 onwards) certainly supports this view: he once told Charles that he was “loose and absolved from all rules of government” and was quick to advocate war with the Scots. However, Wentworth also saw great successes during his time in Ireland; he raised Crown revenue substantially by taking back Church lands and purged the Irish Sea of pirates. Fundamentally, by the time of his execution in May 1641, Wentworth possessed a reputation amongst Parliamentarians very much like that of the Duke of Buckingham; both men came to wield tremendous influence over Charles, as well as great offices and positions.
In the areas considered thus far, it appears opposition to the Personal Rule to have been a rare occurrence, especially in any organised or effective form. Indeed, Durston claims the decade of the 1630s to have seen “few overt signs of domestic conflict or crisis”, viewing the period as altogether stable and prosperous. However, whilst certainly limited, the small amount of resistance can be viewed as representing a far more widespread feeling of resentment amongst the English populace. Whilst many actions received little pushback from the masses, the gentry, much of whom were becoming increasingly disaffected with the Personal Rule’s direction, gathered in opposition. Most notably, John Pym, the Earl of Warwick, and other figures, collaborated with the Scots to launch a dissident propaganda campaign criticising the King, as well as encouraging local opposition (which saw some success, such as the mobilisation of the Yorkshire militia). Charles’ effective use of the Star Chamber, however, ensured opponents were swiftly dealt with, usually those who presented vocal opposition to royal decisions.
The historiographical debate surrounding the Personal Rule, and the Caroline Era more broadly, was and continues to be dominated by Whig historians, who view Charles as foolish, malicious, and power-hungry, and his rule without Parliament as destabilising, tyrannical and a threat to the people of England. A key proponent of this view is S.R. Gardiner who, believing the King to have been ‘duplicitous and delusional’, coined an alternative term to ‘Personal Rule’ – the Eleven Years’ Tyranny. This position has survived into the latter half of the 20th Century, with Charles having been labelled by Barry Coward as “the most incompetent monarch of England since Henry VI”, and by Ronald Hutton, as “the worst king we have had since the Middle Ages”.
Recent decades have seen, however, the attempted rehabilitation of Charles’ image by Revisionist historians, the most well-known, as well as most controversial, being Kevin Sharpe. Responsible for the landmark study of the period, The Personal Rule of Charles I, published in 1992, Sharpe came to be Charles’ most staunch modern defender. In his view, the 1630s, far from a period of tyrannical oppression and public rebellion, were a decade of “peace and reformation”. During Charles’ time as an absolute monarch, his lack of Parliamentary limits and regulations allowed him to achieve a great deal: Ship Money saw the Navy’s numbers strengthened, Laudian reforms mean a more ordered and regulated national church, and Wentworth dramatically raised Irish revenue for the Crown – all this, and much more, without any real organised or overt opposition figures or movements.
Understandably, the Sharpian view has received significant pushback, primarily for taking an overly optimistic view and selectively mentioning the Personal Rule’s positives. Encapsulating this criticism, David Smith wrote in 1998 that Sharpe’s “massively researched and beautifully sustained panorama of England during the 1630s … almost certainly underestimates the level of latent tension that existed by the end of the decade”.This has been built on by figures like Esther Cope: “while few explicitly challenged the government of Charles I on constitutional grounds, a greater number had experiences that made them anxious about the security of their heritage”.
It is worth noting however that, a year before his death in 2011, Sharpe came to consider the views of his fellow historians, acknowledging Charles’ lack of political understanding to have endangered the monarchy, and that, more seriously by the end of the 1630s, the Personal Rule was indeed facing mounting and undeniable criticism, from both Charles’ court and the public.
Sharpe’s unpopular perspective has been built upon by other historians, such as Mark Kishlansky. Publishing Charles I: An Abbreviated Life in 2014, Kishlansky viewed parliamentarian propaganda of the 1640s, as well as a consistent smear from historians over the centuries as having resulted in Charles being viewed “as an idiot at best and a tyrant at worst”, labelling him as “the most despised monarch in Britain’s historical memory”. Charles however, faced no real preparation for the throne – it was always his older brother Henry that was the heir apparent. Additionally, once King, Charles’ Parliaments were stubborn and uncooperative – by refusing to provide him with the necessary funding, for example, they forced Charles to enact the Forced Loan. Kishlansky does, however, concede the damage caused by Charles’ unmoving belief in the Divine Right of Kings: “he banked too heavily on the sheer force of majesty”.
Charles’ personality, ideology and early life fundamentally meant an icy relationship with Parliament, which grew into mutual distrust and the eventual dissolution. Fundamentally, the period of Personal Rule remains a highly debated topic within academic circles, with the recent arrival of Revisionism posing a challenge to the long-established negative view of the Caroline Era. Whether or not the King’s financial, religious, and political actions were met with a discontented populace or outright opposition, it remains the case that the identity crisis facing the period, that between tyranny or stability remains yet to be conclusively put to rest.
Post Views: 1,048 -
The Dangers of a Revolution in Reverse
“In conclusion, this is the great truth with which the French cannot be too greatly impressed: the restoration of the monarchy, what they call the counter-revolution, will not be a contrary revolution, but the contrary of the revolution.” – J. de Maistre, Considerations on France, R. A. Lebrun (Ed.), Cambridge, p. 105.
Imagine a prisoner digging an escape tunnel. For years, in desperation and longing for freedom, he’s picked at the stones by hand until his fingers are bleeding stumps. Suddenly he emerges and a rush of hope shoots through his veins. This subsides immediately. Before him is darkness. He had severely underestimated the size of the prison, and all this time he was merely tunnelling into another prisoner’s cell.
This situation, familiar to readers of Alexander Dumas’ The Count of Monte Cristo, I think pertains to a figure Joseph De Maistre first identifies in 1797, in the aftermath of the French Revolution: the reverse revolutionary. As far as I know, the only other thinker to have dwelt on this character deeply is the conservative Augusto del Noce in the twentieth century, and I shall draw from both to make my case.
First, to define revolutionary. I use “revolutionary” to mean any view that seeks utopian salvation through political or social action, by rejecting traditions of immaterial truth, and an abrupt discontinuity with the past. I don’t necessarily mean one that wants violent upheaval, though usually they do. It’s not the manner that defines a revolution but its content. These ideologies try to sever the link between politics and any truth outside of it. Truth is a socio-political creed. Eric Voegelin’s view that modern revolutionary thought is gnostic serves us here. Ancient Gnostics separated heaven from earth and sought heaven through esoteric spiritual knowledge. Modern Gnostics also separate heaven from earth, but banish heaven from the earth and build paradises out of esoteric political knowledge, without reference to anything beyond it.
A reverse revolutionary is someone who begins as the staunchest conservative. The revolution has come and ruined the world he loves. He’s seen all that he holds good swept aside in a frenzy. Panic ensues, and then rage. What shall he do?
He sets upon pushing back the revolution by what he thinks is a counter-revolution. Whatever the revolutionaries affirm, he’ll deny. Whatever nefarious plans they have, he’ll plan the opposite. Whenever they push, he’ll push back harder. But what he really does is create a contrary revolution. Instead of negating the revolution, he reverses it.
But what’s the difference exactly between negation and reversal? I think it’s the difference between partial and full denial of a revolutionary argument.
Jean Jacques Rousseau, the Ur-revolutionary, thinks something like this:
“Man is born free but everywhere he’s in chains, so he must be born good and it’s society that makes him evil.“
There’s rather a lot here, but for simplicity’s sake it’s an argument with two parts. “Man is born free and everywhere he’s in chains”, effectively means that humans are naturally equal, but everywhere unequal. Why are we unequal if nature makes us equal? Because “man is born good and it’s society that makes him evil.”. That is, unjust social institutions have corrupted us, and prevent us from living as we would in a state of nature.
We can reverse or negate this position. A reversal would be something like this:
“Yes, man is born good, and society makes him evil. But it’s because by nature he’s unequal, and society is what makes him equal.”
In other words, we agree with Rousseau that society and its institutions are responsible for all injustice. However, we disagree with him that inequality is the problem. The problem is the opposite: equality. In the imagined state of nature, humans are unequal, and it’s society which has imposed an unnatural equality upon them.
If Rousseau’s original position is a sort of egalitarian primitivism, our reversed position is a sort of hierarchical primitivism. Were we to put the latter into practice, it would oppose the former, but create its own revolution to do so. It would resist with equal vehemence the status quo, but for the opposite reasons.
A negation, on the other hand, would read like this:
“Man isn’t born free and he isn’t everywhere in chains, so he isn’t born good, and society doesn’t make him evil.”
While the reversal inverts the premise, but keeps the conclusion, the negation says that the premise and conclusion are both false. It denies them both.
Fair, but why does this matter? Aren’t we just splitting hairs? It matters because reversing a revolution accepts part of its lie. One starts from this lie, then tries to produce from it an opposite effect than so far has been produced. But lies are at odds with reality, because only what’s true is real. By fighting lies with lies one risks ruining the world twice over instead of improving it. Further, since lies by definition don’t correspond to reality, a revolution in reverse is destined to fail. Accepting a lie means to accept something which doesn’t exist, and carrying through this lie into political action means creating a delirium or fantasy. History testifies to the fleeting nature of such things.
To create a revolution in the opposite direction is tempting for those who want to protect themselves from a revolution but have unwittingly drunk from its well. It’s the reaction (in the political sense) of someone unwilling to reflect on the times he lives in or analyse himself as the product of a Zeitgeist. Someone who hasn’t thought that all ideas have a genealogy, and that those ideas he detests might be closer on the family tree than he suspects. The reverse revolutionary, in short, is someone who confuses the familiar with the truth.
Like water through coffee, a revolutionary idea only bursts forth once it has thoroughly saturated the culture. By that point it’s part of a wider background, framing all conversation and extremely difficult to think outside of, like the courtyard surrounding a prison that blocks any view of the distance. Robespierre and the Jacobins normalised political violence as a means of change with La Terreur, and La Terreur Blanche was their mirror. Marxism normalised crude materialism and a murderous utilitarian collectivism, and Nazism was its mirror. Indeed, to get Nazism one must simply reverse, point-by-point, every social creed of Marxism, keep the materialist worldview intact, then embed it in a Prussian context (A. Del Noce, (2014), The Crisis of Modernity, pp. 68-69).
Retreat into so-called centrism doesn’t protect against reverse revolution either. A mild and centrist ideology that opposes a harsh and radical one, can still be a revolution in reverse if it shares the same underlying commitment to a revolutionary ideal. Recall that it’s not the manner but the content that defines a revolution. The reversal of a reductive political utopia must necessarily be another reductive political utopia. Thus, the economic liberal who opposes socialism by curing every ill with market forces is no less revolutionary than the socialist for merely being a centrist. Lastly, that one wishes to achieve one’s aims through gradual change doesn’t make one less revolutionary, for a slow revolution is still a revolution.
In our day such reversals are coming thick and fast on the ground, as they must in an age of crisis and disintegration, though they lack the sophistication of even the crudest Victorian pamphlet. The disgraced and arrested influencer Andrew Tate is a reverse revolutionary of sorts. He accepts the radical feminist vision of the patriarchy as a grand male conspiracy to oppress womankind but considers this a good thing which must be reinstated. The result is a masculinist revolution parallel to the radical feminist one, where everything that feminist revolutionaries decry, Tate applauds. Any existing order which is neither feminist nor masculinist is the shared enemy of both.
In gnostic fashion, Tate has swapped the esoteric knowledge of radical feminists with a masculine counterpart. One thinks, as a revolutionary, that Tate wouldn’t really care if the facts disproved his vision (just as radical feminists don’t), since a political goal has absorbed all reality and replaced truth itself.
I don’t have a simple solution to this problem. There’s no remedy for reverse revolutionaries other than humility, education, and careful thought. The wrathful desire for vengeance especially breeds such people because anger, frothing up, looks for a way to harm the enemy without asking what the tool is. Any tool will do, even if the enemy himself has made it. Perhaps this is why societies filled with wrath are prone to this error.
Maybe we should close with the words of Louis XVI awaiting execution in 1792, to his son the Dauphin: “I recommend to my Son, if he has the misfortune to become King, to remember that he owes himself entirely to his fellow citizens; that he must forget all hatred and resentment, and particularly all that relates to the misfortunes and afflictions that I endure.”
Post Views: 1,038 -
Barbie, Oppenheimer and Blue Sky Research
Barbie or Oppenheimer? Two words you would have never considered putting together in a sentence. For the biggest summer blockbuster showdown in decades, the memes write themselves.
In recent months (and years!), we’ve seen flop after flop, such as the new Indiana Jones and Flash films, with endless CGI superheroes and the merciless rehashing of recognised brands. The inability for film studies to recognise and attempt anything new has only led to the continued damage of established and respected franchises.
This in part is due a decline in film studios being willing to take risks over new pieces of intellectual property (something the Studio A24 has excelled in), and a retreat into a ‘culturally bureaucratic’ system that neither rewards art nor generates anything vaguely new, preferring to reward conscientious proceduralism.
Given this, there has been widespread speculation that films like Oppenheimer will ‘save’ cinema, with Christopher Nolan’s biographical adventure, based on the book ‘American Prometheus’ (would highly recommend), being highly awaited and regarded.
Although, I suspect cinema is too far gone from saving in its current format. I do believe that Oppenheimer will have long term cultural effects, which should be recognised and welcomed by everyone.
In the past, there have been many films that, when made and consumed, have directly changed how we view topics and issues. Jaws gave generations of people a newfound fear of sharks, while the Shawshank Redemption provided many with the Platonic form of hope and salvation. I hope that Oppenheimer can and will become a film like this, because of what Robert Oppenheimer’s life (and by extension the Manhattan Project itself) represented.
As such, two things should come out of this film and re-enter the cultural sphere, filtering back down into our collective fears and dreams. Firstly, is it that of existential fear from nuclear war (very pressing considering the Russo-Ukrainian War) and what this means for us as species.
Secondly, is that of Blue-Sky Research (BSR) and the power of problem solving. Although the Manhattan project was not a ‘true’ example of BSR, it helped set the benchmark for science going forward.
Both factors should return to our collective consciousness, in our professional and private lives; they can only benefit us going forward.
I would encourage everyone to go out tonight and look at the night sky and say to yourself while looking at the stars: “this goes on for forever”. In the same breath, look to the horizon and think to yourself: “This can end at any moment. We have the power to do all of this”.
Before watching Oppenheimer, I would highly encourage you to watch the ‘Charlie Dean Archives’ and the footage of atomic bombs from 1959. Not only is the footage astounding, multiple generations have lived in fear of the invention; the idea and the consequences of the bomb have disturbed humans as long as it has existed.
Films like Threads in Britain played a similar role, which entered the unconscious, and films like Barefoot Gen for Japan (this film is quite notorious and controversial, but a must watch) did the same, presenting the real-world effects of nuclear war through the eyes of young children and the fear it invokes.
In recent years, we have seemingly lost this fear. Indeed, we continue to overlook the fact this could all be over so quickly. We have forgotten or chosen to ignore the simple fact that we are closer than ever before to the end of the world.
The pro-war lobby within the West have continually played fast and loose with this fact, to the point we find ourselves playing Russian roulette with an ever-decreasing number of chambers in our guns.
In the past, we have narrowly avoided nuclear conflict several times, and it has been mostly a question of luck as to whether we avoid the apocalypse. The downside of all this is that any usage of the word ‘nuclear’ is now filled with images of death and destruction, which is a shame because nuclear energy could be our salvation in so many ways.
Additionally, we need to remember what fear is as a civilisation; fear in its most existential form. We have become too indebted to the belief that civilisation is permanent. We assume that this world and our society will always be here, when the reality is that all of it could be wiped out within a generation.
As dark as this sounds, we need bad things to happen, so that we can understand and appreciate the good that we do have, and so that good things might occur in the future. Car crashes need to happen, so we can learn to appreciate why we have seatbelts. We need people to remember why we fear things to ensure we do everything in our power to avoid such things from ever happening again.
Oppenheimer knew and understood this. Contrary to the memes, he knew what he had created and it haunted him till the end of his days. Oppenheimer mirrors Alfred Nobel and his invention of dynamite, albeit burdened with a far greater sense of dread.
I hope that with the release of Oppenheimer, we can truly begin to go back to understanding what nuclear weapons (and nuclear war) mean for us as a species. The fear that everything that has ever been built and conceived could be annihilated in one act.
We have become the gods of old; we can cause the earth to quake and great floods to occur and we must accept the responsibility that comes with this power now. We need to fear this power once more, especially our pathetic excuse for leadership.
In addition to fear, Oppenheimer will (hopefully) reintroduce BSR into our cultural zeitgeist – the noble quest of discovery and research. BSR can be defined as research without a clearly defined goal or immediately apparent real-world applications.
As I mentioned earlier, whilst the Manhattan project was not a pure example of BSR, it gave scientists more freedom to pursue long-term “high risk, high reward” research, leading to a very significant breakthrough.
We need to understand the power of BSR. Moving forward, we must utilise its benefits to craft solutions to our major problems.
I would encourage everyone to read two pieces by Vannevar Bush. One is ‘Science the Endless Frontier’, a government report, and ‘As we may think’, an essay.
In both pieces, he makes a good argument for re-examining how we understand scientific development and research and calls for governmental support in such research. Ultimately, Bush’s work led to the creation of the National Science Foundation.
For research and development, government support played a vital role in managing to successfully create nuclear weapons before either the Germans or Japanese and their respective programs.
I believe it was Eric Weinstein who stated that the Manhattan project was not really a physics but rather an engineering achievement. Without taking away from the work of the theorists who worked on the project. I would argue that Weinstein is largely correct. However, I argue that it was a governmental (or ‘human’ achievement), alongside the phenomenal work of various government-supported experimentalists.
The success of the Manhattan Project was built on several core conditions. Firstly, there was a major drive by a small group of highly intelligent and functional people that launched the project (a start-up mentality). Secondly, full government support, to achieve a particular goal. Thirdly, the near-unlimited resources afforded to the project by the government. Fourthly, complete concentration of the best minds onto a singular project.
These conditions mirror a lot of the tenets of BSR: placing great emphasis on government support, unlimited resources and manpower and complete concentration on achieving a specific target. Under these conditions, we can see what great science looks like and how we can possibly go back to achieving it.
Christopher Nolan has slightly over three hours to see if he can continue to make his mark on cinema and leave more than a respectable filmography in its wake. If he does, let’s hope it redirects our culture away from merely good science, and back towards the pursuit of great civilisational achievements – something always involved, as a man with a blog once said: “weirdos and misfits with odd skills”.
Post Views: 994