Josh Allan

The Campaign for Scottish Independence is back to Square One

‘Within the next five years, in one form or another, break-up is likely to come about’. These are not the reflections of an unhappily married man, but a pronouncement on the fate of Great Britain, as prophesied by none other than Tom Nairn, widely considered the intellectual flagbearer for Scottish nationalism up until his death last year. Most notable for his book The Break-up of Britain, which foretells the dissolution of the United Kingdom as a consequence of imperial decline, Nairn, delivering one of his final interviews, felt confident enough to declare that the hour was finally at hand, despite being ‘usually cautious about predicting timelines’.

Nairn wasn’t alone in his prediction. According to a Survation poll, when asked ‘How likely or unlikely do you think it is that there will be another referendum on Scottish independence in the next five years?’, 65% of Scots were of the opinion that it was either ‘Very likely’ ‘or ‘Quite likely’. Only 25% answered ‘Quite unlikely’ or ‘Very unlikely’. The 65% may indeed be correct; but if any such referendum is to take place within the given timeframe, Scottish nationalists will have to pray for a miracle to happen within the next week or so, seeing as the survey in question was conducted all the way back in October 2019. Nairn’s five-year prediction was made in 2020, which affords it slightly more leeway. Yet the odds that the Scottish National Party, which only months ago suffered a crushing defeat at the ballot box, is able to persuade the British government to grant them a second vote by next year are slim to none. The truth is that the nationalist cause hasn’t looked more hopeless since the result of the original Scottish Independence referendum in 2014.

Even that episode, in which the UK faced its first truly existential threat since England and Scotland were united in 1707, ended on a modestly optimistic note for separatists. While the result of the vote was 55-45 in favour of Scotland remaining in the UK, the fact that Alex Salmond, the leader of the Scottish National Party, had been able to secure a referendum from the British government in the first place – and that the ‘Yes’ side, as the pro-Independence side became known, had come so close to winning – was taken by many as a sure sign that the Union was on its last legs. Worse still for the ‘No’ side, the fact that Independence was more popular among younger voters seemed to suggest that the end of the UK was not only inevitable but imminent. Led by liberal activists whose passionate appeals invoked the rhetoric of earlier Independence movements in former British colonies, the nationalists’ quest for freedom from the English yoke became an end goal for progressives; it was merely a matter of time before the pendulum swung left, thereby slicing in half the nation-state that birthed the modern world. 

Any doubts anyone may have harboured about the impending demise of the UK were surely dispelled by the political turbulence that followed the referendum – the general election a year later, the other referendum the year after that, if not the snap election that took place the year after that – which to onlookers home and abroad resembled the death throes of a spent entity. In particular, amidst the chaos that occurred in the wake of the UK’s vote to leave the European Union in 2016, the only thing anyone seemed to be able to predict with any certainty was that Scotland – which had voted 62-38 to remain in the EU – would jump ship at the soonest opportunity. (Around a third of Scottish voters even believed that there would be another referendum before the UK completed the Brexit proceedings.)

Indeed, for many unionists, the ostensible advantages of remaining within the European Union proved a deciding factor in their ‘No’ vote: a stronger economy, representation at the European Parliament, and facilitated travel to the continent. As part of the UK, Scotland was by extension part of the EU, and a newly independent Scotland would surely struggle to gain re-entry to a bloc which mandates a 3% deficit ratio for all member states, given that the nation’s deficit hovered around 10%. But the UK’s vote to leave the EU put a knot in that theoretical chain of consequences. Remaining in the UK became a guarantee that Scotland would be cut off from the EU, while Independence under the stewardship of the passionately pro-Europe Scottish National Party at least allowed for the possibility, however scant, of being ‘welcomed back into the EU with open arms as an independent country’, in the words of one MP

Against this backdrop, even the most optimistic unionist would not have expected the status quo to hold. And yet, 10 years on from the 2014 referendum, Independence polls produce, on average, the same 55-45 split. So what happened?

As it turned out, while Britain’s decision to leave the EU predictably shored up support for the SNP, it also complicated the logistics of Independence. The smallest matryoshka in the set, Scotland would find itself not only isolated but vulnerable, separated by a hard border with its only contiguous neighbour. The country would have to decide between a customs union with England, Wales and Northern Ireland (which collectively constitute the greater part of Scotland’s trade) or with the much larger, but more distant, EU.

And while a number of sometime unionists found themselves suddenly on the ‘Yes’ side of the debate – including a contingent of high-profile figures, from the actors Ewan McGregor and John Hannah to the writers Andrew O’Hagan and John Burnside – a number of those who had voted ‘No’ in 2014 suddenly found themselves siding with the so-called ‘Little Englander’s. These Eurosceptic defectors were broadly comprised of what the commentator David Goodhart would go on to classify as ‘somewheres’: patriotic Scots, typically older and poorer, and defined by a profound attachment to the place they call home, as opposed to the cosmopolitan aloofness of ‘anywheres’. An understudied but significant section of society, Scottish ‘somewheres’ have been instrumental in preventing separatism gaining a majority in the almost weekly polls, considering how many elderly, unionist voters have passed away in the last ten years, and how many Gen-Z, overwhelmingly pro-Indpendence voters have aged into the electorate.

Then there was COVID. As with every other political matter, the arrival of the coronavirus pandemic in 2019 dramatically changed the nature of the Independence debate. Since pandemic response was a devolved matter, the four home nations took a competitive approach to dealing with the virus. While Prime Minister Boris Johnson was initially reluctant to impose lockdowns, Sturgeon lost no time in employing comparatively draconian measures, such as mask mandates and severe quarantine regulations. As a result, Scotland was able to boast a considerably lower death rate than England. Between January 2020 and June 2021, excess deaths in Scotland were only around 3% higher than average, compared with England, where they were 6 % higher. This measurable discrepancy had the effect of suggesting to many minds that not only was Scotland capable of handling its own affairs, but that government under the SNP was safer and more effective than direct rule from Westminster. For the first time, Independence polls consistently suggested that the ‘Yes’ side would win a hypothetical referendum..

But the momentum didn’t last. As the months passed, and particularly as Johnson was coerced into a stricter COVID policy, adopting many of the SNP’s own strategies, the gap in hospitalisation and death rates between each of the home nations narrowed to a pinpoint. Moreover, the UK had developed its own vaccine, and thanks to opting out of the EU’s vaccine rollout was able to conduct its vaccination campaign far more rapidly than any other European country.

From there it was all downhill for the SNP. In the summer of 2021, it transpired that the party had been misallocating donations and routinely lying about membership figures. Nicola Sturgeon, who had replaced Salmond as First Minister after the 2014 referendum, resigned in February 2023, citing gridlock around the Independence question. A month later Peter Murrell, Sturgeon’s husband and chief executive of the SNP, was arrested as part of a police investigation into the party’s finances, prompting the SNP’s auditors to resign as well.

Sturgeon was replaced as party leader by Humza Yousaf. But while the former First Minister had been able to command respect even from her adversaries, Yousaf, who lacked Sturgeon’s charisma and the prestige that comes with an established reputation, proved a far more divisive leader at a time when the party was crying out for unity. The fact that Yousaf only narrowly won the leadership contest, after his openly Presbyterian opponent had been slandered by members of her own coalition for espousing ‘extreme religious views’, did little to endear him to the Christian wing of his party. Perhaps most controversially, his decision to carry forward the Gender Reform Bill – which made it possible to change one’s legal gender on a whim by removing the requirement of a medical diagnosis – provoked derision from a public which still falls, for the most part, on the socially conservative side of the fence. Nor were progressives much impressed by his decision to end the SNP’s power-sharing agreement with the Scottish Greens. 

Yousaf resigned after thirteen months in office and was replaced by John Swinney, one of the more moderate senior members of the party, but the damage to the SNP’s credibility had already been done. With less than two months to go until the UK-wide general election, Swinney had his work cut out for him when it came to convincing Scots to continue to put their trust in the scandal-ridden SNP. Independence polling returned to pre-COVID levels. However, for nationalists, such polls were less germane to the pursuit of a second referendum than the election polls, given that keeping a pro-Independence party at the helm in Holyrood was a necessary prerequisite to secession. As long as the SNP commanded a majority of seats, there was a democratic case to be made for holding another referendum. This had been Sturgeon’s argument during the 2021 Scottish parliament election campaign, which was, in her eyes, a ‘de facto referendum’. If the SNP were to end up with a majority of seats, she claimed, the party would have the permission of the Scottish people to begin Independence proceedings. (As it happened, they went on to win sixty-four seats – one short of a majority.)

The SNP had fewer seats in the House of Commons, but the nature of Britain’s first-past-the-post electoral system meant they still enjoyed considerable overrepresentation. Still, if possessing forty-eight seats at parliament was not enough to secure a referendum from the British government, then plummeting to a meagre nine seats in the 2024 general election killed the prospect stone dead. As the Labour party made sweeping gains in Scotland and the UK more widely, the SNP suffered the worst defeat in its 90-year history. In a speech following the election, a dour-faced Swinney acknowledged the need ‘to accept that we failed to convince people of the urgency of independence in this election campaign.’ The party ‘need[ed] to be healed and it need[ed] to heal its relationship with the people of Scotland’.

It is difficult to take note of something that isn’t there, which is perhaps why, as Ian MacWhirter noted in a piece for Unherd, ‘It doesn’t seem to have fully dawned on the UK political establishment that the break-up of Britain, which seemed a real possibility only a few years ago, has evaporated’. The topic of Independence now rarely features in the news, and some of the biggest names associated with Scottish nationalism – not only Nairn and Burnside, but also Alasdair Gray, Sean Connery and Winifred Ewing – have passed away in the decade since the referendum. Meanwhile, younger Scots are turning away from nationalism in general, an unforeseen phenomenon which MacWhirter puts down to the fallout from Brexit, the Covid-19 pandemic and the war in Ukraine, a sequence of geopolitical shocks that has many Scots wondering whether the campaign to sacrifice the nation’s economy and security on the altar of identity might well be recklessly indulgent. ‘The Union is probably safer now than at any time since the Jacobites waved their claymores 300 years ago.’

If the SNP hope to bring the dream of Independence back to life, it will not be enough to rely on generational shift and the goodwill of the British government. An uphill struggle awaits the Independence movement. It will entail extensive repairs to the party’s image, providing clarity on issues such as currency, retention of the monarchy and the route to EU membership, and – perhaps hardest of all – presenting a clear case as to why Scotland would be better off as an independent nation. Until then, the Union looks set to enjoy a new lease of life.


Photo Credit.

Mania by Lionel Shriver (Book Review)

Lionel Shriver’s latest novel, Mania, imagines a world in which the concept of intelligence has become taboo. ‘Dumb’, ‘stupid’, ‘moronic’ and every other synonym that might adequately describe the mentally deficient have become unspeakable terms of offence, while IQ tests and entrance exams alike are outlawed on the grounds of elitism. Idiots are not a protected class, however, because the prevailing ideology posits that idiots simply don’t exist. In this egalitarian utopia, everyone is equally smart. To suggest anything to the contrary is to commit a hate crime punishable by professional ruin and social ostracism.

If this all sounds familiar, it’s because Mania is a pointed parody of the socio-political logic of what Shriver, in a recent piece for UnHerd, described as the ‘collective crazes’ of the last decade: transgenderism, #MeToo, Covid lockdowns and Black Lives Matter. Her journalism has tackled each of these movements individually and collectively, but Mania is her first work of fiction to deal with the twin forces of political correctness and cancel culture head on. It’s perhaps worth pointing out that her recent novel, The Motion of the Body Through Space, featured as part of its subplot a diversity hire whose incompetence leads to the breakdown of the transport system in Hudson, New York – which landed Shriver in hot water during a promotional tour of the book. But critics will struggle to condemn Mania as offensive. For while the novel is implicitly critical of radical progressive politics, the Mental Parity movement is a squarely fictional creation. Even in the fragile political climate of 2024, the foolish remain fair game as an object of ridicule.

Mania’s characters are recognisable archetypes of any cowed and paranoid society. Plucky, witty and dangerously opinionated, Pearson Converse is one of Shriver’s most autobiographical protagonists, mirroring everything from the author’s overbearingly religious upbringing to the rebellious mentality it imprinted on her. Her defiance in the face of the Mental Parity movement makes Pearson a black sheep in polite society, but stems from a desire to protect her two eldest children, a pair of prodigies who in any other age would have a bright future lined up for them. It is the third child, Lucy, who, having grown up in an age in which Mental Parity has become the mainstream, constitutes an unlikely antagonist, blackmailing her mother and policing her language and behaviour. It is telling that Lucy’s ideological and cognitive equivalents throughout Mania are the teachers, politicians and television presenters, and that perhaps the only other thing they have in common is an unmerited power over those who dare to speak out.

But the real conflict that rages like a dynamo from Mania’s first pages to its dramatic conclusion is more nuanced, more complicated than a simple black-and-white battle between critical thinking Davids and knuckle-dragging Goliaths. Despite Pearson’s career as a university professor, the book focuses less on the shadowy cabal of academics pulling the strings of Mental Parity than on those who are complicit with the regime, or merely undecided. It is complacency that drives a wedge between Pearson and her comparatively apolitical husband, Wade, whom she accuses of ‘sit[ting] this whole thing out on the sidelines, watching, or declining to watch.’ Far more sinister is the character of Emory, Pearson’s lifelong pal, whose position on the whole thing is not neutral but ambiguous. What makes Emory particularly villainous is not that she is a believer, but that she is a non-believer, prepared to manipulate the burgeoning climate of paranoia for her own gain, advancing her career as a talkshow host by producing disingenuous op-eds on microaggressions or thought crimes and thereby embodying, by Pearson’s account, ‘the intelligent face of stupid’.

As Emory rides the coattails of this movement, Pearson’s own career – not to mention her family life and reputation – begins to spiral. Her first brush-in with the tyrannical power of Mental Parity comes when she assigns her literature class a novel that the self-anointed censors have exorcised from the Western canon. The scene is reminiscent of the opening of last year’s American Fiction, in which Monk, a black professor, writes on the class blackboard the name of a Flannery O’Connor story, only for a blue-haired white girl to object that she finds the title – ‘The Artificial Nigger’ – offensive. Monk is laid off from his job as a consequence. Pearson doesn’t quite lose her job for assigning Dostoevsky’s The Idiot to her class, but the stunt earns her the resentment of colleagues and students both, as well as a stern warning. What leads to her eventual dismissal is her later deployment of the word ‘retard’ during a tirade in class. Typically, the scene is filmed by every student in the class and uploaded to the internet. 

Pearson is not even safe within her own home, which she considers a sanctuary of normality – only for Lucy to report her to social services. As a result, Pearson is required to take a six-week Cerebral Acceptance and Semantic Sensitivity class, with the aim of weeding out elitist language from her vocabulary: 

Considering that ‘grasp’ could convey mastery some people lacked, we should instead ‘grip’ or ‘seize’ our coffee mugs. ‘Command’ could also mean an unjustifiable sense of intellectual dominion, so in a position of authority we should issue an ‘edict’ or ‘direction’. Admiring classifications such as ‘savvy’, ‘scholarly,’ and ‘erudite’ couldn’t help but imply the existence of benighted characters who exhibited none of these qualities, so if we were hell-bent on acclaiming colleagues, we should keep to wholesome, simplesorry, uncomplicatedcompliments such as ‘I like you’ or ‘That is good.’

If the attempt to jettison every contaminated word in the English language seems overkill, recall the institutional scramble only a couple of years ago, in which colleges across America issued ‘harmful language’ lists to students, singling out problematic obscenities such as ‘field’, ‘blackboard’, ‘straight’, ‘American’ and – you guessed it – ‘stupid’. Shriver herself conducted a highly entertaining takedown of this phenomenon for the Spectator. One gets the sense that this sterile dumbing down of the English language is what irks her the most, since the straitjacket of minimally offensive newspeak could not be further from the vibrancy and elasticity of the author’s own style. The unfortunate fact for her enemies is that Shriver is one of the most capable writers around. Her insights are profound and her prose is lucid, every sentence an immaculately crafted marvel of colloquial lyricism.

There is a disconcerting familiarity to the events of Mania, which echo some of the more maddening episodes of the last few years. From Sherlock to Columbo, films and TV shows which are seen to promote the notion of ‘cleverness’ are taken off air and removed from circulation. And a campaign to rename the city of Voltaire gains traction, since the views espoused by the author of Candide are no longer in step with those of its residents. 

In a conversational aside we learn that the rest of the world thinks the West has lost its marbles. It’s clear that Shriver has borrowed liberally from the events and controversies that have defined the zeitgeist, but Mental Parity is a creation all her own. Indeed, the titular mania is such a powerful force that it has the effect of sidelining all other social justice movements. Anders Breivik receives public sympathy after murdering 69 members of the Norwegian Workers’ Youth League for exhibiting ‘less than spectacular intelligence’. Not only is the concept of Islamophobia absent from political discourse, but Western society’s fascination with race itself has become blessedly passé – to President Obama’s detriment. ‘Nobody gives a crap anymore about his being a black president,’ Emory states, when the Mental Parity movement is still in its infancy. ‘He’s a know-it-all president. It’s death.’ His replacement is the ‘impressively unimpressive’ Joe Biden, acclaimed for his ‘delectably leaden’ speaking style. But when even the doddering ineptitude of a potentially demented president proves insufficient to satisfy voters, the Democrats find a new champion in the form of Donald Trump. Across the pond, meanwhile, the UK’s decision to leave the EU becomes a win for progressivism, given the tendency of many Remainers to demonise Brexiteers as stupid.

The good thing is that this imagined mania is so much worse – and therefore more entertaining – than any of the real manias currently afflicting the Western world. Thanks to the Mental Parity movement, food produced in the US is no longer safe to eat, nearly all fatalities in the armed forces are caused by friendly fire and a brain drain has left America stunted, handing China and Russia the keys to world domination. 

But while Mania is funny, razor-sharp and extremely readable, it’s also eerily realistic. For the seeds of Mental Parity may already have been sewn, and not just in the soil surrounding the R word. Universities are increasingly eschewing standardised examinations, while columnists wage war against the very idea of meritocracy. What’s more, in a further affront to the English language, last month it was announced that a new version of Scrabble was being released with simplified rules, in order to make the game ‘more accessible for anyone who finds word games intimidating’. If Lionel Shriver’s alternative history becomes the actual future, this fine novel will be the first for the chopping block. Read it while you still can.


Photo Credit.

What is the Point of the Turner Prize?

Picture the scene. Strings of tattered bunting, the concrete shaft of a half-built pillar. At the centre of it all a pile of red and black folders, supplanted by a pair of flagpoles bearing faded Union Jacks. A length of striped tape lies beside them on the floor like the shed integument of a snake, and everywhere you look you see road barriers, twisted, contorted, lopsided. If it weren’t for the fact that the setting of the scene is Towner Eastbourne art gallery, you’d think a car had crashed through it. And you probably wouldn’t blame the driver.

This isn’t the aftermath of a riot or the contents of a disorganised storage room. In fact it is Jesse Darling’s winning submission for the 2023 Turner Prize, one of the world’s most prestigious art awards. The prize was established to honour the ‘innovative and controversial’ works of J.M.W. Turner, although in the thirty-nine years since its inauguration, none of the winning submissions have evoked the sublime beauty of Turner’s paintings.

There are no facts when it comes to art, only opinions. The judges, who lauded the exhibition for ‘unsettl[ing] perceived notions of labour, class, Britishness and power’, seem to have glimpsed something profound beyond the shallow display of metal and tape. Or they may simply have considered it the least worst submission in a shortlist which included some accomplished but otherwise unremarkable charcoal sketches, an oratorio about the COVID-19 pandemic and a few pipes.

It may be that beauty lies in the eyes of the beholder. But there is a distinction to be made between works of art whose beauty is universally accepted, and those which fail to find acclaim beyond a small demographic of urban, middle-class bohemians. According to a YouGov poll, an unsurprising 97% of the British public consider the Mona Lisa to be a work of art. That figure drops to 78% for Picasso’s Guernica, 41% for Jackson Pollock’s Number 5, and just 12% for Tracey Emin’s My Bed. That 12% of society, however, represents those among us most likely to work in art galleries and institutions, and to hold the most latitudinarian definition of ‘art’.

For ordinary people, the chief criterion for art is, and always has been, beauty. But like the other humanities, the 20th century saw the art world succumb to the nebulous web of ‘discourse’, with a corresponding shift away from aesthetic merits and towards political ends. Pieces like Marcel Duchamp’s Fountain – an upturned porcelain urinal – proved that works of art could shoot to fame precisely on account of their capacity to disturb and agitate audiences. As philosopher Roger Scruton described it, ‘According to many critics writing today a work of art justifies itself by announcing itself as a visitor from the future. The value of art is a shock value.’ The fact that shock, fear and revulsion create more powerful reactions than the sense of joy, calm or awe one feels when looking at a Rosetti or a Caravaggio is an unfortunate fact of human nature, and remains as true today as it did a hundred years ago. In much the same way that a news stories about declining global poverty rates or deaths from malaria will receive less attention than stories about melting ice caps or rising CO2 emissions, a truly beautiful artwork will receive less attention in the media than something which irks, irritates and offends. 

In the opening chapter of The Picture of Dorian Gray, when Basil Hallward reveals the eponymous portrait to his friend Lord Henry, he confesses to feeling reservations about exhibiting the work despite it being, in Henry’s estimation, his masterpiece. ‘There is only one thing in life worse than being talked about,’ Lord Henry reprimands him, ‘and that is not being talked about.’ The sentiment is triply true in the age of social media. Indeed, the term ‘ragebait’ started appearing online in the months following Mark Leckey’s winning submission for the Turner Prize in 2008, an exhibition which featured a glow-in-the-dark stick figure and a naked mannequin on a toilet. Like Dorian Gray, the more the art world thirsts for attention, the more hideous the art itself will become.

The quickest route to attention is politics. At the award ceremony for this year’s Turner Prize, Darling pulled from his pocket a Palestinian flag, ‘Because there’s a genocide going on and I wanted to say something about it on the BBC.’ In his acceptance speech, he lambasted the late Margaret Thatcher for ‘pav[ing] the way for the greatest trick the Tories ever played, which is to convince working people in Britain that studying, self expression and what the broadsheet supplements describe as “culture” is only for certain people in Britain from certain socio-economic backgrounds. I just want to say don’t buy in, it’s for everyone’. The irony is that all the money in the world wouldn’t fix the problems currently afflicting the art scene. If the custodians of modern art want to democratise their vocation, and make culture available to ordinary people, they should follow the example of Turner – and produce something worth looking at.


Photo Credit.

Scroll to top