university

The Reality of Degree Regret 

It is now graduation season, when approximately 800,000 (mostly) young people up and down the country decide for once in their lives that it is worth dressing smartly and donning a cap and gown so that they can walk across a stage at their university, have their hands clasped by a ceremonial ‘academic’, and take photos with their parents. Graduation looked a little different for me as a married woman who still lives in my university city, but the concept remains the same. Graduates are encouraged to celebrate the start of their working lives by continuing in the exact same way that they have lived for the prior 21 years: by drinking, partying, and ‘doing what you love’ rather than taking responsibility for continuing your family and country’s legacy. 

However, something I have noticed this year which contrasts from previous years is that graduates are starting to be a lot more honest about the reality of degree regret. For now, this sentiment is largely contained in semi-sarcastic social media posts and anonymous surveys, but I consider it a victory that the cult of education is slowly but surely starting to be criticised. CNBC found that in the US (where just over 50% of working age people have a degree), a shocking 44% of job-seekers regret their degrees. Unsurprisingly, journalism, sociology, and liberal arts are the most regretted degrees (and lead to the lowest-paying jobs). A majority of jobseekers with degrees in these subjects said that if they could go back, they would study a different subject such as computer science or business. Even in the least regretted majors (computer science and engineering), only around 70% said that they would do the same degree if they could start again. Given that CNBC is hardly a network known to challenge prevailing narratives, we can assume that in reality the numbers are probably slightly higher.

A 2020 article detailed how Sixth Form and College students feel pressured to go to university, and 65% of graduates regret it. 47% said that they were not aware of the option of pursuing a degree apprenticeship, which demonstrates a staggering lack of information. Given how seriously educational institutions supposedly take their duty to prepare young people for their future, this appears to be a significant failure. Parental pressure is also a significant factor, as 20% said that they did not believe their parents would have been supportive had they chosen an alternative such as a degree apprenticeship, apprenticeship, or work. This is understandable given the fact that for our parent’s generation, a degree truly was a mark of prestige and a ticket to the middle class, but due to credential inflation this is no longer the case. They were wrong, but only on the matter of scale, as a survey of parents found that as many as 40% had a negative attitude towards alternative paths. 

Reading this, you may think that I am totally against the idea of a university being a place to learn gloriously useless subjects for the sake of advancing knowledge that may in some very unlikely situations become useful to mankind. Universities should be a place to conceptualise new ways the world could be, and a place where the best minds from around the world gather to genuinely push the frontiers of knowledge forward. What I object to is the idea that universities be a 3-year holiday from the real world and responsibilities towards family and community, a place to ‘find oneself’ rather than finding meaning in the outer world, a dating club, or a tool for social mobility. I do not object to taxpayer funding for research if it passes a meaningful evaluation of value for money and is not automatically covered under the cultish idea that any investment in education is inherently good.

In order to avoid the epidemic of degree regret that we are currently facing, we need to hugely reduce the numbers of students admitted for courses which are oft regretted. This is not with the aim of killing off said subjects, but enhancing the education available to those remaining as they will be surrounded by peers who genuinely share their interest and able to derive more benefit from more advanced teaching and smaller classes. Additionally, we need to stop filling the gaps in our technical workforce with immigration and increase the number of academic and vocational training placements in fields such as computer science and engineering. With regards to the negative attitudes, I described above, these will largely be fixed as the millennial generation filled with degree regret comes to occupy senior positions and reduces the stigma of not being a graduate within the workplace. By being honest about the nature of tomorrow’s job market, we can stop children from growing up thinking that walking across the stage in a gown guarantees you a lifetime of prosperity.

On a rare personal note, having my hands clasped in congratulations for having wasted three years of my life did not feel like an achievement. It felt like an embarrassment to have to admit that 4 years ago when I filled out UCAS applications to study politics; I was taken for a fool. I have not had my pre-existing biases challenged and my understanding of the world around me transformed by my degree as promised. As an 18-year-old going into university, I knew that my criticisms of the world around me were ‘wrong’, and I was hoping that and education/indoctrination would ‘fix’ me. Obviously given the fact that 3 years later I am writing for the Mallard this is not the case, and all I have realised from my time here is that there are others out there, and my thoughts never needed to be fixed.


Photo Credit.

The Right in Academia and Politics

As a student at university, it’s easy to be aware that academia is dominated by the left. After all, it is the voices on the left we hear the most. Added to this, a Conservative Party that does not look very conservative at the moment and almost like they are out of ideas – just take a look at the agendas for the Conservative Party Agendas for 2023 and 2022. But over the summer, two academic conferences of note took place, which should bring a glimmer of hope to conservative students.

The first, held at Churchill College at the University of Cambridge from the 6th to 7th July 2023, on British Intellectual Conservatism: Past and Present. This was organised by ResPublica and the University of Public Service. The second, held in the House of Lords from the 14th to 15th September 2023, on Margaret Thatcher: Her Life, Work, and Legacy. This had been organised by two research centres at the University of Hull. The first research centre was the Centre for Legislative Studies, which is led by Lord Norton of Louth, the second by Dr. Matt Beech who leads the Centre for British Politics. 

The conferences, naturally, had different focuses but as a participant at both – and having had time to reflect on them, there are four things I found in common. These conferences were full of enriching academic thought, they were both thought provoking, provided a space to be reflective, and to think ahead to the future. In the current climate when it looks as though the Conservative Party will be unsuccessful at the 2024 General Election, both conferences highlighted the need for a better vision. 

The two conferences in their own way provided a means to push back against the narrative we see that the right are out of ideas. Rather, the conference on British Intellectual Conservatism: Past and Present consisted of several panels, from Conservatism Today to addressing Free Speech and Conservatism. There were also two panels dedicated to two of the great leaders of the Conservative and Unionist Party, a panel on the Age of Churchill, another on the Age of Thatcher. All in all, the conference did exactly as the name of the conference said it would. A key focus of the conference was on the works of Roger Scruton and bringing his ideas, which may have been forgotten to the forefront. There is much to be learnt from this conference. 

For the conference on Margaret Thatcher, many ideas were shared. The main takeaway raising the issue that politicians today do not have a long-term vision. Many who praise Liz Truss and her allies say “she did what Thatcher did” but what people fail to recognise and remember: Thatcher spent many years developing her ideas with a team before those ideas became policy.

There are lessons to be learnt from the conferences. It is people, no matter their role in politics, whether they work in academia, policy or aspire to be an elected representative, who need to take a step back. There are many great people we can learn from, but the problem with the world today is everyone is looking for the next great thing.  The rivers of free-flowing conversation of ideas from conservative academics and politicians needs to be opened up before anything else can happen.


Photo Credit.

The Dualism of Contemporary Archaeology

Time Team was an amazing programme. It was educational yet accessible, undeniably British and true to its discipline. Really, it was everything one could wish for from a television programme. Beyond the screens, it was just as successful in blasting a hole into the ivory tower of academic archaeology, the programme’s lack of gatekeeping and obfuscation opening a realm previously exclusive to university departments.

In 2006, its presenter claimed Time Team had published more reports on its excavations in the past decade than every university archaeology department combined, whilst criticising the shortcomings of the activity (or somewhat lack thereof) within the academic establishment. From the passion of the assembled historians and archaeologists in each episode, it is not difficult to believe this may have been the case.

Broadcaster politics and the pursuit of demographic ‘relevance’ destroyed this British staple rather swiftly in the early 2010s. In retrospect, it was the last hurrah for British archaeology before it wholly sank into the same cultural strife engulfing the rest of modern academia. However, it would be unwise to discard contemporary archaeological activities as nonchalantly as one might do with the rest of the humanities. Due to its fundamental characteristics, a unique dualism now exists within the discipline which merits some attention.

Readers will likely be able to guess many of the negative consequences contemporary culture has imparted onto academic archaeology, chiefly since they are the same as in other humanities subjects. These are focussed on the interpretations made beyond excavations in published writing, thus in the part of the discipline which is closest to modern history’s general malfunctioning. Both disciplines suffer from a homogenous progressive politics amongst academics, so generally hegemonic outcomes of that sort are all but guaranteed for the foreseeable future.

This renders discussion of certain historical subjects completely taboo, even when the tangible evidence is revealed by archaeologists, lest academics be seen to support a supposedly ‘toxic’ mindset about the past or something similarly in contravention of their worldviews. History is neither a story of progress, nor a proof of progressive values’ precedence and inevitability, but such facts are amongst those ignored by a paradigm of deconstruction.

Perhaps more disturbingly, for people of any political persuasion, archaeology can prove civilisations are not immortal and can fall in a wave of decisive violence in the right conditions. The anonymous archaeologist Stone Age Herbalist covered the flaws of contemporary academic archaeology in greater detail in an article for UnHerd last year, which I certainly recommend.

As for the other side of contemporary archaeology, the discipline remains defined by recovering tangible evidence of the past from nature, in other words a bedrock of empirical objectivity. If one puts aside the declining rigour of historical interpretations, the basic role of an archaeologist remains vital for our understanding of the past and its transmission to future generations. The past archaeologists excavate is less ‘living’ than, for instance, an extant Tudor manor, but is still has ample potential to animate the mind about how our ancestors once lived and contributes a great deal to our verifiable knowledge of history.

A couple of examples worth praising are in order. First, the ancient city of Pompeii should be at least vaguely known to all readers. After all, it is one of the most impressive archaeological sites in the world, accompanied by the allure of its dramatic demise to Mount Vesuvius in 79 AD. About a third of the city remains unexcavated, but is generally off-limits from further work in favour of extensively conserving previously unearthed buildings.

However, excavations restarted over the last few years in areas last dug over a century ago have brought forth a wealth of new discoveries, and digs on a new insula (city block) to relieve pressure on exposed walls have been widely reported for a fresco depicting something reminiscent of a pizza. The story of Pompeii is only growing richer as a result of this new archaeology. Second, a lot less readers will know about cuneiform, let alone be able to read it. Globally, only a few hundred people can competently translate the oldest known form of writing used by the Sumerians and Akkadians, whereas archaeologists have found some half a million clay tablets inscribed in cuneiform. Digitisation projects in recent years have aided the accessibility of these artefacts, but the scarcity of scholars yet hampers our recovery of that past.

Therefore, a paper published in May discussing a new project to translate Akkadian with neural machine translation might revolutionise our knowledge of ancient Mesopotamia. In essence, it uses similar technology to Google Translate to render Akkadian cuneiform as meaningful English phrases or sentences. Accuracy is far from ideal, as formulaic texts are translated with some skill by the program whereas literary texts are practically out of the question, but the prospect of substantial usefulness in the future exists without making Akkadian scholars redundant.

What does all this tell us? Contemporary archaeology derives an ultimately negative trajectory from its academic overseers, but this is indistinguishable from the rest of academia in the current deconstructive paradigm. Whilst the discipline is buttressed by an inherent tangibility and objectivity which still produces new discoveries, it should not be overlooked when discussing a restoration of academia towards renewed sense. Indeed, such a project will start from a better basis in archaeology than any other subject in the humanities.

I severely doubt archaeologists will begin grinding artefacts into dust which reinforce uncomfortable facts about human nature, but similar effects can be achieved more subtly by the custodial institutions. Archives and museums increasingly see their collections through a far more reductive and essentially monochromatic lens, with all the nuance therein of a medieval executioner by trading in absolutes. These institutions have developed culturally modish tactics to bypass their natural approach to tangible history and enable presentism to abound, removal to storage being the bluntest. One should not need to remind readers how much poorer the world will be in terms of future maintained knowledge the longer this persists.

We are almost lucky contemporary archaeology possesses characteristics that cannot let transient perspectives fully dominate. They may become the only tools left in the humanities to fight the entropies of this age, until such time as we remake the others ourselves.


Photo Credit.

Scroll to top