My idea of happiness

A poem made out of the headlines to Adrian Chiles’s articles in the Guardian

I recently saw something

in a petrol station toilet

Southbound on the M1

that I can never unsee.

I spent an afternoon

writing my own name.

It was lovely

until I started overthinking it.

What is an app?

I honestly have no idea.

After a meeting that went on for hours,

I was finally told what it was all about.

I was being interviewed

for a job at MI5.

Do I really need to drink

almost 5 litres of water a day?

I haven’t got the bladder for it.

I almost downloaded a pebble-identifying app,

but some stones should be left unturned.

Would you pay £15,000 for a portrait of me?

Me neither.

My idea of happiness?

A strimmer and a bramble-choked path.

Gen Z and Me

This is a longer version of the piece I wrote on Generation Z for the London Review of Books:

On the walk from the car park to my university building sits a red telephone box, classic K6 model. The other day, out of curiosity, I pulled at its heavy cast iron door, stepped inside, and let the door thud behind me. It must be fifteen years, at least, since I last experienced that strange dampening of the sounds of the street and that smell of stale urine and old takeaways. For a moment, the phone box became a TARDIS and I was a homesick student ringing my parents again, harassed by the pips that demanded more coins and the lonely finality of the purring dial tone when it cut me off. It reminded me that I am a digital immigrant, raised in a clunkier, analogue age, when long-distance communication felt fragile, precarious, interruptible.

I was an early inhabitant of the online world. I remember using Netscape Navigator, one of the first web browsers, in a computer room at the University of Sussex in autumn 1994. I have been an Amazon customer (its website reminds me) since 1999, longer than most of the students I teach have been alive. With mobile devices, though, I was on the other side of the adoption curve. No message seemed so urgent to me that you had to carry round the mailbox in your pocket. I bowed to the inevitable in 2004, with one of those entry-level Nokias, all rounded plastic and chunky buttons. I still had it three years later when I saw that film of Steve Jobs at the Macworld Expo, showing the audience his new phone. No one will want to search the web on something as small and fiddly that, I thought. It will never catch on.

Even now, I get so few messages on my garden-variety smartphone that I forget to charge it, or don’t look at it for days. Being mildly dyspraxic and very myopic, I prod at it ponderously and with great emphasis, as if expecting it not to respond. Often it punishes my lack of faith by disowning me, refusing to recognise my thumbprint and unlock itself. My students’ phones are often in their hand, usually on their person, and always within reach. They swipe, pinch and caress them like virtuosos.

Most of this year’s new crop of undergraduates were born between September 2003 and August 2004, the year Eats, Shoots & Leaves was published, the Hutton inquiry reported and Channel 4 aired the last episode of Friends. If you find this information as uncomputable as I do, you’re probably about my age. More significantly, these students were two or three years old when Jobs launched the iPhone. In Gen Z, Explained: The Art of Living in a Digital Age (University of Chicago Press), an anthropologist (Roberta Katz), a linguist (Sarah Ogilvie), a historian (Jane Shaw) and a sociologist (Linda Woodhead) try to understand this peer group of digital natives. They define Generation Z, also called Zoomers or post-millennials, as those born between 1995 and 2009. Even the oldest members of this group have no memory of a world without Broadband.

The key fact for Katz et al. is that Gen Zers have had to navigate this new online reality without the aid of their mainly clueless elders, and have thus improvised their own rich and hard-to-penetrate subcultures. What they mostly like to do is collaborate in leaderless groups. They use digital tools to create shared documents, synch their calendars, write and read fan fiction, play games together, and organise real-world lift sharing, couch surfing and political activism.

They have devised an intricate language and etiquette for their online lives. They can quickly convey their pleasure or displeasure through memes – such as the ubiquitous Drake Yes/No meme, made up of two stills from the rapper’s video of ‘Hotline Bling’, in which he holds his hand up to his face in disgust, and looks happy. They use emojis as ‘a softener and a social lubricant’, and bracket words with asterisks and tildes for emphasis and irony. Whether they write ‘k’ or ‘kk’ to mean ‘okay’ is charged with meaning. The first is purposely curt, especially if the sender has taken the trouble to override the default capitalisation, and still more so if they add a passive-aggressive full stop. The second is cheerful and casual, a no-sweat way to temper the brusqueness of the single letter.

These tonal shadings matter because post-millennials like to state clearly where they are coming from. Self-labelling, especially of fine-grained sexual and gendered identities, has become ‘an imperative that is impossible to escape’. They think it important to be themselves, to admit to their struggles and vulnerabilities, and to say what they mean. In the iGen Corpus, a digital data bank of seventy million words used by post-millennials and compiled by Ogilvie, words such as real, true, honest and fake occur far more often than in general language use.

The book’s findings are mostly based on interviews with students at three institutions: Stanford, Foothill Community College (a few miles from Stanford in northern California), and the University of Lancaster. In a world where so many things compete for their attention, these students worry about allocating their time efficiently. They dislike email, finding it laborious compared to texting and messaging. ‘If it’s a professor you don’t have a close relationship with, you have to say, hi professor whatever, I’m in your class or I’m interested in this blah blah blah,’ one student says. ‘You have to kind of frame it.’ Several of the students surveyed watch recorded lectures at triple speed – not just to save time, one of them says, but to help them concentrate.

All this is useful, if disconcerting, for a university teacher to get learnt. I was less convinced by the book’s basic premise: that the new technology so enculturates its young users that it has created entirely new ways of thinking and being. The book first emerged in a conversation between the authors on the Stanford campus in 2016, when they agreed that ‘incoming students were strikingly different from those from a few years before’. Gen Zers, they argue in their introduction, ‘are shaped by and encounter the world in a radically different way from those who know what life was like without the internet’.

The book’s title carries this sense of interpreting to non-initiates the behaviour of a separate tribe, albeit one whose habits are increasingly being adopted by other age groups. While I was reading it, a phrase of the child psychologist David Elkind’s sprang to mind: ‘cognitive aliens’. Elkind was discussing the work of the Swiss psychologist Jean Piaget, which revealed how contrarily young children see the world – believing, for instance, that the sun and moon follow them as they walk around, that anything that moves, from clouds to cars, is alive, and that dreams fly in through their window at night. For Elkind, Piaget’s work suggested that the main problem in education is communication. The child’s mind is not a tabula rasa but its own rival system for generating reality. Every middle-aged teacher has had a related fear that their students now dwell in an unreachable mental landscape. But the stories in Gen Z, Explained don’t always sustain its initial claim that post-millennials think and behave in very different ways.

Nearly all those interviewed for the book still say that their favourite mode of communication is ‘in person’. Every era thinks that its technology has changed everything utterly, but human instincts, after 300,000 years of evolution, must be pretty resilient. My students check their phones almost as often as they blink, but isn’t that just because we are inescapably social animals? I would check my phone all the time, too, if anyone ever sent me any messages.

Social networking and the smartphone do seem to have made young people more willing to make intimate feelings public. The students in Gen Z, Explained post pictures of their ‘depression meals’ (which can range from a comforting Deliveroo order to a mishmash of whatever food they can find) as a signal that they are feeling low. But they also make clear that this kind of sharing is made possible by distance. One interviewee says that he can post to strangers without ‘worrying that you’re adding some emotional toll to them … whereas your friends are sort of obligated to help you’. Post-millennials are perfectly aware of the boundaries between online and offline life; they just draw them in subtly different ways. A surprising finding in Gen Z, Explained is that it is now a common courtesy to ask permission from friends before posting a picture in which they appear. Those interviewed are also well-attuned to the paradox of having more voice than ever before online, while often feeling powerless IRL (‘in real life’) to change economic and political systems that seem ‘locked, inaccessible to them, and wrongheaded’.

Everyone looks like a maestro when they’re using technology you’re unversed in. If a time traveller from the 1990s arrived in the present, they would marvel at the effortless aplomb with which people of all ages manipulate their touchscreens, talk to their digital assistants and wave contactless cards at readers (unless they are Rishi Sunak, who finds the last one difficult). In a mere fifteen years, smartphones have become the central technology of daily life around the world. I had a colleague twenty years my senior who retired to Portugal and went wholly and impenitently offline, with not even a mobile number to reach him on. The audacity of it! In our age of hyper-connection, he might as well have sailed off the edge of the world.

In Generations: Does When You’re Born Shape Who You Are? (Atlantic Books), Bobby Duffy argues that our currently polarising discourse about generational difference is ‘a mixture of fabricated battles and tiresome clichés’. Duffy, director of the Policy Institute at King’s College London, likes to use an abundance of quantitative statistics and qualitative surveys to challenge common stereotypes and perceptions (his previous book was called The Perils of Perception). Generations thus avoids a charge that could be levelled at Gen Z, Explained – that its conclusions mostly rest on a selective and overeducated sample (although itdoes supplement its student interviews with a representative online survey of 2000 young adults in the US and UK). Duffy’s book is not as alive with anecdote and illustration as Gen Z, Explained, but it deploys a barrage of data to reveal the messier and more interesting reality behind popular myths.

It’s true, he says, that age has become more of a political dividing line, over issues such as Brexit, racial and gendered injustice and privilege, and climate change. It’s also true that social media’s silos can make it harder for generations to converse with each other across that line. Half of post-millennials use SnapChat, but only 1 per cent of Baby Boomers – although some apps, like Facebook and WhatsApp, do better at cutting across age divides. But our fractious politics and online squabbles have created a false impression of post-millennial woke warriors and baby-boomer reactionaries at war. Family links remain stronger than our links to our peers. Lockdown compliance among young people was high partly because they wanted to protect their parents and grandparents from the virus. If anything, as I see on open days and at graduations in the sweetly close bonds that students have with their parents, the generation gap has narrowed. Those interviewed in Gen Z, Explained say that they often call or message a parent – usually the mother – several times a day, or send them pictures, especially of meals.

Sociologists use three explanations for why people’s attitudes and behaviours change over time: period effects, lifecycle effects and cohort effects. Period effects are when change happens across all age groups, because of sweeping societal shifts. Lifecycle effects are when change happens because of the aging process, or in response to key events such as leaving home, becoming a parent or retiring. Cohort effects are when change happens because a generation is socialised by the same experiences. Duffy thinks that the current discussion of generations attributes too much to cohort effects and not enough to period and lifecycle effects.

Generations that seem atypical when they are young tend to revert to a familiar life course as they age. For instance, post-millennials are accused, like many cohorts before them, of being individualistic and materialistic. To the extent that this is true, it is a lifecycle effect, a youthful trait that people grow out of as they take on the responsibilities of work and family. Post-millennials are also around twice as likely to say they feel lonely than older people, but we need to remember that they are at a stage of life when socialising feels compulsory and isolation cuts deep.

When you factor in lifecycle and period effects, generational changelooks more nuanced. My students will eventually have to make their truce with email, not just because I tiresomely insist on emailing them, but because it will remain the default form of communication in graduate employment. A cohort effect will become a lifecycle effect. What Gen Z, Explained claims as a cohort effect – the value young people place on being open and authentic – seems to me more of a period effect. In her book Family Secrets, the historian Deborah Cohen argues that a ‘modern age of confession’ has been slowly emerging in Britain since the 1930s, as attitudes towards divorce, illegitimacy, homosexuality, infidelity, mental disability and other aspects of life once kept as shameful secrets have changed. Transparent self-narration has come to be seen as the key to psychological well-being and a healthy public life. Generation Z’s attitudes are part of a long-term trend towards the valuing (even over-valuing) of emotional candour and empathetic connection.

One symptom of this trend is the irresistible rise of relatable, a word I have been trying for at least a decade to get students to stop writing in their essays. One day they all just started using it at once, as if there had been a meeting about it in my absence. Again and again, they commended a text, character or theme for being relatable. Easy to relate to, they meant. Relatable to what?, I would ask in the margin, perhaps too gruffly. I did not care for this voguish word, which seemed to demand that literature should always mirror our own lives, instead of being a portal into the implacable strangeness of other lives.

Needless to say, my war against relatable has ended in bitter defeat, with my pedants’ army routed and fleeing for the hills. Ogilvie’s iGen Corpus reveals much higher usage of this word among young people, but that is surely shifting. When I saw an interview with Patti Smith (born 1946) in which she described her song ‘Because the Night’ as ‘very relatable’, I knew the game was up. Fair enough. Language is always changing and young people are always at the vanguard. Anyone trying to counter the hegemony of relatable calls to mind the Grandpa Simpsons meme used to mock baby boomers railing against change: Old man yells at cloud.One of these days I may even start using relatable myself. That is how language works, and how cohort effects become period effects.

The most profound recent generational change, for Duffy, has nothing to do with technology. It is the phenomenon of ‘delayed adulthood’. Key life stages, such as leaving home, getting a stable job and moving into a place of one’s own, are happening much later. This partly stems from people staying longer in education but mostly stems from the low wages, precarious employment, debt and housing problems created by austerity. An emblematic contemporary figure is the university graduate sleeping in the single bed of their childhood bedroom. Duffy quotes one 28-year-old who has moved back in with her parents: ‘It’s hard to feel like an adult when you’re living with the people who used to brush your teeth.’ The huge growth in private wealth compared to growth in income, largely down to the long housing boom, ensures that advantage and disadvantage will be passed down the generations. Duffy suggests that this betrayal of the intergenerational contract – the promise that each cohort will have a better life than the one before it – is ‘a key reason why people of all ages are more likely to question whether our economic and political systems are working’.

Those with an interest in maintaining the status quo, meanwhile, prefer to treat post-millennials as children. This involves much less effort than addressing the structural causes of delayed adulthood. If young people have the temerity to want a secure job and affordable housing, they are told to grow up and quit whingeing. If they can’t pay their heating bills, it is because they have frittered away their income on Starbucks and Netflix, having failed to learn the grown-up art of delayed gratification. And if they are consumed by woke identity politics and metropolitan Remainer attitudes, then they must have been force-fed these views by their university lecturers.

This now common idea of universities as indoctrination camps for impressionable young minds would not survive long in a university classroom. Why would my students pay attention to my views on Brexit, when I can’t even get them to stop using the word relatable? Teaching is an uncertain affair, full of such humility-inducing failures and miscues. Students have their own ideas about what is worth knowing and retaining, not because they are a tribe apart, but because each one of them is an adult human – unbiddable, unpredictable and, ultimately, indecipherable. My students are not relatable, and neither am I.

We expend so much anxious thought on generations because, as Duffy says, they are ‘interwoven with the fundamentals of human existence and societal change; while individuals are born, live and die, society flows on, changed a little or a lot by our cohort’s presence and then its absence’. It is salutary for people in positions of privilege, like me, to be discombobulated by change, to feel that those younger than us are becoming harder to reach as they pull the rug from under the reality we have helped shape. In his memoir Teacher Man Frank McCourt writes about the thirty years he spent teaching English in New York high schools. The experience confirmed the truth of what his old professor of education had told him, that ‘it is the function of the young to get rid of their elders, to make room on the planet’. A teacher’s role is to pass something on and get out of the way – to make themselves dispensable.

Still, if you are a teacher of the humanities, you have to believe this: the journey from one brain to another may be the most difficult and circuitous in the universe, but there is still a basic commonality to human experience, and in a classroom you can search for that commonality together. Even cognitive aliens are, in Elkind’s words, ‘emotional countrymen’. If I need reassurance that this is true, I remind myself that I am in all essentials the same person as that homesick student in a phone box: stubborn, needy, self-absorbed, socially unconfident, intellectually arrogant. Since then, I have dumped many once cutting-edge bits of tech in landfill but, in the words of the Tracey Thorn song, the heart remains a child. Now I am an old(ish) man yelling at clouds. But nothing has really happened to me except the passing of time, and no one consulted me about that.

The BBC at 100

I wrote this for a TLS feature in which various contributors discussed what the BBC means to them:

The BBC makes no ideological sense in a post-Thatcherite world. Public goods have come to be seen as the servant of indiv­idual, rational consumers, who are meant to know what they like before they have seen or heard it. And what does British mean, anyway, in an era of devolution and digital fragmentation? A nation is an imagined and often imaginary community. Viewers in Lerwick or St Helier have always watched television with a different eye and ear to those in London, except during those years when they couldn’t watch it at all, because the transmitters did not reach them. And we are forced, by law, to pay for it all! In The Kingdom by the Sea Paul Theroux includes our willingness to pay the licence fee in a long list of what makes the British crazy, alongside their habit of wallpapering ceilings and putting little knitted bobble hats on soft-boiled eggs to keep them warm. That book was published in 1983, before the BBC’s free-market critics had gathered in force.

What can one say in the BBC’s defence? Only that the things that are truly precious in life can’t be fully audited or rationalized. I value it because it tells me that I am more than just a consumer, more than just part of a statistical aggregate of viewers or listeners. I am one member of a diasporic national community, loosely assembled in 20 million living rooms. That community is hard to put one’s finger on and easy to destroy. But precisely because it demands so little of those who belong to it, it can create a sense of commonality among people with little else in common.

Defenders of the BBC tend to shout about its news coverage, its flagship documentaries or its prestige dramas. I prefer to think of those programmes that just tick along in the background – The Sky at NightGardeners’ WorldFarming TodayChoral Evensong, two-minute snatches of birdsong on Radio 4, documentaries about Hebridean islands on BBC Alba. Or of those radio presenters who got me through lockdown by talking away reassuringly in the corner of the room. Watching TV and listening to the radio are everyday, low-intensity and often private acts. It is easy for us to forget what a vital part of our lives they are, and how much the BBC would be missed if it were gone.

The plight of the humanities

I wrote this piece for Times Higher Education earlier this month:

In Samira Makhmalbaf’s film Blackboards (2000), a group of men trek through the mountainous regions of Iranian Kurdistan, stooped under the huge blackboards strapped to their backs. They are itinerant teachers looking for pupils, anyone who will pay them for lessons with money or food.

“Do you know how to write? Would you like to learn?” they say to everyone they meet in earnest, badgering tones. In mountain villages, where the locals hide inside their houses, they shout: “Open the windows. Answer me! I’ve come a long way to teach your children to read and write.” Everyone ignores them or tells them to go away.

As far as I know, teachers in Kurdistan do not roam around with blackboards on their backs, hawking their wares. The blackboards in the film are a conceit, a surreal and beautiful metaphor for the inbuilt asymmetry of education – for how often it rests on offering something as yet unquantifiable to an audience of the apathetic or unconvinced.

I am starting to wonder, as a humanities teacher in a UK university, if this will be my fate – wandering the streets with a laptop and data projector, offering up lectures on Shakespeare’s sonnets or the contemporary novel to random pedestrians. Every week brings news of more planned redundancies and course closures in my field. The hollowing-out of departments by voluntary severance and early retirement is happening more widely, under the radar. The cuts are coming mostly in the post-1992 universities and others outside the research elite.

What adds to the teachers’ desolation in Blackboards is that they are such terrible salesmen – so shrill and needy in their pitches to potential pupils. The humanities have a similar problem. We urgently need a better story that explains the worth of what we do. And yet, as the writer and psychotherapist Adam Phillips argues, often we fall back on idealised justifications of the humanities that “betray, in their vehemence and the nature of their claims, a lack of confidence”.

They make it seem, as Phillips says, as if we are defending a religion or a local hospital under threat of closure. We oversell the humanities as impactful, transformative and game-changing in ways that don’t sit well with the quiet, patient, accretive methods of the scholarship. Or we claim, without much evidence, that the humanities have cornered the market in “critical thinking”, or that studying them makes us more empathetic and humane. If I were a scientist, I would be irritated by the suggestion that my work did not inspire the same kind of emotional literacy. A geneticist or neuroscientist need value human life and consciousness no less for being able to glimpse their makeup in DNA’s double helix or the lump of jellified fat and protein inside our skulls.

We shouldn’t need to overclaim for the humanities like this, because they explore something essential about the human species. We are interpretive animals. “People live by narrative,” Boris Johnson said when being interviewed for a profile in The Atlantic last year. “Human beings are creatures of the imagination.” On this, at least, we can agree. Making shared meanings is almost as vital to us as our animal needs for food, water, shelter and sleep. Every human-made system – literature, art, music, religion, money, the law, the constellations of stars – demands that we swallow its story. The success of our species derives from our ability to weave these intersubjective webs of meaning.

The price we pay is to become overly immersed in these invented worlds, so that it is hard to break their spell and embrace other realities. In Marilynne Robinson’s words, “We live on a little island of the articulable, which we tend to mistake for reality itself.” The humanities show us how to read, with imaginative sympathy and watchful scepticism, the stories we tell ourselves. They thwart our tendency to impose a false neatness and coherence on our lives.

This process is sometimes awkward and uncomfortable. The cuts to humanities departments have coincided – and not coincidentally – with newspapers and politicians caricaturing these departments as hotbeds of woke ideas and resentment-filled identity politics. When the humanities question a society’s well-worn and consoling narratives about itself, it can unsettle and annoy people. Buying into collective meanings is a bit like riding a symbolic bicycle. If we think too much about working the pedals and keeping our balance, we fall off. Falling off a bicycle is painful and makes us look foolish. So most of the time we prefer not to explore too closely the meanings that undergird our activities. We just carry on pedalling.

It is hard to quantify what would be lost if the humanities weren’t there. They don’t come up with solutions like the sciences do, in the form of, say, new vaccines or alternative sources of energy. They can’t produce anything like the beautiful, elegant economy of a mathematical equation – the formula that explains so much with as little effort as possible. Science moves forward with these breakthroughs and discoveries; but the humanities are cumulative, not progressive. Every text examined in the humanities is trying to solve differently the riveting but essentially unsolvable puzzle of being human. Every text is as fascinatingly flawed, as infinitely granular, as limitlessly miscellaneous, as the human being who made it.

So the humanities pursue not solutions but the more careful elaboration of problems. They layer on meaning and erudition, continuing that conversation begun tens of thousands of years ago when Homo sapiens went deep into caves to blow ochre dye on the walls. The results are incremental and hard to measure. Scholars and students just gradually become better writers, readers and thinkers, and more subtle sense-makers of their own and other people’s lives. Bit by bit, the humanities deepen and enrich the act of collective meaning-making.

These slow-burn, incalculable effects mean that the loss of the humanities would be rather like the loss of habitat in the natural world – something profound and far-reaching that occurs piecemeal and unnoticeably, while our attention lies elsewhere. Most people don’t miss that wildflower meadow now that it has become a motorway, especially if they didn’t know the meadow was there in the first place. They didn’t notice the number of migrating birds or pollinating insects declining, because the birds and insects didn’t announce their departure and we were looking the other way. But something precious was lost, all the same.

Of course, people have foretold the death of the humanities for decades. “The humanities are at the cross-roads, at a crisis in their existence,” the historian J. H. Plumb wrote in his introduction to the Pelican book Crisis in the Humanities in 1964. The humanities survived, and indeed student enrolments remained healthy for the next half century. True, the share of students doing the humanities has been falling in UK universities since 2012, in favour of the sciences. This is probably a result of the relentless focus on STEM in schools, and austerity and higher tuition fees driving more career-specific choices. But the fall is not precipitous and might reverse at some point, as subjects go in and out of student fashion.

The problem is not some existential threat to the humanities in general, but to their future outside the elite institutions. What we are seeing now is the long playing out of the government’s decision in 2013 to end student number controls in England. Since this came into force in 2015, the high-ranking universities have made up shortfalls in humanities admissions – or over-recruited in an area seen as cheap to resource – by taking students who would have previously gone elsewhere.

As stories about closures and redundancies in the humanities have broken, there has been much criticism of the predatory student recruitment practices of the Russell Group. But what they are doing is wholly consistent with government policy: student fees and other market mechanisms should increase competition and curb “artificial demand”, even at the cost of the closure of courses and, perhaps, entire universities. Subjects like languages, literature and history risk becoming the new Classics – a luxury taught in what newspapers now routinely call the “good” universities (not elite, or even best, but good).

According to the rational choice economics that now dominates our public life, a university education is a “disutility” – the sacrifice of one’s time and convenience for money. What matters is not so much the learning itself but what it leads to. In the crudest metric, this means a job with a salary high enough to justify the expenditure of the tuition fees. The government’s definition of a “good” university course is one where the size of its fees correlates with the size of salary a graduate of that course can command.

This inevitably favours the elite universities, especially since, with a greatly increased stock of graduates, employers tend to use university rankings as a short cut when sifting job applications. Those who dismiss humanities courses at post-1992 universities as “low value” assume, or at least pretend to assume, that the marketisation of universities established its own natural hierarchy of winners and losers. All it did was favour the entrenched reputations of the elite institutions and the tendency for social hierarchies to be self-fulfilling.

The current political orthodoxy is that social mobility is best achieved by the meritocratic rationing of elite education – by getting more students from poor or modest backgrounds into Oxbridge and other Russell Group universities. There is no evidence that this approach is increasing social mobility. On the contrary, education has been a key driver of what sociologists call “effectively maintained inequality”: the tendency for more affluent families to strategise and over-exploit opportunities. Middle-class children have benefited from inherited social and cultural capital, private education or tutoring, their parents buying into good school catchment areas, and the unconscious rewarding of their social confidence and ease. This gives them a much better shot at attending the elite universities. Although participation in higher education has risen hugely in the UK since the early 1990s, it continues to be sharply stratified according to race and class. The much smaller increase in the number of working-class and black and ethnic minority students has been heavily concentrated in the post-1992 and non-Russell group universities.

If the humanities are driven out of these universities, these are the students who will disproportionately miss out on them. They will be the ones steered away from the “low value” degrees towards learning a trade or doing a more vocational course. Meanwhile Oxbridge humanities graduates – including the prime minister, his former chief adviser Dominic Cummings and three members of Johnson’s current Cabinet (Stephen Barclay, Kwasi Kwarteng and Jacob Rees-Mogg) – continue to fill positions of power and influence. But then our hierarchised UK university system, as a former colleague once said to me, is the last bastion of socially acceptable snobbery.

Why does this unequal access to the humanities matter? It matters because the humanities are, ultimately, about hope and possibility. They began in the creative ferment of the Renaissance, with its rediscovery of the secular accomplishments of classical civilisation. The studia humanitatis lauded the fullness of human potential. The humanities show that people, with all their imperfections, fragilities and self-deceptions, are beautiful and irreplaceable beings of incalculable worth. They show that human beings are more than simply human capital. They show that every person has a deep reservoir of potential that we can’t begin to fathom until it is fulfilled, because it will somehow be tied up with this messy, uncontainable human impulse to make our lives meaningful. The humanities are not for everyone, but everyone should have an equal chance to study them.

As for the humanities scholars under threat of redundancy, they cannot easily retrain as computer coders or ballet dancers. For many, their working lives will be over prematurely. Those left standing will be unsure what the future holds – even if, like Robert Burns addressing the mouse, they “guess and fear”. They are becoming acquainted with the precarity that new PhDs and early career scholars suffer routinely.

This is especially unfortunate because, while the marketised university feeds off flux and uncertainty, humanities scholarship feeds off security and confidence. Humanities scholars know more than anyone that people only thrive inside self-spun webs of meaning. Humanities work requires huge investments of time and effort in getting to know one’s material intimately, keeping the faith that others will find this worthwhile and that it will form one more slender thread in the web of meaning. We have to keep telling ourselves that, just because what we do is slowly accruing and often too ineffable to turn into data, that doesn’t mean it doesn’t exist, or doesn’t matter. When we start to question that faith, it corrodes our focus, motivation and well-being.

All we can do is carry on with the work. In Emily St John Mandel’s 2014 novel Station Eleven, a nomadic troupe of actors perform Shakespeare to pockets of survivors from a global pandemic that has killed most of the world’s population. The lead caravan of their troupe bears the legend Because survival is insufficient.

I have taken to incanting this line, which Mandel got from an episode of Star Trek, as a justification for the humanities. It is not enough just to live. We need to know that our lives, with all their griefs and joys, are meaningful. Exploring life’s meanings more carefully could never be a waste of time, even if all the political mood music at the moment tells us otherwise. The humanities matter and we are right to keep believing in them. Survival is insufficient.

Housing is a human right

This is my review of Vicky Spratt’s book Tenants, and Daniel Lavelle’s book Down and Out, that appeared in the TLS on 22 July:

Like access to clean, drinkable water, the right to adequate housing is recognized by the United Nations. In the UK, in the space of a generation, this right has been gradually eroded by stark housing inequalities. These inequalities are the product of political will, market ideology and the unintended consequences of both. The forty-year property boom has been so spectacular that many houses now earn more in equity than their occupants do in their jobs. Attempts to help those priced out of the market through optimistically named “affordable housing”, stamp-duty cuts and Help to Buy schemes have only tinkered at the edges of the problem or inflated the market further. Meanwhile, the selling off of council housing has pushed most tenants into the private rented sector, where they have few of the legal protections of tenure available in other European countries.

Two new books by Vicky Spratt and Daniel Lavelle address this intricate ecosystem of housing inequality and the ways in which it is reshaping Britain’s social and economic landscape. Both authors have lived at the sharp end of the problem. When Spratt was seven she learnt never to answer the door to bailiffs, but still her family lost their home. As a young adult she rented tiny box rooms in houses with damp and mould, and lost a fortune in deposits to dodgy landlords. She is now the i Paper’s housing correspondent, writing not about property hotspots and fantasy house hunts, but about the human costs of the housing crisis. Lavelle grew up in care, moving between special boarding schools, foster homes and children’s homes. After leaving university he was homeless for two years, living in tents and hostels, or sleeping on friends’ sofas. In 2019 he co-wrote the “Empty Doorway” series in the Guardian, recording the lives of homeless people who died on the streets.

Tenants is the more densely researched book, being based largely on interviews Spratt conducts with evicted tenants, grassroots activists, support workers and experts in housing law. Down and Out is rawer and more personal, combining Lavelle’s own story with those of the insecurely housed and homeless people he keeps in touch with from his time in care and in hostels. Spratt focuses mainly on tenants facing or experiencing eviction; Lavelle explores a twilight world of sofa-surfing, hostels, night shelters and rough sleeping. Taken together they make plain how paper thin is the divide between the cheaper end of renting and being thrown out on to the street. They show that all it takes to be made homeless is to be surprised by illness, redundancy, a break-up or simply a landlord who decides on a whim that they want you out. Section 21 of the Housing Act 1988 allows private landlords to evict tenants at short notice without giving a reason. In the Queen’s Speech of December 2019the government pledged to abolish these “no-fault” evictions, but it has yet to do so.

Both books consider the Thatcher era to be, in Spratt’s words, “ground zero for the mess we are in now”. The 1980 Housing Act gave millions of council house tenants the right to buy their homes, at market discounts of up to 50 per cent. As Spratt points out, this was no rocket boost for homeownership, which in England has increased only slightly from 56.6 per cent in 1980 to 64.6 per cent in 2020. More than 40 per cent of ex-council homes sold under Right to Buy are now owned by private landlords. For Spratt, the key driver of housing inequality was the political decision to outsource the rental sector to unqualified and unregulated individuals, private landlords, many of whom have neither the time nor the resources to manage their properties properly. Nearly half of housing benefit, about £10 billion a year, goes straight to them.

Lavelle, with typical pungency, calls Right to Buy “the greatest heist in modern history, a heist perpetrated under the guise of giving people a stake in public assets they already had a stake in”. In truth, as the more restrained Spratt concedes, Right to Buy was not an original Thatcherite policy. A less heavily discounted version of it appeared in the Labour Party’s election manifesto in 1959. The traditional Conservative policy of encouraging home ownership – Anthony Eden’s espousal in 1946 of a “nation-wide property-owning democracy” – gradually became a cross-party consensus in the postwar years. Labour retained plans for ambitious council house-building, but New Labour shelved them as it tailored its policies to existing homeowners. Between 1998 and 2010, 6,330 council homes were built, just over a third of the total built in 1990 alone, the last year of the Thatcher government.

The 2008 financial crash made things vastly worse in two ways. First, banks wanted bigger deposits and tightened affordability checks for mortgages. They ploughed money into buy-to-let mortgages, with investors being seen as a safer bet than first-time buyers. This pushed many more people into renting. Second, austerity made life more brutal for renters on low incomes. In 2010 George Osborne cut housing benefit and barred single people under thirty-five from claiming it to live in a place of their own. Cuts to local-authority budgets meant less money for hostels, shelters and drug and alcohol dependency services. Councils are now so strapped that they operate what Lavelle calls “a misery contest for housing, a sort of X Factor for the destitute”. While living in a tent along a bridle path in Saddleworth, Greater Manchester, he was told he was not a “priority need” because he did not present with any other vulnerabilities. He was “homeless, but not homeless enough”.

These two effects of the financial crash combined catastrophically with one non-effect: house prices and average rents carried on rising. Adding more renters to the mix drove up demand, and landlords put up prices accordingly. Each chapter of Spratt’s book is preceded with data on the sales and rental market in the area she is writing about, which underlines how hopeless the situation is for many. In Peckham, south London, for instance, the average price of a flat in 2021 was £450,865 and the average monthly rent for a one-bedroom home was £1,394.

Housing inequality bears out Claudius’s maxim that sorrows come “not single spies, but in battalions”. Its victims are invariably dealing with contributory and aggravating factors: casualized work, stagnating wages, welfare cuts and debt. Spratt compares the housing crisis to a virus that “infects its hosts and multiplies to make everything more difficult for them”. Living in cramped, ugly, broken-down surroundings is bad for anyone’s mental health. Damp and mould bring respiratory problems and other illnesses. Those in poor, overcrowded housing suffered most in lockdown, and were more likely to catch and spread Covid. Having to move all the time is stressful and makes it much harder to build support networks. Mindy Fullilove, an American professor of urban policy and health, calls this phenomenon “root shock”, after the trauma a plant experiences when it is moved carelessly to shallow soil.

The homeless people Lavelle speaks to struggle with three big problems: experience of abuse, mental illness, and drug and alcohol addiction. One reason that spice (a synthetic cannabinoid) is so popular on the streets and in hostels, Lavelle says, is that it makes time go quickly. He is open about his own problems. The victim in infancy of a family trauma that he can’t write about for legal reasons, he spent much of his childhood being excluded or expelled from school and attending special educational establishments. The psychiatrist who diagnosed attention deficit hyperactivity disorder when Lavelle was seven “didn’t need to strain his diagnostic skills too much”. He does not sound like the most compliant hostel dweller, getting into arguments with other residents and with the supervisor who, when Lavelle has the flu, refuses him Strepsils because they contain alcohol. At one point he punches a hole in a windscreen. “If being one’s own worst enemy was a sport, I’d be a Hall of Fame world champion”, he concedes. His point is that the people who end up in hostels and shelters are often hard to handle, but that this should not affect the support they receive. The NHS, after all, does not require character references before admitting you to A&E. Adequate housing is a human right, not a reward for good behaviour.

Lavelle’s bête noire is what he calls “philanthrocapitalism”: the voluntary sector, charities and private companies that have taken over homeless provision in the age of austerity. These organizations are self-regulating because they provide “support” rather than “personal care”. They are not monitored by the Care Quality Commission or Ofsted, even though those they support may be as young as sixteen. Nor do they have to comply with Freedom of Information requests. They can ask hostel residents to work for a meagre weekly allowance rather than the minimum wage, impose strict rules on their behaviour and evict them for minor breaches.

Both Spratt and Lavelle advocate the Housing First model pioneered in the 1990s by Sam Tsemberis, then a clinical psychologist at New York University. Housing First provides homeless people with their own home straightaway, with no preconditions. They are not required to move gradually up the ladder from a night shelter to more secure housing. They do not need to get a job first, or obey someone else’s house rules, or abstain from drugs and alcohol. Only when they have been housed are their other needs addressed. Tsemberis tells Lavelle that Housing First is more about providing treatment for addiction and mental health problems than about housing, but says “you can’t really talk about the treatment unless the person is housed, otherwise the whole conversation is only about survival”. Housing First has been successful in the American states and cities where it has been rolled out, as well as in Finland, the only EU country where homelessness is falling.

These books argue convincingly that investing in more social housing would benefit everyone, not just those who live in it. It would ease pressure on the rental sector and make private landlords compete with an alternative source of good-quality and secure homes. It would take the heat out of the housing market and start to counter the inheritocracy in which older people sitting on equity pass it on to their children. It would alleviate other forms of injustice, since housing inequality falls disproportionately on Black, Asian and minority ethnic tenants. It would make the kind of housing safety scandal exposed by the Grenfell Tower fire less likely. And it would reduce homelessness, which places immense strain on the police, the criminal justice system, the NHS and councils.

More subtly but profoundly, investing in social housing would soften, for millions of people, that repeated blow to their self-esteem that comes from being beholden to someone else. It is fatiguing and confidence-sapping to live with stained mattresses and broken shower curtains; to show potential buyers, your would-be evictors, around your home; to be fearful of provoking your landlord by making a fuss about repairs; to feel stuck in a limbo of enforced adolescence, waiting for grown-up life to begin. All this makes for what Spratt calls “an uneasy and constant refrain, your life sung to the tune of the privilege of others”. People waste hulking portions of their lives looking for rented rooms, dealing with bad landlords, extricating themselves from nightmare house shares and moving their stuff from one room to the next. What could be achieved with all that energy if it were expended more creatively?

Perhaps something is stirring. Spratt highlights the work of the community union ACORN and organizations such as Generation Rent and Safer Renting, which fight for tenants’ rights. She reports on eviction resistance bootcamps and renters fighting back against gentrifying regeneration schemes. In 2016 she fronted a successful campaign to get letting fees banned and deposits capped. Ultimately, though, her book leaves you with the sense that nothing much will change while the haves (homeowners and investors) outnumber the have-nots (renters and the homeless).

Housing inequality has barely been mentioned in recent election campaigns. Spratt is told that when David Cameron was prime minister, the phrase “housing crisis” was banned in government. In the early 2010s, while working as a junior producer on Newsnight, she suggested covering more housing stories, but her editor – homeowning and privately educated – told her that they were “just not that interesting”. The same kind of willed obliviousness allows newspapers to place the blame for the housing crisis on immigrants, or on young people who are buying too many avocados or espressos to save for a deposit. In the face of such simplistic explanations, these books enrich our impoverished sociological imagination. Their case studies are as bleakly memorable as Raymond Carver stories. A Brighton man sleeps in his work van at the height of the pandemic, after losing his flat just before the second lockdown. A woman evicted from her flat in Peckham, who has recently attempted suicide, is told that if she does not accept a flat in Croydon she will have made herself “intentionally homeless” and forfeited any right to support. When she asks how she is meant to get to work or take her daughter to school, the placement officer tells her to “get up earlier”. Lavelle spends a wet and freezing November night wandering around Oldham, making endless circuits of the shopping centre until it closes and “laughing maniacally about what a parody my life had become”, before huddling underneath a bridge.

Both books retell the story of Gyula Remes, the Hungarian national who, just before Christmas 2018, died in the underpass leading from the Houses of Parliament to Westminster Tube station. At forty-three, he was one year short of the mean age at death of a homeless person in England and Wales. This story received wide coverage because it seemed shocking that MPs could routinely walk past the effects of the austerity for which many of them had voted. But it is not really so shocking – because MPs are no different from the rest of us, we who avert our eyes from the daily disaster playing out in the pile of blankets on the other side of a pavement. These books succeed in reinserting a whole person into that human-shaped heap: someone with a name, a family, a life history that led them there and a body just as achy as ours would be if our bed were made of stone.

The Noel Edmonds Creed (a found poem)

I, as a follower of Noel Edmonds, believe all that Noel Edmonds believes.

I believe that disease is caused by negative energy.

I believe that death is just a word in the dictionary.

I believe that the most appropriate word is ‘departure’ because we are energy and you can’t create or destroy energy, you can only change its form.

I believe that we are surrounded by electro mist, fog and smog.

I believe that we are covering ourselves in the wrong sorts of electro-magnetism.

I believe that the biggest problem we have is not Ebola, it’s not Aids, it’s electro smog.

I believe that the Wi-Fi and all of the systems that we are introducing into our lives are destroying our own natural electro-magnetic fields. All you are is energy, remember that.

I believe that renewing the BBC’s charter would be an act as futile as giving medicine to a corpse.

I believe in Orbs. Orbs are little bundles of positive energy and they think they can move between 500 and 1,000 miles per hour.

I believe that there are two orbs that visit me. The two that I have are about the size of melons. One sits on my arm and the other is usually in the back of the shot, sitting just over my right shoulder.

I believe that you don’t live life, life lives you.

I believe that I wrote a wishlist of ambitions to the cosmos and, like a mail-order company, it delivered my wife, who was working as a make-up artist on Deal Or No Deal.

I believe that every single human being can achieve a perfect vibrational balance between their positive and negative energy.

I believe it is possible to retune people.

I believe that Ant and Dec are excellent presenters. They’ve been honest, they’ve plundered the House Party archive and created Takeaway. I don’t have a problem with that. I take it as a compliment.

I believe that all these things have been known about for a very long time.


The Nowhere Office

I reviewed Julia Hobsbawm’s The Nowhere Office and Jonathan Malesic’s The End of Burnout for the TLS in February:

Change, writes Julia Hobsbawm, happens “slowly and then all of a sudden completely”. What Hobsbawm calls the “Nowhere Office” – the hybrid workspace that floats between work and home – may seem like Covid’s gift to the world but it was long in the making. For her it is the culmination of trends that have been emerging since the 1980s, when office hours stopped being strictly nine-to-five and the search for an elusive work–life balance began. The pandemic “broke the last threads holding the embedded customs and practices together”.

The Nowhere Office is buoyant about this placeless workspace. Offices, Hobsbawm predicts, will no longer be run by a creed of unthinking presenteeism and will become places we visit for networking, collaboration and community-building. The rest of the work will be done at home or on the move. We will happily cut across different time zones, accessing our files anywhere via digital clouds and dividing up work across a seven-day week, carving out “air pockets of free time” rather than a two-day weekend. The main divide will be between the “hybrid haves” and the “hybrid have nots” – those who are able to move seamlessly between online and offline and those who are not.

Hobsbawm wants to “put the human back in the corporate machine”, and her instincts are all good. She understands that working from home can mean loneliness, isolation and the bleeding of work into our personal lives. And she concedes that “despite the apparent flexibility and freedoms, many inequalities remain and too many people still have to work too hard and too long”. But what if the apparent flexibility and freedoms are the problem? What if the nowhereness of work means that work ends up being everywhere, and we can never disengage from its demands? For Hobsbawm the solution is to give employees more choice and negotiate their consent. They must be disciplined in separating work from life, and their bosses must trust them to work unsupervised. “It will be obvious if people are working well”, she writes sunnily, announcing the end of “the age of being violently busy”.

The book is interspersed with interviews with practitioners and proponents of the Nowhere Office. Most of them are business leaders: chief strategy officers, brand presidents, digital entrepreneurs, investors. Their insights are worth having, even if Hobsbawm’s mimicry of their corporate-speak about “win-win models” and “siloed thinking” does little for her prose style. But one wonders if those lower down the corporate hierarchy might have a less heady take on the Nowhere Office.

According to Hobsbawm, theses changes are unstoppable. The future is set fair and all we can do is catch up. “The desk is all but over as a built-in feature of office life”, she says. “Sofas, small theatres, spaces to convene and converse in will be ‘in’.” Her brisk verdicts on the new reality reminded me of that much-repeated formula online, declaring that some new phenomenon “is a thing now”. But why is it a thing, and should it be a thing? The future is neither uniform nor inevitable. It feels too soon to make bold calls, before the pandemic is even over, about what the workplace of the future will look like.

Hobsbawm summarily dismisses critics such as Josh Cohen, David Graeber and Sarah Jaffe as part of “an emergent purist camp” which holds that “work represents a failure of society, certainly of capitalism, and that work is essentially not an opportunity but a threat”. But these critics do not say that work is “pointless”, as she claims, only that a turbo-capitalist conception of work makes excessive and toxic demands on us. Their writing deserves to be engaged with rather than caricatured.

Hobsbawm would probably put Jonathan Malesic in the purist camp. But his acutely felt investigation of work burnout as an “ailment of the soul” makes his the more thought-provoking and substantial of these two books. Malesic is a recovering academic, a former professor of theology at a small Catholic college in Pennsylvania. Like many academics he began his career with unsustainably high ideals, believing he was “a citizen in the republic of letters”. He discovered that much of it was just a job, with unrewarding tasks, soul-sapping hassle, pointless politicking and fears of redundancy. His students, most of whom were studying theology as a core requirement, did not share his enthusiasms and spent his classes looking blank-faced and bored. Soon he was lying in bed for hours when he should have been working, repeatedly watching the video of the Peter Gabriel and Kate Bush song “Don’t give up” and self-medicating with ice cream and beer. After eleven years he gave up the tenure-track position he had worked so hard for. Alongside the sense of failure, he felt intense guilt that he had come to hate such a coveted and well-rewarded job.

As Malesic admits, burnout is something of a buzzword, “an often-empty signifier onto which we can project virtually any agenda”. Our vague definitions of it, and the lack of consensus on how to diagnose and measure it, raises the question of how much we really want to eradicate it. Diagnosing oneself with burnout can, after all, be self- flattering. To be burned out is to be a modern, a victim of the age, a martyr to one’s own high ideals. Burnout’s historical antecedents, the now-forgotten soul sicknesses of acedia, melancholia and neurasthenia, were similar sources of both pride and shame.

Malesic defines burnout usefully as “the experience of being pulled between expectation and reality at work”. We burn out not just because we are exhausted but because our hearts are broken. Our love for work, which we saw as the path to social status and spiritual flourishing, went unrequited. Even in the good times, work could not deliver all we asked of it, and these are not the good times. Aided by market deregulation, employers now see workers as a fixed cost to be reduced. Outsourcing, zero-hours and precarious work have expanded, while more hours are demanded of everyone. The funky offices of tech start-ups, with their games rooms and sleeping pods, are, Malesic writes, “designed to keep you at work forever”. The life hacks touted as burnout antidotes – mindfulness, getting more sleep, working smarter – are superstitions, “individual, symbolic actions that are disconnected from burnout’s real causes”.

Malesic visits an artisan pottery studio in Minnesota, a Dallas nonprofit doing anti-poverty work and several Benedictine monasteries, and spends time among artists with disabilities who cannot find paid work but who form richly supportive creative communities. He learns that work need not be the lodestar of our lives. To heal our burnout, we need to lower our expectations. Malesic now teaches writing part-time at a Dallas university, just one or two classes per semester. He no longer expects the life of the mind to be soul-nourishing and is a better and more patient teacher for it.

We need to see work as, well, work. But this does not mean that it should cease to matter. Malesic cites the French phrase “un travail de bénédictin”– a Benedictine labour – to describe a project that demands quiet, steady effort over a long time to bring it to fruition. This kind of work has little value in a world of annual pay reviews and key performance indicators. But a richly satisfying Benedictine labour can cure us of that self-lacerating cycle of looming deadlines and short-term goals that ultimately benefits only our paymasters.

These very different books have one perspective in common: they both see the pandemic as a chance for reflection and change. “Right now, we have a rare opportunity to rewrite our cultural expectations of work”, Jonathan Malesic writes, “and I hope we will.” So do I.

The Premonitions Bureau

I reviewed Sam Knights’s book The Premonitions Bureau for the TLS in May:

For most of human history, people have believed that we can see into the future. The Bible is filled with prophecies and premonitory dreams; the ancient Greeks put their faith in oracles and in destinies that no mortal being could swerve. “That which is fated cannot be fled”, warned Pindar. As Oedipus discovered, what is going to happen to us becomes what we choose to do.

The Premonitions Bureau, Sam Knight’s elegant and illuminating work of cultural history, transports us back to a mid-twentieth-century Britain still clinging to this faith in precognition – the extra-sensory perception of future events. Precognition, which hinted at “undiscovered reaches of physics and of the mind”, managed to escape the taint of the occult that clung to phenomena such as ghosts and ectoplasm. It teetered on the edges of scientific respectability.

In 1927, J. W. Dunne, an aeronautical engineer, published the bestselling book An Experiment with Time, which remained in print for more than half a century. In 1902, while serving in the Boer War, Dunne had dreamt of a volcano about to erupt on a French colonial island. A few weeks later, he got hold of a newspaper which reported that the eruption of Mont Pelée, on the French Caribbean island of Martinique, had killed 40,000 people. Dunne’s book was a thirty-year history of his own dreams and their intimations of the future. He explained it all with reference to the new fields of relativity theory and quantum mechanics, which theorized that time’s linearity was no simple matter. Dreams that predicted future happenings became known as “Dunne dreams”. On Dunne’s advice, many of his readers began leaving pencil and paper by their beds so they could write down their dreams on waking.

J. B. Priestley, in plays such as Time and the Conways (1937) and An Inspector Calls (1945), drew on Dunne’s work. Priestley also popularized Carl Jung’s theory of synchronicity, which suggested that events could be linked outside the normal logic of cause and effect, such as when a dream foretells an event in the waking world. In Time and the Conways, Alan Conway tells his sister Kay that the secret to life is that time is not monodirectional but eternally present, and that at any given moment we see only “a cross section of ourselves”.

At the heart of Knight’s story lies a remarkable character called John Barker. When we first meet him, in 1966, he is a forty-two-year-old psychiatrist working at Shelton Hospital in Shropshire, one of Britain’s sprawling and overcrowded mental institutions. Barker worked tirelessly to improve conditions at Shelton by phasing out the more brutal treatments, such as electroconvulsive therapy administered without drugs. But he was also frustrated with the professional timidity of his field. Fringe areas dismissed as psychic or paranormal were just waiting to be absorbed into mainstream science, he believed. He was a member of Britain’s Society for Psychical Research and fascinated by precognition.

The book begins with the event that galvanized Barker: the Aberfan disaster of October 21, 1966, when a coal-tip avalanche buried a primary school, killing 144 people, mostly children. The precarious-looking tips above Aberfan had long worried locals, and many spoke of having disturbing thoughts and visions before the disaster. Given how much Aberfan had pierced the national consciousness, Barker decided to ask the public if they had felt any presentiment of it. He contacted Peter Fairley, the science editor of the London Evening Standard, who agreed to publicize his appeal. Barker received seventy-six responses from what he called “percipients”. After prodding them for details and witnesses, he concluded that precognition was a common human trait, perhaps as common as left-handedness. He thought that a small subset of the population might experience “pre-disaster syndrome”, somewhat similar to the way in which twins were thought to feel each other’s pain remotely.

The problem was that, as with most similar evidence, the Aberfan data had been scientifically compromised by being collected after the event. So just before Christmas 1966, Barker and Fairley approached Charles Wintour, the Evening Standard’s editor, about setting up a “Premonitions Bureau”. For a year, the newspaper’s readers would be asked to send in their forebodings of unwelcome events, which would be collated and then compared with actual events. The Standard’s newsroom was soon inundated with letters and telephone calls.

Barker envisaged the Premonitions Bureau as a “central clearing house” for all portents of calamities, “a data bank for the nation’s dreams and visions”. This crowd-sourcing of the collective unconscious recalled the work of an earlier research organization, Mass Observation, which also made use of unpaid volunteers to create “weather maps of public feeling”. Barker hoped that the results would eventually be uploaded to a computer database, and that the Bureau would issue early warnings of potential disasters.

Barker and Fairley appeared often in newspapers, as well as on BBC2’s Late Night Line-Up. They also turned up with a group of percipients to be interviewed on ITV’s The Frost Programme, but were dropped mid-show, probably because David Frost worried how the group might come across. “‘Weirdos’ would be too strong a description,” Fairley wrote later, “but they were certainly different.” Fairley put his own raised profile to good use, going on to present ITV’s coverage of the moon landings.

The Bureau received hundreds of warnings, most of which proved, predictably, to be blind alleys or impossible to verify. On quiet mornings, Fairley would go through the letters pile in search of racing tips. Two respondents, though, had real staying power: Kathleen Middleton, a piano teacher from Edmonton, and Alan Hencher, a Post Office switchboard operator from Dagenham. They predicted a whole run of unfortunate events, including the Torrey Canyon oil spill, the death of a Russian cosmonaut on his re-entry to earth, the assassination of Robert Kennedy, and the Hither Green rail crash in which forty-nine people died. Distressingly for Barker, they both then foresaw his own death (which nicely sets up the end of the book).

Knight’s refreshing approach to his subject matter avoids being either too cynical or too credulous. “Premonitions are impossible, and they come true all the time”, he writes. He knows how hard it is for us storytelling animals to separate an event from the link we give it in a causal chain. A few weeks before their wedding, he and his wife saw three magpies, and “never asked for a test to confirm the sex of our daughter because we felt we had already been informed”.

Time is an arrow. The second law of thermodynamics rules that there is no way we can know about things before they happen. Entropy – the cup of tea that cools as you drink it, the leaves that fall in autumn, the lines that form on your forehead – is the concrete proof that time only runs forwards. And yet some contemporary theoretical physicists, such as Carlo Rovelli, suggest that the explanatory power of entropy, which makes sense of our lives and our deaths, has caused us to give it too much credence. Perhaps we only see the small part of reality where this rule holds. Knight feels no need to come down on one side or the other. Instead, he uses the theme of precognition to explore deep existential questions about time, causation and the meaning of life.

The Premonitions Bureau is full of lightly dispensed research, gathered from the archives of the Society for Psychical Research and interviews with the families and associates of the main characters. Knight’s method and tone will be familiar to those who have read his Guardian Long Reads on everyday subjects such as the British sandwich industry and Uber’s takeover of London, or his New Yorker “Letter from the UK”. He deploys two highly effective narrative techniques. The first is the deadpan drop of bits of stray information. We learn that a survivor of the Hither Green rail crash was the seventeen-year-old Robin Gibb, of the Bee Gees; that Barker was a keen surfer, although overweight and at least two decades older than his fellow longboard pioneers; that Fairley chased stories on a fold-up motorcycle and that only when he died did his widow and four children learn of his secret second family. As well as being weirdly fascinating, these facts add authenticating specificity to the story.

Knight’s second technique is the narrative handbrake turn, where the story veers off without warning, the significance of this new thread only emerging later. “In the 1690s, a young tutor named Martin Martin was commissioned to map and document life in the western islands of Scotland”, he might begin, out of the blue. Or: “One day in 1995, in the German cathedral city of Mainz, a fifty-one-year-old woman went to hospital …”. The creatively jarring juxtaposition of human voices and stories reminded me a little of Tales of a New Jerusalem, David Kynaston’s multi-volume history of postwar Britain. Knight, like Kynaston, leaves us with a sense of the stubborn strangeness of other people and of the recent past, without ever seeming condescending to either. Other people, his book reveals, are infinitely and incurably odd. Still, they might just be on to something.

Ten Writing Tips

During lockdown in the autumn of 2020, when we were teaching online, I posted a writing tip to our students every week. I thought I would post them here now in case anyone else finds them useful.

Tip 1: Start writing earlier

When you’re working on an essay or piece of coursework, start writing early on in the process. Don’t spend all your time on the reading and research and leave the writing until the last minute. As an English student, writing is your laboratory, your way of thinking – how you find out what you really want to say. Make sure you leave enough time for it.

Students sometimes get discouraged when they have written a first draft of their essay and it feels awkward or stilted. But that’s like saying your cake tastes awful when all you have done is mix some butter, eggs, flour and sugar in a bowl. You haven’t finished making it yet. Only when you have hacked your sentences into a basic shape can you see the many other things wrong with them. Only by putting the words into a semblance of order can you see how muddled they still are. An essay is too big and complex to hold entirely in your head, so you need to have the words in front of you to really think it through.

A defining quality of writing, as opposed to speaking, is that it can be redone. You can keep working on it until it’s ready. Writing is rewriting. ‘Writing,’ the American author Kurt Vonnegut said, ‘allows mediocre people who are patient and industrious to revise their stupidity, to edit themselves into something like intelligence.’ Not that I’m saying you’re mediocre. I’m just saying that the great thing about writing is that you can keep reworking it until you sound like the best, most perceptive and insightful version of yourself. And who wouldn’t want to spend time doing that?

Tip 2: Trust your ear

The best way to iron out mistakes and awkwardness in your writing is to read your work aloud. Trust your ear. Language is innately rhythmic and musical. Even the way you say your phone number to someone else has a rhythm, as you split it into two or three phrases. That is why we find the automated voices of satnavs and public address systems, with their random rise and fall, so alien. They don’t sound human because they don’t speak with human rhythms.

If you get the rhythm of your writing right, the other things tend to fall into place. Most people know the grammatical rules of writing more than they think they do. You probably know where the subject and verb should go in a sentence, even if you can’t identify them. (Most people can’t.) You know the subject and verb go at that point in the sentence, and in that order, because it sounds right. If it sounds right, it’s probably grammatically right; if it sounds wrong, it’s probably grammatically wrong. You should certainly trust your ear more than the grammar check on MS Word, which is pretty useless.

You can test the flow and sense of your writing when you read your work aloud, because the ear is very sensitive to dissonance, in the same way that you can tell if a singer has hit a bum note, even if you don’t know what the note should be. Reading your work aloud slows you down (you read much quicker when you’re reading silently) so you’re more likely to notice if something sounds wrong. Reading aloud forces you to renotice what you have written.

There is an even better way. When you read your own writing aloud you already know what you meant, and you augment that meaning by accenting and stressing, speaking faster or slower, higher or lower – all ways of making your meaning clearer and reducing ambiguity. Better, if you can bear it, to get a friend to read out your sentences for you. If they stumble over a word or phrase, it might be a clue to revisit it.

Tip 3: Cut all unnecessary words

Which of these sentences sounds better to you?

  1. When I was a child, I used to have a terrible temper.
  2. As a child, I used to have a terrible temper.
  3. As a child I had a terrible temper.

I say the second is better than the first, and the third is best. You don’t need both when and used to, because they convey the same thing. And, come to think to think of it, you don’t need both as and used to either, because they too convey the same thing. The third sentence takes the least time and effort to read. Cutting unnecessary words always makes your writing cleaner and more elegant.

For instance, repeating a word in a sentence can sound clunky:

By choosing to narrate the novel in the first person the author makes the novel more vivid.

Better version: The use of the first person makes the novel more vivid.

The story of Cinderella is a well-known story.

Better: The story of Cinderella is well-known.

The book’s title establishes the theme of the book; the book’s first paragraph establishes the voice of the book.

The book’s title establishes its theme; the first paragraph establishes its tone.

Also, do you really need all those vague qualifiers like very and rather, and do you need two vague adjectives when one would do?

This piece of writing is a very poignant and heartfelt one.

The writing is heartfelt.

It’s easier for the reader to quickly grasp the meaning of your sentence if you cut all needless words:

Portraying the nature of people to be driven by violent instinct is present in many other novels.

People driven by violent instinct appear in many other novels.

Most memoirs choose to mirror the strict chronological nature of life itself within the structure of their works, although this is not always the case. Some memoir writers choose to employ non-chronological structures.

Most memoirs mirror the chronological nature of life in their structure, but not all.

So: write more words than you need and then go through your draft cutting the ones you don’t need. Just as your speech is full of ums and ers and repetitions, your first go at any piece of writing will be full of unnecessary verbiage.

A writer makes meaning not just by adding words but by taking them away. The playwright David Mamet said that ‘Omission is a form of creation.’ Cutting words is as creative an act as writing them. It often makes your meaning clearer to yourself. It’s a bit like being a sculptor, looking for the beautiful form hidden in that rough block of marble by chipping away at all the superfluous stone. Cutting words has this same creative quality. Sometimes it can liberate a meaning that you weren’t quite aware of but that was waiting there to be found.

Tip 4: Learn the power of the full stop

In the age of texting and social media, full stops are going out of fashion. The dialogic visual language of texting speech bubbles, pinging left and right on your phone, has little use for full stops. A single-line text needs no punctuation to show that it has ended. Instead of a full stop, you press send. Studies have shown that young people tend to interpret full stops in texts as curt or passive-aggressive.

But writing is not a speech-balloon text waiting on a response. The point of writing is to communicate in a way that does not require you to explain it any further. A sentence gives words a finished form that should need no clarification. It is its own small island of sense. So, with any kind of semi-formal writing addressed to people you don’t know well (such as the tutor marking your essay), the full stop, and where you decide to put it, are crucial.

Only when the full stop arrives can the meaning of a sentence be fulfilled. The full stop should be like a satisfying little click that moves your prose along slightly so that the next sentence can pick up where it left off. If you want to write well, learn to love the full stop. Love it above all other punctuation marks, and see it as the goal towards which all your words move. It is the most powerful punctuation mark: don’t forget to use it.

Tip 5: Don’t make your sentences any longer than they need to be

Last time, I wrote about full stops. Here is another reason why full stops are important: every sentence places a burden on the reader’s short-term memory. A sentence throws a thought into the air and leaves the reader vaguely dissatisfied or confused until that thought has come in to land. The reader has to hold all the sentence’s words in their head until the full stop arrives to close the circle of meaning. The full stop provides relief, allowing them to take a mental breath.

The longer your sentence is, the more the reader has to hold in their head and the more chance there is of something becoming mangled or unclear. This doesn’t mean you should avoid writing long sentences – I will discuss how useful they are in one of my later tips – but it probably means that, if in doubt, you should put a full stop in. A lot of student writing is full of sentences that are longer than they need to be.

When you’re writing a first draft, I suggest you start with short, simple sentences. If you start short like this, it’s easy to add detail and texture, and combine short sentences into longer, more complex ones. But if you start writing long, complicated sentences before you’ve worked out what you really think, then you will find them hard to take apart and simplify. Start simple and make it complex; don’t start convoluted and then have to unravel it all.

Tip 6: Vary your sentence length

The best way to make your writing sound fresh and musical is to vary the length of your sentences. Paragraphs tend to work well when they are a group of sentences of varied lengths. At the end of every sentence there is what’s called a cadence – a drop in pitch (whether you’re reading it aloud or silently) as the full stop arrives. This signals to the reader that the sentence, and the sentiment, are done. Varied sentence length makes for varied cadences. This makes writing breathe, move and sing.

Short and long sentences also do different things. Short sentences make key points or recap them, and trade in relatively straightforward statements about the world. Long ones take readers on a mental tour, list a whole series of things or stretch out a thought. Short sentences give the reader’s brain a bit of a rest; long ones give it an aerobic workout. Short sentences imply that the world is cut and dried; long ones restore its ragged edges. Short sentences are declarative and sure; long ones are conditional and conjectural. Vary your sentence length and you mirror the way the mind works, veering between conviction and nuance.

Vary the length of your sentences!

Trust me.

It works.

Tip 7: Put the important stuff at the end of the sentence

A good English sentence, however long it is, moves smoothly and easily towards its full stop. The best way to ensure this happens is to put the important stuff at the end. A sentence ordered like this feels more deliberate and memorable – just as, when you stop speaking, what sticks in your listener’s mind is the last thing you said.

Typically, the word or words at or near the start of a sentence are the subject. The words at the end of a sentence are typically the predicate: the main verb and its complements. The predicate adds new information that the next sentence may then comment on as a given. So the predicate often turns into the subject of the next sentence. Weak sentences break this given-new pattern. The subject is stronger than the predicate and the sentence ends with an unresounding phhtt.

If you write that something is an interesting factor to consider or should be borne in mind or is very relevant in today’s society, then your predicate is not saying much, because those things could be said about lots of things. I call these sentences pretending-to-care sentences. They turn up a lot in student essays, particularly in introductions, because you’ve essentially been given an assigned task and told to come up with something to say about it. Here are a few examples:

  • Poems dealing with the theme of death include great works by many different authors.
  • The issue of gender equality appears in thousands of texts from different writers all around the world.
  • Each of these writers has something special and unique about them.
  • These stories, written in different time frames, touch on many different subjects.
  • Malcolm X produced potentially one of the most influential autobiographies to ever exist.
  • Throughout my essay, I will use a wide range of secondary sources, making my argument more objective.

There’s nothing drastically wrong with any of these sentences. But there are two problems with all of them: they don’t say very much, and they end flatly. Look at the second half of all these sentences: the predicate (touch on many different subjects, have something special and unique about them, include great works by many different authors etc.) could apply to lots of things.

Let’s have a go at fixing a couple of pretending-to-care sentences.

Childhood is a stage in life that everyone has experienced.

Better version: All of us were children once.

The theme of love is one which has reoccurred throughout various texts in the literary tradition since its very beginning.

Better: Love is a recurring literary theme.

The italicized versions are better not just because they use fewer unnecessary words, but because they end strongly, with the key bit of information at the end of the sentence. If you do this, the full stop will arrive with a satisfying click.

Tip 8: Your writing must speak all on its own

First, some words from the author Verlyn Klinkenborg:

‘When called to the stand in the court of meaning, your sentences will get no coaching from you. They’ll say exactly what their words say, and if that makes you look ridiculous or confused, guess what? Sentences are always literal, no matter how much some writers abhor the idea of being literal. In fact, nothing good can begin to happen in a writer’s education until that sinks in. Your opinion of what your sentence means is always overruled by what your sentence literally says.’

Klinkenborg captures here what makes writing so hard. You have to arrange the words in such a way that they can be deciphered in your absence. In writing, meaning derives from just four things: syntax (the grammatical order of the words), word choice, punctuation and typography (that’s things like capital letters and italics). Part of you thinks that you will be able to hover over the reader’s shoulder as they read what you’ve written, saying ‘That’s not what I meant. This is what I really meant!’. You won’t. The only thing the reader can use to access your wonderful ideas is your words. Writing is made of marks on the page and nothing else.

In their book The Elements of Style, William Strunk and E.B. White advise: ‘When you say something, make sure you have said it. The chances of your having said it are only fair.’ When you write a first draft, it is very unlikely that you will have said exactly what you think you have said. That’s why you need to read your work over, read it aloud, redraft it, proofread it. Then you find out if you have said what you wanted to say.

Writing is a strange, cumbersome, artificial process. It takes a lot of work to make your words clear to the reader. The comic singer Neil Innes used to start his act with this line: ‘I’ve suffered for my art. Now it’s your turn.’ Don’t be like that. Don’t show the reader how tedious you found writing your essay by making them suffer as well. Writing should be an act of generosity, a gift from writer to reader. The gift is the work you’ve put in to make your meaning clear and your sentences a pleasure to read.

Tip 9: Avoid paragraphs of very different lengths

The paragraphs in your essay should not be of dramatically different lengths. That doesn’t mean they have to be exactly the same length. But if you have a two-page paragraph followed by one that is two sentences long, it’s a sign that you need to reshape your essay. There is no rule about how long a paragraph should be, although I don’t like to make mine longer than about 250 words.

A good, basic way of thinking about a paragraph is that it is a single idea, developed into an extended thought. You introduce your point at the start of the paragraph and spend the rest of it developing that point, using examples, supporting quotes, evidence, qualifications and counter-arguments. If your paragraph is only a sentence long, it either means that your idea needs to be developed further, or that it doesn’t merit a paragraph of its own. If your paragraph is two pages long, it means that it contains several ideas that each need their own smaller paragraph.

Paragraphs allow you to put similar material in your essay in the same place. A common phrase that occurs in student essays is ‘As previously mentioned’, or ‘As mentioned earlier’. In which case, why didn’t you also mention this point earlier, when you were talking about that subject? Put similar material in the same place in your essay.

The first and last sentences of each paragraph carry a lot of stress. They are a good way of nudging your argument along. Try making those sentences fairly short, so they can quickly introduce what’s to follow or wrap up a point.

Tip 10: Choose the right word

Be specific in your choice of words. It helps if you learn to be interested in word origins. Did you know that humility and humour are both linked etymologically to humus – the soil, the earth – and to a human, who is thus, linguistically, an earthling? Did you know capricious referred originally to the behaviour of a typical goat (Capricorn being the sign of the goat). Did you know that obscene originated in Greek drama as ob-skene, which means ‘offstage’? Did you know that immediately means ‘without any intervening medium’ – nothing comes between it?

If you know what a word’s origin is, you’re more likely to use that word appropriately. Try to avoid what I call ‘thesaurus words’, where you’re looking for an alternative to a word and find a synonym in the thesaurus facility on MS Word. No word means exactly the same as another one. The right word is rarely the longest, most complicated or most impressive-looking word. It’s just the word that perfectly fits what you want to say in that part of the sentence.

Be aware of what I call ‘crutch words’: the off-the-shelf words that you use a lot. For me, it is words like merely and simply. A common student crutch word is somewhat, often used wrongly. Another crutch word, to describe a book or fictional character, is relatable, an adjective that doesn’t mean much. Use the ‘find’ facility on MS Word to see if you use a particular word a lot. If you use lots of crutch words, your prose may sound muddy and dull.

Choosing the right words is hard, and our first efforts often sound slightly wrong or try-too-hard. The right word rarely comes to you immediately (‘without any intervening medium’). Go through your essay looking at every word, particularly the nouns and adjectives. Is that really the right word? Did I mean to say that? Can I come up with a more exact and informative way of describing this poem than emotional or poignant or relatable?

Good writers also tend to be interested in words themselves: their look, feel, shape and sound. ‘We must remember how wide the word “Iowa” is,’ the American writer William Gass once wrote. ‘We must bear in mind how some words are closed at both ends like “top” or are as open as “easy” or as huffed as “hush.” Some words click and others moan. Some grumble. Listen to the way the word “sister” is put together. Can you feel the blow which chops off the end of “clock”?’ Cultivate this kind of granular interest in words and they will – I promise – pay you back a hundredfold.

Delivering the Undeliverable: Teaching English in a University Today

Here is a free-access link for an article I wrote for the journal English about university English teaching. It is more timely than I would like. Every week now seems to bring more news of redundancies and course closures in university English departments. This piece is an attempt to address this reality without being too depressing: