The BBC at 100

I wrote this for a TLS feature in which various contributors discussed what the BBC means to them:

The BBC makes no ideological sense in a post-Thatcherite world. Public goods have come to be seen as the servant of indiv­idual, rational consumers, who are meant to know what they like before they have seen or heard it. And what does British mean, anyway, in an era of devolution and digital fragmentation? A nation is an imagined and often imaginary community. Viewers in Lerwick or St Helier have always watched television with a different eye and ear to those in London, except during those years when they couldn’t watch it at all, because the transmitters did not reach them. And we are forced, by law, to pay for it all! In The Kingdom by the Sea Paul Theroux includes our willingness to pay the licence fee in a long list of what makes the British crazy, alongside their habit of wallpapering ceilings and putting little knitted bobble hats on soft-boiled eggs to keep them warm. That book was published in 1983, before the BBC’s free-market critics had gathered in force.

What can one say in the BBC’s defence? Only that the things that are truly precious in life can’t be fully audited or rationalized. I value it because it tells me that I am more than just a consumer, more than just part of a statistical aggregate of viewers or listeners. I am one member of a diasporic national community, loosely assembled in 20 million living rooms. That community is hard to put one’s finger on and easy to destroy. But precisely because it demands so little of those who belong to it, it can create a sense of commonality among people with little else in common.

Defenders of the BBC tend to shout about its news coverage, its flagship documentaries or its prestige dramas. I prefer to think of those programmes that just tick along in the background – The Sky at NightGardeners’ WorldFarming TodayChoral Evensong, two-minute snatches of birdsong on Radio 4, documentaries about Hebridean islands on BBC Alba. Or of those radio presenters who got me through lockdown by talking away reassuringly in the corner of the room. Watching TV and listening to the radio are everyday, low-intensity and often private acts. It is easy for us to forget what a vital part of our lives they are, and how much the BBC would be missed if it were gone.

The plight of the humanities

I wrote this piece for Times Higher Education earlier this month:

In Samira Makhmalbaf’s film Blackboards (2000), a group of men trek through the mountainous regions of Iranian Kurdistan, stooped under the huge blackboards strapped to their backs. They are itinerant teachers looking for pupils, anyone who will pay them for lessons with money or food.

“Do you know how to write? Would you like to learn?” they say to everyone they meet in earnest, badgering tones. In mountain villages, where the locals hide inside their houses, they shout: “Open the windows. Answer me! I’ve come a long way to teach your children to read and write.” Everyone ignores them or tells them to go away.

As far as I know, teachers in Kurdistan do not roam around with blackboards on their backs, hawking their wares. The blackboards in the film are a conceit, a surreal and beautiful metaphor for the inbuilt asymmetry of education – for how often it rests on offering something as yet unquantifiable to an audience of the apathetic or unconvinced.

I am starting to wonder, as a humanities teacher in a UK university, if this will be my fate – wandering the streets with a laptop and data projector, offering up lectures on Shakespeare’s sonnets or the contemporary novel to random pedestrians. Every week brings news of more planned redundancies and course closures in my field. The hollowing-out of departments by voluntary severance and early retirement is happening more widely, under the radar. The cuts are coming mostly in the post-1992 universities and others outside the research elite.

What adds to the teachers’ desolation in Blackboards is that they are such terrible salesmen – so shrill and needy in their pitches to potential pupils. The humanities have a similar problem. We urgently need a better story that explains the worth of what we do. And yet, as the writer and psychotherapist Adam Phillips argues, often we fall back on idealised justifications of the humanities that “betray, in their vehemence and the nature of their claims, a lack of confidence”.

They make it seem, as Phillips says, as if we are defending a religion or a local hospital under threat of closure. We oversell the humanities as impactful, transformative and game-changing in ways that don’t sit well with the quiet, patient, accretive methods of the scholarship. Or we claim, without much evidence, that the humanities have cornered the market in “critical thinking”, or that studying them makes us more empathetic and humane. If I were a scientist, I would be irritated by the suggestion that my work did not inspire the same kind of emotional literacy. A geneticist or neuroscientist need value human life and consciousness no less for being able to glimpse their makeup in DNA’s double helix or the lump of jellified fat and protein inside our skulls.

We shouldn’t need to overclaim for the humanities like this, because they explore something essential about the human species. We are interpretive animals. “People live by narrative,” Boris Johnson said when being interviewed for a profile in The Atlantic last year. “Human beings are creatures of the imagination.” On this, at least, we can agree. Making shared meanings is almost as vital to us as our animal needs for food, water, shelter and sleep. Every human-made system – literature, art, music, religion, money, the law, the constellations of stars – demands that we swallow its story. The success of our species derives from our ability to weave these intersubjective webs of meaning.

The price we pay is to become overly immersed in these invented worlds, so that it is hard to break their spell and embrace other realities. In Marilynne Robinson’s words, “We live on a little island of the articulable, which we tend to mistake for reality itself.” The humanities show us how to read, with imaginative sympathy and watchful scepticism, the stories we tell ourselves. They thwart our tendency to impose a false neatness and coherence on our lives.

This process is sometimes awkward and uncomfortable. The cuts to humanities departments have coincided – and not coincidentally – with newspapers and politicians caricaturing these departments as hotbeds of woke ideas and resentment-filled identity politics. When the humanities question a society’s well-worn and consoling narratives about itself, it can unsettle and annoy people. Buying into collective meanings is a bit like riding a symbolic bicycle. If we think too much about working the pedals and keeping our balance, we fall off. Falling off a bicycle is painful and makes us look foolish. So most of the time we prefer not to explore too closely the meanings that undergird our activities. We just carry on pedalling.

It is hard to quantify what would be lost if the humanities weren’t there. They don’t come up with solutions like the sciences do, in the form of, say, new vaccines or alternative sources of energy. They can’t produce anything like the beautiful, elegant economy of a mathematical equation – the formula that explains so much with as little effort as possible. Science moves forward with these breakthroughs and discoveries; but the humanities are cumulative, not progressive. Every text examined in the humanities is trying to solve differently the riveting but essentially unsolvable puzzle of being human. Every text is as fascinatingly flawed, as infinitely granular, as limitlessly miscellaneous, as the human being who made it.

So the humanities pursue not solutions but the more careful elaboration of problems. They layer on meaning and erudition, continuing that conversation begun tens of thousands of years ago when Homo sapiens went deep into caves to blow ochre dye on the walls. The results are incremental and hard to measure. Scholars and students just gradually become better writers, readers and thinkers, and more subtle sense-makers of their own and other people’s lives. Bit by bit, the humanities deepen and enrich the act of collective meaning-making.

These slow-burn, incalculable effects mean that the loss of the humanities would be rather like the loss of habitat in the natural world – something profound and far-reaching that occurs piecemeal and unnoticeably, while our attention lies elsewhere. Most people don’t miss that wildflower meadow now that it has become a motorway, especially if they didn’t know the meadow was there in the first place. They didn’t notice the number of migrating birds or pollinating insects declining, because the birds and insects didn’t announce their departure and we were looking the other way. But something precious was lost, all the same.

Of course, people have foretold the death of the humanities for decades. “The humanities are at the cross-roads, at a crisis in their existence,” the historian J. H. Plumb wrote in his introduction to the Pelican book Crisis in the Humanities in 1964. The humanities survived, and indeed student enrolments remained healthy for the next half century. True, the share of students doing the humanities has been falling in UK universities since 2012, in favour of the sciences. This is probably a result of the relentless focus on STEM in schools, and austerity and higher tuition fees driving more career-specific choices. But the fall is not precipitous and might reverse at some point, as subjects go in and out of student fashion.

The problem is not some existential threat to the humanities in general, but to their future outside the elite institutions. What we are seeing now is the long playing out of the government’s decision in 2013 to end student number controls in England. Since this came into force in 2015, the high-ranking universities have made up shortfalls in humanities admissions – or over-recruited in an area seen as cheap to resource – by taking students who would have previously gone elsewhere.

As stories about closures and redundancies in the humanities have broken, there has been much criticism of the predatory student recruitment practices of the Russell Group. But what they are doing is wholly consistent with government policy: student fees and other market mechanisms should increase competition and curb “artificial demand”, even at the cost of the closure of courses and, perhaps, entire universities. Subjects like languages, literature and history risk becoming the new Classics – a luxury taught in what newspapers now routinely call the “good” universities (not elite, or even best, but good).

According to the rational choice economics that now dominates our public life, a university education is a “disutility” – the sacrifice of one’s time and convenience for money. What matters is not so much the learning itself but what it leads to. In the crudest metric, this means a job with a salary high enough to justify the expenditure of the tuition fees. The government’s definition of a “good” university course is one where the size of its fees correlates with the size of salary a graduate of that course can command.

This inevitably favours the elite universities, especially since, with a greatly increased stock of graduates, employers tend to use university rankings as a short cut when sifting job applications. Those who dismiss humanities courses at post-1992 universities as “low value” assume, or at least pretend to assume, that the marketisation of universities established its own natural hierarchy of winners and losers. All it did was favour the entrenched reputations of the elite institutions and the tendency for social hierarchies to be self-fulfilling.

The current political orthodoxy is that social mobility is best achieved by the meritocratic rationing of elite education – by getting more students from poor or modest backgrounds into Oxbridge and other Russell Group universities. There is no evidence that this approach is increasing social mobility. On the contrary, education has been a key driver of what sociologists call “effectively maintained inequality”: the tendency for more affluent families to strategise and over-exploit opportunities. Middle-class children have benefited from inherited social and cultural capital, private education or tutoring, their parents buying into good school catchment areas, and the unconscious rewarding of their social confidence and ease. This gives them a much better shot at attending the elite universities. Although participation in higher education has risen hugely in the UK since the early 1990s, it continues to be sharply stratified according to race and class. The much smaller increase in the number of working-class and black and ethnic minority students has been heavily concentrated in the post-1992 and non-Russell group universities.

If the humanities are driven out of these universities, these are the students who will disproportionately miss out on them. They will be the ones steered away from the “low value” degrees towards learning a trade or doing a more vocational course. Meanwhile Oxbridge humanities graduates – including the prime minister, his former chief adviser Dominic Cummings and three members of Johnson’s current Cabinet (Stephen Barclay, Kwasi Kwarteng and Jacob Rees-Mogg) – continue to fill positions of power and influence. But then our hierarchised UK university system, as a former colleague once said to me, is the last bastion of socially acceptable snobbery.

Why does this unequal access to the humanities matter? It matters because the humanities are, ultimately, about hope and possibility. They began in the creative ferment of the Renaissance, with its rediscovery of the secular accomplishments of classical civilisation. The studia humanitatis lauded the fullness of human potential. The humanities show that people, with all their imperfections, fragilities and self-deceptions, are beautiful and irreplaceable beings of incalculable worth. They show that human beings are more than simply human capital. They show that every person has a deep reservoir of potential that we can’t begin to fathom until it is fulfilled, because it will somehow be tied up with this messy, uncontainable human impulse to make our lives meaningful. The humanities are not for everyone, but everyone should have an equal chance to study them.

As for the humanities scholars under threat of redundancy, they cannot easily retrain as computer coders or ballet dancers. For many, their working lives will be over prematurely. Those left standing will be unsure what the future holds – even if, like Robert Burns addressing the mouse, they “guess and fear”. They are becoming acquainted with the precarity that new PhDs and early career scholars suffer routinely.

This is especially unfortunate because, while the marketised university feeds off flux and uncertainty, humanities scholarship feeds off security and confidence. Humanities scholars know more than anyone that people only thrive inside self-spun webs of meaning. Humanities work requires huge investments of time and effort in getting to know one’s material intimately, keeping the faith that others will find this worthwhile and that it will form one more slender thread in the web of meaning. We have to keep telling ourselves that, just because what we do is slowly accruing and often too ineffable to turn into data, that doesn’t mean it doesn’t exist, or doesn’t matter. When we start to question that faith, it corrodes our focus, motivation and well-being.

All we can do is carry on with the work. In Emily St John Mandel’s 2014 novel Station Eleven, a nomadic troupe of actors perform Shakespeare to pockets of survivors from a global pandemic that has killed most of the world’s population. The lead caravan of their troupe bears the legend Because survival is insufficient.

I have taken to incanting this line, which Mandel got from an episode of Star Trek, as a justification for the humanities. It is not enough just to live. We need to know that our lives, with all their griefs and joys, are meaningful. Exploring life’s meanings more carefully could never be a waste of time, even if all the political mood music at the moment tells us otherwise. The humanities matter and we are right to keep believing in them. Survival is insufficient.

Housing is a human right

This is my review of Vicky Spratt’s book Tenants, and Daniel Lavelle’s book Down and Out, that appeared in the TLS on 22 July:

Like access to clean, drinkable water, the right to adequate housing is recognized by the United Nations. In the UK, in the space of a generation, this right has been gradually eroded by stark housing inequalities. These inequalities are the product of political will, market ideology and the unintended consequences of both. The forty-year property boom has been so spectacular that many houses now earn more in equity than their occupants do in their jobs. Attempts to help those priced out of the market through optimistically named “affordable housing”, stamp-duty cuts and Help to Buy schemes have only tinkered at the edges of the problem or inflated the market further. Meanwhile, the selling off of council housing has pushed most tenants into the private rented sector, where they have few of the legal protections of tenure available in other European countries.

Two new books by Vicky Spratt and Daniel Lavelle address this intricate ecosystem of housing inequality and the ways in which it is reshaping Britain’s social and economic landscape. Both authors have lived at the sharp end of the problem. When Spratt was seven she learnt never to answer the door to bailiffs, but still her family lost their home. As a young adult she rented tiny box rooms in houses with damp and mould, and lost a fortune in deposits to dodgy landlords. She is now the i Paper’s housing correspondent, writing not about property hotspots and fantasy house hunts, but about the human costs of the housing crisis. Lavelle grew up in care, moving between special boarding schools, foster homes and children’s homes. After leaving university he was homeless for two years, living in tents and hostels, or sleeping on friends’ sofas. In 2019 he co-wrote the “Empty Doorway” series in the Guardian, recording the lives of homeless people who died on the streets.

Tenants is the more densely researched book, being based largely on interviews Spratt conducts with evicted tenants, grassroots activists, support workers and experts in housing law. Down and Out is rawer and more personal, combining Lavelle’s own story with those of the insecurely housed and homeless people he keeps in touch with from his time in care and in hostels. Spratt focuses mainly on tenants facing or experiencing eviction; Lavelle explores a twilight world of sofa-surfing, hostels, night shelters and rough sleeping. Taken together they make plain how paper thin is the divide between the cheaper end of renting and being thrown out on to the street. They show that all it takes to be made homeless is to be surprised by illness, redundancy, a break-up or simply a landlord who decides on a whim that they want you out. Section 21 of the Housing Act 1988 allows private landlords to evict tenants at short notice without giving a reason. In the Queen’s Speech of December 2019the government pledged to abolish these “no-fault” evictions, but it has yet to do so.

Both books consider the Thatcher era to be, in Spratt’s words, “ground zero for the mess we are in now”. The 1980 Housing Act gave millions of council house tenants the right to buy their homes, at market discounts of up to 50 per cent. As Spratt points out, this was no rocket boost for homeownership, which in England has increased only slightly from 56.6 per cent in 1980 to 64.6 per cent in 2020. More than 40 per cent of ex-council homes sold under Right to Buy are now owned by private landlords. For Spratt, the key driver of housing inequality was the political decision to outsource the rental sector to unqualified and unregulated individuals, private landlords, many of whom have neither the time nor the resources to manage their properties properly. Nearly half of housing benefit, about £10 billion a year, goes straight to them.

Lavelle, with typical pungency, calls Right to Buy “the greatest heist in modern history, a heist perpetrated under the guise of giving people a stake in public assets they already had a stake in”. In truth, as the more restrained Spratt concedes, Right to Buy was not an original Thatcherite policy. A less heavily discounted version of it appeared in the Labour Party’s election manifesto in 1959. The traditional Conservative policy of encouraging home ownership – Anthony Eden’s espousal in 1946 of a “nation-wide property-owning democracy” – gradually became a cross-party consensus in the postwar years. Labour retained plans for ambitious council house-building, but New Labour shelved them as it tailored its policies to existing homeowners. Between 1998 and 2010, 6,330 council homes were built, just over a third of the total built in 1990 alone, the last year of the Thatcher government.

The 2008 financial crash made things vastly worse in two ways. First, banks wanted bigger deposits and tightened affordability checks for mortgages. They ploughed money into buy-to-let mortgages, with investors being seen as a safer bet than first-time buyers. This pushed many more people into renting. Second, austerity made life more brutal for renters on low incomes. In 2010 George Osborne cut housing benefit and barred single people under thirty-five from claiming it to live in a place of their own. Cuts to local-authority budgets meant less money for hostels, shelters and drug and alcohol dependency services. Councils are now so strapped that they operate what Lavelle calls “a misery contest for housing, a sort of X Factor for the destitute”. While living in a tent along a bridle path in Saddleworth, Greater Manchester, he was told he was not a “priority need” because he did not present with any other vulnerabilities. He was “homeless, but not homeless enough”.

These two effects of the financial crash combined catastrophically with one non-effect: house prices and average rents carried on rising. Adding more renters to the mix drove up demand, and landlords put up prices accordingly. Each chapter of Spratt’s book is preceded with data on the sales and rental market in the area she is writing about, which underlines how hopeless the situation is for many. In Peckham, south London, for instance, the average price of a flat in 2021 was £450,865 and the average monthly rent for a one-bedroom home was £1,394.

Housing inequality bears out Claudius’s maxim that sorrows come “not single spies, but in battalions”. Its victims are invariably dealing with contributory and aggravating factors: casualized work, stagnating wages, welfare cuts and debt. Spratt compares the housing crisis to a virus that “infects its hosts and multiplies to make everything more difficult for them”. Living in cramped, ugly, broken-down surroundings is bad for anyone’s mental health. Damp and mould bring respiratory problems and other illnesses. Those in poor, overcrowded housing suffered most in lockdown, and were more likely to catch and spread Covid. Having to move all the time is stressful and makes it much harder to build support networks. Mindy Fullilove, an American professor of urban policy and health, calls this phenomenon “root shock”, after the trauma a plant experiences when it is moved carelessly to shallow soil.

The homeless people Lavelle speaks to struggle with three big problems: experience of abuse, mental illness, and drug and alcohol addiction. One reason that spice (a synthetic cannabinoid) is so popular on the streets and in hostels, Lavelle says, is that it makes time go quickly. He is open about his own problems. The victim in infancy of a family trauma that he can’t write about for legal reasons, he spent much of his childhood being excluded or expelled from school and attending special educational establishments. The psychiatrist who diagnosed attention deficit hyperactivity disorder when Lavelle was seven “didn’t need to strain his diagnostic skills too much”. He does not sound like the most compliant hostel dweller, getting into arguments with other residents and with the supervisor who, when Lavelle has the flu, refuses him Strepsils because they contain alcohol. At one point he punches a hole in a windscreen. “If being one’s own worst enemy was a sport, I’d be a Hall of Fame world champion”, he concedes. His point is that the people who end up in hostels and shelters are often hard to handle, but that this should not affect the support they receive. The NHS, after all, does not require character references before admitting you to A&E. Adequate housing is a human right, not a reward for good behaviour.

Lavelle’s bête noire is what he calls “philanthrocapitalism”: the voluntary sector, charities and private companies that have taken over homeless provision in the age of austerity. These organizations are self-regulating because they provide “support” rather than “personal care”. They are not monitored by the Care Quality Commission or Ofsted, even though those they support may be as young as sixteen. Nor do they have to comply with Freedom of Information requests. They can ask hostel residents to work for a meagre weekly allowance rather than the minimum wage, impose strict rules on their behaviour and evict them for minor breaches.

Both Spratt and Lavelle advocate the Housing First model pioneered in the 1990s by Sam Tsemberis, then a clinical psychologist at New York University. Housing First provides homeless people with their own home straightaway, with no preconditions. They are not required to move gradually up the ladder from a night shelter to more secure housing. They do not need to get a job first, or obey someone else’s house rules, or abstain from drugs and alcohol. Only when they have been housed are their other needs addressed. Tsemberis tells Lavelle that Housing First is more about providing treatment for addiction and mental health problems than about housing, but says “you can’t really talk about the treatment unless the person is housed, otherwise the whole conversation is only about survival”. Housing First has been successful in the American states and cities where it has been rolled out, as well as in Finland, the only EU country where homelessness is falling.

These books argue convincingly that investing in more social housing would benefit everyone, not just those who live in it. It would ease pressure on the rental sector and make private landlords compete with an alternative source of good-quality and secure homes. It would take the heat out of the housing market and start to counter the inheritocracy in which older people sitting on equity pass it on to their children. It would alleviate other forms of injustice, since housing inequality falls disproportionately on Black, Asian and minority ethnic tenants. It would make the kind of housing safety scandal exposed by the Grenfell Tower fire less likely. And it would reduce homelessness, which places immense strain on the police, the criminal justice system, the NHS and councils.

More subtly but profoundly, investing in social housing would soften, for millions of people, that repeated blow to their self-esteem that comes from being beholden to someone else. It is fatiguing and confidence-sapping to live with stained mattresses and broken shower curtains; to show potential buyers, your would-be evictors, around your home; to be fearful of provoking your landlord by making a fuss about repairs; to feel stuck in a limbo of enforced adolescence, waiting for grown-up life to begin. All this makes for what Spratt calls “an uneasy and constant refrain, your life sung to the tune of the privilege of others”. People waste hulking portions of their lives looking for rented rooms, dealing with bad landlords, extricating themselves from nightmare house shares and moving their stuff from one room to the next. What could be achieved with all that energy if it were expended more creatively?

Perhaps something is stirring. Spratt highlights the work of the community union ACORN and organizations such as Generation Rent and Safer Renting, which fight for tenants’ rights. She reports on eviction resistance bootcamps and renters fighting back against gentrifying regeneration schemes. In 2016 she fronted a successful campaign to get letting fees banned and deposits capped. Ultimately, though, her book leaves you with the sense that nothing much will change while the haves (homeowners and investors) outnumber the have-nots (renters and the homeless).

Housing inequality has barely been mentioned in recent election campaigns. Spratt is told that when David Cameron was prime minister, the phrase “housing crisis” was banned in government. In the early 2010s, while working as a junior producer on Newsnight, she suggested covering more housing stories, but her editor – homeowning and privately educated – told her that they were “just not that interesting”. The same kind of willed obliviousness allows newspapers to place the blame for the housing crisis on immigrants, or on young people who are buying too many avocados or espressos to save for a deposit. In the face of such simplistic explanations, these books enrich our impoverished sociological imagination. Their case studies are as bleakly memorable as Raymond Carver stories. A Brighton man sleeps in his work van at the height of the pandemic, after losing his flat just before the second lockdown. A woman evicted from her flat in Peckham, who has recently attempted suicide, is told that if she does not accept a flat in Croydon she will have made herself “intentionally homeless” and forfeited any right to support. When she asks how she is meant to get to work or take her daughter to school, the placement officer tells her to “get up earlier”. Lavelle spends a wet and freezing November night wandering around Oldham, making endless circuits of the shopping centre until it closes and “laughing maniacally about what a parody my life had become”, before huddling underneath a bridge.

Both books retell the story of Gyula Remes, the Hungarian national who, just before Christmas 2018, died in the underpass leading from the Houses of Parliament to Westminster Tube station. At forty-three, he was one year short of the mean age at death of a homeless person in England and Wales. This story received wide coverage because it seemed shocking that MPs could routinely walk past the effects of the austerity for which many of them had voted. But it is not really so shocking – because MPs are no different from the rest of us, we who avert our eyes from the daily disaster playing out in the pile of blankets on the other side of a pavement. These books succeed in reinserting a whole person into that human-shaped heap: someone with a name, a family, a life history that led them there and a body just as achy as ours would be if our bed were made of stone.

The Noel Edmonds Creed (a found poem)

I, as a follower of Noel Edmonds, believe all that Noel Edmonds believes.

I believe that disease is caused by negative energy.

I believe that death is just a word in the dictionary.

I believe that the most appropriate word is ‘departure’ because we are energy and you can’t create or destroy energy, you can only change its form.

I believe that we are surrounded by electro mist, fog and smog.

I believe that we are covering ourselves in the wrong sorts of electro-magnetism.

I believe that the biggest problem we have is not Ebola, it’s not Aids, it’s electro smog.

I believe that the Wi-Fi and all of the systems that we are introducing into our lives are destroying our own natural electro-magnetic fields. All you are is energy, remember that.

I believe that renewing the BBC’s charter would be an act as futile as giving medicine to a corpse.

I believe in Orbs. Orbs are little bundles of positive energy and they think they can move between 500 and 1,000 miles per hour.

I believe that there are two orbs that visit me. The two that I have are about the size of melons. One sits on my arm and the other is usually in the back of the shot, sitting just over my right shoulder.

I believe that you don’t live life, life lives you.

I believe that I wrote a wishlist of ambitions to the cosmos and, like a mail-order company, it delivered my wife, who was working as a make-up artist on Deal Or No Deal.

I believe that every single human being can achieve a perfect vibrational balance between their positive and negative energy.

I believe it is possible to retune people.

I believe that Ant and Dec are excellent presenters. They’ve been honest, they’ve plundered the House Party archive and created Takeaway. I don’t have a problem with that. I take it as a compliment.

I believe that all these things have been known about for a very long time.

Amen.

The Nowhere Office

I reviewed Julia Hobsbawm’s The Nowhere Office and Jonathan Malesic’s The End of Burnout for the TLS in February:

Change, writes Julia Hobsbawm, happens “slowly and then all of a sudden completely”. What Hobsbawm calls the “Nowhere Office” – the hybrid workspace that floats between work and home – may seem like Covid’s gift to the world but it was long in the making. For her it is the culmination of trends that have been emerging since the 1980s, when office hours stopped being strictly nine-to-five and the search for an elusive work–life balance began. The pandemic “broke the last threads holding the embedded customs and practices together”.

The Nowhere Office is buoyant about this placeless workspace. Offices, Hobsbawm predicts, will no longer be run by a creed of unthinking presenteeism and will become places we visit for networking, collaboration and community-building. The rest of the work will be done at home or on the move. We will happily cut across different time zones, accessing our files anywhere via digital clouds and dividing up work across a seven-day week, carving out “air pockets of free time” rather than a two-day weekend. The main divide will be between the “hybrid haves” and the “hybrid have nots” – those who are able to move seamlessly between online and offline and those who are not.

Hobsbawm wants to “put the human back in the corporate machine”, and her instincts are all good. She understands that working from home can mean loneliness, isolation and the bleeding of work into our personal lives. And she concedes that “despite the apparent flexibility and freedoms, many inequalities remain and too many people still have to work too hard and too long”. But what if the apparent flexibility and freedoms are the problem? What if the nowhereness of work means that work ends up being everywhere, and we can never disengage from its demands? For Hobsbawm the solution is to give employees more choice and negotiate their consent. They must be disciplined in separating work from life, and their bosses must trust them to work unsupervised. “It will be obvious if people are working well”, she writes sunnily, announcing the end of “the age of being violently busy”.

The book is interspersed with interviews with practitioners and proponents of the Nowhere Office. Most of them are business leaders: chief strategy officers, brand presidents, digital entrepreneurs, investors. Their insights are worth having, even if Hobsbawm’s mimicry of their corporate-speak about “win-win models” and “siloed thinking” does little for her prose style. But one wonders if those lower down the corporate hierarchy might have a less heady take on the Nowhere Office.

According to Hobsbawm, theses changes are unstoppable. The future is set fair and all we can do is catch up. “The desk is all but over as a built-in feature of office life”, she says. “Sofas, small theatres, spaces to convene and converse in will be ‘in’.” Her brisk verdicts on the new reality reminded me of that much-repeated formula online, declaring that some new phenomenon “is a thing now”. But why is it a thing, and should it be a thing? The future is neither uniform nor inevitable. It feels too soon to make bold calls, before the pandemic is even over, about what the workplace of the future will look like.

Hobsbawm summarily dismisses critics such as Josh Cohen, David Graeber and Sarah Jaffe as part of “an emergent purist camp” which holds that “work represents a failure of society, certainly of capitalism, and that work is essentially not an opportunity but a threat”. But these critics do not say that work is “pointless”, as she claims, only that a turbo-capitalist conception of work makes excessive and toxic demands on us. Their writing deserves to be engaged with rather than caricatured.

Hobsbawm would probably put Jonathan Malesic in the purist camp. But his acutely felt investigation of work burnout as an “ailment of the soul” makes his the more thought-provoking and substantial of these two books. Malesic is a recovering academic, a former professor of theology at a small Catholic college in Pennsylvania. Like many academics he began his career with unsustainably high ideals, believing he was “a citizen in the republic of letters”. He discovered that much of it was just a job, with unrewarding tasks, soul-sapping hassle, pointless politicking and fears of redundancy. His students, most of whom were studying theology as a core requirement, did not share his enthusiasms and spent his classes looking blank-faced and bored. Soon he was lying in bed for hours when he should have been working, repeatedly watching the video of the Peter Gabriel and Kate Bush song “Don’t give up” and self-medicating with ice cream and beer. After eleven years he gave up the tenure-track position he had worked so hard for. Alongside the sense of failure, he felt intense guilt that he had come to hate such a coveted and well-rewarded job.

As Malesic admits, burnout is something of a buzzword, “an often-empty signifier onto which we can project virtually any agenda”. Our vague definitions of it, and the lack of consensus on how to diagnose and measure it, raises the question of how much we really want to eradicate it. Diagnosing oneself with burnout can, after all, be self- flattering. To be burned out is to be a modern, a victim of the age, a martyr to one’s own high ideals. Burnout’s historical antecedents, the now-forgotten soul sicknesses of acedia, melancholia and neurasthenia, were similar sources of both pride and shame.

Malesic defines burnout usefully as “the experience of being pulled between expectation and reality at work”. We burn out not just because we are exhausted but because our hearts are broken. Our love for work, which we saw as the path to social status and spiritual flourishing, went unrequited. Even in the good times, work could not deliver all we asked of it, and these are not the good times. Aided by market deregulation, employers now see workers as a fixed cost to be reduced. Outsourcing, zero-hours and precarious work have expanded, while more hours are demanded of everyone. The funky offices of tech start-ups, with their games rooms and sleeping pods, are, Malesic writes, “designed to keep you at work forever”. The life hacks touted as burnout antidotes – mindfulness, getting more sleep, working smarter – are superstitions, “individual, symbolic actions that are disconnected from burnout’s real causes”.

Malesic visits an artisan pottery studio in Minnesota, a Dallas nonprofit doing anti-poverty work and several Benedictine monasteries, and spends time among artists with disabilities who cannot find paid work but who form richly supportive creative communities. He learns that work need not be the lodestar of our lives. To heal our burnout, we need to lower our expectations. Malesic now teaches writing part-time at a Dallas university, just one or two classes per semester. He no longer expects the life of the mind to be soul-nourishing and is a better and more patient teacher for it.

We need to see work as, well, work. But this does not mean that it should cease to matter. Malesic cites the French phrase “un travail de bénédictin”– a Benedictine labour – to describe a project that demands quiet, steady effort over a long time to bring it to fruition. This kind of work has little value in a world of annual pay reviews and key performance indicators. But a richly satisfying Benedictine labour can cure us of that self-lacerating cycle of looming deadlines and short-term goals that ultimately benefits only our paymasters.

These very different books have one perspective in common: they both see the pandemic as a chance for reflection and change. “Right now, we have a rare opportunity to rewrite our cultural expectations of work”, Jonathan Malesic writes, “and I hope we will.” So do I.

The Premonitions Bureau

I reviewed Sam Knights’s book The Premonitions Bureau for the TLS in May:

For most of human history, people have believed that we can see into the future. The Bible is filled with prophecies and premonitory dreams; the ancient Greeks put their faith in oracles and in destinies that no mortal being could swerve. “That which is fated cannot be fled”, warned Pindar. As Oedipus discovered, what is going to happen to us becomes what we choose to do.

The Premonitions Bureau, Sam Knight’s elegant and illuminating work of cultural history, transports us back to a mid-twentieth-century Britain still clinging to this faith in precognition – the extra-sensory perception of future events. Precognition, which hinted at “undiscovered reaches of physics and of the mind”, managed to escape the taint of the occult that clung to phenomena such as ghosts and ectoplasm. It teetered on the edges of scientific respectability.

In 1927, J. W. Dunne, an aeronautical engineer, published the bestselling book An Experiment with Time, which remained in print for more than half a century. In 1902, while serving in the Boer War, Dunne had dreamt of a volcano about to erupt on a French colonial island. A few weeks later, he got hold of a newspaper which reported that the eruption of Mont Pelée, on the French Caribbean island of Martinique, had killed 40,000 people. Dunne’s book was a thirty-year history of his own dreams and their intimations of the future. He explained it all with reference to the new fields of relativity theory and quantum mechanics, which theorized that time’s linearity was no simple matter. Dreams that predicted future happenings became known as “Dunne dreams”. On Dunne’s advice, many of his readers began leaving pencil and paper by their beds so they could write down their dreams on waking.

J. B. Priestley, in plays such as Time and the Conways (1937) and An Inspector Calls (1945), drew on Dunne’s work. Priestley also popularized Carl Jung’s theory of synchronicity, which suggested that events could be linked outside the normal logic of cause and effect, such as when a dream foretells an event in the waking world. In Time and the Conways, Alan Conway tells his sister Kay that the secret to life is that time is not monodirectional but eternally present, and that at any given moment we see only “a cross section of ourselves”.

At the heart of Knight’s story lies a remarkable character called John Barker. When we first meet him, in 1966, he is a forty-two-year-old psychiatrist working at Shelton Hospital in Shropshire, one of Britain’s sprawling and overcrowded mental institutions. Barker worked tirelessly to improve conditions at Shelton by phasing out the more brutal treatments, such as electroconvulsive therapy administered without drugs. But he was also frustrated with the professional timidity of his field. Fringe areas dismissed as psychic or paranormal were just waiting to be absorbed into mainstream science, he believed. He was a member of Britain’s Society for Psychical Research and fascinated by precognition.

The book begins with the event that galvanized Barker: the Aberfan disaster of October 21, 1966, when a coal-tip avalanche buried a primary school, killing 144 people, mostly children. The precarious-looking tips above Aberfan had long worried locals, and many spoke of having disturbing thoughts and visions before the disaster. Given how much Aberfan had pierced the national consciousness, Barker decided to ask the public if they had felt any presentiment of it. He contacted Peter Fairley, the science editor of the London Evening Standard, who agreed to publicize his appeal. Barker received seventy-six responses from what he called “percipients”. After prodding them for details and witnesses, he concluded that precognition was a common human trait, perhaps as common as left-handedness. He thought that a small subset of the population might experience “pre-disaster syndrome”, somewhat similar to the way in which twins were thought to feel each other’s pain remotely.

The problem was that, as with most similar evidence, the Aberfan data had been scientifically compromised by being collected after the event. So just before Christmas 1966, Barker and Fairley approached Charles Wintour, the Evening Standard’s editor, about setting up a “Premonitions Bureau”. For a year, the newspaper’s readers would be asked to send in their forebodings of unwelcome events, which would be collated and then compared with actual events. The Standard’s newsroom was soon inundated with letters and telephone calls.

Barker envisaged the Premonitions Bureau as a “central clearing house” for all portents of calamities, “a data bank for the nation’s dreams and visions”. This crowd-sourcing of the collective unconscious recalled the work of an earlier research organization, Mass Observation, which also made use of unpaid volunteers to create “weather maps of public feeling”. Barker hoped that the results would eventually be uploaded to a computer database, and that the Bureau would issue early warnings of potential disasters.

Barker and Fairley appeared often in newspapers, as well as on BBC2’s Late Night Line-Up. They also turned up with a group of percipients to be interviewed on ITV’s The Frost Programme, but were dropped mid-show, probably because David Frost worried how the group might come across. “‘Weirdos’ would be too strong a description,” Fairley wrote later, “but they were certainly different.” Fairley put his own raised profile to good use, going on to present ITV’s coverage of the moon landings.

The Bureau received hundreds of warnings, most of which proved, predictably, to be blind alleys or impossible to verify. On quiet mornings, Fairley would go through the letters pile in search of racing tips. Two respondents, though, had real staying power: Kathleen Middleton, a piano teacher from Edmonton, and Alan Hencher, a Post Office switchboard operator from Dagenham. They predicted a whole run of unfortunate events, including the Torrey Canyon oil spill, the death of a Russian cosmonaut on his re-entry to earth, the assassination of Robert Kennedy, and the Hither Green rail crash in which forty-nine people died. Distressingly for Barker, they both then foresaw his own death (which nicely sets up the end of the book).

Knight’s refreshing approach to his subject matter avoids being either too cynical or too credulous. “Premonitions are impossible, and they come true all the time”, he writes. He knows how hard it is for us storytelling animals to separate an event from the link we give it in a causal chain. A few weeks before their wedding, he and his wife saw three magpies, and “never asked for a test to confirm the sex of our daughter because we felt we had already been informed”.

Time is an arrow. The second law of thermodynamics rules that there is no way we can know about things before they happen. Entropy – the cup of tea that cools as you drink it, the leaves that fall in autumn, the lines that form on your forehead – is the concrete proof that time only runs forwards. And yet some contemporary theoretical physicists, such as Carlo Rovelli, suggest that the explanatory power of entropy, which makes sense of our lives and our deaths, has caused us to give it too much credence. Perhaps we only see the small part of reality where this rule holds. Knight feels no need to come down on one side or the other. Instead, he uses the theme of precognition to explore deep existential questions about time, causation and the meaning of life.

The Premonitions Bureau is full of lightly dispensed research, gathered from the archives of the Society for Psychical Research and interviews with the families and associates of the main characters. Knight’s method and tone will be familiar to those who have read his Guardian Long Reads on everyday subjects such as the British sandwich industry and Uber’s takeover of London, or his New Yorker “Letter from the UK”. He deploys two highly effective narrative techniques. The first is the deadpan drop of bits of stray information. We learn that a survivor of the Hither Green rail crash was the seventeen-year-old Robin Gibb, of the Bee Gees; that Barker was a keen surfer, although overweight and at least two decades older than his fellow longboard pioneers; that Fairley chased stories on a fold-up motorcycle and that only when he died did his widow and four children learn of his secret second family. As well as being weirdly fascinating, these facts add authenticating specificity to the story.

Knight’s second technique is the narrative handbrake turn, where the story veers off without warning, the significance of this new thread only emerging later. “In the 1690s, a young tutor named Martin Martin was commissioned to map and document life in the western islands of Scotland”, he might begin, out of the blue. Or: “One day in 1995, in the German cathedral city of Mainz, a fifty-one-year-old woman went to hospital …”. The creatively jarring juxtaposition of human voices and stories reminded me a little of Tales of a New Jerusalem, David Kynaston’s multi-volume history of postwar Britain. Knight, like Kynaston, leaves us with a sense of the stubborn strangeness of other people and of the recent past, without ever seeming condescending to either. Other people, his book reveals, are infinitely and incurably odd. Still, they might just be on to something.

Ten Writing Tips

During lockdown in the autumn of 2020, when we were teaching online, I posted a writing tip to our students every week. I thought I would post them here now in case anyone else finds them useful.

Tip 1: Start writing earlier

When you’re working on an essay or piece of coursework, start writing early on in the process. Don’t spend all your time on the reading and research and leave the writing until the last minute. As an English student, writing is your laboratory, your way of thinking – how you find out what you really want to say. Make sure you leave enough time for it.

Students sometimes get discouraged when they have written a first draft of their essay and it feels awkward or stilted. But that’s like saying your cake tastes awful when all you have done is mix some butter, eggs, flour and sugar in a bowl. You haven’t finished making it yet. Only when you have hacked your sentences into a basic shape can you see the many other things wrong with them. Only by putting the words into a semblance of order can you see how muddled they still are. An essay is too big and complex to hold entirely in your head, so you need to have the words in front of you to really think it through.

A defining quality of writing, as opposed to speaking, is that it can be redone. You can keep working on it until it’s ready. Writing is rewriting. ‘Writing,’ the American author Kurt Vonnegut said, ‘allows mediocre people who are patient and industrious to revise their stupidity, to edit themselves into something like intelligence.’ Not that I’m saying you’re mediocre. I’m just saying that the great thing about writing is that you can keep reworking it until you sound like the best, most perceptive and insightful version of yourself. And who wouldn’t want to spend time doing that?

Tip 2: Trust your ear

The best way to iron out mistakes and awkwardness in your writing is to read your work aloud. Trust your ear. Language is innately rhythmic and musical. Even the way you say your phone number to someone else has a rhythm, as you split it into two or three phrases. That is why we find the automated voices of satnavs and public address systems, with their random rise and fall, so alien. They don’t sound human because they don’t speak with human rhythms.

If you get the rhythm of your writing right, the other things tend to fall into place. Most people know the grammatical rules of writing more than they think they do. You probably know where the subject and verb should go in a sentence, even if you can’t identify them. (Most people can’t.) You know the subject and verb go at that point in the sentence, and in that order, because it sounds right. If it sounds right, it’s probably grammatically right; if it sounds wrong, it’s probably grammatically wrong. You should certainly trust your ear more than the grammar check on MS Word, which is pretty useless.

You can test the flow and sense of your writing when you read your work aloud, because the ear is very sensitive to dissonance, in the same way that you can tell if a singer has hit a bum note, even if you don’t know what the note should be. Reading your work aloud slows you down (you read much quicker when you’re reading silently) so you’re more likely to notice if something sounds wrong. Reading aloud forces you to renotice what you have written.

There is an even better way. When you read your own writing aloud you already know what you meant, and you augment that meaning by accenting and stressing, speaking faster or slower, higher or lower – all ways of making your meaning clearer and reducing ambiguity. Better, if you can bear it, to get a friend to read out your sentences for you. If they stumble over a word or phrase, it might be a clue to revisit it.

Tip 3: Cut all unnecessary words

Which of these sentences sounds better to you?

  1. When I was a child, I used to have a terrible temper.
  2. As a child, I used to have a terrible temper.
  3. As a child I had a terrible temper.

I say the second is better than the first, and the third is best. You don’t need both when and used to, because they convey the same thing. And, come to think to think of it, you don’t need both as and used to either, because they too convey the same thing. The third sentence takes the least time and effort to read. Cutting unnecessary words always makes your writing cleaner and more elegant.

For instance, repeating a word in a sentence can sound clunky:

By choosing to narrate the novel in the first person the author makes the novel more vivid.

Better version: The use of the first person makes the novel more vivid.

The story of Cinderella is a well-known story.

Better: The story of Cinderella is well-known.

The book’s title establishes the theme of the book; the book’s first paragraph establishes the voice of the book.

The book’s title establishes its theme; the first paragraph establishes its tone.

Also, do you really need all those vague qualifiers like very and rather, and do you need two vague adjectives when one would do?

This piece of writing is a very poignant and heartfelt one.

The writing is heartfelt.

It’s easier for the reader to quickly grasp the meaning of your sentence if you cut all needless words:

Portraying the nature of people to be driven by violent instinct is present in many other novels.

People driven by violent instinct appear in many other novels.

Most memoirs choose to mirror the strict chronological nature of life itself within the structure of their works, although this is not always the case. Some memoir writers choose to employ non-chronological structures.

Most memoirs mirror the chronological nature of life in their structure, but not all.

So: write more words than you need and then go through your draft cutting the ones you don’t need. Just as your speech is full of ums and ers and repetitions, your first go at any piece of writing will be full of unnecessary verbiage.

A writer makes meaning not just by adding words but by taking them away. The playwright David Mamet said that ‘Omission is a form of creation.’ Cutting words is as creative an act as writing them. It often makes your meaning clearer to yourself. It’s a bit like being a sculptor, looking for the beautiful form hidden in that rough block of marble by chipping away at all the superfluous stone. Cutting words has this same creative quality. Sometimes it can liberate a meaning that you weren’t quite aware of but that was waiting there to be found.

Tip 4: Learn the power of the full stop

In the age of texting and social media, full stops are going out of fashion. The dialogic visual language of texting speech bubbles, pinging left and right on your phone, has little use for full stops. A single-line text needs no punctuation to show that it has ended. Instead of a full stop, you press send. Studies have shown that young people tend to interpret full stops in texts as curt or passive-aggressive.

But writing is not a speech-balloon text waiting on a response. The point of writing is to communicate in a way that does not require you to explain it any further. A sentence gives words a finished form that should need no clarification. It is its own small island of sense. So, with any kind of semi-formal writing addressed to people you don’t know well (such as the tutor marking your essay), the full stop, and where you decide to put it, are crucial.

Only when the full stop arrives can the meaning of a sentence be fulfilled. The full stop should be like a satisfying little click that moves your prose along slightly so that the next sentence can pick up where it left off. If you want to write well, learn to love the full stop. Love it above all other punctuation marks, and see it as the goal towards which all your words move. It is the most powerful punctuation mark: don’t forget to use it.

Tip 5: Don’t make your sentences any longer than they need to be

Last time, I wrote about full stops. Here is another reason why full stops are important: every sentence places a burden on the reader’s short-term memory. A sentence throws a thought into the air and leaves the reader vaguely dissatisfied or confused until that thought has come in to land. The reader has to hold all the sentence’s words in their head until the full stop arrives to close the circle of meaning. The full stop provides relief, allowing them to take a mental breath.

The longer your sentence is, the more the reader has to hold in their head and the more chance there is of something becoming mangled or unclear. This doesn’t mean you should avoid writing long sentences – I will discuss how useful they are in one of my later tips – but it probably means that, if in doubt, you should put a full stop in. A lot of student writing is full of sentences that are longer than they need to be.

When you’re writing a first draft, I suggest you start with short, simple sentences. If you start short like this, it’s easy to add detail and texture, and combine short sentences into longer, more complex ones. But if you start writing long, complicated sentences before you’ve worked out what you really think, then you will find them hard to take apart and simplify. Start simple and make it complex; don’t start convoluted and then have to unravel it all.

Tip 6: Vary your sentence length

The best way to make your writing sound fresh and musical is to vary the length of your sentences. Paragraphs tend to work well when they are a group of sentences of varied lengths. At the end of every sentence there is what’s called a cadence – a drop in pitch (whether you’re reading it aloud or silently) as the full stop arrives. This signals to the reader that the sentence, and the sentiment, are done. Varied sentence length makes for varied cadences. This makes writing breathe, move and sing.

Short and long sentences also do different things. Short sentences make key points or recap them, and trade in relatively straightforward statements about the world. Long ones take readers on a mental tour, list a whole series of things or stretch out a thought. Short sentences give the reader’s brain a bit of a rest; long ones give it an aerobic workout. Short sentences imply that the world is cut and dried; long ones restore its ragged edges. Short sentences are declarative and sure; long ones are conditional and conjectural. Vary your sentence length and you mirror the way the mind works, veering between conviction and nuance.

Vary the length of your sentences!

Trust me.

It works.

Tip 7: Put the important stuff at the end of the sentence

A good English sentence, however long it is, moves smoothly and easily towards its full stop. The best way to ensure this happens is to put the important stuff at the end. A sentence ordered like this feels more deliberate and memorable – just as, when you stop speaking, what sticks in your listener’s mind is the last thing you said.

Typically, the word or words at or near the start of a sentence are the subject. The words at the end of a sentence are typically the predicate: the main verb and its complements. The predicate adds new information that the next sentence may then comment on as a given. So the predicate often turns into the subject of the next sentence. Weak sentences break this given-new pattern. The subject is stronger than the predicate and the sentence ends with an unresounding phhtt.

If you write that something is an interesting factor to consider or should be borne in mind or is very relevant in today’s society, then your predicate is not saying much, because those things could be said about lots of things. I call these sentences pretending-to-care sentences. They turn up a lot in student essays, particularly in introductions, because you’ve essentially been given an assigned task and told to come up with something to say about it. Here are a few examples:

  • Poems dealing with the theme of death include great works by many different authors.
  • The issue of gender equality appears in thousands of texts from different writers all around the world.
  • Each of these writers has something special and unique about them.
  • These stories, written in different time frames, touch on many different subjects.
  • Malcolm X produced potentially one of the most influential autobiographies to ever exist.
  • Throughout my essay, I will use a wide range of secondary sources, making my argument more objective.

There’s nothing drastically wrong with any of these sentences. But there are two problems with all of them: they don’t say very much, and they end flatly. Look at the second half of all these sentences: the predicate (touch on many different subjects, have something special and unique about them, include great works by many different authors etc.) could apply to lots of things.

Let’s have a go at fixing a couple of pretending-to-care sentences.

Childhood is a stage in life that everyone has experienced.

Better version: All of us were children once.

The theme of love is one which has reoccurred throughout various texts in the literary tradition since its very beginning.

Better: Love is a recurring literary theme.

The italicized versions are better not just because they use fewer unnecessary words, but because they end strongly, with the key bit of information at the end of the sentence. If you do this, the full stop will arrive with a satisfying click.

Tip 8: Your writing must speak all on its own

First, some words from the author Verlyn Klinkenborg:

‘When called to the stand in the court of meaning, your sentences will get no coaching from you. They’ll say exactly what their words say, and if that makes you look ridiculous or confused, guess what? Sentences are always literal, no matter how much some writers abhor the idea of being literal. In fact, nothing good can begin to happen in a writer’s education until that sinks in. Your opinion of what your sentence means is always overruled by what your sentence literally says.’

Klinkenborg captures here what makes writing so hard. You have to arrange the words in such a way that they can be deciphered in your absence. In writing, meaning derives from just four things: syntax (the grammatical order of the words), word choice, punctuation and typography (that’s things like capital letters and italics). Part of you thinks that you will be able to hover over the reader’s shoulder as they read what you’ve written, saying ‘That’s not what I meant. This is what I really meant!’. You won’t. The only thing the reader can use to access your wonderful ideas is your words. Writing is made of marks on the page and nothing else.

In their book The Elements of Style, William Strunk and E.B. White advise: ‘When you say something, make sure you have said it. The chances of your having said it are only fair.’ When you write a first draft, it is very unlikely that you will have said exactly what you think you have said. That’s why you need to read your work over, read it aloud, redraft it, proofread it. Then you find out if you have said what you wanted to say.

Writing is a strange, cumbersome, artificial process. It takes a lot of work to make your words clear to the reader. The comic singer Neil Innes used to start his act with this line: ‘I’ve suffered for my art. Now it’s your turn.’ Don’t be like that. Don’t show the reader how tedious you found writing your essay by making them suffer as well. Writing should be an act of generosity, a gift from writer to reader. The gift is the work you’ve put in to make your meaning clear and your sentences a pleasure to read.

Tip 9: Avoid paragraphs of very different lengths

The paragraphs in your essay should not be of dramatically different lengths. That doesn’t mean they have to be exactly the same length. But if you have a two-page paragraph followed by one that is two sentences long, it’s a sign that you need to reshape your essay. There is no rule about how long a paragraph should be, although I don’t like to make mine longer than about 250 words.

A good, basic way of thinking about a paragraph is that it is a single idea, developed into an extended thought. You introduce your point at the start of the paragraph and spend the rest of it developing that point, using examples, supporting quotes, evidence, qualifications and counter-arguments. If your paragraph is only a sentence long, it either means that your idea needs to be developed further, or that it doesn’t merit a paragraph of its own. If your paragraph is two pages long, it means that it contains several ideas that each need their own smaller paragraph.

Paragraphs allow you to put similar material in your essay in the same place. A common phrase that occurs in student essays is ‘As previously mentioned’, or ‘As mentioned earlier’. In which case, why didn’t you also mention this point earlier, when you were talking about that subject? Put similar material in the same place in your essay.

The first and last sentences of each paragraph carry a lot of stress. They are a good way of nudging your argument along. Try making those sentences fairly short, so they can quickly introduce what’s to follow or wrap up a point.

Tip 10: Choose the right word

Be specific in your choice of words. It helps if you learn to be interested in word origins. Did you know that humility and humour are both linked etymologically to humus – the soil, the earth – and to a human, who is thus, linguistically, an earthling? Did you know capricious referred originally to the behaviour of a typical goat (Capricorn being the sign of the goat). Did you know that obscene originated in Greek drama as ob-skene, which means ‘offstage’? Did you know that immediately means ‘without any intervening medium’ – nothing comes between it?

If you know what a word’s origin is, you’re more likely to use that word appropriately. Try to avoid what I call ‘thesaurus words’, where you’re looking for an alternative to a word and find a synonym in the thesaurus facility on MS Word. No word means exactly the same as another one. The right word is rarely the longest, most complicated or most impressive-looking word. It’s just the word that perfectly fits what you want to say in that part of the sentence.

Be aware of what I call ‘crutch words’: the off-the-shelf words that you use a lot. For me, it is words like merely and simply. A common student crutch word is somewhat, often used wrongly. Another crutch word, to describe a book or fictional character, is relatable, an adjective that doesn’t mean much. Use the ‘find’ facility on MS Word to see if you use a particular word a lot. If you use lots of crutch words, your prose may sound muddy and dull.

Choosing the right words is hard, and our first efforts often sound slightly wrong or try-too-hard. The right word rarely comes to you immediately (‘without any intervening medium’). Go through your essay looking at every word, particularly the nouns and adjectives. Is that really the right word? Did I mean to say that? Can I come up with a more exact and informative way of describing this poem than emotional or poignant or relatable?

Good writers also tend to be interested in words themselves: their look, feel, shape and sound. ‘We must remember how wide the word “Iowa” is,’ the American writer William Gass once wrote. ‘We must bear in mind how some words are closed at both ends like “top” or are as open as “easy” or as huffed as “hush.” Some words click and others moan. Some grumble. Listen to the way the word “sister” is put together. Can you feel the blow which chops off the end of “clock”?’ Cultivate this kind of granular interest in words and they will – I promise – pay you back a hundredfold.

Delivering the Undeliverable: Teaching English in a University Today

Here is a free-access link for an article I wrote for the journal English about university English teaching. It is more timely than I would like. Every week now seems to bring more news of redundancies and course closures in university English departments. This piece is an attempt to address this reality without being too depressing:

https://academic.oup.com/english/advance-article/doi/10.1093/english/efac006/6609054?guestAccessKey=a1a55475-7c2a-47b8-be31-cb896c27c683

The Tinkerbell effect

I wrote this for Times Higher Education last week:

We all know the ideal. A university is not just another medium-sized corporation; it is a community of scholars, striving towards the common goals of learning and enlightenment. And we all know the many ways an actual university falls short of that ideal. Collegiality can evaporate in the heat of the job, with its daily irritations and power plays. The modern university, the American educator Clark Kerr once wrote, is just “a series of individual faculty entrepreneurs held together by a common grievance over parking”.

The managerialist ethos that pervades today’s universities doesn’t help. This ethos reduces human relationships to the incentivising logic and contractual obligations of a market. The problem isn’t the people – managers themselves can be well meaning and principled – but the system. Ultimately, managerialism does not believe in community, only in self-interested individuals completing tasks because they have been offered carrots or threatened with sticks. By dividing us up into cost centres, the managerialist university tries to isolate the ways in which the different parts contribute to the whole. Poorly performing areas, or those seen as a drain on resources, are put on the naughty step, or worse.

In this context, the rhetoric of the university as a community can feel like little more than message discipline, smoothing over dissent and critical thought. The language of corporate togetherness rings hollow at a time of casualisation, redundancies and unmanageable workloads.

Still, we keep believing. Collegiality responds to the Tinkerbell effect: the collective act of believing in it, sometimes in spite of the evidence, brings it into being. In the middle of this semester, we had a fire drill. When the alarm goes off, it opens up the building, decanting its dispersed human occupants on to the tarmac and lawn outside. The invisible life of the university is made visible. We stood coatless and shivering in the autumn air, huddled in little groups. I saw students I had only ever seen on Zoom, colleagues appointed since lockdown who I had never seen before, and others I had not seen for over a year, reassuringly unchanged. And I was reminded how much of a community is made by this mere fact of contiguity: passing each other in corridors, popping into offices, queueing up for the microwave.

These acts form part of what Katherine May calls “the ticking mechanics of the world, the incremental wealth of small gestures”, which “weaves the wider fabric that binds us”. As a shy and socially passive person, I rarely take the initiative in interactions, so I need these accidental encounters. I didn’t quite notice, while I was just trying to get through it, how much a year and a half of living online had messed with my head. I had to get well again before I knew how sick I was. After so many months of virtual working, these micro-expressions of the value of community feel like glugging down bottled hope.

Community is not some warm, bland, mushy thing. It is how complicated human beings learn to live alongside other complicated human beings – people who want desperately to be good but who are also self-absorbed, insecure, frustrated and afraid. Community is only ever a work in progress, rife with bugs and glitches. It is hard work.

That becomes particularly apparent at Christmas, as we try to find it in us to show peace and goodwill to people we find irritating and exhausting. The writer Loudon Wainwright, Jr called Christmas “the annual crisis of love”. A university is a permanent crisis of love. But crises are what we struggle through because it’s worth getting to the other side – and because a university is a community or it’s nothing.

Managerial blah

I published this piece in Times Higher Education a few weeks ago:

There is a type of language that has become ubiquitous in academia in recent years. I call it managerial blah. You will recognise managerial blah if you’ve ever had to read it – or, God help you, had to write it. It is the official argot of the modern university, the way its actions are presented and explained.

How do you write managerial blah? First of all, you will need lots of abstract nouns. It helps if these can be used to signal things that we are all meant to approve of in some open-ended, ill-defined way, like leadership, excellence and quality. Mostly, though, you can rely on nouns that just refer to general categories into which other things fit, like framework, model, strategy, mechanism and portfolio.

This kind of noun-speak bears the traces of that traditional faith in word magic, the belief that chanting words like a spell could bring something into being, such as a cattle plague for one’s enemy or a good harvest for oneself. We flatter ourselves that, as enlightened moderns, we have left such primitive notions behind. But word magic survives today in curses, oaths – and nouns.

When you use a noun, you are claiming that the thing it refers to is real and durable enough to be named. The American writer and educator John Erskine wrote that a noun is “only a grappling iron to hitch your mind to the reader’s”. This grappling iron is especially useful when you are dealing with abstract notions that can’t be grasped by the senses. In managerial blah, nouns like esteem, value and gain become taken-for-granted things that, to the initiated, speak for themselves. The effect is amplified when you put two nouns together, such as performance indicator or service outcome. And even better if you can group them into threes: upskilling development opportunity, online delivery platform, workload resource allocation. Managerial blah loves these three-noun clusters because they ratchet up the nouniness, and thus the feeling that we are discussing something definite and unarguable. Knowledge Exchange Partnership is not just a noun, but three nouns – so it must be a thing. It lives!

These abstract nouns can then be paired up with intensifying adjectives such as dynamic, strategic, impactful, innovative and user-focused. In managerial blah these intensifiers have gone through a process that linguists call semantic bleaching. This means that their intensity has declined through overuse, until they are left as little more than placeholders in a sentence. They are so often paired with the same nouns that they form the tired couplings known as collocations. A collocation occurs when two or more words are habitually juxtaposed. So learning is always student-centred, procedures are always robust, competencies are always core and stakeholders are always key (there being no such thing, in managerial blah, as a minor stakeholder). Adverbs and participles can be collated into equally trite pairings. In managerial blah we are never just engaged but actively engaged, never just positioned but proactively positioned, never just committed but strongly committed.

OK, now you have to join up these stock-brick words and phrases into a clause. You will need at least one verb, but make sure it is a weak, connective one, such as facilitate, embed, enhance, refocus, reprioritise or rebalance. Try not to use more energetic verbs which would force you to attach agency to the subject of your sentence. That might involve you inadvertently constructing an argument that could be challenged, and you don’t want that.

You will find that you can cut down on verbs, anyway, by using lots of prepositions. Prepositions are small, harmless-looking words with many different meanings and functions. For the aspiring author of managerial blah they are helpfully ambiguous, allowing you to hint at connections between things without having to argue them through. You can use prepositions to staple-gun nouns together without worrying too much about verbal action. Managerial blah uses prepositions in weird, overemphasised ways, as if these little words are carrying more weight than they should. We will look at providing … This will help around our impact agenda … The Executive Deans will be leading on this.

If you’ve followed my instructions so far, you will have something resembling a complete clause in managerial blah. Now all you need to do is link it up with other clauses into sentences and paragraphs in a way that has no forward momentum at all. For instance, you can yoke interchangeable clauses together into one long sentence using just colons and semicolons. These form a sort of punctuational sticking plaster when the verbs are not strong enough to carry the reader through the sentence and into the next one. You can also group your sentences into lots of numbered sections and sub-sections. Numbering creates the illusion of structure and relieves you of the task of connecting one paragraph to the next with a developing argument. This list-like writing is lifeless – listless, in fact – because, unlike life, it has no propulsive energy. It arranges reality consecutively but randomly.

The torpidity of managerial blah sits awkwardly with its rhetoric, which sets so much store by perpetual movement. The prevailing tone is one of monodirectional certainty, with constant reference to continuous enhancement, direction of travel and, of course, going forward (or its now more common and vigorous variants, moving forward and driving forward). This all brings to mind the French theorist Henri Lefebvre’s definition of modernisation as “the movement which justifies its own existence merely by moving”.

In managerial blah, the phrase going forward is more than just a verbal tic. It encapsulates a particular way of looking at the world. Like the free market, the managerialist university believes in permanent revolution and endless growth. So it produces an infinite series of self-replenishing demands, urging everyone up a mountain whose summit will never be reached. University mission statements beseech us all to keep improving, to ceaselessly pursue quality, value or excellence. And how could the quest for such elusive nouns ever end?

But here’s the odd thing. Even as managerial blah exhorts us to move endlessly onwards, it is taking us on a journey that already knows its own end. In its linguistic universe, nothing truly new or surprising ever occurs. Points must be actioned, milestones must be met, deliverables must be delivered and inputs must have outputs. All eyes are on the satisficing completion of an algorithmic process, in a way that refuses to concede the possibility not only of failure but also of anything unforeseeable, unanticipated or serendipitous.

Managerial blah is an anonymous, depopulated language. It bears no trace of any inconveniently real human beings whose imperfections might mess up the system. It deals in processes and procedures, not people. It conjures up a metric-driven, quasi-rationalistic, artificially sealed-off world in which anything can be claimed and nothing can be seen or felt. No one ever says who did what to whom, or takes ownership or blame. The juggernaut just runs on, inevitably and under its own steam, although there might be issues and challenges along the way (never problems or failures, which might be pinned on people).

Good writing always has some give in it. It is open to dispute by a reality that the writer does not own and the reader might see differently. Managerial blah, by contrast, runs along a closed circuit that permits no response. Without any sense of voice or audience, it feels tone-locked, written by no one in particular to be read by no one in particular. It is anti-language, a weary run-through of the verbal motions.

Why does managerial blah get written? In part it is down to a banal and timeless truth: most people have no great facility with words. Writing with subtlety and precision is hard, so instead we default to off-the-shelf words and boilerplate phrases. Tying nouns together with weak verbal and prepositional knots is the simplest and quickest way to rustle up a sentence and achieve a superficial fluency. If good writing is hard to write and easy to read, then managerial blah is the reverse: a labour to read, a breeze to write.

Perhaps some of those who write managerial blah genuinely believe that, merely by gluing nouns together, they have communicated meaningfully with their readers. But surely, in a university, ignorance is no defence. Managerial blah is a crime against words by intelligent, well-educated and well-remunerated people who should know better. Writing well is hard, but not that hard. If you keep on producing this ugly and alienating language when so many people have told you how ugly and alienating it is, then your intellectual laziness is not an accident.

Writing with proper care and attention offers you no hiding place. The basic somebody-did-something structure of the plain English sentence allows your reader to weigh up how convincing you sound. When you use specific nouns and strong verbs to describe your actions, you haveto think through the purposes and consequences of those actions. Managerial blah evades this obligation. It can thus make the cruellest and maddest realities seem sane and incontrovertible.

The sector is currently going through a traumatic cycle of redundancies, in response to the brutally competitive market created by the raising of tuition fees, the removal of student number controls, and government antipathy to “low value” arts and humanities courses. The language used to announce and explain these redundancies has been a masterclass in managerial blah: stale, vapid, self-validating and, of course, chock-full of nouns. Staff who receive letters informing them of their imminent dismissal are lectured about strategic priorities, changes to the staffing establishment, redundancy ordinances and university redeployment registers. And because lip service must now always be paid to mental health, they are then directed to Employee Assistance Programmes and Staff Wellbeing Portals. Students who are worried that their lecturers have all been sacked are assured that a student protection plan is in place. This nouny language is not even trying to illuminate. Its only purpose is to immunise itself against scrutiny and challenge.

Job cuts and other distressing developments are justified with the scariest two nouns in the lexicon of managerial blah: change management. Change is happening anyway, this omnipresent phrase suggests, and in a direction that has already been decided. All the rest of you can do is adapt or get off the bus. Change management is the shibboleth of a financialised capitalism that sees human capital as a faceless element in an inexorable process and a fixed cost to be reduced. The emotional costs of redundancy are immense. People who have given everything to adapt their teaching and support their students in a pandemic have now had their livelihoods taken away and, in many cases, their careers effectively ended. Naturally they feel great anxiety, anger and pain. Their colleagues who remain in post are left with survivor guilt and fear of the worse that may be to come. And all this human turmoil is hidden inside that insipid, unassailable phrase, change management.

Those of us in the humanities – the subjects most at threat from redundancies – are at least alert to how language shapes reality as well as reflecting it. Words are never simply a neutral, transparent container for meaning. They can clarify and elucidate or they can muddy and obscure. They can wake the mind up or anaesthetise it. They can polish reality so it gleams or hide it behind a rusty carapace of cliché, cant and sloganeering. Words are not just how we communicate; they are how we think. Managerial blah liberates the writer from the need to do either. It is a kind of literary lockjaw which stops you saying or thinking anything original or edifying.

There is a long tradition of making fun of management speak, but managerial blah is too dull even to poke fun at. It offers no purchase or ingress for the satirist or ironist. It just sleepwalks from one big noun to the next, sucking us into its vortex and boring us into submission. All the imaginative promise of words has been pulped into a lumpy noun gravy, neither liquid enough to flow nor solid enough to be forked. This noun gravy is tasteless but, should you swallow enough of it, noxious.

Words matter. They transform how we think and feel about each other and about our lives. We will never be able to see the world in more creative and fruitful ways if we are trapped inside a univocal vocabulary. Imagine how refreshing it would be to read an official university document that treated its reader like a human being, by trying to persuade them with defensible arguments, fine distinctions and honest doubts. We would live richer, more productive and more authentic working lives in academia if we cared more about words – which is why, now that I have told you how to write managerial blah, I hope you will ignore my advice.