The Noel Edmonds Creed (a found poem)

I, as a follower of Noel Edmonds, believe all that Noel Edmonds believes.

I believe that disease is caused by negative energy.

I believe that death is just a word in the dictionary.

I believe that the most appropriate word is ‘departure’ because we are energy and you can’t create or destroy energy, you can only change its form.

I believe that we are surrounded by electro mist, fog and smog.

I believe that we are covering ourselves in the wrong sorts of electro-magnetism.

I believe that the biggest problem we have is not Ebola, it’s not Aids, it’s electro smog.

I believe that the Wi-Fi and all of the systems that we are introducing into our lives are destroying our own natural electro-magnetic fields. All you are is energy, remember that.

I believe that renewing the BBC’s charter would be an act as futile as giving medicine to a corpse.

I believe in Orbs. Orbs are little bundles of positive energy and they think they can move between 500 and 1,000 miles per hour.

I believe that there are two orbs that visit me. The two that I have are about the size of melons. One sits on my arm and the other is usually in the back of the shot, sitting just over my right shoulder.

I believe that you don’t live life, life lives you.

I believe that I wrote a wishlist of ambitions to the cosmos and, like a mail-order company, it delivered my wife, who was working as a make-up artist on Deal Or No Deal.

I believe that every single human being can achieve a perfect vibrational balance between their positive and negative energy.

I believe it is possible to retune people.

I believe that Ant and Dec are excellent presenters. They’ve been honest, they’ve plundered the House Party archive and created Takeaway. I don’t have a problem with that. I take it as a compliment.

I believe that all these things have been known about for a very long time.

Amen.

The Nowhere Office

I reviewed Julia Hobsbawm’s The Nowhere Office and Jonathan Malesic’s The End of Burnout for the TLS in February:

Change, writes Julia Hobsbawm, happens “slowly and then all of a sudden completely”. What Hobsbawm calls the “Nowhere Office” – the hybrid workspace that floats between work and home – may seem like Covid’s gift to the world but it was long in the making. For her it is the culmination of trends that have been emerging since the 1980s, when office hours stopped being strictly nine-to-five and the search for an elusive work–life balance began. The pandemic “broke the last threads holding the embedded customs and practices together”.

The Nowhere Office is buoyant about this placeless workspace. Offices, Hobsbawm predicts, will no longer be run by a creed of unthinking presenteeism and will become places we visit for networking, collaboration and community-building. The rest of the work will be done at home or on the move. We will happily cut across different time zones, accessing our files anywhere via digital clouds and dividing up work across a seven-day week, carving out “air pockets of free time” rather than a two-day weekend. The main divide will be between the “hybrid haves” and the “hybrid have nots” – those who are able to move seamlessly between online and offline and those who are not.

Hobsbawm wants to “put the human back in the corporate machine”, and her instincts are all good. She understands that working from home can mean loneliness, isolation and the bleeding of work into our personal lives. And she concedes that “despite the apparent flexibility and freedoms, many inequalities remain and too many people still have to work too hard and too long”. But what if the apparent flexibility and freedoms are the problem? What if the nowhereness of work means that work ends up being everywhere, and we can never disengage from its demands? For Hobsbawm the solution is to give employees more choice and negotiate their consent. They must be disciplined in separating work from life, and their bosses must trust them to work unsupervised. “It will be obvious if people are working well”, she writes sunnily, announcing the end of “the age of being violently busy”.

The book is interspersed with interviews with practitioners and proponents of the Nowhere Office. Most of them are business leaders: chief strategy officers, brand presidents, digital entrepreneurs, investors. Their insights are worth having, even if Hobsbawm’s mimicry of their corporate-speak about “win-win models” and “siloed thinking” does little for her prose style. But one wonders if those lower down the corporate hierarchy might have a less heady take on the Nowhere Office.

According to Hobsbawm, theses changes are unstoppable. The future is set fair and all we can do is catch up. “The desk is all but over as a built-in feature of office life”, she says. “Sofas, small theatres, spaces to convene and converse in will be ‘in’.” Her brisk verdicts on the new reality reminded me of that much-repeated formula online, declaring that some new phenomenon “is a thing now”. But why is it a thing, and should it be a thing? The future is neither uniform nor inevitable. It feels too soon to make bold calls, before the pandemic is even over, about what the workplace of the future will look like.

Hobsbawm summarily dismisses critics such as Josh Cohen, David Graeber and Sarah Jaffe as part of “an emergent purist camp” which holds that “work represents a failure of society, certainly of capitalism, and that work is essentially not an opportunity but a threat”. But these critics do not say that work is “pointless”, as she claims, only that a turbo-capitalist conception of work makes excessive and toxic demands on us. Their writing deserves to be engaged with rather than caricatured.

Hobsbawm would probably put Jonathan Malesic in the purist camp. But his acutely felt investigation of work burnout as an “ailment of the soul” makes his the more thought-provoking and substantial of these two books. Malesic is a recovering academic, a former professor of theology at a small Catholic college in Pennsylvania. Like many academics he began his career with unsustainably high ideals, believing he was “a citizen in the republic of letters”. He discovered that much of it was just a job, with unrewarding tasks, soul-sapping hassle, pointless politicking and fears of redundancy. His students, most of whom were studying theology as a core requirement, did not share his enthusiasms and spent his classes looking blank-faced and bored. Soon he was lying in bed for hours when he should have been working, repeatedly watching the video of the Peter Gabriel and Kate Bush song “Don’t give up” and self-medicating with ice cream and beer. After eleven years he gave up the tenure-track position he had worked so hard for. Alongside the sense of failure, he felt intense guilt that he had come to hate such a coveted and well-rewarded job.

As Malesic admits, burnout is something of a buzzword, “an often-empty signifier onto which we can project virtually any agenda”. Our vague definitions of it, and the lack of consensus on how to diagnose and measure it, raises the question of how much we really want to eradicate it. Diagnosing oneself with burnout can, after all, be self- flattering. To be burned out is to be a modern, a victim of the age, a martyr to one’s own high ideals. Burnout’s historical antecedents, the now-forgotten soul sicknesses of acedia, melancholia and neurasthenia, were similar sources of both pride and shame.

Malesic defines burnout usefully as “the experience of being pulled between expectation and reality at work”. We burn out not just because we are exhausted but because our hearts are broken. Our love for work, which we saw as the path to social status and spiritual flourishing, went unrequited. Even in the good times, work could not deliver all we asked of it, and these are not the good times. Aided by market deregulation, employers now see workers as a fixed cost to be reduced. Outsourcing, zero-hours and precarious work have expanded, while more hours are demanded of everyone. The funky offices of tech start-ups, with their games rooms and sleeping pods, are, Malesic writes, “designed to keep you at work forever”. The life hacks touted as burnout antidotes – mindfulness, getting more sleep, working smarter – are superstitions, “individual, symbolic actions that are disconnected from burnout’s real causes”.

Malesic visits an artisan pottery studio in Minnesota, a Dallas nonprofit doing anti-poverty work and several Benedictine monasteries, and spends time among artists with disabilities who cannot find paid work but who form richly supportive creative communities. He learns that work need not be the lodestar of our lives. To heal our burnout, we need to lower our expectations. Malesic now teaches writing part-time at a Dallas university, just one or two classes per semester. He no longer expects the life of the mind to be soul-nourishing and is a better and more patient teacher for it.

We need to see work as, well, work. But this does not mean that it should cease to matter. Malesic cites the French phrase “un travail de bénédictin”– a Benedictine labour – to describe a project that demands quiet, steady effort over a long time to bring it to fruition. This kind of work has little value in a world of annual pay reviews and key performance indicators. But a richly satisfying Benedictine labour can cure us of that self-lacerating cycle of looming deadlines and short-term goals that ultimately benefits only our paymasters.

These very different books have one perspective in common: they both see the pandemic as a chance for reflection and change. “Right now, we have a rare opportunity to rewrite our cultural expectations of work”, Jonathan Malesic writes, “and I hope we will.” So do I.

The Premonitions Bureau

I reviewed Sam Knights’s book The Premonitions Bureau for the TLS in May:

For most of human history, people have believed that we can see into the future. The Bible is filled with prophecies and premonitory dreams; the ancient Greeks put their faith in oracles and in destinies that no mortal being could swerve. “That which is fated cannot be fled”, warned Pindar. As Oedipus discovered, what is going to happen to us becomes what we choose to do.

The Premonitions Bureau, Sam Knight’s elegant and illuminating work of cultural history, transports us back to a mid-twentieth-century Britain still clinging to this faith in precognition – the extra-sensory perception of future events. Precognition, which hinted at “undiscovered reaches of physics and of the mind”, managed to escape the taint of the occult that clung to phenomena such as ghosts and ectoplasm. It teetered on the edges of scientific respectability.

In 1927, J. W. Dunne, an aeronautical engineer, published the bestselling book An Experiment with Time, which remained in print for more than half a century. In 1902, while serving in the Boer War, Dunne had dreamt of a volcano about to erupt on a French colonial island. A few weeks later, he got hold of a newspaper which reported that the eruption of Mont Pelée, on the French Caribbean island of Martinique, had killed 40,000 people. Dunne’s book was a thirty-year history of his own dreams and their intimations of the future. He explained it all with reference to the new fields of relativity theory and quantum mechanics, which theorized that time’s linearity was no simple matter. Dreams that predicted future happenings became known as “Dunne dreams”. On Dunne’s advice, many of his readers began leaving pencil and paper by their beds so they could write down their dreams on waking.

J. B. Priestley, in plays such as Time and the Conways (1937) and An Inspector Calls (1945), drew on Dunne’s work. Priestley also popularized Carl Jung’s theory of synchronicity, which suggested that events could be linked outside the normal logic of cause and effect, such as when a dream foretells an event in the waking world. In Time and the Conways, Alan Conway tells his sister Kay that the secret to life is that time is not monodirectional but eternally present, and that at any given moment we see only “a cross section of ourselves”.

At the heart of Knight’s story lies a remarkable character called John Barker. When we first meet him, in 1966, he is a forty-two-year-old psychiatrist working at Shelton Hospital in Shropshire, one of Britain’s sprawling and overcrowded mental institutions. Barker worked tirelessly to improve conditions at Shelton by phasing out the more brutal treatments, such as electroconvulsive therapy administered without drugs. But he was also frustrated with the professional timidity of his field. Fringe areas dismissed as psychic or paranormal were just waiting to be absorbed into mainstream science, he believed. He was a member of Britain’s Society for Psychical Research and fascinated by precognition.

The book begins with the event that galvanized Barker: the Aberfan disaster of October 21, 1966, when a coal-tip avalanche buried a primary school, killing 144 people, mostly children. The precarious-looking tips above Aberfan had long worried locals, and many spoke of having disturbing thoughts and visions before the disaster. Given how much Aberfan had pierced the national consciousness, Barker decided to ask the public if they had felt any presentiment of it. He contacted Peter Fairley, the science editor of the London Evening Standard, who agreed to publicize his appeal. Barker received seventy-six responses from what he called “percipients”. After prodding them for details and witnesses, he concluded that precognition was a common human trait, perhaps as common as left-handedness. He thought that a small subset of the population might experience “pre-disaster syndrome”, somewhat similar to the way in which twins were thought to feel each other’s pain remotely.

The problem was that, as with most similar evidence, the Aberfan data had been scientifically compromised by being collected after the event. So just before Christmas 1966, Barker and Fairley approached Charles Wintour, the Evening Standard’s editor, about setting up a “Premonitions Bureau”. For a year, the newspaper’s readers would be asked to send in their forebodings of unwelcome events, which would be collated and then compared with actual events. The Standard’s newsroom was soon inundated with letters and telephone calls.

Barker envisaged the Premonitions Bureau as a “central clearing house” for all portents of calamities, “a data bank for the nation’s dreams and visions”. This crowd-sourcing of the collective unconscious recalled the work of an earlier research organization, Mass Observation, which also made use of unpaid volunteers to create “weather maps of public feeling”. Barker hoped that the results would eventually be uploaded to a computer database, and that the Bureau would issue early warnings of potential disasters.

Barker and Fairley appeared often in newspapers, as well as on BBC2’s Late Night Line-Up. They also turned up with a group of percipients to be interviewed on ITV’s The Frost Programme, but were dropped mid-show, probably because David Frost worried how the group might come across. “‘Weirdos’ would be too strong a description,” Fairley wrote later, “but they were certainly different.” Fairley put his own raised profile to good use, going on to present ITV’s coverage of the moon landings.

The Bureau received hundreds of warnings, most of which proved, predictably, to be blind alleys or impossible to verify. On quiet mornings, Fairley would go through the letters pile in search of racing tips. Two respondents, though, had real staying power: Kathleen Middleton, a piano teacher from Edmonton, and Alan Hencher, a Post Office switchboard operator from Dagenham. They predicted a whole run of unfortunate events, including the Torrey Canyon oil spill, the death of a Russian cosmonaut on his re-entry to earth, the assassination of Robert Kennedy, and the Hither Green rail crash in which forty-nine people died. Distressingly for Barker, they both then foresaw his own death (which nicely sets up the end of the book).

Knight’s refreshing approach to his subject matter avoids being either too cynical or too credulous. “Premonitions are impossible, and they come true all the time”, he writes. He knows how hard it is for us storytelling animals to separate an event from the link we give it in a causal chain. A few weeks before their wedding, he and his wife saw three magpies, and “never asked for a test to confirm the sex of our daughter because we felt we had already been informed”.

Time is an arrow. The second law of thermodynamics rules that there is no way we can know about things before they happen. Entropy – the cup of tea that cools as you drink it, the leaves that fall in autumn, the lines that form on your forehead – is the concrete proof that time only runs forwards. And yet some contemporary theoretical physicists, such as Carlo Rovelli, suggest that the explanatory power of entropy, which makes sense of our lives and our deaths, has caused us to give it too much credence. Perhaps we only see the small part of reality where this rule holds. Knight feels no need to come down on one side or the other. Instead, he uses the theme of precognition to explore deep existential questions about time, causation and the meaning of life.

The Premonitions Bureau is full of lightly dispensed research, gathered from the archives of the Society for Psychical Research and interviews with the families and associates of the main characters. Knight’s method and tone will be familiar to those who have read his Guardian Long Reads on everyday subjects such as the British sandwich industry and Uber’s takeover of London, or his New Yorker “Letter from the UK”. He deploys two highly effective narrative techniques. The first is the deadpan drop of bits of stray information. We learn that a survivor of the Hither Green rail crash was the seventeen-year-old Robin Gibb, of the Bee Gees; that Barker was a keen surfer, although overweight and at least two decades older than his fellow longboard pioneers; that Fairley chased stories on a fold-up motorcycle and that only when he died did his widow and four children learn of his secret second family. As well as being weirdly fascinating, these facts add authenticating specificity to the story.

Knight’s second technique is the narrative handbrake turn, where the story veers off without warning, the significance of this new thread only emerging later. “In the 1690s, a young tutor named Martin Martin was commissioned to map and document life in the western islands of Scotland”, he might begin, out of the blue. Or: “One day in 1995, in the German cathedral city of Mainz, a fifty-one-year-old woman went to hospital …”. The creatively jarring juxtaposition of human voices and stories reminded me a little of Tales of a New Jerusalem, David Kynaston’s multi-volume history of postwar Britain. Knight, like Kynaston, leaves us with a sense of the stubborn strangeness of other people and of the recent past, without ever seeming condescending to either. Other people, his book reveals, are infinitely and incurably odd. Still, they might just be on to something.

Ten Writing Tips

During lockdown in the autumn of 2020, when we were teaching online, I posted a writing tip to our students every week. I thought I would post them here now in case anyone else finds them useful.

Tip 1: Start writing earlier

When you’re working on an essay or piece of coursework, start writing early on in the process. Don’t spend all your time on the reading and research and leave the writing until the last minute. As an English student, writing is your laboratory, your way of thinking – how you find out what you really want to say. Make sure you leave enough time for it.

Students sometimes get discouraged when they have written a first draft of their essay and it feels awkward or stilted. But that’s like saying your cake tastes awful when all you have done is mix some butter, eggs, flour and sugar in a bowl. You haven’t finished making it yet. Only when you have hacked your sentences into a basic shape can you see the many other things wrong with them. Only by putting the words into a semblance of order can you see how muddled they still are. An essay is too big and complex to hold entirely in your head, so you need to have the words in front of you to really think it through.

A defining quality of writing, as opposed to speaking, is that it can be redone. You can keep working on it until it’s ready. Writing is rewriting. ‘Writing,’ the American author Kurt Vonnegut said, ‘allows mediocre people who are patient and industrious to revise their stupidity, to edit themselves into something like intelligence.’ Not that I’m saying you’re mediocre. I’m just saying that the great thing about writing is that you can keep reworking it until you sound like the best, most perceptive and insightful version of yourself. And who wouldn’t want to spend time doing that?

Tip 2: Trust your ear

The best way to iron out mistakes and awkwardness in your writing is to read your work aloud. Trust your ear. Language is innately rhythmic and musical. Even the way you say your phone number to someone else has a rhythm, as you split it into two or three phrases. That is why we find the automated voices of satnavs and public address systems, with their random rise and fall, so alien. They don’t sound human because they don’t speak with human rhythms.

If you get the rhythm of your writing right, the other things tend to fall into place. Most people know the grammatical rules of writing more than they think they do. You probably know where the subject and verb should go in a sentence, even if you can’t identify them. (Most people can’t.) You know the subject and verb go at that point in the sentence, and in that order, because it sounds right. If it sounds right, it’s probably grammatically right; if it sounds wrong, it’s probably grammatically wrong. You should certainly trust your ear more than the grammar check on MS Word, which is pretty useless.

You can test the flow and sense of your writing when you read your work aloud, because the ear is very sensitive to dissonance, in the same way that you can tell if a singer has hit a bum note, even if you don’t know what the note should be. Reading your work aloud slows you down (you read much quicker when you’re reading silently) so you’re more likely to notice if something sounds wrong. Reading aloud forces you to renotice what you have written.

There is an even better way. When you read your own writing aloud you already know what you meant, and you augment that meaning by accenting and stressing, speaking faster or slower, higher or lower – all ways of making your meaning clearer and reducing ambiguity. Better, if you can bear it, to get a friend to read out your sentences for you. If they stumble over a word or phrase, it might be a clue to revisit it.

Tip 3: Cut all unnecessary words

Which of these sentences sounds better to you?

  1. When I was a child, I used to have a terrible temper.
  2. As a child, I used to have a terrible temper.
  3. As a child I had a terrible temper.

I say the second is better than the first, and the third is best. You don’t need both when and used to, because they convey the same thing. And, come to think to think of it, you don’t need both as and used to either, because they too convey the same thing. The third sentence takes the least time and effort to read. Cutting unnecessary words always makes your writing cleaner and more elegant.

For instance, repeating a word in a sentence can sound clunky:

By choosing to narrate the novel in the first person the author makes the novel more vivid.

Better version: The use of the first person makes the novel more vivid.

The story of Cinderella is a well-known story.

Better: The story of Cinderella is well-known.

The book’s title establishes the theme of the book; the book’s first paragraph establishes the voice of the book.

The book’s title establishes its theme; the first paragraph establishes its tone.

Also, do you really need all those vague qualifiers like very and rather, and do you need two vague adjectives when one would do?

This piece of writing is a very poignant and heartfelt one.

The writing is heartfelt.

It’s easier for the reader to quickly grasp the meaning of your sentence if you cut all needless words:

Portraying the nature of people to be driven by violent instinct is present in many other novels.

People driven by violent instinct appear in many other novels.

Most memoirs choose to mirror the strict chronological nature of life itself within the structure of their works, although this is not always the case. Some memoir writers choose to employ non-chronological structures.

Most memoirs mirror the chronological nature of life in their structure, but not all.

So: write more words than you need and then go through your draft cutting the ones you don’t need. Just as your speech is full of ums and ers and repetitions, your first go at any piece of writing will be full of unnecessary verbiage.

A writer makes meaning not just by adding words but by taking them away. The playwright David Mamet said that ‘Omission is a form of creation.’ Cutting words is as creative an act as writing them. It often makes your meaning clearer to yourself. It’s a bit like being a sculptor, looking for the beautiful form hidden in that rough block of marble by chipping away at all the superfluous stone. Cutting words has this same creative quality. Sometimes it can liberate a meaning that you weren’t quite aware of but that was waiting there to be found.

Tip 4: Learn the power of the full stop

In the age of texting and social media, full stops are going out of fashion. The dialogic visual language of texting speech bubbles, pinging left and right on your phone, has little use for full stops. A single-line text needs no punctuation to show that it has ended. Instead of a full stop, you press send. Studies have shown that young people tend to interpret full stops in texts as curt or passive-aggressive.

But writing is not a speech-balloon text waiting on a response. The point of writing is to communicate in a way that does not require you to explain it any further. A sentence gives words a finished form that should need no clarification. It is its own small island of sense. So, with any kind of semi-formal writing addressed to people you don’t know well (such as the tutor marking your essay), the full stop, and where you decide to put it, are crucial.

Only when the full stop arrives can the meaning of a sentence be fulfilled. The full stop should be like a satisfying little click that moves your prose along slightly so that the next sentence can pick up where it left off. If you want to write well, learn to love the full stop. Love it above all other punctuation marks, and see it as the goal towards which all your words move. It is the most powerful punctuation mark: don’t forget to use it.

Tip 5: Don’t make your sentences any longer than they need to be

Last time, I wrote about full stops. Here is another reason why full stops are important: every sentence places a burden on the reader’s short-term memory. A sentence throws a thought into the air and leaves the reader vaguely dissatisfied or confused until that thought has come in to land. The reader has to hold all the sentence’s words in their head until the full stop arrives to close the circle of meaning. The full stop provides relief, allowing them to take a mental breath.

The longer your sentence is, the more the reader has to hold in their head and the more chance there is of something becoming mangled or unclear. This doesn’t mean you should avoid writing long sentences – I will discuss how useful they are in one of my later tips – but it probably means that, if in doubt, you should put a full stop in. A lot of student writing is full of sentences that are longer than they need to be.

When you’re writing a first draft, I suggest you start with short, simple sentences. If you start short like this, it’s easy to add detail and texture, and combine short sentences into longer, more complex ones. But if you start writing long, complicated sentences before you’ve worked out what you really think, then you will find them hard to take apart and simplify. Start simple and make it complex; don’t start convoluted and then have to unravel it all.

Tip 6: Vary your sentence length

The best way to make your writing sound fresh and musical is to vary the length of your sentences. Paragraphs tend to work well when they are a group of sentences of varied lengths. At the end of every sentence there is what’s called a cadence – a drop in pitch (whether you’re reading it aloud or silently) as the full stop arrives. This signals to the reader that the sentence, and the sentiment, are done. Varied sentence length makes for varied cadences. This makes writing breathe, move and sing.

Short and long sentences also do different things. Short sentences make key points or recap them, and trade in relatively straightforward statements about the world. Long ones take readers on a mental tour, list a whole series of things or stretch out a thought. Short sentences give the reader’s brain a bit of a rest; long ones give it an aerobic workout. Short sentences imply that the world is cut and dried; long ones restore its ragged edges. Short sentences are declarative and sure; long ones are conditional and conjectural. Vary your sentence length and you mirror the way the mind works, veering between conviction and nuance.

Vary the length of your sentences!

Trust me.

It works.

Tip 7: Put the important stuff at the end of the sentence

A good English sentence, however long it is, moves smoothly and easily towards its full stop. The best way to ensure this happens is to put the important stuff at the end. A sentence ordered like this feels more deliberate and memorable – just as, when you stop speaking, what sticks in your listener’s mind is the last thing you said.

Typically, the word or words at or near the start of a sentence are the subject. The words at the end of a sentence are typically the predicate: the main verb and its complements. The predicate adds new information that the next sentence may then comment on as a given. So the predicate often turns into the subject of the next sentence. Weak sentences break this given-new pattern. The subject is stronger than the predicate and the sentence ends with an unresounding phhtt.

If you write that something is an interesting factor to consider or should be borne in mind or is very relevant in today’s society, then your predicate is not saying much, because those things could be said about lots of things. I call these sentences pretending-to-care sentences. They turn up a lot in student essays, particularly in introductions, because you’ve essentially been given an assigned task and told to come up with something to say about it. Here are a few examples:

  • Poems dealing with the theme of death include great works by many different authors.
  • The issue of gender equality appears in thousands of texts from different writers all around the world.
  • Each of these writers has something special and unique about them.
  • These stories, written in different time frames, touch on many different subjects.
  • Malcolm X produced potentially one of the most influential autobiographies to ever exist.
  • Throughout my essay, I will use a wide range of secondary sources, making my argument more objective.

There’s nothing drastically wrong with any of these sentences. But there are two problems with all of them: they don’t say very much, and they end flatly. Look at the second half of all these sentences: the predicate (touch on many different subjects, have something special and unique about them, include great works by many different authors etc.) could apply to lots of things.

Let’s have a go at fixing a couple of pretending-to-care sentences.

Childhood is a stage in life that everyone has experienced.

Better version: All of us were children once.

The theme of love is one which has reoccurred throughout various texts in the literary tradition since its very beginning.

Better: Love is a recurring literary theme.

The italicized versions are better not just because they use fewer unnecessary words, but because they end strongly, with the key bit of information at the end of the sentence. If you do this, the full stop will arrive with a satisfying click.

Tip 8: Your writing must speak all on its own

First, some words from the author Verlyn Klinkenborg:

‘When called to the stand in the court of meaning, your sentences will get no coaching from you. They’ll say exactly what their words say, and if that makes you look ridiculous or confused, guess what? Sentences are always literal, no matter how much some writers abhor the idea of being literal. In fact, nothing good can begin to happen in a writer’s education until that sinks in. Your opinion of what your sentence means is always overruled by what your sentence literally says.’

Klinkenborg captures here what makes writing so hard. You have to arrange the words in such a way that they can be deciphered in your absence. In writing, meaning derives from just four things: syntax (the grammatical order of the words), word choice, punctuation and typography (that’s things like capital letters and italics). Part of you thinks that you will be able to hover over the reader’s shoulder as they read what you’ve written, saying ‘That’s not what I meant. This is what I really meant!’. You won’t. The only thing the reader can use to access your wonderful ideas is your words. Writing is made of marks on the page and nothing else.

In their book The Elements of Style, William Strunk and E.B. White advise: ‘When you say something, make sure you have said it. The chances of your having said it are only fair.’ When you write a first draft, it is very unlikely that you will have said exactly what you think you have said. That’s why you need to read your work over, read it aloud, redraft it, proofread it. Then you find out if you have said what you wanted to say.

Writing is a strange, cumbersome, artificial process. It takes a lot of work to make your words clear to the reader. The comic singer Neil Innes used to start his act with this line: ‘I’ve suffered for my art. Now it’s your turn.’ Don’t be like that. Don’t show the reader how tedious you found writing your essay by making them suffer as well. Writing should be an act of generosity, a gift from writer to reader. The gift is the work you’ve put in to make your meaning clear and your sentences a pleasure to read.

Tip 9: Avoid paragraphs of very different lengths

The paragraphs in your essay should not be of dramatically different lengths. That doesn’t mean they have to be exactly the same length. But if you have a two-page paragraph followed by one that is two sentences long, it’s a sign that you need to reshape your essay. There is no rule about how long a paragraph should be, although I don’t like to make mine longer than about 250 words.

A good, basic way of thinking about a paragraph is that it is a single idea, developed into an extended thought. You introduce your point at the start of the paragraph and spend the rest of it developing that point, using examples, supporting quotes, evidence, qualifications and counter-arguments. If your paragraph is only a sentence long, it either means that your idea needs to be developed further, or that it doesn’t merit a paragraph of its own. If your paragraph is two pages long, it means that it contains several ideas that each need their own smaller paragraph.

Paragraphs allow you to put similar material in your essay in the same place. A common phrase that occurs in student essays is ‘As previously mentioned’, or ‘As mentioned earlier’. In which case, why didn’t you also mention this point earlier, when you were talking about that subject? Put similar material in the same place in your essay.

The first and last sentences of each paragraph carry a lot of stress. They are a good way of nudging your argument along. Try making those sentences fairly short, so they can quickly introduce what’s to follow or wrap up a point.

Tip 10: Choose the right word

Be specific in your choice of words. It helps if you learn to be interested in word origins. Did you know that humility and humour are both linked etymologically to humus – the soil, the earth – and to a human, who is thus, linguistically, an earthling? Did you know capricious referred originally to the behaviour of a typical goat (Capricorn being the sign of the goat). Did you know that obscene originated in Greek drama as ob-skene, which means ‘offstage’? Did you know that immediately means ‘without any intervening medium’ – nothing comes between it?

If you know what a word’s origin is, you’re more likely to use that word appropriately. Try to avoid what I call ‘thesaurus words’, where you’re looking for an alternative to a word and find a synonym in the thesaurus facility on MS Word. No word means exactly the same as another one. The right word is rarely the longest, most complicated or most impressive-looking word. It’s just the word that perfectly fits what you want to say in that part of the sentence.

Be aware of what I call ‘crutch words’: the off-the-shelf words that you use a lot. For me, it is words like merely and simply. A common student crutch word is somewhat, often used wrongly. Another crutch word, to describe a book or fictional character, is relatable, an adjective that doesn’t mean much. Use the ‘find’ facility on MS Word to see if you use a particular word a lot. If you use lots of crutch words, your prose may sound muddy and dull.

Choosing the right words is hard, and our first efforts often sound slightly wrong or try-too-hard. The right word rarely comes to you immediately (‘without any intervening medium’). Go through your essay looking at every word, particularly the nouns and adjectives. Is that really the right word? Did I mean to say that? Can I come up with a more exact and informative way of describing this poem than emotional or poignant or relatable?

Good writers also tend to be interested in words themselves: their look, feel, shape and sound. ‘We must remember how wide the word “Iowa” is,’ the American writer William Gass once wrote. ‘We must bear in mind how some words are closed at both ends like “top” or are as open as “easy” or as huffed as “hush.” Some words click and others moan. Some grumble. Listen to the way the word “sister” is put together. Can you feel the blow which chops off the end of “clock”?’ Cultivate this kind of granular interest in words and they will – I promise – pay you back a hundredfold.

Delivering the Undeliverable: Teaching English in a University Today

Here is a free-access link for an article I wrote for the journal English about university English teaching. It is more timely than I would like. Every week now seems to bring more news of redundancies and course closures in university English departments. This piece is an attempt to address this reality without being too depressing:

https://academic.oup.com/english/advance-article/doi/10.1093/english/efac006/6609054?guestAccessKey=a1a55475-7c2a-47b8-be31-cb896c27c683

The Tinkerbell effect

I wrote this for Times Higher Education last week:

We all know the ideal. A university is not just another medium-sized corporation; it is a community of scholars, striving towards the common goals of learning and enlightenment. And we all know the many ways an actual university falls short of that ideal. Collegiality can evaporate in the heat of the job, with its daily irritations and power plays. The modern university, the American educator Clark Kerr once wrote, is just “a series of individual faculty entrepreneurs held together by a common grievance over parking”.

The managerialist ethos that pervades today’s universities doesn’t help. This ethos reduces human relationships to the incentivising logic and contractual obligations of a market. The problem isn’t the people – managers themselves can be well meaning and principled – but the system. Ultimately, managerialism does not believe in community, only in self-interested individuals completing tasks because they have been offered carrots or threatened with sticks. By dividing us up into cost centres, the managerialist university tries to isolate the ways in which the different parts contribute to the whole. Poorly performing areas, or those seen as a drain on resources, are put on the naughty step, or worse.

In this context, the rhetoric of the university as a community can feel like little more than message discipline, smoothing over dissent and critical thought. The language of corporate togetherness rings hollow at a time of casualisation, redundancies and unmanageable workloads.

Still, we keep believing. Collegiality responds to the Tinkerbell effect: the collective act of believing in it, sometimes in spite of the evidence, brings it into being. In the middle of this semester, we had a fire drill. When the alarm goes off, it opens up the building, decanting its dispersed human occupants on to the tarmac and lawn outside. The invisible life of the university is made visible. We stood coatless and shivering in the autumn air, huddled in little groups. I saw students I had only ever seen on Zoom, colleagues appointed since lockdown who I had never seen before, and others I had not seen for over a year, reassuringly unchanged. And I was reminded how much of a community is made by this mere fact of contiguity: passing each other in corridors, popping into offices, queueing up for the microwave.

These acts form part of what Katherine May calls “the ticking mechanics of the world, the incremental wealth of small gestures”, which “weaves the wider fabric that binds us”. As a shy and socially passive person, I rarely take the initiative in interactions, so I need these accidental encounters. I didn’t quite notice, while I was just trying to get through it, how much a year and a half of living online had messed with my head. I had to get well again before I knew how sick I was. After so many months of virtual working, these micro-expressions of the value of community feel like glugging down bottled hope.

Community is not some warm, bland, mushy thing. It is how complicated human beings learn to live alongside other complicated human beings – people who want desperately to be good but who are also self-absorbed, insecure, frustrated and afraid. Community is only ever a work in progress, rife with bugs and glitches. It is hard work.

That becomes particularly apparent at Christmas, as we try to find it in us to show peace and goodwill to people we find irritating and exhausting. The writer Loudon Wainwright, Jr called Christmas “the annual crisis of love”. A university is a permanent crisis of love. But crises are what we struggle through because it’s worth getting to the other side – and because a university is a community or it’s nothing.

Managerial blah

I published this piece in Times Higher Education a few weeks ago:

There is a type of language that has become ubiquitous in academia in recent years. I call it managerial blah. You will recognise managerial blah if you’ve ever had to read it – or, God help you, had to write it. It is the official argot of the modern university, the way its actions are presented and explained.

How do you write managerial blah? First of all, you will need lots of abstract nouns. It helps if these can be used to signal things that we are all meant to approve of in some open-ended, ill-defined way, like leadership, excellence and quality. Mostly, though, you can rely on nouns that just refer to general categories into which other things fit, like framework, model, strategy, mechanism and portfolio.

This kind of noun-speak bears the traces of that traditional faith in word magic, the belief that chanting words like a spell could bring something into being, such as a cattle plague for one’s enemy or a good harvest for oneself. We flatter ourselves that, as enlightened moderns, we have left such primitive notions behind. But word magic survives today in curses, oaths – and nouns.

When you use a noun, you are claiming that the thing it refers to is real and durable enough to be named. The American writer and educator John Erskine wrote that a noun is “only a grappling iron to hitch your mind to the reader’s”. This grappling iron is especially useful when you are dealing with abstract notions that can’t be grasped by the senses. In managerial blah, nouns like esteem, value and gain become taken-for-granted things that, to the initiated, speak for themselves. The effect is amplified when you put two nouns together, such as performance indicator or service outcome. And even better if you can group them into threes: upskilling development opportunity, online delivery platform, workload resource allocation. Managerial blah loves these three-noun clusters because they ratchet up the nouniness, and thus the feeling that we are discussing something definite and unarguable. Knowledge Exchange Partnership is not just a noun, but three nouns – so it must be a thing. It lives!

These abstract nouns can then be paired up with intensifying adjectives such as dynamic, strategic, impactful, innovative and user-focused. In managerial blah these intensifiers have gone through a process that linguists call semantic bleaching. This means that their intensity has declined through overuse, until they are left as little more than placeholders in a sentence. They are so often paired with the same nouns that they form the tired couplings known as collocations. A collocation occurs when two or more words are habitually juxtaposed. So learning is always student-centred, procedures are always robust, competencies are always core and stakeholders are always key (there being no such thing, in managerial blah, as a minor stakeholder). Adverbs and participles can be collated into equally trite pairings. In managerial blah we are never just engaged but actively engaged, never just positioned but proactively positioned, never just committed but strongly committed.

OK, now you have to join up these stock-brick words and phrases into a clause. You will need at least one verb, but make sure it is a weak, connective one, such as facilitate, embed, enhance, refocus, reprioritise or rebalance. Try not to use more energetic verbs which would force you to attach agency to the subject of your sentence. That might involve you inadvertently constructing an argument that could be challenged, and you don’t want that.

You will find that you can cut down on verbs, anyway, by using lots of prepositions. Prepositions are small, harmless-looking words with many different meanings and functions. For the aspiring author of managerial blah they are helpfully ambiguous, allowing you to hint at connections between things without having to argue them through. You can use prepositions to staple-gun nouns together without worrying too much about verbal action. Managerial blah uses prepositions in weird, overemphasised ways, as if these little words are carrying more weight than they should. We will look at providing … This will help around our impact agenda … The Executive Deans will be leading on this.

If you’ve followed my instructions so far, you will have something resembling a complete clause in managerial blah. Now all you need to do is link it up with other clauses into sentences and paragraphs in a way that has no forward momentum at all. For instance, you can yoke interchangeable clauses together into one long sentence using just colons and semicolons. These form a sort of punctuational sticking plaster when the verbs are not strong enough to carry the reader through the sentence and into the next one. You can also group your sentences into lots of numbered sections and sub-sections. Numbering creates the illusion of structure and relieves you of the task of connecting one paragraph to the next with a developing argument. This list-like writing is lifeless – listless, in fact – because, unlike life, it has no propulsive energy. It arranges reality consecutively but randomly.

The torpidity of managerial blah sits awkwardly with its rhetoric, which sets so much store by perpetual movement. The prevailing tone is one of monodirectional certainty, with constant reference to continuous enhancement, direction of travel and, of course, going forward (or its now more common and vigorous variants, moving forward and driving forward). This all brings to mind the French theorist Henri Lefebvre’s definition of modernisation as “the movement which justifies its own existence merely by moving”.

In managerial blah, the phrase going forward is more than just a verbal tic. It encapsulates a particular way of looking at the world. Like the free market, the managerialist university believes in permanent revolution and endless growth. So it produces an infinite series of self-replenishing demands, urging everyone up a mountain whose summit will never be reached. University mission statements beseech us all to keep improving, to ceaselessly pursue quality, value or excellence. And how could the quest for such elusive nouns ever end?

But here’s the odd thing. Even as managerial blah exhorts us to move endlessly onwards, it is taking us on a journey that already knows its own end. In its linguistic universe, nothing truly new or surprising ever occurs. Points must be actioned, milestones must be met, deliverables must be delivered and inputs must have outputs. All eyes are on the satisficing completion of an algorithmic process, in a way that refuses to concede the possibility not only of failure but also of anything unforeseeable, unanticipated or serendipitous.

Managerial blah is an anonymous, depopulated language. It bears no trace of any inconveniently real human beings whose imperfections might mess up the system. It deals in processes and procedures, not people. It conjures up a metric-driven, quasi-rationalistic, artificially sealed-off world in which anything can be claimed and nothing can be seen or felt. No one ever says who did what to whom, or takes ownership or blame. The juggernaut just runs on, inevitably and under its own steam, although there might be issues and challenges along the way (never problems or failures, which might be pinned on people).

Good writing always has some give in it. It is open to dispute by a reality that the writer does not own and the reader might see differently. Managerial blah, by contrast, runs along a closed circuit that permits no response. Without any sense of voice or audience, it feels tone-locked, written by no one in particular to be read by no one in particular. It is anti-language, a weary run-through of the verbal motions.

Why does managerial blah get written? In part it is down to a banal and timeless truth: most people have no great facility with words. Writing with subtlety and precision is hard, so instead we default to off-the-shelf words and boilerplate phrases. Tying nouns together with weak verbal and prepositional knots is the simplest and quickest way to rustle up a sentence and achieve a superficial fluency. If good writing is hard to write and easy to read, then managerial blah is the reverse: a labour to read, a breeze to write.

Perhaps some of those who write managerial blah genuinely believe that, merely by gluing nouns together, they have communicated meaningfully with their readers. But surely, in a university, ignorance is no defence. Managerial blah is a crime against words by intelligent, well-educated and well-remunerated people who should know better. Writing well is hard, but not that hard. If you keep on producing this ugly and alienating language when so many people have told you how ugly and alienating it is, then your intellectual laziness is not an accident.

Writing with proper care and attention offers you no hiding place. The basic somebody-did-something structure of the plain English sentence allows your reader to weigh up how convincing you sound. When you use specific nouns and strong verbs to describe your actions, you haveto think through the purposes and consequences of those actions. Managerial blah evades this obligation. It can thus make the cruellest and maddest realities seem sane and incontrovertible.

The sector is currently going through a traumatic cycle of redundancies, in response to the brutally competitive market created by the raising of tuition fees, the removal of student number controls, and government antipathy to “low value” arts and humanities courses. The language used to announce and explain these redundancies has been a masterclass in managerial blah: stale, vapid, self-validating and, of course, chock-full of nouns. Staff who receive letters informing them of their imminent dismissal are lectured about strategic priorities, changes to the staffing establishment, redundancy ordinances and university redeployment registers. And because lip service must now always be paid to mental health, they are then directed to Employee Assistance Programmes and Staff Wellbeing Portals. Students who are worried that their lecturers have all been sacked are assured that a student protection plan is in place. This nouny language is not even trying to illuminate. Its only purpose is to immunise itself against scrutiny and challenge.

Job cuts and other distressing developments are justified with the scariest two nouns in the lexicon of managerial blah: change management. Change is happening anyway, this omnipresent phrase suggests, and in a direction that has already been decided. All the rest of you can do is adapt or get off the bus. Change management is the shibboleth of a financialised capitalism that sees human capital as a faceless element in an inexorable process and a fixed cost to be reduced. The emotional costs of redundancy are immense. People who have given everything to adapt their teaching and support their students in a pandemic have now had their livelihoods taken away and, in many cases, their careers effectively ended. Naturally they feel great anxiety, anger and pain. Their colleagues who remain in post are left with survivor guilt and fear of the worse that may be to come. And all this human turmoil is hidden inside that insipid, unassailable phrase, change management.

Those of us in the humanities – the subjects most at threat from redundancies – are at least alert to how language shapes reality as well as reflecting it. Words are never simply a neutral, transparent container for meaning. They can clarify and elucidate or they can muddy and obscure. They can wake the mind up or anaesthetise it. They can polish reality so it gleams or hide it behind a rusty carapace of cliché, cant and sloganeering. Words are not just how we communicate; they are how we think. Managerial blah liberates the writer from the need to do either. It is a kind of literary lockjaw which stops you saying or thinking anything original or edifying.

There is a long tradition of making fun of management speak, but managerial blah is too dull even to poke fun at. It offers no purchase or ingress for the satirist or ironist. It just sleepwalks from one big noun to the next, sucking us into its vortex and boring us into submission. All the imaginative promise of words has been pulped into a lumpy noun gravy, neither liquid enough to flow nor solid enough to be forked. This noun gravy is tasteless but, should you swallow enough of it, noxious.

Words matter. They transform how we think and feel about each other and about our lives. We will never be able to see the world in more creative and fruitful ways if we are trapped inside a univocal vocabulary. Imagine how refreshing it would be to read an official university document that treated its reader like a human being, by trying to persuade them with defensible arguments, fine distinctions and honest doubts. We would live richer, more productive and more authentic working lives in academia if we cared more about words – which is why, now that I have told you how to write managerial blah, I hope you will ignore my advice.

Why I no longer read anonymous comments

This is a slightly longer version of a piece I wrote for the Times Higher last week.

I have stopped reading anonymous comments by students on my module evaluation surveys. Unless I’m forced to, I won’t read them again. I understand the argument for anonymity. Anonymous feedback, delivered without polite hedging or fear of censure, can provide those in privileged roles with salutary information they might not hear face-to-face. But anonymity also has costs, and I no longer believe the benefits outweigh them.

I have never posted anonymous feedback in my life. When I fill in staff surveys, which isn’t often, I put my name at the bottom of any free-text comments I make. Perhaps this is vanity: why waste time on words that don’t have my name on them? But at least it means that I take responsibility for them – the credit and the blame. By affixing my name to my words, I am incentivized to care that they say precisely what I want them to say.

Our online lives have normalized anonymity. In You Are Not a Gadget: A Manifesto (2010), the technology writer Jaron Lanier argues that anonymity is now a congealed philosophy, an “immovable eternal architecture” built into the software. Participants in the early world wide web were extrovert and collegiate in their online identities. Web 2.0, with its shift to user-generated content, encouraged the use of pseudonyms and avatars as part of the crowdsourcing of information. What mattered in this new world were not the individuals who made it up, but the endless, collective generation of data, which could be exploited for advertising, surveillance and other purposes.

We have got used to providing free content online, by posting below-the-line comments, leaving feedback or updating our social media feeds. Even when we do put our names to this writing, our names aren’t that important. The writer is merely a content provider, a tiny part of the vast computational machine and its insatiable appetite for harvestable data.

For Lanier, this new culture has led to a “drive-by anonymity”. It empowers trolls, rewards snark and makes for “a generally unfriendly and unconstructive online world”. Distanced from others by the technology, we are more likely to forget that we are addressing complex, harassed, bruisable humans like ourselves.

Academics are at the luckier end of this problem. In some service industries, anonymous feedback can affect people’s pay and even their employment. If an Uber driver gets too many poor ratings, they are frozen out of the app that brings them new customers. In academia, poor feedback doesn’t usually have these drastic career consequences. We are also lucky that only a tiny number of students set out to be cruel or unkind. Such comments do get posted, though, and there is now a large body of research suggesting that negative feedback is aimed disproportionately at young, women and BAME lecturers.

Anonymous feedback has a more insidious aspect: it skews the whole nature of writing as communication between human beings. It is more likely to be dashed off and dispensed casually, probably in the middle of many other invitations to give feedback. It means far more to the reader than it does to the writer – which is the wrong way round.

One of Lanier’s suggestions for improving online culture is to post something that took you a hundred times longer than it will take to read. This usefully shifts a piece of writing’s centre of gravity. The producer of the words has more invested than the consumer. Those words have a better chance of saying something interesting and worthwhile.

Our culture’s appetite for computable information makes nuanced communication more difficult. “Writing has never been capitalism’s thing,” Gilles Deleuze and Félix Guattari argue in Anti-Oedipus: Capitalism and Schizophrenia (1972). Capitalism, they write, prefers “electric language” – words that can be processed, actioned and monetized. But words are not just containers for data. They possess an immense power to move, hurt, deceive, anger, enchant and cajole others.

Most of our students grew up with Web 2.0 and know no other reality. They are at ease with anonymity. But as an English lecturer, I am struck by how much this conflicts with what we try to teach them about good writing. We tell them that putting words into careful, considered order is hard, that they must keep rewriting until they sound like the best and most insightful version of themselves. We teach them that words cut through most deeply when they have a sense of voice and address, of being written by an irreducibly unique person to other irreducibly unique people.

We have learned during the pandemic that teaching does not thrive as a series of faceless interactions. Just as Zoom seminars are easier and more enriching to teach when students have their cameras on, I would much rather receive feedback from specific, identifiable people. I know this kind of feedback would be as flawed as all human communication – prone to misunderstandings, self-censorship and power imbalances. We would need to work hard to create a space in which students felt able to speak freely. And students would also need to spend time framing their comments with the right mix of directness and tact – but wouldn’t that be a good skill for them to learn? For all its difficulties, feedback with someone’s name on it still feels preferable to the asymmetry of anonymity, so subtly alienating for both writer and reader. That is why I no longer read anonymous comments.

The power of touch

This is a slightly longer version of the article I published in last week’s Observer:

When was the last time you touched someone you don’t live with? One day last March, probably; you’re not sure of the date. Did you shake hands with a new colleague at work? Did your coat brush against another commuter’s on the train? Did someone bump your elbow and mutter an apology when rushing past you on an escalator? If you’d known that was the last time you’d make contact with the body of a stranger, you’d have paid more attention.

And what about the 8.2 million British adults who live on their own? Many will have gone nearly a year now without so much as a pat on the arm from another person. Touch is the sense we take most for granted, but we miss it when it’s gone. Psychologists have a term for the feelings of deprivation and abandonment we experience: “skin hunger”.

“Skin hunger” is not a phrase I had come across before last year, nor a problem I ever imagined facing. I am a socially awkward, non-tactile person. I have looked on nervously as, over the last two decades, hugging has moved from being a marginal pursuit to a constant of British social life. A hug feels to me like an odd mix of the natural and the artful. It is natural because bodily contact is the first, endorphin-releasing language we learn as babies and share with other apes. But it is also artful, because it has to be silently synchronised with someone else, unlike a handshake which can be offered and accepted asynchronously.

For the truly socially inept, even a handshake can be fiddly. I used to botch them all the time, offering the wrong hand (being left-handed didn’t help) or grabbing the other person’s fingers instead of their palm. Then, just as I had completed my long internship in handshaking, it began to lose currency and I had to hastily reskill in hugging. The best I could manage at first was a sort of bear-claw holdwith my arms hanging limply down my huggee’s back. It must have been like trying to cuddle a scarecrow. I got better at it; I had to. Now I find that I really miss hugging people. I even miss those clumsy, mistimed hugs where you bang bones together and it goes on just slightly too long or not long enough. And “hunger” feels the right word for it, in the sense that your body lets your mind know that something is up, and fills it with a gnawing sense of absence.

Aristotle considered touch the lowliest sense. He looked down on it because it was found in all animals and it relied on mere proximity, not the higher human faculties of thought, memory and imagination. But one could just as easily say that touch is the highest sense and for the same reasons. It isthe basicanimal instinct that lets us know we are alive in the world. It offers proof of the solidity of things other than ourselves.

Touch is our first sensation. The hand of a two-month-old human foetus will grasp when it feels something in its palm. A new-born baby will instinctively turn its head towards a touch on the cheek. All over the world, children play tag without having to learn how. The earliest forms of medicine drew on this human need to touch and be touched. The practice of healing massage emerged in India, China and Southeast Asia by the third millennium BCE, before spreading west. Asclepius, the Greek god of healing, cured people by touching them. The word surgeon originally meant hand healer, from the Greek for hand (kheir) and work (ergon). In the gospels, Jesus cures the sick with the laying on of hands.

In recent years the caring professions have revived this practice of healing through touch. The tender touch of others is now known to boost the immune system, lower blood pressure, decrease the level of stress hormones such as cortisol, and trigger the release of the same kind of opiates as painkilling drugs. Premature babies gain weight when rubbed lightly from head to foot. Massages reduce pain in pregnant women. People with dementia who are hugged and stroked are less prone to irritability and depression.

Our oldest myths speak of the lifegiving power of touch. In Homer’s Odyssey, Odysseus, visiting Hades, tries to hug his dead mother, Anticleia, so that they might “find a frigid comfort in shared tears”. But Anticleia is now a lifeless husk; she just slips through his arms like a hologram. Homer’s metaphor for the unbridgeable chasm between the living and the dead – a failed hug – feels newly resonant in the time of Covid. The Homeric underworld is a place of permanent lockdown, where the dead live on as unreachable, self-isolating ghosts.

Philip Pullman’s His Dark Materials trilogy echoes this scene in its last book, The Amber Spyglass. Lyra tries to hug her friend Roger in the world of the dead, but he passes “like cold smoke through her arms”. Pullman’s trilogy is a hymn to the materiality of the human body. It deliberately inverts the traditional Christian story, in which our eternal souls triumph over our flawed, sinful flesh. Pullman’s angels long to have bodies like humans, to feel the world through the senses. His human characters have daemons, physical manifestations of their souls, which means that they can hold themselves in their armsthe way Lyra hugs her daemon Pan.

It is hard to read His Dark Materials now without thinking about how the pandemic has separated us from each other. The trilogy’s climax comes when Lyra and Will kiss and know each other with their bodies. But then they must part and return to their own worlds. They agree that at noon on every midsummer’s day they will both sit on a bench in Oxford’s Botanic Garden that exists in both their worlds. Lyra tells Will that if they ever meet again they’ll “cling together so tight that nothing and no one’ll ever tear us apart”.

The different worlds in Pullman’s work are divided by the thinnest of membranes. The strange new rituals of the past year have all been about trying to reach across such thin but absolute divides. Older couples stand in front gardens, waving at their grandchildren through windows and miming hugs. People embrace their relatives in care homes through “cuddle curtains”: plastic sheets with two pairs of sleeves, allowing them to hug without touching. In Zoom meetings, we smile and wave at the shapeshifting pixels on our screens because they resemble people we used to know and perhaps once touched.

The virus, by forcing us apart, reminds us of this inescapable fact: we live in our bodies. Maybe we had begun to forget this in a world that links us up in so many virtual, intangible ways. That miraculous piece of technology, the touchscreen, works through a desensitised, near-touchless touch. It smoothly responds to our prodding, pinching and swiping so that we may do our duty as good little online citizens, working, shopping and distracting ourselves endlessly. But as our fingers and thumbs glide across the uniform surface, there is no sensuality or responsiveness in the touch. For the skin hungry, this is thin gruel.

Touch is a universal language, but every culture has its own way of speaking it. In north Africa and the Middle East, men join their hands together in greeting, then kiss their own hands or hold them to the heart. The Congolese touch each other on the temples and kiss foreheads. In Tuvalu they sniff each other’s cheeks. Andaman islanders in the Bay of Bengal sit in each other’s laps and then, in farewell, lift the other person’s hand to their mouth and blow.

Britain, by contrast, has historically been a low-contact culture. One explanation for the rise of ballroom dancing in this country is that it gave shy strangers formal permission to hold each other. Studying the etiquette in a Bolton dance hall in 1938, the anthropologist Tom Harrisson noted that a man would ask a woman for a dance simply by touching her elbow and waiting for her to fall into his arms. This couple might dance the whole night without speaking, then go their separate ways.

In touch-deprived cultures, touching is no less important than in tactile ones. As we have learned over the past year, when people are starved of touch the slightest forms of contact become filled with meaning. The most charged moment in the film Brief Encounter (1945) comes when Laura (Celia Johnson) and Alec (Trevor Howard) can’t say goodbye properly, because an annoying acquaintance of Laura’s has gatecrashed their final farewell. So he softly squeezes her shoulder, a small gesture filled with doomed longing. A hesitant embrace can speak as potently as an ardent one. On 30 May 1953 Edmund Hillary and Tenzing Norgay arrived back at advance base camp after climbing Everest. According to the expedition leader, John Hunt, they were welcomed with “handshakes – even, I blush to say, hugs”.

In 1966 the psychologist Sidney Jourard conducted a field study of couples sat in coffee shops around the world. He found that in the Puerto Rican capital, San Juan, couples touched each other – by hand-holding, back stroking, hair caressing or knee-patting – an average of 180 times per hour. In Paris, it was 110 times; in Gainesville, Florida, it was twice; in London, never.

Jourard concluded that Americans and Britons lived under a “touch taboo”. In the US this even extended to barbers using electric scalp massagers strapped to their hands so they did not touch their customers’ heads. Jourard wondered if the large number of massage parlours in British and American cities betrayed a need not being met in normal relationships. Many American motel rooms were equipped with “Magic Fingers”, a device which, on inserting a quarter, would slowly vibrate the bed. The machine, Jourard wrote, “has taken over another function of man – the loving and soothing caress”.

The new therapies that came out of California in the late 1960s sought to cure the English-speaking countries of their touchlessness. They prescribed generous doses of hugging. Bernard Gunther, of the Esalen Institute in Big Sur Hot Springs, taught full-body massage techniques as a path to sensory awakening. Some of Gunther’s more outré methods – mutual hair shampooing and the “Gunther hero sandwich” (a group of people spooning one another) – failed to catch on. But the massage therapists probably did help Britain and America become more tactile societies. By the 1980s, “Magic Fingers” machines had largely vanished from motel rooms.

In lockdown, the skin hungry have once again been forced to improvise inadequate technical fixes. They hug themselves, or hug pillows and duvets, or tuck in their bed blankets tightly at night. The robotics industry has tried to replicate the feel of human touch with Bluetooth-enabled “hug shirts” and silicone lips that allow you to hold and kiss someone remotely. But it’s not the same and never will be, however good the technology gets. Nothing substitutes for human touch.

As a teenager, the autistic writer and activist, Temple Grandin, longed to feel the pressure stimulation of a hug. Like many autistic people, though, she found being touched difficult. One day, visiting her aunt’s Arizona ranch, she saw cattle being put in a squeeze chute: a pen with compressing metal sides, which kept them calm while they were branded or castrated. Thus inspired, she made her own human “squeeze machine”. It had two wooden boards, upholstered with thick padding and joined by hinges. When she kneeled inside it and turned on an air compressor, it felt like being hugged. For Grandin, this was a useful staging post on the way to touching people. In her mid-twenties she learnt to shake hands. When she was sixty, her squeeze machine finally broke, and she didn’t bother to fix it. “I’m into hugging people now,” she said.

Real human touch is infinitely subtle and intricate – less a sense than a sensorium. Skin, which makes up nearly twenty per cent of our bodies, is our largest and most sensitive organ. An area of skin the size of a pound coin contains fifty nerve endings and three feet of blood vessels. The work of touch is done by sensory receptors, buried in the skin at different depths according to what kind of stimulus they detect, such as warmth, cold or pain. One of these receptors, the Pacinian corpuscle, responds to pressure and vibration. It can detect movements smaller than a millionth of a metre.

Everything we touch has its own specific shape, texture and firmness, its own special resistance to the pressure we place on it. Every hug feels different because everyone you hug takes up space in the world in a different way. No one else has quite the same contours, the same pleats and ripples in their clothes, the same warmth and weight, the same precise arrangement of flesh and bones. Your own body is a one-off, too. It folds into and nests with someone else’s in a way that no other body can.

“Sending hugs,” people say online – but you can’t send a hug. A virtual hug only whets the appetite for what you’re missing, just as looking at food when you’re hungry makes you hungrier. The feeling you’re trying to share in a hug is all wrapped up in its embodiment in space and time. A hug joins the physical and emotional so tightly together that you can’t tell them apart. The writer Pádraig Ó Tuama points out that an Irish way of saying hug is duine a theannadh le do chroí: to squeeze someone with your heart.

I wonder how it will feel when we can hug people again. Will we have to relearn the protocol, or will muscle memory kick in? Will our nerve endings have been deadened or hyper-sensitised by abstinence? Will we hug everyone too much and too hard, because our feeding habits have switched to feast-or-famine mode, like wolves who kill more than they can eat? One thing we do know now is that we are hardwired for touch. We were not meant to swerve away from each other in the street, or mime hugs through windows, or cuddle through walls of plastic. We were meant to hold people close, and feel the bones in their back and the rise and fall of their chests, and remind each other that we are warm bodies, still breathing, still alive.

On libraries

This article was published on Christmas Eve in the Times Higher (paywalled). I’m posting it here with some hesitation. When I uploaded the link on Twitter I received a number of angry tweets over Christmas from people who work in libraries. I think this may be because the headline suggested that I couldn’t wait for libraries to fully open again, when the piece is actually about looking forward to using libraries after the pandemic and had nothing to do with putting library workers at risk. Others felt that I had invisibilised the work of librarians in the article itself. That wasn’t my intention, but I don’t get involved in arguments on social media for all kinds of reasons. I’m just putting it up here for the record, and because I put a lot of effort into even short pieces like this. But please, as they say, don’t @ me – that’s happened enough already …

I have not been inside a library since March. University libraries are rationing their footfall with booking systems, shelves cordoned off with tape, and books available via click and collect. I have left our library to the students who need it more than me. We “library-cormorants”, as Coleridge called us, feel expelled from our nests.

I have long thought that the beating heart of a university is its library – the most welcoming and egalitarian space on campus. Here no locked rooms or timetables confine you. Go there to finish your essay, daydream, keep warm, watch a football game live on some bootleg online channel – no one minds. The library offers you free wifi, a workspace, reading matter, warmth, light and peace, not all of which everyone finds at home. I miss that mood of industrious quiet, captured in the sound of hundreds of students two-finger tapping at keyboards, like soft summer rain pattering on a tent.

I miss other libraries too. One day I will again sit in the reading room of the British Library, finding solace in the benign indifference of the readers around me, all of them politely pretending that I am invisible. A library offers, in Zadie Smith’s words, “an indoor public space in which you do not have to buy anything in order to stay”. The argument that we have less need of a library in a digital age errs, she writes, in seeing it “as a function rather than a plurality of individual spaces”. A library is a beautiful paradox: a public building that enshrines the private acts of reading, writing and thought.

I suppose the pandemic has taught us that, if needs must, we can survive without libraries. The history of scholarship is full of people who, when banished from them, make a virtue of the loss. “Lock up your libraries if you like,” wrote Virginia Woolf after being refused entry to a Cambridge college library, “but there is no gate, no lock, no bolt that you can set upon the freedom of my mind.” Woolf’s ambition was to write a history of English literature entirely from memory. Eric Auerbach, exiled in Istanbul during the second world war, was nearly reduced to this. Deprived of the European sources he needed, he focused instead, in his classic book Mimesis, on brilliant close readings of Western classics from Homer to Woolf. He ascribed its existence to “just this lack of a rich and specialised library”. Forced to wade through the vast secondary literature, he might never have got around to writing it.

The pull of libraries is as much emotional as practical. They offer the comforting illusion that in their unearthly calm and Dewey-Decimal order we will at last be able to quiet our neuroses, remake ourselves and return to the world revived and repaired. The scholarly life can be solitary and dispiriting. Doing research often feels like treading water. Libraries instil in us an emboldening sense of collective enterprise. They are, in David Attenborough’s words, “immense communal brains … extra-corporeal DNA, adjuncts to our genetical inheritance”. They make knowledge, in its wonderful but dizzying abstraction, concrete and tactile. I used to remind students that the books in the library are still known as “holdings”, and that sometimes it is nice to hold them. I am saving this advice for when it is once again Covid-compliant.

How cruel, then, that this quintessence of our collegiality, the shared space of a library – with its books thumbed by countless unnamed others, its hotdesking computer workstations, its large atria where hundreds breathe the same air – is now such a contagious and perilous place. As well as attacking our respiratory systems, the virus has attacked this idea of ourselves as social animals, who want to be near each other even if we might work more quickly and cost-effectively alone.

My nostalgia for libraries may be sentimental, but so what? Human beings like to anchor themselves in familiar places. In a famous essay, George Orwell rhapsodises about his ideal pub, “The Moon Under Water”. At the end Orwell admits what perceptive readers have already guessed: his perfect pub does not exist, being only an amalgamation of the best aspects of all the imperfect pubs he has known. On the same principle, in lockdown I have been imagining my perfect library. In my mind’s eye I see a single, large room panelled in dark wood, with soft pools of light emanating from green table lamps, and high-backed, padded chairs with sturdy armrests that miraculously fit under the desks. The bookshelves are as tall as buses, their upper limits accessed by ladders. On the mezzanine floor, reached by iron spiral steps, there are nooks and recesses where you may retreat further from the world. Plane trees rustle gently outside; the sunlight filters through them, and the high windows, to leave a lovely dappling effect on the parquet floor. The room has only a few other occupants, none of whom have annoying coughs or sniffles. This perfect library is small, but by some magically self-replenishing process has every single book I need.

John Betjeman reputedly said that in the event of Armageddon, he would head to the haberdashery department of Peter Jones department store in Sloane Square, “because nothing unpleasant could ever happen there”. I feel the same way about my perfect library. One day I hope to resume my search for it.