The Tinkerbell effect

I wrote this for Times Higher Education last week:

We all know the ideal. A university is not just another medium-sized corporation; it is a community of scholars, striving towards the common goals of learning and enlightenment. And we all know the many ways an actual university falls short of that ideal. Collegiality can evaporate in the heat of the job, with its daily irritations and power plays. The modern university, the American educator Clark Kerr once wrote, is just “a series of individual faculty entrepreneurs held together by a common grievance over parking”.

The managerialist ethos that pervades today’s universities doesn’t help. This ethos reduces human relationships to the incentivising logic and contractual obligations of a market. The problem isn’t the people – managers themselves can be well meaning and principled – but the system. Ultimately, managerialism does not believe in community, only in self-interested individuals completing tasks because they have been offered carrots or threatened with sticks. By dividing us up into cost centres, the managerialist university tries to isolate the ways in which the different parts contribute to the whole. Poorly performing areas, or those seen as a drain on resources, are put on the naughty step, or worse.

In this context, the rhetoric of the university as a community can feel like little more than message discipline, smoothing over dissent and critical thought. The language of corporate togetherness rings hollow at a time of casualisation, redundancies and unmanageable workloads.

Still, we keep believing. Collegiality responds to the Tinkerbell effect: the collective act of believing in it, sometimes in spite of the evidence, brings it into being. In the middle of this semester, we had a fire drill. When the alarm goes off, it opens up the building, decanting its dispersed human occupants on to the tarmac and lawn outside. The invisible life of the university is made visible. We stood coatless and shivering in the autumn air, huddled in little groups. I saw students I had only ever seen on Zoom, colleagues appointed since lockdown who I had never seen before, and others I had not seen for over a year, reassuringly unchanged. And I was reminded how much of a community is made by this mere fact of contiguity: passing each other in corridors, popping into offices, queueing up for the microwave.

These acts form part of what Katherine May calls “the ticking mechanics of the world, the incremental wealth of small gestures”, which “weaves the wider fabric that binds us”. As a shy and socially passive person, I rarely take the initiative in interactions, so I need these accidental encounters. I didn’t quite notice, while I was just trying to get through it, how much a year and a half of living online had messed with my head. I had to get well again before I knew how sick I was. After so many months of virtual working, these micro-expressions of the value of community feel like glugging down bottled hope.

Community is not some warm, bland, mushy thing. It is how complicated human beings learn to live alongside other complicated human beings – people who want desperately to be good but who are also self-absorbed, insecure, frustrated and afraid. Community is only ever a work in progress, rife with bugs and glitches. It is hard work.

That becomes particularly apparent at Christmas, as we try to find it in us to show peace and goodwill to people we find irritating and exhausting. The writer Loudon Wainwright, Jr called Christmas “the annual crisis of love”. A university is a permanent crisis of love. But crises are what we struggle through because it’s worth getting to the other side – and because a university is a community or it’s nothing.

Managerial blah

I published this piece in Times Higher Education a few weeks ago:

There is a type of language that has become ubiquitous in academia in recent years. I call it managerial blah. You will recognise managerial blah if you’ve ever had to read it – or, God help you, had to write it. It is the official argot of the modern university, the way its actions are presented and explained.

How do you write managerial blah? First of all, you will need lots of abstract nouns. It helps if these can be used to signal things that we are all meant to approve of in some open-ended, ill-defined way, like leadership, excellence and quality. Mostly, though, you can rely on nouns that just refer to general categories into which other things fit, like framework, model, strategy, mechanism and portfolio.

This kind of noun-speak bears the traces of that traditional faith in word magic, the belief that chanting words like a spell could bring something into being, such as a cattle plague for one’s enemy or a good harvest for oneself. We flatter ourselves that, as enlightened moderns, we have left such primitive notions behind. But word magic survives today in curses, oaths – and nouns.

When you use a noun, you are claiming that the thing it refers to is real and durable enough to be named. The American writer and educator John Erskine wrote that a noun is “only a grappling iron to hitch your mind to the reader’s”. This grappling iron is especially useful when you are dealing with abstract notions that can’t be grasped by the senses. In managerial blah, nouns like esteem, value and gain become taken-for-granted things that, to the initiated, speak for themselves. The effect is amplified when you put two nouns together, such as performance indicator or service outcome. And even better if you can group them into threes: upskilling development opportunity, online delivery platform, workload resource allocation. Managerial blah loves these three-noun clusters because they ratchet up the nouniness, and thus the feeling that we are discussing something definite and unarguable. Knowledge Exchange Partnership is not just a noun, but three nouns – so it must be a thing. It lives!

These abstract nouns can then be paired up with intensifying adjectives such as dynamic, strategic, impactful, innovative and user-focused. In managerial blah these intensifiers have gone through a process that linguists call semantic bleaching. This means that their intensity has declined through overuse, until they are left as little more than placeholders in a sentence. They are so often paired with the same nouns that they form the tired couplings known as collocations. A collocation occurs when two or more words are habitually juxtaposed. So learning is always student-centred, procedures are always robust, competencies are always core and stakeholders are always key (there being no such thing, in managerial blah, as a minor stakeholder). Adverbs and participles can be collated into equally trite pairings. In managerial blah we are never just engaged but actively engaged, never just positioned but proactively positioned, never just committed but strongly committed.

OK, now you have to join up these stock-brick words and phrases into a clause. You will need at least one verb, but make sure it is a weak, connective one, such as facilitate, embed, enhance, refocus, reprioritise or rebalance. Try not to use more energetic verbs which would force you to attach agency to the subject of your sentence. That might involve you inadvertently constructing an argument that could be challenged, and you don’t want that.

You will find that you can cut down on verbs, anyway, by using lots of prepositions. Prepositions are small, harmless-looking words with many different meanings and functions. For the aspiring author of managerial blah they are helpfully ambiguous, allowing you to hint at connections between things without having to argue them through. You can use prepositions to staple-gun nouns together without worrying too much about verbal action. Managerial blah uses prepositions in weird, overemphasised ways, as if these little words are carrying more weight than they should. We will look at providing … This will help around our impact agenda … The Executive Deans will be leading on this.

If you’ve followed my instructions so far, you will have something resembling a complete clause in managerial blah. Now all you need to do is link it up with other clauses into sentences and paragraphs in a way that has no forward momentum at all. For instance, you can yoke interchangeable clauses together into one long sentence using just colons and semicolons. These form a sort of punctuational sticking plaster when the verbs are not strong enough to carry the reader through the sentence and into the next one. You can also group your sentences into lots of numbered sections and sub-sections. Numbering creates the illusion of structure and relieves you of the task of connecting one paragraph to the next with a developing argument. This list-like writing is lifeless – listless, in fact – because, unlike life, it has no propulsive energy. It arranges reality consecutively but randomly.

The torpidity of managerial blah sits awkwardly with its rhetoric, which sets so much store by perpetual movement. The prevailing tone is one of monodirectional certainty, with constant reference to continuous enhancement, direction of travel and, of course, going forward (or its now more common and vigorous variants, moving forward and driving forward). This all brings to mind the French theorist Henri Lefebvre’s definition of modernisation as “the movement which justifies its own existence merely by moving”.

In managerial blah, the phrase going forward is more than just a verbal tic. It encapsulates a particular way of looking at the world. Like the free market, the managerialist university believes in permanent revolution and endless growth. So it produces an infinite series of self-replenishing demands, urging everyone up a mountain whose summit will never be reached. University mission statements beseech us all to keep improving, to ceaselessly pursue quality, value or excellence. And how could the quest for such elusive nouns ever end?

But here’s the odd thing. Even as managerial blah exhorts us to move endlessly onwards, it is taking us on a journey that already knows its own end. In its linguistic universe, nothing truly new or surprising ever occurs. Points must be actioned, milestones must be met, deliverables must be delivered and inputs must have outputs. All eyes are on the satisficing completion of an algorithmic process, in a way that refuses to concede the possibility not only of failure but also of anything unforeseeable, unanticipated or serendipitous.

Managerial blah is an anonymous, depopulated language. It bears no trace of any inconveniently real human beings whose imperfections might mess up the system. It deals in processes and procedures, not people. It conjures up a metric-driven, quasi-rationalistic, artificially sealed-off world in which anything can be claimed and nothing can be seen or felt. No one ever says who did what to whom, or takes ownership or blame. The juggernaut just runs on, inevitably and under its own steam, although there might be issues and challenges along the way (never problems or failures, which might be pinned on people).

Good writing always has some give in it. It is open to dispute by a reality that the writer does not own and the reader might see differently. Managerial blah, by contrast, runs along a closed circuit that permits no response. Without any sense of voice or audience, it feels tone-locked, written by no one in particular to be read by no one in particular. It is anti-language, a weary run-through of the verbal motions.

Why does managerial blah get written? In part it is down to a banal and timeless truth: most people have no great facility with words. Writing with subtlety and precision is hard, so instead we default to off-the-shelf words and boilerplate phrases. Tying nouns together with weak verbal and prepositional knots is the simplest and quickest way to rustle up a sentence and achieve a superficial fluency. If good writing is hard to write and easy to read, then managerial blah is the reverse: a labour to read, a breeze to write.

Perhaps some of those who write managerial blah genuinely believe that, merely by gluing nouns together, they have communicated meaningfully with their readers. But surely, in a university, ignorance is no defence. Managerial blah is a crime against words by intelligent, well-educated and well-remunerated people who should know better. Writing well is hard, but not that hard. If you keep on producing this ugly and alienating language when so many people have told you how ugly and alienating it is, then your intellectual laziness is not an accident.

Writing with proper care and attention offers you no hiding place. The basic somebody-did-something structure of the plain English sentence allows your reader to weigh up how convincing you sound. When you use specific nouns and strong verbs to describe your actions, you haveto think through the purposes and consequences of those actions. Managerial blah evades this obligation. It can thus make the cruellest and maddest realities seem sane and incontrovertible.

The sector is currently going through a traumatic cycle of redundancies, in response to the brutally competitive market created by the raising of tuition fees, the removal of student number controls, and government antipathy to “low value” arts and humanities courses. The language used to announce and explain these redundancies has been a masterclass in managerial blah: stale, vapid, self-validating and, of course, chock-full of nouns. Staff who receive letters informing them of their imminent dismissal are lectured about strategic priorities, changes to the staffing establishment, redundancy ordinances and university redeployment registers. And because lip service must now always be paid to mental health, they are then directed to Employee Assistance Programmes and Staff Wellbeing Portals. Students who are worried that their lecturers have all been sacked are assured that a student protection plan is in place. This nouny language is not even trying to illuminate. Its only purpose is to immunise itself against scrutiny and challenge.

Job cuts and other distressing developments are justified with the scariest two nouns in the lexicon of managerial blah: change management. Change is happening anyway, this omnipresent phrase suggests, and in a direction that has already been decided. All the rest of you can do is adapt or get off the bus. Change management is the shibboleth of a financialised capitalism that sees human capital as a faceless element in an inexorable process and a fixed cost to be reduced. The emotional costs of redundancy are immense. People who have given everything to adapt their teaching and support their students in a pandemic have now had their livelihoods taken away and, in many cases, their careers effectively ended. Naturally they feel great anxiety, anger and pain. Their colleagues who remain in post are left with survivor guilt and fear of the worse that may be to come. And all this human turmoil is hidden inside that insipid, unassailable phrase, change management.

Those of us in the humanities – the subjects most at threat from redundancies – are at least alert to how language shapes reality as well as reflecting it. Words are never simply a neutral, transparent container for meaning. They can clarify and elucidate or they can muddy and obscure. They can wake the mind up or anaesthetise it. They can polish reality so it gleams or hide it behind a rusty carapace of cliché, cant and sloganeering. Words are not just how we communicate; they are how we think. Managerial blah liberates the writer from the need to do either. It is a kind of literary lockjaw which stops you saying or thinking anything original or edifying.

There is a long tradition of making fun of management speak, but managerial blah is too dull even to poke fun at. It offers no purchase or ingress for the satirist or ironist. It just sleepwalks from one big noun to the next, sucking us into its vortex and boring us into submission. All the imaginative promise of words has been pulped into a lumpy noun gravy, neither liquid enough to flow nor solid enough to be forked. This noun gravy is tasteless but, should you swallow enough of it, noxious.

Words matter. They transform how we think and feel about each other and about our lives. We will never be able to see the world in more creative and fruitful ways if we are trapped inside a univocal vocabulary. Imagine how refreshing it would be to read an official university document that treated its reader like a human being, by trying to persuade them with defensible arguments, fine distinctions and honest doubts. We would live richer, more productive and more authentic working lives in academia if we cared more about words – which is why, now that I have told you how to write managerial blah, I hope you will ignore my advice.

Why I no longer read anonymous comments

This is a slightly longer version of a piece I wrote for the Times Higher last week.

I have stopped reading anonymous comments by students on my module evaluation surveys. Unless I’m forced to, I won’t read them again. I understand the argument for anonymity. Anonymous feedback, delivered without polite hedging or fear of censure, can provide those in privileged roles with salutary information they might not hear face-to-face. But anonymity also has costs, and I no longer believe the benefits outweigh them.

I have never posted anonymous feedback in my life. When I fill in staff surveys, which isn’t often, I put my name at the bottom of any free-text comments I make. Perhaps this is vanity: why waste time on words that don’t have my name on them? But at least it means that I take responsibility for them – the credit and the blame. By affixing my name to my words, I am incentivized to care that they say precisely what I want them to say.

Our online lives have normalized anonymity. In You Are Not a Gadget: A Manifesto (2010), the technology writer Jaron Lanier argues that anonymity is now a congealed philosophy, an “immovable eternal architecture” built into the software. Participants in the early world wide web were extrovert and collegiate in their online identities. Web 2.0, with its shift to user-generated content, encouraged the use of pseudonyms and avatars as part of the crowdsourcing of information. What mattered in this new world were not the individuals who made it up, but the endless, collective generation of data, which could be exploited for advertising, surveillance and other purposes.

We have got used to providing free content online, by posting below-the-line comments, leaving feedback or updating our social media feeds. Even when we do put our names to this writing, our names aren’t that important. The writer is merely a content provider, a tiny part of the vast computational machine and its insatiable appetite for harvestable data.

For Lanier, this new culture has led to a “drive-by anonymity”. It empowers trolls, rewards snark and makes for “a generally unfriendly and unconstructive online world”. Distanced from others by the technology, we are more likely to forget that we are addressing complex, harassed, bruisable humans like ourselves.

Academics are at the luckier end of this problem. In some service industries, anonymous feedback can affect people’s pay and even their employment. If an Uber driver gets too many poor ratings, they are frozen out of the app that brings them new customers. In academia, poor feedback doesn’t usually have these drastic career consequences. We are also lucky that only a tiny number of students set out to be cruel or unkind. Such comments do get posted, though, and there is now a large body of research suggesting that negative feedback is aimed disproportionately at young, women and BAME lecturers.

Anonymous feedback has a more insidious aspect: it skews the whole nature of writing as communication between human beings. It is more likely to be dashed off and dispensed casually, probably in the middle of many other invitations to give feedback. It means far more to the reader than it does to the writer – which is the wrong way round.

One of Lanier’s suggestions for improving online culture is to post something that took you a hundred times longer than it will take to read. This usefully shifts a piece of writing’s centre of gravity. The producer of the words has more invested than the consumer. Those words have a better chance of saying something interesting and worthwhile.

Our culture’s appetite for computable information makes nuanced communication more difficult. “Writing has never been capitalism’s thing,” Gilles Deleuze and Félix Guattari argue in Anti-Oedipus: Capitalism and Schizophrenia (1972). Capitalism, they write, prefers “electric language” – words that can be processed, actioned and monetized. But words are not just containers for data. They possess an immense power to move, hurt, deceive, anger, enchant and cajole others.

Most of our students grew up with Web 2.0 and know no other reality. They are at ease with anonymity. But as an English lecturer, I am struck by how much this conflicts with what we try to teach them about good writing. We tell them that putting words into careful, considered order is hard, that they must keep rewriting until they sound like the best and most insightful version of themselves. We teach them that words cut through most deeply when they have a sense of voice and address, of being written by an irreducibly unique person to other irreducibly unique people.

We have learned during the pandemic that teaching does not thrive as a series of faceless interactions. Just as Zoom seminars are easier and more enriching to teach when students have their cameras on, I would much rather receive feedback from specific, identifiable people. I know this kind of feedback would be as flawed as all human communication – prone to misunderstandings, self-censorship and power imbalances. We would need to work hard to create a space in which students felt able to speak freely. And students would also need to spend time framing their comments with the right mix of directness and tact – but wouldn’t that be a good skill for them to learn? For all its difficulties, feedback with someone’s name on it still feels preferable to the asymmetry of anonymity, so subtly alienating for both writer and reader. That is why I no longer read anonymous comments.

The power of touch

This is a slightly longer version of the article I published in last week’s Observer:

When was the last time you touched someone you don’t live with? One day last March, probably; you’re not sure of the date. Did you shake hands with a new colleague at work? Did your coat brush against another commuter’s on the train? Did someone bump your elbow and mutter an apology when rushing past you on an escalator? If you’d known that was the last time you’d make contact with the body of a stranger, you’d have paid more attention.

And what about the 8.2 million British adults who live on their own? Many will have gone nearly a year now without so much as a pat on the arm from another person. Touch is the sense we take most for granted, but we miss it when it’s gone. Psychologists have a term for the feelings of deprivation and abandonment we experience: “skin hunger”.

“Skin hunger” is not a phrase I had come across before last year, nor a problem I ever imagined facing. I am a socially awkward, non-tactile person. I have looked on nervously as, over the last two decades, hugging has moved from being a marginal pursuit to a constant of British social life. A hug feels to me like an odd mix of the natural and the artful. It is natural because bodily contact is the first, endorphin-releasing language we learn as babies and share with other apes. But it is also artful, because it has to be silently synchronised with someone else, unlike a handshake which can be offered and accepted asynchronously.

For the truly socially inept, even a handshake can be fiddly. I used to botch them all the time, offering the wrong hand (being left-handed didn’t help) or grabbing the other person’s fingers instead of their palm. Then, just as I had completed my long internship in handshaking, it began to lose currency and I had to hastily reskill in hugging. The best I could manage at first was a sort of bear-claw holdwith my arms hanging limply down my huggee’s back. It must have been like trying to cuddle a scarecrow. I got better at it; I had to. Now I find that I really miss hugging people. I even miss those clumsy, mistimed hugs where you bang bones together and it goes on just slightly too long or not long enough. And “hunger” feels the right word for it, in the sense that your body lets your mind know that something is up, and fills it with a gnawing sense of absence.

Aristotle considered touch the lowliest sense. He looked down on it because it was found in all animals and it relied on mere proximity, not the higher human faculties of thought, memory and imagination. But one could just as easily say that touch is the highest sense and for the same reasons. It isthe basicanimal instinct that lets us know we are alive in the world. It offers proof of the solidity of things other than ourselves.

Touch is our first sensation. The hand of a two-month-old human foetus will grasp when it feels something in its palm. A new-born baby will instinctively turn its head towards a touch on the cheek. All over the world, children play tag without having to learn how. The earliest forms of medicine drew on this human need to touch and be touched. The practice of healing massage emerged in India, China and Southeast Asia by the third millennium BCE, before spreading west. Asclepius, the Greek god of healing, cured people by touching them. The word surgeon originally meant hand healer, from the Greek for hand (kheir) and work (ergon). In the gospels, Jesus cures the sick with the laying on of hands.

In recent years the caring professions have revived this practice of healing through touch. The tender touch of others is now known to boost the immune system, lower blood pressure, decrease the level of stress hormones such as cortisol, and trigger the release of the same kind of opiates as painkilling drugs. Premature babies gain weight when rubbed lightly from head to foot. Massages reduce pain in pregnant women. People with dementia who are hugged and stroked are less prone to irritability and depression.

Our oldest myths speak of the lifegiving power of touch. In Homer’s Odyssey, Odysseus, visiting Hades, tries to hug his dead mother, Anticleia, so that they might “find a frigid comfort in shared tears”. But Anticleia is now a lifeless husk; she just slips through his arms like a hologram. Homer’s metaphor for the unbridgeable chasm between the living and the dead – a failed hug – feels newly resonant in the time of Covid. The Homeric underworld is a place of permanent lockdown, where the dead live on as unreachable, self-isolating ghosts.

Philip Pullman’s His Dark Materials trilogy echoes this scene in its last book, The Amber Spyglass. Lyra tries to hug her friend Roger in the world of the dead, but he passes “like cold smoke through her arms”. Pullman’s trilogy is a hymn to the materiality of the human body. It deliberately inverts the traditional Christian story, in which our eternal souls triumph over our flawed, sinful flesh. Pullman’s angels long to have bodies like humans, to feel the world through the senses. His human characters have daemons, physical manifestations of their souls, which means that they can hold themselves in their armsthe way Lyra hugs her daemon Pan.

It is hard to read His Dark Materials now without thinking about how the pandemic has separated us from each other. The trilogy’s climax comes when Lyra and Will kiss and know each other with their bodies. But then they must part and return to their own worlds. They agree that at noon on every midsummer’s day they will both sit on a bench in Oxford’s Botanic Garden that exists in both their worlds. Lyra tells Will that if they ever meet again they’ll “cling together so tight that nothing and no one’ll ever tear us apart”.

The different worlds in Pullman’s work are divided by the thinnest of membranes. The strange new rituals of the past year have all been about trying to reach across such thin but absolute divides. Older couples stand in front gardens, waving at their grandchildren through windows and miming hugs. People embrace their relatives in care homes through “cuddle curtains”: plastic sheets with two pairs of sleeves, allowing them to hug without touching. In Zoom meetings, we smile and wave at the shapeshifting pixels on our screens because they resemble people we used to know and perhaps once touched.

The virus, by forcing us apart, reminds us of this inescapable fact: we live in our bodies. Maybe we had begun to forget this in a world that links us up in so many virtual, intangible ways. That miraculous piece of technology, the touchscreen, works through a desensitised, near-touchless touch. It smoothly responds to our prodding, pinching and swiping so that we may do our duty as good little online citizens, working, shopping and distracting ourselves endlessly. But as our fingers and thumbs glide across the uniform surface, there is no sensuality or responsiveness in the touch. For the skin hungry, this is thin gruel.

Touch is a universal language, but every culture has its own way of speaking it. In north Africa and the Middle East, men join their hands together in greeting, then kiss their own hands or hold them to the heart. The Congolese touch each other on the temples and kiss foreheads. In Tuvalu they sniff each other’s cheeks. Andaman islanders in the Bay of Bengal sit in each other’s laps and then, in farewell, lift the other person’s hand to their mouth and blow.

Britain, by contrast, has historically been a low-contact culture. One explanation for the rise of ballroom dancing in this country is that it gave shy strangers formal permission to hold each other. Studying the etiquette in a Bolton dance hall in 1938, the anthropologist Tom Harrisson noted that a man would ask a woman for a dance simply by touching her elbow and waiting for her to fall into his arms. This couple might dance the whole night without speaking, then go their separate ways.

In touch-deprived cultures, touching is no less important than in tactile ones. As we have learned over the past year, when people are starved of touch the slightest forms of contact become filled with meaning. The most charged moment in the film Brief Encounter (1945) comes when Laura (Celia Johnson) and Alec (Trevor Howard) can’t say goodbye properly, because an annoying acquaintance of Laura’s has gatecrashed their final farewell. So he softly squeezes her shoulder, a small gesture filled with doomed longing. A hesitant embrace can speak as potently as an ardent one. On 30 May 1953 Edmund Hillary and Tenzing Norgay arrived back at advance base camp after climbing Everest. According to the expedition leader, John Hunt, they were welcomed with “handshakes – even, I blush to say, hugs”.

In 1966 the psychologist Sidney Jourard conducted a field study of couples sat in coffee shops around the world. He found that in the Puerto Rican capital, San Juan, couples touched each other – by hand-holding, back stroking, hair caressing or knee-patting – an average of 180 times per hour. In Paris, it was 110 times; in Gainesville, Florida, it was twice; in London, never.

Jourard concluded that Americans and Britons lived under a “touch taboo”. In the US this even extended to barbers using electric scalp massagers strapped to their hands so they did not touch their customers’ heads. Jourard wondered if the large number of massage parlours in British and American cities betrayed a need not being met in normal relationships. Many American motel rooms were equipped with “Magic Fingers”, a device which, on inserting a quarter, would slowly vibrate the bed. The machine, Jourard wrote, “has taken over another function of man – the loving and soothing caress”.

The new therapies that came out of California in the late 1960s sought to cure the English-speaking countries of their touchlessness. They prescribed generous doses of hugging. Bernard Gunther, of the Esalen Institute in Big Sur Hot Springs, taught full-body massage techniques as a path to sensory awakening. Some of Gunther’s more outré methods – mutual hair shampooing and the “Gunther hero sandwich” (a group of people spooning one another) – failed to catch on. But the massage therapists probably did help Britain and America become more tactile societies. By the 1980s, “Magic Fingers” machines had largely vanished from motel rooms.

In lockdown, the skin hungry have once again been forced to improvise inadequate technical fixes. They hug themselves, or hug pillows and duvets, or tuck in their bed blankets tightly at night. The robotics industry has tried to replicate the feel of human touch with Bluetooth-enabled “hug shirts” and silicone lips that allow you to hold and kiss someone remotely. But it’s not the same and never will be, however good the technology gets. Nothing substitutes for human touch.

As a teenager, the autistic writer and activist, Temple Grandin, longed to feel the pressure stimulation of a hug. Like many autistic people, though, she found being touched difficult. One day, visiting her aunt’s Arizona ranch, she saw cattle being put in a squeeze chute: a pen with compressing metal sides, which kept them calm while they were branded or castrated. Thus inspired, she made her own human “squeeze machine”. It had two wooden boards, upholstered with thick padding and joined by hinges. When she kneeled inside it and turned on an air compressor, it felt like being hugged. For Grandin, this was a useful staging post on the way to touching people. In her mid-twenties she learnt to shake hands. When she was sixty, her squeeze machine finally broke, and she didn’t bother to fix it. “I’m into hugging people now,” she said.

Real human touch is infinitely subtle and intricate – less a sense than a sensorium. Skin, which makes up nearly twenty per cent of our bodies, is our largest and most sensitive organ. An area of skin the size of a pound coin contains fifty nerve endings and three feet of blood vessels. The work of touch is done by sensory receptors, buried in the skin at different depths according to what kind of stimulus they detect, such as warmth, cold or pain. One of these receptors, the Pacinian corpuscle, responds to pressure and vibration. It can detect movements smaller than a millionth of a metre.

Everything we touch has its own specific shape, texture and firmness, its own special resistance to the pressure we place on it. Every hug feels different because everyone you hug takes up space in the world in a different way. No one else has quite the same contours, the same pleats and ripples in their clothes, the same warmth and weight, the same precise arrangement of flesh and bones. Your own body is a one-off, too. It folds into and nests with someone else’s in a way that no other body can.

“Sending hugs,” people say online – but you can’t send a hug. A virtual hug only whets the appetite for what you’re missing, just as looking at food when you’re hungry makes you hungrier. The feeling you’re trying to share in a hug is all wrapped up in its embodiment in space and time. A hug joins the physical and emotional so tightly together that you can’t tell them apart. The writer Pádraig Ó Tuama points out that an Irish way of saying hug is duine a theannadh le do chroí: to squeeze someone with your heart.

I wonder how it will feel when we can hug people again. Will we have to relearn the protocol, or will muscle memory kick in? Will our nerve endings have been deadened or hyper-sensitised by abstinence? Will we hug everyone too much and too hard, because our feeding habits have switched to feast-or-famine mode, like wolves who kill more than they can eat? One thing we do know now is that we are hardwired for touch. We were not meant to swerve away from each other in the street, or mime hugs through windows, or cuddle through walls of plastic. We were meant to hold people close, and feel the bones in their back and the rise and fall of their chests, and remind each other that we are warm bodies, still breathing, still alive.

On libraries

This article was published on Christmas Eve in the Times Higher (paywalled). I’m posting it here with some hesitation. When I uploaded the link on Twitter I received a number of angry tweets over Christmas from people who work in libraries. I think this may be because the headline suggested that I couldn’t wait for libraries to fully open again, when the piece is actually about looking forward to using libraries after the pandemic and had nothing to do with putting library workers at risk. Others felt that I had invisibilised the work of librarians in the article itself. That wasn’t my intention, but I don’t get involved in arguments on social media for all kinds of reasons. I’m just putting it up here for the record, and because I put a lot of effort into even short pieces like this. But please, as they say, don’t @ me – that’s happened enough already …

I have not been inside a library since March. University libraries are rationing their footfall with booking systems, shelves cordoned off with tape, and books available via click and collect. I have left our library to the students who need it more than me. We “library-cormorants”, as Coleridge called us, feel expelled from our nests.

I have long thought that the beating heart of a university is its library – the most welcoming and egalitarian space on campus. Here no locked rooms or timetables confine you. Go there to finish your essay, daydream, keep warm, watch a football game live on some bootleg online channel – no one minds. The library offers you free wifi, a workspace, reading matter, warmth, light and peace, not all of which everyone finds at home. I miss that mood of industrious quiet, captured in the sound of hundreds of students two-finger tapping at keyboards, like soft summer rain pattering on a tent.

I miss other libraries too. One day I will again sit in the reading room of the British Library, finding solace in the benign indifference of the readers around me, all of them politely pretending that I am invisible. A library offers, in Zadie Smith’s words, “an indoor public space in which you do not have to buy anything in order to stay”. The argument that we have less need of a library in a digital age errs, she writes, in seeing it “as a function rather than a plurality of individual spaces”. A library is a beautiful paradox: a public building that enshrines the private acts of reading, writing and thought.

I suppose the pandemic has taught us that, if needs must, we can survive without libraries. The history of scholarship is full of people who, when banished from them, make a virtue of the loss. “Lock up your libraries if you like,” wrote Virginia Woolf after being refused entry to a Cambridge college library, “but there is no gate, no lock, no bolt that you can set upon the freedom of my mind.” Woolf’s ambition was to write a history of English literature entirely from memory. Eric Auerbach, exiled in Istanbul during the second world war, was nearly reduced to this. Deprived of the European sources he needed, he focused instead, in his classic book Mimesis, on brilliant close readings of Western classics from Homer to Woolf. He ascribed its existence to “just this lack of a rich and specialised library”. Forced to wade through the vast secondary literature, he might never have got around to writing it.

The pull of libraries is as much emotional as practical. They offer the comforting illusion that in their unearthly calm and Dewey-Decimal order we will at last be able to quiet our neuroses, remake ourselves and return to the world revived and repaired. The scholarly life can be solitary and dispiriting. Doing research often feels like treading water. Libraries instil in us an emboldening sense of collective enterprise. They are, in David Attenborough’s words, “immense communal brains … extra-corporeal DNA, adjuncts to our genetical inheritance”. They make knowledge, in its wonderful but dizzying abstraction, concrete and tactile. I used to remind students that the books in the library are still known as “holdings”, and that sometimes it is nice to hold them. I am saving this advice for when it is once again Covid-compliant.

How cruel, then, that this quintessence of our collegiality, the shared space of a library – with its books thumbed by countless unnamed others, its hotdesking computer workstations, its large atria where hundreds breathe the same air – is now such a contagious and perilous place. As well as attacking our respiratory systems, the virus has attacked this idea of ourselves as social animals, who want to be near each other even if we might work more quickly and cost-effectively alone.

My nostalgia for libraries may be sentimental, but so what? Human beings like to anchor themselves in familiar places. In a famous essay, George Orwell rhapsodises about his ideal pub, “The Moon Under Water”. At the end Orwell admits what perceptive readers have already guessed: his perfect pub does not exist, being only an amalgamation of the best aspects of all the imperfect pubs he has known. On the same principle, in lockdown I have been imagining my perfect library. In my mind’s eye I see a single, large room panelled in dark wood, with soft pools of light emanating from green table lamps, and high-backed, padded chairs with sturdy armrests that miraculously fit under the desks. The bookshelves are as tall as buses, their upper limits accessed by ladders. On the mezzanine floor, reached by iron spiral steps, there are nooks and recesses where you may retreat further from the world. Plane trees rustle gently outside; the sunlight filters through them, and the high windows, to leave a lovely dappling effect on the parquet floor. The room has only a few other occupants, none of whom have annoying coughs or sniffles. This perfect library is small, but by some magically self-replenishing process has every single book I need.

John Betjeman reputedly said that in the event of Armageddon, he would head to the haberdashery department of Peter Jones department store in Sloane Square, “because nothing unpleasant could ever happen there”. I feel the same way about my perfect library. One day I hope to resume my search for it.  

On academic failure

I wrote this for the Times Higher about academic failure, awards blight and why I’m not a fan of the phrase ‘celebrating success’.

A recent phenomenon of academic life has been the proliferation of prizes and awards for staff. These often take the form of “vice-chancellor’s medals” for excellence in research, teaching, leadership or public impact. They are almost identical in style across institutions, as if a memo went round all the vice-chancellors to this effect. Any anthropologist of the modern workplace would find them an intriguing ritual. Yet their sudden ubiquity has passed without comment or critique. University managers have an all-purpose phrase to explain them: celebrating success. And who but a killjoy would be against celebrating success?

Well then, I am a killjoy. Not that I blame people for accepting prizes. Academia is short on affirmation, and we all like to feel appreciated. But there is something faintly infantilizing about these awards. The phrase celebrating success originated in prize-giving assemblies at primary schools. Many academics (including me) were the kind of children who had gold stars regularly stuck to their work by their teachers. Prizes appeal to this eager-to-please aspect of our characters, while gently badgering us into higher levels of performance. They see our jobs not as a contract with our employers but as a life-ruling passion in which the best of us go “above and beyond”.

The phrase celebrating success obscures the competitive nature of all prizes. It implies that the winners of these awards have not really competed for them. Success has just been benignly acknowledged from on high. The Dodo in Alice’s Adventures in Wonderland, when asked who has won the Caucus race, declares: “Everybody has won, and all must have prizes.” This being Wonderland, the Dodo is of course talking nonsense. Everyone can’t win a race, and if everyone gets a prize then the prize is meaningless. A prize is what economists call a positional good. Its value derives from its scarcity, the extent to which other people can’t have it. Every time you give someone a prize, you’re not giving it to everyone else. Every time you celebrate success you define what success is. Celebrating success turns the one aspect of collegiate life not invaded by market values – our relationship with our colleagues – into a competition.

Alongside this plethora of prizes has come a related development: a redemptive way of thinking about failure. Universities have bought heavily into the “failing well” movement. This movement emerged among American start-up entrepreneurs in the mid-2000s before being taken up by the self-help and personal growth industry. Its message is that we should own up to our failures and use them to learn and grow. Failure is simply a hurdle to overcome on our way to success. A new feature of graduation ceremonies is the honorary fellow appearing as an expert witness on the subject of failure. “Don’t be afraid to fail,” they say to the new graduates. “I’ve spent a lifetime failing, but it’s all been part of my journey to get here today.”

British universities have also followed the American example of running courses, aimed at staff and students, on coping with failure, beating impostor syndrome and developing resilience. I recently had to complete an online training module on “bouncebackability”. Alongside mildly sensible advice about conscious breathing, healthy eating and sleep hygiene, it urged you to “reboot your level of resilience” by adopting a “High Power posture” for “instant access to your feel good factors”.

The failing well movement too often succumbs to platitudinous positivity – the belief that every negative can be turned on its head. The day after Sir Peter Ratcliffe won the Nobel prize for medicine, one of his old rejection letters from the journal Nature began circulating on social media, posted with suitably boosterish comments. Believe in yourself, they said. Everyone else will catch up eventually. Success will find you in the end. Failure is the stepping stone to success.

Except that none of this is true. Believing in yourself will not always make people believe in you. Success will not always find you in the end. Failure is not always the stepping stone to success. The most concerted efforts misfire. In any race, most of us will be also-rans. Failure is always odds-on. It is statistical probability, basic maths, a numbers game – reversion to the norm.

This is especially true of an academic career, which is a long falling into failure. Most new PhDs don’t get shortlisted for jobs. Most grant applications fail. Most papers are rejected or abandoned before they are submitted. Most published work is ignored. The way to deal with this is to think about failure and success differently, not to assume that failure can be eradicated with the shallow certainties of positive thinking. Younger scholars who have failed in a flawed and iniquitous system are hardly helped by having to view their successful elders as shining beacons who persevered and got there in the end.

The failing well movement is a symptom of how the language and logic of the market have become the pervading aroma and undertaste of our lives. The market wants us to believe that everyone who works hard will be rewarded in the end – in which case the only cure for failure is to dust yourself down and start over again.

What can never be seen to have failed is the market itself. For the marketization of universities is a utopian project. It holds that anything can be solved with incentivizing competition and better performance measurement. If this utopian project falters, the blame must never be seen to lie with the project, only with the failure to realise it to the full. Marketization has failed, the utopians believe, because its values have not been spread with sufficient fervour and alacrity. Failed efforts must be redoubled. And so we continue our journey to the market-led utopia beyond the ever-receding horizon.

If marketization is the unchangeable given, then the only thing that can change is you. Your failures are not the fault of a failed system in which failure is distributed unequally. No, they are yours alone to solve, by acquiring that admirable quality, “bouncebackability”.

In 2010 Melanie Stefan, a neuroscientist at Caltech, proposed an idea in an article in Nature. Most of the research fellowships Stefan applied for she did not get – predictably enough, since they were much sought-after and so had low success rates. When she learnt that the Brazilian footballer Ronaldinho had been left out of the 2010 World Cup squad, she felt better about these failures. Ronaldinho had been one of Brazil’s stars in the previous two World Cups. Brazil’s World Cup squads are announced with some fanfare, so Ronaldinho’s failure to make the cut was very public. It made Stefan wonder why failure in sport is so conspicuous and academic failure so hidden. “As scientists, we construct a narrative of success that renders our setbacks invisible both to ourselves and to others,” she wrote. A scholarly career came to seem like the simple accretion of esteem indicators, with no sign of our statistically inevitable friend, failure.

Stefan suggested that scholars compile an alternative CV listing all the things they had failed at. It would be much longer than their normal CV, she warned, but it would give a truer picture of a scholar’s life. Thus inspired, Johannes Haushofer, a Princeton psychology professor, released his “CV of failures”. He arranged it under CV-like subheadings, such as “degree programs I did not get into”, “paper rejections from academic journals” and “awards and scholarships I did not get”. Haushofer’s CV of failures became a viral hit – garnering more attention, he said ruefully, than anything on his standard CV. 

A CV of failures is a sweet and generous idea. But still it relies on the redemptive arc that treats failure as something that can always be spun into success. CVs of failure tend to be produced by tenured scholars. They make their failures public to inspire their more precarious junior colleagues to shrug off disappointmentand continue their ascent to the professional heights.

A true curriculum vitae would include not just the failures but the shards of uncompleted work that never got to the stage where they could fail. It would record that large part of our lives made up of false starts, wasted time, aimless worrying and fruitless moaning. And it would acknowledge that most of the useful things we do as academics are unrecordable, done when no one is looking, just to keep the collective enterprise ticking along. Academic life is a delicate ecosystem in which every part affects every other part. In a healthy ecosystem there is no such thing as individual failure or success. Earthworms are as indispensable as charismatic megafauna. Every living thing contributes to the general health of the habitat.

In a market, failure and success are easily measured. The main criterion is productivity: the rate of output per unit of input. But in the academic ecosystem, we can’t always identify what an input and an output are. Much of our work is stochastic: a randomly determined process with asynchronous and asymmetrical results. Teaching doesn’t easily slot into the market language of “delivery”, being a multi-stranded pursuit covering everything from scholarly expertise to social work. “A teacher affects eternity; he can never tell where his influence stops,” wrote Henry Adams. Often he can’t tell where it starts, either. Every lecturer has taught classes that fell flat for murky reasons that no amount of student feedback will disclose. Failure is inevitable in any activity where we interact with other human beings, who are as mysterious, complicated and unique as we are.

Academic research, meanwhile, can go on for months with little clear sign of progress. A scientific experiment may yield only null results, its elegant theory ruined by empirical reality, the evidence buried in lab books. An archaeological dig may unearth nothing for weeks but plastic wrappers and ring pulls. A day’s harvest of writing may yield a few salvageable sentences, if that. Even when finished, research is inherently incomplete. There is always another reference to check, another source to chase up, another theory to take on board. All scholarship is provisional and falsifiable, so someone can come along to point out the gaps in your reading or the holes in your argument.

None of my own academic failures has been any kind of spur to self-improvement. The only good they did was to throw me off the hamster wheel of institutional expectations. They forced me to face the blank days and dry seasons that, by the short-term and satisficing standards of the market, look like failure. They taught me that every academic career is incommensurable with any other and runs on its own tracks to its own destination.

Success divides us; failure unites us. “All losers are the heirs of those who lost before them,” writes Jack Halberstam in The Queer Art of Failure (2011). “Failure loves company.” The culture of celebrating success claims to be fostering collegiality – let’s celebrate success, everyone! – but actually undermines it. Handing out awards is no substitute for the knottier and more time-consuming task of talking and listening to colleagues and making them feel valued. Awards blight is the friendlier face of all the other inequalities created by marketization: insecure contracts, huge pay disparities, cuts in “uneconomic” areas, and a general fetishizing of overwork and competitive busyness.

Most academics do not thrive in such a competitive system. Academia is a gift economy, as defined by Lewis Hyde in his book The Gift (1983). It trades not in commodities, which lose value when they become second-hand, but in gifts that gain value as more people are allowed to hold them. This kind of gift can never be sold or stockpiled, but must be constantly given way. We refer to a work of scholarship as a “contribution” because it has to offer something to the group, not just accrue kudos for its author.

This is what makes the thing we most crave, the approval of our peers, so elusive. Scholarly prestige has to keep circulating; it can’t be hoarded, still less solidified into a medal or certificate. Nor can we ever predetermine the impact our work has on others. “All work is as seed sown,” wrote Thomas Carlyle. “Who shall compute what effects have been produced, and are still, and into deep Time, producing?” Many seeds are scattered; most fall on stones and never break bud. All we can do is keep the faith that our efforts will one day feed into the accumulated knowledge and wisdom of the world. This is the only success that lasts.

All cities are the same at dawn

(I wrote this for the TLS during the first lockdown.)

At around 10pm on April 3, the BBC journalist Dan Johnson filmed his drive through Central London to begin the night shift at Broadcasting House. The next day he posted the film on Twitter, speeded up, with Massive Attack’s “Unfinished Sympathy” as the backing track. On what should have been a busy Friday night, the car weaves unimpeded through an abandoned Whitehall, the vast emptiness of Trafalgar Square, a tourist-free Piccadilly Circus with the billboards lit up pointlessly, and the long sweep of Regent Street, its clean lines spoilt only by a few red buses and stray pedestrians. Belisha beacons blink, and traffic lights turn red, for almost no one. London is like a gigantic, unpeopled film set. In The Road to Oxiana Robert Byron writes that all cities are the same at dawn – that even Oxford Street looks as beautiful in its desolation as Venice. Empty cities are compelling because they bear the traces of our lives as inescapably social beings. Their silence speaks of the need for human connection. Absence makes visible what we usually fail to notice: our everyday life going on, like an orchestra without a conductor, while our minds are elsewhere. Johnson’s film received more than 10,000 likes, many people apparently finding it as mesmerizing, and unexpectedly moving, as I did.

Lockdown has laid bare the strangeness of the everyday. It has severed us from many of our routines, and coated those that survive with a deep glaze of oddness. A permanent message in the corner of the television screen orders us to “stay at home”. Every journey beyond the front door must be justified. A queue for the supermarket is elongated by two-metre gaps policed by upturned baskets or stripy tape. Once inside, we find that the shelves have been denuded of once banal and now treasured items such as dried pasta and toilet roll, and cashiers are shielded from us by Perspex screens. Freud would have called this the unheimlich: the troubling intrusion of the unfamiliar into the familiar.

In an English thesaurus, the word everyday is found alongside other words – dullhumdrumworkaday – which seem to dismiss it as unworthy of interest. The British prefer to look at their daily lives through the distorting lenses of irony and bathos, perhaps. In Germany, however, in the early 1920s, the critic Siegfried Kracauer began writing short essays for the Frankfurter Zeitung that gave the everyday the serious attention it deserves. Kracauer dwelt on the dead moments of city life. He saw how much of people’s time was consumed by queuing, commuting, waiting around in lobbies and labour exchanges, or the dull office work of switchboards and typing pools – mundane activities that, like Edgar Allan Poe’s purloined letter, hid themselves in plain sight. In The Salaried Masses (1930; translated, 1998), Kracauer argues that the lives of office workers of Berlin and Frankfurt are “more unknown than [those] of the primitive tribes at whose habits those same employees marvel in films”. We should rid ourselves of the notion that our lives are defined by major events, he says. We are “more deeply and lastingly influenced by the tiny catastrophes of which everyday existence is made up”. This existence escapes our attention because it feels anonymous and unowned – like a story with no narrator, plot or protagonist. In an essay on boredom, Kracauer calls the everyday “a life that belongs to no one and exhausts everyone”.

When I watched Johnson’s short film of London in lockdown, it felt like something, as Philip Larkin wrote of trees coming into leaf, almost being said. Everyday life had become briefly visible through the thick fog of habit. In Kracauer’s Theory of Film: The redemption of physical reality (1960), he writes that a film leaves its raw material intact, so its images look like “casual self-revelations”. This unstaged, authorless quality – amplified in Johnson’s footage by the fact that his dashboard camera had simply recorded the journey while he drove – leads us to re-notice everyday phenomena that “are part of us like our skin, and because we know them by heart we do not know them with the eye”. Film looks at the everyday afresh, with the unfiltered gaze of a child.

France has produced an especially rich body of writing on the invisible vie quotidienne – a phrase both more precise and more evocative than the English everyday life. (A character in Don DeLillo’s novel Underworld calls quotidian “a gorgeous Latinate word … that suggests the depth and reach of the commonplace”.) At the heart of this tradition lies Henri Lefebvre’s three-volume Critique of Everyday Life. Published between 1947 and 1981, Critique covers the Trente Glorieuses, the thirty-year postwar boom that transformed France from an agrarian into a modern consumer society. Lefebvre explores how a new culture of consumption promised to relieve the drudgery of daily life, tapping into the desires – for style, glamour, energy, abundance – that this drudgery failed to fulfil. (Jacques Tati’s film Mon Oncle, 1958, makes delicious visual comedy out of this promise, as the shambolic Monsieur Hulot wreaks gentle havoc in his sister’s ultra-modern, wipe-down, push-button home.) Consumer culture pledges to replace everyday life with lifestyle and public tedium with private pleasures. In the final volume of Critique, Lefebvre foresaw that we would one day be able to shop without ever leaving our homes.

And yet, he argues, the everyday remains. It is a “residual deposit”, a “great, disparate patchwork” that modernity “drags in its wake”. It is the awkward underside of the modern obsession with productivity and growth. Its tedium is unevenly dumped on the poor, but no one can wholly escape it. We will never break through the everyday to reach some more exalted plane of existence, for “man must be everyday, or he will not be at all”. We dismiss the everyday as marginal and boring when in truth it is unavoidable and freighted with meaning. It recedes from view even as it fills up our lives.

Lefebvre’s friend, the French novelist and essayist Georges Perec, coined the term infra-ordinary to describe the huge terrain of our lives that had become unseeable, like infrared light. The daily papers, Perec writes, “talk of everything except the daily”. Trains and buses only seem to exist when we are cursing them for not turning up; their absence has forced us to acknowledge them. As soon as we hang a picture in our house, we stop seeing it. We fixate on the exotic and ignore its opposite, which Perec calls the endotic. We sleepwalk through our lives and they unfold with the relentless logic of a dream.

In a series of essays, many of them collected in Species of Spaces (1974; translated, 1997), Perec makes inventories of his desk, apartment and neighbourhood. One piece records all the solids and liquids he ingests in a year. In An Attempt at Exhausting a Place in Paris (1975; translated, 2010) he sets out to record “what happens when nothing happens”. Seated by the windows of cafés in the Place Saint- Sulpice for a whole weekend, Perec notes down everything he sees: pigeons grooming themselves in the fountain; buses running eternally through their routes; a man stopping to greet the café’s dog. Perec instructs his readers to do likewise, by taking field notes on the contents of their cutlery drawers and the way cars are parked on their street. He tells them to “set about it more slowly, almost stupidly”, to write down “what is most obvious, most common, most colourless”. By observing the world flatly, like “court stenographers of reality”, they will unearth the infra-ordinary.

It occurs to me that my own lockdown is a long, unwanted Perecian experiment. The pandemic’s distancing and constraining effects have made me look at daily life anew. I observe the exact times that neighbours take their dogs for a walk, the slightly over-elaborate way that delivery drivers step back from doorsteps, the apologetic nod that passers-by offer as they swerve away from each other. I have become an anthropologist of the infra-ordinary, watching the world around me with the pained, excessive attention that Perec prescribed.

Behind Perec’s seemingly playful methods lay serious intent. He wanted to show us that daily life didn’t happen inevitably, like the earth turning on its axis and day following night. It was a spell that could be broken, a collective dream of ordinariness from which we were free to awake. In the everyday, the French theorist Maurice Blanchot wrote, “we are neither born nor do we die: hence the weight and the enigmatic force of everyday truth”. Daily life feels interminable, uninterruptible: a present with no past or future, as inevitable as rain. With our eyes on the news headlines, we tend to forget that the most significant changes are slow, incremental and unseen. They happen in our daily lives while we are looking the other way. In The Practice of Everyday Life (1980; translated, 1984), the historian and cultural theorist Michel de Certeau writes that “objects and words also have hollow places in which a past sleeps, as in the everyday acts of walking, eating, going to bed, in which ancient revolutions slumber”. Our daily routines feel eternal and without origin, but a social and cultural history lies buried inside them.

Many of these theorists were writing in moments of crisis, when the everyday’s semblance of normality was lifted like a veil. Kracauer wrote in Weimar Germany, as the country lurched from one political or economic calamity to another. Lefebvre began his work just after the Second World War, when the simplest matters of French daily life, such as the search for bread and fuel, were fraught and all-consuming. Certeau’s interest in the everyday arose out of the évènements of 1968, when, he wrote, “from everywhere emerged the treasures, either aslumber or tacit, of forever unspoken experiences”. The slogans of the May revolutionaries – Never work; Beauty is in the street; Beneath the paving stones, the beach! – took aim at a life wasted on strap- hanging and clockwatching. They urged people to wake up and realize that their boredom was not obligatory.

Our own crisis has torn a similar hole in the everyday. We have learnt that it is not some tedious distraction from the things that really matter, but the real substance of our lives. We have also learnt that its permanence is an illusion; it is more precarious than we thought. While we are in the middle of it, everything that happens in daily life feels both natural and necessary. A crisis allows us to view things from afar and see that this is not so. Some of us have begun to wonder. Was all that prodding insistently at laptops and mobile phones, all that rushing through station concourses gazing up anxiously at annunciator boards, strictly necessary? Or was it part of a cult of busyness and presenteeism, a fetishizing of activity for its own sake, a life propelled forwards by the fake urgency of email meeting reminders?

Our daily lives are a mixture of the habitual disguised as the essential and the essential disguised as the habitual. A “key worker” turns out to be someone whose job involves the vital maintenance and repair of the life that we barely acknowledge. These people care for us and keep us alive, drive lorries and stack shelves in the dead of night so that we may be fed and watered, and dispose of our detritus. We have been delivered a harsh tutorial in how much we depend on strangers doing unglamorous, low-paid work, on systems and infrastructures whose workings we don’t understand, and on the minutely synchronized routines and fragile supply lines of a just-in-time economy that is always one bottleneck away from anarchy. We can’t just opt out of our dependence on others; we make everyday life together. When the world returns to something like business as usual, will we use this new knowledge to reshape our lives and value more those who make them possible?

It would be nice to think so. But crises also make us long for a return to normality, where everyday life is mere background noise, a respite from self-analysis and existential doubt. We start to miss that cheering open-sesame buzz as we lay our swipe card on the entrance scanner at work, the gossipy huddle at the photocopier, even attending a proper meeting instead of those lonely online affairs full of oblong glimpses into the domestic lives of colleagues. The everyday only feels enslaving when you are stuck inside it. Lefebvre believed that the évènements of 1968 proved unsustainable because people got sick of the disruptions and privations of a country effectively shut down. They hankered after their unexciting but livable lives, preferring “boredom at zero point” to “the hazards of desire”. The daily grind that Parisians call métro-boulot-dodo (commute-work-sleep) had its compensations after all. In the end, perhaps we all want – to invert the famous curse – to live in uninteresting times.

What is a university now?

I wrote this for the TLS last week:

In normal times, autumn for me means new beginnings. With the first damp chill in the air and the leaves fading, and just as our avian summer guests are heading south, the students arrive in a wave, hugging each other and screeching like swifts returning from Africa. This self-replenishing tribe of mostly young, loose-limbed, voluble people makes me wonder where the years have gone. I have files on my computer older than most of them. Still, I find their eagerness catching. That first day of a university term feels like a fresh start: the blackboard wiped clean.

This seasonal migration is also potentially lethal. Hundreds of thousands of new students travel up and down motorways in parents’ cars and then flock together, exchanging germs. Timetabling software moves them around buildings in minutely synchronized mini-migrations, forming corridor bottlenecks on the hour. Just walk into a recently emptied classroom and smell the stale sweat and perfume in the humid air. Here is a convivial habitat for those vampiric beings, viruses, that thrive by leaping on and off other living bodies. Most university lecturers have had several iterations of fresher’s flu.

Gradually over the summer it dawned on me: come September, everything would have to be different. Universities would offer “blended learning” – more online teaching, fewer contact hours. Instead of the usual lively mingling, there would be face masks, one-way corridors and desks a regulation distance apart.

Meanwhile the news about universities was depressing. The huge market for students from China and India, who pay the higher international fees, had collapsed overnight. The Institute for Fiscal Studies warned that thirteen British universities or colleges were at risk of going bankrupt. Several universities asked staff to take pay cuts. Others announced closures of humanities degrees. Many lecturers on short-term contracts, who supply as much as a third of university teaching, were laid off before the end of the academic year. In July, Gavin Williamson’s Department for Education published a “restructuring regime”, outlining the conditions for universities seeking emergency loans. It amounted to a new higher education policy, including a sharp shift towards STEM and vocational courses, a threat to end funding for arts and humanities courses deemed poor value for money, and a warning that universities would not be saved from going bust.

Even in hard times, universities receive little sympathy. They inspire a persistent, low-level hostility in political and public life. As William Whyte argues in Redbrick: A social and architectural history of Britain’s civic universities (2015), we tend to focus on the university not as a place but as an ideal. This engenders, he argues, “a constant sense that the university is in crisis, failing to live up to this exalted, fixed, and fictive idea”. In recent years universities have been denounced as havens of hidebound practices, anti-market thought, smug Remainerism and woke politics – “left-wing madrassas”, in Toby Young’s words.

But a university is not any of these feverishly imagined things. It is, first of all, a building, or a group of buildings, made of bricks, glass, carpet tiles and dropped ceilings stuffed with pipes and cables. It houses not just students and lecturers but also office staff, cleaners, counsellors, caterers, librarians, accountants. Inside its classrooms you find people talking about contract law or King Lear, or singing in Gospel choirs, or rehearsing plays, or kneeling on prayer mats, or lying on yoga mats. The people and the buildings come together in millions of small acts that make up an intricate, evolving, collective organism. A university is as full of human virtues, quirks and flaws, and as difficult to summarize, as a small town.

In How Buildings Learn: What happens after they’re built (1995), Stewart Brand sings the praises of a sprawling, ramshackle edifice at MIT known only as Building 20. Building 20 was a temporary structure erected during the Second World War for radar research. By the time it was finally demolished in 1998, this unpromising space had housed groundbreaking research on linguistics, acoustics, microwaves, video games and high-speed photography. Its horizontal layout, with lots of corridors and a lone vending machine to which everyone gravitated, forced people to meet and share ideas. It was a low-rent environment, free from turf wars because the turf – leaky, draughty, dilapidated – wasn’t worth warring over. The nuclear physicist Jerrold Zacharias, working on the first atomic beam clock, simply cut holes in the floor to make room for his equipment. Just by existing, Building 20 made creative things happen.

What is a university when it is not a building? We now have some idea because, in March, universities stopped being physical spaces with flip-down lecture seats, polypropylene classroom chairs, acoustic panelling and laminated notices about fire assembly points. They turned into data packets passing through fibre-optic cables and wireless routers on their way to kitchen tables, back bedrooms and garden sheds. Lectures were recorded, webinars held, essays submitted and marked online.

It worked, more or less – but, speaking for myself, it was a desiccated and lonely business. I felt especially sorry for our final-year students, ending their university careers with a single click to submit their last assignment, perhaps after leaving a brief note to their tutor in the comments box (“I am aware that this file is called ‘nearly done’ but I assure you it is indeed finished”, said one of mine). And with that their student days were over. Into the cold winds of a Covid-19 job market they were decanted, without the warming rituals of farewell hugs and degree ceremonies.

What made the online university possible is that the past two decades have seen a gradual digitizing of teaching. The first baby step was PowerPoint, which British universities adopted fairly late. My hard drive tells me that I only started using it in 2003, the same year that the Yale professor Edward Tufte complained that PowerPoint presentations “too often resemble the school play: very loud, very slow, and very simple”. But PowerPoint had one big selling point: its bullet-point templates and off-the-peg designs allowed content to be easily slotted in. In a more market-led university system, it ironed out the individual idiosyncrasy of the lecturer and met basic standards of presentational competence. Lecture slides could also be added to VLEs, “virtual learning environments”: electronic portals full of teaching resources. Recently these resources have included not only slides but also lecture recordings. VLEs respond to the same demand as TV catch-up and streaming services: the individual consumer’s desire to access content asynchronously and at their convenience.

Used alongside face-to-face teaching, the new technology works fine. The sacred form of the hour-long, real-time lecture probably needed shaking up. This teaching method was invented before the printing press, when books and paper were scarce and texts had to be read aloud to be discussed. In his memoirs, Siegfried Sassoon describes his brief time studying law at Cambridge, dutifully attending “droning lectures” in which “note-taking seemed to be physical rather than mental exercise”. In Oxford Triumphant (1954), a recent graduate, Norman Longmate, argued that this medieval invention, the lecture, “has lingered on into the twentieth century to become the biggest time-waster in Oxford”. During one lecture, Longmate noticed a fellow student composing a sonnet and another sketching the woman next to him. The only student showing real concentration turned out to be filling in a pools coupon.

The golden age of universities never existed. I was a student in the dying days of full maintenance grants and light-touch government intervention in higher education. That world had too many inattentive, complacently dull lecturers who saw teaching us as an imposition. A purgative dose of student consumerism certainly seemed in order. Except for one recalcitrant detail: students are not consumers. They don’t pay for their degrees (if they did, their degree certificates would be worthless), but for their tuition. Students are assessed, marked and graded, which doesn’t happen to most consumers. Teaching is not a client-facing service but an inevitably hierarchical activity. It is also communal and collaborative. As catch-up and streaming services have transformed our TV-watching habits, all that has been lost is the diasporic, live viewing community scattered across millions of living rooms. But when students consume class material at their leisure, the agora of the classroom is impoverished. What was a shared pursuit becomes, in the student satisfaction survey, a statistical aggregate of individual preferences.

Every lecturer knows this routine: the first thing students do when they enter a classroom is plug their phones into the room’s available sockets. Like Bedouins carefully calibrating how far the water in their goatskin bags will stretch between wells, they are always on their way to or from a recharging point. I have come to think of classroom teaching as a corrective to their device-driven lives. A timetabled class is inescapably analogue. It can’t be watched at double speed (a common student hack with recorded lectures) or split into bite-sized chunks. It teaches them to be truly present in a room and to know that thoughts and words carry real weight when they come out of this concentrated bubble of shared attention.

All human communication is embodied. The headache you get after a day of Zoom meetings tells you as much. Even thinking burns calories. We are sensual and tactile animals. That is why recorded music has not killed off the concert, why fans gather in city squares to watch football matches on big screens when they could easily watch them at home, why friends prefer to see each other in person than on FaceTime. We engage most intensely not with avatars or talking-head rectangles but with the physical presence of other breathing bodies. Why should teaching be any different?

Teaching is not a commercial transaction but an innately human act. Unlike most animals we are born prematurely, with our brains and nervous systems still developing. It takes years for us to master even simple motor functions. So we rely on our elders to teach us what to do and how to live. This turns us into needy, imitative creatures, easily bruised by a mere glance from another person, or raised aloft by the barest nod of approval. Teaching depends on gesture, body language, eye contact, vocal tone – those barely noticeable things that make every conversation different. A good university class hinges on what Elizabethans called “lively turning” – surprising links, embellishments and leaps of thought, made in the moment. Talking to your laptop camera while recording a lecture isn’t the same, any more than reading lines is the same as live theatre.

University planners have begun to talk about the “sticky campus”: one with lots of social spaces so that students stick around before and after class. Talk about reinventing the wheel. Some of us remember the sticky campus as “the campus”. The plateglass universities that opened in the 1960s, such as York, Sussex and Lancaster, had very sticky campuses, partly by accident. They needed sites of at least 200 acres, and land prices in city centres were too high. So they were built on green fields out of town. The redbrick university student had often lived at home or in lodgings scattered around the city. But when my parents arrived at Lancaster in 1964 as part of its first cohort, they encountered a revival of the medieval ideal of the university as a self-sufficient society of scholars. After bed and breakfast in their digs, they were expected to spend their waking hours on campus.

Today’s students, many of whom live at home and subsidize their studies with paid work, do not have this luxury. But because their lives are more fragmented, it matters even more that the university offers them a sense of belonging and community. Online teaching is often sold as a way to give students flexibility and accessibility, with everything a click away. But it also throws them back on their own unequally allotted resources. One thing lockdown has revealed is how many students have no access to a computer or a quiet place to work at home. Anyone who teaches young people will also have spotted the symptoms of an epidemic of anxiety and depression. A common characteristic of a distressed student is that they live inside their own head – a whirring, wired mind that has become estranged from the shell of a body that they lug around. Routines and timetables help: getting enough sleep, eating regularly and well, and forming part of that ad hoc student community carved out of class time, corridor chats and coffee breaks. Students may be surgically attached to their phones, but that does not mean they should live their whole lives online, or want to.

University managers tend to be techno-optimists, attaching an incantatory magic to the word digital. The timetabled routines of a university can feel, by contrast, boringly old-school. And yet showing up at the same time every week is a vital life skill. It allows you to ride out the tedium, fatigue and loss of heart that comes with any attempt to learn something difficult over time. The scaffolding of habit shores up the patient, incremental effort that real learning requires. A timetable is also a peg on which we hang our loyalty and commitment to others. In a lecture at Oxford in 2001, Margaret Drabble told a heartbreaking story about the novelist Angus Wilson, a professor of English at the University of East Anglia. Long retired, in poor health and living in the South of France, he would sometimes rise from his bed at night with a start and hurriedly collect a pile of papers, saying he had to “go to give a lecture”. His partner Tony Garrett would eventually convince him that there was no lecture to be given, and persuade him to go back to sleep.

I worry that this may soon be me. Will I ever lecture to a packed room again? I fear that the pandemic will accelerate an underlying trend: the reinvention of the university as a virtual, atomized, hollowed-out space. The government’s restructuring regime says that adjusting to a post-Covid world may mean “maximising the potential for digital and online learning that the crisis has revealed to increase accessibility”. Online teaching needs fewer staff, cuts overheads and has vast economies of scale – at least if it is done on the cheap. The digital university, necessitated by a public health emergency, may come to seem like an improvement on its labour-intensive predecessor.

What would be lost are those unquantifiable aspects of a university education that can’t be reduced to packageable, downloadable content. Students are not merely human capital but creative, cussed, non-algorithmic, irreducibly unique human beings. They need time and space to develop their particular gifts in ways that feel true to them and useful to others. The ancient Greeks called this educational ideal eudaimonia, or “human flourishing”. As a justification for the university, it is a line of defence that fell several trenches back, being hard to audit or compute and easily caricatured as woolly-minded. But most university teachers still subscribe to it in some samizdat form. They believe that, without recognition of the value of the university as a series of organic and serendipitous encounters, the narrow pursuit of market efficiency is likely to prove both joyless and self-defeating. They think of the university as a place, and they hope that, when all this is over, it will be one again.

Poem for a crisis

Never let a crisis go to waste.

Now is the perfect time

to repaint your dining-room chairs

organise your kitchen cupboards

colour-code your spice rack

clear out your desk drawers

and repair your bird box.

 

These are strange times

but even stranger

is why you’ve left it so long

to go through every thing in your closet

and put it in piles:

sell, customise, upcycle.

 

When you walk up the stairs

pretend you are climbing the Pyramids.

Run a marathon on your patio.

You can knock up a barbell

from bamboo and biscuit tins filled with rice.

And while you’re at it

why not build a climbing wall in your garage

and then ride to Berlin on your exercise bike?

 

Make a delicious meal from your pantry staples:

soba noodles and furikake.

It’s amazing what you can do

with half a fennel bulb.

Can’t get hold of seeds?

Pick them out of a pepper.

 

Draw up a learning contract with your pets.

Now there’s no excuse

not to read Infinite Jest.

Teach yourself to code.

If not now, when?

 

Police are patrolling in your area

issuing on-the-spot fines

to anyone who hasn’t learnt to make sourdough bread.

This is the new normal:

Get used to it.

Reset your life.

Appraise you

We’ve had a long, long year together

Through the hard times and the good

I have to line-manage you baby

I have to appraise you like I should

 

We’ve had a long, long year together

Through the hard times and the good

I have to ask you about your training needs

I have to appraise you like I should

 

I have to appraise you

I have to appraise you

I have to appraise you like I should

 

We’ve had a long, long year together

Through the hard times and the good

I have to identify performance shortfall

I have to appraise you like I should

 

I have to appraise you

I have to appraise you

I have to appraise you like I should