How to Know a Person

I reviewed David Brooks’s How to Know a Person for the TLS:

David Brooks was raised in a Jewish family whose motto, he says, might have been “Think Yiddish, Act British.” He learned to be awkward and reserved around strangers, but also inherited a disputatious streak that saw argument as “a form of prayer”. The result, for many years, was disappointing conversations, stymied by his tendency to either clam up or overperform. More recently, though, he has been working hard on being less inhibited. Now he actively seeks out conversations with strangers and looks for ways to make those conversations an act of joint, gentle, enriching exploration.

We should all try to do this more, Brooks argues, in an increasingly fractious and divisive public sphere where millions feel invisible and excluded. Adopting a favourite word of Generation Z, he suggests that we have lost the ability to make others feel “seen” — to let them know that they have been understood. When beholding someone fully, he writes, we see “the richness of this particular human consciousness, the full symphony – how they perceive and create their life”. He agrees with Iris Murdoch that we can “grow by looking”.

The trouble is, most of us are really bad at it. Our default view of the world is naively realist. We assume that the way it appears to us is what others also see. Our brains, locked in what Brooks nicely calls the “dark, bony vault” of our skulls, offer us a highly edited, partial reading of reality. People with different life circumstances literally see different worlds. Even partners and intimate friends misread each other, because they lock in some earlier version of the other person, and that version stays set while the person doesn’t.

The solution, for Brooks, is not some vague exhortation to be empathetic and kind, but learning how to perform small, concrete social actions well. Some of his suggestions about how to do this – pay full attention to people, ask open-ended questions – are fairly obvious. Other advice is more original and useful. People aren’t specific enough when they tell stories, so probe them for details, making them “authors, not witnesses”. Paraphrase what they just said and pause to see if they agree with your summary. Break the momentum of difficult conversations, by stepping back and asking them how you got to this overwrought state together.

I was less convinced by Brooks’s formulae for initiating deeper discussions. He claims that people actually love being asked the questions we fear might be too personal. But some of the examples he gives – “If the next five years is a chapter in your life, what is that chapter about?”, “What have you said yes to that you no longer really believe in?”, “What is the gift you currently hold in exile?” – would just leave me stumped.

More problematic still is Brooks’s eagerness to categorise people. In every crowd, he writes, there are Diminishers, who stereotype, ignore and make people feel unseen, and Illuminators, who shine their curiosity and care on others and make them feel lit up. Then there are the Essentialists who are guilty of “stacking”, using one thing about a person to make a series of further assumptions. And the Weavers, who build communities and drive civic life. And so on. We all know people who very roughly fit into such typologies. But can’t people also be different things at different times? I thought we were all supposed to be gloriously complicated and difficult to “see”?

Brooks justifies delineating these character traits with an analogy: just as a sommelier can judge a wine more subtly because they have a feel for qualities like “well structured” or “strong finish”, we’ll be able to see people more clearly with a better understanding of the qualities that make up their personality. He wants us to be “sommeliers of people”. But people are infinitely more complicated than wine, which is already complicated enough. This urge to label and classify sits awkwardly in a book which also argues that people often feel unseen “because somebody saw them not as an individual but just as someone in a category”.

Brooks is a chatty, likeable guide, although with an over-fondness for reheating the latest psychological research and peppering his prose with aphoristic quotations from authors who “had a famous saying” or “said it wonderfully”. I much preferred it when he suspended the supply of life hacks and wrote at more length about his own life and the people in it. One chapter covers a close friend’s depression and suicide. This chapter’s insight – that you shouldn’t try to coax someone out of depression but instead “create an atmosphere in which they can share their experience” – is not original. But it rings true because it has clearly emerged out of sustained, pained experience, rather than the urge to dispense “smart thinking” wisdom.

More Than a Game

I wrote this review of David Horspool’s More Than a Game: A History of How Sport Made Britain for the TLS.

In his classic book Anyone but England: An Outsider Looks at English Cricket (1994), Mike Marqusee writes about falling in love with the sport as an American newly arrived in the UK in the 1970s. Watching TV in the long hot summer of 1976, he saw the West Indies blow England away in the Test series, and became captivated by the spectacle: thirteen men in immaculate whites, moving in intricately choreographed patterns on perfectly mown grass as if obeying some ancient religious rite. “The change at the end of the over, when I first saw it, struck me as magical,” he writes. “It was so arbitrary, yet so precise, like a sorcerer’s trick.” This, Marqusee learns, is what makes cricket such a handy conveyance for its peculiar brand of racial and class politics: its pointless beauty draws us in first.

Britain is not the most cricket-mad, or football-mad, or golf-mad nation on the planet. But as David Horspool writes in this ambitiously conceived new history, “the sheer variety of games and the complex history of sport in Britain are unparalleled”. Britain has a unique talent for inventing and codifying sport and then sending it round the world as “an agent of empire and a spreader of soft power”. Meanwhile, on home turf, sport has generated endless social anxieties and panics, concerning its supposed encouragement of idleness, Sabbath-breaking, violence, unruliness, foul play, corruption and greed. These fears often bring to the fore the British genius for the botched compromise. When betting shops were finally legalised in 1961, they had to operate behind blacked-out windows. Rab Butler, the Home Secretary responsible for the legislation, later reflected that they had been so “intent on making betting shops as sad as possible, in order not to deprave the young, that they ended up more like undertakers’ premises”. In 1986 a concession arrived: they were allowed to serve hot drinks.

For Horspool, sport is “more than a game” because it has penetrated every area of British life and served as both a crucible for and a reflector of social change. In cricket’s perennial tension between the celebration of amateurism and style over professionalism and the pursuit of victory, for instance, we see both an illustration and an artful exaggeration of the British class system and its “fantasy of patrician beneficence”. Golf, as the greediest coloniser of green spaces of any sport, has been at the heart of battles over the future of the British landscape, first by monopolising common land and more recently by falling foul of environmentalists. Boxing has offered a way out of poverty for successive waves of immigrants but was also a pioneer of colour bars and other ugly forms of racism.

Each chapter takes on a different sport and is split into themed sections. There are illuminating five-page riffs on subjects such as boxing’s origins in London’s Jewish East End, football stadium chants, street bookmakers and Wimbledon dress codes. Sometimes this division into subheadings leads to an over-reliance on potted histories that don’t quite cohere into a thesis. Elsewhere, though, the non-linear structure allows Horspool to make suggestive connections across time. He traces many elements of the modern international sporting competition, for example, back to the medieval tournament: the heraldry badges on football players’ shirts, the semi-contained, ritualised violence, the elite competitors playing to a broad audience, and an ethos of fair play battling constantly with the urge to win.

Horspool has done much useful rooting around in old newspapers and sporting magazines and has a lovely eye for the informative detail. What, for instance, makes 1830 such a key date in the evolution of bowls, croquet and tennis? Because that was when the lawnmower was invented. Why is the preferred view of the “true” football fan not the self-evidently optimal one from the halfway line, but the distorted, foreshortened and often distant view from behind one of the goals? Because a century or more ago, the Glaswegian architect Archibald Leitch established the standard football stadium design, with posh grandstands and supporters’ terraces at each end. Why are the two statues in Britain in honour of tennis players (Fred Perry and Dorothy Round) easily outnumbered by those for racehorses (Frankel alone has four)? Because tennis, outside of Wimbledon fortnight, is a surprisingly minor sport in Britain; racing has always been huge. Victorian jockeys were the first sporting superstars.

Horspool’s rationale for writing this book is that a country that expends so much of its energies on sport should feel more acutely than it does “its absence from mainstream historiography”. But is it really so absent? There have been several single-volume histories of British sport before this one, as well as countless histories focusing on individual sports. Academic sports history is a long-established and thriving sub-discipline with several journals dedicated to it. More puzzlingly, Horspool cites many of these books and journals in his notes. This skirting over the existing scholarship may explain why More Than a Game never quite turns its many fascinating vignettes and trenchant observations into an overarching argument. “Sport can only exist and develop in the wider environment in which it takes place,” he writes. “So the tensions of race, class, gender, geopolitics, money, identity and environment are always exposed or reflected in sport, which, in turn, reflects them back on to wider society.” It is hard to argue with that, but nor is it an original insight.

There is a far more interesting argument, I think, dotted through the book without ever being fully articulated. It is that sport is a kind of distorting mirror of society: it feeds into broader histories while spawning its own rich micro-histories. Sport is limitlessly weird. I did not know, for instance, that until the Covid pandemic put paid to them, racecourses all had saunas, where jockeys desperately sweated away the final few pounds before a race like exam-sitters doing last-minute revision. Sports follow their own sweet logic and are a capacious container of contradictions. How else could the urban game of professional cricket, with most of the county grounds located in cities, present itself as a world of “unblemished rustic amateurism”?

The history of sport is also the history of these strange subcultures that demand complete immersion in their symbolic universes. Looking at them with an outsider’s eye would break the spell and render them meaningless. There are plenty who remain immune to the spell and who, like Kipling dismissing “the flannelled fools at the wicket or the muddied oafs at the goals”, see it all as a silly distraction from more important things. A large number of Britons couldn’t care less about any sport, as is shown by the always surprisingly low recognition scores for sporting stars on the TV quiz show Pointless.

But that spell, when it works, is what compels fans to stand in the freezing rain to watch their side get beaten week after week, or spend fortunes on Sky Sports subscriptions and flat screen TVs the size of dining tables. And they do all this despite so often being treated with undisguised contempt by the powerful and monied people who run elite sports – for that is another common theme in this book, of rich men’s pursuits being opened up grudgingly to the rest of us.

“Fortunately,” Horspool concludes, “sport is also transcendent.” Sport, for many, is not a distraction from the real business of life; it is life itself. I wish this book had offered more of a sense of that life-giving joy of sport – which is, after all, what made it matter to the historical subjects who played and watched it. But More Than A Game rarely ventures onto the field of play or into the thick of the action. Take one passing comment, that the Welsh rugby union stars of the 1970s (Gareth Edwards, Barry John, J.P.R. Williams) provoke “gasps of admiration” when we watch replays of their darting runs on television. Horspool doesn’t elaborate on this with an account of those thrilling, snaking dashes for the try line. The half-choreographed dance of a team game, the thrilling pulse of collective movement, the luminous piece of individual virtuosity plucked from the jaws of contingency – these too are part of sport’s history, but we only glimpse them intermittently here. And yet sport would not have had nearly so much power to drive and refract social change – the history that Horspool tells with great aplomb – if that magic were missing.

Academic tribes

I wrote this for Times Higher Education a couple of weeks ago:

The people who work in universities are made up of two tribes: tragedians and comedians. These tribes view each other with bewilderment across a seemingly unbridgeable mental and cultural divide.

What makes things worse is that the tribes are not named as such, and no one ever declares or even knows which tribe they belong to or why. All they know is that there is a whole bunch of people in universities with whom they have little in common. Perhaps, in the interests of collegiality, it would help if I introduced these tribes to each other.

In his pioneering work of ecocriticism, The Comedy of Survival (1974), Joseph Meeker discusses two literary-philosophical modes of thought that people have used to make sense of the fragilities and vagaries of life: tragedy and comedy.

In tragedy, the (almost always male) hero exists in a state of conflict with some force – nature, the gods, fate, death, his own self-defeating desires – that will ultimately overthrow him. He tries to escape his human limitations and fails. The audience feels both awe and pity for him as he follows this doomed project to its end. Tragedy concerns the human search for aggrandisement, the need to feel we matter in an uncaring, meaningless universe. It sees our efforts to transcend our animal nature as heroic, even if that heroism ends in death. In tragedy, the gods care enough about us to squash us when our pride and defiance angers them. Tragedy puffs up our species pride, our delusion that the universe gives a fig for human ideals.

Tragedy is a human invention – of the Athenians in the 5th century BC, and of cultures inspired by them, such as Elizabethan and Jacobean London. Comedy, by contrast, exists everywhere, even in nature. It grows artlessly out of life’s biological realities. Plants and animals live what Meeker calls “the comic way” in their pragmatism and pliability. Evolution for him is like a comedy, its aim being to allow life to thrive in all its myriad forms. Successful participants are not always those best able to destroy competitors, but those that can stay alive and grow.

Evolution favours variety over the ideal or optimal; it eliminates the unworkable and settles for the good enough. Healthy ecosystems maintain an equilibrium that avoids all-or-nothing rivalries and allows life to carry on. Similarly, the heroes (or antiheroes) of comedies may fall flat on their face, or lose their trousers, but they always survive. Comedy bids us respond to our human limitations with flexibility and compromise. The comic way seeks to enjoy and sustain life, because life is all there is.

In universities, the tragedians tend to occupy more senior positions, especially the upper echelons of management. They are not tragic heroes in the Greek or Shakespearean sense, of course: they don’t do dramatic things like stabbing their eyes out or dying in a pool of blood at the end of the fifth act. They don’t have Lear’s rages, or Othello’s jealousy, or Macbeth’s murderous lust for power.

No, they are tragedians because they think that humans can escape their fate as frail, mortal, imperfect beings. They put their faith in human-made ideals rather than natural laws. And they believe in chasing down these abstractions – usually given names like excellencequality and innovation – endlessly.

In pursuit of them, they see universities not as delicately interconnected ecosystems but as places of rivalry and competition. They think not only that universities should compete with other universities, but also that individuals and groups should compete within universities for money, prizes and resources. A university, to a tragedian, is a collection of competing cost centres.

The admirable thing about tragedians is what genuine idealists, what true believers in the university they are. They treat the marks of distinction and esteem that academia has devised – titles, honours, university rankings, assessment exercises, data-driven performance indicators – as if they had some absolute, unquestioned reality in the world. Theirs is a tragic vision because it coats these anthropocentric creations with the veneer of necessity. They cannot imagine any alternative to this hard-driving, metric-oriented world.

Tragedians get things done. They believe in constant activity, the embracing of growth and change, the zealous production (even overproduction) of research “outputs” and “impact”. Unsurprisingly, they have a much bigger carbon footprint than the comedians. They hoover up the airmiles to attend international conferences and forge global partnerships, and they build shiny new campus buildings, often trumpeted as energy-efficient, even though the most sustainable solution would be to restore and refurbish old ones. (As eco-minded architects say, “The greenest building is the one that already exists.”)

Tragedians align perfectly with the values of the free market, because the free market is also a tragic concept. It, too, refuses to recognise our human limitations, because its aim is to go on forever, even at the expense of the destruction of the planet. It thinks of humans as mini-gods, unbound by the natural constraints that govern other living things. For Meeker, the tragic worldview lies at the heart of our environmental crisis. Its ultimate goal – infinite productivity, using only the finite reserves of energy on this earth – is impossible.

The university’s comedians, meanwhile, are baffled by all this busywork. They think that much of what goes on in academia – the ritualistic language, the unending meetings that generate more meetings, the power play and jostling for status and influence – is inherently comic. They believe that even the clever, able people who work in universities do foolish, irrational things. They know that underneath our carapace of smooth professional competence we remain what Shakespeare’s Falstaff calls “this foolish-compounded clay, man” – an error-prone, self-deluded, self-sabotaging animal. They see that much of what goes on in academia has no objective reality but happens within some closed circle of meaning we have drawn ourselves, then stepped inside and mistaken for the world.

The comedians tend to be lower down the institution’s food chain. They are the people on the ground, having to find concrete answers to the abstract visions of the tragedians, which looked so neat and plausible when summarised in bullet points and on spreadsheets. They are the office administrators who find workarounds for dysfunctional processes. Or the lecturers who try to convert the ineffable aspects of what they do into some diluted or semi-fictionalised form that bureaucratic and IT systems will recognise. They keep doing their job when they know that the software is hopelessly overengineered, the classroom tech doesn’t work, the website is unnavigable, the delivery strategy is undeliverable, and workload allocation models bear no relation to the work that they actually do.

Unwilling actors in this theatre of the absurd, they try to keep the show on the road in a collaborative, mutually tolerant way. They know that, as in a comedy, most things in academic life are a muddle that never gets resolved, but that the life of the university always goes on.

To be clear, comedians do not think that their work is absurd or meaningless. They are no less hard-working and dedicated than the tragedians. They only suggest that the meaning of university life can’t be found by running after intangible ideals. Instead they relish the detail and materiality of the everyday, the snatched moments where pleasure and purpose come together: the buzzy feeling after a good class, the discussion with a student that really seemed to cut through, the click of recognition in their own research when an idea or sentence falls into place. None of these things are measurable, and yet to a comedian they are what makes the work matter.

Recently, I helped to clear out some archived material from my department when we had to empty a storage room. We sorted everything into recycling or confidential waste sacks: handbooks from now defunct modules, old undergraduate dissertations that had to be chucked under data protection legislation, reams of papers from previous assessment exercises and revalidations, decades-old minutes of meetings. It was a salutary lesson in how the business of the university, so urgent and invested with meaning at the time, gets eaten away by the years. And yet it was impressive as well, all this evidence of forgotten labour, done just to keep things ticking along and ensure that the human comedy went on. 

To a tragedian, a comedian’s approach seems hopelessly woolly and unrigorous. The comedian replies that true rigour requires that we take account of subjectivity, and of how the most robust and consistent procedures are muddied by flawed, uncomputable, glitch-ridden humans. But surely, says the tragedian, this is all woefully lacking in ambition. We can’t just bumble along as we’ve always done! The comedian replies that they are not against change, just the pre-set, top-down, life-sapping variety enforced through “change management”. Comedy, they point out, is as adaptive and creative as the natural world. It lets a thousand flowers bloom, each flourishing in their own way. It recognises that our most imaginative and lasting achievements emerge out of an alertness to our messy, mercurial, fallible natures.

By now you may have guessed which tribe I belong to. I am a comedian. The tragic way has its uses and boasts many accomplishments. Humans are storytelling animals, and all work relies to some extent on the suspension of disbelief, a faith that it matters more than it actually does. We all need to believe in something.

Pushed too far, though, the tragic way becomes joyless. It makes people feel ashamed and guilty that they are not living up to its ideals – ideals that can never, in fact, be lived up to. For you will never produce enough outputs, never have enough impact, never secure enough funding, never be trained enough, never be innovative enough, never be excellent enough. The tragic way turns us into martyrs who see work as sacrifice and suffering. It explains much about why so many people in universities – even those in secure, well-paid jobs doing work they believe in and enjoy – feel stressed, harassed and miserable.

The other day, I was completing an online form on one of the many data management systems I am required to use as part of my job. I had to get a box to turn from red to green, to show that data had been provided to the satisfaction of the algorithm. I did not think the information being asked of me was useful or relevant. I typed “not applicable” into the box. The algorithm swiftly shot back that this answer had “an insufficient number of characters”. I typed “This is not applicable” into the box.

The box went green. The comic hero’s victories are small, but, as Meeker writes, “he lives in a world where only small victories are possible”. Oh yes, I thought, I am a comedian.

Class of 2023

I wrote this, about the graduating class of 2023, for Times Higher Education a couple of weeks ago.

The class of 2023 will soon graduate. When they walk across that stage in their black polyester gowns and mortar boards to shake the vice-chancellor’s hand, it will feel different from previous years. This cohort is unique – and, hopefully, will remain so.

When they began university back in September 2020, this celebratory moment seemed a long way off. I knew them then only as shifting pixels arranged into headshots on my laptop screen. I was quietly freaking out. Covid cases were rising again and most campus buildings were shut, obliging students to log in to Zoom classes from their family homes or the shared accommodation at which many had already arrived, expecting to be taught face-to-face.

When I clicked on the button to start my first seminar with the first years, I spent a horrible few minutes alone in the virtual room, looking at the webcam image of my tired, haunted face, wondering how it had all come to this. Then I heard the reassuring pings that signalled students waiting to be admitted. I let them in, and their faces popped up one by one. Some were in their bedrooms, with their posters and photo walls behind them; others were sat at kitchen tables filled with everyday clutter. They looked friendly, if a bit dazed, and ready to roll with this surreal state of affairs. Within a few minutes, I thought: “This is going to be OK; we can do this.”

As the weeks went by, we grew more at ease with the technology and each other. I had conversations with them about their lives that I would never have had in a classroom. One of them, newly branded a “key worker”, was getting up at 3am each day to stack shelves at Asda. Another had lost all his income because bar work had dried up. All were struggling gamely with the practicalities of the new normal: competing over the family PC with home-schooling siblings, running errands for grandparents, doing the shopping and cooking for flatmates with Covid who were isolating in their rooms.

The news was full of stories about students having Covid parties to catch the virus from each other and get it over with. But these young people were obeying the rules, sacrificing all the fun bits of a fresher’s life for the sake of those more vulnerable.

That autumn was grim, with no sign of a vaccine yet and the days getting gradually colder and darker. A weak, low-hanging sun poked grudgingly through the small window of the box-bedroom study where I worked. In a week of solitary screen work and anxious doomscrolling, my Zoom classes became the one thing I looked forward to. The class of 2023 kept me going.

All lecturers moan about students a little, an inevitable symptom of generational differences and the stresses of the job. Why won’t they answer their emails? Why won’t they do the reading? Why won’t they get off their phones? But I will always have a soft spot for the class of 2023, who I didn’t meet IRL (as they would say) for a whole year, when their habit of waving goodbye before logging out of Zoom morphed into sweet thank yous as they left the physical classroom.

Generation Z is often accused of being fragile and mollycoddled. In truth, this generation feels powerless in a world that has failed them. Often already working long hours doing emotionally draining work in the gig economy, they face years of debt and precarity. They have good reason to feel troubled and are literate in the language of mental health to explain their distress.

In response, they are always being urged to acquire that voguish inner quality, resilience. But resilience is not some universal rocket fuel that we can top up on, serviceable in all situations. Students are, like all human beings, an unfathomable mix of brittleness and strength, weakness and wilfulness. They might be phobic about something I can do as easily as blinking and then blithely bat off something else that fills me with nameless dread. People are weird like that.

I don’t want to go back to online teaching, ever. But in that strange interregnum I did learn something about the many-sided nature of resilience.

I dislike the flexed-biceps-emoji, “you’ve-got-this” culture of positive thinking that has overtaken universities in recent years. The self-help and personal growth industries, where all this originates, are wrong to insist that the world is always solvable and that even the worst experiences can be turned into opportunities. Not every adversity is a chance to learn and grow. And no, despite what every graduation ceremony speaker seems to tell you, you can’t just achieve your dreams by never giving up.

But, still, it is amazing what you can survive. And when faced with a real, full-bore crisis, this generation of supposed snowflakes just got on with it and coped better than I did.

So, graduating class of 2023, I doff my floppy felt hat to you. If you can get through a degree with all that going on, I have high hopes for you. Now, if you wouldn’t mind sorting out the world, we’d all be very grateful. Sorry we left it in such a mess.

Imagination

I reviewed Albert Read’s The Imagination Muscle for the TLS:

This book about the imagination begins, aptly enough, with a story about telling stories, inspired by a serendipitous find. Just out of university and doing a job he hated selling advertising space from a Soho office, Albert Read was browsing at lunchtime in the second-hand bookshops on Charing Cross Road. He saw a book, The Secret Language of Film, by the French screenwriter Jean-Claude Carrière, and bought it on a whim. In it, he discovered that Carrière and his long-time collaborator, Luis Buñuel, would set each other a challenge at the end of a day on set, by spending half an hour inventing a story to tell to the other. This was their way, Carrière wrote, of “training the imagination, the muscle which makes the essential breakthroughs”. Ever since, Read has clung to this idea of the imagination as a muscle – something that can be stretched and developed “through the regular pulse of movement and exertion”. Engaging it fully might feel weird at first, as with “an unencountered glute muscle”, but it will become stronger and suppler when “prodded into the rhythm of action”.

Most of this book’s chapters are themed around some way of training this “imagination muscle”. The training tips seem fairly obvious: free yourself from unthinking habit by taking a different route to the train station on your way to work; adopt a “beginner’s mindset”; be at ease with risk; keep a commonplace book; seek out collaborations and have better conversations; make leaps across disciplines; embrace the “joyful absorption” of observation, by taking out a magnifying glass to study a yellow meadow ant. All sound ideas, but none of them very imaginative.

Gradually, though, a more original take emerges. The imagination works best, Read finds, when it meets the constraint of structure or has to otherwise rub up against the real world. Its sweet spot is the “meeting point of intelligence, acquired knowledge and observational wonder”. The trick is to stay within touching distance of the unconscious and respond to its “quiet intimations”, while giving them shape “under the limpid gaze of judgement”. The biggest creative leaps are made by refining and recombining established ideas, or running with the grain of what already exists. Palaeolithic cave painters worked with the contours and fissures of the rock – a bulge in the rock incorporating a bison’s hump, say, or the swell of a muscle – to bring three-dimensional dynamism to their art. Read offers the analogy of a child making a mask by coating one side of a blown-up balloon with papier mâché. Sometimes, he writes, “the imagination requires a balloon around which it can drape its first tender ideas – initially too fragile to find solid form of their own”.

Read is the managing director of Condé Nast Britain, overseeing titles like GQ, Vanity Fair and Vogue. Imagination is a more unusual topic for someone of this background to fasten on than its more dynamic-sounding partner, creativity. Creativity, ubiquitous in the business lexicon, suggests agency and action; imagination carries suggestions of reality avoidance and unproductive daydreaming. From the evidence of this book, Read seems like the kind of line manager who would tolerate a bit of daydreaming in meetings. I like the sound of a boss who says that “ideas come where there is confidence and laughter” and who thinks that business is normally too focused on “the ratcheting up of expectation, the smoothing out of experience … and the unthinking quest for efficiency”.

I am not sure, though, that this book quite gets to the bottom of what the imagination is and how it works. Read dismisses evolutionary biologists who see the imagination as “merely a highly evolved survival mechanism” and who thus wish to “diminish or ignore its mystery”. I don’t see why this should be so: you can still be captivated by the mysteries of the human mind even if you know that they are all held within that three-pound lump of jellified fat and protein inside our skulls.

Read would presumably agree with Keats, who accused Newton of draining the rainbow of poetry by explaining it as a product of the refraction of white light. There are plenty of scientists discussed in this book, but they tend to be ones with side interests in the arts such as Albert Einstein (violin-playing), Richard Feynman (drawing) and Alexander Fleming (painting with bacteria). Read’s idea of the imagination is essentially a romantic-artistic one. The Romantics saw the imagination as a transforming force, able to both tap into the hidden energies of this world and conjure up alternative worlds – in Shelley’s words, “Forms more real than living man, / Nurslings of immortality.” For Read, too, the imagination is a tool that can unravel “the mystery, the harmony and the immensity of existence” while also serving as “the valve easing the crush of reality”. In service of this argument, Read falls back on the usual prodigiously gifted suspects: Newton watching the falling apple, Turner strapping himself to the mast of a ship, Einstein grasping the relativity of time just after waking up one morning, Leonardo conceiving the helicopter by observing the rotating descent of a maple tree seed.

Undercutting this – and source of the book’s more interesting insights – is an idea of the imagination as something more ordinary and workaday. To imagine, in the mundane sense of being able to generate images not felt or experienced by the senses, is a basic human activity. Imagining, as Read writes in the opening chapter, is at the centre of all social life. He is tied to his family “not only by love, but by the imagined structure of marriage”, and he deposits his money in a bank because “I imagine that one day they will give it back”.

This notion of the imagination as something that knits our collective lives together could have been explored more. The novelist and critic Amitav Ghosh observed in his 2016 book The Great Derangement: Climate Change and the Unthinkable that “the climate crisis is also a crisis of culture and thus of the imagination”. Our stunted ecological imaginations make it hard for us to connect our everyday habits with rising sea levels and record temperatures. We demonize refugees and benefit claimants because of a failure to imagine the fullness of their humanity or the complex networks that connect us to them. Hate, as Graham Greene wrote in The Power and the Glory, is a failure of the imagination. Imagining other perspectives and realities is the basis of all democratic compromise, all care and concern, all attempts to conceive better ways to work and live.

Read mostly ignores these broader questions, focusing instead on life hacks, ways of training the imagination to make our individual lives more productive and fulfilling. I can see why his publisher liked the idea of the imagination as a muscle. It makes intuitive sense, and is readily explainable in a three-minute radio interview. But is the imagination really like a muscle? Most muscles, when exercised regularly, become stronger and more efficient at converting energy derived from chemical reactions into mechanical energy. I saw no firm proof in this book that exercising the imagination regularly makes you more imaginative. Rather, the evidence collected here suggests that the imagination works best when it is caught unawares, while the mind is semi-occupied, energized by a change in physical circumstances or enjoying a brief moment of stillness. Ideas come most freely to us in the grogginess of waking up, while having a shower, or when leaving a party alone and stepping out into a cold street.

My favourite story here is of Arthur Fry, who invented the Post-it Note while daydreaming during a dull sermon in a Minnesotan Presbyterian church in 1973, “his disengaged mind roaming the unconscious for connections, but still possessed of a sufficient self-awareness to pull a good idea up from the depths”. (I can confirm from first-hand experience that church services are a good space for creative reverie; more recently, dull university research seminars have served the same function.) Perhaps the answer, then, is to engineer more of these liminal, unguarded occasions, rather than to push the imagination like a muscle and feel the burn. Maybe that is what Read is really arguing, because the muscle metaphor doesn’t appear much beyond his opening chapter. By the end of the book, this metaphor feels less like an argument than a peg on which to hang a mixed bag of insights – most of them stimulating, some of them useful, not all of them imaginative.

Awe

I reviewed Dacher Keltner’s Awe: The Transformative Power of Everyday Wonder for the TLS:

Over the centuries, the word awe and its derivatives have suffered what linguists call semantic bleaching: overuse has weakened their intensity. The Old English awe meant a feeling of terror or dread. From this kind of awe-full derived awful, an adjective attached to anything deemed unpleasant or abysmal from the late 18th century onwards. From the early fifteenth century, the terror and dread associated with awe came to be mixed with reverence or wonder, usually inspired by the divine. Awesome meant arousing this kind of awe, until it came to mean merely breathtaking, and finally turned into a vague term of approval, meaning “cool”. The other day, a student on one of my courses sweetly exclaimed “awesome!” when I gave him a copy of the module handbook.

Dacher Keltner wants to revive the old sense of awe as an intense, life-altering feeling, but not its earliest associations with terror or dread. Nor does he subscribe to the common belief that awe can be so overpowering as to deprive us of our critical faculties, leaving us prey to dogma and demagoguery. Keltner, a Professor of Psychology at the University of California, Berkeley, has conducted experiments which suggest that, in a state of awe, our thought is more probing, rigorous and energised. Awe, he writes, “awakens the better angels of our nature”.

Keltner defines awe as “the emotion we experience when we encounter vast mysteries that we don’t understand”. His book begins with a moving account of watching his brother Rolf die. As Rolf, terminally ill with colon cancer, took opiates to end his life, Keltner sensed “a force around his body pulling him away”. Watching life leave another person is an awe-inducing experience. It makes us aware of the great mystery and preciousness of being alive and the dignity and solidarity of death, and connects us with forces much larger than us.

Keltner’s work forms part of ongoing efforts in academic psychology, from the early 1990s onwards, to take emotions seriously. A long tradition in Western thought, running from Plato to Descartes and beyond, has seen emotions as base, bestial and sinful. Humanity’s highest achievement, in this tradition, is the intellect, which supposedly transcends and survives our animal, bodily selves. This newish sub-field of psychology suggests, to the contrary, that emotions are a crucial influence on our thoughts. They are the lens through which we perceive the world. Keltner’s research shows that awe, like other emotions, is grounded in bodily responses – not just the obvious phenomena like tears, goose bumps and hairs standing on end, but an autonomous sensory meridian response (ASMR), which is when your spine literally tingles, along with the shoulders, neck and head.

Awe has less visible bodily effects. It triggers the release of oxytocin, the “love” hormone that helps us to trust and bond with others. And it activates the vagus nerve – a cluster of neurons running from the brain to the abdomen that regulate bodily functions, slowing our heart rate, aiding digestion and deepening breathing. Awe also reduces inflammation in the body, associated with depression, heart problems, cancer and autoimmune diseases. Awe, in other words, confirms Walt Whitman’s hunch that the soul follows “the beautiful laws of physiology”.

By concretising the experience of awe, Keltner’s book offers a valuable corrective to our hyper-individualist culture. Much of our current understanding of how human life and society operate is dominated by rational choice theory. This theory holds that people act mainly as self-interested individuals, and that they pursue this self-interest efficiently with the aim of maximising the utility of any outcome. Rational choice theorists focus on what psychologists call the default self: the self that considers itself a free agent, discrete and distinct from others and geared towards competitive advantage. The dominance of rational choice theory encourages us to see life and human relations instrumentally, in terms of clear, measurable and self-maximising outcomes. As Keltner says, this is why schools cut art, drama and music classes, and why creative approaches to pedagogy are replaced by teaching to the test.

But our default self is not all that we are. Awe quiets this self-maximising part of us, Keltner argues, and invites us “to collaborate, to open our minds to wonders, and to see the deep patterns of life”. A common prompt for feeling awe is witnessing what he calls “moral beauty”: other people’s courage, kindness or capacity for overcoming adversity. Awe puts our lives in perspective, making us feel small but still significant, a tiny piece of patchwork in a vast tapestry of existence. When we are awestruck, our life’s work seems “both less important than our default self makes it out to be and yet promising in purpose and possibility”.

I think I understand what Keltner means by awe, then, but I am not convinced that his research subjects have the same understanding of it as he does. With his Berkeley colleague, Yang Bai, Keltner collected 2600 stories of experiencing awe from people in 26 countries. He also interviewed people from many walks of life: basketball players, San Quentin prisoners, cellists, clerics. They and Keltner come up with such wildly disparate examples of awe that you wonder if they are talking about the same thing. These examples include watching a seven-year-old daughter play the tin whistle in front of 200 people, listening to lullabies, Mexican waves at football games, mosh pits, the Burning Man festival, the Cirque du Soleil, surfers riding 50-foot waves and, for one Swedish woman, witnessing her husband’s strength when moving furniture around the house. Keltner makes a point of saying that, in these stories of awe, “no one mentioned their laptop, Facebook, Apple Watch, or smartphone”. But many people do view the latest tech with something approaching awe, and I don’t see how different this is from the other examples he cites.

One problem here is that the word awe has been so semantically bleached. Keltner notes that Google Trends reveals a sharp rise in the use of the word since 1990. Such sharp rises are rarely accompanied by definitional precision. A second problem is that awe tends to evoke only a pre-verbal language of so-called vocal bursts, like “ooh” and “whoa”, that evolved in Homo sapiens prior to the emergence of words. When Keltner asks his subjects to describe their experiences, they fall back on boilerplate phrases like “it gives me the chills”.

This wouldn’t matter if Keltner were a precise enough writer to convey some sense of the ineffable. Instead, his prose style is garrulous and gushy, heavily italicised for emphasis and with browbeating paragraphs consisting of single, all-caps sentences such as “GET OUTDOORS”. There are awkward thumbnail sketches of the people he interviews: “Over Indian food, Frank ate sparsely, like the competitive miler he was at Harvard.” Kierkegaard is introduced as “the dour Danish philosopher”, which isn’t the descriptive adjective I would have chosen.

Like most books aimed at occupying the bookseller’s genre of “smart thinking”, this book is reducible to a single lesson that can be snappily summarised and serialised. Keltner’s single lesson unites a huge range of material, from the teachings of Julian of Norwich to anecdotes about being seated at a dinner next to Steven Spielberg, who tells Keltner that “we are all equal in awe”. The stories Keltner relates – of epiphanic moments experienced by doctors, army veterans, hospice workers and midwives – are not as powerful as they might be, because they come so thick and fast and are shoehorned into a somewhat artificial taxonomy of awe based on eight “wonders of life”, from nature (“wild awe”) to music.

There is lots of signposting and recapping, of the “to answer this question, we will …” and “we are nearing an end to our first section” kind. All this hand-holding can make it seem as if there is nothing to learn that Keltner doesn’t already know. “Twenty years into teaching happiness,” he writes on the second page, “I have an answer: FIND AWE.” This spoiler means that the book’s whole argument is baited and primed from the start. That isn’t to decry the two decades’ worth of work that Keltner has done on this subject, and his generous-spirited and carefully-argued use of it. But awe is about embracing the mystery, and there was no mystery to this book at all.

Fans

I reviewed Michael Bond’s Fans: A Journey into the Psychology of Belonging for the TLS:

“Fannish devotion,” Michael Bond writes, “is a gregarious impulse.” As the subtitle of his book suggests, he is interested in fandom as a shared experience that gives people’s lives meaning and purpose. In the course of the book, he joins fans at online meetings and conventions and talks to collectors, cosplayers, fanzine writers and “aca-fans” (researchers who are fans of what they study) to try and understand what forms of belonging fandom takes.

Fans begins by framing fandom within the social identity theory pioneered by the psychologist Henri Tajfel. Tajfel’s experiments, conducted in the late 1960s, showed that people require minimal prompting to categorize themselves with others and to favour members of their own group over anyone else. Group identities are an inevitable fact of social life, one of the key ways we position ourselves in the world. Fandoms, Bond argues, suggest that in-groupishness does not have to lead to intolerance towards those outside the group. They offer “the pleasures of tribalism with less of the harm”.

After this opening chapter, the social theory takes something of a back seat and the book settles into a diverting tour around the more eccentric shores of fandom. “Fans,” Bond writes, “exist in places you would never think of looking.” He talks to one academic psychologist who is a “Brony”, one of the thousands of British and American men who gather in online communities to celebrate the characters in the toy line and media franchise, My Little Pony. These men are drawn to its themes of friendship and compassion, and enjoy upending expectations about the kinds of things men should be interested in. This same psychologist is also a “furry”, one of the large subgroup of cosplayers who like dressing up as anthropomorphic animal characters such as Bambi, Sonic the Hedgehog and the Lion King. He sometimes lectures dressed in his fursuit, and reports that student feedback is “largely positive”.

Fan cultures are bottom-up, relying on little or no input from the objects of their devotion. Their members are not just consumers of a product but advocates and campaigners for it. They can form formidable lobby groups, for instance, to challenge the cancellation of a favourite TV series (as NBC planned to do with Star Trek in 1967, until they received 115,893 letters in protest). They can also be highly effective political activists. When anti-government protestors in Thailand demonstrated in the streets in 2020, many of them dressed as characters from Hogwarts to symbolize their fight against injustice. Fans of the South Korean boy band BTS have used their online savvy to block anti-protest surveillance apps, derail racist social media campaigns and raise money for Black Lives Matter.

Fan fiction, almost none of which is shared beyond fan communities, deploys astonishing reserves of energy and creativity. The online fan-fiction repository, Archive of Our Own, has around five million registered users and more than nine and a half million works (a number that rose sharply during the Covid lockdowns), including over 10,000 versions of The Hobbit. A common ploy of fan-fiction authors is to move the action to the present: in one rewrite of War and Peace, for instance, Pierre takes to Reddit after his duel with Dolokhov to air his grievances. Another technique is “shipping”: inventing relationships between (usually same-sex) characters. The most common such couplings are Kirk/Spock, Holmes/Watson, Starsky/Hutch, and Harry Styles and Louis Tomlinson of One Direction. There is also a niche sub-genre detailing the love affair between Donald Trump and Shrek (“Trump suddenly felt strong arms surrounding him, steadying him and saving him”).

Bond argues that these fan subcultures are richly meaningful and, for the most part, healthy. As he points out, diehard sports fans are rarely assumed to be overly-fixated or fanatical, in the way that sci-fi fans or cosplayers often are. Pathological obsession among fans gets disproportionate media attention, but is rare. Many fans do experience an illusion of intimacy with the object of their devotion. But psychological research suggests that people who form such “parasocial relationships” with imaginary characters or unavailable stars are often empathetic, and just as good at forming relationships in real life.

Parasocial relationships can offer the temporarily isolated a sense of belonging, a chance to dip their toes in the waters of sociability and “to darn the holes in their social fabric”. A woman whose husband had recently died of cancer, for instance, developed a crush on the singer Josh Groban that made her see how she might be able to love again. Cosplaying can also be a way of experimenting fruitfully with other identities. Several hundred furries had their personalities tested while they were “in fursona” (in their furry persona). They scored significantly higher than in their everyday personalities on factors such as agreeableness, emotional stability and openness to new experiences.

Sometimes Bond’s eagerness to see fans’ behaviour as normal can take him into uncomfortable areas. In a section on the thousands of fans of the High School killers Eric Harris and Dylan Klebold, who call themselves “Columbiners”, he argues that their motives are “surprisingly benign” and that “they mean no disrespect to the victims”. Most Columbiners, he says, identify not with Harris and Kebold’s crimes but with their sense of themselves as outsiders, but he does concede, in what should surely have been more than an aside, that “a tiny minority of their members aspire to be mass murderers”. He skates over this difficult territory with a bland formulation: “In serial-killer and school-shooter fandoms, there is little to celebrate, but there is a great deal to contemplate.”

This book would have been stronger with a tighter and more contemporary focus, perhaps on how fandom is changing in an online age. Its scope is wide, covering everything from Janeites (Jane Austen fans) to Japanese anime enthusiasts. One chapter centres on therians: people who believe that they are an animal trapped inside a human body. Bond says that therians are “on a quest for identity, meaning and a sense of belonging just like Janeites, Trekkies and Potterheads”. But are they “fans” of the bears, wolves or big cats with whom they identify? It seems a stretch.

The tone of Fans feels a little undecided, caught between sympathy and satire. Bond seems to be very good at talking to fans, winning their trust and drawing them out. But he also wants to find humour in their unconventionality, such as when he pokes gentle fun at the online meeting of the Richard III Society where “most were not yet up to speed with the finer points of video conferencing – the mute button seemed particularly challenging”. A former New Scientist editor and writer who has written other books on human behaviour, he is skilled at summarizing and synthesizing psychological theories. But this book seems poised somewhat uneasily between a serious scholarly study and a more personal journey around cultures of fandom.

In truth, there is a substantial body of scholarship in cultural studies and sociology produced since the 1990s already pointing out what Bond discovers: that fandom is a creative, dynamic and communal process. He skirts fairly lightly over this scholarship, preferring to talk to fan scholars over Zoom rather than delve too deeply into their works. The book succeeds best when it simply reports from the mostly unironic, beguilingly weird world of extreme fandom. Here it becomes a celebration of human idiosyncrasy and our talent for building shared meaning and solidarity out of the strangest material.

Noli timere

[Published in Hinterland magazine, issue 12 (2022)]

On 13 November 2020, a leaving do was held at No 10 Downing Street, in a room so crowded that people were perched on each other’s laps. Later that evening, Abba’s ‘The Winner Takes It All’ was heard coming from a party in the prime minister’s flat. On 11 December, No 10 took delivery of a new drinks fridge for the regular ‘wine-time Fridays’. On 18 December, around fifty people attended a Christmas party, with cheese and wine and the exchange of Secret Santa gifts.

Meanwhile my closest friend was dying. In those months, my only contact with her was a series of ever more desultory phone conversations, when she was either worn out and dreamy or high on steroids. At her funeral, in early February, I was one of exactly thirty mourners. Everyone wore masks and no one hugged. At the end we briefly milled around outside, before the next thirty mourners from the next funeral came out. It was so cold that, when I looked down at my hands, I saw that my knuckles were bleeding.

When news of the lockdown parties broke, the most enraged were the bereaved – the ones who had been forced to say their farewells through windows in care homes or on iPads, or watch funerals on live streams on their laptops. Those defending the parties didn’t understand, or pretended not to. People had been working hard, they said, and we needed to maintain morale.

*

Ever since our ancestors first sprinkled ochre on bodies 40,000 years ago, and buried them with favoured objects and adornments, we have needed rituals for the dying and the dead. Gathering round the bed to accompany those in their last hours on their voyage into the unknown, then cleansing and purifying the body, holding wakes, keening, eulogising, taking turns to shovel dirt into the grave, sitting shiva, breaking bread. The rituals wash over us and relieve us of the duty to think. They help to fill the void of unmeaning left when someone we love simply, shockingly, ceases to exist.

We share these rituals with other animals. One night in the summer of 1941, while watching a sett, the nature writer Brian Vesey-Fitzgerald saw a badger funeral. A sow and her son improvised a grave from an old rabbit’s warren, dragged and heaved an older male into it, then roofed it with earth. The whole ceremony, throughout which they howled and whimpered and touched noses, lasted seven hours. The scientist and conservationist Cynthia Moss, who has studied elephants in Kenya’s Amboseli National Park since 1972, has seen them covering dead members of their herd with leaves and branches and standing vigil, then returning much later to stroke their bones. The Roman author Aelian observed the same elephant rituals in the third century. The animal behaviourist Marc Bekoff once witnessed the funeral of a magpie hit by a car. Four birds stood silently over the body, then flew off and brought back grass, twigs and pine needles to place beside it, like a wreath. After bowing their heads for a few seconds, they flew off.

Some scientists think that calling these things ‘funerals’ is just mushy anthropomorphism. You can observe animal behaviour, they say, but you can’t prove what feelings lie behind it. The biologist E. O. Wilson noticed that when an ant dies, it lies ignored for two days. Then, when its body starts to release oleic acid, another ant carries it to a refuse pile of dead ants, the ant version of a graveyard. When Wilson applied oleic acid to a live ant’s body and returned it to an ant trail, that ant was also carried off on another’s back to the graveyard, struggling all the while.

I suppose a strict behaviourist would see the grieving rituals of all animals – perhaps even humans – like this, as a matter of chemical triggers and blind instinct. But I have seen a group of horses in a field with heads bowed over their dead comrade, and I know what I saw. Other animals can tell us something about why we have to say goodbye to those we have lost, even though we know it changes nothing. We need to be with our dying and our dead, and when we can’t be, it feels as if a hole has been rent in the fabric of the universe.

For months after my friend’s death, a line from a Paddy McAloon song, ‘The Old Magician’, kept popping into my head: Death is a lousy disappearing act. Things felt oddly dulled and affectless, as if the normal course of grief had stalled in the general surrealism of daily life in lockdown. Life went on, but laboriously. My brain felt like an old computer that more or less works, but that takes ages to boot up and keeps freezing because of all the old programs and temporary files running in the background. It occurred to me that at some point the computer would stop working altogether, and those feelings, whirring away uselessly underneath, would have to be faced.

*

We owe to Sigmund Freud the now common idea of grief as an arduous road we must walk undeviatingly along. Bereaved people, he writes in his essay ‘Mourning and melancholia’, cleave so tightly to the memory of their lost beloved that ‘a turning away from reality ensues’. Mourning demands Trauerarbeit or grief work: the hard labour of severing the ties that bind us to them. It means slowly conceding the truth that they are now, in Freud’s unforgiving phrase, a ‘non-existent object’.

I could hardly begin to think of my friend as not existing. For the last year of her life, I knew her the same way I knew almost everyone else – as a digital ghost, an incorporeal intelligence spirited through the air. And then, like a switch being tripped, she wasn’t even that. How easy it was for me to believe that she had just mislaid her phone, the one with the scuffed Cath Kidston case, or forgotten to charge it. Or that she was somewhere with dodgy reception and was wandering around in the garden holding it up, trying to pick up a single bar, and at any moment her thumbs would start dancing on its little screen, she would press send and her name would pop up again. Sorry for the radio silence, she would say – as people always say.

Even now, I still think an email might arrive with a friendly ping, and it will be her. Or that one of those little grunts when my phone shudders on silent might be her sending a text. Or that she might chip in to our WhatsApp group in the way that long-time lurkers suddenly and weirdly have something to say, and remind us with a jolt that they exist.

To Dr Freud, all this is just denial. Must do better with your grief work, he would say. See me after class.

*

Things carry on existing even when we can no longer see or hear them. We aren’t born with this knowledge; we need to learn it. The psychologist Jean Piaget gave the name ‘object permanence’ to this awareness that a thing might live on even when it is absent to us. Piaget observed the reactions of infants when their favourite toy was covered with a blanket. Babies think the toy has gone for good. They seem briefly puzzled or sad, but then quickly give up on it. From around eight months old, though, they start to realise that the toy is just hidden. This new-found knowledge overlaps with the first wave of separation anxiety. Once a child grasps object permanence, the world becomes a more complicated and scarier place. The child knows when someone isn’t there, but doesn’t know when, or if, they will return. A parent in the next room might as well be on the moon.

Infants develop ways of coping with separation anxiety. In September 1915, Freud was staying at his daughter Sophie’s house in Hamburg and was watching his 18-month-old grandson, Ernst, play a game of his own invention. Ernst would throw a wooden reel with a piece of string coiled around it out of his cot, exclaiming ‘Oo!’. When he yanked on the string to bring the reel back into view, he uttered a gleeful ‘Ah!’. Freud heard these sounds as infantile approximations of the German fort, ‘gone’, and da, ‘there’. In the fort-da game, Ernst was symbolically commutingan unhappy situation, in which he had no control over the presence of his mother, into a happy one in which he could call her up at will.

Later on, we learn a harder truth. Human objects are not, in fact, permanent, and sometimes they will leave, never to return.

*

Our modern faith is the quest for perfect connectivity. These days it is as near as the godless get to the promise of eternal life. When the telegraph and the telephone were invented, the Victorians saw them as the electrical equivalents of that then-voguish pseudo-science, telepathy. These inventions seemed to fulfil the same dream of contact with distant others. Our online world is the culmination of that dream. It offers up an antidote to the depressing laws of physics which say that a human body is time-limited and gravity-bound.

This faith is fuelled by the market’s unquenchable hunger for harvestable data. The online world cannot conceive that anything could end. There will always be another update or notification, another drop in the self-replenishing drip feed of gossip and comment. All you need do is keep scrolling, dragging down to refresh, searching for the dopamine hit, the virtual hug that comes from being liked and shared. The dead live on in their undeleted social media accounts, still flogging their CVs and freelance pitches, their holiday photos, their pictures of long-consumed meals about to be eaten, their thoughts on Brexit and vaccines and facemasks and Love Island. Everything online feels ongoing, as if death were a temporary bandwidth problem. Online, we think that things will be solved by saying them, by declaring our feelings and having them validated. Thank you for sharing, we say, because saying anything is always preferable to saying nothing.

In the early days of the first lockdown, a Second World War veteran in a care home was filmed in tears after being given a cushion with his late wife’s face on it. Thousands shared the film online. This man had been sleeping with a picture of his wife in a frame, so his carer, worried he might cut himself if the glass broke, had the cushion made. A kind thing to do, of course. But why did it need to be filmed so that the sight of him crying could go viral? A stranger’s tears allow us to think that something has been fixed. Click on the link, feel the warm glow of empathy by proxy, have a little cry yourself and then go back to your own life. Online, shedding and witnessing tears is seen as healthy, cathartic and semi-compulsory. But since when were tears ever a guarantor of sincerity or depth of feeling? Often, what provokes them has nothing to do with whatever is really making us sad. The other day, I cried when I couldn’t tear the cellophane off a box of tea bags.

*

In Wonderworks, Angus Fletcher explores the neuropsychology of grief. Almost all of us, he writes, feel that it’s wrong to stop grieving and move on with the rest of our lives. At the heart of this feeling lies guilt. Guilt’s function is to keep a check on our relationships with others and raise the alarm when rifts form. The death of a loved one scrambles this system. Our brains sense their absence and warn us to heal the rift. But how can we, when they are no longer there? This is why our ancestors created funeral rites, which offer gifts to the dead in the shape of words, music and formalised gesture. They help a little, but these rituals can also feel empty and … well, ritualistic. Their stock utterances and choreographed moves can never account for the infinite particularity, the limitless heterogeneity, of the person who has gone. And that person can no longer relieve our guilt by accepting the gift of remembrance anyway, and telling us not to worry.

In the middle of a pandemic, with death a grim statistic on the nightly news, I worried that the uniqueness of my friend, the stubbornly singular way she took up space in the world, was being lost in the weight of numbers. But I also disliked the idea of sharing her with strangers, of posting some online tribute that followers of ever-diminishing proximity to her could answer with comments, likes and emojis. Why should she have to fight for space in the jarring juxtapositions of a Twitter timeline, sandwiched between someone saying I’m so fucking angry and someone else saying I’ve been promoted? I didn’t want her to become part of the noise.

We need time, Fletcher says, to move beyond the necessary bromides and expedient clichés of memorials. We have to acknowledge that the person we have lost has left behind a human-shaped hole moulded to their precise dimensions, one that nothing and no one else will fill. Our brains must slowly absorb this brain-melting paradox of being human – that set against the billions of other selves who have lived and died, a single life doesn’t matter much, and yet it matters beyond words.

*

Grief today comes with a script, and the script says this: it will obliterate us and then, painfully, remake us. Its trauma will be worth withstanding because it will teach us something important about ourselves. Even grief, in other words, has been co-opted into the progress myths and redemptive arcs of the personal growth industry. But why should the worst pain of all be turned into an opportunity for self-improvement? Whatever doesn’t kill you makes you stronger, they say – even though a moment’s serious thought reveals this to be nonsense.

In All the Lives We Ever Lived, Katharine Smyth writes about the death of her father from bladder cancer at the age of 59. After this long-anticipated event, her days just felt ‘vague and muffled’ – not the required response in our culture of ‘grief worship’. Smyth wonders if we overinvest in the idea that grief floors us and changes us for ever. Instead, in its tedium and monotony, it ‘recalls to us our impotence, reminds us that our longing counts for nothing’.

The slightly shaming truth is that grief is an anti-climax. If this is grief work, I remember thinking in the weeks after my friend’s death, then it is the dullest desk job imaginable. The schedule isn’t onerous or stressful; I seem to spend most of my time clockwatching and staring out of the window. But the hours are long, there is no annual leave, and I don’t know when I can hand in my notice.

*

I look at my text conversations with her, laid out on my phone. A long daisy chain of words, saying where are you? and sorry I had to rush off and I’m running behind as usual! and are you ok? and I hope things are feeling a bit less shit.One of the designers of the first iPhone conceived this idea of putting all our conversations in a thread, with different-coloured speech bubbles to the right (me) and left (her). Every other phone manufacturer copied it and it became part of the invisible grammar shoring up our remote interactions.

The whole thread unspools like a two-hander, with both characters at first oblivious as to what’s coming next and then all too aware but desperate to talk about anything else. Here she is, after the diagnosis, sending me a picture of an injured oystercatcher she has rescued, convalescing in a cardboard-box nest in her garden, its long orange beak peeking out sadly from the opening torn out of one side. Or excitedly sharing a picture of her old Brownie Collector’s Badge, found in a clearout. Then come the texts that say she is feeling a bit rubbish, rallying a bit now. And sorry for going on and on, earlier. I could blame it on the drugs but you’ve known me too long.

Then my words are parried with a bald sorry, I can’t talk right now or can I call you later? When I got the first of those messages, my stomach dropped. She had never been so curt before; had someone stolen her phone? Then I twigged that it was an automated message, selected from a drop-down menu when she didn’t have the energy to pick up. At the end comes a salvo of messages from me, with no reply. The ping-pong pattern on my phone fools me into thinking that a message will appear on the left, with her initial, but it never does.

When online daters cut off contact with someone they have been seeing by simply ignoring their messages, it is called ‘ghosting’. What compounds the sense of abandonment, I assume, is that the speech-balloon format already implies a reply. When you are ghosted, it feels like someone turned away from you in mid-conversation and walked silently out of the room. Our age venerates interactivity. We think that every message deserves an answer, that no conversation need ever end. Back in the real world, though, plotlines peter out, conversations tail off and ends stay loose. Those deathless parting words we write and rewrite in our heads turn in reality into something bland and adamantly cheerful. The messages get briefer and more perfunctory and then, without warning, stop. But that’s OK, or should be. Life is not a TV police procedural where, if the ending feels like a cop-out or doesn’t tie up all the threads, you berate yourself for wasting twelve hours of your life. Our lives were precious anyway; they are not defined by the leaving of them.

*

Nowadays clinical psychologists don’t tend to think like Freud. Grief, they have found, is not some exam we have to resit repeatedly until we learn to accept reality. Nor does it parse itself into neat, self-contained steps, from denial to bargaining to acceptance. Grief has no universal symptoms and no obligatory stages to be got through, like levels of a computer game. Most of us turn out to be fairly resilient in the face of loss. It doesn’t tear us asunder. It just makes us feel wiped out and wobbly, and we wonder if this is really grief or we are doing it wrong. Grief has no script; we are all just making it up as we go along.

Grief is not an unwavering line we follow, a long lesson in letting go. Maybe some of us never accept that the person we are grieving for no longer exists, and we learn to live with not accepting it. This is not magical thinking. We know that they are utterly gone, and for ever – but there they still are, brightly and intensely alive in our heads. In many cultures, the living see the dead as vividly present. On the Mexican Day of the Dead, they welcome their departed, always alive in memory and spirit, back to the earth. Maybe these rituals are on to something. Maybe grief, as a universal human dilemma, is something that evolution has hard-wired us to be able to handle, and this is one of the workarounds we have devised.

Something imagined is still real, because the imagination is real. Most of what matters to us happens in our heads, in that supercomputer made of fat and protein between our ears. The unlived life is also life. I am forever conducting made-up conversations with others, endless rehearsals without a performance, or rehashings of exchanges that went wrong, trying to do it better next time, even when I know there will never be a next time. That doesn’t seem so different from the messages I compose to my friend in my head, which one day I might send zipping at the speed of light to that phone with the scuffed Cath Kidston case, which is probably still in some drawer somewhere, powered down but ready to sputter into life and pick up my unread messages. In our brains, synapses fire, chemicals react, electricity fizzes, new neural pathways form. What happens in our brains happens in the world, because our brains are part of the world.

*

At the end of August 2013, Seamus Heaney was leaving a restaurant with a friend in Dublin, when he stumbled on the steps and banged his head. He was admitted to hospital, where the doctors found a split aorta, requiring a serious operation. After Heaney left for the operating room, he sent his wife Marie a text. It contained the Latin words Noli timere. Do not be afraid.

Heaney died shortly afterwards, before the operation even took place. His son Michael revealed these last words during his funeral eulogy, inspiring a flurry of tears in the congregation. Heaney had loved Latin ever since, as a small boy, he heard his mother rhyme off the Latin prefixes and suffixes she’d learned at school. He was old enough to have attended, in the days before Vatican II, daily Latin masses at St Columb’s, the Catholic grammar school he went to on a scholarship. And he learned it in the classrooms at St Columb’s, where they taught Latin in imitation of the English public schools, and his lifelong love of Virgil began. He admired the relentless logic of the language, its economy and precision, its neat conflation of analytical and emotional truth. Noli timere says with two words what English needs four to say, or, in the King James Bible, three: Be not afraid. Two words are easier to type when you’re being wheeled to theatre on a trolley.

Noli timere appears about seventy times in St Jerome’s Vulgate Bible. Often God says it, while trying to calm a human understandably freaked out at His presence. The angel says it to the shepherds before bringing them news of the birth of Jesus. Jesus says it to the disciples when he walks on water, and when he meets them after rising from the dead. The force of Noli timere derives from its blend of clear instruction and gentle assurance.

Heaney’s texted words weren’t as lapidary or final as all that. He was expected to recover from the operation, and often used Latin with his family as a private, joking language. His use of the singular form noli timere instead of the plural nolite timere suggests the message was personal, meant only for his wife. But the family, Michael later wrote, ‘seized on his final words as a kind of lifebuoy’. It seemed to them that he had captured ‘the swirl of emotion, uncertainty and fear he was facing at the end, and articulated it in a restrained yet inspiring way’. Noli timere was a last act of kindness, a spell to help those left behind to grieve.

In the days and weeks after Heaney’s death, Noli timere appeared in all the obituaries and tributes. They became a shorthand for the power of language to help us survive our losses. They did what Heaney had been doing for more than half a century: writing sturdy, well-shaped words that cut cleanly through banality and pierced the heart. The graffiti artist Maser painted the words, in English, on a gable end wall in Portobello, Dublin (although an all-caps DON’T BE AFRAID in massive white letters is not perhaps as reassuring as it is trying to be). The phrase now speaks a little more forlornly in our fretful and fractious new world, from which Heaney was spared.

A few months after my friend died, this story about Heaney’s last words came into my head. It felt like a little chink of light to walk towards, a source of solace and hope. It made me see that a virtual goodbye could still be beautiful, that a message sent through the ether might mean even more for being so intangible and precarious. The written word can be an outstretched hand across the abyss; it can walk through walls.

*

The most common way of thinking about our online lives, even among people who spend most of their lives online, is that they are unreal. To be online is to be disembodied, reduced to eyes and fingertips, occupying some elusive other realm, made of air and vapour. The web is eating up our lives, we fear, and disgorging them as a waking dream, or fooling us into thinking that some better life is being lived elsewhere, just out of reach. We have gone down a rabbit hole of our own making. If it weren’t all so addictive, we would come to our senses, power down our devices and return to the three-dimensional sensorium of real life.

But that’s not right, is it? I mean, we should probably spend less time doomscrolling and hate-reading and getting pointlessly angry with strangers. And maybe we shouldn’t lie in the dark so much after midnight, kept awake by the flickering light of our phones and the adrenaline rush that comes from eavesdropping on the babble of other people’s egos begging to be affirmed. But still, our online lives are also our lives – extensions of our humanity, not some pitiable stand-in for it. Man is an animal, the anthropologist Clifford Geertz once wrote, suspended in webs of significance he himself has spun.

Like us, the online world is both physical and ethereal. It is made of wireless routers, modems with blinking lights, vast data centres in unmarked buildings full of humming hard drives and glass fibres inside copper tubes, and hundreds of thousands of miles of cable, buried alongside roads and railways and crossing ocean floors, occasionally nibbled at by sharks. And it is also made of our lusts and rages and fears and desires, which are more real to us than our livers and kidneys.

And what do we learn in this virtual world that is nowhere near as virtual as we think? Only that the deepest connections between us are the most fragile, because they are made of that filigree web of meaning and mutual care we all spin together. And we also learn that absence needn’t mean obliteration, that someone you can’t see or hear can still exist – that objects, even human objects, have permanence.

*

I wonder how all those people who had to say their final goodbyes on FaceTime are doing now. I imagine they were as angry as I was when the gaslighting sociopaths of Downing Street told them to draw a line and move on. I assume they felt the same guilt that I did about obeying the rules when the rulemakers didn’t. We feel guilty, as Elaine Pagels says, because it is more bearable than feeling helpless, than falling through an unending chasm of meaninglessness. It follows that guilt is assuaged when we find meaning again. So I hope those people have come round to the idea that their online partings were still meaningful, and that they did, after all, convey something profound about human love – that the ties that bind us are as tenuous and transient as life itself, and yet they are made of the strongest material in the universe.

After all those texts I sent with no reply, I did get a final message. She must have found, on her phone, one of those firms that send flowers in a slim cardboard package that fits through your letterbox. On the day she died, a dozen stems of solidago and alstroemeria arrived, with a card. All love to you my most amazing friend, it said. It seems unfair that men don’t get given things that smell nice. I wonder if, like me, she was thankful in the end for our soullessly algorithmic online world – the one that requires us only to swipe and prod a glass screen to magic up flower pickers and delivery drivers and have our words, tapped out with our thumbs, transcribed in cards with handwritten fonts. I finally threw the flowers out after a month, when all but a few of the petals had shrivelled and shed. But I still have the card – my own Noli timere.

My idea of happiness

A poem made out of the headlines to Adrian Chiles’s articles in the Guardian

I recently saw something

in a petrol station toilet

Southbound on the M1

that I can never unsee.

I spent an afternoon

writing my own name.

It was lovely

until I started overthinking it.

What is an app?

I honestly have no idea.

After a meeting that went on for hours,

I was finally told what it was all about.

I was being interviewed

for a job at MI5.

Do I really need to drink

almost 5 litres of water a day?

I haven’t got the bladder for it.

I almost downloaded a pebble-identifying app,

but some stones should be left unturned.

Would you pay £15,000 for a portrait of me?

Me neither.

My idea of happiness?

A strimmer and a bramble-choked path.

Gen Z and Me

This is a longer version of the piece I wrote on Generation Z for the London Review of Books:

On the walk from the car park to my university building sits a red telephone box, classic K6 model. The other day, out of curiosity, I pulled at its heavy cast iron door, stepped inside, and let the door thud behind me. It must be fifteen years, at least, since I last experienced that strange dampening of the sounds of the street and that smell of stale urine and old takeaways. For a moment, the phone box became a TARDIS and I was a homesick student ringing my parents again, harassed by the pips that demanded more coins and the lonely finality of the purring dial tone when it cut me off. It reminded me that I am a digital immigrant, raised in a clunkier, analogue age, when long-distance communication felt fragile, precarious, interruptible.

I was an early inhabitant of the online world. I remember using Netscape Navigator, one of the first web browsers, in a computer room at the University of Sussex in autumn 1994. I have been an Amazon customer (its website reminds me) since 1999, longer than most of the students I teach have been alive. With mobile devices, though, I was on the other side of the adoption curve. No message seemed so urgent to me that you had to carry round the mailbox in your pocket. I bowed to the inevitable in 2004, with one of those entry-level Nokias, all rounded plastic and chunky buttons. I still had it three years later when I saw that film of Steve Jobs at the Macworld Expo, showing the audience his new phone. No one will want to search the web on something as small and fiddly that, I thought. It will never catch on.

Even now, I get so few messages on my garden-variety smartphone that I forget to charge it, or don’t look at it for days. Being mildly dyspraxic and very myopic, I prod at it ponderously and with great emphasis, as if expecting it not to respond. Often it punishes my lack of faith by disowning me, refusing to recognise my thumbprint and unlock itself. My students’ phones are often in their hand, usually on their person, and always within reach. They swipe, pinch and caress them like virtuosos.

Most of this year’s new crop of undergraduates were born between September 2003 and August 2004, the year Eats, Shoots & Leaves was published, the Hutton inquiry reported and Channel 4 aired the last episode of Friends. If you find this information as uncomputable as I do, you’re probably about my age. More significantly, these students were two or three years old when Jobs launched the iPhone. In Gen Z, Explained: The Art of Living in a Digital Age (University of Chicago Press), an anthropologist (Roberta Katz), a linguist (Sarah Ogilvie), a historian (Jane Shaw) and a sociologist (Linda Woodhead) try to understand this peer group of digital natives. They define Generation Z, also called Zoomers or post-millennials, as those born between 1995 and 2009. Even the oldest members of this group have no memory of a world without Broadband.

The key fact for Katz et al. is that Gen Zers have had to navigate this new online reality without the aid of their mainly clueless elders, and have thus improvised their own rich and hard-to-penetrate subcultures. What they mostly like to do is collaborate in leaderless groups. They use digital tools to create shared documents, synch their calendars, write and read fan fiction, play games together, and organise real-world lift sharing, couch surfing and political activism.

They have devised an intricate language and etiquette for their online lives. They can quickly convey their pleasure or displeasure through memes – such as the ubiquitous Drake Yes/No meme, made up of two stills from the rapper’s video of ‘Hotline Bling’, in which he holds his hand up to his face in disgust, and looks happy. They use emojis as ‘a softener and a social lubricant’, and bracket words with asterisks and tildes for emphasis and irony. Whether they write ‘k’ or ‘kk’ to mean ‘okay’ is charged with meaning. The first is purposely curt, especially if the sender has taken the trouble to override the default capitalisation, and still more so if they add a passive-aggressive full stop. The second is cheerful and casual, a no-sweat way to temper the brusqueness of the single letter.

These tonal shadings matter because post-millennials like to state clearly where they are coming from. Self-labelling, especially of fine-grained sexual and gendered identities, has become ‘an imperative that is impossible to escape’. They think it important to be themselves, to admit to their struggles and vulnerabilities, and to say what they mean. In the iGen Corpus, a digital data bank of seventy million words used by post-millennials and compiled by Ogilvie, words such as real, true, honest and fake occur far more often than in general language use.

The book’s findings are mostly based on interviews with students at three institutions: Stanford, Foothill Community College (a few miles from Stanford in northern California), and the University of Lancaster. In a world where so many things compete for their attention, these students worry about allocating their time efficiently. They dislike email, finding it laborious compared to texting and messaging. ‘If it’s a professor you don’t have a close relationship with, you have to say, hi professor whatever, I’m in your class or I’m interested in this blah blah blah,’ one student says. ‘You have to kind of frame it.’ Several of the students surveyed watch recorded lectures at triple speed – not just to save time, one of them says, but to help them concentrate.

All this is useful, if disconcerting, for a university teacher to get learnt. I was less convinced by the book’s basic premise: that the new technology so enculturates its young users that it has created entirely new ways of thinking and being. The book first emerged in a conversation between the authors on the Stanford campus in 2016, when they agreed that ‘incoming students were strikingly different from those from a few years before’. Gen Zers, they argue in their introduction, ‘are shaped by and encounter the world in a radically different way from those who know what life was like without the internet’.

The book’s title carries this sense of interpreting to non-initiates the behaviour of a separate tribe, albeit one whose habits are increasingly being adopted by other age groups. While I was reading it, a phrase of the child psychologist David Elkind’s sprang to mind: ‘cognitive aliens’. Elkind was discussing the work of the Swiss psychologist Jean Piaget, which revealed how contrarily young children see the world – believing, for instance, that the sun and moon follow them as they walk around, that anything that moves, from clouds to cars, is alive, and that dreams fly in through their window at night. For Elkind, Piaget’s work suggested that the main problem in education is communication. The child’s mind is not a tabula rasa but its own rival system for generating reality. Every middle-aged teacher has had a related fear that their students now dwell in an unreachable mental landscape. But the stories in Gen Z, Explained don’t always sustain its initial claim that post-millennials think and behave in very different ways.

Nearly all those interviewed for the book still say that their favourite mode of communication is ‘in person’. Every era thinks that its technology has changed everything utterly, but human instincts, after 300,000 years of evolution, must be pretty resilient. My students check their phones almost as often as they blink, but isn’t that just because we are inescapably social animals? I would check my phone all the time, too, if anyone ever sent me any messages.

Social networking and the smartphone do seem to have made young people more willing to make intimate feelings public. The students in Gen Z, Explained post pictures of their ‘depression meals’ (which can range from a comforting Deliveroo order to a mishmash of whatever food they can find) as a signal that they are feeling low. But they also make clear that this kind of sharing is made possible by distance. One interviewee says that he can post to strangers without ‘worrying that you’re adding some emotional toll to them … whereas your friends are sort of obligated to help you’. Post-millennials are perfectly aware of the boundaries between online and offline life; they just draw them in subtly different ways. A surprising finding in Gen Z, Explained is that it is now a common courtesy to ask permission from friends before posting a picture in which they appear. Those interviewed are also well-attuned to the paradox of having more voice than ever before online, while often feeling powerless IRL (‘in real life’) to change economic and political systems that seem ‘locked, inaccessible to them, and wrongheaded’.

Everyone looks like a maestro when they’re using technology you’re unversed in. If a time traveller from the 1990s arrived in the present, they would marvel at the effortless aplomb with which people of all ages manipulate their touchscreens, talk to their digital assistants and wave contactless cards at readers (unless they are Rishi Sunak, who finds the last one difficult). In a mere fifteen years, smartphones have become the central technology of daily life around the world. I had a colleague twenty years my senior who retired to Portugal and went wholly and impenitently offline, with not even a mobile number to reach him on. The audacity of it! In our age of hyper-connection, he might as well have sailed off the edge of the world.

In Generations: Does When You’re Born Shape Who You Are? (Atlantic Books), Bobby Duffy argues that our currently polarising discourse about generational difference is ‘a mixture of fabricated battles and tiresome clichés’. Duffy, director of the Policy Institute at King’s College London, likes to use an abundance of quantitative statistics and qualitative surveys to challenge common stereotypes and perceptions (his previous book was called The Perils of Perception). Generations thus avoids a charge that could be levelled at Gen Z, Explained – that its conclusions mostly rest on a selective and overeducated sample (although itdoes supplement its student interviews with a representative online survey of 2000 young adults in the US and UK). Duffy’s book is not as alive with anecdote and illustration as Gen Z, Explained, but it deploys a barrage of data to reveal the messier and more interesting reality behind popular myths.

It’s true, he says, that age has become more of a political dividing line, over issues such as Brexit, racial and gendered injustice and privilege, and climate change. It’s also true that social media’s silos can make it harder for generations to converse with each other across that line. Half of post-millennials use SnapChat, but only 1 per cent of Baby Boomers – although some apps, like Facebook and WhatsApp, do better at cutting across age divides. But our fractious politics and online squabbles have created a false impression of post-millennial woke warriors and baby-boomer reactionaries at war. Family links remain stronger than our links to our peers. Lockdown compliance among young people was high partly because they wanted to protect their parents and grandparents from the virus. If anything, as I see on open days and at graduations in the sweetly close bonds that students have with their parents, the generation gap has narrowed. Those interviewed in Gen Z, Explained say that they often call or message a parent – usually the mother – several times a day, or send them pictures, especially of meals.

Sociologists use three explanations for why people’s attitudes and behaviours change over time: period effects, lifecycle effects and cohort effects. Period effects are when change happens across all age groups, because of sweeping societal shifts. Lifecycle effects are when change happens because of the aging process, or in response to key events such as leaving home, becoming a parent or retiring. Cohort effects are when change happens because a generation is socialised by the same experiences. Duffy thinks that the current discussion of generations attributes too much to cohort effects and not enough to period and lifecycle effects.

Generations that seem atypical when they are young tend to revert to a familiar life course as they age. For instance, post-millennials are accused, like many cohorts before them, of being individualistic and materialistic. To the extent that this is true, it is a lifecycle effect, a youthful trait that people grow out of as they take on the responsibilities of work and family. Post-millennials are also around twice as likely to say they feel lonely than older people, but we need to remember that they are at a stage of life when socialising feels compulsory and isolation cuts deep.

When you factor in lifecycle and period effects, generational changelooks more nuanced. My students will eventually have to make their truce with email, not just because I tiresomely insist on emailing them, but because it will remain the default form of communication in graduate employment. A cohort effect will become a lifecycle effect. What Gen Z, Explained claims as a cohort effect – the value young people place on being open and authentic – seems to me more of a period effect. In her book Family Secrets, the historian Deborah Cohen argues that a ‘modern age of confession’ has been slowly emerging in Britain since the 1930s, as attitudes towards divorce, illegitimacy, homosexuality, infidelity, mental disability and other aspects of life once kept as shameful secrets have changed. Transparent self-narration has come to be seen as the key to psychological well-being and a healthy public life. Generation Z’s attitudes are part of a long-term trend towards the valuing (even over-valuing) of emotional candour and empathetic connection.

One symptom of this trend is the irresistible rise of relatable, a word I have been trying for at least a decade to get students to stop writing in their essays. One day they all just started using it at once, as if there had been a meeting about it in my absence. Again and again, they commended a text, character or theme for being relatable. Easy to relate to, they meant. Relatable to what?, I would ask in the margin, perhaps too gruffly. I did not care for this voguish word, which seemed to demand that literature should always mirror our own lives, instead of being a portal into the implacable strangeness of other lives.

Needless to say, my war against relatable has ended in bitter defeat, with my pedants’ army routed and fleeing for the hills. Ogilvie’s iGen Corpus reveals much higher usage of this word among young people, but that is surely shifting. When I saw an interview with Patti Smith (born 1946) in which she described her song ‘Because the Night’ as ‘very relatable’, I knew the game was up. Fair enough. Language is always changing and young people are always at the vanguard. Anyone trying to counter the hegemony of relatable calls to mind the Grandpa Simpsons meme used to mock baby boomers railing against change: Old man yells at cloud.One of these days I may even start using relatable myself. That is how language works, and how cohort effects become period effects.

The most profound recent generational change, for Duffy, has nothing to do with technology. It is the phenomenon of ‘delayed adulthood’. Key life stages, such as leaving home, getting a stable job and moving into a place of one’s own, are happening much later. This partly stems from people staying longer in education but mostly stems from the low wages, precarious employment, debt and housing problems created by austerity. An emblematic contemporary figure is the university graduate sleeping in the single bed of their childhood bedroom. Duffy quotes one 28-year-old who has moved back in with her parents: ‘It’s hard to feel like an adult when you’re living with the people who used to brush your teeth.’ The huge growth in private wealth compared to growth in income, largely down to the long housing boom, ensures that advantage and disadvantage will be passed down the generations. Duffy suggests that this betrayal of the intergenerational contract – the promise that each cohort will have a better life than the one before it – is ‘a key reason why people of all ages are more likely to question whether our economic and political systems are working’.

Those with an interest in maintaining the status quo, meanwhile, prefer to treat post-millennials as children. This involves much less effort than addressing the structural causes of delayed adulthood. If young people have the temerity to want a secure job and affordable housing, they are told to grow up and quit whingeing. If they can’t pay their heating bills, it is because they have frittered away their income on Starbucks and Netflix, having failed to learn the grown-up art of delayed gratification. And if they are consumed by woke identity politics and metropolitan Remainer attitudes, then they must have been force-fed these views by their university lecturers.

This now common idea of universities as indoctrination camps for impressionable young minds would not survive long in a university classroom. Why would my students pay attention to my views on Brexit, when I can’t even get them to stop using the word relatable? Teaching is an uncertain affair, full of such humility-inducing failures and miscues. Students have their own ideas about what is worth knowing and retaining, not because they are a tribe apart, but because each one of them is an adult human – unbiddable, unpredictable and, ultimately, indecipherable. My students are not relatable, and neither am I.

We expend so much anxious thought on generations because, as Duffy says, they are ‘interwoven with the fundamentals of human existence and societal change; while individuals are born, live and die, society flows on, changed a little or a lot by our cohort’s presence and then its absence’. It is salutary for people in positions of privilege, like me, to be discombobulated by change, to feel that those younger than us are becoming harder to reach as they pull the rug from under the reality we have helped shape. In his memoir Teacher Man Frank McCourt writes about the thirty years he spent teaching English in New York high schools. The experience confirmed the truth of what his old professor of education had told him, that ‘it is the function of the young to get rid of their elders, to make room on the planet’. A teacher’s role is to pass something on and get out of the way – to make themselves dispensable.

Still, if you are a teacher of the humanities, you have to believe this: the journey from one brain to another may be the most difficult and circuitous in the universe, but there is still a basic commonality to human experience, and in a classroom you can search for that commonality together. Even cognitive aliens are, in Elkind’s words, ‘emotional countrymen’. If I need reassurance that this is true, I remind myself that I am in all essentials the same person as that homesick student in a phone box: stubborn, needy, self-absorbed, socially unconfident, intellectually arrogant. Since then, I have dumped many once cutting-edge bits of tech in landfill but, in the words of the Tracey Thorn song, the heart remains a child. Now I am an old(ish) man yelling at clouds. But nothing has really happened to me except the passing of time, and no one consulted me about that.