Archive for the ‘Selected Articles’ Category

Tom Sturridge: Not Romantic

Sunday, May 1st, 2016

Tom Sturridge has a dual public identity — as an actor, a very serious one, and as Sienna Miller’s former boyfriend. The acting is going well, thanks to his appearance in the West End last year with Damian Lewis and John Goodman in American Buffalo and now to his appearance as Henry VI in The Hollow Crown. This is the second instalment of the BBC’s blockbuster adaptation of Shakespeare’s history plays and will begin on Saturday on BBC2. Sturridge the actor is starting to be considered superb, potentially one of our best.

During the four years he was with Miller their every move was papped, printed, webbed and broadcast. They had a daughter, Marlowe, in 2012. Since they broke up last year there have been stories about them reuniting, but Sturridge is not going to confirm or deny any of this: “I feel I am being led into a conversation about my family and I do feel very protective about both those girls.”

He does say everything is amicable and that both are involved with their daughter’s upbringing. He also says he is looking for an apartment in New York because Miller is thinking of moving there and Marlowe is booked to go to school there in September. Meanwhile, he lives, well, nowhere. “I caught a train at 5am this morning from Dorset, where I’d been with my daughter. I’ll be sleeping at a place just behind Bayswater tonight. Because the move to New York is imminent, it doesn’t make any sense to get a place,” he says.

“You sound gypsyish.”

“That’s too romantic a word, I think.”

He denies he is romantic several times. He’s also single — “definitely”, he insists. Does he want to be single? “No, I’d love to be hanging out with some girl tonight, but I’m not.”

Sitting outside my favourite pub, the Uxbridge Arms in Notting Hill Gate, I see the first sign of his arrival — a trilby hat bobbing along behind parked cars. He’s slender despite drinking Guinness — three pints talking with me — but that may be offset by the Camel cigarettes he smokes. His face is extraordinary, a shifting combination of cold and vulnerable. Evidently he was born to play Henry VI who is also, with bloody consequences, alternately cold and vulnerable.

He is largely an autodidact, having left school before A-levels. His parents are the director Charles Sturridge and the actress Phoebe Nicholls, who met when he was directing and she was acting in the television series Brideshead Revisited. He evidently adores them and they him, but that wasn’t enough to stop him dropping out of school (Winchester College and the Harrodian School) in what seems to have been a fit of absent-mindedness: “I was lazy, I didn’t turn up quite a lot. My parents were deeply concerned. They’re incredibly emotionally intelligent people and they went through the experience with me.”

To ensure that he didn’t stop learning he was sent to a strange but evidently life-saving meeting with a smart lady whose name he forgets. It worked. He reads all the time, spinning off a list of names — mainly early 20th-century European writers — and is eager to get a list of recommendations from me. That’s the point about intelligent autodidacts: they’re not jaded learners, they stay humble and retain a sense of wonder.

This separates Sturridge from the feted list of public school actors: Cumberbatch, Lewis, Redmayne. He is always starting anew, it will never be just a job, he won’t feel that a part is his by right.

He wasn’t pushed into acting and never thought about it. He didn’t see his mother on stage until his twenties and “she was incredible”. As a boy he appeared in his father’s Gulliver’s Travels on television, but all he remembers of it is talking to his dad “and there must have been a camera somewhere”. Then, during a summer holiday before he left school, he found himself having what he describes as “an incredibly romantic experience”. The great Hungarian director Istvan Szabo was looking for a boy to play Jeremy Irons’s son in his film Being Julia. Sturridge ended up spending the summer living alone in Budapest.

“I was allowed into an adult world in which the interrogation of my thought was part of creating what they were creating. It was sexy, interesting and without doubt it made me think this was a world I wanted to explore.”

What then ensued was an odd sort of stop-start career. He was cast as the young hero in the Hollywood sci-fi flick Jumper, but was dumped in favour of Hayden Christensen. It was thought that betting $100m on a pale, thin and slightly disturbing looking English kid was not such a great idea. Soon after that he was cast in Richard Curtis’s The Boat That Rocked and last year he played Sergeant Troy in Far from the Madding Crowd. He has just shot A Storm in the Stars in which he plays Lord Byron, with Elle Fanning playing Mary Shelley.

In his own, odd, zigzagging way that all means he has pretty much arrived. Even his dad — who never says he likes anything unless he really likes it — has said that he likes Henry VI.

“Difficult part,” I say, “playing a weak and mad man.”

“Well, yes, unless you are mad and weak yourself,” he says, before settling down to dissect the word “weak”.

“He’s a boy. In the first film he is 17 and I think equating weakness with him is a dangerous direction to go in. He is the son of Henry V, the most heroic and powerful figure in the Shakespearean universe. I saw him as someone who was very young, investigating that position of power during his formative years. That doesn’t mean he was weak.”

As politics, The Hollow Crown makes The Thick of It look like starry-eyed idealism; as television it is a stunning, blood-soaked piece of drama. It’s Sturridge’s first Shakespeare part; he hopes it won’t be his last.

Given his thoughtful, impressionable and sensitive personality, it is astonishing that he survived mentally intact after the brain-curdlingly banal coverage of his relationship with Miller: endless online reports of him having a beard, not having a beard, wearing this, wearing that, having lunch, pushing a buggy, hanging out, hiding away. He was in danger of just becoming Miller’s bloke.

He looks wary when I bring it up, he suspects it is another raid on the Sienna mystery. “If you go on those websites I’m sure you can upset yourself, but it’s quite easy not to go on and I’m not really a well-known figure so I wasn’t really exposed to it.”

All of this, as he says himself, may be a performance. But, reviewing that performance, I’d say Sturridge, in spite of his years of relentless coverage, is an other-worldly figure. Many of his thoughts seem to revolve around a point he never quite reaches. His apparent uncertainty about himself — he never watches his own performances for fear of destroying his confidence — is, of course, an ideal condition for an actor who must, at work, always be somebody else. But, like “those girls”, it needs protecting. The last I see of him is his trilby hat bobbing along behind the cars.

Ageing and Forgetting

Sunday, May 1st, 2016

The End of Memory: A Natural History of Ageing and Alzheimer’s by Jay Ingram

When she died, Sister Mary of the School Sisters of Notre Dame had her brain removed and examined. It was full of plaques and entanglements, the signs of advanced Alzheimer’s. Perhaps no surprise there, she was 101; except that, right up to her death, Sister Mary had no symptoms of the disease. Quite the reverse, she had remained smart and alert with memories largely intact until the end.

She was one of 678 subjects of the Nun Study, an American experiment begun in 1986. Many findings are still likely to emerge but two already stand out. First, there were other nuns, like Mary, who had all the physical symptoms of Alzheimer’s but none of the mental ones. Second, incipient Alzheimer’s may be detectable in the early twenties. It just so happened that some of these nuns had been asked to write autobiographical sketches when they were young. Close analysis of these revealed a correlation between low “idea density” — a technical term for overtones and meanings — in the prose and the onset of Alzheimer’s decades later.

These are, as we all now know, urgent matters. Alzheimer’s has supplanted cancer and heart disease as the great medical anxiety of the developed world — 10% of people over 60 and 50% over 85 will suffer some form of dementia, usually Alzheimer’s. This is an epidemic, though there are signs that its progress may not be quite as inexorable as we thought. A recent British study suggested there had been a 20% fall over the last two decades. Men in particular seemed to be less at risk, possibly because of the decline in smoking.

Jay Ingram, a celebrated Canadian science writer and broadcaster, is ahead of the curve here; he notes earlier evidence that dementia may be decreasing in Europe. In fact, he notes everything. This is a thorough, lucid report from the dementia front line. Ingram speaks from experience: his mother probably had Alzheimer’s — it was never diagnosed — and he helped look after an aunt who certainly did. He is now 71 and obviously at risk. This is, as a result, a very engaged book.

What emerges most forcibly is how little we know. Aloysius Alzheimer identified a very early onset version of the condition that was to bear his name when, in 1906, he examined the brain of one of his patients who had died in her fifties. He discovered it had been disrupted by “amyloid” plaques and “neurofibrillary” tangles. The assumption thereafter was that these were what caused Alzheimer’s. In fact, we can’t even say that. They may be the result of the condition and, as the Nun Study showed, they may not cause anything. Nevertheless, the theory that the plaques in particular are the culprits is now dominant and the drugs that companies are pursuing — without success so far — are plaque preventing or busting drugs.

There is an even more fundamental problem: is dementia a disease at all? The single strangest medical fact about the modern world is the astounding pace of the increase in life expectancy: since 1840 in some countries it has increased at the rate of one year in every four. Again, we do not know the causes, but we do know that ageing populations are creating a care crisis. Improving the general health of the old is one solution, but it may be that nothing can be done for the ageing brain. Dementia is just what happens to old brains — this is what was believed before it was identified as a medical condition — in which case big life expectancy increases may actually be very bad news indeed.

Ingram somehow manages to provide an entertaining guide to the oddities that Alzheimer’s research has uncovered. One possibly significant fact is that our closest relatives — the other primates — never get Alzheimer’s, however old they are. Perhaps dementia is the price we pay for the explosive growth in our brain power.

Then there is the peanut butter test. Patients were blindfolded and asked to close their mouths and one nostril. Fourteen grams of peanut butter were put before the open nostril and the patients had to identify the smell. If they couldn’t, the stuff was moved closer. The finding was that those with Alzheimer’s needed the peanut butter to be four inches closer than those without. You can, I suppose, try this at home but you can’t use peanut butter because now you’d know.

If you want to protect yourself against dementia, go back to the Sister Mary story. She lived a disciplined life with continuous human interactions. She had, as a result, a high degree of “cognitive reserve”, meaning her brain remained agile and could find new pathways to get round any deterioration. In other words, although she had no symptoms, she may have had Alzheimer’s, but she also had the weaponry to fight it off. Other evidence shows that people with poor education and skills are much more likely to contract dementia. So stay as mentally active as you can and keep learning.

This means, among other things, not watching television. One study found that Alzheimer’s sufferers spent 27% of their leisure hours watching TV against 18% for the non-afflicted.

If giving up smoking is, indeed, the cause of the fall in dementia rates in Britain, then stop it at once and do all the other obvious things — keeping slim, exercising — to preserve your general health. Eating vegetables but not, mysteriously, fruit reduces the risk. In fact the fruit exception may be something to do with the clear link between sugar and dementia. Some have gone so far as to call Alzheimer’s Type 3 diabetes, so close is the correlation.

Eating turmeric, as some say, may help, but don’t bet your brain on it. Avoid nitrosamines, found in cheese, hot dogs, smoked turkey and the like. There’s no direct evidence of the effect on humans, but lab animals did develop diabetes and dementia when fed low levels of the stuff.

I could go on; this is a book rich in strange signs and wonders. Even if you’re terrified of dementia and don’t want to hear any more about it, this is a good read. Ingram has a relaxed, informative style and a way with explanation that makes you feel that at least somebody knows and cares. Remember, above all, the nuns whose cloistered but sociable and hard-working lives may seem more attractive than ever as the years go by.

The Alzheimer Identity

Sunday, May 1st, 2016

In the future, there is a surgical cure for Alzheimer’s. The damaged parts of the brain are removed and what’s left is reconnected with structures of artificial neurons. The patient returns fully competent and intact. Almost. Memories are lost in the process. And if the memories are of your beloved, the very person who encouraged you to have the operation, what then?

That’s the question asked in Nick Payne’s new play, Elegy. Payne has form when it comes to putting science on the stage. Constellations (2012) dealt with quantum mechanics and multiple universes, 2014’s Incognito with neuroscience and free will.

“I tried writing naturalistic plays,” he says, “but I actually felt I wasn’t very good at doing that — there were others much more adept at understanding what a naturalistic play requires. I find those scientific ideas interesting because they could give you a structure that didn’t have to be naturalistic.”

Science in drama is not new. Arcadia, perhaps Tom Stoppard’s finest play, considers truth in the context of contemporary maths and physics. Michael Frayn’s Copenhagen is about a meeting when great physicists arrive at the most shocking — now widely accepted — interpretation of quantum theory. In novels, science has regularly been plundered, from HG Wells to Douglas Adams, for tragic or comic effect. In films, science has been used seriously — think Alex Garland’s Ex Machina — but more often as a jumping-off point for assorted space-opera mythologies.

Artists, like the rest of us, now have easier-than-ever access to science, thanks to the huge wave of popular scientific publishing that followed Richard Dawkins’s The Selfish Gene in 1976, and Stephen Hawking’s A Brief History of Time in 1988. These have injected great stories, wild ideas and, latterly, tricky moral questions into our imaginations.

Payne and his generation are taking all this a step further. But first there is the man himself and the extremely odd situation in which we find ourselves.

Payne, 34, is slightly puzzling. The writer in residence at the Donmar Warehouse has a geeky air, and the clothes of an urban hipster. He has an odd tic of going “Hmmm” or “HMMMM!” when people are talking, whether in affirmation or dissent is unclear, and he chuckles to himself at odd moments. Mind you, I think I’m also acting a bit strangely.

We are in the Donmar offices alongside Murray Shanahan (professor of cognitive robotics at Imperial College); an Apple desktop and a phone (through which Anil Seth, professor of cognitive and computational neuroscience at Sussex, communicates); and Josie Rourke (artistic director of the Donmar). Shanahan and Seth have advised Payne on his play. There are repeated failures of Seth’s internet connection, and conflicting agendas among the participants to add to the oddity of Payne’s chuckles and “Hmmm”s. In the midst of this strange huddle — quite intimate, so Seth can see us all on his screen — I find my questionings wandering in a faintly deranged manner.

Though weird science has often been staged, the subject of Payne’s play is new and very topical: Alzheimer’s. Increasing life expectancy and, perhaps, other causes of which we as yet know nothing have made this disease the presiding anxiety of the developed world. It is incurable and peculiarly brutal, as it progressively takes away the patient’s mind while leaving their physical presence intact. Harrowing as this may be, it is fertile ground for drama.

In 2014, for example, we had Florian Zeller’s hit play The Father, which used shifting characters and sets to take us inside the mind of the suffering patient. This is not a play for the faint-hearted — reading the script, I found myself wondering if there was a gas oven handy. Alzheimer’s, however, is not merely topical, it is a version of an old theme. The disintegration of identity is, after all, the subject of great plays all the way from King Lear to Who’s Afraid of Virginia Woolf?.

Rourke makes the connection between this tradition and Elegy. “Although it’s a play about the self, it’s mainly about a couple. Fundamentally, there’s a question — and it goes all the way back to the Renaissance — about what happens to the object of your love and what the nature of your love is, how far that will persist against grief. That’s as relevant to Shakespeare’s sonnets as it is to neuroscience.”

Elegy is a short three-hander — two women lovers, one with Alzheimer’s, and a doctor. Payne is vague about where the idea came from. “I had a tinyish window to write a piece, so I just jumped in, then retrospectively spoke to Murray and Anil. It was a mix of things coalescing. I read something about the mapping of the brain.”

One of the key things he learnt from the scientists is that memory is not what we think it is: a series of excisable clumps within the brain, like files on our computers.“In an earlier draft,” Payne says, “I had used a data analogy for memories, so the clinician talked in the play about memories as data files, and they got corrupted. When I spoke to Murray and Anil, they both said, ‘It doesn’t work like that.’”

Memories are, in fact, distributed all over the place. If you remember meeting someone, that memory will involve a mass of associations — weather, clothes, conversation, the coffee you had — linked to other incidents and associations. Memory, like the brain itself, is a garden of forking paths. This, among other things, is what makes Zeller’s play so clever: the father’s memories do not simply vanish, they break free of their moorings and fragment. This does not mean, however, that they cannot be manipulated.

“We are far from being able to do that [the surgery in Payne’s Elegy],” Shanahan says, “and I am not in the business of making predictions, but, looking just a little bit ahead, it’s entirely feasible.”

Seth goes a little further. “It has become possible to change specific memories that animals have. In humans, there has been some work showing that, using behavioural techniques, having people recall memories, then changing them, can cause selective amnesias. And there are drugs that will change the way people reconsolidate their memories. Looking ahead, there is work in the States with what are called hippocampal prostheses, which will help people lay down new memories when they have lost the ability to do so.”

In short, Elegy is not merely topical or even futuristic, it is right at the edge of science that is happening now — and, of course, an affliction that is happening now. No wonder, as Seth says of popular scientific explanation, “there is an insatiable appetite for this kind of thing”.

Rourke, the phrase-maker in our huddle, says this is all about the “futurity of the present”, which, at this particular moment, is a peculiarly urgent matter. “This is the condition of being alive, needing to think about these things. The point is not about science, the point is about subject matter and whether the artist is the right person to face it. The point I’m making about Nick is that this is part of the quality about being in your early thirties and trying to navigate the world. It’s part of everybody’s experience in a way it wasn’t 20 years ago.”

She also speaks of an “aesthetic fetish of the future”, which is all about revelling in fantasies of things to come that have no relevance or attachment to the present from which they emerge.

This is very smart. Imagined futures tend to look quaint quite quickly — most old sci-fi movies and TV shows are now watched cultishly, ironically or as exercises in style. Some, however — think of the original Invasion of the Body Snatchers — endure because they were so urgently engaged with their own present: in that case, anti-communist paranoia. We have moved from a phase in which science and technology were exciting but “out there” to one in which they are accelerating and “in here” — in our lives and, increasingly, in our minds.

Furthermore, they are increasingly pointing to a strange future, a “post-human” phase in which anatomical or neurological enhancements change us so fundamentally that we become, in effect, a different species. Elegy dramatises one aspect of this by showing how love, that most intimate and potent reality, can simply be cancelled in the name of a medical cure that seems eminently desirable, yet at the same time results in the most undesirable thing imaginable. It’s an old paradox. “We are never so defenceless against suffering as when we love,” Sigmund Freud said. But he also said: “Work and love, that’s all there is.”

In this context, I suggest to the huddle that, though the scientists may have helped Payne, he is also helping them by putting up a giant warning sign about their work. Everyone looks oddly uncomfortable with this. Payne modestly demurs and the scientists steer the conversation in the direction of that blandly unhelpful subject, “the public understanding of science”.

Payne aims at a middle way. “The play deliberately tries to present a particular version of the future where medicine has the desire to treat the illness above treating the holistic condition of the patient. And, yeah, the interventions are somewhat chilly, so I think there is an interesting thing about how medicine grapples with death and dying.”

He breaks off into a chortle, which this time, I point out, is a little oddly timed. He chortles even more. “Well, I’ve been thinking so much about death and dying, you’ve got to laugh, otherwise… I’m glad you see this as treating the science and the ethics around death and dying with respect, but also with a polite sort of question mark.”

In the end, what is at stake here, both in Alzheimer’s and in robotics and neuroscience, is the condition of the human self, the source of everything we know and make. In Alzheimer’s, the self does not vanish, as in death, but fragments — and, in doing so, raises questions about what it was in the first place.

“The self is not just one thing,” Seth says. “In dementia patients, some aspects may be intact — people still experience themselves as the centre of the subjective world and the source of their actions. There is a relevance here to how we respond to the increasing prevalence of dementia and Alzheimer’s, and we can do that as a society in a more sensitive and useful way by recognising that the self is richer than just the string of memories over time.”

The Wizard of Woz

Sunday, April 24th, 2016

Steve “Woz ” Wozniak ambles into the room in a grey suit and blue and yellow Nike trainers. He is very short and round. Now 65, he is grey-bearded and neckless and has leg problems. Yet his bonhomie is unstoppable. He grins at everybody and talks and talks.

Enthralled, this meet-and-greet group listens and listens. To everybody at the Business Rocks convention in Manchester, he is the heart and soul of contemporary technology. An engineer of genius, he co-founded Apple with Steve Jobs and, entirely on his own, designed and built the Apple II . Every personal computer in the world is a descendant of that machine.

Unlike Jobs, he is neither difficult nor ambiguous and, unlike pretty much everybody else in Silicon Valley, he is not greedy. He is said to be worth $100m (about £70m), a piffling sum in an industry in which manhood in measured in billions. He gave a lot of his money away and, from the first, sold his shares cheaply to Apple employees. Last week he stuck his neck out by saying the company ought to pay its taxes like everyone else. “I don’t like the idea that Apple might be unfair,” he said.

He continues on this theme when we talk on stage at the convention. “I never started Apple for money,” he says. “I wanted to show off my engineering prowess. I didn’t want to be corrupted ever in my life. I thought this out when I was 20 years old. I’m not going to be corrupted to where I do things for the sake of money.”

Woz is pure; he’s a tech saint and, in Manchester, they flocked just to touch him — and get selfies, of course. He is like a fragment of the True Cross, a physical connection to the foundational narrative of their tech-soaked lives.

But there’s something odd about him, a strange uniformity of response. This may be something to do with his prosopagnosia — face blindness. It’s an affliction, he tells me, he shares with Brad Pitt. We step down after an hour on the stage together and at once he does not know who I am. Only when I speak does he recognise me.

So the grinning joviality may, in part, be a social defence but it’s also an aspect of his ideology. His management theory is, basically, make it fun. Jokes are sacraments and pranks — usually involving software or engineering hacks — are the stuff of the well-lived life.

It’s hard to imagine how this roly-poly funster formed any sort of partnership with Jobs, the narcissistic perfectionist and artist. Yet partners they were, for a time. This ended in the early 1980s, when Jobs’s marketing genius as opposed to Woz’s engineering became the Apple core. Woz left, but not quite. He has never stopped being an Apple employee, earning, it is said, an annual $120,000.

His main use of the company is the local Apple store. He says he lives in the one part of Silicon Valley with lousy broadband so he and his wife sneak down to the store at midnight and stay there until 6am, downloading movies and TV shows to watch at home.

Sadly, his continued employment does not mean he can leak anything to the Manchester geeks about what’s going on at Apple: “I’m so honest and I talk a lot so they’re scared to get me too close to the inner workings of the company, probably rightly so. But they won’t fire me.”

In the limbo of this not-quite-exile, Woz seems to have had a love-hate relationship with Apple, which he once described as “the bane of my life”. Stories have emerged that Woz doesn’t like this or that — most recently his suggestion that the Apple Watch was inauthentic, not a core Apple product. In Manchester, however, he’s having none of this, blaming the press for exaggerating things or brushing it all aside: “It’s a joke!”

This often makes him difficult to understand but none of it should detract from his genuine purity. He was taught two big things by his father, who, unknown to Woz, designed missiles during the Cold War: electronic engineering, and that America was the greatest country in the world. The former gave him his livelihood and a view of the world as fixable. The latter provided crushing disillusion.

In Manchester he said: “I thought, this thing [politics] is not for me and I’m not going to vote, ever, and I’ve never voted for anybody who could win. The government has wealth and power and I wish they had common sense and spent their money like Apple and Google do and earned money instead of wasting it.”

He’s veering towards Bernie Sanders’s socialism this time round but can sound very right-wing in his dislike of government. Pure he may be, but also confusing.

Never mind, let him be confusing: he is adored — not too strong a word — and nobody can ever question either the genius of the Apple II or his touching faith in engineering. Nor can anybody doubt the value of what is still a hacker’s and prankster’s view of the world — that it can be both fixed, often illicitly, and fun.

Forgetting who I am, Woz leaves the stage for more meetings. In one, a young would-be tech billionaire asks him: “What question should I ask you?”

“How do we not lose our specialness?” says Woz, afloat on an ocean of geek love.

Hurling Stuff at the Stars

Sunday, April 17th, 2016

Accelerating a human being from a standing start to 100m miles an hour in two minutes is unwise. Something not unlike strawberry jam glued to the inside of the spaceship would be all that was left of the gullible fool. This is one of the many ways in which Star Trek is not entirely realistic. It is also one of the many reasons we are not bound for the stars — unless some weird physics turns up with a shortcut.

The problem is that the universe is, by any reasonable standards, far too large — “vastly, hugely, mind- bogglingly big”, as Douglas Adams put it. The nearest stars to Earth, the trio known as Alpha Centauri, are about 4.2 light years (25 trillion miles) away, and there are 100bn more stars in our galaxy (and 200bn more galaxies that we can see).

There are three possible reactions to this shocking state of affairs. One is anguish — “the eternal silence of these infinite spaces frightens me”, said Blaise Pascal. Then there is disdain — Peter Cook said he felt a great sense of his own significance when he looked at a starry sky. But some respond with something akin to irritation — “Let’s do it” is more or less what the Russian physicist and billionaire Yuri Milner said last week.

Milner, supported by Stephen Hawking, has put up $100m (£70m) to kickstart a $5bn-$10bn project — Breakthrough Starshot — that would accelerate lots of iPhone-sized objects in the direction of Alpha Centauri. Unlike humans, these objects should remain unharmed by the hyper-acceleration involved.

Everything about this project reeks of an ecstatic and mad grandeur. Lots of — perhaps all — billionaires go crazy. Some do in a really bad way: one co-founder of PayPal, Peter Thiel, for example, turned himself into a Bond villain when he backed a plan to build floating cities in international waters where the cyber-elites could find fulfilment free of regulation (and probably get invaded by China). There is no poetry in that, but there is a great deal in this new symptom of Milner’s madness.

First there is the ticklish timing. It would take 20 years to build the system, at least another 20 years’ flight time and then 4.2 years more before the signal came back to Earth. Milner is 54, so he would need to live to 98 to see whether it had all worked — not impossible given the sort of healthcare he can afford with his $2.9bn fortune.

Then there is the astounding technology. It is clear Milner has made this announcement simply because we can do it. Probably.

The first thing required is a mighty bank of 10m lasers, each with a power output of 10kW. Then there would be pretty ordinary rockets that would deliver the little iPhones into space, where they would open their sails. The lasers would then be fired so as to converge into a single 100 gigawatt beam (equivalent to about 100 nuclear power stations) that would hit the wings and accelerate the iPhones. In two minutes they would be out of range of the lasers, but, all being well, by then they would be travelling at around 100m miles an hour.

The reasons this all now seems possible are developments in materials, electronics, miniaturisation and lasers. In particular, getting an object with enough power and instrumentation to send back information from the stars into a package the size of an iPhone is now feasible. Indeed, the marvellously named Zac Manchester at Harvard has raised more than £50,000 on Kickstarter to build satellites — KickSats he calls them — the size of postage stamps.

Furthermore, after the recent successes of unmanned missions within our solar system — the photos of Pluto, the delicate little craft that landed on a comet and so on — the idea of machine space exploration is now much more attractive than human. All those sci-fi space operas consistently failed to acknowledge the sheer inhospitability of space which makes keeping humans alive, well and sane for anything more than a few months a quite absurdly expensive task. Hurling smartphones at the stars is a much more efficient idea.

Back on Earth this story has an interesting ancestry. Private-sector space is, albeit slowly, booming on the back of the mad money being made from the internet. Most notably the Amazon founder, Jeff Bezos, (worth $55bn) has Blue Origin, a project to develop new rocketry for manned flight. Elon Musk ($14bn), co-founder of PayPal, has SpaceX, which is already delivering payloads to the International Space Station. Earlier this month, after several attempts, it also pulled off the fantastically difficult feat of retrieving a first rocket stage by landing it vertically on a drone ship.

Musk is 44 — he was born a year before the last moon landing — Bezos is 52. Along with Milner these are technologically driven men for whom space can only have been a terrible disappointment. Government space failed to deliver. The euphoria of the beautiful Apollo missions was followed by the boredom and tragedies of the space shuttle era. A humiliating American dependence on Russian launchers ensued.

There is some faint irony in the fact that Milner is named after the first man in space, Yuri Gagarin, whose single Earth orbit fired the US determination to get to the moon before the commies did, and April 12, the date of the Starshot announcement, was the 55th anniversary of that flight.

But there is more than disappointment at work here. There is a determination not to acknowledge the possibility of any human limitation in the face of those infinite spaces.

“The limit that confronts us now is the great void between us and the stars,” Stephen Hawking said at the press conference to announce the project, “but now we can transcend it . . . Today, we commit to this next great leap into the cosmos. Because we are human, and our nature is to fly.”

The tone — “great void’, “great leap”, “transcend”, “commit” — is not remotely scientific; it is, if anything, religious. Hawking is saying this project will go some way to fulfilling our destiny as creatures whose nature is to fly. It is way over the top and the idea that human nature includes the need to fly is bizarre. But then calls to fulfil our destiny, to submit to the demands of an imagined future, usually are, on closer examination, OTT and bizarre.

They are often dangerous as well, but not in this case. Milner’s madness has found fine and poetic expression in his plan to hyper-accelerate a flock of iPhones in the direction of Alpha Centauri. It is a much better use of money than yachts, babes, London property, Smythson notebooks — £240 for the Vahram Panama (sic) box set — dodgy financial instruments or even football clubs; and it is a much better deployment of middle-aged ego than Thiel’s lawless and stateless islands because, ultimately, a mad beauty is involved, both in the execution and the intention. And, though flying may not be an aspect of human nature, the pursuit of beauty most certainly is.

Justin Bieber’s Incorrect Hair

Sunday, April 10th, 2016

I recently saw Justin Bieber on a TV chat show in America and concluded he was a bit thick. Then I felt guilty and decided he was sick, not thick. Inventing a diagnosis on the spot, I concluded he was suffering from CIA (celebrity induced autism). This afflicts people in their teens who have had a showbusiness “personality” dumped on them before they had a chance to attain personhood.

Up to the age of two a baby’s brain generates neuronal connections at the rate of 700 a second. After two the brain starts cropping these excess connections and it is not until the early twenties that the brain reaches its adult state. Bieber was discovered when he was 13; he became world famous at 15. Something went wrong with the cropping process.

Something else has gone wrong with Bieber’s cropping. He has acquired dreadlocks. This, it turns out, is forbidden in some quarters because it amounts to the “appropriation” of black culture by a white pop star.

So? Well, in America this word has gone viral. A production of Gilbert and Sullivan’s The Mikado was shut down because it had “appropriated” Japanese culture. In Maine some students have been banned from wearing sombreros because they belong to Mexican culture. Another CIA sufferer, Miley Cyrus, has been condemned, as has the reality TV star Kylie Jenner, for adopting cornrows. And so on. A video has emerged from a US campus of a white student being bullied because of his dreadlocks.

This is another startling addition to the ever-lengthening list of things we must be offended by. It follows a yoga class banned because of “cultural issues” in Canada and a New Orleans-themed summer ball in Oxford condemned as “nostalgia for an era of history steeped in racism”.

Ian McEwan was recently cornered on the subject of some annoyingly sane and perfectly humane remarks on transgender matters as, previously, was Germaine Greer. Last year the National Union of Students adopted a policy to stop gay men “appropriating” black culture.

Some of the current outrage is, admittedly, driven by nothing more than the need to feed the slavering maw of the internet — notably Twitter and Facebook — with ever more lethal doses of outrage. But, more alarmingly, it is also driven by academic terror at the prospect of being found on the wrong side of any argument with students.

Or academic glee. Appropriation was the subject of a bizarre, head-clutching — and, on closer examination, entirely nonsensical — item on BBC2’s Newsnight last week in which an academic, Emma Dabiri, fiercely defended those who were outraged by these cross-cultural exchanges. Black culture in particular may be “appreciated” but not “appropriated”, a distinction that, I fear, may further confuse poor Justin.

The high point of the show was when, from New York, Chimene Suleyman, a writer, told us that white students had defended dreads on the basis that Vikings and Celts had worn them. To this, Suleyman responded that this meant that black culture was being extirpated in “favour of a group of people who haven’t existed for the past 1,000 years”.

The BBC should really look into the possibilities of a noxious drivel filter.

In fairness, Newsnight was right to take this subject seriously. The politics of outrage and identity (defining social roles entirely by whether people are black, gay, female, whatever) and the revelling in victimhood have lately taken an explicitly illiberal and Stalinist turn.

McEwan and the gay rights campaigner Peter Tatchell, who has found himself being labelled “transphobic”, are liberal men, defenders of tolerance, but suddenly, like one of Stalin’s bewildered loyalists, they found they had to take a bullet in the back of the head.

The first point to make here is that if you ban appropriation, more or less the entire culture vanishes in a puff of smoke: from the Beatles and the Stones (the blues) to Romanesque architecture (Byzantium) and Picasso (African art).

The second point is the stupidity of the assault on liberalism. This may give people of a certain age a shudder of déjà vu. The radicals of the 1960s and 1970s were, reasonably, angry about the Vietnam War but then, less reasonably, they moved on to liberal democracy.

They found succour in the thoughts of philosophers such as Herbert Marcuse who, I notice, has returned to haunt us. The good news is that he was brilliantly lampooned in the Coen brothers’ latest film, Hail, Caesar!; the bad news is that he is being adopted by a new generation of students.

The man was a prat who wrote, among other things, that freedom would come when we imposed “intolerance against movements from the right and toleration of movements from the left” or, to translate, you’re free if you agree with me.

Marcuse is now being used to argue against any distinction between speech and action, which is why preventing people speaking at universities — “no-platforming” — has become so commonplace.

Within reason, this distinction is critical to the stabilising function of liberal discourse.

Poor Justin’s hair now seems to have been dragged into the Marcusian dialectic. Supposedly his “do” victimises blacks but the only victim here is a bewildered Bieber.

People say, he said in response, that “you wanna be black and all that stuff, I’m like: it’s just my hair.”

Out of the mouths of babes and CIA-sufferers . . .

Justin Welby: Truth from a World of Lies

Sunday, April 10th, 2016

Sir Anthony Montague Browne’s hairbrushes are the detail that catches the eye. Do men, these days, have more than one? Do they have anything more than a comb and a tube of gel?

Montague Browne, a former RAF pilot and private secretary to Winston Churchill, died in 2013. His widow, Shelagh, kept the brushes in remembrance. His hair clung to them, and after a long, strange story his DNA was extracted, proving that he, not Gavin Welby, was the father of Justin Welby, Archbishop of Canterbury.

Welby reacted to the news with astounding grace. He had asked for the DNA test himself and was clear-sighted about what it might reveal. It was a fine, priestly and topical lesson to all those politicians who have been dodging and weaving over the Panama papers.

“There is no existential crisis,” he said, “and no resentment against anyone. My identity is founded in who I am in Christ.”

The story of the unearthing of Welby’s true paternity evokes the vanished world of postwar Britain, not just of men with multiple hairbrushes, but also of drunks and gamblers, rakish RAF heroes and racing drivers and of the potency of rural gossip.

It is the world of Anthony Powell’s 12-volume novel cycle A Dance to the Music of Time in which the variously damaged, privileged characters move through lives of infidelity and concealment, all lost in a world the war and its aftermath had made terrifying and uncertain. People had become opaque.

“One passes through the world,” wrote Powell, “knowing few, if any, of the important things about even the people with whom one has been from time to time in the closest intimacy.” He could have been writing about the story of Welby’s discovery of his true father.

Welby’s handling of it has also made it a story of a very Anglican mix of liberality, faith and realism.

The tale began to emerge two years ago from the borders of East Sussex and Kent where Charles Moore, the Daily Telegraph columnist and official biographer of Margaret Thatcher, heard it said by neighbours that Montague Browne was the true father of Welby, who had been confirmed as archbishop in February 2013.

Long before that, Welby’s apparent paternity was already a gossipy issue. Gavin Welby was a high-living fraud. He dated John F Kennedy’s sister and the actress Vanessa Redgrave, he was a Lloyd’s “name” and he was twice a losing Conservative party candidate.

His life was all founded on lies. He was an alcoholic whose father had sold quack remedies in America. Gavin imported whisky illegally and put it about that he had aristocratic connections. He lied about his wartime military rank; he was a lieutenant, not a captain. Jane, his wife and the archbishop’s mother, described his courtship as “bullying”. The marriage ended after three years and Gavin died alone in his Kensington flat in 1977.

The archbishop has spoken frankly about the difficulty of his upbringing by such a man, describing his childhood under the shadow of the alcoholism of both his father and mother as “messy”.We now know it was not his true father who was causing his pain. The irony is sharpened by the fact that Montague Browne was a man who embodied the Establishment eminence Gavin had so mendaciously craved.

The story is, first, that of Shelagh Mulligan, whose first husband died in a car crash. She had separated from her second, the racing driver Lance Macklin, when in the early 1960s, with a son and a daughter to support, she became the personal secretary of Churchill’s wife Clementine.

It was in that job she first met Montague Browne, a pilot decorated for his service in the Far East and now Churchill’s private secretary. He took the post in 1952 and stayed until the great man’s death in 1965.

He was clearly almost too good to be true; he was certainly a very Anthony Powell, if not a 1950s comedy film, character. Shelagh had been to see the movie Carlton-Browne of the FO and the next morning she ran into a man in a bow tie and a yellow waistcoat who introduced himself by saying: “My name is Anthony Montague Browne of the Foreign Office.” She laughed at the faintly preposterous encounter, but he must have seemed like a rock of stability to a young woman with such a tumultuous romantic history.

Montague Browne was married but, in Lady Churchill’s words “très infidèle” and he began an affair with Shelagh. He told her: “I have a daughter. And I’m told that I have a son . . . I shan’t tell you [who he is]. But you’ll find out one day.” After years of indecision he finally left his wife and married Shelagh in 1970.

In the same job Shelagh also got to know Jane Portal, niece of Sir Charles Portal, the wartime chief of the air staff. She was pretty and very attractive to men but also, as she admitted in a statement last week, an alcoholic. “I was already drinking heavily at times,” she says of the period of Gavin Welby’s courtship. She has not drunk since treatment in rehab in 1968.

“Although my recollection of events is patchy, I now recognise that during the days leading up to my very sudden marriage, and fuelled by a very large amount of alcohol on both sides, I went to bed with Anthony Montague Browne,” she said in a statement. She did not know that Justin was to be the result of that liaison: “It appears that the precautions taken at the time didn’t work.”

“I still recall,” she said, “our joy at his arrival. So this DNA evidence proving that Gavin was not Justin’s biological father, so many years after Gavin’s death, has come as an almost unbelievable shock.”

Justin himself had heard rumours that Gavin was not his father but he had discounted them. As far as he was concerned he seemed to be a honeymoon baby as he was born almost exactly nine months after Gavin and Jane married in haste.

After the divorce, Justin, who went on to Eton, spent much of his time with Gavin. Montague Browne said to Shelagh: “The poor child was left like a little football.” As Churchill might have said in the light of Justin’s later career, some football.

Shelagh noticed her husband kept in touch with Jane and prevented them meeting. He was also interested in the progress of Justin. Suspicious, she challenged him and he “hotly denied” that he was the father. She did, however, discover evidence of his continued habit of infidelity that made her doubt these denials.

In 1975 Jane married Charles Williams, a former cricketer, banker, biographer and Labour peer. They are still together. The marriage created a rift between the Montague Brownes and Jane. Shelagh suspected that this was in part due to Anthony’s jealousy of Williams’s success as an author. He took pride in his time with Churchill, but he felt it had been downhill since, professing himself “unfulfilled”.

When Justin became archbishop, he began to make regular appearances on television. Shelagh was struck by his close resemblance to her husband. But even in old age he continued to deny it.

Finally, with Montague Browne now in a nursing home, he told Paddy Macklin, Shelagh’s son, that he would like to see Justin before he died.

Macklin called the archbishop and told him he thought Montague Browne was his father. “He did not seem terribly surprised. He sounded composed,” Macklin said. It sounds credibly Welbyish at that moment, but he also elsewhere professed himself astounded. In the event, Montague Browne died before Welby could visit him.

When Moore spoke to Welby about the story, he recognised that it would be a news story and said: “Well, let’s get a DNA test. Certainty is better than doubt.” And the truth finally emerged.

Moore asked Welby how the news affected him. “I am one of those people who processes things more subconsciously than consciously. This news is important and interesting. But this is by no means the most difficult thing that has happened to us.” Far from it. Welby and his wife,Caroline, lost their first child, Johanna, in a car crash when she was seven months old.

Coincidentally, I was speaking to somebody last week — an atheist — who had been present and intensely moved by a speech Welby made about his bereavement. “He spoke with such honesty,” he said, “it was incredibly thrilling.”

Welby succeeded the scholarly and saintly theologian and poet Rowan Williams as archbishop. Having been a banker and very much out in secular society, he was seen as a worldly choice to replace the brilliant but somewhat unworldly Williams. And so it proved. Welby revived the Anglican social conscience as he became the spokesman of the public outrage against the behaviour of payday lenders, modern-day slavery and the persecution of Christians in the Middle East.

Perhaps the difficulty of his upbringing at the hands of the bullying Gavin had made him even more worldly. He has always seemed to be involved with the lives of the people in ways that are unusual in holders of that high Establishment post. But the handling of this story has shown Welby — and therefore Anglicanism — at its very best. Like his church, he has shown himself able to make sane and gentle peace with the storms that afflict human lives.

He even responded to the awkwardness of having a newly extended family — a new half sister, Montague Browne’s daughter, Janie, and her son Guy — with grace and, as usual, a flawless command of tone.

“I would not wish to push myself on anyone. This is an entirely private matter. But once the dust has settled, if they wanted to meet me, yes, that would be very interesting,” he said.

The story is, as I said, reminiscent of Powell’s A Dance to the Music of Time in which the shady and damaged rich responded to a world that had freed itself of one conflict only to face the possibility of another much worse conflagration. Their uncertainties and those of the world made them increasingly difficult to understand.

But the Welby story introduces a new form of understanding that can shine a light on the consequences of that generation’s behaviour. Welby himself noticed the fact that now we can take those hairs on those brushes and, through DNA sampling, identify ourselves and our predecessors.

But can we really identify anything truly fundamental about ourselves from genetics? The only properly devout — and realistic — answer is no and here yet again Welby delivers: “Although there are elements of sadness, and even tragedy in my father’s case, this is a story of redemption and hope from a place of tumultuous difficulty and near despair in several lives.

“It is a testimony to the grace and power of Christ to liberate and redeem us, grace and power which is offered to every human being. I know that I find who I am in Jesus Christ, not in genetics, and my identity in him never changes.”

His faith and serenity are long way from the rich, decadent, faithless postwar world that Powell described. It was a flashy but drab world in which “work and play merge[d] indistinguishably into a complex tissue of pleasure and tedium”. Gavin Welby was of that world, as in their various ways were all the characters in this distinctly novelish tale.

Such a world, of course, still exists but, just occasionally, someone such as Justin Portal Welby, the 105th Archbishop of Canterbury, emerges to tell us it needn’t.

Helen Mirren Fifteen Years Later

Sunday, March 27th, 2016

The last time I interviewed Helen Mirren was in 2001. We got drunk in a French restaurant near Victoria station, discovering, among other possibilities, the use of nuns as drug smugglers. Waiters dropped plates at the sight of her. She was very rude about the treatment of Hollywood stars and she said f*** a lot.

This time, I am in the Four Seasons in Beverly Hills, there’s only water to drink and she is trying to be diplomatic, even at one point stopping herself saying f***. Back then, there was just one last reprise of Prime Suspect (The Final Act) left, and she wasn’t to be The Queen until 2006.

Now she is Helen Mirren, screen goddess, heroine of older women, ageless L’Oréal lady and Dame Commander of the Order of the British Empire. In fact, the only things our meetings have in common are her legs.

“They’re not her legs, you know,” my wife called as I headed out for Victoria. This was sort of true. A Virgin Atlantic TV ad was running, and in one lingering shot, the gams in question were a model’s. She admitted this, saying her legs were like Gazza’s (Paul Gascoigne, football star, for our younger readers). Oddly, they are still named after a footballer.

“They’ve had various names, all footballers’. They were Kevin Keegans, now they’re Wayne Rooneys — just between me and my husband, really. I would have been a good footballer. One thing that was really, really sad about my era was that women weren’t allowed to play football. I’m built like a footballer, and I never liked any of those girly sports, hockey and netball.”

Here we are in Hospitality Suite 1518 (which is anything but hospitable). Mirren is wearing a slinky knee-length purple dress and her legs look footballer sturdy, but in a feminine way. Her hair is silky and silvery, and she wears a good deal less make-up than in the L’Oréal ads. She is 71 in July and looks exactly like the perfect glam role model for grannies everywhere.

“There should be a new word for that. I am becoming aware of the fact that there are a lot of women to whom I am a beacon of hope. It’s fantastic — yeah, we’re here and we’re available and we’re relevant, it’s OK.” (Her accent seems to be a touch grander, slightly more cut-glass than in 2001, as it was in that weird 2016 Super Bowl anti-drink-driving ad I saw soon after our meeting. But, OK, she’s earned grand and cut-glass.)

She has always been sceptical of the desperation with which people fight ageing. Back in 2001, she spoke of stars like Demi Moore getting up at four in the morning and going to the gym for five hours. “And here I am now, still not going to the gym!” she exclaims. “Except I do go to the gym occasionally. As you get older, it becomes a necessity. I’m still not a Hollywood actor, but maybe what has happened is that Hollywood has changed.”

I remind her that, last time, she said Hollywood actors were “infantilised by the community around them and, at the same time, disdained”. “There’s still truth in that, but what’s happened now is that a lot of actors are taking responsibility for their own material. In fact, with this film [Eye in the Sky, the subject of this encounter], Colin Firth is a producer, and it came to me from Colin.”

She also seems to be smitten with the latest generation of film actors. “Jennifer Lawrence! Saoirse Ronan! How can they be so self-possessed? And wonderful and beautiful. I don’t mean just physically beautiful — they are beautiful things in their self-possession, their intelligence and their ease. I’m just in awe of them.”

She is in two big current films, the other being Trumbo, in which she plays Hedda Hopper, the absurdly powerful gossip columnist who terrorised Hollywood from the mid-1930s until the late 1950s. She was the high-hatted showbiz attack dog for Senator Joe McCarthy’s commie hunt, one victim of which was the eponymous screenwriter Dalton Trumbo. She is the villain of the movie and, incontestably, the best thing in it. In one chilling scene, she confronts Louis B Mayer. She blackmails him because he once attempted to seduce her.

“She was aware of her power and not afraid of using it in what seems a not particularly feminine way, but, actually, it was a very feminine approach. I think women are hugely underestimated on that level. Certain women get away with it because they are underestimated. She was cartoonish, and deliberately so. Her hats weren’t elegant, they were ridiculous. She wanted people to know she was in the room.”

Yet she sets Hopper’s aggression and cruelty against the sexism of the age. “I suspect, where she says to Mayer, ‘You tried to f*** me, now I’m going to f*** you if I can’, that was true. There was a lot of sexual harassment, and she was angry about that. That’s my theory. She became so obsessive, there was something unbalanced about her.”

There’s also ambiguity about Colonel Katherine Powell, the part Mirren plays in Eye in the Sky. It’s a taut, documentary-like film about the moral complexity of a drone strike on terrorists in Kenya. Part of the complexity is that it’s a British operation using American kit and an American pilot — Breaking Bad’s Aaron Paul — based thousands of miles away.

Powell is the British officer trying to get the strike done, in the face of resistance from politicians, lawyers and even, at one crucial moment, Americans. After the film was made, Mirren discovered her part had originally been written for a man.

“I don’t think they changed a word of the dialogue. But I think having a woman in that role changed a lot. The director [Gavin Hood] said he didn’t want people to see a movie all about men and war, a bloke’s film. Casting me in that role emphasised that you really had to think about the morality and ethics of it. People say I was channelling my Jane Tennison a little bit.”

Every character is ambiguous, the point being that there are no right or wrong decisions. There are equally powerful arguments for and against the strike. But, to be honest, since on the “for” side are Mirren and the late, great Alan Rickman (who gets the best line right at the end), most people will be urging Aaron Paul to pull the trigger and launch his Hellfire.

Almost all the dialogue is through screens, and the film was shot in discrete segments in different countries — Mirren had to interact with crosses on an otherwise blank screen, but, of course, she pulls it off. I suggest there is something of Jane Tennison in this. In Prime Suspect, she just turned up and became Jane with miraculous consistency; in this, she becomes Powell without a human to react to.

The secret is to limit thinking time. “I do work to fulfil the requirements of a role. I did a lot of research on Hedda Hopper, to see what she looked and sounded like. But I don’t overthink things. I’ve become a great believer in the Gérard Depardieu school of acting — you just read the script and do what’s on the page.”

Next, she is doing a really starry Hollywood movie with Will Smith, Kate Winslet and Keira Knightley, called Collateral Beauty. (The title is a play on collateral damage, but it’s still an awful title.) This is not the thriller she would like to be doing — she misses them. I say, as the supreme older woman, she must be deluged with offers; she says she isn’t, very few good ones, anyway.

Mirren’s father was born in Russia. She agrees with me that she looks Russian, and adds that, in Russia, people assume she is, asking directions and so on. But culturally, she’s all English. The story of her departure for America in the early 1980s is that she left in a huff because she failed to win an Olivier award for Antony and Cleopatra with Michael Gambon. She was quoted as saying: “F*** it, that’s it, they don’t want me.” Now she says it was more to do with film offers and meeting the director Taylor Hackford. They lived together from 1986 and married in 1997.

She now has homes in LA and London. Her nephew and his family are in the US, and she has two stepchildren via Hackford, so, she says: “All my family lives in America.” But she will never give up British citizenship, though she might take dual citizenship because of the dread subject of “estate planning”.

She is enough of a Brit — perhaps because she is The Queen — to get quite passionate about a royal: Prince Charles. “Charles is going to be a great king, unfortunately not for very long. Not that I’m into kings and queens. I know him a little, but not well. I don’t understand why people attack him for being concerned enough about his country to write to people about it. When you read those letters — those black spider things — what you see is a very, very nice man who is concerned about architecture and the look of the country, and what a jolly decent chap he is.”

She sounds motherly about Charles, which raises the question: why didn’t she have children?

“I love children, they are so funny and so sweet, but I never wanted my own. I have never had a moment of regret about not having children. Well, I lie. When I watched the movie Parenthood, I sobbed for about 20 minutes afterwards.

“It was about the whole story of being a parent and how it never stops, even when you’re a grandparent. I realised I would never experience that, and for about 20 minutes, I sobbed for the loss of that and the fact that I never experienced it. Then I got over it and I was happy again.”

Her blood relatives are, as a result, few in number. She lost her brother, Peter, in 2002, while she was shooting Calendar Girls. He was, she said in 2001, “an adventurer in the Philippines and a sort of hanger-out with bar girls”.

Peter died of skin cancer. She tried to manage his care and treatment while running on and off the set to call the Philippines. “They don’t have skin cancer there. If he’d been in Australia, he’d be alive now.”

The mood has turned dark and depressed. I lead her out of it into the comedy of politics. She’s anti-Brexit and seems quite taken, in a crazy, surreal, 2001 kind of way, with Corbyn and Bernie Sanders.

“They’re slightly old school, sort of left, with the weird clothes that they wear and their hair and everything — and, suddenly, the rediscovery of that kind of political thinking.

“It seemed to be lost. It’s like the Welsh language — just as it’s about to be lost, people rediscover it.”

Daft but true. The Mirren of 2001 is not lost to the world.

I am preparing to leave Room 1518 when we discover we are on the same flight back to London that evening. Luckily, it’s British Airways, not Virgin, so she won’t have to unscrew her legs.

David Eagleman: Making the Blind See and the Deaf Hear

Sunday, March 27th, 2016

In the lobby of the Ritz-Carlton hotel near Jacksonville, Florida, people are gathering for a conference in the usual way – schmoozing, drinking and flirting. In the silent darkness of each of their brain the rationalising prefrontal cortex is being overwhelmed by the incoming signals. More primitive, brain regions are wildly overestimating the possibility of a sexual or a career-boosting encounter.
In the midst of this erotic/ambitious maelstrom I glimpse one man in a black tracksuit flicking at his phone. His prefrontal cortex is clearly in charge and he is looking for a picture of me so he can find me in this crowd. I introduce myself.
“David.”
“Bryan!”
This is David Eagleman. He is to neuroscience what Brian Cox is to physics, the friendly, enthusiastic face of an intimidating subject. Thanks to brain scanners, neuroscience is the most explosive scientific discipline of the moment. It is also the most intimate. It concerns you here now rather than, like physics, weird stuff happening trillions of miles away. Eagleman is in your head, whether you like it or not and, thanks to his ingenuity, excitable clarity and boundless optimism, you probably will.
Tomorrow he is doing a lecture for these horny and increasingly drunk conferencees. He hasn’t the faintest idea who they are – I tell him they’re something to do with the ‘hospitality’ business – but he does know they want to see his vest more or, more accurately, his VEST. This garment may make the blind see, the deaf hear and introduce everybody to a whole new sensory world. But I’ll explain all that later.
We slip out of the rising din of primitive urges in the lobby – somebody has started playing a piano – on to the terrace and start talking. Our prefrontal cortexes, as opposed to everybody else’s, are fully in control, or, rather, they are for about 50 minutes at which point more ancient cerebral regions kick in to make us shiver with cold and, in the case of Eagleman’s brain, to signal urgently that he is starving.
Our lumbering bodies then carry the 3 pounds of hyperactive fat and water in our skulls over to the restaurant where he devours a monstrous burger – he doesn’t care about food in detail, he just needs a lot. This is understandable, the man is a terrifyingly energetic calorie burner. He is currently in the midst of a career reorganisation – “My wife told me I’d be dropping from eight jobs to seven and she’s in favour of that.” She accuses him of having FOMO – fear of missing out,
“I don’t think it’s exactly FOMO but it’s a close cousin.”
“Trust me,” I tell him, “it’s FOMO.”
So here’s brief roundup of his very FOMO achievements since he was born just under 45 years ago in New Mexico.
First of all, as a child he fell off a roof. This turned out, neuroscientifically, to be a significant achievement because he noticed that, as he fell, time seemed to slow down. Even though the fall was very brief, it felt much longer because of the rate at which his senses were taking in information.
Much later he tested this by using a fairground machine to drop himself and his students at Baylor College in Houston backwards 150 feet. This made them afraid enough to test whether fear really did slow time down. It didn’t. Eagleman’s popularity and accessiblity is based, not least, on his talent for theatrical experiments.
Agred nineteen or twenty, he was for a time a stand-up comedian, claiming to have been “pretty good”. More to the point, his father was a psychiatrist who often dealt with murderers.
“I learned something from my father something that, as a child, you don’t want to believe – that people are really different on the inside. We are all the sole inhabitants of our own planet. He would sit down with murderers and I wanted to believe that if you talked to them long enough you would find they were just like you… but in fact they were just completely different. People are a multidimensional space, they are completely different from one another.”
At college he pursued this mystery of other minds by majoring in British and American literature but he was also interested in physics and space.
“I felt a bit frustrated about space somehow. I grew up watching Carl Sagan’s Cosmos on TV. But studying it was always so distant. You could spend your life studying it but you’d never get there. But I felt we’d got the brain cornered. There’s three pounds of it right behind your eyes. It’s probably more complex than anything that’s out there, but at least we have it cornered, you can hold a brain in your hands.”
He still writes fiction and sees a link between literature and neuroscience.
“It’s a different angle on the problem of knowing ourselves. Literature is a great way to get there. It allows you to jump into different people’s heads and different points in time and space and try to understand what it is to be somebody else….Studying the brain is asking fundamentally who any of us are, why we believe the things we do and why we take the actions we do. They’re two angles of the same problem – know thyself.”
“The thing is,” he adds later, “we’re all trapped inside our own heads our whole life so we don’t know what it’s like to be somebody else.”
The theme of other minds – the utter indecipherability of othrer humans – keeps appearing in his conversation; it’s an itch he can’t quite scratch.
Now at Baylor College in Houston he directs the Perception and Action Lab and the Initative on Neuroscience and Law. He has made a six-part TV series – The Brain with David Eagleman – which ran on BBC4. His latest book was based on the series. His previous ones were Why the Net Matters, Incognito: The Secret Lives of the Brain and Sum, a bestselling set of forty short stories about possible afterlives about which Stephen Fry famously raved on Twitter. He’d just been in LA discussing his next TV series and he is now going to move to San Francisco to take up a post at Stanford and, he hopes, to finish his novel Eon, though he also has another brain book on the go called Live Wired. He is also a brilliant talker at TED events, but, these days, almost everybody is. He has also started a company, Neosensory, to commercialise that VEST.
He was voted one of Houston’s ‘most stylish men’. Houston, I tease him, is not exactly Paris. He points out indignantly that Houston is America’s third largest city and that the rather unstylish track suit is just what he wears on planes. Then he turns all coy on the subject – “I have nothing to say about that”.
He looks like a seventies rock star – he has thick, black hair, sideburns and a loud, sometimes booming voice. He seems to chew his words and there is always a laugh coming through. When he says ‘a lot’ it comes out as ‘a laaart’. His favourite word is “Interesting!’” which he sort of sings in a falling cadence.
He ahas a dog called Maya. The Veil of Maya is in Hindu philosophy the veil of illusion that conceals the real world from our eyes, exactly what out selective senses do. The entire family except for Sarah, his wife, is named with neuroscientific intent. He has two children, a 4-year-old son called Aristotle or Ari which is Hebrew for lion, and a 7-month-old daughter called Aviva which is Hebrew for spring. Apart from reminding himself that he is Jewish – though he has never been observant – the children present him with a very intimate version of the deep neuroscientific problem of other minds.
“The challenge of raising children is this increasing separation. They are completely attached to you and dependent on you at first and then it’s just this road to independence and it hurts a bit. I remember the time when my wife and I first moved Ari’s crib into his own bedroom and her eyes filled with tears, it was just a moment of this separation that happens.”
Unexpectedly, he tells me he was walking though Jacksonville airport when he saw a father and daughter, clearly separated for some time, hugging, unable to let each other go. “It brought a tear to me eye,” he says and it brings another as he tells me. Knowing the incompleteness of the world provided by our sense clearly does not protect us from the feelings it inspires.
Okay, so the child’s brain is where we should start with the Eagleman view of neuroscience. The, roughly 86 billion neurons in Aviva’s brain are generating connections at the rate of 2 million per second and will continue do so until she is two. Ari’s brain is already cropping these connections and will continue to do so until the whole process settles down in his mid-twenties. This cropping process is how we become who we are. Children and teenagers are odd, wildly imaginative, risk-taking or just downright annoying because their neurons are over-connected. Cropping these connections is how the brain edits the flood of incoming information from the senses down to something that works for a viable adult human mind.
“What the child is exposed to,” he says, “that’s what prunes the garden.”
Basically it’s use it or lose it. The child’s brian stops using connections that are of no use in its contacts with the world and they die; the others grow in strength. If you don’t go through this process you’re in trouble. In his TV show Eagleman has some heartbreaking sequences involving now grown people who were, in their early years, locked up in Romanian orphanages receiving the bare minimum of care. The rest of their lives are dominated by the fact that their brains did not fully synch with the world the rest of us inhabit.
Your brain takes what it needs to make a world. Locked in the darkness and silence of your skull, it doesn’t see, smell, hear, taste or touch anything, rather it makes a world that works for you out of the electrical impulses flowing from your eyes, ears, nose, tongue and skin. The real world smells of nothing, feels like nothing and is colourless and tasteless.
“The thing about it being colourless and odourless,” Eagleman explains, “is what we would find so strange about reality if we could somehow see it. We wouldn’t be us if we could see it. But also we would see that it’s much much bigger than we ever imagined. We see about one ten trillionth of the electro-magnetic spectrum and we call that visible light, but the rest of it is exactly the same stuff – radio waves, X-rays, gamma rays, microwaves. It’s all the same stuff just at different frequencies. It’s totally invisible to us, it will pass right through our bodies. If you could see the whole world it would be … God, it would just be such a different experience it would blow our little minds off.”
He describes the brain as “a general purpose computing device” which simply deals with what it is given by the peripherals – eyes, noses etc – provided by nature. But, unlike a computer, the brain has plasticity, it constantly adapts and changes according to both the inputs and its own condition.
Perhaps the most staggering – and most hopeful – evidence of this plasticity came from a US study of 1,100 nuns, monks and priests. Their calling meant they had similar and very stable lifestyles and they couild easily be tracked through their live as and beyond when their brains were removed for study. In some cases, they found that though, as he puts it, their brains had been “chewed up by Alzheimer’s disease”, they had shown no cognitive defects. Why?
“They were intellectually active, they dealt with other people, they had chores, they had responsibilities, they talked to each other. Other people is the hardest thing that we have to face in terms of exercising our brains and they live in these communities until their old age. Even as their brains were falling apart, they were able to establish new roadways between point A and point B.
“The problem is that once people retire they tend to do less exploration of the world, they just watch TV and stop reaching out and challenging themselves, I guess when people reach a certain age they think they’ve paid their dues and now they’re going to relax but that’s problematic.”
The other point about brain plasticity – and this is where that vest comes in – is that we don’t need to regard the input systems – our senses – as fixed. All these systems do the same thing, they send electrical impulses to the brain. This means we can create different systems. Famously one blind man was given some sight by a light sensor fixed to his tongue. The electrical impulses from this were, after a while, interepreted by the brain as visual images. So, potentially, we can cure deafness and blindness simply by bypassing the ears and eyes. Or we could invent entirely new senses. Eagleman imagines us being able to sense our Twitter feeds or stock market prices.
The vest which the conferencees were keen to see was invented by Eagleman and his students at Baylor. It’s called a Variable Extra-Sensory Transducer – hence VEST – but it really is a vest in that it is warn tightly against the body beneath the clothes. Little vibrating motors “convert data streams into dynamic patterns of vibration across the torso”. The brain can learn to interpret these vibrations into just about anything we like.The vest can, potentially, make the deaf hear, the blind see and investors and twitterers even more enervated than they already are.
“The brain accepts what you feed it. There are many kinds of peripheral devices used by animals, we can make our own peripheral devices and feed any kind of information into it. Our preipheral devices were just inherited on the long road of evolution, there’s nothing fundamental about them, we can take in any kind of data.”
The vest has now been “spun-off” into a company called Neosensory, which, I would guess, is perfectly capable of being the next Google.
Oh and he’s started, seemingly by accident, a new religion. It’s called Possibilianism and, fankly, it knocks scientific atheism out of the park.
“That’s where I stand in terms of those big questions about the world. It’s essentially that we know way too much to buy into anybody’s particular story and we know way too little to pretend we’ve got the answers and so when I walk into a bookstore and I see people arguing one side or the other and it seems strange to me. So I’ve defined this this position which is just keeping one’s mind open in the possible space of what is going on in this strange cosmos.”
He gave one talk about it and it caught on worldwide. He’s seen papers on Possibilianism from Uganda, India and all over the US.
“My book Sum is essentially the Possibilian manifesto. It’s 40 completely different short stories about what this is all about. i could wirte 400 or, collectuively, we could write 4000. That’s the point of it – to shine a flashlight into the possibility space.”
Possibilianism mandates humility – we just don’t know so many things. His own humility includes scepticism about the current state of our knowledge. He regards the FMRi machines, the primary brain scanning tools that have fired current interested in neuoscience, as far too primitiive – “I’m not that big a fan, in twenty years we’ll look back and guffaw.” He does this looking back from twenty years in the future thing a lot.
“Google seems to be enormous now and in charge of the internet but when we look back in twenty years it will be Schmoogle or something.” And on the state of physics, he says, “In twenty years we may have forgotten about quantum mechanaics and we’ll have schwantum mechanics.” He is, in short, impatient to get there, wherever there is. The present is never enough, and FOMO can include fear of missing the future.
Above all, he doesn’t want to miss the future because he is pretty convinced it will be good. He is a straightforward American optimist.
“If you look at the abolition of slavery, women’s suffrage, gay marriage, these are all these steps point in the same direction. They don’t come about for free, they take a lot of work, but they’re all pointing towards increased equality. That’s why I believe it’s possible to know ourselves a little better as a species and not just act like gorillas.”
His book on Why the Internet Matters argued that the internet would make tyranny impossible in the future, people would simply know too much. It was published in 2010 and looks, after the failure of the net-driven Arab Spring, distinctly dated now. But he is clinging on to his hope.
“What’s useful about the net now and probably into the foreseeable future is that it prevents governments, it means that the citizens have a voice. I think in general it is a forec that goes against the force of tyranny.”
But central to his optimism is that, in the future, we will know ourselves better through the insights of neuroscience. He believes, for example, that education will include neuroscientific insights about the rises of tyranny and violence – how people demonise out groups using the same language, typically comparing them to cockroaches or rats.
“If you imagine a world in which it is just part of education to learn what happens when you dehumanise an out group and ask how it is possible for young 18-25 year old men can come to believe they are right and everybody else is wrong. Then it would be more diffiuclt for something like ISIS or tyrants to arise.”
He speaks of enormous moral progress and quotes Martin Luther King – “The arc of the moral universe is long but it bends towards justice.”
I point out to him that this, in fact, does not work. The psycho-analysis of Freud – one of his heroes – was inspired by the entirely rational idea that if you explained to people what was wrong with them, they would get better. Overwhelmingly they didn’t. And the one great truth about ISIS and their successors is that the coherence of their posision arises from the very rejection of all the wisdom represented by people like optimistic American neuroscientists, who are, by definition, wrong about everything.
He looks a little unhappy about all this, feeling, perhaps, that I am trying to deny him the future he habitually sees as brighter and better than the present. I feel a little guilty, but, on the other hand, I know he won’t read any of this because he told me “I can’t read articles about myself for the same reason Woody Allen can’t watch his own movies.”
What Eagleman represents is a peculiarly American kind of artist whose media are fiction, non-fiction, TED talks, television, the internet, strange experiments and wild tales. Neuroscience is his ostensible theme, but, really, it’s the same theme as all art. As he describes it is the “great puzzle to figure out what the f*** we are doing here.” And, FOMO sufferer that he is, he really wants to solve that puzzle.
He wanted to follow up the giant burger with a pudding.
“How ya doin’, bud?” he asks, clearly hinting I should join him another couple of thousand calories. But he is shamed by my austere order of a mint tea and he has one too. Back in the lobby the lower regions ot the assembled brains are still running rings round the prefrontal cortices.
“They’re going to have terrible hangovers when you give your talk.”
He grins, shrugs and lopes off to his room to write his talk.

Simenon: The Artist of the Irresponsible

Sunday, March 13th, 2016

Detective Chief Inspector Jules Maigret of the Direction Régionale de Police Judiciaire de Paris, situated at 36 Quai des Orfèvres, is the perfect modern man. He has no faith — his creator, Georges Simenon, abandoned Catholicism in his teens — and no time for great ideas of philosophy or society. He lives a comfortable bourgeois life, but he is in that world, rather than of it. His mind is usually far from his loving wife’s amply stocked table. His real dwelling is on what Robert Browning called “the dangerous edge of things”, the moments when the ordinary man plummets out of his ordinary life into crime.

Maigret is unlike other great fictional detectives — Sherlock Holmes, Father Brown, Miss Marple — in that he does not solve puzzles. He is more like Raymond Chandler’s Philip Marlowe, but he has none of Marlowe’s moral anguish. Nor does he, or his author, indulge in Chandlerian poetic musings. My theory is that he has influenced much more modern detectives: I remember Steve McQueen in Bullitt (1968) making me think of Maigret, as did David Tennant in Broadchurch. Superficially they are quite different, but they share that sense of damaged cops being ever more damaged by the world they contemplate. Like Maigret, they are Christlike in absorbing the sins of the world, but, of course, there is no Christ for them.

For Maigret, evidence and the moral condition of society are secondary: all that matters is understanding the mind of the murderer. This is the most difficult task, perhaps an impossible one, because the criminal himself does not know his motives. He is “irresponsible”.

“My very first Maigrets,” Simenon said, “were imbued with the sense, which has always been with me, of man’s irresponsibility. This is never stated openly in my writings. But Maigret’s attitude to the criminal makes it quite clear.”

“Even in the Maigrets,” writes the philosopher John Gray, the greatest British explicator of Simenon, “the question is not why a crime was committed, but how the person who committed the crime departed from a settled routine of living, and the detective resolves the conundrum by imaginatively entering into the life of the suspect.”

So, in the first of two ITV Simenon films — Maigret Sets a Trap — Rowan Atkinson’s detective wanders through the bloody killings of Parisian women as if in a dream. His team wait on him and watch, knowing he is working in ways that are inaccessible or even offensive to ordinary cops. He gives the impression that, in some ultimate sense, he knows the murderer long before he meets him, and when he does meet him, the recognition is instant.

Atkinson is the latest in a distinguished line of TV and movie Maigrets. There was the still fondly remembered Rupert Davies from the 1960s TV series, and the more robust Michael Gambon in the 1990s. In France, there were three Maigret films starring the great Jean Gabin. No pressure, then, Rowan. But don’t worry, he pulls it off, not least by silently capturing the intensity of the unexpressed workings of the maestro’s mind.

Behind all this lies Simenon the man, one of the most successful authors of all time. He sold more than 600m books around the world. He wrote almost 500 and could knock off a Maigret in 10 days — eight days’ writing and two revising.

These astounding figures, and the fact that his most famous works are detective stories, have perhaps disguised from a few literary snobs that he is, without doubt, one of the greatest writers of our time, not least because, like Maigret, he is so perfectly of our time. André Gide was one of the earliest to point this out, and he has been followed by William Faulkner, Muriel Spark, Peter Ackroyd, PD James and John Banville.

The keeper of the flame is John Simenon, one of the three children of Simenon and his second wife, Denyse Ouimet. He was born in America and christened Jean, which swiftly became Johnny and now seems to have settled down to John. Though his first language was English, the family left America when he was young, and he now speaks English with a pronounced French accent.

He has spent his life working in the world of the media and intellectual property, but now, aged 67, he commits himself to his father, as, apparently, he has always done. In his memoir When I Was Old (great title!), Simenon, seemingly bewildered, writes of 10-year-old Johnny’s dogged devotion: “He has become a sort of disciple, which is disturbing. It’s a relation I’m not used to, and when I’m conscious of it, it bothers me.”

In a giant room in a Soho hotel, I remind John of this passage. He did indeed admire his father, but this account is, he feels, one of the exaggerations to which his father was prone — the most famous being his celebrated boast that he’d slept with 10,000 women.“I knew you would ask about that. He said that in a discussion about Casanova with Fellini. To take it literally is a bit of a jump, and I never did… My father is clearly a person who attracted the need to create a mythology around him.”

Nevertheless, devotion to the father runs in the family. Simenon was devoted to his, although, John points out, he was actually more like his mother. He also recalls that Simenon always said being a parent — he had four children — was his most important role in life. He told him, “Le métier d’homme est difficile” — being a man is a hard task — a wonderful manly message for a son. It was not, however, always hard for Simenon.

Whatever the actual number, he clocked up a large number of conquests, including, most famously, Josephine Baker, the black American dancer who became a huge celebrity and artists’ muse in Paris during the 1920s and 1930s. He was, in authorship and life, burdened with boundless energy. One symptom was women, another his pathological urge to move home, or just to move in general. John does a huge Gallic shrug when I mention this.

“I went through his diaries not long ago, trying to create a timeline of his life, and he was all over the place all the time. When I was born, in Tucson, he was going to LA twice a month. In those days, you had to have a car specially fitted for the journey. His whole life he was doing things like that.”

Another, darker aspect of the mythology is the suggestion that Simenon was a collaborator with the Germans during the war. The truth is that this has twice been dismissed by official investigations. All he seems to have been doing is getting on with his work with minimum interference.

“The facts are simple: he was a foreigner [Simenon was originally Belgian] in an occupied country and he had to report to the Germans every week. The collaboration is all part of the mythology. My father never went to Berlin, he never wrote a sentence in favour of the fascist regime and he never thought about them.”

The fact that Simenon was a foreigner in France is important. Though we tend to think of Maigret as the supremely French detective, John says he would have been completely different if the author had been born and brought up in France — “Not that he would have been a Belgian detective, just that he would have been different.”

Perhaps Simenon never lost the watchfulness, the anxiety and curiosity of the man abroad. The foreigner, the displaced and deracinated hero, is, of course, a central figure in modern literature; Maigret always feels like a stranger in a strange land.

Simenon was also a hypochondriac. A story he told about himself was that he had his blood pressure taken once when he embarked on a 10-day Maigret, and once when he had finished. It had usually fallen, he said. In fact, John says, he had his blood pressure taken all the time, not just when starting or finishing a book, and he introduced the American habit of annual check-ups for all his family. He needn’t have worried. He died in September 1989, aged 86, having burnt the candle at both ends and got away with it.

His later years had been darkened by the suicide of his daughter, Marie-Jo, in 1978, when she was 25. John can scarcely speak of this. He offers no explanation and clearly feels there is nothing to be said. “It affected him more than anything, not because he was Simenon, but because he was a human being.” Her death inspired one of his latest autobiographical books, Intimate Memoirs.

Nevertheless, in old age he attained a degree of what John calls “tentative serenity”, a rest from his superhuman energy and his unremittingly bleak view of the human animal. He found peace in acceptance. “Maigret’s was a far from perfect world, but he finds his place in it. That’s the difference between Maigret and my father. My father finds a certain centre of gravity within that chaotic world. You felt he would have to struggle much harder to be able to find that serenity.”

Simenon’s reputation seems to fluctuate more than most. John says, a touch bitterly, that he was utterly dismissed in France in the 1960s. “But now I have the feeling people are definitely getting it much more intuitively and quickly when they are reading the romans durs [the “hard novels”, generally regarded as his finest].

“There’s a new generation discovering him. David Hare has adapted a late book, La Main, as The Red Barn, and that’s going to be on at the National Theatre. That was dismissed when it was first published.”

I ask him why we are now beginning to understand Simenon’s works. “There is a resonance, I think, that is taking place. It may have to do with the world we live in today. It’s not an easy world. We had an illusion in the 1960s. Now we are post-ideological, post-growth.” We live, perhaps, in the world of Maigret’s mind.

John, however, now seems to have found comfort and, perhaps, serenity in his immersion in the world and works of Simenon. He speaks often of his gratitude to writers such as Banville and, especially, Gray for offering him new critical pathways into that mighty imagination. Does he do all this as an act of love or, perhaps, payback for the father to whom he was devoted?

“No. I’m going to sound a little strange here. I’m really doing it for pleasure. People say his books are like a mirror offered to the reader, but that mirror is my father, and when I am reading, I am going deep into my own roots. How can I find that a burden, for Christ’s sake?”