The atomic human, p.5
The Atomic Human, page 5
This said:
‘Under the Command of General Eisenhower, Allied Naval Forces supported by strong Air Forces, began landing Allied Armies this morning on the Northern coast of France.’
This was how Fred found out that his unit was shipping out to France, while he was 80 miles away at home in Kenilworth.
Fred’s story highlights one of the challenges for the information topography. He was ordered to stay in Hampshire, but he didn’t know the motivation for the order. He was balancing his obligation to his family against his wider obligation to his army unit. Fred had made his own decision about what to do, but now the context had changed. He left Kenilworth immediately and spent the next two years fighting in France and Germany. When he did eventually get back to Kenilworth, his son, my father, didn’t know who Fred was. He kept asking my grandma, ‘When is that man going to leave?’
According to the logic of command, Eisenhower was in charge of Fred, but the devolution of authority meant that Fred could operate outside Eisenhower’s orders. Limited information bandwidth didn’t just undermine Fred’s understanding, it also restricted Eisenhower’s ability to control his men. You might think the answer is simple – servicemen like Fred should just obey orders – but it’s not quite that easy. Troops on the ground have information that their commander doesn’t have. Sometimes they need to follow their instinct rather than their orders. By devolving authority to troops rather than micromanaging their activities the military can be more flexible.
I often think of this devolutionary step as being akin to riding a bicycle with no hands. A simple logical model for a bike would suggest that it is steered through the actions of the rider’s hands on the handlebars, but the reality is more nuanced. A professional cyclist steers not just with her hands but through her body’s position on the bicycle. She steers with her posture, using the weight of her body to guide the bicycle. If the distribution of weight is correct, the bicycle begins to steer itself.
A bicycle is more stable when the rider’s weight is towards the rear wheel. So, somewhat counterintuitively, it is easier to ride leaning back with no hands on the bars. Devolving responsibility is analogous. People become more confident when you empower and entrust them, but just like a rider leaning back on a bicycle, this empowerment can feel to the leader like a loss of control. Leaning back makes the bicycle more stable, but it makes any necessary interventions – braking or negotiating a tricky corner – harder because the rider’s hands are further from the controls than she would like. Devolution of power has similar challenges: it implies a different form of exercising authority.
When not building survival shelters in the Palm Springs desert, one of Jeff Bezos’s interests was machine learning. I met Jeff in Palm Springs when I was still a professor in Sheffield. The survival shelter construction was part of the MARS conference: a thought-space event focusing on space, robotics and machine learning with team-building activities like the shelter construction.
I had given a talk there about the challenges of data and artificial intelligence. Machine learning is programming a computer by providing it with examples. But the really big breakthroughs in machine learning that triggered international interest in artificial intelligence were achieved only when we acquired a lot of examples – many more examples than humans require to do the same job. The success of machine learning was driven by an internet-enabled information topography. With the internet, data was easier to acquire, but raw internet data is not always sufficient to teach the machine. Each example needed to be refined before the machine could consume it, and the new breakthroughs were coming only when the machine was shown millions of these refined examples.
This meant that large tech companies had a significant edge over their smaller rivals when using this new technology. My team at Sheffield had started a small company. It focused on how to do machine learning when examples are scarce. Either the talk went down well, or maybe Jeff was impressed by my ability to stay out of his way when he was building survival shelters, because within a year Amazon had acquired my team and Jeff had become my supreme commander.
It was seeing how Amazon operated to devolve authority that drew my attention to the challenges of leading a large company. When I joined, in 2016, it had a quarter of a million employees. By the time I left, in 2019, it had close to a million. Eisenhower had commanded over a million troops, many of whom were, like my grandfather Fred, waiting in English fields under their own canvas shelters for the invasion to start.
Jeff may have been hands-on when building shelters, but his approach to running his company was largely hands-off. Amazon has a corporate culture that teaches its employees how Jeff sees the company and its role and empowers them to make decisions in the light of that knowledge. I used to joke that it’s as if Amazon surgically implants a little Jeff Bezos inside your head. Whatever the circumstances you find yourself in, you have been trained to think how Jeff thinks. The hope is that you act as Jeff would act. Of course, Jeff does intervene in the company, but his interventions are calculated and strategic. They are the equivalent of putting the brakes on the corporate bicycle or steering around a tricky obstacle. The day-to-day management is devolved to the employees through their understanding of the company’s culture: mostly, the bicycle has been trained to steer itself. Amazon is managed like this because it’s run on too large a scale for Jeff to be involved in every important decision. It focuses on building a culture and training its employees in it. Then it trusts its employees to try and make the decision that Jeff would have made, had he been there. Like Cicero’s notion of art, music and literature cultivating our minds, Amazon cultivates its employees through training them in the company’s priorities.
In any large organization there is a tension between an individual’s autonomy and the advantages that come through wider coordination of their activities. The human’s locked-in condition provides bottlenecks: information cannot propagate instantaneously through an organization. When Bezos was still in charge at Amazon, he mostly steered the company through posture, enabling it to run itself, and much of the communication within the company was through the sharing of culture.
We have an expression in English: ‘It goes without saying’. We use it to suggest that the next piece of information to be exchanged is superfluous. For example: ‘It goes without saying that, after listening to the morning radio bulletin, Fred hightailed it back to Aldershot on the next available train.’ The use of ‘it goes without saying’ is reflecting the fact that Fred’s next actions are predictable. They are a direct result of his understanding of a wider common purpose. It is a purpose you can instinctively recognize, even though you know very little about Fred, other than knowing that he is a human living in a society with certain obligations. So, it goes without saying that’s what he did. Of course, I’m saying it anyway, because his actions are important in the context of the wider narrative we’re building.
So, Fred boarded the next train and headed off to war. As an isolated decision, this might not seem like rational behaviour, but when you know his obligations to his unit and to the wider society, the decision makes sense. Perhaps the only thing missing for you to be able to predict Fred’s actions is an understanding of his personal values. Well, now you know him a little better, because Fred got on a train to go to war.
Imagine you own a small shop and I come in and tell you what a lovely window display it has. Then, if I say how terrible it would be if someone threw a brick through the window, you may perceive what I’m saying as a threat and think I represent some form of protection racket. Although I have vocalized my hope that your window display will not suffer any damage, you may feel intimidated.
Why is that? It’s perfectly reasonable for me to hope silently that your shop window won’t be smashed: it goes without saying it would be a terrible shame if it was. By choosing to say it, by using our precious bandwidth to communicate information that should be obvious, then I’m implying that the opposite may happen. This sophisticated form of communication relies on a shared understanding of common purpose and context. We expect each other to come to our own conclusions about appropriate actions given this context, we can second-guess what those actions should be, and we can allow for individuals to operate autonomously given our shared purpose. This is how we deal with our very limited bandwidth, and it’s also why it’s difficult to capture the nuance of human communication and actions without accounting for this context.
The common purpose in war is the defence of a nation or a people: having an adversary brings alignment among the people. But in peace we can also coordinate around common ideals. Recent advances in our AI capabilities have been developed by building neural networks which have read all digitally available human written knowledge. These networks, known as transformers, have consumed billions of documents and given the machine an ability to converse. This ability is built on an awareness of our context gathered from consuming our written works. The different roles we each play in society are intertwined in this context. In modern society, some of that context is given by our working life: we have defined a set of professions just as the ants and the bees have evolved to carry out different roles in their colonies. As an academic, my profession conforms to a common set of ideals that informs my work. I started my academic career at Aston University in Birmingham, moved across to Cambridge, then north to Sheffield, on to Manchester, then returned to Sheffield and most recently to Cambridge. This range of universities share a common feature: a sense of academic culture. Academics have their own mythology. We view ourselves as fiercely independent, objective truth-seekers. Our research is curiosity-driven and its impact is measured through the respect of our esteemed peers. We combine this with the education of the next generation, ensuring a ready supply of fertile minds to carry the baton forward. In practice, academic reality often departs from our mythology, because academics are also humans. We have individual autonomy and are subject to human weaknesses such as egotism. Our academic ideals are often sacrificed to our desire for recognition, funding and promotion. Despite this, the academic ideal still serves as a cultural regularizer. Even if very few of us fulfil it, it represents something to strive for. We know what we should do, even if we don’t always do it. Just like in Top Trumps, individual academics may be better at certain aspects of the role than others. Some may be better lecturers, others may be better researchers and still others may be better mentors.
After two decades steeped in academic culture, when I left Sheffield to join Amazon in 2016 I had some adapting to do. While my caricature of Amazon’s corporate culture is that it places a little Jeff in your head, you won’t be surprised to hear that that’s not how the culture is presented to new recruits. What you are taught is a set of values that capture the essence of how the company wants you to behave. The core component is known as the ‘leadership principles’, and they are like the ten commandments of the company. But there are a couple of differences. First, there were fourteen of them at the time, and, secondly, they can change over time. For example, since I left Amazon, they have added two more, so now there are sixteen. One of the most important is ‘customer obsession’. The company likes to view the customer’s interests as being at the heart of all the decisions it makes. Another is ‘earn trust’, which describes how you build relationships with other employees and different teams and reflects the importance of trust in any system of devolved autonomy. Trust between individuals and groups is one way we overcome our bandwidth limitations. There are also other, more complex principles like ‘right a lot’, about how you assimilate information to make decisions.
The leadership principles give Amazon a set of common values, but the company recognizes that specific projects may need a different set of more focused ones. Teams are encouraged to come up with a list of ‘tenets’ for their projects. In Amazon parlance, tenets are values that individual teams design, a set of mini-commandments to represent how that team addresses problems. This allows for some adaptation of the corporate common purpose to better suit the needs of individual teams, so the values can be contextualized according to the circumstances the team finds itself in. The system of shared corporate values creates a culture that gives the context for individuals to make decisions. By teaching values, the company can devolve decision-making, trusting that individuals are capable of the nuance required to judge a particular situation. This topography helps Amazon deal with the communication barriers associated with locked-in intelligence. It gives Jeff a set of cultural levers through which he can steer his company.
Let’s contrast human communication with two machines engaged in similar conversations. Machines don’t have access to context or an understanding of common purpose, but they do have a great deal of bandwidth. So if a customerbot walked into a shop and suggested to a shopkeeperbot that it would be a shame if the shopkeeperbot’s window was broken, then the shopkeeperbot will not find it unusual that this information has been shared, because it would only be a small part of the myriad facts and figures the two machines are capable of exchanging in milliseconds.
When the Allies were about to invade France, the Eisenbot, on finding the Fredbot was out of position on the day of the planned invasion, could use its enormous bandwidth to send direct orders about exactly what the Fredbot should do to get back to its unit. The Eisenbot could directly order the Fredbot to report to a particular place while simultaneously giving the 160,000 troopbots crossing the Channel that morning their specific landing orders. Further, the Eisenbot could have compelled the Fredbot to remain in Aldershot. In a machine-dominated topography, a mechanistic topography, the Eisenbot would control everything, and the Fredbot and the other troopbots would not be autonomous but automatons.
As embodied humans, we cannot handle the quantity of communication this mechanistic topography would require. Instead, imbued with a sense of common purpose, we short-circuit our embodiment. We view the circumstances we are faced with, we understand our common purpose, we understand our fellow humans, we second-guess their behaviours and we choose how to act. Humans do not coordinate as directed automatons; we coordinate through devolved autonomy. Or at least that’s how it used to be.
I decided to start my career in neural networks when I was working on oil rigs in Liverpool Bay. It was 1996 and the internet was still in its infancy. I didn’t have a network of tech-world contacts I could ask for advice, so I headed to Foyles, a bookshop on Charing Cross Road in London, once the world’s largest. I found my way through the maze of shelves to the neural-network books. A fellow browser had a professorial look about him, so I asked him which book I should buy.
In the information topography, Foyles was providing an information hub where I could meet like-minded people and exchange understanding and opportunities, a place where information could be exchanged and redistributed. A bookshop becomes a hub because it stores information in the form of books and those books attract interested readers to congregate around its shelves. The internet has radically altered this information landscape, but twenty-five years ago, when I had to physically travel to Foyles, the information topography was closely tied to our physical topography.
Today we have networks of interconnected machines that can coordinate decision-making between them. This is the tectonic shift in the information topography we are faced with, and the early tremblings of this shift originate in the moment when Eisenhower decided the invasion of Europe should be launched.
On 5 June, Eisenhower was in conference with his staff. Tidal patterns were dictating that a landing in France would have to be on the 6th or 7th June. Allied forces had an information topography that shared orders and plans. The military has a culture around how orders are followed. But plans were shared only on a need-to-know basis, which is why soldiers like Fred were not told when the invasion was due to begin, but that afternoon even Eisenhower didn’t know the invasion was about to begin.
The problem was the weather: storms would make an amphibious assault across the Channel difficult and prevent air forces from supporting the naval troops. The UK Met Office had predicted a possible break in the weather on 6 June, but this window was short and uncertain. Eisenhower was faced with a difficult decision. At a key moment during the conference he was handed a slip of paper, and he turned to the attendees and announced: ‘We go tomorrow.’
Later that day, Eisenhower reflected in a letter: ‘My decision to attack at this time and place was based upon the best information available.’ It certainly was, because that slip of paper contained Rommel’s direct orders from Adolf Hitler, his Supreme Commander in Berlin. Those orders told Rommel to hold back the German tank divisions from Normandy, as an attack was anticipated at Calais. The attack was expected five days after the one on the Normandy coast.1 Creating the phantom army in Kent had worked. Eisenhower had a window of opportunity to establish his forces in France.
How did Eisenhower come to know Rommel’s orders? The information was from the hub at the heart of the Allied information topography: Bletchley Park.
Eisenhower’s decision had millions of downstream effects. He didn’t have the capacity to follow up with each of his subordinates, but he had the confidence to know the organization around him would do the right thing. He could ‘lean back on the bicycle’, knowing it would respond to his posture as he willed it in this new direction. And so it was that Fred, who until that moment had represented a bent spoke up in Kenilworth, could respond to Eisenhower’s command and straighten himself out.
Much of the Supreme Commander’s control is through posture, but there are still major decisions, like the launch date for the invasion to be made. To ensure those decisions are well informed the command structure assimilates, sifts and summarizes information, bringing it to the attention of the leaders. This distillation process led to the secrets that were unpicked at Bletchley Park being presented to Eisenhower at the conference.
