The Misconception: You should focus on the successful if you wish to become successful.
The Truth: When failure becomes invisible, the difference between failure and success may also become invisible.
In New York City, in an apartment along the Hudson River, above trees reaching out over sidewalks and dogs pulling at leashes and conversations cut short to avoid parking tickets, a group of professional thinkers once gathered and completed equations that would both snuff and spare several hundred thousand human lives.
People walking by the apartment at the time had no idea that four stories above them some of the most important work in applied mathematics was tilting the scales of a global conflict as secret agents of the United States armed forces, arithmetical soldiers, engaged in statistical combat. Nor could people today know as they open umbrellas and twist heels on cigarettes, that nearby, in an apartment overlooking Morningside Heights, one of those soldiers once effortlessly prevented the United States military from doing something incredibly stupid, something that could have changed the flags now flying in capitals around the world had he not caught it, something you do every day.
These masters of math moved their families across the country, some across an ocean, so they could work together. As they unpacked, the theaters in their new hometowns replaced posters for Citizen Kane with those for Casablanca, and the newspapers they unwrapped from photo frames and plates featured stories still unraveling the events at Pearl Harbor. Many still held positions at universities. Others left those sorts of jobs to think deeply in one of the many groups that worked for the armed forces, free of any other obligations aside from checking in on their families at night and feeding their brains during the day. All paused their careers and rushed to enlist so they could help crush Hitler, not with guns and brawn, but with integers and exponents.
The official name for the people inside the apartment was the Statistical Research Group, a cabal of geniuses assembled at the request of the White House and made up of people who would go on to compete for and win Nobel Prizes. The SRG was an extension of Columbia University, and they dealt mainly with statistical analysis. The Philadelphia Computing Section, another group made up entirely of women mathematicians, worked six days a week at the University of Pennsylvania on ballistics tables. Other groups with different specialties were tied to Harvard, Princeton, Brown and others, 11 in all, each a leaf at the end of a new branch of the government created to help defeat the Axis – the Department of War Math.
Actually…no. They were never officially known by such a deliciously sexy title. They were instead called the Applied Mathematics Panel, but they operated as if they were a department of war math.
The Department, ahem, the Panel, was created because the United States needed help. A surge of new technology had flooded into daily life, and the same wonders that years earlier drove ticket sales to the World’s Fair were now cracking open cities. Numbers and variables now massed into scenarios far too complex to solve with maps and binoculars. The military realized it faced problems that no soldier had ever confronted. No best practices yet existed for things like rockets and radar stations and aircraft carriers. The most advanced computational devices available were clunky experiments made of telephone switches or vacuum tubes. A calculator still looked like the mutant child of an old-fashioned cash register and a mechanical typewriter. If you wanted solutions to the newly unfathomable problems of modern combat you needed powerful number crunchers, and in 1941 the world’s most powerful number crunchers ran on toast and coffee.
Here is how it worked: Somewhere inside the vast machinery of war a commander would stumble into a problem. That commander would then send a request to the head of the Panel who would then assign the task to the group he thought would best be able to resolve the issue. Scientists in that group would then travel to Washington and meet with top military personnel and advisors and explain to them how they might go about solving the problem. It was like calling technical support, except you called a computational genius who then invented a new way of understanding the world through math in an effort to win a global conflict for control of the planet.
For instance, the Navy desperately needed to know what was the best possible pattern, or spread, of torpedoes to launch against large enemy ships. All they had to go on were a series of hastily taken, blurry, black-and-white photographs of turning Japanese war vessels. The Panel handed over the photos to one of its meat-based mainframes and asked it to report back when it had a solution. The warrior mathematicians solved the problem almost as soon as they saw it. Lord Kelvin, they told the Navy, had already worked out the calculations in 1887. Just look at the patterns in the waves, they explained, see how they fan out in curves like an unfurling fern? The spaces tell you everything; they give it all away. Work out the distance between the cusps of the bow waves and you’ll know how fast the ship is going. Lord Kelvin hadn’t worked out what to do if the ship was turning, but no problem, they said. The mathematicians scribbled on notepads and clacked on blackboards until they had both advanced the field and created a solution. They then measured wavelets on real ships and saw their math was sound. The Navy added a new weapon to its arsenal – the ability to accurately send a barrage of torpedoes into a turning ship based only on what you could divine from the patterns in the waves.
The devotion of the mathematical soldiers grew stronger as the war grew bloodier and they learned that the things they etched on hidden blackboards and jotted on guarded scraps of paper determined who would and would not return home to their families once the war was over. Leading brains in every scientific discipline had eagerly joined the fight, and although textbooks would eventually devote chapters to the work of the code breakers and the creators of the atomic bomb, there were many groups whose stories never made headlines that produced nothing more than weaponized equations. One story in particular was nearly lost forever. In it, a brilliant statistician named Abraham Wald saved countless lives by preventing a group of military commanders from committing a common human error, a mistake that you probably make every single day.
Colleagues described Wald as gentle and kind, and as a genius unsurpassed in his areas of expertise. His contributions, said one peer, had “produced a decisive turn in method and purpose” in the social sciences. Born in Hungary in 1902, the son of a Jewish baker, Wald spent his childhood studying equations, eventually working his way up through academia to become a graduate student at the University of Vienna where the great mathematician Karl Menger mentored him. He was the sort of student who offered suggestions on how to improve the books he was reading, and then saw to it those suggestions were incorporated into later editions. His mentor would introduce Wald to problems that made experts in the field rub their beards, the sort of things with names like “stochastic difference equations” and the “betweenness among the ternary relations in metric space.” Wald would not only return within a month or so with the solution to such a problem but politely ask for another to solve. As he advanced the science of probability and statistics, his name became familiar to mathematicians in the United States where he eventually fled in 1938, reluctantly, as the Nazi threat grew. His family, all but a single brother, would later die in the extermination camp known as Auschwitz.
Soon after Wald arrived in the United States he joined the Applied Mathematics Panel and went to work with the team at Columbia stuffed in the secret apartment. His group looked for patterns and applied statistics to problems and situations too large and unwieldy for commanders to get their arms around. They turned the geometry of air combat into graphs and charts and they plotted the success rates of bomb sights and various tactics. As the war progressed, their efforts became focused on the most pressing problem of the war – keeping airplanes in the sky.
In some years of World War II, the chances of a member of a bomber crew making it through a tour of duty were about the same as calling heads in a coin toss and winning. As a member of a World War II bomber crew, you flew for hours above an entire nation that was hoping to murder you while you were suspended in the air, huge, visible from far away, and vulnerable from every direction above and below as bullets and flak streamed out to puncture you. “Ghosts already,” that’s how historian Kevin Wilson described World War II airmen. They expected to die because it always felt like the chances of surviving the next bombing run were about the same as running shirtless across a football field swarming with angry hornets and making it unharmed to the other side. You might make it across once, but if you kept running back and forth, eventually your luck would run out. Any advantage the mathematicians could provide, even a very small one, would make a big difference day after day, mission after mission.
As with the torpedo problem, the top brass explained what they knew, and the Panel presented the problem to Wald and his group. How, the Army Air Force asked, could they improve the odds of a bomber making it home? Military engineers explained to the statistician that they already knew the allied bombers needed more armor, but the ground crews couldn’t just cover the planes like tanks, not if they wanted them to take off. The operational commanders asked for help figuring out the best places to add what little protection they could. It was here that Wald prevented the military from falling prey to survivorship bias, an error in perception that could have turned the tide of the war if left unnoticed and uncorrected. See if you can spot it.
The military looked at the bombers that had returned from enemy territory. They recorded where those planes had taken the most damage. Over and over again, they saw that the bullet holes tended to accumulate along the wings, around the tail gunner, and down the center of the body. Wings. Body. Tail gunner. Considering this information, where would you put the extra armor? Naturally, the commanders wanted to put the thicker protection where they could clearly see the most damage, where the holes clustered. But Wald said no, that would be precisely the wrong decision. Putting the armor there wouldn’t improve their chances at all.
Do you understand why it was a foolish idea? The mistake, which Wald saw instantly, was that the holes showed where the planes were strongest. The holes showed where a bomber could be shot and still survive the flight home, Wald explained. After all, here they were, holes and all. It was the planes that weren’t there that needed extra protection, and they had needed it in places that these planes had not. The holes in the surviving planes actually revealed the locations that needed the least additional armor. Look at where the survivors are unharmed, he said, and that’s where these bombers are most vulnerable; that’s where the planes that didn’t make it back were hit.
Taking survivorship bias into account, Wald went ahead and worked out how much damage each individual part of an airplane could take before it was destroyed – engine, ailerons, pilot, stabilizers, etc. – and then through a tangle of complicated equations he showed the commanders how likely it was that the average plane would get shot in those places in any given bombing run depending on the amount of resistance it faced. Those calculations are still in use today.