on genomes, sexual attraction, and the foundations of mathematics.
With the kind of nonlinear shock to be expected from a Tarantino flick, I open with an excerpt from the New York Times review of the second book I will cover, A Billion Wicked Thoughts,
“The concentrated essence of this curious book is contained in its 11th chapter, which attempts to explain what the “Mona Lisa” has in common with Chicken McNuggets, vampire novels and the concluding scene of most pornographic videos.”
Got your attention? Great, but we’ll get to that one in due course …
One in a Billion by Kathleen Gallagher and Mark Johnson tells the remarkable story of Nic Volker, the first person in history to receive a whole-genome analysis as a medical diagnostic tool. Gallagher and Johnson are reporters who initially covered the story in Pulitzer Prize-winning reporting for the Milwaukee Journal Sentinel, which was adapted for this utterly riveting book: part medical case study, part heart-wrenching family drama, and part scientific detective story.
In 2004, Nic Volker seemed to be a regular, happy two-year-old, when his mother brought him to the emergency room in the local Wisconsin hospital for what she assumed was a simple abscess. This began a tormenting two-year saga of drastic swings in Nic’s health as one treatment after another failed to cure him, to the point where doctors literally had no idea what was wrong with Nic; that they did not know how to treat him, and that he was going to die.
Told in riveting parallel is the professional background of the team of scientists and doctors who would end up saving Nic’s life. As the Human Genome Project pushed on in the mid-90s, several prescient medical professionals were envisioning a future in which the results of this fundamentally scientific project could be industrialized, cheapened, and applied to medicine. One such group formed around the Medical College of Wisconsin under Howard Jacob. They sought to build on the work of the Human Genome Project and their own specialty in, of all things, rat genomics, to bring genomic science into medicine.
They envisioned a future in which genomic analyses were routinely performed along with taking blood pressure or urine samples and in which couples are counseled on the suitability of having children due to likely risk factors their children could inherit — a future we are far closer to as a result of the events described in this book.
Five years ahead of the original schedule, Jacob was propositioned by Nic’s doctors at the linked Wisconsin’s Children’s Hospital that a whole-genome sequence was the last recourse to try to find out what was wrong with Nic, to use as a tool to craft a cure, and to save his life. In a truly historic event, the sequence was carried out and the source of the illness discovered. The reason nobody could work out what was wrong with Nic was that his illness was caused by a genetic disease that had never been seen before. No DNA from any animal, never mind any human being, had ever been found containing the single mutation responsible for Nic’s illness; he was one in a billion.
One in a Billion is in no way just a chronological retelling of a medical breakthrough. Arguably Gallagher and Johnson’s greatest asset is the ease with which they convey the story as being less about science than about people: the horrible stress inflicted upon Nic’s family, the optimism, ingenuity, and bravery of the medical team, the foresight of the academics whose work years before led to the Human Genome Project itself and all that followed. One unsettling episode recounts how Nic’s mother Amylynne elected for breast reduction surgery in order to be taken more seriously by the medical staff treating her son. Gallagher and Johnson do not let us forget that the events retold depend on people’s behavior and institutions ranging from academia to medicine to family to church that mold people’s motivations. A pair of contrasting excerpts should reveal this craft:
“The boy knows where the operating room is. He knows the different floors of the hospital. The massive building, confusing to most adults, is home. What friends Nic has are here. His mom arranged hospital play dates. But friendships in the hospital are fleeting. Nic learns that friends always leave. They go home. Or they die.”
“Wound-cleaning days follow their own choreography. Mother and surgeon stand on opposite sides of the line at the entrance to the operating room. Amylynne passes her son to Arca [Nic’s surgeon]. Nic often wears his Batman cape and costume for the occasion, and Arca, knowing Nic loves it, addresses him as Batman.”
It is never explicitly stated, but it seems clear that Nic’s own character was necessary for his survival. It might seem callous to try to depict a child near death in any other way, but he comes across as incredibly brave, and anecdotes of his attitude moved me to tears at times. I challenge the reader not to cry at this excerpt in particular:
If Nic is the damsel in distress in this story, and the deadly mutation the antagonist, then the heroes are undoubtedly the medical staff. Gallagher and Johnson’s humanization applies no less magically to such figures as Eric Lander and George Church, something of godfather figures in the field, and Howard Jacob, Lander’s one-time protégé at MIT who founded the group at Wisconsin and ultimately carried out the sequencing himself; David Margolis, the transplant doctor who pushed the desperate team for a medically sound reason to operate; Alan Mayer, Nic’s primary doctor who took up Margolis’ challenge and petitioned for a WGS; Liz Worthey and David Dimmock, the computational biologist and pediatric geneticist who arguably cracked the case; James Verbsky, the immunologist who designed the test to try to prove the medical accuracy of the sequencing’s findings, and many more.
Nic Volker was not saved because of the advance of science, an impersonal and ethereal force that exists outside society. Nic Volker was saved by scientists. Brave, intelligent, and compassionate scientists, who did not simply apply their knowledge — indeed, they had no knowledge. Not of this; not of what was wrong with Nic; at least not to begin with. It is probably less accurate to say that the advance of science saved Nic Volker than to say that saving Nic Volker advanced science. The decision to sequence Nic’s genome is described in one of the book’s most thrilling passages,
“Near the end of June, Mayer takes a look at Nic’s Ileum. He can see the disease taking root. Now the doctor cannot sleep at night. He knows this child is going to die. And despite all of his knowledge and skill, he will be unable to do more than watch.
In desperation, he goes to see David Margolis, one of the transplant doctors. Mayer is now reduced to begging Margolis to approve a bone marrow transplant. The response is the same: no diagnosis, no transplant. It would be irresponsible to go ahead without comprehending the nature of the problem.
Mayer understands his colleague’s position. He is asking Margolis to assume a huge risk. If the transplant fails, Margolis becomes the doctor who killed Nic.
At the same time, Mayer is thoroughly frustrated. There are no other options on the table. We’ve done every test that can be done, Mayer explains to Margolis. What will it take? What more do you want?
Neither doctor will recall who said it. The idea is proposed as more of a joke than a possibility.
“What are we supposed to do now, sequence his genome?” ”
What may have started as a joke has since begun to transform medicine, and it is worth considering exactly how. I contest that Whole Genome Sequencing is not an incremental advance; it is not a treatment or cure for a disease. It represents a fundamental shift in the conception of what medicine is, in at least two related ways: the technological breakthrough and the philosophical approach to the discipline this technology requires.
WGS and the resulting idea of genomic medicine relies on computers. But this does not reduce to using a fancy machine to get medical information. Every medical device in history has achieved this feat, from a CAT Scan right down to a stethoscope; doctors acquire information about the patient that their senses alone could not grant them. But the doctors understand this information. They can interpret the result. Nobody can interpret a genome. It is a 3 billion-letter sentence from a 4-letter alphabet that totally escapes human comprehension. Even ‘knowing’, whatever we mean by this, what the genome says is not enough because we require an enormous amount of computation of the differences between genomes from enormous sets to tell us anything worthwhile. This is not using a machine to get information; it is using a machine to tell us what the information means.
The reason this is required — the enormous and incomprehensible variance of information — leads to another novel difference. A very crude model of medicine prior to the twenty-first century could be as follows: human beings are machines that are supposed to work a certain way. There are common ways in which we deviate from the norm of ‘health’, and for many of these, we can engineer the deviation away. Unfortunately for many others, we do not know how to do this. Medicine is a subfield of engineering, with the human body the sole object of study.
This approach could be thought of as awfully normative. There is an idea of ‘health’, deviations from which can hopefully be engineered back to the norm. But we are now in the very early stages of personalized medicine. People do not have generic deviations from the norm of health; they have their own deviations from their own norms. In one sense, this does not change the engineering principle we seek to apply: we still want to find engineering solutions to the deviations. But in another sense, the traditional philosophy has been upturned: while many illnesses are widespread and their symptoms understood, it is not going to be possible to catalog every possible illness because there is an incomprehensible number of possibly dangerous mutations, not to mention combinations of mutations and their interaction with the environment.
An excellent Forbes article covering the publishing of the Wisconsin team’s results raises the puzzle in a practical setting. “The hurdles to getting DNA sequencing from the laboratory to the hospital are huge. Regulators don’t know how to think about these machines. Are they more akin to MRI and PET scans or diagnostic tests? What should insurance companies pay for? The challenges might be insurmountable, if it weren’t for patients like Nicholas Volker — and the doctors who are determined to treat them.” After this turning point, we will less and less be able to match illnesses to a chapter in a book, and follow the instructions to fix the deviation — many will not have a chapter because, like Nic’s, they were effectively personal. We need computers to find these, and we will likely need computers to tell us what to do about them.
This raises profound ethical questions which Gallagher and Johnson capture well, not least because Nic’s family were the first to ever face them. His mother Amylynne and father Sean had their genomes sequenced also, and it turned out that Amylynne was actually the originator of the deadly mutation. However, the mutation was in one of her X chromosomes and the protein it improperly made is made up for by her functioning second X chromosome. Her daughters may or may not carry this mutation without knowing it, but the sons of any woman carrying the mutation have a 50% chance of developing the horrible illness.
Does this mean that Amylynne should never really have had children? Does it mean her daughters should be tested to find out if they carry the mutation? More general and more disturbing questions can be asked before even knowing the results of a WGS: if the sequence revealed a crippling illness that only develops later in life and for which there is no cure, should the patient be told this? These questions are still very much open, and will only become more pertinent as personalized medicine enters the mainstream.
These questions quickly cease being medical in nature and are more related to social and moral philosophy. Ironically, we might think, the abundance of computation and data has served, ultimately, to make science more private and intimate than ever.
As a quick thought experiment, could we use similar tools to go in the other theoretical direction of understanding? Could we take topics thought of as totally private and intimate, hence intractable to traditional methods of scientific inquiry, and approach them anew with the heft of intense computation? Ogi Agas and Sai Gaddam think that we can. Their chosen private and intimate topic? Sexual Desire.
A Billion Wicked Thoughts has a very simple premise: it is impossible to do any kind of meaningful social study on sexual desire because it relies on information in people’s own minds, and they don’t want to talk about it. Or rather, it used to be impossible, but now we have the Internet. The reader will be forgiven for instinctively responding along the following lines, oh great. Porn. What an amazing advance for science!
Well, arguably yes, a great advance indeed. Not just porn as we might think of it traditionally, but the entire and unconstrained breadth of what people all over the world find to be erotic. All people — not just those who are in a position to volunteer for an academic research project — and also, given the solitary anonymity, what they really like, not what they tell some potentially pervy postgrad for course credit. Agas and Gaddam explain by extrapolating from what they amusingly argue to have been the best pre-Internet attempt:
“In Gergen’s experiment, five young men and five young women entered a small room one at a time. They did not know one another before the experiment, and they were kept isolated before they entered the room. Once they entered, they were free to do whatever they liked. At the end of the experiment, the subjects left the room one at a time. But what made the experiment so interesting was the room itself. It was pitch-dark.
The subjects couldn’t see one another, they didn’t know one another, and they knew they would not learn one another’s identities after the experiment. In other words, they experienced complete and total anonymity. So what did these strangers do? At first they talked, but conversation soon slacked off. Then the touching began. Almost 90 percent of subjects touched someone else on purpose. More than half of the subjects hugged someone. A third of the subjects ended up kissing. One young man kissed five different girls. “As I was sitting Beth came up and we started to play touchy face and touchy body and we started to neck. We expressed it as showing love to each other. We decided to pass our love on and share it with other people. So we split up and Laurie took her place.” Hidden by anonymity, the participants freely expressed their desires. One man even offered to pay Gergen to be let back into the room. Almost 80 percent of the men and women reported feeling sexual excitement.
The Internet is like a much, much, much larger version of the Gergen experiment. Put a billion anonymous people in a virtually darkened room. See what they do when their desires are unleashed.”
What Agas and Gaddam find is fascinating, and it is difficult to constrain oneself in choosing extracts to analyze. I am minded moreover not to ruin the best passages by robbing the reader of the patient buildup. Capturing the tone may do a better job than isolating the best findings or arguments. The following is typical in arguing against a kind of blank slate socialization of sexual desire,
“For example, imagine a culture in which every prepubescent boy is encouraged to perform fellatio on an older teenager several times a week for three or four years, as part of a ritualistic initiation into adulthood. If social inputs determine whether the male brain finds men or women to be sexually attractive, then we might expect this would result in a society dominated by adult homosexuality, or at least bisexuality.
In fact, a society with such practices actually exists: the Sambia. These Papua New Guinea people are jungle horticulturalists who live in mountain hamlets. The Sambia believe that semen is the essence of manhood (sort of like Austin Powers’s mojo) and all Sambian boys must ingest quite a bit of it to become strong, masculine men. When the boys hit puberty and start to develop a manly physique, their elders say, “See? It’s working!” Now the adolescent boys get fellated by a new crop of prepubescent boys.
So what is the rate of homosexuality among adult Sambian men? Roughly 5 percent, about the same level of homosexuality found in Western societies. By the time a Sambian man reaches his twenties, he usually marries a Sambian woman. “They have pleasant memories of their youth,” reports the anthropologist Gilbert Herdt, who lived among the Sambia. “But their real lust is for women.” ”
Or this elaboration on the differences in the typical sexual cues for men and women,
“Though many women feel betrayed when their partners watch porn, they rarely feel that they are betraying their husbands by reading romance. In fact, in Janice Radway’s Reading the Romance, the women in a romance book discussion group insisted that reading romance improved their sex lives with their husbands. Even though the male brain can understand that a woman may be lost in an intimate fantasy world where she is emotionally and erotically connecting with a fictional male, as long as there are no real penises involved, men don’t tend to get jealous of a woman’s reading habits. However, men can get annoyed. “I think every girlfriend I’ve had has turned to me after watching a romantic movie and asked ‘Why can’t you be more like him?’ laments one young man. “I’ve never put on a Jenna Jameson movie and asked ‘Why can’t you be more like her?’” ”
Or splitting hairs on the differences in desire between gay and straight men,
“In many ways, gay men seem to live a sexual life that straight men can only fantasize about. If you’re heterosexual and male, imagine that at any moment, day or night, you could press a button on your iPhone to see an array of attractive women you’ve never met before. Then, you could pick the one that struck your fancy, and within five minutes, get together with her. Two out of three times, she will just want to perform fellatio on you, then leave. She’s delighted to do it, and it doesn’t cost you a penny. This may sound like an impossible fantasy, but this experience is actually available to gay men right now, through Grindr.
Grindr is an iPhone application that displays photos and profile information of all other men using Grindr who are within three thousand feet, making it easy to get in touch with potential sex partners. “I’ve met up with a guy in the backseat of his Lexus during my lunch hour,” says one thirty-one-year-old Grindr user who works in the Boston Financial District. Grindr is a technological innovation that facilitates the casual, anonymous sex that has long been the fantasy of straight men.
Of course, gay men — like straight men — also pursue a ‘mixed mating strategy’. Gay men look for romance and long-term monogamous relationships. Gay men get their hearts broken, get married, and raise children. Like straight men, gay men cheat on their partners and get enraged when their partners cheat on them. In Richard Lippa’s BBC survey, he found that the preferred traits in a partner clustered together based upon the gender of the subject — not by nationality and not by sexual orientation. In almost every way, the brain software of gay men appears to be identical to that of straight men.
But as we’ve seen, there are least two crucial differences: gay men prefer masculinity, and many (but not all) gay men prefer the submissive role in sex. Both of these preferences are standard in the female brain. Something apparently causes two specific parts of the male desire software to ‘flip’ to female settings, while keeping all of the other male settings. There’s also a physical difference between gay men and straight men: gay men have longer penises.”
And finally, about nothing in particular, but because I couldn’t resist,
“Unfortunately, there is no research on the formation of uncued interests in humans. It’s difficult to imagine any ethics review board would approve an experiment that required adolescent men to masturbate while probing their testicles with paddles in the hope of instilling a permanent erotic fixation.”
A Billion Wicked Thoughts spans a very similar range of unnerving humor and profound compassion as One in a Billion. At a high enough level of abstraction, the motivation is the same: forget what you think you know because it clearly isn’t helping — we need to look at the data. The range of off-the-cuff quotes for the provision of cultural context is as amusing as it is helpful, but I couldn’t help but notice the above-norm usage of comedians. Without meaning to scientifically explain humor for my readers (I’m still refining that theory so maybe a later essay) I think this is because the job of a comedian is to make us laugh at everyday absurdities by placing them in unusual contexts, and hence sneakily in the cold light of rationality.
For all but the most sociopathic or (literally) uncivilized readers, I suspect the inevitable self-reflection the book brings about will be at first uncomfortable but then probably also quite funny for exactly this reason. Heck, this essay may even have triggered that reaction already (did I mention this is NSFW? Look, I just did). Looking at the data shows that actually we know nothing, and in fact, across the world are almost certainly socialized to know nothing. And the effect is far greater than in clinical medicine (although admittedly unimportant by comparison) where presumably very few people if any have an innate bias against finding the truth.
The authors are careful never to mock nor judge, but merely to report. The introduction contains a bold and clear pronouncement that in discussing averages of enormous populations, there will be enormous variety within the group and the information cannot be used to say anything about any individual. This attitude is held to throughout and leads to a kind of humanistic plea in the book’s conclusion,
“The greatest hurdle to sexual harmony is ignorance of the fact that members of the other sex (and other sexual orientations) are fundamentally different from ourselves. We all instinctively feel that other people must be just like us. “It just seems so natural to like men,” insisted one thirty-year-old gay man when asked why he liked gay porn that featured straight men. “To be completely honest, I guess I believe that all guys must feel the same attraction to men that I do, but straight guys just repress these feelings. So when I see a straight guy having sex with other men, it feels like validation. It’s like — see, he’s just like me after all.”
Similarly, many straight men believe that, deep down, all women secretly yearn for casual, no-strings-attached sex with strangers. Many straight women believe men have been socialized to be aggressive and promiscuous — but hide a secret emotional life that, with the proper attention, will blossom into tenderness and monogamy. It’s hard for us to accept that other people’s most intimate desires are different from our own — and when confronted with this fact, we often dismiss their desires as deviant or dangerous or just plain hurtful. When literary scholar Janice Radway asked the women in a romance discussion group about male sexuality, the women reported that they did not want to adopt male standards; they wished that men would learn to adhere to theirs. Doubtless, most men feel the same way. By identifying and understanding one another’s sexual cues, we can develop greater comfort, confidence, and compassion; only then will we have an authentic opportunity to truly connect.
Some might argue that not all of our sexual cues should be indulged — that some should be ignored or repressed. Science can’t offer any moral prescription about which cues should be judged acceptable and unacceptable; but science does tell us that it’s difficult or impossible to modify men’s rigid cues, and even though women’s tastes are more plastic, it’s simply not possible to shut off the sleuthing of Miss Marple or her detectives. It’s also worth remembering that at various points in the twentieth century, the medical profession and mainstream society were in perfect agreement that certain sexual activities were unacceptable, including masturbation, oral sex, anal sex adolescent make-out sessions, homosexuality, and interracial sex.”
The metaphor of software is used repeatedly in both A Billion Wicked Thoughts and One in a Billion. A naïve interpretation of this coincidence might infer a disturbing determinism creeping into our general understanding of human behavior. But I believe this would be a misunderstanding based on an oversimplification that the choice of metaphor unfortunately invites: while software is entirely deterministic, medical health and sexual desire are more subtle. It is true that the progress of human knowledge is revealing more and more physically causal mechanisms. This is incredibly useful in cases where we can pinpoint the cause of a defect, but these cases should not be extrapolated. It does not mean that every defect actually has a cause of only this or that nature. In many cases, what we consider a defect will not have a cause at all, of any nature.
But a deeper potential misunderstanding, that a reliance on the formalism of determinism and the metaphor of software invites, is to fail to distinguish between what can be known about individuals and what can be known about a population, and how they relate. A Billion Wicked Thoughts focuses on populations, and, where helpful or relevant, gives some indication of which proportions of the population seem to fit into what sub-categories. One in a Billion is of course only about Nic Volker. Knowledge of the population statistics was necessary for his diagnosis — the baseline from the Human Genome Project to identify mutations — but not sufficient. The HGP tells us nothing at all about Nic. We need the permission of the relevant parties (Nic’s mother in this case) to connect the two sources of information.
Medical health and sexual desire are amongst the most intimate facets of an individual’s being. It is therefore reassuring that a great deal can be said about a population that will be of interest or even of benefit to a randomly chosen individual. But of a specifically chosen individual, practically nothing whatsoever can be inferred from the population alone, without that individual’s consent for personalized data. This understanding strongly resists stereotyping; moreover, this resistance is rooted in a mathematical formalization that proves the irrationality of stereotyping. The foundational entity of this formalization is not of the nature of a point, a number, a set, etc., but is a random variable. It is precisely not deterministic. I will return to this idea shortly.
The books are alike in another, more socially oriented sense. Both detail the reactionary resistance in the respective scientific communities to the novel method developed. A caricature of the most extreme criticism leveled at both would be, this is not science at all! This is guessing!
Doctors made some wild guesses about something never encountered by a thinking person before, never mind subjected to the scientific method, Nic Volker got a bone marrow transplant, and hurray! He’s not dead! Two neuroscientists sifted through anonymous and uncontrolled-for web searches to infer the truth of several dubious theories of sexual desire they had probably already decided they liked the sound of. The New York Times review with which I opened this post contains the following two sentences which amount to as good an appraisal as any, “There is no sense in scrutinizing for consistency a farrago like this book. Its value lies elsewhere — not as a scientific tract, but as a cultural document.” Science is about establishing causal mechanisms in nature, not making probabilistic stabs at billion-dimensional datasets. These approaches are emphatically not scientific.
Or are they?
I chuckled when I conceived of the title of this essay, but the overlap of ‘a billion’ is no coincidence at all. Given the limitations already outlined, how else are these topics supposed to be studied ‘scientifically’ if not in exactly the ways they were — by accepting the reality of a billion variables and trying to make sensible inferences? Consider the deductive alternative in each case:
- Consider every possible genetic variation in turn and use our knowledge of biochemistry to deduce its effects. Keep going until we get what Nic Volker has.
- I don’t even know how to articulate the second option in a way that it would be clear what I am talking about. Establish by rigorous experimentation exactly how the human brain works, and then read the part about sexual desire?
But even this phrasing suggests a superior ‘deductive’ alternative elsewhere in science, presumably championed and regularly practiced by the critics. While I wouldn’t go as far as calling A Billion Wicked Thoughts ‘science’ for the reasons outlined in the NYT, the null hypothesis of pure deduction is far from the case.
Science has always had a problem with justifying its philosophical reliance on induction, on the basis that it doesn’t follow deductively from anything, and that seems to be troubling. This remained an entirely open problem until Popper’s consolidation of various insights of the Vienna Circle of linguistic philosophy into what we now refer to as falsificationism.
Although Popper’s intellectual achievement was significant, I would suggest that there is an unaddressed axiom at the heart of the discomfort that Popper only half-resolved: that deduction is intellectually superior. But is it? Is it necessarily a problem that induction cannot be deduced? Why isn’t it a problem that deduction can’t be induced? An answer typical of freshman philosophy might suggest that deduction is intellectually sound whereas induction is plagued with the vagaries of faulty observations of reality. Popper gave us an intellectual painkiller to deal with this and get back to work. Deduction is clean, while induction is dirty. Deduction is precise; induction is messy.
But what does this really mean? I have so far presented it as little more than a question of taste. We might say more simply that deductive arguments model proper processes of thought and hence are indisputably sound, whereas inductive ones are fuzzy and their implications are unclear. It would seem to follow fairly naturally that deductive arguments ought to be more useful. You can’t go wrong! How wonderful!
But is that actually the case? Consider the following two strands of argument:
- Tolstoy was a genius
- Tolstoy can only truly be appreciated by geniuses
- No genius is without some eccentricity
- Tolstoy sang the blues
- Every eccentric blues singer is appreciated by some half-wit
- Eccentrics think they own the road
- There is always some half-wit who thinks he owns the road.
This chain of deductive reasoning is perfectly precise, perfectly clean, and is indisputably sound. It is also barely comprehensible and mostly useless nonsense. An interesting conclusion to reach regarding a process we just declared to ‘model proper processes of thought’ …
Here is another:
- Watson phones Holmes in his office and states the burglar alarm in Holmes’s house is going off. Holmes prepares to rush home.
- Holmes recalls Watson is known to be a practical joker hence doubts his statement.
- Holmes phones Mrs Gibbon, another neighbor. She is tipsy and rants about crime, making Holmes think she has heard the alarm.
- Holmes remembers the alarm manual said it might have been triggered by an earthquake.
- Holmes realizes that if there had been an earthquake, it ought to be mentioned on the radio.
- Holmes turns on his radio to check.
This chain of reasoning seems to make perfect sense, and furthermore seems to ‘model proper processes of thought’ very accurately. And yet there is not an ounce of valid reasoning to be seen! It’s one random decision after another to no obvious end! What a fool Holmes must be!
This comparison is taken from The Dawning of the Age of Stochasticity, a wonderful paper by David Mumford, based on a talk he gave at the Mathematics Towards The Third Millennium Conference in Rome, in 1999. Mumford, a renowned Fields Medal-winning mathematician, argues that logic has been given undue prominence in the foundational understanding of mathematics and that in many ways and to many ends probabilistic approaches are in fact superior. He elaborates in the introduction that,
“all mathematics arises by abstracting some aspect of our experience and that, alongside the mathematics which arises from objects and their motions in the material world, formal logic arose, in the work of Aristotle, from observing thought itself. However, there can be other ways of abstracting the nature of our thinking process and one of these leads to probability and statistics.”
I strongly encourage the reader to have a look (link here) — although large portions will be inaccessible to readers without formal training in mathematics, several sections are in plain English, and are clearly and thoughtfully written. Section 2 expands on the quote above, giving more insight into how the various branches of mathematics emerge from the abstraction of basic human experiences. Section 3 compares the historical development of logic and statistics, arguing that the fact that statistics developed its depth more sporadically and largely later than logic by no means implies that it should be considered as ‘less foundational’, as it certainly seems to have been.
After a jaunt through some pretty hardcore math, Section 7 is where it really gets juicy. Mumford returns to “the modeling of thought as a computational process” and provides the comparison of deductive and inductive reasoning I quoted above (he calls them ‘logical’ and ‘statistical’ for reasons that should now be obvious given the context). He laments that while logic offers a system for formalizing deduction, “induction has been much harder to understand from a logical perspective”. You pretty much have to say something like that when discussing this stuff …
But he is exceptionally optimistic. He gives a brief explanation of how the ‘inductive’ reasoning in the example is actually perfectly formalizable as what is called ‘Bayesian inference’, and how Bayesian inference seems to be far superior to any ‘logical’ approach to capture the concept of ‘learning’ as opposed to ‘reasoning’, clearly a core component of modeling thought. He then extends the formalization to discussing the (then) state of the art in AI.
(If you want to understand the difference between Bayesian statistics, which has been all the rage in computer science for 30 years and happens to make sense, and frequentist statistics, which has been all the rage in every other science for 100 years and happens to be idiotic, the always wonderful xkcd comic below explains pretty much everything)
Anyway, why is Mumford so intrigued by ‘modeling thought’? Interestingly for us, why he cared twenty years ago need bear no relevance to why we ought to care today. As alluded to, he was intrigued because he believed that all mathematics derives from some or other process of ‘modeling thought’. If thought is better modeled by statistics than by logic, then sorry everybody, but statistics is now primal in the foundations of mathematics, which makes it universally important for applied mathematics, and therefore physics, and therefore science, and therefore basically everything we actually know about the world. I suppose the Set Theorists can always try to get jobs in the Philosophy department.
But today we care because, to grossly oversimplify for dramatic effect, ‘modeling thought’ is powering the most significant advances in AI in probably in that subject’s entire history. Or, to jazz it up even further, it’s making the computers friggin awesome!
I should add as well that although Mumford was originally known and successful as an algebraic geometer, he has since become a world expert on neural nets, so it’s not like he was clueless about all of this and just stumbled into some deep truth that we have only later come to appreciate! Towards the end, he very modestly cautions that,
“all too often, various schools studying the problem of modeling thought have announced that they had the key and that the full solution of reproducing intelligent behaviour was just a matter of a few more years of research! As all these pronouncements in the past have flopped, I refrain from making any claims now except to say that the ideas just sketched seem to me on the right track”
And yet, if I could however pick one line from the essay which I truly believe Mumford did not appreciate the eventual significance of at the time of writing, it would be the following. In giving examples of ‘random variables’ in Section 4, he says:
“a doctor’s diagnosis can be viewed as a random sample from his posterior probability distribution on the state of your body, given the combinations of a) his personal experience, b) his knowledge from books, papers and other doctors, c) your case history, and d) your test results.”
Sounds pretty sensible. But can we square it with Nic Volker and Alan Mayer? Mayer certainly had a) and b) — but they were of absolutely no use — it was arguably a harmful distraction in the early stages. c) and d) helped, but it was totally unclear how they would help until the life-or-death moment. They were certainly not used in this case the way they are normally. So was Mumford wrong? Should he stick to intellectual history and stay the hell away from patients?
Absolutely not — he was eerily correct. We must just remember why exactly this case was so unusual: this was the first time this was ever tried. It was remarkable that Nic Volker’s life was saved, but the lengths gone to are not a scalable solution for every patient.
An enormous amount of money and professional input went into Nic Volker’s case that is infeasible to apply to every (potentially?) genetic illness. This was not due to the novelty or the urgency of the case — although these certainly helped justify the contributions — but to the requirement to draw on an incredibly diverse pool of knowledge. No single doctor had it all, and it is entirely unreasonable to expect that any single doctor ever will. As much as possible will need to be automated. To refer to Mumford’s list of inputs to the Bayesian analysis of the random variable of a patient’s health, no single doctor can possibly have enough personal experience to combine the other factors in a useful way. ‘Individual Knowledge’ will become a kind of non sequitur in light of the gargantuan dataset comprising ‘your case history’ and ‘your test results’. Knowledge of how to gather the data and what to do with it, perhaps, and individual creativity and intuition, for sure, but not knowledge of the data itself. That task is now almost solely the responsibility of ever-advancing computers.
I may be looking for nerdy romance where it doesn’t exist, but I think there is a kind of poignancy to the fact that Mumford being proved right is happening almost entirely due to these advances in computing power. Computing as we know it rests on Claude Shannon completing the most important Masters Thesis of all time; a napkin with the single sentence: classical logic can be done on electrical circuits. Computer Science grew out of a tiny corner of mathematical logic, and computers are living proof of logic’s incredible power.
And now they are being used to push the theoretical primacy of logic aside. And with logic will go the traditional role of the doctor, the study of sexual desire, and goodness knows how much else. Mumford himself barely holds back in this prediction, concluding his essay with its singularly boldest passage — penned, I emphatically repeat, in 1999,
“Probability and statistics will come to be viewed as the natural tools to use in mathematical as well as scientific modeling. The intellectual world as a whole will come to view logic as a beautiful elegant idealization but to view statistics as the standard way in which we reason and think.”
From seeking wicked thoughts to think, to reading the code of life to save a little boy, Mumford seems to have been absolutely correct.
It was quite the inference.
follow me on Twitter @allenf32