That is Edge's annual question for this year. Here is my sister Azra's response:
An obvious truth that is either being ignored or going unaddressed in cancer research is that mouse models do not mimic human disease well and are essentially worthless for drug development. We cured acute leukemia in mice in 1977 with drugs that we are still using in exactly the same dose and duration today in humans with dreadful results. Imagine the artificiality of taking human tumor cells, growing them in lab dishes, then transferring them to mice whose immune systems have been compromised so they cannot reject the implanted tumors and then exposing these “xenografts” to drugs whose killing efficiency and toxicity profiles will then be applied to treat human cancers. The inherent pitfalls of such an entirely synthesized non-natural model system have also plagued other disciplines.
A recent scientific paper showed that all 150 drugs tested at the cost of billions of dollars in human trials of sepsis failed because the drugs had been developed using mice. Unfortunately, what looks like sepsis in mice turned out to be very different than what sepsis is in humans. Coverage of this study by Gina Kolata in the New York Times incited a heated response from within the biomedical research community, “There is no basis for leveraging a niche piece of research to imply that mice are useless models for all human diseases.” They concluded by saying that, “The key is to construct the appropriate mouse models and design the experimental conditions that mirror the human situation.”
The problem is there are no appropriate mouse models which can mimic the human situation. So why is the cancer research community continuing to be dominated by the dysfunctional tradition of employing mouse models to test hypotheses for development of new drugs?
I was also asked to participate but my response didn't make the final cut. Oh, well. I give it here below in any case if you want to read it:
The Current High School Science Curriculum
For decades, during their four years in high school almost all Americans have taken at least a year-long course in each of the following subjects: biology, chemistry, and physics, in addition to several years of mathematics. Yet, we are all familiar with the frequent surveys which repeatedly show dismaying levels of innumeracy and scientific illiteracy in American adults as well as a shocking and depressing prevalence of anti-scientific beliefs in rubbish ranging from crystal healing to astrology to homeopathy to anti-vaccination skullduggery to young-Earth tomfoolery to mind-boggling conspiracy theories of every sort. Why?
The current science curriculum emphasizes learning facts about science far too much over learning a scientific attitude toward the world. While it is admittedly essential to know things like the basic structure of atoms and how sodium metal and chlorine gas can combine to form common table salt, or how a human red blood cell transports oxygen from our lungs to the many tissues all over our bodies that need it, many of the scientific facts learned in high school are soon forgotten, especially by those who do not go on to study more science in college. In other words, what students learn in science classes in high school ends up not being of much practical benefit to many, if not most, of them in their later lives.
What needs to be stressed in addition to facts is the major aspect of science which can be thought of as a struggle to overcome our innate tendencies toward false views of the world.
(Of course, there are often good evolutionary reasons for these tendencies but they do not always serve us well in the modern world. As Steven Pinker once pointed out, our innate fear of snakes is not very useful in the environments most of us inhabit now; it would be much better for us to have an innate fear of not wearing seat belts!) Here is one obvious example: all of us tend to generalize from too little data because it is in our nature to seek patterns and to do so quickly in real time. I am no exception and often catch myself doing something like bad-mouthing an airline to a friend because I flew on it twice and had a bad experience both times (which could clearly just be a coincidence), or deciding that the people of country X are rude and unfriendly based on a handful of unfortunate hostile encounters I had while visiting X for four days. This inclination is so strong that I have frequently even met doctors who will recommend things like a completely untested (in a controlled manner) remedy for the common cold that their grandmother bequeathed to them and which they tried and now swear by, based on their own anecdotal (and completely spurious) evidence. And so it is really very unfortunate that it is possible today to go through three years of high school science without knowing what a double-blind controlled trial is, or having any understanding of basic statistical concepts, or even why these things are needed.
What is required, in my opinion, is at least a two year course in a subject which we might call Applied Rationality. And we need this at the high school level because most people do not go on to college and this is the last chance we as a society have of equipping the majority of our citizens with the conceptual tools that they can use to their benefit for the rest of their lives. So what should the curriculum for such a course include? We can start by trying to combat some of the known frailties of the human mind. For example, we have notoriously bad instincts and intuition when it comes to probabilities. This means that we tend to behave irrationally when faced with uncertainty and we are faced with uncertainty every day. Even the imparting of a basic understanding of probability and statistics would go a long way toward reducing illogic of the “this roulette wheel has come up black five times in a row, surely it is red's turn this time” variety, and among many other benefits, perhaps reduce the morally degenerate tax on the poor and innumerate known as lotteries and gambling.
Similarly, in the last five decades an immense amount has been learned by psychologists about other systematic weaknesses in the human cognitive apparatus. For example, the tendency to give extra weight to evidence which supports an already held belief and to dismiss evidence against that belief is a universal feature of human psychology and is known as confirmation bias. Scores of such irrational biases have been identified and they should be taught in the course I am proposing along with practical methods of guarding against falling prey to them.
Another approach could be to identify and list the most harmful (the prevalence of such a belief in society weighted by the potential harm caused by the belief) anti-scientific beliefs and then work backwards to see what can be taught to help overcome them.
We should perhaps even include a section on not giving too much weight to supposedly scientific studies in areas where such studies regularly provide contradictory advice such as the field of nutrition and we should provide conservative sources of information on such subjects and practical advice on how to, for example, avoid fatuous dietary trends and, in general, sift good information from bad. The details of such a course in Applied Rationality would obviously need to be worked out carefully for maximum benefit.
In any case, the current science curriculum in high school is clearly not working. It is time to try something new, making use of all the knowledge we have acquired in the past half-century about why people believe stupid things.