Breaking Barriers: On “Hidden Figures” by Margot Lee Shetterly

by Jonathan Kujawa

ScreenHunter_2488 Jan. 02 11.51"Reduce your household duties! Women who are not afraid to roll up their sleeves and do jobs previously filled by men should call the Langley Memorial Aeronautical Laboratory." In 1935 the National Advisory Committee for Aeronautics (the arm of the US government dedicated to research and development in the new-fangled area of human flight) hired its first cohort of women computers. This was before calculations could be done effectively by machines. If you wanted equations solved and numbers crunched you needed a person who was quick with numbers and deadly accurate. With a talent shortage, and with some reluctance, the those in charge admitted that women might be up to task. When the first women arrived, the male engineers were no doubt reassured by the fact that the women would only have to calculate whatever they were given and wouldn't have to worry their pretty little heads with the actual problem solving and thinking. The women more than held their own.

With the onset of World War II the allies needed every possible advantage. It was clear that winning in the air was key to winning the war. Better, faster, more maneuverable planes were needed without delay. The NACA grew at an exponential rate and needed every clever person it could get its hands on. Word spread and soon black women were also applying for these positions. No surprise since jobs at the NACA paid at least twice the salary of a school teacher, the next best option for well-educated black women.

Read more »

The Hit Aesthetic

by Misha Lepetic

"Wonder was the grace of the country."
~
George W.S. Trow

Scapegoat_1At a recent cocktail party, the conversation turned to conspiracy theorists and how to engage them. I offered a strategy that has served me fairly well in the past: I like to ask my interlocutor what information they would need to be exposed to in order to change their minds about their initial suspicion. To be clear, I think of this more as a litmus test for understanding whether a person has the capacity to change their minds on a given position, rather than an opening gambit leading to further argument and persuasion. Climate change is a good example: What fact or observation might lead a person to consider that global warming is happening, and that human economic activity is responsible for it? It is actually quite surprising how often people don't really have a standard of truth by which they might independently weigh the validity of their argument. Of course, in today's ‘post-truth' world, I suspect that it is just as likely that I might be told that nothing can change a person's mind, since everything is lies and propaganda anyway.

I was pleased that another person at the party made an even better suggestion. She said that she would ask not only what would change a conspiracy theorist's mind, but from whom they would need to hear it. This vaults the act of interrogation from a context grounded purely in individualism and individuals' appeals to authority, to something distinctly more social. It also specifies the importance of not just facts, but from where those facts emanate. Because as much as we would like to believe ourselves independently reasoning beings, that we come to our conclusions through a rigorous and sacrosanct process of discernment, we are still very subject to having our opinions shaped by others. This may seem somewhat obvious, but in these times, when new ways of sensemaking are in high demand, I believe this provides an important opening.

Read more »

Crowdfunding Science and Tribefunding Science

by Jalees Rehman

Competition for government research grants to fund scientific research remains fierce in the United States. The budget of the PandaNational Institutes of Health (NIH), which constitute the major source of funding for US biological and medical research, has been increased only modestly during the past decade but it is not even keeping up with inflation. This problem is compounded by the fact that more scientists are applying for grants now than one or two decades ago, forcing the NIH to enforce strict cut-offs and only fund the top 10-20% of all submitted research proposals. Such competition ought to be good for the field because it could theoretically improve the quality of science. Unfortunately, it is nearly impossible to discern differences between excellent research grants. For example, if an institute of the NIH has a cut-off at the 13 percentile range, then a grant proposal judged to be in the top 10% would receive funding but a proposal in top 15% would end up not being funded. In an era where universities are also scaling back their financial support for research, an unfunded proposal could ultimately lead to the closure of a research laboratory and the dismissal of several members of a research team. Since the prospective assessment of a research proposal's scientific merits are somewhat subjective, it is quite possible that the budget constraints are creating cemeteries of brilliant ideas and concepts, a world of scientific what-ifs that are forever lost.

How do we scientists deal with these scenarios? Some of us keep soldiering on, writing one grant after the other. Others change and broaden the direction of their research, hoping that perhaps research proposals in other areas are more likely to receive the elusive scores that will qualify for funding. Yet another approach is to submit research proposals to philanthropic foundations or non-profit organizations, but most of these organizations tend to focus on research which directly impacts human health. Receiving a foundation grant to study the fundamental mechanisms by which the internal clocks of plants coordinate external timing cues such as sunlight, food and temperature, for example, would be quite challenging. One alternate source of research funding that is now emerging is "scientific crowdfunding" in which scientists use web platforms to present their proposed research project to the public and thus attract donations from a large number of supporters. The basic underlying idea is that instead of receiving a $50,000 research grant from one foundation or government agency, researchers may receive smaller donations from 10, 50 or even a 100 supporters and thus finance their project.

Read more »

A Tree in Winter

by Brooks Riley

Winter treesIf I could hug a tree a day without seeming a complete idiot, I would. Trees matter to me now–how fast they grow, how full their crowns, how tall they are, how odd their leaves, how extraordinary their shapes, how thick their trunks, how nearby they are. This late interest has crept up on me, and taken hold in ways I am trying to understand.

It’s not as if I’ve taken leave of humankind, the animal kingdom—or my senses—to go live among the stately green giants. I haven’t given up all that for something else, far from it. But there are aspects of trees that seem to harmonize with what I need: Silence (I don’t need to communicate with them.); Design (The complexity of a living organism achieving its biological destiny is somehow reassuring.); Color (The range of hues, from green to orange to yellow to purple to pink, is a technicolor packaging triumph); Variety (The aesthetic intricacy of their bare black branches against a grey sky, or the hoarfrost that turns them white overnight like an old crone); Progress (Those bare branches look a lot like dentrites, reminding me that mine are still growing too); Stillness (They don’t have to move to be going somewhere.). The appeal of a tree is almost metaphoric, heralding a time when I too will fall silent, cease to move, and return to the same earth they already occupy. No hooded figure with a scythe will knock on my door. I’ll be knocking at the door of their kingdom when the time comes, a willing Philemon with or without my Baucis.

I don’t anthropomorphize trees the way Peter Wohlleben apparently does in his recent bestselling book. I am happy to learn that trees are just as social as we are, but this news has no bearing on my solitary appreciation of a tree.

Read more »

Surname Extinction

by Olivia Zhu

Every year, there comes a flood of articles regarding trends in baby names accompanied with charts and historical analyses. I’ve been tickled to see my own first name see rather significant increases popularity over the past decade or so—congratulations to my parents for being trendsetters! Picture1

Yet, equally interesting—if not perhaps even more interesting—is the modeling of surname trends over time, and it was that problem that captivated my collaborator Nicole Flanary (Nicole is the 152nd most popular female baby name, by the way) and me. Surnames tell the stories of lineages, immigration, ethnic enclaves, feminism, assimilation, family planning, and more, whereas given names more typically reflect cultural fads. A study of surnames also offers up the idea of “surname extinction,” the fatalistically named phenomenon that British mathematicians Francis Galton and Henry William Watson modeled. In 1847, they explored the topic to determine whether aristocratic families might go extinct depending on the number of children they had—a process well-modeled since British high society at the time was fairly closed, homogenous, and patrilineal.

Galton and Watson might have found a few other societies interesting as well. Chinese, Korean, and Vietnamese populations are renowned for the lack of surname diversity—was there an extinction-style event at some point that eliminated names from the language altogether? Vietnam is a particularly interesting case, as 40% of the population share the same last name: Nguyen. Contrastingly, surname diversity and even inventiveness in other countries is also worth studying, especially since new last names may be easily and often added to the name pool.

Read more »

Studying the Liberal Arts while Muslim

by Shadab Zeest Hashmi

ScreenHunter_2487 Jan. 02 11.42As the election season seemed more and more like being trapped in a carnival where the uncanny is orchestrated to play up primal fears, we witnessed language itself veering off into the realm of the irrational— not only because statements did not reflect facts or reflected only partial facts, or arguments lacked consistency, but because none of this mattered anymore: a sense of panic killed the need to seek the truth. The election machinery, with the media as its engine, successfully exploited anxiety to mute even the most basic assessment of language for truth-telling. If it were not for the few Human Rights groups circulating infographics (that bypass conventional language by presenting facts numerically) on social media, or sharing cell phone videos, or simply asking questions to expose the propaganda, all the cogs involved in manufacturing the “post-truth” age would have been even more opaque. Falling prey to fear-based propaganda isn’t uncommon in history, but when it happens to the populace of a leading superpower that prides itself in being a free-thinking democracy, one needs to ask how the populace has found itself primed for phobia. One of the many places to find the answer is academia, and my personal frame of reference is my alma mater Reed College, which itself happens to be in the throes of political agitation (not unlike many other campuses across the country), and is doubly significant to me because my son is a current student there. How does Reed prepare students to make sense of the world in the post-truth age? How did I fare as a female Muslim international student?

Read more »

The Art of Wine: Part 1

by Dwight Furrow

CezanneAmong the most striking developments in the art world in the past 150 years is the proliferation of objects that count as works of art. The term “art” is no longer appropriately applied only to paintings, sculpture, symphonic music, literature or theatre but includes architecture, photographs, film and television, found objects, assorted musical genres, conceptual works, environments, etc. The Museum of Modern Art in New York proudly displays a Jaguar XKE roadster as a work of art. As Jacques Rancière writes regarding the modernist aesthetic that begins to emerge in the 18th Century:

“The aesthetic regime asserts the absolute singularity of art and, at the same time, destroying any pragmatic criterion for isolating this singularity. It simultaneously establishes the autonomy of art and the identity of its forms with the forms that life uses to shape itself.”*

Rancière argues that with the proliferation of objects that now count as art, contemporary art is neither autonomous from nor fully absorbed into everyday life but occupies a borderland between the everyday and the extraordinary that is art's function to continually negotiate. Art is about having a certain kind of aesthetic experience; it is no longer about a particular kind of object.

Wine is among the most prevalent of everyday objects that have no function except to provide an aesthetic experience. And so the question naturally arises: Can wine be a work of art?

Read more »

Chapter One: The Compass of Regret

by Maniza Naqvi

Colombia1To dance the dance, I did not dance, because at the end of the conference, my accompanying handbag which contained all my documents, passports, credit cards and so forth compelled me to sit frumpily, guarding it, instead of joining the sensuously swaying crowd. When I had the chance, I chose instead to sit tied to my belongings—an accumulation of things. Ah the regret.

Why you? Why You? Why you? I had asked myself earlier, marveling at my good fortune gleefully. I kept repeating the direction I was headed towards the land of a thousand stories: Aracataca, Aracataca, Aracataca. Each syllable slung against the roof of my mouth, crashing against each other on my tongue, creating a rhythm like a tin drum. I wanted to jump and dance. Oh sure. I was going just to a conference—but it was on the shores of Colombia very near Aracataca. And so I went pulled by the magnetic allure of it and the lore of the Sierra Nevada.

But, ah the regret. Nearly there, not really there, close, nearby. I did but glimpse it in that chance brief encounter with its beauty and its possibilities, a moment so very brief it nevertheless left me breathless. And when I left, it left me imagining it, wishing that I would return to travel it by river perhaps at a great age, and in love. Finally. And then, then, without a care in the world, I would dance.

Read more »

Why Germans Can Say Things No One Else Can

From The Book of Life:

We’re hugely dependent on language to help us express what we really think and feel. But some languages are better than others at crisply naming important sensations.

Germans have been geniuses at inventing long – or what get called ‘compound’ – words that elegantly put a finger on emotions that we all know, but that other languages require whole clumsy sentences or paragraphs to express.

Here is a small selection of the best of Germany’s extraordinary range of compound words:

1. Erklärungsnot

[Explanation-Distress]

Literally, a distress at not having an explanation. The perfect way to define what a partner might feel when they’re caught watching porn or are spotted in a restaurant with a hand they shouldn’t be holding. More grandly, Erklärungsnot is something we feel when we realise we don’t have any explanations for the big questions of life. It’s a word that defines existential angst as much as shame.

2. Futterneid

[Food-Envy]

The feeling when you’re eating with other people and realise that they’ve ordered something better off the menu that you’d be dying to eat yourself. Perhaps you were trying to be abstemious; now you’re just in agony. The word recognises that we spend most of our lives feeling we’ve ordered the wrong thing. And not just in restaurants.

More here.

In praise of, dare we say it, the media

From the Globe and Mail:

NY791-YE+2016+Top+10+StorieMark Twain once complained about newspapers that use one half of their pages to tell readers how good the other half are. It’s a valid grievance; no paper ought to do it. But in a year that saw a boom in fake news, neo-Nazi sloganeering against the “lugenpresse” and attacks on journalists by the president-elect of the United States, it is defensible for this little space to spend a minute celebrating, not our newspaper in particular, but a free and unbiased press in general.

Note the word “celebrating.” We could have said “defending,” but we aren’t going to play that game. The attacks on the media of the past year, from left and right, have been driven either by political operatives or opportunists. There is political gain to be had from whining ceaselessly that the “elite” media are biased against you, as Donald Trump and many others ritually do. There is also a solid business model in telling your readers that the mainstream press are lying to them, and that they should spend their money and time on the alternative that you just happen to own and operate. There is no point decrying these inevitabilities, and it is wrong to be censorial about them if one is committed to free speech.

Note in that first paragraph the word “unbiased.” There are undoubtedly readers who got to that contentious term and crumpled this page into a tightly wadded ball, carried it to the kitchen garbage pail and dropped it in with relish.

The charge of bias is a constant today, for reasons already stated, but also because there is no hiding the fact that newspapers and the people who write for them have a variety of leanings.

More here.

WHAT SCIENTIFIC TERM OR CONCEPT OUGHT TO BE MORE WIDELY KNOWN?

Azra Raza at Edge.org:

Picture-2047-1481921804The Cancer Seed and Soil Hypothesis

One in 2 men and 1 in 3 women in the US will get cancer. Five decades after declaring war on the disease, we are still muddling our way rather blindly from the slash-poison-burn (surgery-chemo-radiation) strategies to newer approaches like targeted therapies, nanotechnology, and immunotherapies which benefit only a handful of patients. Among other reasons for the slow progress, a major flaw is the study of cancer cells in isolation, which de-links the seed from its soil.

Stephen Paget was the first to propose in 1889 that “seeds” (cancer cells) preferentially grew in the hospitable “soil” (microenvironment) of select organs. The cross-talk between seed-soil hypothesized by Paget indeed proved to be the case whenever the question was examined (such as in the elegant studies of Hart and Fiddler in the 1980s). Yet, consistent research combining studies of the seed and soil were not pursued, largely because in the excitement generated by the molecular revolution and discovery of oncogenes, the idea of creating animal models of human cancers appeared far more appealing. This led to the entire field of cancer research being hijacked by investigators studying animal models, xenografts and tissue culture cell lines in patently artificial conditions. The result of this de-contextualized approach, which is akin to looking for car keys under the lamppost because of the light instead of where they were lost a mile away, is nothing short of a tragedy for our cancer patients whose pain and suffering some of us witness and try to alleviate on a daily basis.

More here. And my own answer to the same question can be seen here. Do browse all the excellent entries at Edge.

Why We Love to Blame 2016

Brian Gallagher in Nautilus:

TwentyYou may have noticed it by now: the—I guess I’ll call it an impulse—to anthropomorphize “2016.” It began gradually. First, we objectified it, likening it to a disturbing film, a force of nature, broken hardware. As Slate put it:

In trying to wrap our heads around 2016’s all-reason-and-logic–defying onslaught of tragedy and absurdity, we objectified the year. We gave it a shape and form, likening it to a melodrama, a malfunctioning machine, an unstoppable meteor, anything to get some small grasp on the year’s surreal and hellish parade of events.

Then we subjectified 2016. We wrote letters to the year, chastising its bad behavior (for, among other things, offing beloved celebrities). John Oliver went further, detonating a large “2016” structure in an arena. A recent Atlantic article ran with the title “‘Fuck You, 2016’: On blaming a year for the things that happen in it.” But why are people blaming 2016 anyway? It could be that we’re anthropomorphizing the year to connect to it, and we need to connect to it because so many of our other connections are broken. In a study published in October, Jennifer A. Bartz, a psychologist at McGill University, with her colleagues described anthropomorphism as “a motivated process” that “reflect[s] the active search for potential sources of connection.” Bartz wanted to see if she could replicate, and extend, findings from a 2008 study by the University of Chicago social psychologist Nicholas Epley, and colleagues, who claimed that socially disconnected people may “invent humanlike agents in their environment” to help feel reconnected. Those researchers, Bartz and her colleagues write, “found that lonely people (compared with nonlonely people) were more likely to ascribe humanlike traits (e.g., free will) to an alarm clock, battery charger, air purifier, and pillow.”

This year, with its Presidential election, seems to have offered many occasions for Americans to question their sense of belonging. Neil Gross, a sociologist at Colby College, wrote in the New York Times recently that many people have been wondering, “Is this America?” “It’s a telltale sign of collective trauma, a grasping for identity when the usual bases for community aren’t there any more,” he writes. “For progressives, moderates and ‘Never Trump’ Republicans, the political order they long took for granted—defined by polarization, yes, but also by a commitment to basic principles of democracy and decency—is suddenly gone.” A recent Pew report, titled “A Divided and Pessimistic Electorate,” illustrated this: “Beyond their disagreements over specific policy issues, voters who supported President-elect Donald Trump and Hillary Clinton also differed over the seriousness of a wide array of problems facing the nation, from immigration and crime to inequality and racism.”

More here.