The history of the Digital Revolution

Christina Pazzanese in the Harvard Gazette:

21856367Isaacson ’74 is the best-selling author of landmark biographies of Jobs, Albert Einstein, and Benjamin Franklin. A former journalist who has headed CNN and Time magazine, Isaacson is currently CEO of the Aspen Institute, an educational and policy studies think tank in Washington, D.C., as well as a Harvard Overseer. He spoke with the Gazette about what he learned in his research and how truly lasting innovation is often found where our humanity meets our machinery.

GAZETTE: What drew you to a subject as complicated and fluid as the history of the Digital Revolution?

ISAACSON: I was always an electronics geek as a kid. I made ham radios and soldered circuits in the basement. My father and uncles were all electrical engineers. When I was head of digital media for Time Inc. in the early 1990s, as the Web was just being invented, I became interested in how the Internet came to be. When I interviewed Bill Gates, he convinced me that I should do a book not just about the Internet, but its connection to the rise of the personal computer. So I’ve been working on this for about 15 years. I put it aside when Steve Jobs asked me to do his biography, but that convinced me even more there was a need for a history of the Digital Age that explained how Steve Jobs became Steve Jobs.

GAZETTE: You’re known for your “Great Man”-style biographies, and yet this is a story about famous, seminal figures and the lesser-knowns who contributed to the Digital Revolution in some way. Can you tell me about that approach?

ISAACSON: The first book I did after college (with a friend) was about six not very famous individuals who worked as a team creating American foreign policy. It was called “The Wise Men.” Ever since then, I’ve done biographies. Those of us who write biographies know that to some extent we distort history. We make it seem like somebody in a garage or in a garret has a “light-bulb moment” and the world changes, when in fact creativity is a collaborative endeavor and a team sport. I wanted to get back to doing a book like “The Wise Men” to show how cultural forces and collaborative teams helped create the Internet and the computer.

More here.

Wednesday Poem

Destiny

They deliver the edicts of God
without delay
And are exempt from apprehension
from detention
And with their God-given
Petasus, Caduceus, and Talaria
ferry like bolts of lightning
unhindered between the tribunals
of Space and Time

The Messenger-Spirit
in human flesh
is assigned a dependable,
self-reliant, versatile,
thoroughly poet existence
upon its sojourn in life

It does not knock
or ring the bell
or telephone
When the Messenger-Spirit
comes to your door
though locked
It'll enter like an electric midwife
and deliver the message

There is no tell
throughout the ages
that a Messenger-Spirit
ever stumbled into darkness

by Gregory Corso

Gregory Corso French Translation

New French translation of Gregory Corso's poetry


The Metaphysical Club

From delanceyplace:

Robert_gould_shaw_memorial_-_detailRobert Gould Shaw, (portrayed by Matthew Broderick in the stirring 1989 film Glory), was a wealthy young Bostonian and a second lieutenant in the 2nd Massachusetts Infantry when he was approached by his father in late 1862 to take command of a new All-Black Regiment, the 54th Massachusetts Infantry. At first he declined the offer, but after careful thought, he accepted the position. He and his troops were immortalized on July 18, 1863, when they assaulted Confederate Battery Wagner. As the unit hesitated in the face of fierce Confederate fire, Shaw led his men into battle by shouting, “Forward, Fifty-Fourth, forward!” As he lead his men forward he was shot through the chest three times and died almost instantly. Years later, it was William James, viewed by some as the most brilliant American of the nineteenth century, who gave the dedication speech at the unveiling of a memorial in Shaw's honor. Note that the 1859 publication of Charles Darwin's Origin of the Species formed the intellectual backdrop for that era, and thus is understandably present in James's speech: “In 1897 the Commonwealth of Massachusetts erected a monument on Boston Common, designed by Augustus Saint-Gaudens and dedicated to Robert Gould Shaw, the man who had led the Fifty-Fourth and had died at Fort Wagner. William James was invited to deliver the oration at the unveiling. It is the finest of his speeches. Shaw had begun the war as a private in the Seventh New York Regiment, and was then commissioned an officer in the Second Massachusetts before accepting, in the winter of 1863, the colonelcy of the Fifty-Fourth, the so-called black regiment. Veterans of all Shaw's regiments were in the audience when James spoke. Shaw was being honored for having been a valiant soldier, James told them, but that was not what made him worthy of a memorial. For the instinct to fight is bred into us through natural selection; it hardly needs monuments or speeches to be reinforced. 'The survivors of one successful massacre after another are the beings from whose loins we and all our contemporary races spring,' James said; ' … pugnacity is the virtue least in need of reinforcement by reflection.

“What had made Shaw admirable, James explained, was not 'the common and gregarious courage' of going off to fight. It is that more lonely courage which he showed when he dropped his warm commission in the glorious Second to head your dubious fortunes, [soldiers] of the Fifty-fourth. That lonely kind of courage (civic courage as we call it in peace-times) is the kind of valor to which the monuments of nations should most of all be reared. For the survival of the fittest has not bred it into the bone of human beings as it has bred military valor; and of five hundred of us who could storm a battery side by side with others, perhaps not one could be found who would risk his worldly fortunes all alone in resisting an enthroned abuse. “A great nation is not saved by wars, James said; it is saved 'by acts without external picturesqueness; by speaking, writing, voting reasonably; by smiting corruption swiftly; by good temper between parties; by the people knowing true men when they see them, and preferring them as leaders to rabid partisans or empty quacks.' This is the behavior that monuments should honor.

More here.

Antal Szerb’s journey by midnight

1590177738.01.LZZZZZZZJulie Orringer at The Millions:

In August 1936, the thirty-six-year-old Hungarian writer Antal Szerb—acclaimed both for his fiction and for his influential History of Hungarian Literature—traveled to Italy for what he suspected would be the last time. The journey was a romantic farewell, the coda of a long obsession with the country, its art, its history, its people, its language, its ancient towns, and their narrow back streets. Szerb had lived in Italy as a young man, between 1924 and 1929, and the place had never relinquished its hold on him. “I initially wanted to go to Spain,” he wrote in his 1936 travel journal, “…but it occurred to me that I simply must go to Italy, while Italy remains where it is, and while going there is still possible. Who knows for how much longer that will be; indeed, for how much longer I, or any of us, will be able to go anywhere? The way events are moving, no one will be allowed to set foot outside his own country.” (He may as well have said, particularly no one of Jewish origin; though baptized a Catholic, he was the son of Jewish parents and was conscious of the growing threat Europe’s Jews faced.) His sense of urgency was, of course, prescient: within a few years a trip like the one he undertook in 1936 would indeed have become impossible. But the unusual combination of obsession, urgency, clear-eyed judgment, and foreboding that drew him to Italy helped to shape the brilliant and surprising work he produced on his return to Hungary:Journey by Moonlight, one of the most indelible novels of Szerb’s troubled century.

more here.

the notion of family

Atoya-ruby-frazier-aunt-midgie-and-grandma-ruby-2007-from-the-notion-of-family-aperture-2014-1Jane Harris at Paris Review:

LaToya Frazier’s first monograph, The Notion of Family, documents the decline of Braddock, Pennsylvania—a once-prosperous steel-mill town that employed generations of African American workers—alongside the hardships of Frazier’s family, who grew up there. Issues of class and race underscore the mostly black-and-white photographs in the collection, which is arranged as a kind of family album: intimate, collaboratively produced portraits of Frazier and her mother in mirrors and on beds, are presented with derelict scenes of collapsed buildings, vacant lots, and boarded-up stores.

Frazier provides short texts with each image—wistful snippets of memory and anecdote merge with facts and statistics. Illness is nearly a constant. As Laura Wexler points out in an accompanying essay, Braddock’s hospital, which eventually housed the town’s only restaurant and therefore became its de facto meeting place, “is as much or more a fixture in this album and this family than the school, the factory, the library, the market, the taxi stand, the pawnshop, or any other institution.”

more here.

How I Rewired My Brain to Become Fluent in Math

4385_65f148c815a4ebfaf8eb150460ba94fc

Barbara Oakley in Nautilus:

In the years since I received my doctorate, thousands of students have swept through my classrooms—students who have been reared in elementary school and high school to believe that understanding math through active discussion is the talisman of learning. If you can explain what you’ve learned to others, perhaps drawing them a picture, the thinking goes, you must
understand it.

Japan has become seen as a much-admired and emulated exemplar of these active, “understanding-centered” teaching methods. But what’s often missing from the discussion is the rest of the story: Japan is also home of the Kumon method of teaching mathematics, which emphasizes memorization, repetition, and rote learning hand-in-hand with developing the child’s mastery over the material. This intense afterschool program, and others like it, is embraced by millions of parents in Japan and around the world who supplement their child’s participatory education with plenty of practice, repetition, and yes, intelligently designed rote learning, to allow them to gain hard-won fluency with the material.

In the United States, the emphasis on understanding sometimes seems to have replaced rather than complemented older teaching methods that scientists are—and have been—telling us work with the brain’s natural process to learn complex subjects like math and science.

The latest wave in educational reform in mathematics involves the Common Core—an attempt to set strong, uniform standards across the U.S., although critics are weighing in to say the standards fail by comparison with high-achieving countries. At least superficially, the standards seem to show a sensible perspective. They propose that in mathematics, students should gain equal facility in conceptual understanding, procedural skills and fluency, and application.

The devil, of course, lies in the details of implementation. In the current educational climate, memorization and repetition in the STEM disciplines (as opposed to in the study of language or music), are often seen as demeaning and a waste of time for students and teachers alike. Many teachers have long been taught that conceptual understanding in STEM trumps everything else. And indeed, it’s easier for teachers to induce students to discuss a mathematical subject (which, if done properly, can do much to help promote understanding) than it is for that teacher to tediously grade math homework. What this all means is that, despite the fact that procedural skills and fluency, along with application, are supposed to be given equal emphasis with conceptual understanding, all too often it doesn’t happen. Imparting a conceptual understanding reigns supreme—especially during precious class time.

More here.

Friends of Leo

Strauss-Howse-243x366

Benjamin Aldes Wurgaft reviews Robert Howse's Leo Strauss: Man of Peace:

Robert Howse’s Leo Strauss: Man of Peace is an effort to rehabilitate Strauss without supporting those whom he terms the “Straussians.” Howse both defends Strauss against his caricature as a “cult figure of the right,” and argues that Strauss favored peace (while acknowledging that violent force is sometimes necessary) rather than bellicosity, as some Strauss detractors have claimed. Howse wrote his book with the benefit of hundreds of Strauss’s recorded lectures, placed online by the Leo Strauss Center at the University of Chicago. The availability of these lectures does much to increase the transparency of Strauss’s literary estate, particularly after so much opprobrium has been heaped on his name, and Howse is quite right to say that “The release of the recordings […] reflect[s] the shock therapy of the Iraq accusations on the Straussian cult.”[5] But beyond Howse’s meditations on Strauss, war, and peace, this thoughtful, inventive, and well-argued — if, as I will explain, sometimes uneven — book also makes an important contribution by inquiring into Strauss’s views on philosophers and political life. Howse frames this in the following terms: “What can we, as scholars and as citizens, learn from the dramatic encounter between philosophy and political violence in Strauss’s own thought?”

While Howse emphasizes violence, for Strauss the problem of how philosophers understand “political violence” — of how they guard against it both for their own good and for the benefit of their political communities — was effectively a sub-question beneath the larger question of the relationship between philosophy and politics, or philosophy and action in the world more broadly: how do philosophers relate to their non-philosophical neighbors? Strauss articulated his doubts about philosophers directly participating in governance — during a famous exchange with his friend, the great Hegel interpreter Alexandre Kojève — to which Howse attends with skill and care. Howse notes that the controversy between them really involved not only philosophy’s action-potential but also its fundamental meaning, which for Kojève was revealed by history’s unfolding. But doubts about the mixture of philosophy and politics had turned up early in Strauss’s career, roughly at the same time two of his illiberal influences — the jurist Carl Schmitt and the philosopher Martin Heidegger — joined the Nazi Party.

Howse’s “man of peace” discussion involves two central, intertwined claims: first, that Strauss was not a foe of liberalism, constitutionalism, or democracy, as he is commonly taken to be. Howse’s Strauss looks beyond the limiting polarization of liberalism and anti-liberalism, is willing to support constitutional democracy (just as the historical Strauss was, of course, happy to live within the constitutional democracy of the United States), and asks how philosophers might contribute to constitutional thought. Howse’s book thus demands to be read next to Steven Smith’s 2006 Reading Leo Strauss: Philosophy, Politics, Judaism, which presented Strauss as the best friend liberal political thought ever had, working not to attack liberalism but to shore up any weaknesses within liberal thought.[6]

Howse’s other claim is biographical: he argues that we should see Strauss’s development in terms of t’shuvah (Hebrew for “repentance”) performed for the youthful sin of illiberal and nihilistic thinking: he calls this “Strauss’s self-overcoming of anti-liberalism,” a form of surrogate repentance not through religious piety but by philosophical means.

More here.

The Arendt Wars Continue: Seyla Benhabib v. Richard Wolin

B0165_eichmann_d

Corey Robin in Crooked Timber:

In the beginning, when the battle first broke out after the publication of Eichmann, the main issue of contention was Arendt’s treatment of the Jewish Councils. But now that most of that generation of survivors is gone, that issue has died down.

Now the main fault line of the battle is Arendt’s treatment of Eichmann’s anti-Semitism: whether she minimized it or not. And that issue, it seems to me, is very much tied up with the fate of Israel.

After all, if the claim could be made, however vulgarly (for this was not in fact Arendt’s point at all), that Ground Zero of modern anti-Semitism was not in fact anti-Semitic, what does that tell us about the presence and persistence of anti-Semitism in the contemporary world? Again, that was not in fact Arendt’s argument, but it’s been taken that way, and I can’t help but think that one of the reasons why the focus on Eichmann’s anti-Semitism plays the role now that it does (as opposed to when the book was originally published) has something to do with the legitimation crisis that Israel is currently undergoing.

But this is for a longer discussion at a later point, one that I plan to explore in more depth in a piece on the Arendt wars that I’ll be writing for a magazine.

Right now, I’m more interested in the battle between Seyla Benhabib and Richard Wolin that has broken out over the last few weeks in the pages of the New York Times and the Jewish Review of Books. Again, prompted by Stangneth’s book.

I’ve been hesitant to weigh into this battle on this blog for a few reasons. First, I personally know both Seyla and Richard, who’s a colleague of mine at the CUNY Graduate Center. Though I tend to side with Seyla on the question of Arendt, I have a great deal of respect for Richard and his work. I like both of them, and don’t like getting into the middle of it. Second, as I said, I’ll be writing more on the Arendt wars in the future, and want to give myself some time and space to think about what they mean before I weigh in in public. And last, I don’t know that I have the stomach for the inevitable round of Seinsplaining I anticipate on the comment thread of this blog. Talk about Arendt, everyone thinks Heidegger, and lo and behold we have one after another thousand-word comment from Learned Men about matters that have little to do with the original post.

But there are two smaller issues that have come up in the exchange between Wolin and Benhabib that I did want to explore, in part because they are so small.

More here.

What to Call Her?

640px-Doris_lessing_20060312_(jha)

Jenny Diski in LRB (image from Wikimedia commons):

My experience with death has been minimal and to varying degrees distant. I have never been in the presence of anyone when they died. The likely ones, family deaths, the deaths of my father and mother, are remote in space and time. My father died when I was 19, somewhere else, and I was told of it by phone. In the case of my mother I didn’t even know she had died in the 1980s until my daughter found out eight years later. Between late 2010 and early 2011 there were two deaths; one a very elderly, long-time friend, Joan, and the other, sudden and tragic, a couple of months later, in 2011, my first husband, father of my daughter, and my oldest friend, Roger. Then, during the final quarter of 2013, there were two more deaths within a month of each other, neither of them really unexpected after years of frailty, but both, Doris Lessing and her son, Peter, having attachments of some complexity to each other, to my daughter and to me, going back even before I went at 15 to live in their house.

When she died last November at the age of 94, I’d known Doris for fifty years. In all that time, I’ve never managed to figure out a designation for her that properly and succinctly describes her role in my life, let alone my role in hers. We have the handy set of words to describe our nearest relations: mother, father, daughter, son, uncle, aunt, cousin, although that’s as far as it goes usually in contemporary Western society.

Doris wasn’t my mother. I didn’t meet her until she opened the door of her house after I had knocked on it to be allowed in to live with her. What should I call her to others? For several months I lived with Doris, worked in the office of a friend of hers and learned shorthand and typing. Then, after some effort, she persuaded my father to allow me to go back to school to do my O and A levels. As a punishment, he had vetoed further schooling after I was expelled – for climbing out of the first-floor bathroom window to go to a party in the town – from the progressive, co-ed boarding school that Camden Council had sent me to some years before. (‘We think you will be better living away from your mother for some of the time. Normally, we would send you to one of our schools for maladjusted children, but because your IQ is so high, we’re going to send you to a private school, St Christopher’s, which takes a few local authority cases like yours,’ the psychologists at University College Hospital had said to me, rather unpsychologically. I was 11.) My father relented and Doris sent me to a progressive day school.

At the new school, aged 16, as I tried to ease myself back into being a schoolgirl after my adventures in real life (working full time in a shoe shop, a grocery shop and then being a patient in a psychiatric hospital), I discovered I had to have some way of referring to the person I lived with to my classmates. It turned out that teenagers constantly refer to and complain about their parents and they use the regular handles. Not that I would, under the circumstances, have complained. But could I refer to Doris as my adoptive mother?

More here.

Don’t Spoil the Ending

Abigail Zuger in The New York Times:

BookPerhaps we should reform the medical profession by keeping the young and immortal out of it. Let’s bar medical school entry till age 50: Presumably that would fix our present bizarre disconnect between the army of doctors bent on preserving life and the tiny band able to accept death. Doctors would come equipped with the age-bred wisdom to understand the continuum, and they would demand a health care system that did likewise. That’s not happening any time soon. As things stand, though, at least we have the bittersweet pleasure of watching the occasional thoughtful defection from the mighty army to the little band. Dr. Atul Gawande, possibly the most articulate defector yet, has made his considerable reputation primarily as a fix-it man. As a Harvard surgeon he patches up organs; as a longtime writer for The New Yorker he has both described and prescribed for many of our profession’s troubles. The recent widespread enthusiasm for checklists to minimize medical errors can be traced more or less directly to his pen. Now Dr. Gawande (heading for 50) has turned his attention to mortality, otherwise known as the one big thing in medicine that cannot be fixed. In fact, the better doctors perform, the older, more enfeebled and more convincingly mortal our patients become. And someone should figure out how to take better care of all of them soon, because their friends, neighbors and children are at their wits’ end. It is one thing to understand this helplessness, as most young doctors do, by watching the trials of patients and their families; as an observer Dr. Gawande has visited this territory before. It is quite another thing to be socked in the gut by age and infirmity unfolding in one’s own family — an experience that has to be the world’s finest postgraduate medical education.

Dr. Gawande completed that curriculum in three courses: his grandfather’s extraordinarily long and atypically happy old age, his wife’s grandmother’s extremely long and typically unhappy old age, and his own father’s struggle with age and illness. The grandfather lived to almost 110 years old in a small Indian village, surrounded by family members who cared for him and catered to his every whim. All was not idyllic — a patriarch’s prolonged survival can certainly play havoc with everyone’s financial expectations — but his was the kind of empowered aging to which most aspire. Instead, what they usually get is the slow entrapment experienced by Dr. Gawande’s grandmother-in-law, a self-sufficient New Englander whose horizons were increasingly hemmed in by the terrible dictates of “safety.” She was not safe to live alone, not safe to drive, not safe to manage her own finances — she was not safe to live at all, really, yet condemned to live on. A balance between a reasonably risk-free old age and one worth living is surpassingly difficult to devise; it is the rare institution or family that manages it, as Dr. Gawande’s extensive reporting makes clear.

More here.

What would Plato make of the modern world?

Joe Gelonesi in ABC Radio:

More than 2,500 years ago an urgent question arose: why should we matter to ourselves, or anyone else? The existential angst of the Axial Age unleashed a protean intellectual energy. Enter Socrates and his famed pupil. As Rebecca Goldstein sees it, there was no turning back. She tells Joe Gelonesi that if there was ever another place and time for Plato, it’s right here, right now.

PlatoRebecca Goldstein is a fan of Plato. That might be an understatement. It’s been said that all of western philosophy is but a footnote to the Athenian, and this highly accomplished, Ivy League trained philosopher doesn’t doubt it for one minute. For our crazy, mixed-up times, Goldstein has conducted what amounts to a 400 page thought experiment. It’s proved extraordinarily popular, beyond her expectations, and kept her busy for the better part of a year taking her Plato on a tour of a world hungry for answers, or at the least the right questions. Her premise is simple: if Plato could come back, what would he make of it all? In the process, she hopes to prove that the philosophy-jeerers, as she calls them, have wrongly trumpeted a premature death for one of humankind’s most extraordinary enterprises.

Plato at the Googleplex is subtitled Why Philosophy Won’t Go Away. In it she handles Plato with deft hands, placing him in situations known all too well to us moderns: from clamorous cable talk shows to a brain imaging centre, where a neuroscientist declares to Plato that the game is up—science has solved the puzzle of free will. You can feel a soft wrath underneath these fictional whacky situations, from someone fed up with the crassness of a world desperate to move on to somewhere, anywhere, where doubt and uncertainty have evaporated. Goldstein, though, is not of the anti-science, touchy- feely kind. She understands string theory, evolution and genetics. She gets neuroscience too, sharing the stage on occasion with Antonio Demasio—he of impeccable mind-is-the-brain credentials. It’s just that for her, the technical explanation to life, the universe and everything won’t do on its own.

More here. (Thanks to Andrew Davies)

Randomness: the Ghost in the Machine?

by Yohan J. JohnMachineghost

“Mine is a dizzying country in which the Lottery is a major element of reality; until this day, I have thought as little about it as about the conduct of the indecipherable gods or of my heart.”

The Lottery in Babylon by Jorge Luis Borges

In his classic short story The Lottery in Babylon, Borges invites us to imagine a culture that valorizes randomness, institutionalizing it in an official lottery that entangles itself with every aspect of life, and even non-life. By situating this culture in Babylon, Borges frees himself to conjure up an alien way of life. And yet, as with all great speculative fiction, Borges also seems to be holding a mirror up to nature — a funhouse mirror that warps and amplifies features that we can discern even in our own culture. In his evocative and succinct way, Borges is perhaps hinting that we continue to live in that dizzying country in which randomness is a major component of reality.

The Indecipherable Gods

How long has randomness been an element in the periodic table of ideas? For ancient people, chance was wrapped up with the concepts of fate and divine will. “Divination” comes from the Latin for “to be inspired by a god”. For the Romans, chance or luck was personified by the goddess Fortuna. To tell a person's fortune was to determine the hidden intentions of Lady Luck. The ancient Chinese used yarrow stalks, coins, and dice when consulting the 4000-year-old I Ching, or Book of Changes. Divination either led to, or co-evolved with, games of chance. The earliest known board game is Senet, which was played by ancient Egyptians as early in the 30th century BCE. The game seems to have involved casting two-sided tokens. A 5000 year old backgammon set,complete with dice, was excavated at a site in Iran. Dice from 2000 BCE have also been found at sites that were part of the Indus Valley civilization [1].

Ancient peoples seem to have attached great meaning to chance events — even in the context of games. Confronted with the sheer unpredictability of nature, ancient people populated their pantheons with gods and demons who were capricious in the extreme. They seem to have believed that participating in chance events of their own invention could give them a glimpse into the otherwise inscrutable ways of divine beings [1]. Or perhaps they reasoned that they could become like gods through imitation of their ludicrous whims. The word “ludicrous”, incidentally, derives from the Latin root ludus,which means “game” or “play”. At some point in the past few hundred years,the word came to mean “ridiculous” — perhaps the Enlightenment made Europeans look unfavorably upon frivolity and play. There are streams within Hinduism, however, that preserve an echo of the ancient worldview — in some scriptures the universe is described as as lila, or divine play. The gods, according to this view, engage in creation and destruction for fun or sport. In India the term lila did not pick up any connotations of ridiculousness: it is a well-known theological concept, as well as a popular name given to girls.

In the modern world randomness typically connotes the very opposite of divine will — outside the world of gambling and gaming, a random event is often described as meaningless, and therefore only a source of inconvenience or tragedy. The ancients may have confronted chance with a more cheery attitude than is common today, but there is no suggestion that they were able to translate any intuitions derived from gambling (or fortune-telling) into a mathematical theory of chance. This hole in ancient knowledge is striking, because much of the mathematics required to begin the study randomness — simple arithmetic — was known to ancient cultures all over the world [2].

Order out of Chaos

The seeds of a mathematical approach to randomness were planted in the 16th century, when Europeans thinkers realized that chance events and processes were not completely devoid of order. This realization contributed to the emergence of two related but complementary mathematical approaches: probability and statistics. The theory of probability allowed people to uncover patterns in controlled settings, such as games of chance. Statistics allowed people to uncover patterns in more natural, uncontrolled settings, such as mortality tables compiled for insurance purposes.

Read more »

Poem

FIRE TREE

Tips of his mustache whip braided,
a turbaned invader four centuries ago
carried Persian saplings in a caravan
across the Himalayas to Kashmir.

“Our chinar will last a thousand years,”
my grandfather said as rustling boughs
reigned above the tin roof of the house
where I was born a Scorpio at midnight.

Every fall each leaf burst into a flower.
We gathered the remains of dyes
to create our rustic fuel for winter,
sprinkling water on burning leaves,

palms brushing light ashes together.
I packed fragile coal in a clay pot
matted in painted wicker, my kangri,
cloaking it between my knees

under a loose mantle, my pharun.
The ashes warmed my bag of bones.
I flew to the future of other worlds,
returning years later to see my father,

sun-withered, sipping his morning tea
alone beside an amputated trunk.
Last night I dreamt I went to Kashmir again.
I was being rowed in an embroidered shikara

to the Garden of Rajas who had vanished,
and the garden was a sea of hell; the tin roof
collapsed, our fire tree submerged, and
barrenness had become a thousand things.

by Rafiq Kathwari, Winner of the 2013 Patrick Kavanagh Poetry Award

Perceptions: Avian aesthetics

Bowerbird-5-vogal
Bowerbirds. “make up the bird family Ptilonorhynchidae. They are renowned for their unique courtship behaviour, where males build a structure and decorate it with sticks and brightly coloured objects in an attempt to attract a mate.” From Wikipedia.

“To woo females, the males of 17 of the 20 known species of bowerbirds build structures—often resembling an arbor, or bower, with an artfully decorated platform. …

… evolutionary biologist Jared Diamond has called them “the most intriguingly human of birds.” These are birds that can build a hut that looks like a doll's house; they can arrange flowers, leaves, and mushrooms in such an artistic manner you'd be forgiven for thinking that Matisse was about to set up his easel; some can sing simultaneously both the male and female parts of another species' duet, and others easily imitate the raucous laugh of a kookaburra or the roar of a chain saw. Plus, they all dance.” From National Geographic, July 2010.

More here, and here.

Do check out the links … bowerbirds are completely awesome!

Thanks to Joyce Ramsey, the owner of “Bowerbird Mongo”, a store in Ypsilanti, MI.