Pandaemonium

PLUCKED FROM THE WEB #38

web 38

The latest (somewhat random) collection of recent essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.


.

Why replacing politicians
with experts is a reckless idea

David Runciman, Guardian, 1 May 2018

Epistocracy remains the reckless idea. There are two dangers in particular. The first is that we set the bar too high in politics by insisting on looking for the best thing to do. Sometimes it is more important to avoid the worst. Even if democracy is often bad at coming up with the right answers, it is good at unpicking the wrong ones. Moreover, it is good at exposing people who think they always know best. Democratic politics assumes there is no settled answer to any question and it ensures that is the case by allowing everyone a vote, including the ignorant. The randomness of democracy – which remains its essential quality – protects us against getting stuck with truly bad ideas. It means that nothing will last for long, because something else will come along to disrupt it.

Epistocracy is flawed because of the second part of the word rather than the first – this is about power (kratos) as much as it is about knowledge (episteme). Fixing power to knowledge risks creating a monster that can’t be deflected from its course, even when it goes wrong – which it will, since no one and nothing is infallible. Not knowing the right answer is a great defence against people who believe that their knowledge makes them superior.

Brennan’s response to this argument (a version of which is made by David Estlund in his 2007 book Democratic Authority) is to turn it on its head. Since democracy is a form of kratos, too, he says, why aren’t we concerned about protecting individuals from the incompetence of the demos just as much as from the arrogance of the epistocrats? But these are not the same kinds of power. Ignorance and foolishness don’t oppress in the same way that knowledge and wisdom do, precisely because they are incompetent: the demos keeps changing its mind.

The democratic case against epistocracy is a version of the democratic case against pragmatic authoritarianism. You have to ask yourself where you’d rather be when things go wrong. Maybe things will go wrong quicker and more often in a democracy, but that is a different issue. Rather than thinking of democracy as the least worst form of politics, we could think of it as the best when at its worst. It is the difference between Winston Churchill’s famous dictum and a similar one from Alexis de Tocqueville a hundred years earlier that is less well-known but more apposite. More fires get started in a democracy, de Tocqueville said, but more fires get put out, too.

Read the full article in the Guardian.


.

The privilege predicament
Robert Boyers, The American Scholar, Spring 2018

Certainly it is not a simple matter to speak of privilege in the domain of race relations. A few years ago, I found myself embroiled in an argument at a symposium, where one speaker had referred to “white privilege” as a self-evident phenomenon. Was it really necessary, I asked, to point out that there is privilege and privilege, whiteness and whiteness? If my white colleague felt that she had a great deal to apologize for, and thought a public symposium a suitable occasion for a display of soul searching, that was well and good, so long as she did not also suggest that we must all follow her lead and all feel about our own so-called privilege exactly what she felt. Was it reasonable to suppose that whiteness confers, on everyone who claims it, comparable experiences and privileges? Was my own background as a working-class Jewish boy, growing up in a predominantly black community, remotely similar to the background or disposition of a white colleague who had never known privation, or had no contact at all with black children? Did it matter, thinking of ourselves simply as possessors of white privilege, that one of us had written extensively on race while the other had devoted herself to scholarly research on metaphysical poetry? Was it not the case, I asked, that what Claudia Rankine and Beth Loffreda call in The Racial Imaginary “the boundaries” of our “imaginative sympathy” had been drawn in drastically different ways? How could whiteness, or blackness, signify to us the same things?

To consider either of us primarily as white people, deliberately consigning to irrelevance everything that made us different from each other—and different from the kinds of white people who regard their whiteness as an endowment to be proud of—was to deny what was clearly most important about each of us. Rankine and Loffreda rightly challenge those who “argue that the imagination is or can be somehow free of race,” and they mock white writers “who make a prize of transcendence,” supposing that the imagination can be “ahistorical” or “postracial.” But to insist that elementary distinctions be made, as between one experience of race and another, would seem indispensable to a serious discussion of privilege.

Read the full article in the American Scholar.


.

The collapse of Europe’s mainstream centre left
Chris Bickerton, New Statesman, 1 May 2018

Historically, the left as a political force emerged from the confrontation between the wishes and desires of European workers and the inability of 19th-century laissez-faire liberalism to satisfy them. The incubator for left-wing ideas has long been the gap between what people want and what the social structure of capitalism can deliver.

Today’s parties of the left in Europe tend to be too socially deracinated and isolated to know what it is that people want. And given the complacency of many of the leading figures, it is not clear they are interested in finding out. The degree to which new movements are representative of their societies is crucial if they are to be a path for the renewal of the left. And so, the question for the political parties of the left is how they can reconnect with societies marked by intensifying individualism.

The left in Europe should not look back nostalgically at the “golden age” of social democracy and ask how that can be recreated. The world of the 1940s or 1950s was very different, as were the early years of the 20th century, when the emergence of mass politics was transforming political systems and giving rise to new social movements. Nor should the left hope to recreate this golden age at the European level, a wish that is more utopian than ever today.

The challenge of the present is to combine a renewal of the theoretical tools of left-wing social and political thought with the discovery of what people are thinking about and what it is that they want. And then to build a new political project at the national level, based on what changes to society would be required in order for those needs and desires to be met.

Read the full article in the New Statesman.


.

Windrush is just the tip of the iceberg – this is how anti-immigrant politics are normalised in Britain
Maya Goodfellow, Prospect, 2 May 2018

You don’t just arrive at the ‘hostile environment’ with one government. Theresa May, with a little help from New Labour, is its architect, but she didn’t build the base it’s constructed on. This is not just the product of the past eight years; it is the outcome of decades of a toxic anti-immigration politics that are deeply embedded in Britain.

Just look at the Windrush generation: lauded by the press in 2018 as national treasures, when they came to the UK as citizens from former colonies they were treated as anything but. They were met with politicians that wanted their labour but not their presence—and a public that complained that people of colour moving in next door would ruin whole neighbourhoods.

Fast forward sixty years, and the shape of xenophobia has changed but anti-migrant politics have remained remarkably similar. Politicians, the public and the press have continuously scapegoated migrants for our social ills – wrongly blaming them for low pay, crumbling public services and our social anxieties. ‘They’ are still a problem to be managed. It is not the specifics of who is in the Home Office but the normalisation ofthese politics that allows the hostile environment.

Cameras will turn elsewhere now that the cleanup is underway. But while the spillage might be mopped up, the source of the mess remains sitting on the shelf. On his first day in office, Amber Rudd’s replacement, Sajid Javid, announced the phrase the ‘hostile environment’ will be exchanged for the more palatable ‘compliant environment’. The name, he said, isn’t fitting with ‘our values as a country’. We can assume the policies, still in place, are.

Read the full article in Prospect.


.

achiles and ajax

Black Achilles
Tim Whitmarsh, Aeon, 9 May 2018

In the Odyssey, Athena is said to enhance Odysseus’ appearance magically: ‘He became black-skinned (melagkhroiēs) again, and the hairs became blue (kuaneai) around his chin.’ On two other occasions when she beautifies him, she is said to make his hair ‘woolly, similar in colour to the hyacinth flower’. Now, translating kuaneos (the root of the English ‘cyan’) as ‘blue’, as I have done here, is at first sight a bit silly: most translators take the word to mean ‘dark’. But given the usual colour of hyacinths, maybe – just maybe – he did have blue hair after all? Who knows; but here, certainly, is another example of just how alien the Homeric colour scheme is. To make matters worse, at one earlier point in the poem his hair is said to be xanthos, ie just like Achilles’; commentators sometimes take that to refer to grey grizzle (which is more evidence that xanthos doesn’t straightforwardly mean ‘blond’).

And what of ‘black-skinned’? Was Odysseus in fact black? Or was he (as Emily Wilson’s acclaimed new translation renders it) ‘tanned’? Once again, we can see how different translations prompt modern readers to envisage these characters in completely different ways. But to understand the Homeric text, we need to shed these modern associations. Odysseus’ blackness, like Achilles’ xanthos hair, isn’t intended to play to modern racial categories; rather, it carries with it ancient poetic associations. At another point in the Odyssey, we are told of Odysseus’ favourite companion Eurybates, who ‘was round-shouldered, black-skinned (melanokhroos), and curly-haired … Odysseus honoured him above his other comrades, because their minds worked in the same way.’ The last part is the crucial bit: their minds work in the same way, presumably, because Eurybates and Odysseus are both wily tricksters. And, indeed, we find the association between blackness and tricksiness elsewhere in early Greek thought.

In the Odyssey, Athena is said to enhance Odysseus’ appearance magically: ‘He became black-skinned (melagkhroiēs) again, and the hairs became blue (kuaneai) around his chin.’ On two other occasions when she beautifies him, she is said to make his hair ‘woolly, similar in colour to the hyacinth flower’. Now, translating kuaneos (the root of the English ‘cyan’) as ‘blue’, as I have done here, is at first sight a bit silly: most translators take the word to mean ‘dark’. But given the usual colour of hyacinths, maybe – just maybe – he did have blue hair after all? Who knows; but here, certainly, is another example of just how alien the Homeric colour scheme is. To make matters worse, at one earlier point in the poem his hair is said to be xanthos, ie just like Achilles’; commentators sometimes take that to refer to grey grizzle (which is more evidence that xanthos doesn’t straightforwardly mean ‘blond’).

And what of ‘black-skinned’? Was Odysseus in fact black? Or was he (as Emily Wilson’s acclaimed new translation renders it) ‘tanned’? Once again, we can see how different translations prompt modern readers to envisage these characters in completely different ways. But to understand the Homeric text, we need to shed these modern associations. Odysseus’ blackness, like Achilles’ xanthos hair, isn’t intended to play to modern racial categories; rather, it carries with it ancient poetic associations. At another point in the Odyssey, we are told of Odysseus’ favourite companion Eurybates, who ‘was round-shouldered, black-skinned (melanokhroos), and curly-haired … Odysseus honoured him above his other comrades, because their minds worked in the same way.’ The last part is the crucial bit: their minds work in the same way, presumably, because Eurybates and Odysseus are both wily tricksters. And, indeed, we find the association between blackness and tricksiness elsewhere in early Greek thought.

Read the full article in Aeon.


.

The demise of the nation state
Rana Dasgupta, Guardian, 5 April 2018

Even if we wanted to restore what we once had, that moment is gone. The reason the nation state was able to deliver what achievements it did – and in some places they were spectacular – was that there was, for much of the 20th century, an authentic “fit” between politics, economy and information, all of which were organised at a national scale. National governments possessed actual powers to manage modern economic and ideological energies, and to turn them towards human – sometimes almost utopian – ends. But that era is over. After so many decades of globalisation, economics and information have successfully grown beyond the authority of national governments. Today, the distribution of planetary wealth and resources is largely uncontested by any political mechanism.

But to acknowledge this is to acknowledge the end of politics itself. And if we continue to think the administrative system we inherited from our ancestors allows for no innovation, we condemn ourselves to a long period of dwindling political and moral hope. Half a century has been spent building the global system on which we all now depend, and it is here to stay. Without political innovation, global capital and technology will rule us without any kind of democratic consultation, as naturally and indubitably as the rising oceans.

If we wish to rediscover a sense of political purpose in our era of global finance, big data, mass migration and ecological upheaval, we have to imagine political forms capable of operating at that same scale. The current political system must be supplemented with global financial regulations, certainly, and probably transnational political mechanisms, too. That is how we will complete this globalisation of ours, which today stands dangerously unfinished. Its economic and technological systems are dazzling indeed, but in order for it to serve the human community, it must be subordinated to an equally spectacular political infrastructure, which we have not even begun to conceive.

It will be objected, inevitably, that any alternative to the nation-state system is a utopian impossibility. But even the technological accomplishments of the last few decades seemed implausible before they arrived, and there are good reasons to be suspicious of those incumbent authorities who tell us that human beings are incapable of similar grandeur in the political realm. In fact, there have been many moments in history when politics was suddenly expanded to a new, previously inconceivable scale – including the creation of the nation state itself. And – as is becoming clearer every day – the real delusion is the belief that things can carry on as they are. The first step will be ceasing to pretend that there is no alternative. So let us begin by considering the scale of the current crisis.

Read the full article in the Guardian.


.

Are you in a BS job? In academe, you’re hardly alone
David Graeber, Chronicle of Higher Education, 6 May 2018

In most universities nowadays — and this seems to be true almost everywhere — academic staff find themselves spending less and less time studying, teaching, and writing about things, and more and more time measuring, assessing, discussing, and quantifying the way in which they study, teach, and write about things (or the way in which they propose to do so in the future. European universities, reportedly, now spend at least 1.4 billion euros [about 1.7 billion dollars] a year on failed grant applications.). It’s gotten to the point where “admin” now takes up so much of most professors’ time that complaining about it is the default mode of socializing among academic colleagues; indeed, insisting on talking instead about one’s latest research project or course idea is considered somewhat rude….

What strikes me as insufficiently discussed is that this has happened at a time when the number of administrative-support staff in most universities has skyrocketed. Consider here some figures culled from Benjamin Ginsberg’s book The Fall of the Faculty (Oxford, 2011). In American universities from 1985 to 2005, the number of both students and faculty members went up by about half, the number of full-fledged administrative positions by 85 percent — and the number of administrative staff by 240 percent.

In theory, these are support-staff. They exist to make other peoples’ jobs easier. In the classic conception of the university, at least, they are there to save scholars the trouble of having to think about how to organize room assignments or authorize travel payments, allowing them to instead think great thoughts or grade papers. No doubt most support-staff still do perform such work. But if that were their primary role, then logically, when they double or triple in number, lecturers and researchers should have to do much less admin as a result. Instead they appear to be doing far more.

This is a conundrum. Let me suggest a solution. Support staff no longer mainly exist to support the faculty. In fact, not only are many of these newly created jobs in academic administration classic bullshit jobs, but it is the proliferation of these pointless jobs that is responsible for the bullshitization of real work — real work, here, defined not only as teaching and scholarship but also as actually useful administrative work in support of either. What’s more, it seems to me this is a direct effect of the death of the university, at least in its original medieval conception as a guild of self-organized scholars. Gayatri Spivak, a literary critic and university professor at Columbia, has observed that, in her student days, when people spoke of “the university,” it was assumed they were referring to the faculty. Nowadays it’s assumed they are referring to the administration. And this administration is increasingly modeling itself on corporate management.

Read the full article in the Chronicle of Higher Education.


.

Cambridge Analytica’s closure
is a pyrrhic victory for data privacy

Ivan Manokha, The Conversation, 3 May 2018

If anything, the furore surrounding Cambridge Analytica has only served to strengthen the distinction between the idea of covert data collection versus data collection that is seen as legitimate and acceptable. The routine gathering and monetisation of vast amounts of personal data – currently undertaken on a daily basis by various actors and digital platforms especially – has been normalised…

The outcry against Cambridge Analytica has not attempted to sanction, nor even to question, the existence of digital platforms and other actors which depend on the ever more extensive acquisition and monetisation of personal data. If anything, the Cambridge Analytica story has unintentionally contributed to the further normalisation of surveillance and the lack of privacy that comes with being an internet user nowadays.

Even the web pages of the sites that broke the story (the Observer and New York Times) allow dozens of third-party sites to obtain data from the browser of the user accessing the articles. It was 75 and 61 sites, respectively, the last time I checked using Firefox’s Lightbeam extension.

Many commentators have pointed to this new era of ‘surveillance capitalism’ as the problem. But these arguments imply that capitalism without surveillance is not only possible, but that it existed before the advent of new technology.

Yet surveillance has been absolutely fundamental to the functioning of capitalism from the start. Producers have always needed to gather some information about the nature of their markets, their suppliers of inputs, and about the economy in general. Surveillance has also been central to the wage-labour relationship. Employees are closely supervised and monitored to ensure the time they work matches up to the time for which they are paid.

Read the full article in the Conversation.


.

Human chromosomes

The Golden State killer is tracked through
a thicket of DNA, and experts shudder

Gina Kolata & Heather Murphy,
New York Times, 27 April 2018

Genetic testing services have become enormously popular with people looking for long-lost relatives or clues to hereditary diseases. Most never imagined that one day intimate pieces of their DNA could be mined to assist police detectives in criminal cases.

Even as scientific experts applauded this week’s arrest of the Golden State Killer suspect, Joseph James DeAngelo, 72, some expressed unease on Friday at reports that detectives in California had used a public genealogy database to identify him. Privacy and ethical issues glossed over in the public’s rush to embrace DNA databases are now glaringly apparent, they said.

‘This is really tough’, said Malia Fullerton, an ethicist at the University of Washington who studies DNA forensics. ‘He was a horrible man and it is good that he was identified, but does the end justify the means?’

Coming so quickly on the heels of the Cambridge Analytica scandal, in which Facebook data on more than 70 million users was shared without their permission, it is beginning to dawn on consumers that even their most intimate digital data — their genetic profiles — may be passed around in ways they never intended.

‘There is a whole generation that says, “I don’t really care about privacy”‘, said Peter Neufeld, a co-founder of The Innocence Project, which uses DNA to exonerate people who were wrongly convicted. ‘And then they do, once there is a Cambridge Analytica. No one has thought about what are the possible consequences.’

Read the full article in the New York Times.


.

Islamic State assassin:
How I killed more than 100 people

BBC News, 4 May 2018

Khaled did not simply wake up in Raqqa to the smell of death and dust, and decide to become an assassin. He was sent a special invitation.

Six men were ordered to report to an airfield in Aleppo, in north-western Syria, where a French trainer would teach them to kill with pistols, silenced weapons, and sniper rifles. They learned to murder methodically, taking prisoners as their victims.

‘Our practice targets were detained soldiers from the regime’, he says. ‘They put them in a difficult place so you need a sniper to hit them. Or they send out a group of detainees and ask you to target one without hitting the others. Most of the time assassinations are done from a motorbike. You need another person to ride the bike and you sit behind him. You ride next to the target’s car – then you shoot him and he cannot escape.’

Khaled – not his real name – learned how to follow people. How to ‘buy’ targets he could not reach through those close to them. How to distract a convoy of cars, so a fellow assassin can pick off their mark.

It was a bloody, inhuman education. But in mid-2013, soon after the Syrian army retreated from Raqqa, it suited the leaders of Ahrar al-Sham – a hardline Islamist group striving to rule the northern city and eliminate its rivals.

Read the full article on BBC News.


.

Review of ‘Barbed-Wire Imperialism:
Britain’s Empire of Camps, 1876-1900′ by Aidan Forth

Mahon Murphy, LSE Review of Books, 1 May 2018

Histories of the development of concentration camps usually take their starting point as colonial military crises, with the Spanish use of ‘reconcentrados’ in Cuba in the 1890s being the most cited origin. While the term ‘concentration camp’ was coined during this conflict, the camps themselves did not correspond to what we might think of when we imagine them today. In Barbed-Wire Imperialism: Britain’s Empire of Camps, 1876-1903, Aidan Forth presents a broad history of the concentration camp during the late nineteenth century that maps their origin not in military conflict, but rather through their development as part of the British empire in accordance to Victorian ideals concerning the preservation of physical and moral health. At the same time, Forth addresses imperial security concerns raised by destitute and displaced populations who were considered socially, racially or politically suspect. The horrors of modern warfare developed in tandem with the birth of humanitarianism, and this is epitomised in the space of the camp.

Forth’s book intertwines the history of the concentration camps in both Boer Wars with experiences of famine and plague across the British empire to demonstrate that the connections between imperial practices and developments in Western culture rendered camps conceivable and feasible technologies in diverse but related circumstances. The same forces that created prisons and workhouses in Victorian Britain also fed into the creation of the colonial concentration camp. Forth traces the evolution of internment practices from their origins in the displacement of peasant farmers as industrial capitalism took hold in Britain, their spread to the colonial sphere and their modern global legacies. The scale of internment is shocking: in the final decades of the nineteenth century, Britain interned more than ten million men, women and children in camps during a series of colonial, military, medical and subsistence crises. These people were interned ‘for their own good’ and in the name of relief and humanity. Yet, camps also responded to metaphors of social danger and contagion, which dehumanised those who were detained.

Camps addressed the central question of imperial rule: how do a small contingent of Europeans occupy, survey vast landscapes and effectively manage the populations in these areas? Forth demonstrates the connection between camps and the ‘science of relief’. The practice of colonial famine policy had its origins in the Irish Famine of the late-1840s, which framed the agendas of the subsequent famine in India. While India did not have the permanent system of workhouses that operated in Ireland, new techniques developed that mirrored the empire’s attitudes towards colonial poverty. Famine camps emerged to distribute food, relief and, most importantly, discipline to India’s poor, performing much the same functions as their equivalents in the penal infrastructure in Britain and Ireland. In line with hardening attitudes towards race and poverty, the image of those affected by famine shifted from one of charity to suspicion. Famine wanderers became ‘able-bodied parasites’ and were seen as causing a law-and-order problem. For the imperial planners, the best method to deal with them was containment.

Read the full article in the LSE Review of Books.


.

The truth about Hans Asperger’s Nazi collusion
Simon Baron-Cohen, Nature, 8 May 2018

In digging anew into the deeper historical context of Asperger’s work, Sheffer fills in parts of the story anticipated in John Donvan and Caren Zucker’s history of autism, In a Different Key, which referred to Czech’s early findings. Sheffer reveals how the Nazi aim of engineering a society they deemed ‘pure’, by killing people they saw as unworthy of life, led directly to the Holocaust.

With insight and careful historical research, Sheffer uncovers how, under Hitler’s regime, psychiatry — previously based on compassion and empathy — became part of an effort to classify the population of Germany, Austria and beyond as ‘genetically’ fit or unfit. In the context of the ‘euthanasia’ killing programmes, psychiatrists and other physicians had to determine who would live and who would be murdered. It is in this context that diagnostic labels such as ‘autistic psychopathy’ (coined by Asperger) were created.

Sheffer lays out the evidence, from sources such as medical records and referral letters, showing that Asperger was complicit in this Nazi killing machine. He protected children he deemed intelligent. But he also referred several children to Vienna’s Am Spiegelgrund clinic, which he undoubtedly knew was a centre of ‘child euthanasia’, part of what was later called Aktion T4.

This was where the children whom Nazi practitioners labelled ‘genetically inferior’ were murdered, because they were seen as incapable of social conformity, or had physical or psychological conditions judged undesirable. Some were starved, others given lethal injections. Their deaths were recorded as due to factors such as pneumonia.

Sheffer argues that Asperger supported the Nazi goal of eliminating children who could not fit in with the Volk: the fascist ideal of a homogeneous Aryan people.

Read the full article in Nature.


.

Vermeer Astronomer

Radicalising the Enlightenment
Jonathan Israel, spiked review, May 2018

spiked review: While you note the international nature of the Enlightenment, why does the Dutch Republic, in particular its urban, commercial centres such as Amsterdam, play such a key role in the early Enlightenment? After all, Descartes himself heads there in the 1620s, and Cartesianism really takes off first in the republic. And later, of course, there’s Spinoza himself. What was it about the 17th-century Dutch states that was so conducive to the development and explosion of Enlightenment thought?

Jonathan Israel: Admittedly, it is no part of the traditional historiography of the Enlightenment to assign any particularly prominent place to the Netherlands in explaining the Western Enlightenment’s origins. But when one considers that the Dutch Republic was a unique society in the early modern Western world in several key respects one might expect to be closely connected with the Enlightenment’s origins, not least being the society with the largest and freest publishing industry in the 17th century, the conclusion that it was soon turns out to be an eminently logical one.

Firstly, toleration, freedom of expression and a willingness to consider different viewpoints were indisputably central features of the Enlightenment. But before the Glorious Revolution in Britain and parliament passing the Toleration Act (1689), every contemporary commentator agreed the Dutch Republic offered the most extensive and defined toleration available in Europe as a matter of policy, and the freest in allowing different religious (and irreligious) viewpoints to be published. It also had, as a matter of fact, the widest range of organised, established and recognised religious creeds, which some observers liked to compare and contrast – Calvinist, Lutheran, Catholic, Mennonite, Socinian, Collegiant, Jewish and Remonstrant – to be found in any European country, definitely eclipsing Britain in this respect at that time. Likewise, it offered an exceptionally broad freedom to philosophise. It was hardly an accident that Descartes and later Pierre Bayle chose to reside in the Netherlands and that John Locke did so, too, for several years while he was under a shadow in England, prior to 1688.

Secondly, because the Dutch Republic had the largest ‘carrying’ merchant fleet before the 18th century, and the largest volume of trade with Asia and Africa, familiarity with distant parts of the globe and the taste for collecting ‘rarities’, artefacts, manuscripts, coins, art objects, exotic plants and antiquities from distant parts provided a stronger, more obvious base than could be found in France, Italy or Germany, for example, for establishing private collections and museums functioning as a spur to early efforts in ethnography, anthropology, botany, geology and other social and exact sciences.

Thirdly, the hierarchical character of European society, placing nearly all higher positions in the hands of the aristocracy and courtiers, or else clergy, generally lent a socially enclosed, rather narrow character to political and social debate, as did the primacy of the crown and court. Being a republic with a relatively free press, that afforded more access to office-holding, diplomatic roles, and political life for non-nobles than other Western countries offered before 1789, tended to allow more scope than one found elsewhere for different points of view and for wide-ranging social and political criticism, including of monarchy, aristocracy and ecclesiastical authority.

Read the full article in spiked review.


.

Is this your image of the working class?
You need to update it

Tamara Draut, Guardian, 9 May 2018

As the manufacturing footprint in the working class has shrunk, so has the white male archetype that has historically defined the working class. Today’s working class is more female and racially diverse – with whites comprising less than 60% of the working class, down from nearly 9 out of 10 in 1970. Similarly, two-thirds of working-class women are in the paid labor market, up from less than half in 1970.

Put simply, the working class shifted from “making stuff” to “serving and caring for people” – a change that carried significant sociological baggage. The long-standing “others” in our society – women and people of color – became a much larger share of the non-college-educated workforce. And their marginalized status in our society carried over into the working class, facilitating the invisibility and devaluing of their work.

No longer shuttered away in a factory, today’s working class is interwoven into nearly every aspect of our lives. It’s the black woman in a caretaker’s smock wearing special comfort shoes and a name tag above her heart. It’s the white man in a uniform (which he had to pay for) who punches in each day and restocks the shelves of your favorite big-box giant. It’s the Latina home healthcare aide who cares for your mom, the janitor who empties your office wastebasket, the woman who rings up your groceries, and the crew who fix the bumpy freeway you take every day to work.

Yet despite how interwoven this new working class is in our lives, we don’t really know or hear much about it. Its members’ concerns don’t shape the national agenda or top the headlines in major newspapers. Their stories aren’t featured in sitcoms, dramas, or movies. Roseanne got a 21st century reboot, but no such luck for Good Times or Sanford and Son or Alice.

Read the full article in the Guardian.


.

The IQ trap: how the study of genetics
could transform education

Philip Ball, New Statesman, 16 April 2018

It’s sometimes said that the whole notion that intelligence has a genetic component is anathema to the liberals and left-wingers who dominate education. Young reliably depicts the extreme version here, saying ‘liberal educationalists… reject the idea that intelligence has a genetic basis [and] prefer to think of man as a tabula rasa, forged by society rather than nature’. He’s not alone, though. The psychologist Jill Boucher of City, University of London has lambasted what she calls ‘the unthinkingly self-righteous, hypocritical and ultimately damaging political correctness of those who deny that genetic inheritance contributes to academic achievement and hence social status’. Teach First’s suppression of Young’s article contributed to that impression: it was a clumsy and poorly motivated move. (The organisation has since apologised to Young.)

Despite this rhetoric, however, you’d be hard pushed to find a teacher who would question that children arrive at school with differing intrinsic aptitudes and abilities. Some kids pick things up in a flash, others struggle with the basics. This doesn’t mean it’s all in their genes: no one researching genes and intelligence denies that a child’s environment can play a big role in educational attainment. Of course kids with supportive, stimulating families and motivated peers have an advantage, while in some extreme cases the effects of trauma or malnutrition can compromise brain development. But the idea of the child as tabula rasaseems to be something of a straw man.

That’s backed up by a 2005 study by psychologist Robert Plomin of King’s College London, one of the leading experts on the genetic basis of intelligence, and his colleague Sheila Walker. They surveyed almost 2,000 primary school teachers and parents about their perceptions of genetic influence on a number of traits, including intelligence, and found that on the whole, both teachers and parents rated genetics as being just as important as the environment. This was despite the fact that 80 per cent of the teachers said there was no mention of genetics in their training. Plomin and Walker concluded that educators do seem to accept that genes influence intelligence.

Kathryn Asbury supports that view. When her PhD student Madeline Crosswaite investigated teachers’ beliefs about intelligence, Asbury says she found that ‘teachers, on average, believe that genetic factors are at least as important as environmental factors’ and say they are ‘open to a role for genetic information in education one day, and that they would like to know more’.

Why, then, has there been this insistence from conservative commentators that liberal educationalists are in denial? It’s just one reflection of how the whole discussion has become highly politicised as left versus right, political correctness versus realism. There’s more of that to come.

Read the full article in the New Statesman.


.

The very first animal appeared
amid an explosion of DNA

Carl Zimmer, New York Times, 4 May 2018

The animal kingdom is one of life’s great success stories — a collection of millions of species that swim, burrow, run and fly across the planet. All that diversity, from ladybugs to killer whales, evolved from a common ancestor that likely lived over 650 million years ago.

No one has found a fossil of the ur-animal, so we can’t say for sure what it looked like. But two scientists in Britain have done the next best thing. They’ve reconstructed its genome.

Their study, published in Nature Communications, offers an important clue to how the animal kingdom arose: with an evolutionary burst of new genes. These may have played a crucial part in transforming our single-celled ancestors into creatures with complex bodies made of many kinds of cells.

The new genes also proved to be remarkably durable. Of all the genes in the human genome, 55 percent were already present in the first animal. ‘The big surprise was how many of them there were’, said Jordi Paps, an evolutionary biologist at the University of Essex and co-author of the new study.

Read the full article in the New York Times.


.

Ex Machina

The humanist left must challenge
the rise of cyborg socialism

Jon Cruddas, New Statesman, 23 April 2018

In a previous era, in one of the great essays of the English left, Edward Thompson took aim at Louis Althusser and structuralism; he wrote: ‘Enchanted minds move through humourless, visionary fields, negotiate imaginary obstacles, slay mythical monsters (‘humanism’, ‘moralism’) perform tribal rites with the rehearsal of approved texts.’

Today, the fashionable left seeks to surrender humanism. What previous generations fought for and defended – from William Morris and George Lansbury to Thompson, Raymond Williams and the Independent Labour Party –  is to be replaced with a decentred, plastic tech utopia.

Historically, humanist Marxists and ethical socialists retained a notion of human nature; without this, it was deemed impossible to establish an agenda for durable economic and social change. The left rejected determinism so that the human being could be reinserted back into history and the means by which lives are commodified could be resisted, rather than accelerated. This was considered the very essence of politics.

The three elements of this modern hybrid chronocentric left – its deterministic embrace of technology and abolition of the working class; its attachment to a specific vision of the cosmos and rejection of the nation state as a politics of land and territory; and its incipient transhumanism – refract into a political worldview and manifesto which is a world away from the everyday experiences of the people. In this new world, apart from a certain chronocentric group of mainly young men, everything else is presented as reactionary and parochial.

For the left, it appears a shift away from concerns regarding social justice and institution-building, towards a narcissistic concern with self and identity. This is the interface with modern identity liberalism – everything is fluid, change is immanent, we are individually all in transition. It also shares an almost fanatical approach to questions of progress and a disdain for history and tradition, or what Chesterton once called the ‘democracy of the dead’. Maybe the left should noisily discuss the quiet rise of cyborg socialism.

Read the full article in the New Statesman.


.

The ethics of experimenting with human brain tissue
Nita Farahany, et al, Nature, 25 April 2018

If researchers could create brain tissue in the laboratory that might appear to have conscious experiences or subjective phenomenal states, would that tissue deserve any of the protections routinely given to human or animal research subjects?

This question might seem outlandish. Certainly, today’s experimental models are far from having such capabilities. But various models are now being developed to better understand the human brain, including miniaturized, simplified versions of brain tissue grown in a dish from stem cells — brain organoids. And advances keep being made.

These models could provide a much more accurate representation of normal and abnormal human brain function and development than animal models can (although animal models will remain useful for many goals). In fact, the promise of brain surrogates is such that abandoning them seems itself unethical, given the vast amount of human suffering caused by neurological and psychiatric disorders, and given that most therapies for these diseases developed in animal models fail to work in people. Yet the closer the proxy gets to a functioning human brain, the more ethically problematic it becomes.

There is now a need for clear guidelines for research, albeit ones that can be adapted to new discoveries. This is the conclusion of many neuroscientists, stem-cell biologists, ethicists and philosophers — ourselves included — who gathered in the past year to explore the ethical dilemmas raised by brain organoids and related neuroscience tools.

Read the full article in Nature.


.

‘Haifa is essentially segregated’:
cracks appear in Israel’s capital of coexistence

Ian Black, Guardian, 19 April 2018

Education provides important insights. In Haifa, as elsewhere, Jewish and Arab children mostly attend separate schools. Many Arab children (the majority are Christians), study in fee-paying church schools, and a few dozen in Jewish ones. There are no Jews in Arab public schools, where standards are poor. The curricula are different too. ‘People want to stay within their own communities to speak in their native languages, have days off on their own holidays, and learn about their own history, culture and religion’, says Asaf Ron. ‘Assimilation through attending the other community’s schools is a free choice that almost no one chooses.’

The Yad beyad (‘Hand in Hand’) network of bilingual schools complains about long waiting lists and a struggle to secure municipal support. In its kindergarten in Hadar, Arab and Jewish six-year-olds sing songs and are captivated by nursery rhymes that interchange Hebrew and Arabic – a heartwarming but highly unusual sight. ‘The whole country is based on separation in a very profound way’, says Merav Ben-Nun, its community organiser.

Higher education is a different story. Haifa University is 40% Arab, and the Technion, the Israeli Institute of Technology, 23%, though Arab graduates are unlikely to find jobs in security-related industries. Arab students are younger than Jewish ones, who mostly spend up to three years from the age of 18 doing the compulsory military service from which the vast majority of Arabs are exempt. ‘Arab and Jewish students sit in the same classes but barely speak to each other’, notes Golani. National holidays – Holocaust Day, Memorial Day and Independence Day – feel especially awkward on campus…

The closure of Haifa’s Arab theatre, al-Midan (‘The Square’), is cited as an example: state funding was withdrawn after it staged a play about a Palestinian security prisoner. The defiant response was to create an autonomous crowdfunded alternative – al-Khashabi (‘The Stage’). Its Arabic-language performances are translated into English, but conspicuously not into Hebrew. ‘Independent Palestinian institutions do not believe in coexistence’, explains Al-Khashabi’s director, Bashar Murkus. ‘We believe in dialogue from a position of strength and independence.’ His colleague Khoulood Tannous flatly refuses even to use the ‘c’ word. ‘No one is shelling us here’, she adds. ‘It’s no Gaza, nor the West Bank. It’s mind games.’

Politician Ayman Odeh’s disapproving view is that influence should matter more than identity to Israel’s Palestinian minority, in Haifa and beyond, and that joint struggle is the key to a more equal future. ‘Arabs are developing autonomy at the expense of Arab-Jewish cooperation’, warns Sikkuy’s Shbita. Neither side harbours illusions about the other. ‘In Haifa it’s not hate, but there’s not too much love either’, is the stark conclusion of Omer Shaffer, a Jewish Technion postgraduate who was both moved and surprised when an Arab colleague told him to ‘take care’ when he went off to do a stint of reserve army duty at a checkpoint in the West Bank. ‘It’s pretty indifferent. We’ve found a way to ignore each other without killing each other.’

Read the full article in the Guardian.


.

Cell by cell, scientists map the genetic steps
as eggs become animals

Jordana Cepelewicz, Quanta Magazine, 26 April 2018

But three papers appearing today in Science are changing that, as they unveil work with major significance for the field of developmental biology. Using a combination of gene sequencing and mathematical methods, the researchers traced the patterns of gene expression in every cell in embryos of zebra fish and of Western clawed frogsthrough many stages of development during their first 24 hours.

The results revealed, at a previously impossible resolution and scale, the genetic and developmental trajectories that embryonic cells follow to their eventual fates in fully differentiated tissues. Surprising new insights emerged as well: Many biologists, for example, believed that embryonic cells always followed branching paths toward maturity that committed them irrevocably to certain fates. But the new data indicates that cells can, in effect, sometimes ‘loop back’ to follow a different path, and that cells with different developmental histories can sometimes end up as the same type of cell.

The powerful techniques used in these reports, according to experts in the field, mark a new frontier in the ability to study development, cell fates and disease. ‘Whatever tissue you’re interested in studying, there’s something in this data set that should be of interest to you’, said Berthold Göttgens, a molecular biologist at the University of Cambridge who did not participate in the research but has been doing similar work in mouse embryos. Just as the rise of genome sequencing studies put biology on a different footing, he said, ‘this kind of foundational data will stand the test of time. It’ll be a landmark people will go back to.’

‘There’s a whole universe of possibilities that data like this opens up’, said Alexander Schier, a cell biologist at Harvard University and an author on one of the studies. ‘Before, when we could only work with a few genes, or a few cells, or a few developmental stages, it was like we were seeing two or three stars. Now we can suddenly see an entire galaxy.’

Read the full article in Quanta Magazine.


.

National Memorial for Peace and Justice

A lynching memorial is opening.
The country has never seen anything like it.

Campbell Robertson. New York Times, 25 April 2018

The National Memorial for Peace and Justice, which opens Thursday on a six-acre site overlooking the Alabama State Capitol, is dedicated to the victims of American white supremacy. And it demands a reckoning with one of the nation’s least recognized atrocities: the lynching of thousands of black people in a decades-long campaign of racist terror.

At the center is a grim cloister, a walkway with 800 weathered steel columns, all hanging from a roof. Etched on each column is the name of an American county and the people who were lynched there, most listed by name, many simply as ‘unknown’. The columns meet you first at eye level, like the headstones that lynching victims were rarely given. But as you walk, the floor steadily descends; by the end, the columns are all dangling above, leaving you in the position of the callous spectators in old photographs of public lynchings.

The magnitude of the killing is harrowing, all the more so when paired with the circumstances of individual lynchings, some described in brief summaries along the walk: Parks Banks, lynched in Mississippi in 1922 for carrying a photograph of a white woman; Caleb Gadly, hanged in Kentucky in 1894 for ‘walking behind the wife of his white employer’; Mary Turner, who after denouncing her husband’s lynching by a rampaging white mob, was hung upside down, burned and then sliced open so that her unborn child fell to the ground.

There is nothing like it in the country. Which is the point.

‘Just seeing the names of all these people’, said Bryan Stevenson, the founder of the Equal Justice Initiative, the nonprofit organization behind the memorial. Many of them, he said, ‘have never been named in public’.

Read the full article in the New York Times.


.

Out of the armchair
Stephanie Wykstra, Aeon, 1 May 2018

Is experimental philosophy really philosophy? Knobe and some of his colleagues argue that it is. They describe the work as continuous with a long tradition of philosophers trying to understand the human mind, and point to the likes of Aristotle, David Hume and Friedrich Nietzsche as precedents. In their manifesto, Knobe and Nichols write:

It used to be a commonplace that the discipline of philosophy was deeply concerned with questions about the human condition. Philosophers thought about human beings and how their minds worked … On this traditional conception, it wasn’t particularly important to keep philosophy clearly distinct from psychology, history or political science … The new movement of experimental philosophy seeks to return to this traditional vision.

Some philosophers, even those who identify as part of the x-phi movement, disagree with this viewpoint. Machery, a fellow x-phi advocate, argues that even if, historically, philosophers used to engage in a huge range of intellectual endeavours, it doesn’t mean that studying all those things should now count as philosophy. There’s something lost, Machery thinks, if experimental philosophers start to resemble cognitive scientists more and more, and lose their focus on what has been of central interest in philosophy: analysing concepts. (According to Knobe’s recent analysis, only around 10 per cent of x-phi experiments over a period of five years were directly about conceptual analysis, as opposed to revealing new cognitive effects and discussing potential cognitive processes underlying them.) Machery concurs with Stich and other ‘negative programmers’ that trying to analyse concepts from the armchair is a poor method, because of the experimental evidence that judgments vary by demographic group. Instead, he argues in his book Philosophy Within Its Proper Bounds (2017), philosophers should make use of experiments as a way of clarifying and assessing important philosophical ideas.

A second kind of response comes from those who question the usefulness of eliciting intuitions from people outside of philosophy. For example, in his book Relativism and the Foundations of Philosophy (2009), Stephen Hales writes: ‘[I]ntuitions of professional philosophers are much more reliable than either those of inexperienced students or the “folk”.’ This response, dubbed the ‘expertise defence’, is generally made in response to the ‘negative programme’ in x-phi. The philosopher Jennifer Nado characterises the defence as insisting that experimental philosophy’s reliance on the conflicting intuitions of non-philosophers is ‘fundamentally misguided’, since ‘the intuitions of such persons are irrelevant’. There’s often an analogy drawn to other fields: we wouldn’t take the conflicting opinions of non-experts as a challenge to most scientific and mathematical claims. On the other hand, some philosophers – responding to the expertise defence – have questioned that analogy by asking what ‘philosophical expertise’ amounts to, and how we can tell that professional philosophers have it. (In some cases, philosophers have even run experimentson fellow philosophers, claiming that they are susceptible to various kinds of bias in their intuitions.)

A third response to the negative programme in x-phi has been to look more closely at ‘intuitions’ themselves. The British philosopher Timothy Williamson argues that those who attack traditional philosophy should define exactly what they mean by ‘intuition’. If an ‘intuition’ is just ‘how things seem to us’, he argues, then the critique of intuition leads to ‘global skepticism’, the position that we should withhold all kinds of judgments until they are proven to be widely shared. (This extreme conclusion is one that, Williamson takes it, x-phi practitioners would prefer to avoid.) And taking another tack, some philosophers, such as Herman Cappelen in his book Philosophy Without Intuitions (2012), claim that traditional philosophy doesn’t actually rely on intuitions at all (even though many traditional philosophers think that it does).

Read the full article in Aeon.


.

The last slave
Zora Neale Hurston, Vulture, 29 April 2018

Six years earlier, Hurston had tried to publish another book in dialect, this one a work of nonfiction called Barracoon. Before she turned to writing novels, she’d trained as a cultural anthropologist at Barnard under the famed father of the field, Franz Boas. He sent his student back south to interview people of African descent. (Hurston was raised in Eatonville, Florida, which wasn’t the ‘black backside’ of a white town, she once observed, but a place wholly inhabited and run by black people — her father was a three-term mayor.) She proved adept at the task, but, as she noted in her collection of folklore, Mules and Men, the job wasn’t always straightforward: ‘The best source is where there are the least outside influences and these people, usually underprivileged, are the shyest. They are most reluctant at times to reveal that which the soul lives by. And the Negro, in spite of his open-faced laughter, his seeming acquiescence, is particularly evasive … The Negro offers a feather-bed resistance, that is, we let the probe enter, but it never comes out.’

Barracoon is testament to her patient fieldwork. The book is based on three months of periodic interviews with a man named Cudjo Lewis — or Kossula, his original name — the last survivor of the last slave ship to land on American shores. Plying him with peaches and Virginia hams, watermelon and Bee Brand insect powder, Hurston drew out his story. Kossula had been captured at age 19 in an area now known as the country Benin by warriors from the neighboring Dahomian tribe, then marched to a stockade, or barracoon, on the West African coast. There, he and some 120 others were purchased and herded onto the Clotilda, captained by William Foster and commissioned by three Alabama brothers to make the 1860 voyage.

After surviving the Middle Passage, the captives were smuggled into Mobile under cover of darkness. By this time, the international slave trade had been illegal in the United States for 50 years, and the venture was rumored to have been inspired when one of the brothers, Timothy Meaher, bet he could pull it off without being ‘hanged’. (Indeed, no one was ever punished.) Cudjo worked as a slave on the docks of the Alabama River before being freed in 1865 and living for another 70 years: through Reconstruction, the resurgent oppression of Jim Crow rule, the beginning of the Depression.

When Hurston tried to get Barracoon published in 1931, she couldn’t find a taker. There was concern among ‘black intellectuals and political leaders’ that the book laid uncomfortably bare Africans’ involvement in the slave trade, according to novelist Alice Walker’s foreword to the book, which is finally being published in May. Walker is responsible for reintroducing the world to a forgotten Zora Neale Hurston, who’d died penniless and alone in 1960, in a 1975 Ms. magazine essay. As Walker writes, ‘Who would want to know, via a blow-by-blow account, how African chiefs deliberately set out to capture Africans from neighboring tribes, to provoke wars of conquest in order to capture for the slave trade. This is, make no mistake, a harrowing read.’

Read the full article in Vulture.


.

Did Einstein really say that?
Andrew Robertson, Nature, 30 April 2018

‘There appears to be a bottomless pit of quotable gems to be mined from Einstein’s enormous archives’, notes Alice Calaprice, editor of The Ultimate Quotable Einstein (2011); one detects a hint of despair. Indeed, Einstein might be the most quoted scientist in history. The website Wikiquote has many more entries for him than for Aristotle, Galileo Galilei, Isaac Newton, Charles Darwin or Stephen Hawking, and even than Einstein’s opinionated contemporaries Winston Churchill and George Bernard Shaw. But how much of this superabundance actually emanated from the physicist?…

Among the hundreds of quotes that Calaprice notes are misattributed to Einstein are many that are subtly debatable. Some are edited or paraphrased to sharpen or neaten the original. ‘Everything should be made as simple as possible, but no simpler’ might, says Calaprice, be a compressed version of lines from a 1933 lecture by Einstein: ‘It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.’ More certain is the provenance of ‘The most incomprehensible thing about the Universe is that it is comprehensible’. That rewords a passage in a 1936 article in the Journal of the Franklin Institute: ‘The eternal mystery of the world is its comprehensibility … The fact that it is comprehensible is a miracle.’

Even ‘God does not play dice’, arguably Einstein’s most famous quote, isn’t quite his words. It derives from a letter written in German in December 1926 to his friend and sparring partner, theoretical physicist Max Born. It is published in the new volume of Einstein’s papers, in which the editors comment on its ‘varying translations’ since the 1920s. Theirs is: ‘Quantum mechanics … delivers much, but does not really bring us any closer to the secret of the Old One. I, at any rate, am convinced that He does not play dice.’ Einstein does not use the word ‘God’ (Gott) here, but ‘the Old One’ (Der Alte). This signifies a ‘personification of nature’, notes physicist and Nobel laureate Leon Lederman (author of The God Particle, 1993).

Read the full article in Nature.

.

 

The images are, from top down: Achilles and Ajax playing dice, from an amphora at the Vatican Museum; Human chromosomes; Johannes Vermeer’s ‘The Astronomer’; scene from ‘Ex Machina’;  Interior of the National Memorial for Peace and Justice (photo: Audra Melton for The New York Times).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s