The latest (somewhat random) collection of recent essays and stories from around the web that have caught my eye and are worth plucking out to be re-read.
Truth? It’s not just about the facts
Julian Baginni, Times Literary Supplement, 21 September 2017
Truth is rarely, if ever, a simple matter of getting the facts straight. History, for example, certainly demands factual accuracy but that in itself is not enough. There is also the question of which facts are made salient and how they are understood. There is no factual disagreement, for example, about the European colonization of Australia between those who would like to see Captain Cook’s statue taken down and those who wouldn’t. The difference concerns which features of that history are given centre stage and whether they are celebrated, lamented or both. When people complain that official histories are untruthful, they are rarely claiming that brazen lies are being told. Rather they insist that important truths are being ignored or overlooked.
People’s interest in the truth is often a concern not with facts but with their meanings. The truth in a portrait, for example, is not necessarily a matter of realistic fidelity. It is rather about capturing something in the sitter that a more physically accurate picture or photograph could miss. This idea is captured in Picasso’s famous aphorism ‘Art is a lie that makes us realize truth’. This kind of truth is often explicitly contrasted with the factual variety. ‘There is a distinction between fact and truth’, claimed Lucian Freud. ‘Truth has an element of revelation about it. If something is true, it does more than strike one as merely being so.’ Freud’s definitions may not match those of philosophers, but his point is clear enough. The kind of truth that concerns him is that which reveals the hidden meaning of things, not facts one could look up in a reference book.
Once we prise open a distinction between truth-as-meaning and truth-as-fact, all sorts of ‘truths’ become possible beyond what can be established by reason and evidence alone. Contemporary religion has been good at exploiting this opportunity. In response to the charge that science has made religion redundant at best, demonstrably false at worst, many believers have retorted that religion is concerned with a different kind of truth from that of science and so cannot be falsified by it. Most famously, Stephen J. Gould argued that while ‘Science tries to document the factual character of the natural world’, religion ‘operates in the equally important, but utterly different, realm of human purposes, meanings, and values’.
Many find this idea of ‘Non-Overlapping Magisteria’ of human inquiry attractive but keeping them apart is easier said than done. The religious tend to end up concerned with facts about the world as well as values. With Christians, I find this is usually made clear by the ‘empty tomb test’. When an articulate, theologically sophisticated believer starts expressing some version of the two magisteria view, one can ask: is it important for your faith that Christ’s tomb was found empty, and not because someone had sneaked his body out? It’s a rare Christian who says this doesn’t matter at all. Central to the faith of most is a supposed fact about a historical event, the everyday kind of truth which we are all concerned with.
Read the full article in the Times Literary Supplement.
Idylls of the liberal: The American dreams of
Mark Lilla and Ta-Nehisi Coates
Asad Haider, Viewpoint, 11 September 2017
Ta-Nehisi Coates, however, illustrates the debilitating limits of what “identity politics” has now come to represent, something far from the radical and coalitional practice of the Combahee River Collective: a moralizing discourse which monopolizes the discussion of race, yet fails to propose either a coherent theory of racial oppression or a viable program for eliminating it. Coates deploys his considerable erudition and rhetorical flourish in service of sheer obfuscation – the story of whiteness as magic and Trump as sorcerer. Despite the gingerly placed historical references, in Coates’s telling whiteness has no history. It is a malevolent force which surges from the netherworld in moments which can only be identified by the intensity of Coates’s own feelings – the American Dream become Coates’s personal nightmare.
Coates goes as far as to make the extraordinary claim that before Trump, whiteness lay dormant—when in fact our very first president owned slaves while in office, the first of eight to do so (four more were slaveowners while not in office). That Coates goes on to be disingenuous should not surprise us. If whiteness is magic, it has no real historical specificity, no clearly identifiable social effects, no limits on its scope of action, and no structure which can be dismantled. In Coates’s legends, there is no point in resistance to evil. There are no moments in which whiteness is opposed. The noble heroes are all found wanting, perhaps to leave room for one of Coates’s preference in the future.
Yet we know, from reading the same history books as Coates, that there was resistance. And indeed, the complex evolution of white supremacy is marked by this resistance—first and foremost the resistance of slaves, who from Saint-Domingue to Virginia refused to accept the notion that one person could be the property of another.
Lilla also erases this history of resistance when he relies on a sentimental notion of ‘citizenship’, completely eliding the historical complexity of this category and its relationship to the class differentiation that accompanied the formation of modern nation-states. In fact, Lilla’s notion of citizenship is targeted specifically against socialists and Bernie Sanders-style progressives who would base politics on class. ‘Citizenship is not an identity in the way we currently use the term, but it provides one possible way of encouraging people to identify with one another’, Lilla writes. ‘There is good reason why progressives should stop framing their calls for economic justice in terms of class and start appealing instead to our shared citizenship.’ (Astoundingly, he adds in a footnote that if progressives ‘want to become a major force in American politics again’ they should ignore ‘the latest books from Verso Press’ and instead read Teddy Roosevelt.)
Read the full article in Viewpoint.
How the modern addiction to identity politics
has fractured the left
Mark Lilla, New Statesman, 18 September 2017
Imagine a young student entering such an environment today – not your average student pursuing a career, but a recognisable campus type drawn to political questions. She is at the age when the quest for meaning begins and in a place where her curiosity could be directed outward towards the larger world she will have to find a place in. Instead, she is encouraged to plumb mainly herself, which seems an easier exercise. She will first be taught that understanding herself depends on exploring the different aspects of her identity, something she now discovers she has. An identity that, she also learns, has already been largely shaped for her by various social and political forces. This is an important lesson, from which she is likely to draw the conclusion that the aim of education is not progressively to become a self – the task of a lifetime, Kierkegaard thought – through engagement with the wider world. Rather, one engages with the world and particularly politics for the limited aim of understanding and affirming what one already is.
And so she begins. She takes classes in which she reads histories of the movements related to whatever she determines her identity to be, and reads authors who share that identity. (Given that this is also an age of sexual exploration, gender studies will hold a particular attraction.) In these courses she also discovers a surprising and heartening fact: that although she may come from a comfortable, middle-class background, her identity confers on her the status of one of history’s victims. This discovery may then inspire her to join a campus group that engages in movement work. The line between self-analysis and political action is now fully blurred. Her political interest will be genuine but circumscribed by the confines of her self-definition. Issues that penetrate those confines now take on looming importance and her position on them quickly becomes non-negotiable; those issues that don’t touch on her identity (economics, war and peace) are hardly perceived.
The more our student gets into the campus identity mindset, the more distrustful she becomes of the word ‘we’, a term her professors have told her is a universalist ruse used to cover up group differences and maintain the dominance of the privileged. And if she gets deeper into ‘identity theory’, she will even start to question the reality of the groups to which she thinks she belongs.
Read the full article in New Statesman.
Clickbait and impact: How academia has been hacked
Portia Roelofs & Max Gallien, LSE blogs, 19 September 2017
This article represents the culmination of broader trends in academia: from marketisation, to impact, to the promotion of artificially adversarial debate. From the late 1990s, universities have been under pressure to operate more like businesses. Rather than existing in their own comfy bubble, politicians demanded that universities face the bracing winds of the market and earn their keep. Students became consumers, big companies increasingly set the agenda for publicly funded research, and academics were to be subject to the same accountability and incentives as, say, a call-centre worker. Academics have to publish. In order to rank articles against each other, the world of academia had to create a universal way of quantifying how good an article is: hence the citation index. Indexing platforms like Scopus, Web of Science and Google Scholar record how many other articles or books cite your article. The idea is that if a paper is good, it is worth talking about. The only thing is, citation rankings count positive and negative references equally.
But this style of quantifying how good an article is pales in comparison to what has been done under the ‘impact agenda’. Initially spurred by the desire for professors to reach out and engage with the world outside the ‘ivory tower’, impact came to be measured by blogs, page views, download stats, and tweets. Academia is replicating the structure of the mass media. Academic articles are now evaluated according to essentially the same metrics as Buzzfeed posts and Instagram selfies. In fact, the impact factor is an especially blunt example of online metrics: Reddit, Youtube, and Imgur at least allow users to up-vote or down-vote posts.
The result is to dilute the idea of impact to simply publicity. And as we all know, all publicity is good publicity. (It is worth noting that Gilley lists his ‘scholarly impact metrics’ on his CV above any of his publications.) And it’s deadly serious: how many likes your article gets is not simply a matter of vanity but is ingrained into the system of academic rewards and respects; whether when applying for promotions, jobs, or research funding. If your job prospects depend on clicks, you’d be stupid not to write clickbait.
Read the full article on LSE Blogs.
A quick reminder of why colonialism was bad
Nathan J Robinson, Current Affairs, 14 September 2017
I suppose to those unfamiliar with the history, Gilley’s argument could appear superficially persuasive. But a moment’s examination of the record reveals why the case he makes is abhorrent. Gilley says he is simply asking for an unbiased assessment of the facts, that he just wants us to take off our ideological blinders and examine colonialism from an empirical perspective. But this is not what he has done. Instead, in his presentation of colonialism’s record, Gilley has deliberately excluded mention of every single atrocity committed by a colonial power. Instead of evaluating the colonial record empirically, he has distorted that record, concealing evidence of gross crimes against humanity. The result is not only unscholarly, but is morally tantamount to Holocaust denial.
First, Gilley says he is making a ‘case for colonialism,’ to rescue Western colonial history’s ‘bad name.’ But he restricts his examination to ‘the early nineteenth to mid-twentieth centuries.’ He does so because if he were to include the first 300 years of Western colonialism (i.e. the majority), it would be almost impossible to mount any kind of case that the endeavor benefited indigenous populations. The civilizations of the Americas were exterminated by colonialism, through disease, displacement, resource depletion, one-sided warfare, and outright massacre, and their populations suffered a ‘catastrophic collapse’. Since it is impossible to spin this as benefiting the inhabitants, Gilley avoids mentioning that it even happened. This, in itself, in an article defending ‘colonialism,’ should sufficiently prove that Gilley is unwilling to consider evidence that contradicts his case, by discussing ‘colonialism’ generally while selecting only the cases in which native populations were not extinguished.
Next, Gilley’s method of defending colonialism is through ‘cost-benefit analysis,’ in which the harms of colonialism are weighed against the ‘improvements in living conditions’ and better governance. (Gilley even proposes ‘greater business confidence’ as a potential benefit of a neo-colonial project.) He quotes his standard of measurement:
[I]n times and places where colonial rule had, on balance, a positive effect on training for self-government, material well-being, labor allocation choices, individual upward mobility, cross-cultural communication, and human dignity, compared to the situation that would likely have obtained absent European rule, then the case for colonialism is strong. Conversely, in times and places where the effects of foreign rule in these respects were, on balance, negative compared to a territory’s likely alternative past, then colonialism is morally indefensible
We should observe here that this is a terrible way of evaluating colonialism. It is favored by colonialism’s apologists because it means that truly unspeakable harms can simply be ‘outweighed’ and thereby trivialized. We can see quickly how ludicrous this is: ‘Yes, we may have indiscriminately massacred 500 children, but we also opened a clinic that vaccinated enough children to save 501 lives, therefore ‘the case for colonialism is strong.’’ We don’t allow murderers to produce defenses like this, for good reason: you can’t get away with saying ‘Yes, I killed my wife, but I’m also a fireman.’ We must also be careful about using hypothetical counterfactuals: examining whether colonialism is ‘better than what would have happened in its absence.’ I’m reading Great Expectations at the moment, and so I’ll call this the ‘Pip’s sister defense’: Pip’s sister justifies her cruelty and physical abuse by constantly reminding Pip that if it were not for her, he would be in an even worse situation. It’s an argument frequently deployed by abusive and exploitative individuals in order to justify their acts. And the point is that whether or not it’s true is immaterial to the evaluation of the person’s crimes. Gilley and other colonial apologists, like the husband telling his wife that while she may not like being hit, she should remember who provides for her, try to exonerate colonial powers by suggesting that enough economic growth could somehow make a ‘strong case for colonialism’ even if there had been constant mass rape and torture. (By the way, I think even committed opponents of colonialism may sometimes fall into this trap. They may feel as if it is necessary to deny that colonialism ever brought any benefits—which, as Gilley points out, even Chinua Achebe doesn’t think. Instead, it’s important to point out that building power lines and opening a school doesn’t provide one with a license to rob and murder people. Furthermore, nobody should be surprised if performance on certain economic and political metrics did end up declining in the postcolonial era, since reconstructing a functioning country after decades or centuries of subjugation is… not easily done.)
Read the full article in Current Affairs.
Is retraction the new rebuttal?
Colleen Flaherty, Inside Higher Ed, 19 September 2017
The isn’t the first time scholars have called on a journal retract a controversial article in recent months. In philosophy, division over calls for the journal Hypatia to retract a paper comparing transgenderism to transracialism led to the resignations of top editors and the suspension of the associate editorial board. More recently, the American Psychological Association’s Journal of Personality and Social Psychology re-reviewed, on ethical grounds, a previously accepted study on training a computer to recognize gay and straight faces.
In neither case was the article retracted (and in the latter case, it was mostly outside groups — not academics — that wanted the paper retracted). But are calls for retraction, not forceful rebuttals, becoming the new normal when it comes to disfavored research?
Justin Weinberg, an associate professor of philosophy at the University of South Carolina and editor of Daily Nous, a popular philosophy blog, recently wrote that he wasn’t an expert in Gilley’s case or field, but that ‘our default reaction to cases like this should not be ‘retract!’ but rather, “rebut!”’
As academics, he wrote, ‘we should try as much as possible to rely on the exchange of evidence and arguments, not (directly) on the numbers of people who agree with us, or the strength of their agreement.’ Supposing that Gilley’s article was peer reviewed but that arguments against it are largely correct, Weinberg asked, ‘How should those academics in a position to know these things respond? Is it by saying something tweetable that will convince lots of nonexperts to help them try to erase the article from history? That seems to be making use of inappropriate means towards an undesirable end. The history of academia is a history of mistakes — and learning from them. If Gilley’s article is full of mistakes, then the job of the experts is to point this out and help us learn from them, so people are less likely to make them again.’
Read the full article in Inside Higher Ed.
The forgotten victims of Agent Orange
Viet Thanh Nguen & Richatrd Hughes, New York Times, 15 September 2017
Our government has acknowledged some of its responsibility to its veterans. In 2010, Secretary of Veterans Affairs Eric K. Shinseki added three Agent Orange-related diseases to the V.A.’s compensation list, and Congress allocated $13.3 billion to cover the costs. An enterprising Senate aide slipped in $12 million for Agent Orange relief in Vietnam, only a small portion of which was for health. These disparities in funding are unconscionable, as is the American government’s illogical refusal to acknowledge that Agent Orange has caused the same damage to the Vietnamese as it has to Americans.
Pham Van Truc is another Vietnamese victim of Agent Orange. With his crippled, birdlike limbs and patches of scaly skin, he had as his only blessing, it seemed, exceptionally devoted parents who cared for him, night and day, all 20 years of his life and who were devastated when he died in March. His mother, Nguyen Thi May, 66, had pleaded for a solution to just one of Mr. Truc’s afflictions, such as testicles that had not descended or the attendant pain unrelieved by ineffective medicines.
In regards to cases like this, our government’s one concession to responsibility for the ravages of Agent Orange is environmental remediation. Over $100 million has been allocated to clean up the Da Nang airport, one of 28 ‘hot spots’ for defoliant contamination in Vietnam. By contrast, only $20 million has been allocated for victims.
The most common American bureaucratic excuse for this disparity is that a definitive connection between Agent Orange and the illnesses has not yet been made. But the evidence is overwhelming: Vietnamese soldiers, from both sides, with perfectly healthy children before going to fight, came home and sired offspring with deformities and horrific illnesses; villages repeatedly sprayed have exceptionally high birth-deformity rates; and our own Department of Veterans Affairs now lists 14 illnesses presumed to be related to Agent Orange.
Read the full article in the New York Times.
Europe’s ‘migrant hunters’
Jerome Tubiana, Foreign Affairs, 31 August 2017
‘To close Libya’s southern border is to close Europe’s southern border,’ Marco Minniti, Italy’s interior minister, said in April at a meeting in Rome with representatives of three cross-border Saharan tribes, the Tubu, Awlad Suleiman Arabs, and Tuareg. The leaders agreed to form a border force to stop migrants entering Libya from traveling to Europe, reportedly at the demand of, and under the prospect of money from, the Italian government. All three communities are interested in resolving the deadly conflicts that have beset the country since the fall of Colonel Muammar al-Qaddafi in 2011 and hope Italy will compensate them monetarily for their casualties (in tribal conflicts, a payment is needed to end a fight) as well as fund reconstruction and development of neglected southern Libya. Italy, of course, is keen on halting the flow of migrants reaching its shores and sees these Saharan groups, which have the potential to intervene before migrants even get to Libya, as plausible proxies.
Some tribal leaders in southern Libya – mostly Tubu and Tuareg – look favorably on Italy’s and Europe’s overtures and suggested that the EU should cooperate directly with local militias to secure the border. But their tribes largely benefit from smuggling migrants, and they also made clear this business will not stop unless development aid and compensation for the smugglers is provided. ‘The EU wants to use us against migrants and terrorism,’ a Tubu militia leader told me, off-the-record, on the side of a meeting in the European Parliament last year. ‘But we have our own problems. What alternative can we propose to our youth, who live off trafficking?’
With or without the EU, some of the newly armed groups in Libya are selling themselves as migrant hunters. ‘We arrested more than 18,000 migrants,’ a militia chief told me, with a hauteur that reminded me of the anti-immigrant sentiment spreading across Europe. ‘We don’t want just to please the EU, we protect our youths and our territory!’
It seems rather reckless, however, in a largely stateless stretch of the Sahara, for Europe to empower militias as proxy border guards, some of whom are the very smugglers whose operations the EU is trying to thwart. The precedent in Sudan is not encouraging. Last year, Khartoum received funding from the EU that was intended to help it restrict outward migration. The best the government could do was redeploy at the Sudanese-Libyan border the notorious Rapid Support Forces, recruited among Darfur’s Janjaweed militias, which have wreaked havoc in the province since 2003. In due course, their leader, Brigadier General Dagalo, also known as ‘Hemeti,’ claimed to have arrested 20,000 migrants and then threatened to reopen the border if the EU did not pay an additional sum. The EU had already given Sudan and Niger 140 million euros each in 2016. And the Libyan rival factions are catching on, understanding well that the migrant crisis gives them a chance to blackmail European leaders worried about the success of far-right anti-immigrant groups in their elections. In February, with elections looming in the Netherlands and France, the EU made a deal to keep migrants in Libya, on the model of its March 2016 agreement with Turkey, with the Tripoli-based, internationally recognized Government of National Accord, despite the fact it has little control over the country. In August, the GNA’s main rival, eastern Libya’s strongman Khalifa Haftar, claimed that blocking migrants at Libya’s southern borders would cost one billion euros a year over 20 years and asked France, his closest ally in Europe, to provide him with military equipment such as helicopters, drones, armored vehicles, and night vision goggles. Needless to say, Heftier did not get the equipment.
Read the full article in Foreign Affairs.
Africa gets its first major contemporary art museum –
but challenges lie ahead
Antwaun Sargent, Artsy, 19 September 2017
The appearance of the museum being yet another white power grab in Africa is further exacerbated by the fact that the museum’s five trustees are white and the advisory board is co-chaired by David Green – the white British CEO of the V&A Waterfront, who funded a large part of the museum’s 500 million rand ($38 million) construction cost – and Jochen Zeitz himself. (When I spoke to Zeitz in Johannesburg recently, I asked him what he wanted to achieve with the museum. He simply said: ‘I hope many people from all over the world come see it.’)
A board meeting at Zeitz MOCAA may be like one convened at MoMA or the Whitney Museum, but unlike those august institutions, the curatorial staff, according to Coetzee, is representative of South Africa’s most recent census and is overwhelmingly black. The museum also has an endowed curatorial program that will train aspiring African curators, with the hopes that they return home to work specifically in the context of their communities.
Gallerists, curators, and artists I spoke to raised concerns about the museum’s centers of power. But all remained hopeful that, although Zeitz MOCAA’s leadership does not look like the public it will serve, the museum could be a cultural space unlike any seen on the continent. ‘When you look at the contemporary art scene, we have great artists but we don’t have platforms for them,’ said the director of the FNB Joburg Art Fair, Mandla Sibeko.
According to Sibeko, only 12 countries in Africa have at least one contemporary art gallery, so platforms like FNB Joburg and Zeitz MOCAA represent two of very few international opportunities for artists.
‘Look, if there is any institution that can support and house African art on the continent, I think it’s a very positive thing,’ said South African artist Robin Rhode before his performance at the fair. The celebrated Nigerian curator Bisi Silva was also tentatively enthusiastic. ‘We are all very excited about it, of course,’ she said, ‘but what we do definitely want to see is that it reaches out across the continent, and that’s something that’s sometimes not as easy from South Africa. I think that is going to be very important.’
The 100,000-square-foot museum’s inaugural exhibitions feature about 300 works of art that, at least to my Western eyes, are impressive. The 11 shows – ‘Wounded Negative’,’ ‘LGBTQI+,’ ‘Material Value,’ and ‘States of Grace,’ to name a few – present an intergenerational mix of a few white but mostly black voices from across the continent, such as Gabrielle Goliath, Nandipha Mntambo, Mouna Karray, and Samson Kambalu.
Read the full article in artsy.
When dissent became treason
Adam Hochschild, New York Review of Books,
28 September 2017
Anticommunism in this country, he points out, never had much to do with the Soviet Union. For one thing, it had already been sparked by the Paris Commune, decades before the Russian Revolution took place. ‘To-day there is not in our language…a more hateful word than Communism,’ thundered a professor at the Union Theological Seminary in 1878. For another thing, after the Revolution, anticommunists knew as little as American Communists about what was actually happening in Russia. The starry-eyed Communists were convinced it was paradise. The anticommunists found they could shock people if they portrayed the country as one ruled by ‘commissariats of free love’ where women had been nationalized along with private property and were passed out to men. Neither group had much incentive to investigate what life in that distant country was really like.
For a century or more, Fischer convincingly documents, the real enemy of American anticommunism was organized labor. Employers were the core of the anticommunist movement, but early on began building alliances. One was with the press (whose owners had their own fear of unions): as early as 1874 the New York Tribune was talking of how ‘Communists’ had smuggled into New York jewels stolen from Paris churches by members of the Commune, to finance the purchase of arms. That same year the Times spoke of a ‘Communist reign of terror’ wreaked by striking carpet weavers in Philadelphia. In 1887, Bradstreet’s decried as ‘communist’ the idea of the eight-hour workday.
The anticommunist alliance was joined by private detective agencies, which earned millions by infiltrating and suppressing unions. These rose to prominence in the late nineteenth century, and by the time of the Palmer Raids the three largest agencies employed 135,000 men. Meanwhile, starting in the 1870s, the nation’s police forces began using vagrancy arrests to clear city streets of potential troublemakers (New York made more than a million in a single year). Then they developed ‘red squads,’ whose officers’ jobs and promotions depended on finding communist conspiracies.
Read the full article in the New York Review of Books.
Emile Chabal, Aeon, 18 September 2017
The racialised model that underpinned the origins of the idea of the Anglo-Saxon in France has fallen into disrepute. Where before the most common adjectival use of Anglo-Saxon in French was in the phrase la race anglo-saxonne, such usages have become taboo since the 1970s. Today, we readily talk about national groups (‘British’, ‘French’) or regional entities (‘Europeans’, ‘South Asians’), but we would hesitate to contrast, as Maurras did in the 1910s, the ‘Anglo-Saxons’, the ‘Slavs’ and the ‘yellow races’.
So why do the French continue to use such a loaded term? The answer lies in its transformed meaning. Since the 1970s, its racial connotations have been buried, to be replaced by broader social, cultural and economic meanings. Two of these stand out: its use to describe the economic system of late capitalism and its importance in debates surrounding multiculturalism. There is still, obviously, a competitive element to the term. It still conjures up older, quasi-military connotations of the Anglo-Saxon, for example when de Gaulle used it in the 1960s to describe Anglo-American nuclear cooperation, and when contemporary European negotiators use it to describe British intransigence in the face of the European Union. But the term is now much more commonly applied to an amorphous sense of difference between France and the English-speaking world.
The resurgence of the term Anglo-Saxon in relation to late capitalism can be easily tracked in a variety of French publications. From the early 1990s onwards, journalists and editors begin to attach the adjective Anglo-Saxon to the words capitalisme (capitalism) and marché(market). A search in the archives of the staunchly Left-wing Le Monde diplomatique since 1978 turns up references to the ‘monetarist myopia that dominates the Anglo-Saxon world’ (in 1981); the ‘attempts on the part of Anglo-Saxon capitalism to achieve global hegemony’ (in 1982); and the ‘Anglo-Saxon capitalist model that is the privileged choice of multinationals’ (in 1992). These are complemented by frequent references to ‘Anglo-Saxon liberalism’, which was usually seen to be a dangerous affront to a French ‘social model’. As the 1990s wore on, this language became increasingly alarmist: commentators warned of the hegemony of an ‘Anglo-Saxon neo-liberalism’ or a predatory ‘Anglo-Saxon capitalism’, particularly in the wake of the public-sector strikes of 1995. Articles railed against the decision of the French government to fall in line with ‘the Anglo-Saxon ‘model’… the terrible consequences of which are now plain’ (in 1997).
The association between the Anglo-Saxon and capitalism was cemented not only in the pages of Left-wing magazines. At the time of the 2005 referendum, which saw the French reject a new constitutional settlement for Europe, the Anglo-Saxon also featured prominently in public debate. Already before the referendum, the Gaullist president Jacques Chirac sought to reassure voters that ‘a laissez-faire solution, in other words, a solution leading to a Europe pushed forward by an ultraliberal, Anglo-Saxon and Atlanticist tendency… is not what we want’. And, shortly after the referendum, the first line of a front-page editorial in the daily newspaper Le Monde placed the fear of the Anglo-Saxon at the heart of the campaign. As the editors put it: ‘The depth of the No vote … can be explained largely by a refusal of the ‘Anglo-Saxon model’.’ By the first decade of the 21st century, the Anglo-Saxon had become more than a rhetorical device; it was a political battleground. To take a stand for or against the Anglo-Saxon ‘way’ was to take sides in a pressing debate about the ethics of economic development.
Read the full article in Aeon.
America’s shameful history of voter suppression
Andrew Gumble, Guardian, 13 September 2017
Not only was there a problem of reliability with the voting machines, it also became clear that the United States had never established an unequivocal right to vote; had never established an apolitical, professional class of election managers; and had no proper central electoral commission to set standards and lay down basic rules for everyone to follow, free of political interference.
In the absence of such a body, every jurisdiction was free to play fast and loose with the rules on everything from voter eligibility to whether or not to conduct recounts.
‘All these different systems in different counties with no accountability … it’s like the poorest village in Africa,’ the chair of South Africa’s Independent Electoral Commission, Brigalia Bam, later exclaimed on a follow-up tour of Florida on the eve of the 2004 presidential election.
Much of that dysfunction harks back to the country’s shameful racial history. To circumvent constitutional amendments passed in the wake of the civil war, southern states approved a slew of discriminatory laws and introduced literacy tests and good character tests (also adopted in parts of the north) that made it next to impossible for black voters to cast their ballots. James Vardaman, the despotic governor of Mississippi, admitted in 1890 that his state’s new constitution had ‘no other purpose than to eliminate the nigger from politics’.
Read the full article in the Guardian.
The making and the breaking of the legend of Robert E Lee
Eric Foner, New York Times, 28 August 2017
By the time the Civil War ended, with the Confederate president, Jefferson Davis, deeply unpopular, Lee had become the embodiment of the Southern cause. A generation later, he was a national hero. The 1890s and early 20th century witnessed the consolidation of white supremacy in the post-Reconstruction South and widespread acceptance in the North of Southern racial attitudes. A revised view of history accompanied these developments, including the triumph of what David Blight, in his influential book ‘Race and Reunion’ (2001), calls a ‘reconciliationist’ memory of the Civil War. The war came to be seen as a conflict in which both sides consisted of brave men fighting for noble principles — union in the case of the North, self-determination on the part of the South. This vision was reinforced by the ‘cult of Lincoln and Lee,’ each representing the noblest features of his society, each a figure Americans of all regions could look back on with pride. The memory of Lee, this newspaper wrote in 1890, was ‘the possession of the American people.’
Reconciliation excised slavery from a central role in the story, and the struggle for emancipation was now seen as a minor feature of the war. The Lost Cause, a romanticized vision of the Old South and Confederacy, gained adherents throughout the country. And who symbolized the Lost Cause more fully than Lee?
This outlook was also taken up by the Southern Agrarians, a group of writers who idealized the slave South as a bastion of manly virtue in contrast to the commercialism and individualism of the industrial North. At a time when traditional values appeared to be in retreat, character trumped political outlook, and character Lee had in spades. Frank Owsley, the most prominent historian among the Agrarians, called Lee ‘the soldier who walked with God.’ (Many early biographies directly compared Lee and Christ.) Moreover, with the influx of millions of Catholics and Jews from southern and eastern Europe alarming many Americans, Lee seemed to stand for a society where people of Anglo-Saxon stock controlled affairs.
Historians in the first decades of the 20th century offered scholarly legitimacy to this interpretation of the past, which justified the abrogation of the constitutional rights of Southern black citizens. At Columbia University, William A. Dunning and his students portrayed the granting of black suffrage during Reconstruction as a tragic mistake. The Progressive historians — Charles Beard and his disciples — taught that politics reflected the clash of class interests, not ideological differences. The Civil War, Beard wrote, should be understood as a transfer of national power from an agricultural ruling class in the South to the industrial bourgeoisie of the North; he could tell the entire story without mentioning slavery except in a footnote. In the 1920s and 1930s, a group of mostly Southern historians known as the revisionists went further, insisting that slavery was a benign institution that would have died out peacefully. A ‘blundering generation’ of politicians had stumbled into a needless war. But the true villains, as in Lee’s 1856 letter, were the abolitionists, whose reckless agitation poisoned sectional relations. This interpretation dominated teaching throughout the country, and reached a mass audience through films like ‘The Birth of a Nation,’ which glorified the Klan, and ‘Gone With the Wind,’ with its romantic depiction of slavery. The South, observers quipped, had lost the war but won the battle over its history.
Read the full article in the New York Times.
Keepers of the secrets
James Somers, Village Voice, 20 September 2017
The ‘backend’ of the New York public library system is a three-story building in Long Island City, a few blocks from the Court Square subway stop, that looks like an elementary school. The building says ‘BookOps’ on the facade, and sits next to a Tower TLC rental facility for livery drivers. It houses ‘technical services’ for the NYPL and Brooklyn Public Library; every new item destined for either library first comes through here to get cleaned up, bar-coded, and entered into the library database. Rare books that are falling apart, or old maps, are meticulously restored in industrial-grade laboratories on the third floor.
This is the home of the archival processing team, the organization that turns newly acquired archival collections — like Lou Reed’s collected papers and recordings, or jazz musician Sonny Rollins’s, both of which were acquired this year — into a resource that’s usable by researchers.
It used to be that papers were donated to libraries. Now, as often, major archives are sold, sometimes for millions of dollars. The Harry Ransom Center at the University of Texas Austin, which is well-funded and ambitious, is said to have particularly driven up the price for the most sought-after collections, like David Foster Wallace’s papers.
When a collection arrives in Long Island City, the first step is to ‘stabilize’ it, as though it were a patient just arrived at the ER. One recently acquired collection — the archives of the New York Review of Books — had been sitting at the Navy Yard for twenty years. It was covered in oily dirt. The archivists who brought it here had to wear Tyvek suits and facemasks while unpacking it. There’s a room on the third floor called the ‘disaster recovery room,’ where, say, a mold infestation might be taken care of.
It is at this stage, too, that the not strictly archival material usually gets found in the filing cabinets. Lea Osborne, the head of the archival processing unit at the NYPL, told me that she’s found dentures before, homemade roller skates, a bottle of ginseng, and, so far in the Sonny Rollins archive, more than $8,000 in cash. (That gets returned to the donor.)
The real work, though, in processing a collection, is intellectual. The goal is to make the files you’ve received findable by a researcher; and of course to make them findable, you have to know what’s in them. In the old days, this was slow work. Archivists would read most of the documents in a folder, taking note of them, rearranging the documents if they seemed disorganized. Their finding aids, the all-important database record that tells a researcher what’s in a given collection were deeply hierarchical, with detail all the way down to individual pieces of paper in individual folders. You wouldn’t just get the ‘Ezra Pound letters,’ you’d get something akin to ‘Ezra Pound letters, 1904–06, re Joyce.’
The explosion of paper made this approach unsustainable. ‘When the National Archives in Washington was created in 1934, it inherited an awesome backlog of about one million metres of federal records, with a growth rate of more than sixty thousand metres annually,’ the archival theorist Terry Cook wrote in a paper. ‘By 1943, under the expansion of the state to cope with the Great Depression and World War II, that growth rate had reached six hundred thousand metres annually.’
Read the full article in Village Voice.
Genetics spills secrets from Neanderthals’ lost history
Jordana Cepelewicz, Quanta Magazine, 18 September 2017
In 1856, three years before the publication of Charles Darwin’s On the Origin of Species, a group of miners uncovered human fossils in a limestone cave in the Neander Valley of northern Germany — what would later be named Neanderthal 1, the first specimen to be recognized as belonging to another, archaic species of human. We have been trying to understand as much as possible about our mysterious cousins ever since. To do so, experts have consulted two major lines of evidence: the hundreds of bones and stone tools found to date, scattered from Spain and England to the Altai Mountains, and, much more recently, genomic data and inferences drawn from statistical models.
But these approaches paint strikingly different pictures of what Neanderthal populations would have looked like. The archaeological record suggested that very roughly 150,000 individuals spanned Europe and Asia, living in small groups of 15 to 25 — and that their total numbers fluctuated greatly during the several climate cycles (which included harsh glacial periods) that occurred during the half a million years they inhabited Earth, before going extinct 40,000 years ago.
Genetic sequencing tells a different story. Some gene-based estimates put the Neanderthals’ effective population at a measly 1,000; others claim they hovered at a few thousand at most (one study, for example, calculated that there were effectively fewer than 3,500 females). Two hypotheses might account for these results: that the population was indeed that low, even at its peak, or that the population was perhaps larger but had been decreasing for a very long time. In either case, the Neanderthals were always on the decline; their extinction seemed to have been foretold from the beginning.
‘The fact that these two kinds of estimates don’t match is an issue we have yet to work out,’ said John Hawks, a paleoanthropologist at the University of Wisconsin-Madison.
A prison sentence ends. But the stigma doesn’t.
James Forman, jr, New York Times, 25 September 2017
But Harvard’s rejection of Ms. Jones (and my university, Yale, rejected her as well, though the reasons remain unclear) is more than that. It reveals the truth about why mass punishment persists and the lie we are telling ourselves about the possibility of redemption.
Here’s the thing about harsh justice in America. More and more people criticize it, but most eagerly shift the blame for who is responsible. I saw this repeatedly in California, where I just spent a year living and teaching. I lost count of the number of conversations I had with colleagues and friends about criminal justice in which somebody bemoaned the state of affairs in ‘the Trump states.’ I responded by bringing up the fact that California led the prison-building movement in the 1980s and ’90s, and would share stories about a visit to San Quentin prison, located just across the water from San Francisco, where I met dozens of men serving life sentences. Nobody from the Trump states put them there, or is keeping them there, I would say. That’s on California voters and their elected officials. That’s on you.
I suspect that the administrators and professors who helped block Ms. Jones’s admission are a lot like my friends in Connecticut and California. They consider themselves liberal, and they think mass incarceration is a problem. Somebody’s else’s problem. Blame the judges, prosecutors, legislators, police, probation officers, prison guards. Just not us.
What will the gatekeepers of privilege do when confronted with gold-star applicants who have a criminal record? Harvard’s answer — you can never outlive your crime — is an affront to a first-rate candidate and brings shame on those responsible.
Read the full article in the New York Times.
Provocative Nat Turner-inspired portraits
fuel debate after their removal
Cara Ober, Hyperallergic, 8 September 2017
The Joy Cometh paintings are deliberately provocative and designed to challenge commonly accepted notions of history and race in the US. They are uncomfortably realistic and they inflame your emotions, no matter what your racial or familial history. While at Galerie Myrtis, these paintings were written about in multiple Baltimore publications, and, as far as I know, attracted little controversy and were widely praised. Perhaps this has something to do with the fact that Galerie Myrtis specializes in work by African-American artists, and that the gallery’s typical audience regularly sees and consumes contemporary art.
In late August, at Goucher College’s Rosenberg Gallery, the same paintings provoked a completely different response. A Goucher employee deemed the paintings offensive, stating that she shouldn’t have to look at black faces with nooses around their necks at work, leading to a mediated discussion between the employee, the gallery director, and AU professor and artist Zoë Charlton in which the employee expressed that these paintings made her work environment feel abusive and uncomfortable.
After he was informed of the complaint, Towns issued a statement and requested that the paintings be removed, but that taped squares remain where the paintings were originally hung. It’s unclear whether the college would have insisted upon the paintings’ removal, but it was an unfortunate possibility. Rather than forcing the college’s hand, Towns chose to remove his own works.
‘It has come to my attention that the work from my Joy Cometh in the Mornings eries has offended staff at Goucher College,’ says Towns in a statement placed in the gallery alongside the empty frames. ‘Though I am saddened to see the work go, I value Goucher’s Black employees’ concern. The intent of my work is to examine the breadth and complexity of American history, both good and bad. It is not to fetishize Black pain, nor to diminish it.’
Read the full article in Hyperallergic.
Laura Kipnis’s endless trial by Title IX
Jeannie Suk Gersen, New Yorker, 20 September 2017
Kipnis told me that she was surprised when Northwestern once again launched a formal Title IX investigation of her writing. (A spokesperson from Northwestern did not respond to a request for comment by press time.) Kipnis said that investigators presented her with a spreadsheet laying out dozens of quotations from her book, along with at least eighty written questions, such as ‘What do you mean by this statement?,’ ‘What is the source/are the sources for this information?,’ and ‘How do you respond to the allegation that this detail is not necessary to your argument and that its inclusion is evidence of retaliatory intent on your part?’ Kipnis chose not to answer any questions, following the standard advice of counsel defending the court case.
She did submit a statement saying that ‘these complaints seem like an attempt to bend the campus judicial system to punish someone whose work involves questioning the campus judicial system, just as bringing Title IX complaints over my first Chronicle essay attempted to do two years ago.’ In other words, the process was the punishment. Possible evidence of retaliatory purpose, she learned, included statements in the book that aggressively staked out her refusal to keep quiet, expressed in her trademark hyperbole. Her prior Title IX investigation, she writes, ‘has made me a little mad and possibly a little dangerous. . . . I mean, having been hauled up on complaints once, what do I have to lose? ‘Confidentiality’? ‘Conduct befitting a professor’? Kiss my ass. In other words, thank you to my accusers: unwitting collaborators, accidental muses.’ Also presented as possible evidence was her Facebook post quoting a book review – ’Kipnis doesn’t seem like the sort of enemy you’d want to attract, let alone help create’ – on which Kipnis had commented, ‘I love that’…
For many, Title IX has become synonymous with the imperative to address sexual assault among students. But Title IX can also be used to discourage disagreement, deter dissent, deflect scrutiny, or register disapproval of people whom colleagues find loathsome. The problem is not with Title IX itself, much less the generic capacity of any rule to be used as a pretext for unrelated ends. Rather, it is the growing tendency to try, in the words of Kipnis’s book, ‘to bend Title IX into an all-purpose bludgeon.’ This warping is made possible by ambiguous and undisciplined understandings—misunderstandings—of sexual harassment and its harms. Kipnis’s rebuke of common slippages and conflations, whereby ‘gropers become rapists and accusers become Survivors,’ anticipated a situation in which expression of her opinions about a sexual-harassment allegation could be sincerely perceived as an act of sexual harassment. Perhaps, in this environment, the complaint that followed the publication of ‘Unwanted Advances’ was inevitable.
Kipnis implied as much in ‘My Title IX Inquisition’: ‘by writing these sentences,’ she was ‘risking more’ complaints. That risk is now built into the professional life of those of us in universities who engage on subjects related to gender and sexuality. Like Kipnis, I routinely hear from teachers who say they are refraining from teaching and writing on such topics for fear of attracting Title IX complaints, which bring possibilities of termination, demotion, pay cuts, and tens of thousands of dollars in legal fees, especially for the swelling ranks of teachers who, unlike Kipnis and me, do not have tenure.
Read the full article in the New Yorker.
Nothing matters: how the invention of zero
helped create modern mathematics
Ittay Weiss, The Conversation, 20 September 2017
Zero’s late arrival was partly a reflection of the negative views some cultures held for the concept of nothing. Western philosophy is plagued with grave misconceptions about nothingness and the mystical powers of language. The fifth century BC Greek thinker Parmenides proclaimed that nothing cannot exist, since to speak of something is to speak of something that exists. This Parmenidean approach kept prominent historical figures busy for a long while.
After the advent of Christianity, religious leaders in Europe argued that since God is in everything that exists, anything that represents nothing must be satanic. In an attempt to save humanity from the devil, they promptly banished zero from existence, though merchants continued secretly to use it.
By contrast, in Buddhism the concept of nothingness is not only devoid of any demonic possessions but is actually a central idea worthy of much study en route to nirvana. With such a mindset, having a mathematical representation for nothing was, well, nothing to fret over. In fact, the English word ‘zero’ is originally derived from the Hindi ‘sunyata’, which means nothingness and is a central concept in Buddhism.
So after zero finally emerged in ancient India, it took almost 1,000 years to set root in Europe, much longer than in China or the Middle East.
Read the full article in The Conversation.
Hyping the history of mathematics
Thonyy Christie, The Renaisance Mathematicus,
19 September 2017
All of the articles, which are all basically clones of the original announcement state quite clearly that this is a placeholder zero and not the number concept zero and that there are earlier recorded symbols for placeholder zeros in both Babylonian and Mayan mathematics. Of course it was only in Indian mathematics that the place-holder zero developed into the number concept zero of which the earliest evidence can be found in Brahmagupta’s Brahmasphuṭasiddhanta from the seven century CE. However, this re-dating of the Bakhshali manuscript doesn’t actually bring us any closer to knowing when, why or how that conceptual shift, so important in the history of mathematics, took place. Does it in anyway actually change the history of the zero concept within the history of mathematics? No not really.
Historians of mathematics have known for a long time that the history of the zero concept within Indian culture doesn’t begin with Brahmagupta and that it was certainly preceded by a long complex prehistory. They are well aware of zero concepts in Sanskrit linguistics and in Hindu philosophy that stretch back well before the turn of the millennium. In fact it is exactly this linguistic and philosophical acceptance of ‘nothing’ that the historian assume enabled the Indian mathematicians to make the leap to the concept of a number signifying nothing, whereas the Greeks with their philosophical rejection of the void were unable to spring the gap. Having a new earliest symbol in Indian mathematics for zero as a placeholder, as opposed to the earlier recorded words for the concept of nothingness doesn’t actually change anything fundamental in our historical knowledge of the number concept of zero…
The hype that I have outlined here in the recent history of mathematics has unfortunately become the norm in all genres of history and in the historical sciences such as archaeology or palaeontology. New discoveries are not presented in a reasonable manner putting them correctly into the context of the state of the art research in the given field but are trumpeted out at a metaphorical 140 decibel claiming that this is a sensation, a discipline re-defining, an unbelievable, a unique, a choose your own hyperbolic superlative discovery. The context is, as above, very often misrepresented to make the new discovery seem more important, more significant, whatever. Everybody is struggling to make themselves heard above the clamour of all the other discovery announcements being made by the competition thereby creating a totally false impression of how academia works and how it progresses. Can we please turn down the volume, cut out the hype and present the results of academic research in history in a manner appropriate to it and not to the marketing of the latest Hollywood mega-bucks, blockbuster?
How did dinosaurs evolve beaks and become birds?
Michael J Benton, The Conversation, 26 September 2017
Once you know that many dinosaurs had feathers, it seems much more obvious that they probably evolved into birds. But there’s still a big question. How did a set of dinosaurian jaws with abundant teeth (think T. rex) turn into the toothless jaws of modern birds, covered by a beak? Two things had to happen in this transition, suppression of the teeth and growth of the beak. Now new fossil evidence has shown how it happened.
In a new study, Shuo Wang from the Capital Normal University of Beijing and colleagues studied a series of dinosaur and early bird fossils to see the transition. They found that some dinosaurs evolved to lose their teeth as they got older and sprouted a small beak. Over time, this process happened earlier and earlier until eventually the animals emerged from their eggs with a fully formed beak.
The oldest birds actually had reptilian-like teeth – for example Archaeopteryx from the late Jurassic period (150m years ago) and Sapeornis from the early Cretaceous (125m years ago). But other early birds had lost their teeth, such as Confuciusornis, also from the early Cretaceous.
Modern birds all lack teeth, except for the South American hoatzin, Opisthocomus, whose hatchlings have a small tooth that they use to help them escape from their egg and then shed. Developmental experiments in the 1980s showed that modern birds could probably generate teeth if their jaw tissue was artificially stimulated with the right molecules. This suggests their ancestors at some point grew teeth naturally.
Read the full article in The Conversation.
Do immigrants ‘steal’ jobs in South Africa?
What the data tell us
Raphael Chaskalson , Ground Up, 18 September 2017
On 1 May 2008, riots broke out against foreign nationals in Alexandra township, a densely populated settlement some two miles east of Sandton, sub-Saharan Africa’s financial centre. The iconic image of Mozambican national Ernesto Nhamuave being burnt alive shocked South African audiences and quickly spread across the world. Within two weeks, ‘xenophobic violence’ had spread across South Africa. Those targeted were largely foreign African and South Asian nationals. After two weeks, more than 60 people had been killed and more than 100,000 displaced.
The ‘charges’ against immigrants are typical of anti-immigrant sentiment globally: a 2010 survey by the Southern African Migration Programme found that 60% of South Africans believe immigrants ‘take jobs’, whilst 55% believe that they worsen crime.
But in fact we know very little about the actual labour market effects of immigration to South Africa. The best source of useful demographic information about immigrants in South Africa is the census, the most recent of which was in the 2011 census. There are, of course, problems with using these data. Demographic trends may well have changed in the six years since 2011. There is also a risk that immigrants may dodge census surveyors if they are scared of being victimised by the authorities, so that the data may underestimate immigration to South Africa. But, taking the data at face value, we can glean useful demographic features of immigrants compared to South African born workers….
Based on the evidence we have available, we can conclude with reasonable confidence that immigrants do not take jobs from South Africans overall – in fact, a best-case scenario suggests that they are creating a small number of jobs where they settle. However, when we hone in on particular job categories, we do see a small, negative employment effect for workers in better-skilled job categories. If we believe that immigrants are systematically under-represented in the census data, this may reflect a larger negative effect in reality.
In any case, the predicted effects on both wages and employment, taken at face value, are extremely small.
Read the full article in Ground Up.
The races of Europe: Construction of national identities
in the social sciences, 1839-1939
Ian Stewart, Reviews in History, 22 September 2017
In part one, McMahon is largely concerned with establishing a geography of racial classification in Europe, where the three ‘core’ nations of Britain, France, and Germany spurred most racial research. However, as the 19th century wore on, classificatory race science expanded outwards from this core, though the three nations remained dominant. Here the author should be commended for going beyond the normal comparative framework of Britain, France, and Germany, instead aiming to provide a true European picture – supported by case studies of the refreshingly novel choices of Ireland, Romania, and Poland. Most intriguingly, as outlined in chapter two, McMahon has constructed his geography through a quantitative database of 126 source texts, from what he identifies as the ‘elite’ of racial classifiers. Aware of the subjective imperfection of his method and of quantitative limitations more generally, McMahon does not rely on the database overmuch. This may be because it seems to prop up findings already more or less known, or at least suspected. For example, few familiar with the field will be surprised that Paris is confirmed by the data to have been the 19th-century racial classification capital (p. 50). More interesting is McMahon’s use of transnational comparison to contribute to the Sonderweg debate, suggesting that German anthropology was more monogenist and egalitarian than the contemporary (Robert Knox-inspired) London Anthropological Society or the Parisian anthropologicals (p. 57). The geographical picture – made clearer by the useful inclusion of 24 maps in the appendix – established early on helps to frame McMahon’s findings through the rest of the book.
A timeline for race science is also established in part one, with an obvious increase in the complexity of racial classification as well as its adaptability through different intellectual paradigms. The most significant change within race science was the shift in the middle of the 19th century from the long-established ethno-linguistic understandings of race to more hard and fast physiological distinctions between nations. Here the established geography repeatedly provides the context for a clash between the waning currents of Enlightened universalism and local national imperatives. Although the community of racial scientists remained internationally focused throughout the 19th century, by the fin de siècle any notion of a common project had collapsed as nationalism overshadowed scientific objectivity. However, for superior races there had to be inferior ones, so to some extent racial classification was always transnational and comparative. In the 20th century European geographical classifications came to replace long-established racial delineations. For example, ‘Nordicism’ encompassed the Germanic-speaking peoples as the Indo-European, or Aryan, idea was set aside in these countries (pp. 171–2).
McMahon explores the relationship between race science and politics in the fourth chapter of the work, foregrounded as a ‘central theme’ in his introduction (pp. 1–2). An obvious factor here is that scientists more or less relied on politicians – via the state – to fund their research, which had implications for the racial narratives they created and against which nations they directed them. But most scientists were also nationalists to varying degrees and so a tension between local imperatives and international scholarly objectivity was also evident across Europe. As the 19th century wore on, universalism mattered less and less as nationalist dimensions took over. The most striking example included here came when the French anthropologist Paul Broca found that his data forced him into the inconvenient conclusion that the majority of Frenchmen were brachycephalic rather than dolichocephalic (round-headed as opposed to long, narrow-headed), the opposite of common assumptions. What could Broca do? Fudge the data? Or reorder craniological hierarchies? Remarkably, he chose the latter, turning the established sequence of craniological prestige on its (broad) head, so that brachycephalic became perceived as the most prestigious skull shape in French racial science. This had wide ramifications as the Slavs were also thought brachycephalic. The French therefore suddenly found they had Celto-Slav relatives, at the same time as the great power politics of France and Russia aligned.
Read the full article in Reviews in History.
Paleoart: the evolution of dinosaur paintings,
from watercolours to Soviet visions
Tom Holland, New Statesman, 18 September 2017
Few genres of art were more authentically representative of the industrial age than portrayals of the prehistoric past. As the artist Walton Ford puts it in his preface: ‘This is a book brimming with images born in the heat of startling discovery, urgent works of first contact and of handcrafted time travel.’
As such, they are images not just of prehistoric life, but of how different people at different times have imagined prehistoric life. Hence, perhaps, why the earliest illustrations compiled in the book tend to be the most agitated and unsettling of all. They are the expressions of an entire upheaval in sensibility, of the shock felt by complacent humanity at the discovery of just how immense were the cycles of geological time, and of how brutal had been the repeated cullings of creatures that were now only to be found entombed in rock.
‘Prehistory,’ as Lescaze puts it, ‘could not help but engender uncomfortable musings on a benevolent God’s capacity to annihilate entire species.’ A shadow of the apocalyptic hung over the earliest works of paleoart. Volcanoes exploded, oceans seethed, beast preyed on beast. In Duria Antiqua, such was the terror of one plesiosaur that the wretched animal was shown voiding proto-coprolites on to the sea floor.
Read the full article in the New Statesman.
I’m Indian. Can I write black characters?
Thrity Umrigar, New York Times, 14 September 2017
In retrospect, it seems incredible I didn’t anticipate the questions.
My seventh novel, ‘Everybody’s Son’ — about an affluent white couple, their adopted black son, and his search for identity and reconciliation with his past — came to me in a flash of inspiration. I wrote the story in a white heat, in about four months.
So I was unprepared for what interviewers I spoke to about the book asked me: Why, and how, had I chosen to write from the perspective of an African-American protagonist? I hadn’t expected this line of inquiry to come up because, although race and racial identity are central preoccupations of the book, I saw Anton not just as a black character, but as a singular, distinctive character born of my imagination and efforts.
I soon realized I had been naïve. While I might define myself as an American writer, I grew up in India. That means, to many, I’ll always be an Indian-American writer, with all the freight that the hyphen carries.
The assumption by agents, editors and readers was that I would continue writing novels featuring Indian characters or set in India — as I did in my first six novels — even though I have not lived there for over 30 years.
Read the full article in the New York Times.
The images are, from top down: Congo Belge II by T Kalema; the interior of Zeitz MOCAA (photo by Iwan Baan in dezeen; Statue of Robert E Lee in Charlottesville; ‘Shall It Declare ThyTruth’, by Stephen Towns, part of his ‘The Joy Cometh’ series’; ‘Inostrancevia, devouring a Pareiasaurus’ by Alexei Petrovich Bystrow.