“Many of those engaged in these early struggles and projects have sustained strong supportive networks. I have benefited hugely from these”
Professor Matt Cook
As a gay academic working on queer themes in history, my feelings of comfort and belonging owe a lot to the emergence of new areas of scholarship, to my discovery of community among colleagues and students – and to good timing.
I began my postgraduate studies in the mid‑1990s, just as work on gender and sexuality had gained some credibility and was even fashionable in some places – not least atQueen Mary University of London, where I found myself. By the time I emerged with my PhD in 2000, much ground had already been laid and my specialism was not the impediment to gaining an academic post that it had been for the preceding generation. There was a growing sense that explorations of sexuality had a real significance to broader understandings of society, culture and politics – past and present.
In the 1970s and 1980s, the scholars in the UK who inspired me – Jeffrey Weeks, Lynne Segal and Sheila Rowbotham among them – wrote much of their early work outside the university sector or against the grain of the jobs they were being paid for. They were nurtured instead by political and community networks arising from women’s and gay liberation, from the Gay Left collective and also from the History Workshop movement and journal (which, from its inception, had taken gender and sexuality – and those working beyond the academy – seriously). Such scholars had to argue that women’s and gay history were not marginal or peripheral areas of study and had a place in university departments. Once hired, some of them (including those I’ve mentioned) faced overt disdain or were “benignly” expected to focus on other things seen as more significant.
There was some notable resistance to this marginalisation. At theUniversity of Sussexin 1991, Alan Sinfield and Jonathan Dollimore established theSexual Dissidence master’s programme, exploring history, literature, post-structural and queer theory. It felt especially urgent in the context of the Aids crisis, Clause 28 (which prevented UK local councils from “promoting homosexuality”) and a broader homophobic backlash. Unsurprisingly, it was derided as insignificant, trendy (an insult in this context) and part of a “Loony Left” agenda. But, tellingly, the programme is still running 25 years on.
Read the original Times Higher Education article here
Many of those engaged in these struggles and projects have sustained strong supportive networks. I have benefited hugely from these. Research and teaching projects have meanwhile allowed me to work with LGBT community groups and with archive and museum professionals – giving me sustaining anchor points outside academia.
AtBirkbeck, University of London– my institutional home for the past 10 years – I have found further communities. One is a history department with a collective commitment to wide-ranging historical work (and the intersections that it fosters). Another is with colleagues brought together through theBirkbeck Interdisciplinary Gender and Sexualityresearch centre. A third is with students whose engagement with their studies has often been underpinned by much more direct experiences of discrimination and marginalisation than I have had to deal with. Being a white, middle-class man has made me an insider in more ways than my queerness has set me apart.
Matt Cook is professor of modern history atBirkbeck, University of Londonand the author, most recently, ofQueer Domesticities: Homosexuality and Home Life in Twentieth-Century London(2014).
I recently returned from a trip to Southern Italy. Apart from enjoying the delights of Neapolitan pizza (3 stars), the Bay of Naples (4 stars) and Pompeii (5 stars), I also went right down to the heel of Italy on a linguistic fact-finding mission, starting in the lovely Baroque town of Lecce.
Grecia Salentina – like the smaller area of Bovesia down in the toe – comprises nine villages where, intriguingly, it has been claimed that a form of Greek (written in the Roman alphabet) may have been spoken since the 8th century BC. Others say that the Greek spoken there was brought over by refugee settlers in Byzantine times; yet others claim that at least in its current form, it has more recent origins, dating to the 19th century.
Even discounting the more ancient origins which are claimed, it is intriguing that a linguistic minority should have survived so long in this context. Having failed to find any easily accessible and up-to-date sociolinguistic studies, I wanted to carry out a quick recce, and if possible hear this dialect for myself. I therefore went round all nine villages (one of them incidentally called Calimera, or ‘good day’ in Greek), looking for evidence of Greek both in the visual (‘linguistic landscape’) sense and for potential speakers.
There was plenty of evidence in the visual sphere: street signs, shop names (some even in the Greek alphabet), explanations on various monuments – even a fully fledged parish magazine trilingual in Modern Greek, Italian and Griko. There were also some clear culinary connections, probably dating back centuries: ‘chorta’ or wild greens, boiled and served as a salad in Greece, were also on the menu here, as was twice-baked bread as found in every Greek bakery.
But what of the active linguistic scene? Italian was standardised late in the 19th century and regional dialects are still widely spoken. As in Naples, in this area many locals do not speak standard Italian among themselves.
Like other Italian dialects, Neapolitan and Salentino varieties are being eaten away by the spread of the standard variety but they are still noticeably active in the local population. Our taxi driver in Naples, assailed from all sides by motorbike riders cutting in on him – a local pastime – opened his window and screamed with ferocious irony at one of them: ‘Ha raggiu! Ha raggiu!’ (‘You are right! You are right!’).
The Italian form: ‘Ha raggione’ simply would not have carried the same impact, savour or street cred. So like many other linguistic situations, the Southern Italian one is as multilayered as the local lasagne. If Greek was there to be found, it would be vying not only with Italian but on a range of local dialects. Indeed this may have contributed to its decline, since an alternative ‘in-group’ variety, closer to the standard, was also available.
‘Relic’ languages and NORMS
But what was the evidence of the ‘Griko’ dialect actually being spoken? As all sociolinguists will know, the best hope of finding speakers of ‘relic’ languages is by interviewing ‘NORMS’ – non-mobile, older rural males. Fortunately for me, one of the principal pastimes of the ‘norms’ in Mediterranean countries is hanging out in the cafe with their friends, sipping a coffee or an alcoholic beverage, flicking their worry beads round (in Greece), and toothlessly commenting on the world going by. I therefore approached and spoke to a number of elderly gentlemen in their seventies or eighties in these villages.
I told them I was carrying out a linguistic study and was interested in whether any of them spoke Griko. All were friendly and interested, but none (save one) offered to produce any words of Griko. Their near-universal opinion, whichever village you were in, was that far more people spoke it in the next door village than in their own. In fact, on reflection, they thought it was indeed still widely spoken – only definitely somewhere else.
They also universally claimed it had been the normal means of communication between their parents, but that the latter had not passed it on to them. Finally, I was given the details of someone who definitely spoke it in Castignano dei Greci, and an appointment was made for me to meet him. I also spoke to a young family who said that certain schools taught Griko since the Italian government had declared it to be a regional language of Italy, but only as an extra-curricular ‘add-on’ on a par with folk dancing, and mainly through songs. There has therefore been a revival of sorts through this policy, and perhaps a positive change in attitudes, as Manuela Pellegrino’s doctorate at UCL recently showed, but there is Vesuvius to climb before this translates into active usage.
Sadness and elation
Professor Penelope Gardner-Chloros
When I arrived in Castrignano, my 94-year-old host and his wife could not have been more charming. He had written poetry extensively in Griko and had won prizes for it in the 1970s and 1990s. He proudly allowed himself to be recorded reading it out, occasionally checking my understanding as a Modern Greek speaker.
In spontaneous speech he did not appear to be really fluent any more – his wife was not a speaker, and at 94, there was no-one else much left to speak to. Even a mother-tongue atrophies through long disuse. But he could respond appropriately to my questions as to what his mother would have said in Griko in various circumstances, the dialect being close enough to Modern Greek, despite many borrowings and much general influence from various types of Italian, for all this to be understandable to me.
I left with a signed and dedicated copy of his Griko poetry anthology, and a feeling of sadness mixed with elation: elation to have spoken to one of the last native speakers of a language, and recorded a small piece of European history; and sadness that if I go there again, there may be no-one left to record…not even if I go to the next-door village.
Frustrated by a lack of political interest in conservation issues, Dr Adrian Cooper, former Associate Research Fellow in Birkbeck’s Department of Geography, has been developing an innovative and successful approach to community-based conservation in Suffolk.
Felixstowe’s Community Nature Reserve was born out of my frustration with politicians during the 2015 General Election debate. None of them mentioned the catastrophic decline in bee and other wildlife populations. Clearly, local grass roots action was needed.
I started talking and listening to people from local government and the local community about what might be possible, and gathering a small team of volunteers. Most people understood that wildlife populations in Felixstowe were falling, and they wanted to help, but they simply did not know how. It also became clear that getting hold of a single plot of land for any kind of nature reserve project in the Felixstowe area would take too long, and would be too complicated.
Participation in this initiative had to be as simple as possible. First, I re-defined what a nature reserve could be. Instead of it being one area of land, I suggested that local gardeners and allotment owners only had to allocate three square yards of their gardens or allotments for wildlife-friendly plants, ponds and insect lodges, and we could then aim for 1,666 people to take part. That combination would give us a total area of 5,000 square yards – the size of a football pitch.
In this way, we are developing a “community nature reserve” composed of many pieces of private land, but between which insects, birds and other wildlife can fly and develop sustainable biodiversity.
Creating our new nature reserve
With my partner Dawn Holden, I started a Facebook page, on which we advise local people about appropriate wildlife-friendly plants. I also wrote articles for our local advertiser magazines and gave an interview to our local community TV station and BBC Radio Suffolk. We were thrilled with the early take-up of our ideas, and at the time of writing, we know that 207 people have bought and planted at least one of the plants we have recommended. But the good news hasn’t stopped there.
Where are we now?
Thanks to Facebook, we’ve had enquiries from people all over the UK, asking about how we set ourselves up, and how the initiative has developed. BBC presenter Chris Packham found out about us, and his tweets to his 145,000 Twitter followers have produced a small avalanche of enquiries about our work and achievements.
In the Leicestershire villages of Cosby and Burbage, people decided to copy our model to develop their own community nature reserves. So now there is the Cosby Community Nature Reserve, and the Burbage Community Nature Reserve. That’s why I wanted to write this blog – to inspire and help other communities to take responsibility for their local conservation in a way that means everyone can get involved. Even window box owners are encouraged to take part – after all, they can grow herbs, crocus, snow drops and much else. So, no one is excluded.
During the first three months of this year, we’ve recruited lots more volunteers and received some wonderful new ideas, such as the organisation of a plant-swap opportunity, to keep the cost of buying and growing wildlife friendly plants as low as possible.
We’ve also started to work alongside Suffolk Wildlife Trust’s community projects officer to help with their grassroots conservation initiatives and to raise our profile. As a result, this month, April 2016, we hope to help the Trust to raise awareness of falling populations of swifts, and what people can do to help. In September, we plan to help the Trust raise awareness of local hedgehog populations.
The most important lesson we can offer groups who may wish to start their own community nature reserve is to listen to as many local people as possible. Be patient. Don’t rush on to Facebook until your local community feels comfortable with what you plan to do.
The next lesson is to keep listening, so fresh ideas from the community can be fed into Facebook and other social media as often as possible. We like to use Streetlife.com because it’s a great way to get discussions going among local people who otherwise might not get involved in community engagement.
Finally, we recommend using as many different types of local media as possible to spread the message. We have used Facebook, Streetlife.com, LinkedIn (including multiple LinkedIn posts), local magazines, our community radio and TV station, BBC Radio Suffolk and Twitter. For more information have a look at our Facebook page
On Sunday evening, Boris Johnson, with the zeal of a convert or the scheming of a Machiavellian, has decided to join the ‘Outers’. Here’s 3 reasons why it doesn’t matter:
Reason 1: Boris isn’t that popular.Remember, Heineken isn’t that strong. I’m intrigued by the poll in theEvening Standardthat claimed ‘he could be a game-changer in the historic vote’ as ‘one in three people regard him as “important” to deciding whether they vote In or Out’. Putting aside exactly what ‘important’ means, the statistics are revealing. 32 % of those asked said Boris could be ‘important’ but a full 28 % said Theresa May’s and George Osborne’s views were important-only 4 % points behindBoris(and 23 %, by the way, identified Stuart Rose as ‘important’ too). So if, as the report claimed, Boris could ‘partly’ cancel out Cameron’s influence, presumably May and Osbourne could do the same to Johnson? Boris’ position as ‘the most popular politician’ isoften citedthough his reach to UKIP voters is probablyrather unnecessary– and it looks likeNicola Sturgeon pipped Boris in the popularity stakes at least once.
Reason 2: Boris doesn’t do arguments. As Janan Ganeshargues in the FT‘voters like Mr Johnson. But they like Judi Dench too. Liking someone and deferring to their judgment on a serious question are different things’. As a number of people have argued, what the Leave campaign needs, above all, is a serious alternative vision, equivalent to the Scottish YES campaign’s positive, mobilising narrative. Boris hangs hilariously from aerial slides but he doesn’t really do ideas or arguments, just quips and ‘mishaps’. Cameron’s speech last night in Parliament was perhaps a taste of the gravitas, clarity and seriousness the Remain campaign will deploy. Judging by his question in Parliament, Boris’ re-joiner will beabout ‘soveregnity’a word not evenconstitutional lawyersagree on. And there is no nuance or wriggle roomin a vote to leave.
Reason 3: Boris doesn’t do teams and messages.Being the Mayor of London is (or was) the perfect job for Boris, where he can be a maverick, a loose cannon and is able to rail against everyone and everything. His record when part of an organised group e.g. in the shadow cabinet, is much less glittering given his tendency to be rather egocentric or, as one unkind review put it, agold medal egomaniac. How will he fare as part of an organised group with a message and a ‘line to take’?
Boris cites his great hero Winston Churchill. However, for most of the 1930s Churchill, a
similarly gold medal level egotist, entangled himself in a series of failed and doomed campaigns, from the cross-party ‘arms and the covenant’ rearmament initiative (which he almost wrecked), to supporting Edward the VIII and a bizarre solo effort to stop Indian independence. Churchill was very much, and very often, on the wrong side of history, and only his later struggle against appeasement saved him.
Last night, Michael Crick quoted an unhappy MP who spoke of another Churchill, Winston Churchill’s dad, Randolph (above). He was also a famous politician, gifted, witty and talked about as a future Prime Minister in the 1880s and 1890s. Randolph had, as Winston wrote of his father, ‘the showman’s knack of drawing public attention to everything he said or did’. Why did his career end? Boris take note-he gambled and took sides against his own party and leader on a fundamental debate in British politics. And lost, never to return.
This post was contributed by Rob Singh, Professor of Politics at Birkbeck. Prof Singh’s new book, ‘After Obama: Renewing American Leadership, Restoring Global Order’ will be published by Cambridge University Press in May.
This aticle was originally published in Prospect on 16 February.
With the death of Justice Antonin Scalia on 13thFebruary, the United States Supreme Court became a central issue in the raucous 2016 presidential campaign. While President Obama has stated his intent to nominate the next justice, Senate Majority Leader Mitch McConnell has argued that Scalia should not be replaced until after the presidential election — and nominees must be confirmed by the currently Republican-held Senate. These competing claims show how the Court now reflects and reinforces the broader partisan polarisation in Washington.
Justice Antonin Scalia (By Steve Petteway, photographer, Supreme Court of the United States (See ) [Public domain], via Wikimedia Commons)
On decisions from gun control to campaign finance, the court over the last decade has pursued an outspokenly conservative agenda. But other key rulings—such as upholding the Affordable Care Act and the right to same sex marriage—have also grievously disappointed traditionalists. With the remaining eight justices now split between four progressives and four conservatives, Scalia’s replacement could potentially reshape constitutional law for years to come.
A man of acerbic wit and often scathing venom, Scalia developed an approach to constitutional interpretation—originalism—that many found coherent and compelling (a whole book,Scalia Dissents, was even dedicated to his disagreements with prevailing opinion). In a democracy, how can a Court legitimately strike down the laws passed by the Congress and signed by the president? Originalism offered a simple solution: rather than consider what the writers of laws, or of particular constitutional clauses, intended the law to mean, judges should instead interpret these in terms of how the text was commonly understood at the time it was adopted. That adherence to the values of others seemed to limit the dangers of judges writing their own views into law. It had the happily convenient benefit, to Scalia, of also yielding reliably conservative policy outcomes. But three problems plagued the path Scalia paved, which he never convincingly resolved.
First, the practical outcomes of Scalia’s philosophy are widely regarded as repugnant to contemporary moral values. Take Maryland v Craig (1990), where the Court upheld a state law allowing a victim of child sex abuse to testify over CCTV rather than in court, in the presence of her abuser. Scalia dissented, arguing that the Sixth Amendment provides that in “all criminal prosecutions the accused shall enjoy the right… to be confronted with the witnesses against him.”
The only things that had changed since 1791, he argued, were society’s “so-called sensitivity to psychic trauma” and the judgment of where the proper balance lay between assuring conviction of child abusers and acquittal of those falsely accused of abuse. At the same time, in supporting states’ rights to enact statutes rooted in “moral disapproval,” Scalia opposed striking down laws criminalising gay sex in 2003. Relying on “tradition” and popular sentiment to thwart progress, he selectively transformed the Bill of Rights from a safeguard against majoritarianism into another expression of it.
But beyond specific rulings was a second, broader problem. Central to Scalia’s judicial philosophy was an inherent contradiction: would the original framers of the Constitution whom he so venerated have prescribed an originalist approach? Compelling evidence suggests otherwise. Not only is the language of the document notoriously ambiguous and vague, deliberately open to competing and evolving interpretations, but the Framers expressly rejected freezing the fledgling republic in the conditions of 1787. Iconic figures such as Thomas Jefferson even expected new generations to rewrite the Constitution anew.
Thirdly, in decisions such as that made in court caseThe District of Columbia v. Dick Heller(2008) (which was presided over by the Supreme Court of the United States, and thus Scalia) the Court hardly exemplified a conservative role; for the first time in American history an individualist reading of the Second Amendment was announced. It was ruled that an individual is entitled to carry a firearm for private purposes, such as self-defence, and that the Amendment doesn’t just apply to the rights of groups such as militias. The result of this ruling was a litigation bonanza centred on exactly what gun regulations offend a citizen’s right to own firearms. But if the US survived more than two centuries without the 2nd Amendment ever conferring such a right, when did this change, and why?
Originalists used to criticise the Court’s progressive rulings of the 1960s and 1970s, when the liberal Justices exercised “raw judicial power” by “inventing” new constitutional rights that weren’t explicitly in the Constitution. Now, the same charge can be levelled at the conservatives, whose recent embrace of judicial activism often appears less philosophical rationale than political rationalisation.
Read the original article in Prospect
To be fair, Scalia did frequently abide by his own strictures to act as a judge rather than a legislator, not least on First Amendment cases such as flag desecration, where his reading of free expression trumped his affront at unpatriotic acts such as burning the Stars and Stripes. But it is difficult to disassociate his embrace of originalism from his finding in its cold but confused logic a way to oppose every progressive advance from reproductive rights to affirmative action.
George W Bush declined the opportunity to elevate Scalia to the Chief Justiceship in 2005, but Republican presidential candidates have already solemnly avowed to appoint “another Scalia” to the Court, should they be sworn into office in 2017. The chances of that are increasingly slim. With the Court’s future direction now a key issue in the presidential election, several vulnerable Republican Senators facing uphill battles for re-election in swing states such as Wisconsin and Illinois, and the Grand Old Party likely to seem nakedly partisan in obstructing a new Obama judicial nominee from even coming to a vote, Scalia seems likely to remain a magnet for controversy in death as well as life.
It would be mildly ironic if Scalia’s passing, and controversial legacy, hamper the prospect of a more conservative direction in constitutional law by helping to energise the Democratic Party base and costing the GOP the White House and/or the Senate. And even more ironic that the remainder of this year’s contentious argument over the Court will itself test the proposition of whether a Constitution designed in and for the 18th century is still fit for purpose in the 21st, or more resembles a noble piece of paper housed in the National Archives.
Rob Singh is Professor of Politics at Birkbeck. His new book, ‘After Obama: Renewing American Leadership, Restoring Global Order’ will be published by Cambridge University Press in May. Prof Singh recently appeared on an episode of BBC Radio 4’s The Long View which focused on ‘Donald Trump and the Politics of Celebrity’
Buffoon. Joke. Jerk. Those are just some of the descriptions of the current front-runner for the Republican Party nomination for president of the United States. From his fellow Republicans, that is. Beyond the party, Donald J. Trump has been lambasted as a bigot, misogynist, and racist. Yet none of this has seemingly hampered the popular appeal of his quixotic quest for the White House.
Should we take the Trump phenomenon seriously? The answer is, emphatically, yes. Laugh at or loathe him, Trump has been the Heineken candidate, reaching parts of the electorate no other candidate can reach. And whilst it remains to be seen whether he can translate his support in the polls into votes, Trump already dominates 2016 in singular fashion. There exists no precedent in the modern era for a political novice setting the agenda so consistently that the media focuses in Pavlovian fashion on whatever subjects Trump raises. From stopping illegal immigration through a ‘beautiful’ great wall with Mexico to a moratorium on all Muslims entering the US, no-one has commanded attention like the New Yorker. Moreover, not only have other Republicans felt compelled to follow his lead but even President Obama’s final State of the Union was essentially an extended rejoinder to the Donald.
So, what underlies the success? Anger, authenticity, media savvy, populism, and timing.
An unapologetically redemptive force
First, most Americans think their country is on the wrong track. Among white working class Americans – the core Trump constituency – stagnant wages, real income decline, and loss of a once-dominant status in a nation transforming economically and culturally underlies disillusion. For Americans regarding ‘their’ country as in need of taking back and among those fearing the US is in terminal decline – polarised and gridlocked at home, discounted and challenged for primacy abroad – Trump represents an unapologetically redemptive force: a visceral, primal scream from the heart of white American nationalism.
Second, Americans broadly view their government as ineffective and political system as corrupt. Running for Washington by running against it, on a platform of incoherent but potently opaque policy positions, no-one – for those wanting to change Washington – embodies the outsider like Trump. Moreover, uniquely, his personal fortune insulates him from charges that he can be ‘bought’ by vested interests. When Trump talks about knowing how to work the system as a businessman, he is credible. Add to that an outspoken willingness to speak directly, bluntly and without fear of causing offence and millions of Americans view the Donald as a truth teller. Like businessmen in politics before him, Trump promises that what he did for himself he can do for America, and that ordinary Americans will once more partake of the increasingly elusive American Dream.
Social media mogul
Third, Trump has exploited his formidable media knowledge with astonishing shrewdness. Outrageous statements, outlandish claims and telling personal insults – seemingly spontaneous but carefully pre-planned and road-tested – compel ratings. Social media abets the creation of an alternative reality and echo chamber from which the distrusted mainstream media are excluded. Disintermediation – cutting out the middle man – compounds Trump’s celebrity status to forge what his 5 million Twitter supporters perceive as a personal link to their politically incorrect champion.
Fourth, Trump – for whom id, not ideology, is all – upends conservative orthodoxy. A New York native who was for most of his life pro-choice on abortion, pro-gun control and a donor to Democrats, Trump is no staid Mitt Romney. In rejecting free trade deals and ‘stoopid’ Middle East wars, pledging to make allies from Saudi Arabia to South Korea pay for US protection, committing to punitive taxes on Wall Street and preserving entitlement programmes for the average Joe, Trump’s anti-elitism is scrambling a party establishment fearful of an anti-government populism it unleashed but cannot control.
Finally, if Obama won the presidency in 2008 as the ‘un-Bush’, what more vivid an antithesis to the current lame duck could be imagined than Trump? After seven years of the most polarising presidency since Richard Nixon, Trump promises to restore the art of the deal – something the US Constitution mandates for successful governing, and AWOL since 2009 – at home and abroad alike.
Can Trump triumph?
Can Trump prevail in the Republican demolition derby? The odds are still against him. After all, most Republicans do not support him and he has been first in national polls in large part because the ‘establishment’ vote has been so fragmented among Marco Rubio, Jeb Bush, John Kasich and Chris Christie. But if Trump can win or come second to Ted Cruz in the Iowa caucus, and then top the New Hampshire and South Carolina polls, the prospects of him securing the nomination are 50-50 at worst. By the time of the Republican Party convention in Cleveland, Ohio in July, if not well in advance, no one may be laughing other than the Donald.
As with almost everything about David Bowie, no one is sure exactly what his politics were.The Mirrorclaims he turned downan OBE and a knighthoodin the 2000s. In 1977 he is quoted as saying ‘the more I travel and the less sure I am about exactly which political philosophies are commendable’. Nevertheless, many have seenways in which Bowie’s career could provide lessonsfor how we do politics.
But Bowie was not apolitical. In the 1970s Bowie challenged entrenched gender and sexuality stereotypes at a time when few would. Jarvis Cocker has said how Bowie sent out the message that it was OK to be different while the Mirror speakers of how the singer’s ‘radical, gender-busting personas turned traditional conservative views upside down and widened what was acceptable in society’. He also wrote about the world around him, describing events from the space race to divided Berlin (the German Foreign Ministry todaypublically thanked him for helping to bring down the Berlin Wall).
At the same time, his championing of different cultures pushed all sorts of new ideas into society-look over his top 100 books, covering everything from a memoir of Stalin’s Gulags to Viz magazine. He popularised of whole kaleidoscope of new sounds and visions to new audiences, from German electronic music to Soul, while also experimenting with what people insist on calling ‘world music’. And his message reached a huge, diversenumber of people.
In this way, David Bowie was a very political animal, in the same way that Elvis Presley or the Beatles were. None of them urged ‘revolution’ or told people how to vote. Elvis was rather conservative, John Lennon asked to be counted ‘out’ of the revolution (or maybe ‘in’-he wasn’t sure) and David Bowie was too wide-ranging or elliptical to join any one party. But like these other musical legends, in challenging convention, the Man Who fell to Earth tore down barriers and opened up new worlds. David Bowie made people think differently about the world around them. And that is very political.
When will the historians of twentieth-century Europe accept that their century has ended? The violent attacks in Paris on the night of 13 November serve to confirm what we should already have known: that the populations of Europe have moved on from the politics of the twentieth century, and it is time for the historians to do so too.
Read the original article on the Reluctant Internationalists blog
Of course, in the aftermath of traumatic events, historians delve rapidly into their store-cupboard of analogies and precedents. And there are many which can be drawn upon for such purposes. Violence by small militant groups composed predominantly of immigrants from specific ethnic backgrounds has, after all, a considerable lineage in twentieth-century Europe. The various revolutionary and counter-revolutionary movements that proliferated in the former territories of the Habsburg and Tsarist empires at the end of the First World War, the militant Jewish Communist groups who played such a role in the anti-fascist movements and the wartime Resistance groups in the 1930s and 1940s, and the FLN militants of Algerian origin who were active in France in the 1950s and 1960s, are all examples of how political violence has often been generated in Europe by marginalized ethnic and religious minorities, who derived their legitimation from the perceived repression by state authorities.
And yet none of these models really has much purchase for understanding the various incidents which, from the train bombings of Madrid in 2004 to the events in Paris, have become part of Europe’s contemporary present. In part, of course, this is because European history is no longer, if it ever was, self-contained: this violence draws its inspiration from elsewhere, and from different histories. But there is also a broader and more disconcerting reality. The radicalized militants who have generated this violence feel no affinity with these precedents. Indeed, one suspects that they know little or nothing (and care even less) about Europe’s past history.
This is a cause for some modesty on the part of historians. We inhabit a present which owes little to “our” past. The twentieth-century history of Europe has come to an end. Everybody can choose their terminus date of preference, be it the reunification of Europe after 1989, the impact of the neo-liberal reforms of the 1990s, or the attacks on the Twin Towers on 9/11 and their subsequent imitators in Europe. But, wherever you choose to stick the frontier post between past and present, it is impossible to ignore the sense that European history has not so much ended as turned into a new configuration. For contemporary historians, to misquote J.P. Hartley, the present is another country, and they do things differently there.
Quite why that should be so is a question which probably demands an answer on a rather grand scale. But the more immediate challenge for historians of Europe is to develop frameworks for understanding the evolutions of the present, which are more relevant than reworkings of our all-too-familiar stories of the crises of the 1930s and 1940s. The history of the twenty-first century has to start somewhere, and the events of the last year have given us plenty of raw material to work from. War in Ukraine, the rise of new populist forces of right and left (or both), the demands for revision of national sovereignty, the arrival of large numbers of migrants fleeing war and economic deprivation, and the impact of new forms of political violence constitute a formidable agenda which demands a response more substantial than the overused language of crisis.
27 October 2015, Migrants are led by German Fedeal Police to an emergency accommodation centre in Wegscheid, southern Germany (Armin Weigel/ dpa via AP)
Crisis is of course a term that historians conventionally deploy to describe the demise of the old and the difficult birth of the new. The first is certainly highly visible in present events, as manifested by the collapse of a certain way of managing Europe, as well as the retreat of pre-existing political elites in the face of economic pressures and the demands of angry and exasperated voters. Of course, they will not go quietly. The logics of austerity economics and of national security justified by the supposed internal and external threats to European populations provide plenty of means for state authorities to seek to impose their discipline on their populations. But state authority is not what it used to be. One of the more tangible consequences of the last twenty years has been the hollowing out of much of the former trappings of state power and of national politics. In an era when communication has become primarily electronic, and national borders have become largely notional, state authority no longer has the same centrality in the history of twenty-first century Europe.
Part of the challenge of a history of the present is therefore to appreciate, if not fully to understand, the fluidity of boundaries of any kind. We inhabit a new cosmopolitanism, as reflected in the global character of many of Europe’s major cities, but also in the flexibility of identities, be they national, political, ethnic, or indeed religious. Journalists investigating the backgrounds of the authors of the Paris attacks have appeared surprised to discover that they were products of the banlieux of Paris and of Strasbourg, who amidst the chaotic years of their early adulthood travelled without any great sense of purpose to Syria, from where they returned equipped with a cocktail of animus, bravado and perhaps a superficial understanding of some elements of Islam. And yet that surely is what one would expect: militants are made not born, and the manner of their making well illustrates the fluidity of identities among those many Europeans whose lives have been rendered fragile by economic changes, the dislocation of social structures, and the retreat of structures of state provision.
In order to understand this, the most appropriate template is not the twentieth century, with its explosion in state power and totalizing ideological visions, but its predecessor. Looking at Europe’s present-day cities, one cannot but be reminded of the chaotic immigrant cities of Europe in the nineteenth century, and their worlds of neighbourhoods, ethnic self-help structures, and an almost total absence of state authority. Zola, it seems, has never been so topical; but other aspects of Europe’s present-day history seem also to recall the Europe of the mid-nineteenth century. The impact of vast economic forces beyond the control of any public authority, the pressure of migrant masses on a pre-existing population, and sudden surges of political support for charismatic individuals or for rhetorics of national liberation (and of xenophobia) smells much more akin to the Europe of the 1840s and the 1850s, than it does to the Europe of Adenauer, de Gaulle, Thatcher, Kohl and Mitterrand.
However, to replace one set of analogies with another borrowed from the previous century is not sufficient. A history of Europe’s twenty-first century has to identify the building blocks of the new. Some elements of this are incontrovertible: the new precariousness of living standards caused by economic change and untrammelled market forces, and the consequent replacement of the disciplined interaction of socio-economic interest groups by a new and much more volatile politics of economic opportunity and grievance. But other elements appear much less clear-cut. Is Europe moving left or right? Will the migrants of 2015 be integrated into a new and more multi-cultural Central Europe, or will they provoke a descent into forms of ethnic essentialism?
Above all, where, in the end, will state authority be discovered to reside? One of the most striking features of Europe since the final decades of the twentieth century has been the demise of those hierarchical organizational charts of government which used to characterise political-science textbooks. Power is now more dispersed and also more opaque, shared between a plethora of regional, national and supra-national institutions, but also secreted away in institutions such as central banks and security structures that are impervious to democratic control or even public scrutiny. None of that means that we are about to experience new forms of authoritarianism; the populations of Europe have, one suspects, moved beyond the stage when they would submit to the disciplines of states of emergency and military coups. Moreover, for all of the seriousness with which leaders have gathered to consider Europe’s overlapping current crises, one of the most striking features of their discussions has been the relative absence of effective tools of power. Military force – other than the spectacular acts of aerial bombing in Libya, Iraq and Syria – has almost disappeared; national economic policy-making has been transferred to central banks and the power of the markets; and even the routine ability to keep track of the movements of populations appears to have been largely eroded.
From the streets of Molenbeek to the beaches of Lesbos, it is the limits of the capacity of the state which has been more apparent than its strength. Perhaps that presages a new 1848, but more significant is the way that the state has lost, or surrendered, its twentieth-century role as the grand manager of European life. What will replace it forms part of the still uncertain nature of the history of the European present.
The Reluctant Internationalists project inspects the history of international collaboration and ambitions of medical professionals, politicians, generals, diplomats and policy-makers in twentieth century Europe. This four-year project, funded by Birkbeck researcher Dr Jessica Reinisch’s Wellcome Trust Investigator Award, examines the origins of such policies, consequences and lasting legacies.
It seems that we now have a terrible enemy who cannot be named – or rather whose naming causes a major headache. For many months now, on hearing the term ‘Isis’ we have not thought about a certain Egyptian goddess, or about the river that flows through Oxford. The name is now indelibly associated with one of the most evil organisations of modern times, which adopted the acronym of ‘Islamic State in Iraq and Syria’ (or sometimes ‘Islamic State in Iraq and the Levant’, ISIL).
Because of fears that calling the organisation by this name may legitimise it, some politicians and other public figures in the West have been calling it ‘so-called’ Islamic State, using the verbal equivalent of the two-fingered “scare quotes” gesture which has become part of our gestural vocabulary. There is also a move to call it ‘Daesh’ instead, another acronym which the organisation itself is said not to like.
This is how we in the West shake our tiny fists at the evil monster, even though, to paraphrase Shakespeare, a terrorist by any other name would smell as rank. Ah, the power of names! How can a simple label mean so much? And how do names acquire their symbolic power?
The power of names
The power of names is a recurrent theme in religions, myths and fairy tales. For example the issue of how to name God is an important issue in many religions. In Exodus 3:13-15, Moses asks God who he should say has sent him, and God replies ‘I am who I am’. It is as if God himself did not want to share an actual name with humans, because knowing someone’s name gives you a certain power over them.
In the Odyssey, after Odysseus has got the Cyclops drunk, he tells the monster that his name is ‘Nobody’’ When Odysseus later blinds the Cyclops in order to escape his clutches, his fellow Cyclops come to their brother’s rescue, but when they ask who has hurt him, the Cyclops replies ‘Nobody’ and the other Cyclops understandably lose interest and shuffle off.
Names are important also in traditional fairy tales. Those brought up on Grimm’s tales will remember Rumpelstiltskin, the evil dwarf who loses his power to harm people if they guess his name. And then of course there is Harry Potter, and the villain Voldemort who cannot be named for fear of conjuring him up. He is known throughout as ‘he who cannot be named’.
Marking changes in style, identity and allegiance
But back to ‘so-called’ Islamic State. Radicalised European or American Jihadist fighters change their names from their bland-sounding European ones to Arab ones that make them sound like the holy fighters they profess to be, the change of names marking a clear change in identity. If they return and are deradicalised, their names change back too. Many of us with less sinister motives make minor or greater changes to our names to mark changes in style, identity or allegiance.
We encourage or discourage nicknames and abbreviations at different stages in our life, reflecting how we wish to be seen – remember for example when Kate became Catherine? When we marry, we may change – or resolutely not change – our surnames – if of course this is allowed or encouraged in our culture (in some countries, such as Iceland, the issue does not arise and names are not enmeshed in a patriarchal system).
In Britain we might use our second name rather than our first; but this would not work in a country such as Russia where the second name is invariably a patronymic, i.e. your father’s first name, regardless of your gender. You may even be known by different names in different places – such as Jack in town and Ernest in the country.
Magic? Perhaps not. But certainly another facet of the power of words.
In this post, Dr Shin considers the agreement reached at the Paris climate conference earlier last week, and points to a longer history of tensions between international and national attempts to control energy.
After intensive and tough negotiations, the COP 21 climate conference in Paris finally reached an agreement on 12 December 2015. Political leaders hailed the agreement as ahistoric turning point, an agreement in which the global community now shares the recognition that climate change is a threat to human existence, an enormous challenge that has to be tackled with an internationally coordinated system.The Paris pactaims to keep global temperature rise well below 2.0C, while officially acknowledging the more ambitious target of 1.5C. It also envisions the world with zero carbon emissions in the latter half of this century. The new global climate deal is to be implemented through reviews of individual countries’ performance every five years – measured against their voluntary pledges (Intended Nationally Determined Contributions, or INDCs) – and by intensifying efforts to mitigate climate change over the coming years.
As much as it is a landmark event in terms of the world coming to embrace a common goal, we need to see the Paris agreement as a headlight illuminating a rough road ahead of us. Furthermore, the agreement’s reference to the latter part of this century might make it sound as though we have plenty of time, but this is not the case. Looking at how things have changed in the past shows that, when dealing with a global problem, decades are a very short unit of time.
Half a century ago, in 1965, energy was already an international topic, though not for its environmental implications. The world was only beginning to realise, mostly at the national level, the environmental harm caused by human activity. It was still a burgeoning recognition expressed, for example, in the US President’s Science Advisory Committee report Restoring the Quality of Our Environment (1965). The report opened with the statement, ‘Ours is a nation of affluence’, but 1965 was a year when affluence and scarcity formed a curious mix. One of its manifestations was the American Northeast blackout in November 1965, evidence that energy affluence came with disruptions, shortages and the fear of losing power.
The blackout, which lasted for more than ten hours on a Tuesday evening, affected over thirty million people in a country that had come to depend for the major part of its normal life on electrical power. What is interesting is not just how Americans experienced and responded to the sudden deprivation of electricity, but also how outsiders saw the event. On the other side of the Pacific Ocean, in Japan, which still regarded the USA as the model for its economic development and standard of living, news of the Northeast blackout was received with a mixture of surprise and admiration. An article described it as an event caused by ‘a blind spot of the hypermodern city’.[i]The mechanised cities that came to a halt during the power outage were seen as proof of the extent to which energy-using technology penetrated into American life. The blackout was therefore seen as a sign of affluence, the level of energy civilisation the Japanese aspired to achieve.
The British, less impressed by the American hypermodern, asked themselves the question: ‘Could it happen here?’ The answer of the UK Central Electricity Generating Board (CEGB) was that, thanks to Britain’s integrated power supply system, it was ‘impossible to visualise a similar situation arising in this country’.[ii]Despite the CEGB’s confident claim, there were a number of blackouts in the UK at the time. Less than a week after the American blackout, on 15 November 1965, a power outage occurred, affecting London, the Home Counties, Birmingham, Leeds, Nottingham, Derby, Chesterfield and most of East Anglia. The CEGB blamed the exceptionally cold weather that coincided with the operation to overhaul much of its equipment.[iii]The Guardian’s chief editor, Mark Arnold-Forster, exonerated the CEGB by pointing out that the real culprits were ‘millions of customers [who] felt cold at once and switched on direct-acting electric fires’.[iv]Indeed, a sudden imbalance of supply and demand had been the cause of a number of blackouts since the 1940s, including a power outage on the Christmas Day in 1962.[v]
The Guardian, 27 December 1962.
Energy experts in 1960s Europe were well aware that demand was not the obedient follower of supply. In the spring of 1965, the Committee on Electric Power of the UN Economic Commission for Europe held a symposium in Istanbul to consider the challenge of meeting the rapidly growing electricity demand.[vi]The attendance of 215 representatives from twenty-one countries demonstrated that those countries were facing a similar problem. However, although the problem was shared, their approaches differed. While the USSR delegate referred to a central committee to allocate power to different classes of consumer in times of emergency, the UK delegate – ever so inclined to soft persuasion – presented a paper on how to control load growth using consumer advisory services. While the problem was discussed at an international forum, the solution was sought at the national level. What could be described as the common recognition then was that the problem caused by power demand ‘does not arise only today, but exists at all times’, as expressed by the Turkish chairman.[vii]This amounted to an admission that there would be no future in which everyone’s need for power is fully satisfied at all times.
In July the same year, in Bangkok, another international meeting was held by the UN Economic Commission for Asia and the Far East (ECAFE, later to become ESCAP). The focus of ECAFE’s working group meeting was the development of energy resources – particularly electric power – and how to exploit them for industrialising the ECAFE region that included major developing countries such as China, India, Japan, South Korea, the Philippines, Vietnam and Iran.[viii]Some of the ECAFE countries were already experiencing the demand problem discussed by European countries in Istanbul. The demand for power was constantly outstripping supply in developing countries too, but it was simply taken as insufficient capacity building. What is striking about the Bangkok conference is how easily the longing for more energy could overshadow other important issues such as balancing the supply capacity and demand. Another topic that received only a passing reference was the depletion of fossil fuels, even though the Asian energy experts must have been familiar with King Hubbert’s ‘peak oil’ theory, first presented in 1956, which warned that US oil production would reach its peak around 1965–1970.
ECAFE Electric Power Experts’ Tour in India, 1956. United Nations Photo
The two meetings in Istanbul and Bangkok demonstrate that the different priorities had a profound effect on how some problems were selected while other issues were obscured. Fifty years on from the two UN meetings, the environmental issues are now centre stage at international forums where developed and developing countries participate equally in negotiations. In this way, there have been major changes. Nevertheless, there are uncanny parallels between the situation today and that of 1965. Energy disruption still haunts developed and developing countries; power outages arising from technical problems, human errors and natural disasters abound, and the renewable transition has now been added to the list of disruption causes. For instance, the UK’s attempt to abandon all of its coal-fired plants has narrowed the electricity supply margin to satisfy the nation’s demand. In early November 2015, theNational Grid had to appeal to business usersto reduce energy consumption to avert a wide-scale disconnection. In Germany,a record number of consumers were disconnectedbecause they could not pay their electricity bills, which had been inflated due to added subsidies for renewable energy. Blackouts have yet to be eliminated in developed countries; they are still alive and kicking. In developing countries, including those that have already achieved significant levels of development, the appetite for energy is unabated. More than anything, the belief that greater energy use leads to greater economic growth remains so strong that it is obscuring other important issues and sacrificing the global environment for future generations.
Read the original post on The Reluctant Internationalists project site
A brief look at the events of 1965 and 2015 tells us that the length of fifty years has turned out to be far from sufficient in balancing our needs and desires for energy with the resources and capacity we have. During the same period, we have failed to manage our power demands, which has led to severe damage to the global environment. With the coming of a more rigorous emissions control regime, the problem of managing energy demands will become more acute. In addition, as the Paris meeting highlighted, the fundamental divide between energy haves and have-nots has changed very little in the past fifty years, and this is the situation we have to deal with in the coming decades. Aligning our goals is one thing, aligning our acts and deeds is another, and the latter is usually more difficult. To meet the numerous challenges, several decades are equivalent to but a short space of time. This means that we must equip ourselves with ever-increasing determination and will in order to sprint through the long and rough terrain in the decades to come.
[viii]Economic Commission for Asia and the Far East,The Role and Application of Electric Power in the Industrialization of Asia and the Far East(UN, 1965). A recent review of the ECAFE’s early history is Ikuto Yamaguchi, ‘The Development and Activities of the Economic Commission for Asia and the Far East (ECAFE), 1947–65’, in S. Akita, G. Krozewski and S. Watanabe (eds.),The Transformation of the International Order of Asia(Routledge, 2014).
Dr Hiroki Shin is Co-Investigator of the AHRC-funded ‘Material Cultures of Energy’ project (PI: Prof Frank Trentmann), based at Birkbeck College, University of London. www.bbk.ac.uk/mce