Author Archives: Andrew Youngson

Dickensian delight: Our ‘serial’ fascination with the afterlife of Charles’s characters

This post was contributed by Dr Ben Winyard, digital publications officer at Birkbeck, University of London. Dr Winyard has been a co-organiser of Birkbeck’s Dickens Day event since 2005, and is one of the organisers behind the current Dickens reading project at the College

BBC drama Dickensian (image copyright Premier)

BBC drama Dickensian (image copyright Premier)

Is there a word for that familiar feeling of sadness or melancholia that accompanies finishing a novel? Perhaps there is a ferociously lengthy compound noun in German, or an elegant Japanese word with multiple, elusive meanings that can’t be fully encompassed by a solitary English word. A quick, unscientific search on Google reveals fascinating discussions on sites such as reddit about this emotional state and the various words that might describe it: sadness, ennui, nostalgia, regret, catharsis, homesickness, mourning, separation anxiety, and the delightful but somewhat toxic sounding ‘book hangover’.

Another suggestion is the term ‘limerance’, used in psychoanalytic theory to describe an invasive sexual and emotional obsession with a person or object – an infatuation or crush, in more demotic idiom. This feels a little too histrionic for such a quiet, fleeting but pervasive feeling. Imagining a curiously powerful parental bond, Dickens described David Copperfield as his ‘favourite child’, which suggests the intense feelings of attachment readers (and writers) can develop for fictional creations.

Do Dickens’s books deliver more noxious ‘book hangovers’? After all, Dickens was keen throughout his writing career to evoke feeling in his readers, meaning that his forceful use of sentimentality and melodrama, to induce laughter and tears in rapid succession (what Dickens described as his ‘streaky bacon’ approach), might be regarded as a particularly heady and intoxicating form of emotional pummelling. Dickens’s work provokes powerful feelings and his readers are famous for their attachment to the author and to his works. Dickens’s sentimental mode entices, coaxes and even coerces us to be affected by its depictions; it is a form of aesthetic and imaginative self-projection. Indeed, the shared, collective experience of feeling is what often brings us together as a community of Dickens enthusiasts.

It is also worth remembering that Dickens’s original readers would have encountered multiple hiatuses, as they read the novels serially in weekly or monthly instalments, which may have provoked feelings of frustration, anticipation, excitement and longing. A novel’s plot may be thrillingly propulsive, providing a forward momentum that, when halted, generates an exasperated thirst to traverse the ‘empty’ space in-between as quickly as possible. These manifold mini or false endings – which sometimes took the form of cliffhangers, but were more often simply breaks in the narrative – are similar to the final ending of the novel, in that they represent spaces that evoke fantasy and speculation. Just as the serialised instalments represent only temporary cessations that are potentially bridged by longing-filled fantasy, the end of a Dickens novel may similarly rouse imaginative speculation and fancy about the afterlives of the characters.

A scene from BBC drama Dickensian, featuring Stephen Rea in the role of Inspector Bucket (image copyright Premier)

A scene from BBC drama Dickensian, featuring Stephen Rea in the role of Inspector Bucket (image copyright Premier)

Adaptation, reimagining, pastiche and outright bootlegging

Dickens was himself no stranger to this phenomenon. The exceptional success of his first novel, The Pickwick Papers, serialised in 1836–37, stimulated a veritable industry of adaptations, pastiches, rip-offs and continuations. One of the most famous was Pickwick Abroad; or, The Tour in France (1837–38) by George W. M. Reynolds, a hugely successful radical journalist Dickens intensely disliked. In this decidedly rough and ready sequel, the Pickwick Club ventures into France, where crass national stereotypes and risqué adventures abound. In an era before copyright law – for which he campaigned vociferously – Dickens witnessed the multiple imaginative afterlives of his stories and characters on stage and in unlicensed prequels and sequels.

There was also a bustling trade in Dickensian souvenirs featuring his characters, including illustrations, porcelain figures, china plates, Toby jugs, keepsake boxes, and miscellaneous other household items and collectibles. Interestingly, in his short-lived journal Master Humphrey’s Clock, which he wrote and edited alone between 1840 and 1841, Dickens acknowledged and indulged readers’ desire for afterlives and new adventures for their favourite characters by reintroducing the hugely popular Mr Pickwick and Samuel Weller. We can also sense in Dickens himself the irresistible urge to resurrect characters he evidently longed to spend time with again – just as many of his readers did.

A digital Dickens afterlife

More recently, Birkbeck’s inventive and successful Twitter retelling of Our Mutual Friend, which saw dozens of people tweet as characters in this multitudinous novel, provided an outlet for Dickens readers to reengage with, and extend the afterlives of, their favourite characters. Many tweeters were unafraid to present their characters in decidedly modern, updated terms, which meant that, while the novel’s plot remained essentially the same, many characters took on new aspects, had new adventures and relationships, and occupied more imaginative space than in the original work.

At the most recent Dickens Day (October 2015), Professor Holly Furneaux, an alumna of Birkbeck who is now based at the University of Cardiff, delivered a fascinating paper on Dickensian fan fiction online, which is forging communities and providing avenues for original, and even erotic, (re)engagements with popular Dickensian characters. Furneaux demonstrated the particular popularity of the triangular relationship between Mortimer Lightwood, Eugene Wrayburn and Lizzie Hexam in Our Mutual Friend, which many online Dickens fan fiction writers reimagined more capaciously, with space within Eugene and Lizzie’s marriage for Mortimer.

Dickensian ­– Goading the stuffy old gatekeepers

Given the powerful attachment of Dickens’s readers to his works and the long history of adaptation, reimagining, pastiche and outright bootlegging of Dickens’s work, Tony Jordan’s Dickensian feels less of an oddity or a provocation than it may first appear. In this twenty-part TV serial, we enter a fantasy Victorian world, in which many of Dickens’s characters, from several of the novels, coexist. Thus, Inspector Bucket (Bleak House, 1852–53) is investigating the demise of Jacob Marley (A Christmas Carol, 1843), with the forensic assistance of Mr Venus (Our Mutual Friend, 1864–65). Accompanying these fantastic mash-ups is Jordan’s reimagining of backstories and subplots in the novels; thus, Honoria Barbary is embarking upon a relationship with Captain James Hawdon that readers of Bleak House know is doomed. And in a delightful nod to the queer affiliations that Professor Furneaux has observed in online fan fiction and other literary and non-literary sources, Arthur Havisham is hopelessly in love with Meriwether Compeyson, the dastardly seducer he has appointed to marry and defraud his sister, Amelia (who will become the bitter, decrepit Miss Havisham of Great Expectations, 1860–61). The first episode of Dickensian thus presented a delightful, uncanny guessing game, as Dickens fan scrambled to identify all of the characters and connect them to their extant stories within the novels.

Jordan himself is keen to present his work as goading the stuffy old gatekeepers of Dickens’s legacy, irreverently insisting in an interview with the Daily Telegraph that ‘he knew the changes risked “p––ing off the Dickens community”’. In actuality, Jordan’s work taps into a rich seam of readerly fantasy about Dickens imaginative worlds that has been amply mined by authors, playwrights, filmmakers and TV executives. Furthermore, the intense relationship between Dickens and his readers, and the love and affection readers have felt, and continue to feel, for Dickens, his fictional world and the people who inhabit it, have all been objects of intense scholarly scrutiny and analysis.

Dickens Gurney head

Charles Dickens – By Jeremiah Gurney (Heritage Auction Gallery) [Public domain], via Wikimedia Commons

It is a truism that all Dickens fans and scholars inevitably encounter that, ‘Were Dickens alive today, he would write for a soap opera’. Jordan, who famously wrote for Eastenders (and less famously, for the slated, 90s camp classic Eldorado), has been keen to emphasise that Dickens’s serialised fiction prefigured the episodic melodrama of the contemporary soap opera. In a recent interview with The Big Issue, Jordan observes that,

“People can be far too reverential. We mustn’t forget that Dickens was a populist writer who wrote for the masses. He wrote episodically, trying to flog magazines every month. He was sensationalist and did cliffhangers way before soap operas. He and Wilkie Collins said their secret was to make them cry, make them laugh and make them wait. That is everything I did in my EastEnders career. It is that deferment of gratification.”

The originality of Dickensian lies in its audacious bringing together of numerous characters at once, but one of the impulses that has powered this – the desire to make characters live again and to imagine them into new aspects, new stories and new worlds – accompanied Dickens’s fiction from its earliest moments. In the spaces in-between and after instalments, we find that readers’ emotional engagement births alternative and new stories, trajectories and lives that all demonstrate the enduring power of Dickens’s fiction.

Find out more

Share

Justice Scalia wasn’t just immoral—he was philosophically confused

This post was contributed by Rob Singh, Professor of Politics at Birkbeck. Prof Singh’s new book, ‘After Obama: Renewing American Leadership, Restoring Global Order’ will be published by Cambridge University Press in May.

This aticle was originally published in Prospect on 16 February.

With the death of Justice Antonin Scalia on 13th February, the United States Supreme Court became a central issue in the raucous 2016 presidential campaign. While President Obama has stated his intent to nominate the next justice, Senate Majority Leader Mitch McConnell has argued that Scalia should not be replaced until after the presidential election — and nominees must be confirmed by the currently Republican-held Senate. These competing claims show how the Court now reflects and reinforces the broader partisan polarisation in Washington.

Antonin Scalia Official SCOTUS Portrait crop

Justice Antonin Scalia (By Steve Petteway, photographer, Supreme Court of the United States[1] (See [2]) [Public domain], via Wikimedia Commons)

On decisions from gun control to campaign finance, the court over the last decade has pursued an outspokenly conservative agenda. But other key rulings—such as upholding the Affordable Care Act and the right to same sex marriage—have also grievously disappointed traditionalists. With the remaining eight justices now split between four progressives and four conservatives, Scalia’s replacement could potentially reshape constitutional law for years to come.

A man of acerbic wit and often scathing venom, Scalia developed an approach to constitutional interpretation—originalism—that many found coherent and compelling (a whole book, Scalia Dissents, was even dedicated to his disagreements with prevailing opinion). In a democracy, how can a Court legitimately strike down the laws passed by the Congress and signed by the president? Originalism offered a simple solution: rather than consider what the writers of laws, or of particular constitutional clauses, intended the law to mean, judges should instead interpret these in terms of how the text was commonly understood at the time it was adopted. That adherence to the values of others seemed to limit the dangers of judges writing their own views into law. It had the happily convenient benefit, to Scalia, of also yielding reliably conservative policy outcomes. But three problems plagued the path Scalia paved, which he never convincingly resolved.

First, the practical outcomes of Scalia’s philosophy are widely regarded as repugnant to contemporary moral values. Take Maryland v Craig (1990), where the Court upheld a state law allowing a victim of child sex abuse to testify over CCTV rather than in court, in the presence of her abuser. Scalia dissented, arguing that the Sixth Amendment provides that in “all criminal prosecutions the accused shall enjoy the right… to be confronted with the witnesses against him.”

The only things that had changed since 1791, he argued, were society’s “so-called sensitivity to psychic trauma” and the judgment of where the proper balance lay between assuring conviction of child abusers and acquittal of those falsely accused of abuse. At the same time, in supporting states’ rights to enact statutes rooted in “moral disapproval,” Scalia opposed striking down laws criminalising gay sex in 2003. Relying on “tradition” and popular sentiment to thwart progress, he selectively transformed the Bill of Rights from a safeguard against majoritarianism into another expression of it.

But beyond specific rulings was a second, broader problem. Central to Scalia’s judicial philosophy was an inherent contradiction: would the original framers of the Constitution whom he so venerated have prescribed an originalist approach? Compelling evidence suggests otherwise. Not only is the language of the document notoriously ambiguous and vague, deliberately open to competing and evolving interpretations, but the Framers expressly rejected freezing the fledgling republic in the conditions of 1787. Iconic figures such as Thomas Jefferson even expected new generations to rewrite the Constitution anew.

Thirdly, in decisions such as that made in court case The District of Columbia v. Dick Heller (2008) (which was presided over by the Supreme Court of the United States, and thus Scalia) the Court hardly exemplified a conservative role; for the first time in American history an individualist reading of the Second Amendment was announced. It was ruled that an individual is entitled to carry a firearm for private purposes, such as self-defence, and that the Amendment doesn’t just apply to the rights of groups such as militias. The result of this ruling was a litigation bonanza centred on exactly what gun regulations offend a citizen’s right to own firearms. But if the US survived more than two centuries without the 2nd Amendment ever conferring such a right, when did this change, and why?

Originalists used to criticise the Court’s progressive rulings of the 1960s and 1970s, when the liberal Justices exercised “raw judicial power” by “inventing” new constitutional rights that weren’t explicitly in the Constitution. Now, the same charge can be levelled at the conservatives, whose recent embrace of judicial activism often appears less philosophical rationale than political rationalisation.

Read the original article in Prospect

Read the original article in Prospect

To be fair, Scalia did frequently abide by his own strictures to act as a judge rather than a legislator, not least on First Amendment cases such as flag desecration, where his reading of free expression trumped his affront at unpatriotic acts such as burning the Stars and Stripes. But it is difficult to disassociate his embrace of originalism from his finding in its cold but confused logic a way to oppose every progressive advance from reproductive rights to affirmative action.

George W Bush declined the opportunity to elevate Scalia to the Chief Justiceship in 2005, but Republican presidential candidates have already solemnly avowed to appoint “another Scalia” to the Court, should they be sworn into office in 2017. The chances of that are increasingly slim. With the Court’s future direction now a key issue in the presidential election, several vulnerable Republican Senators facing uphill battles for re-election in swing states such as Wisconsin and Illinois, and the Grand Old Party likely to seem nakedly partisan in obstructing a new Obama judicial nominee from even coming to a vote, Scalia seems likely to remain a magnet for controversy in death as well as life.

It would be mildly ironic if Scalia’s passing, and controversial legacy, hamper the prospect of a more conservative direction in constitutional law by helping to energise the Democratic Party base and costing the GOP the White House and/or the Senate.  And even more ironic that the remainder of this year’s contentious argument over the Court will itself test the proposition of whether a Constitution designed in and for the 18th century is still fit for purpose in the 21st, or more resembles a noble piece of paper housed in the National Archives.

Find out more

Share

The Presidential Apprentice? Taking Trump Seriously

Rob Singh is Professor of Politics at Birkbeck. His new book, ‘After Obama: Renewing American Leadership, Restoring Global Order’ will be published by Cambridge University Press in May. Prof Singh recently appeared on an episode of BBC Radio 4’s The Long View which focused on ‘Donald Trump and the Politics of Celebrity’

Donald Trump Sr. at Citizens United Freedom Summit in Greenville South Carolina May 2015 by Michael Vadon 13Buffoon. Joke. Jerk. Those are just some of the descriptions of the current front-runner for the Republican Party nomination for president of the United States. From his fellow Republicans, that is. Beyond the party, Donald J. Trump has been lambasted as a bigot, misogynist, and racist. Yet none of this has seemingly hampered the popular appeal of his quixotic quest for the White House.

Should we take the Trump phenomenon seriously? The answer is, emphatically, yes. Laugh at or loathe him, Trump has been the Heineken candidate, reaching parts of the electorate no other candidate can reach. And whilst it remains to be seen whether he can translate his support in the polls into votes, Trump already dominates 2016 in singular fashion. There exists no precedent in the modern era for a political novice setting the agenda so consistently that the media focuses in Pavlovian fashion on whatever subjects Trump raises. From stopping illegal immigration through a ‘beautiful’ great wall with Mexico to a moratorium on all Muslims entering the US, no-one has commanded attention like the New Yorker. Moreover, not only have other Republicans felt compelled to follow his lead but even President Obama’s final State of the Union was essentially an extended rejoinder to the Donald.

So, what underlies the success? Anger, authenticity, media savvy, populism, and timing.

An unapologetically redemptive force

First, most Americans think their country is on the wrong track. Among white working class Americans – the core Trump constituency – stagnant wages, real income decline, and loss of a once-dominant status in a nation transforming economically and culturally underlies disillusion. For Americans regarding ‘their’ country as in need of taking back and among those fearing the US is in terminal decline – polarised and gridlocked at home, discounted and challenged for primacy abroad – Trump represents an unapologetically redemptive force: a visceral, primal scream from the heart of white American nationalism.

Second, Americans broadly view their government as ineffective and political system as corrupt. Running for Washington by running against it, on a platform of incoherent but potently opaque policy positions, no-one – for those wanting to change Washington – embodies the outsider like Trump. Moreover, uniquely, his personal fortune insulates him from charges that he can be ‘bought’ by vested interests. When Trump talks about knowing how to work the system as a businessman, he is credible. Add to that an outspoken willingness to speak directly, bluntly and without fear of causing offence and millions of Americans view the Donald as a truth teller. Like businessmen in politics before him, Trump promises that what he did for himself he can do for America, and that ordinary Americans will once more partake of the increasingly elusive American Dream.

Social media mogul

Third, Trump has exploited his formidable media knowledge with astonishing shrewdness. Outrageous statements, outlandish claims and telling personal insults – seemingly spontaneous but carefully pre-planned and road-tested – compel ratings. Social media abets the creation of an alternative reality and echo chamber from which the distrusted mainstream media are excluded. Disintermediation – cutting out the middle man – compounds Trump’s celebrity status to forge what his 5 million Twitter supporters perceive as a personal link to their politically incorrect champion.

Fourth, Trump – for whom id, not ideology, is all – upends conservative orthodoxy. A New York native who was for most of his life pro-choice on abortion, pro-gun control and a donor to Democrats, Trump is no staid Mitt Romney. In rejecting free trade deals and ‘stoopid’ Middle East wars, pledging to make allies from Saudi Arabia to South Korea pay for US protection, committing to punitive taxes on Wall Street and preserving entitlement programmes for the average Joe, Trump’s anti-elitism is scrambling a party establishment fearful of an anti-government populism it unleashed but cannot control.

Finally, if Obama won the presidency in 2008 as the ‘un-Bush’, what more vivid an antithesis to the current lame duck could be imagined than Trump? After seven years of the most polarising presidency since Richard Nixon, Trump promises to restore the art of the deal – something the US Constitution mandates for successful governing, and AWOL since 2009 – at home and abroad alike.

Can Trump triumph?

Can Trump prevail in the Republican demolition derby? The odds are still against him. After all, most Republicans do not support him and he has been first in national polls in large part because the ‘establishment’ vote has been so fragmented among Marco Rubio, Jeb Bush, John Kasich and Chris Christie. But if Trump can win or come second to Ted Cruz in the Iowa caucus, and then top the New Hampshire and South Carolina polls, the prospects of him securing the nomination are 50-50 at worst. By the time of the Republican Party convention in Cleveland, Ohio in July, if not well in advance, no one may be laughing other than the Donald.

Find out more

Share

The Politics of David Bowie

This post was contributed by Dr Benjamin Worthy, lecturer in Birkbeck’s Department of Politics. This blog was originally posted on the 10 Gower Street blog on 11 January 2016.

david_bowie_by_alexwomersley-d5g9foa

Picture courtesy of Alex Womersley

As with almost everything about David Bowie, no one is sure exactly what his politics were. The Mirror claims he turned down an OBE and a knighthood in the 2000s. In 1977 he is quoted as saying ‘the more I travel and the less sure I am about exactly which political philosophies are commendable’. Nevertheless, many have seen ways in which Bowie’s career could provide lessons for how we do politics.

David Bowie rarely indulged directly in politics or political slogans. His lyrics seemed to deal obliquely with it across his career-from ‘Now the workers have struck for fame’ in‘Life on Mars’, his 1996 song ‘Seven Years in Tibet’ to the album Diamond Dogs, based on George Orwell’s 1984. However, direct ‘interventions’ seemed rare and a little unclear, as with his plea for the union and Scotland to vote No to independence in 2014, sent via Kate Moss, or this rather entertaining acceptance of a Brit award in 1996 from a young Tony Blair. This didn’t, of course, stop his fans who seem, on the whole,left-wing (and also fans of scrabble, Patrick Moore and Monty Python, according to YouGov).

But Bowie was not apolitical. In the 1970s Bowie challenged entrenched gender and sexuality stereotypes at a time when few would. Jarvis Cocker has said how Bowie sent out the message that it was OK to be different while the Mirror speakers of how the singer’s ‘radical, gender-busting personas turned traditional conservative views upside down and widened what was acceptable in society’. He also wrote about the world around him, describing events from the space race to divided Berlin (the German Foreign Ministry today publically thanked him for helping to bring down the Berlin Wall).

At the same time, his championing of different cultures pushed all sorts of new ideas into society-look over his top 100 books, covering everything from a memoir of Stalin’s Gulags to Viz magazine. He popularised of whole kaleidoscope of new sounds and visions to new audiences, from German electronic music to Soul, while also experimenting with what people insist on calling ‘world music’. And his message reached a huge, diverse number of people.

In this way, David Bowie was a very political animal, in the same way that Elvis Presley or the Beatles were. None of them urged ‘revolution’ or told people how to vote. Elvis was rather conservative, John Lennon asked to be counted ‘out’ of the revolution (or maybe ‘in’-he wasn’t sure) and David Bowie was too wide-ranging or elliptical to join any one party. But like these other musical legends, in challenging convention, the Man Who fell to Earth tore down barriers and opened up new worlds. David Bowie made people think differently about the world around them. And that is very political.

Find out more

Share

The State of Europe

This post was contributed by Professor Martin Conway for the college’s Reluctant Internationalists project. This first appeared on the project’s blog on 4 January 2016

When will the historians of twentieth-century Europe accept that their century has ended? The violent attacks in Paris on the night of 13 November serve to confirm what we should already have known: that the populations of Europe have moved on from the politics of the twentieth century, and it is time for the historians to do so too.

Read the original article on the Reluctant Internationalists

Read the original article on the Reluctant Internationalists blog

Of course, in the aftermath of traumatic events, historians delve rapidly into their store-cupboard of analogies and precedents. And there are many which can be drawn upon for such purposes. Violence by small militant groups composed predominantly of immigrants from specific ethnic backgrounds has, after all, a considerable lineage in twentieth-century Europe. The various revolutionary and counter-revolutionary movements that proliferated in the former territories of the Habsburg and Tsarist empires at the end of the First World War, the militant Jewish Communist groups who played such a role in the anti-fascist movements and the wartime Resistance groups in the 1930s and 1940s, and the FLN militants of Algerian origin who were active in France in the 1950s and 1960s, are all examples of how political violence has often been generated in Europe by marginalized ethnic and religious minorities, who derived their legitimation from the perceived repression by state authorities.

And yet none of these models really has much purchase for understanding the various incidents which, from the train bombings of Madrid in 2004 to the events in Paris, have become part of Europe’s contemporary present. In part, of course, this is because European history is no longer, if it ever was, self-contained: this violence draws its inspiration from elsewhere, and from different histories. But there is also a broader and more disconcerting reality. The radicalized militants who have generated this violence feel no affinity with these precedents. Indeed, one suspects that they know little or nothing (and care even less) about Europe’s past history.

This is a cause for some modesty on the part of historians. We inhabit a present which owes little to “our” past. The twentieth-century history of Europe has come to an end. Everybody can choose their terminus date of preference, be it the reunification of Europe after 1989, the impact of the neo-liberal reforms of the 1990s, or the attacks on the Twin Towers on 9/11 and their subsequent imitators in Europe. But, wherever you choose to stick the frontier post between past and present, it is impossible to ignore the sense that European history has not so much ended as turned into a new configuration. For contemporary historians, to misquote J.P. Hartley, the present is another country, and they do things differently there.

Quite why that should be so is a question which probably demands an answer on a rather grand scale. But the more immediate challenge for historians of Europe is to develop frameworks for understanding the evolutions of the present, which are more relevant than reworkings of our all-too-familiar stories of the crises of the 1930s and 1940s. The history of the twenty-first century has to start somewhere, and the events of the last year have given us plenty of raw material to work from. War in Ukraine, the rise of new populist forces of right and left (or both), the demands for revision of national sovereignty, the arrival of large numbers of migrants fleeing war and economic deprivation, and the impact of new forms of political violence constitute a formidable agenda which demands a response more substantial than the overused language of crisis.

Picture

27 October 2015, Migrants are led by German Fedeal Police to an emergency accommodation centre in Wegscheid, southern Germany (Armin Weigel/ dpa via AP)

Crisis is of course a term that historians conventionally deploy to describe the demise of the old and the difficult birth of the new. The first is certainly highly visible in present events, as manifested by the collapse of a certain way of managing Europe, as well as the retreat of pre-existing political elites in the face of economic pressures and the demands of angry and exasperated voters. Of course, they will not go quietly. The logics of austerity economics and of national security justified by the supposed internal and external threats to European populations provide plenty of means for state authorities to seek to impose their discipline on their populations. But state authority is not what it used to be. One of the more tangible consequences of the last twenty years has been the hollowing out of much of the former trappings of state power and of national politics. In an era when communication has become primarily electronic, and national borders have become largely notional, state authority no longer has the same centrality in the history of twenty-first century Europe.

Part of the challenge of a history of the present is therefore to appreciate, if not fully to understand, the fluidity of boundaries of any kind. We inhabit a new cosmopolitanism, as reflected in the global character of many of Europe’s major cities, but also in the flexibility of identities, be they national, political, ethnic, or indeed religious. Journalists investigating the backgrounds of the authors of the Paris attacks have appeared surprised to discover that they were products of the banlieux of Paris and of Strasbourg, who amidst the chaotic years of their early adulthood travelled without any great sense of purpose to Syria, from where they returned equipped with a cocktail of animus, bravado and perhaps a superficial understanding of some elements of Islam. And yet that surely is what one would expect: militants are made not born, and the manner of their making well illustrates the fluidity of identities among those many Europeans whose lives have been rendered fragile by economic changes, the dislocation of social structures, and the retreat of structures of state provision.

In order to understand this, the most appropriate template is not the twentieth century, with its explosion in state power and totalizing ideological visions, but its predecessor. Looking at Europe’s present-day cities, one cannot but be reminded of the chaotic immigrant cities of Europe in the nineteenth century, and their worlds of neighbourhoods, ethnic self-help structures, and an almost total absence of state authority. Zola, it seems, has never been so topical; but other aspects of Europe’s present-day history seem also to recall the Europe of the mid-nineteenth century. The impact of vast economic forces beyond the control of any public authority, the pressure of migrant masses on a pre-existing population, and sudden surges of political support for charismatic individuals or for rhetorics of national liberation (and of xenophobia) smells much more akin to the Europe of the 1840s and the 1850s, than it does to the Europe of Adenauer, de Gaulle, Thatcher, Kohl and Mitterrand.

However, to replace one set of analogies with another borrowed from the previous century is not sufficient. A history of Europe’s twenty-first century has to identify the building blocks of the new. Some elements of this are incontrovertible: the new precariousness of living standards caused by economic change and untrammelled market forces, and the consequent replacement of the disciplined interaction of socio-economic interest groups by a new and much more volatile politics of economic opportunity and grievance. But other elements appear much less clear-cut. Is Europe moving left or right? Will the migrants of 2015 be integrated into a new and more multi-cultural Central Europe, or will they provoke a descent into forms of ethnic essentialism?

Above all, where, in the end, will state authority be discovered to reside? One of the most striking features of Europe since the final decades of the twentieth century has been the demise of those hierarchical organizational charts of government which used to characterise political-science textbooks. Power is now more dispersed and also more opaque, shared between a plethora of regional, national and supra-national institutions, but also secreted away in institutions such as central banks and security structures that are impervious to democratic control or even public scrutiny. None of that means that we are about to experience new forms of authoritarianism; the populations of Europe have, one suspects, moved beyond the stage when they would submit to the disciplines of states of emergency and military coups. Moreover, for all of the seriousness with which leaders have gathered to consider Europe’s overlapping current crises, one of the most striking features of their discussions has been the relative absence of effective tools of power. Military force – other than the spectacular acts of aerial bombing in Libya, Iraq and Syria – has almost disappeared; national economic policy-making has been transferred to central banks and the power of the markets; and even the routine ability to keep track of the movements of populations appears to have been largely eroded.

From the streets of Molenbeek to the beaches of Lesbos, it is the limits of the capacity of the state which has been more apparent than its strength. Perhaps that presages a new 1848, but more significant is the way that the state has lost, or surrendered, its twentieth-century role as the grand manager of European life. What will replace it forms part of the still uncertain nature of the history of the European present.

The Reluctant Internationalists project inspects the history of international collaboration and ambitions of medical professionals, politicians, generals, diplomats and policy-makers in twentieth century Europe. This four-year project, funded by Birkbeck researcher Dr Jessica Reinisch’s Wellcome Trust Investigator Award, examines the origins of such policies, consequences and lasting legacies.

Find out more

Share

Who remembers Aylan Kurdi now?

This post was contributed by Dr Nadine El-Enany lecturer in Birkbeck’s School of Law. This first appeared on Media Diversified on 4 January 2016

Moments of Mourn held in memory of Aylan Kurdi and other refugees

Moments of Mourn held in memory of Aylan Kurdi and other refugees

Who remembers Aylan Kurdi now? The photograph of the Syrian toddler that so galvanised Europe’s public over the question of refugees seems a distant memory now. Is it that a genuine concern for the wellbeing of refugees has merely been displaced by other political priorities in the minds of Europeans? Or is it that the basis for the mass outpouring of grief and the acts of generosity and solidarity that followed the publication of the photo was always fickle, contingent upon white Europeans’ limited capacity to humanise the other?

What was it about the photo of Aylan Kurdi that so stirred Europe’s public over the question of refugees? Aylan Kurdi was by no means the first child to drown en route to Europe and since his death, more than 70 children have lost their lives off the Greek coast. Since 1993, more than 22, 394 people are known to have died attempting to enter Europe. The actual figure is likely to be much higher. The blood on the hands of Europe’s fortress-makers had long dried before Kurdi’s body washed up on a Turkish beach in September. How did it come about that white Europeans were able, all of a sudden, to humanise the body of a refugee, least of all, the body of a Muslim?

What did white Europeans see when they looked at the photo of Aylan Kurdi? They saw their own sons and nephews in the photo, aptly illustrated by the #CouldBeMyChild hashtag which was trending on Twitter following the discovery of Kurdi’s body. The image was of course particularly potent in depicting the tragic end of a life so young. But was there something else about this particular child that enabled his humanisation by white people, when so many others had died before?

Perhaps it was the innocence evoked by the body of a light-skinned child that enabled the temporary, fleeting awakening among white Europeans to a refugee movement that long-preceded the media spotlight on that photo. The news has moved on, but the situation persists and grows more desperate daily. According to the International Organisation for Migration, an estimated 700,000 people arrived at Europe’s borders between January and October 2015, compared with 280,000 for the entirety of 2014. Refugees fleeing persecution and war in Syria have been trying to reach Europe since 2012. The majority remain in neighbouring countries in the region, with only 10% of those fleeing Syria seeking protection in Europe. Many have perished along the way together with refugees from Afghanistan and Iraq.

Those white Europeans with a new penchant for carrying #RefugeesWelcome tote bags are unlikely to be amenable to the argument that this is the result of an awakening of their conscience made possible by the coincidentally fair hue of Aylan Kurdi’s skin. Yet, research has shown that the extent to which white people feel empathy and humanise others correlates with implicit racial biases, with negative stereotyping of those with darker skin coresponding to a lower level of empathy shown for them. Feelings of empathy are known to encourage cooperation and assistance between human-beings, while an absence of identification with the suffering of others can lead to violence and abuse, both characteristics of Europe’s militarised border regime.

What of the refugees who do not evoke in the mind of the white European an image of their own offspring? The images of black African bodies washed up on the shores of Europe’s mediterranean beaches last Spring did not prompt an equivalent outpouring of compassion and charitable action. What of the bearded male refugee? What of the woman in the hijab or burka? What of their dark-skinned children? These coded images of Muslims inhibit their humanisation. The Islamophobia that thrives in European societies today means that rather than compassion, they elicit feelings of apprehension and fear.

In the wake of the Paris attacks, the British newspaper, the Daily Mail, published a cartoon depicting racialised images of Muslims crossing Europe’s borders along with rats. Poland reneged on its refugee quota agreement following the attacks and more than half of all US state governors have refused to accept Syrian refugees. Meanwhile, Australia declared its policy was to focus its protection efforts on Christian Syrians.Each of these decisions is reproductive of Islamophobia in buying into the idea that Muslims are associated with terror by virtue of being Muslim.

Read the original article on Media Diversified

Read the original article on Media Diversified

Rather than acknowledge the racism endemic in European societies, many white Europeans prefer to see the dehumanisation of refugees as merely an expression of anti-migrant sentiment, or different values, or viewpoints in what is presented as a fair debate on migration. In what amounts to a dangerous apologia for racism, Slavoj Zizek has categorised the claims of anti-immigrant populists as being about the “protection of our way of life” and argued that the claim Europeans lack empathy for the suffering of others is “merely the obverse of…anti-immigrant brutality”. In a move demonstrative of his attachment of negative stereotypes to refugees, Zizek insists that it be “made clear” to them that they are to “respect the laws and social norms of European states” which entails “No tolerance of religious, sexist or ethnic violence”, as though sexist, racist and religious violence were not fundamental aspects of European life.

While Aylan Kurdi’s light skin colour may have allowed white Europeans to humanise him and partake in large-scale charity-giving, petition-signing and demonstrations, their children could not of course have met Aylan Kurdi’s end. It was, after all, the ancestors of the white Europeans tweeting selfies taken with their babies as they headed for their nearest #RefugeesWelcome march who colonised the lands from which these desperate people come. And it is white Europeans occupying positions of power and privilege today who continue to give orders for bombs to be dropped on their homes.

Absent from the #CouldBeMyChild hashtag was an understanding of the specificity of colonial histories and present imperial wars and the way in which these structurally determine positions of power and privilege as between white people and people of colour. Refugees are here, their bodies washing up on European beaches, because white Europeans were, and continue to be, there.

Find out more

Share

Addressing the skills gap through partnerships between education and business

This post was contributed by Elena Georgalla, Work Readiness Programme Officer at Birkbeck. Elena’s article follows the recent roundtable discussion (hosted by the college’s Careers and Employability team) which explored cross-industry perspectives on the skills gap and social mobility

Universities and businesses alike suffer from the skills gap. Working closer together can have transformative potential.

 

Mind The Gap Logo by rrward on DeviantArt

Mind The Gap Logo by rrward on DeviantArt

The perceived growing gulf between the skills and abilities the workforce offers today and the skills and abilities businesses consider crucial to their success – the so-called skills gap – is old news to the UK job market.  Although its severity and extent remain highly contested, often distorted by politically-loaded debates on immigration and Tier 2 Visas, the overwhelming consensus among employers is that there is a deficit in graduates’ ability to communicate effectively and to solve problems creatively, to think critically, to work collaboratively and to adapt to changing priorities.

Further to these “soft skill” shortages, businesses report that job seekers also lack the technical “hard” skills, associated with specific jobs, including, most alarmingly, key digital skills. The latter has given rise to a wide range of public and private sector initiatives to inspire more young people to take up STEM subjects. As universities and businesses alike are affected by the skills gap, joining forces could have a transformative potential.

Universities and employability

Education has been quick to receive the blame. Despite the UK’s massive expansion in university education – 2015 saw a record number of undergraduates admitted to British universities – no parallel increase in skills has occurred as a result. Universities are at a watershed; their perceived value is reducing as they are faced with challenges that are intensified by burdensome tuition fees and a policy shift that favours apprenticeships for school-leavers to recalibrate the apparent skills malaise.

At a recent roundtable discussion organised by Birkbeck, University of London, chaired by Prof Philip Powell, Pro-Vice Master for Innovation, senior decision makers and university recruitment managers from some of the UK’s largest graduate employers questioned the role of university education as a sufficient indicator of a candidate’s potential. More often than not, they would favour a strong track record of work experience over academic achievement.

Are universities then in danger of becoming redundant if the norm of a university degree being the golden ticket to employment no longer stands? Is work experience a better indicator of ability than an undergraduate degree let alone a postgraduate one? Should more school-leavers consider alternative routes to employment, such as well-remunerated apprenticeships with clear progression paths, rather than three or four years of study followed by many years of paying off student debts and no correlated career outcomes?

It’s time for universities and employers to come together.

Addressing the skills gap

A growing number of universities have been responding to these questions by establishing direct partnerships with businesses in a variety of ways to take direct action towards addressing the skills gap and at the same time avoiding the bleak scenario of the overqualified unemployed graduate.

Indeed, the best universities for graduate employment have one thing in common: strong employer presence on campus. This, in tandem with academic excellence and playing to the strengths of each institution, appears to be a good recipe for success. This was certainly the overwhelming view of our roundtable participants who admitted that, empirically, the most successful candidates come from universities that excel at building partnerships with employers, increasing employer presence on campus, embedding employability in the overall student experience, and crucially, working with businesses to design academic curricula.

Apprenticeships have a key part to play in this model; there is large scope for universities to work with employers to establish high quality degree apprenticeships that allow students to gain a university qualification and invaluable (paid) work experience. Birkbeck, being London’s original evening only university, is currently exploring a day-apprenticeship/evening-study model. Overall, as employers demand more from their graduates with the modern job market increasingly requiring employees to be forward-thinking, problem-solving and entrepreneurial, it is clear that constructive dialogue, ground-breaking initiatives and a common, mutually-reinforcing approach between universities and business is the best solution.

There is an abundance of case studies demonstrating the importance and success rate of such university-business synergies. But to be truly successful, such partnerships need to go beyond the usual talent scouting and guidance on dealing with interviews and over-demanding assessment centres. They also need to focus on issues that can bring about genuine long-term change: social mobility, diversity and dealing with the chronic lack of women in technology. Such a focus is to the advantage of both sides because failing to tackle these issues will only compound the skills gap if fewer people are able to meet their full potential. Whatever the relationship that universities and businesses build, it ought to be reciprocal, mutually-reinforcing, sustainable and must speak to the needs of both sides.

Work Readiness: J.P. Morgan case study

At Birkbeck, we partnered with J.P. Morgan to launch the Work Readiness Programme, an initiative targeted specifically at students from underrepresented backgrounds, which aims to enhance social mobility and promote diversity in the job market, specifically in technology. Birkbeck prides itself on the diversity of our student body and on our reputation as an inclusive institution. J.P. Morgan has identified work readiness and social mobility as their top community engagement priorities. At the same time, Birkbeck has a campus in East London home to J.P. Morgan’s largest UK operation.

The reasons to join forces are evident. Since its launch, the programme has been very well-received among students and employers alike. On the one hand, employers see the value the programme adds to their efforts to become more inclusive and diversify their workforce; it speaks to real needs. On the other hand, our students have benefitted immensely from interacting with employers on campus and creating their own informal networks, whilst it has given our Career Service the opportunity to veer away from dull traditional career support limited to CV checks and “I’m not sure what I would like to do” conversations. In addition, we are soon to announce a partnership with TechCity UK which aspires to bridge the digital skills gap.

A collaborative approach

University education has a key role to play in driving the success of UK business. But in order to remain relevant, institutions must adapt. Universities are discovering the importance of increasing their collaboration with businesses beyond applied academic research and into preparing graduates for the world of work.

This, of course, does not absolve employers from their responsibility to provide training and opportunities for up-skilling. As emphasised, collaborative approaches should bear something for both sides. Most crucially, both sides should be willing to break the mould and innovate. Ideally, such partnerships should be focused around creating flagship initiatives, with a clear manifesto, well-defined aims and objectives, a robust structure and a progression plan.

Equally, there are many areas that still remain to be explored. Employer engagement in education provision (course development and delivery) for example, is still in its infancy and there is a lot more to be understood and to be implemented. Degree apprenticeships are another area with great potential to transform not only the university experience but also how people progress from education into work.

Finally, the true potential of employer-education partnerships does not solely lie in their ability to nurture super-graduates who are client-friendly, critical thinkers and also experts in Excel, Python and Matlab; it rests in the recognition of the potential to create real societal change and opportunities for everyone.

Find out more

Share

Naming the unspeakable: ‘So-called’ Isis and Harry Potter syndrome

This post was contributed by Professor Penelope Gardner-Chloros, Department of Applied Linguistics and Communication

No Name Road (2988366432)It seems that we now have a terrible enemy who cannot be named – or rather whose naming causes a major headache. For many months now, on hearing the term ‘Isis’ we have not thought about a certain Egyptian goddess, or about the river that flows through Oxford. The name is now indelibly associated with one of the most evil organisations of modern times, which adopted the acronym of ‘Islamic State in Iraq and Syria’ (or sometimes ‘Islamic State in Iraq and the Levant’, ISIL).

Because of fears that calling the organisation by this name may legitimise it, some politicians and other public figures in the West have been calling it ‘so-called’ Islamic State, using the verbal equivalent of the two-fingered “scare quotes” gesture which has become part of our gestural vocabulary. There is also a move to call it ‘Daesh’ instead, another acronym which the organisation itself is said not to like.

This is how we in the West shake our tiny fists at the evil monster, even though, to paraphrase Shakespeare, a terrorist by any other name would smell as rank. Ah, the power of names! How can a simple label mean so much? And how do names acquire their symbolic power?

The power of names

The power of names is a recurrent theme in religions, myths and fairy tales. For example the issue of how to name God is an important issue in many religions. In Exodus 3:13-15, Moses asks God who he should say has sent him, and God replies ‘I am who I am’. It is as if God himself did not want to share an actual name with humans, because knowing someone’s name gives you a certain power over them.

In the Odyssey, after Odysseus has got the Cyclops drunk, he tells the monster that his name is ‘Nobody’’ When Odysseus later blinds the Cyclops in order to escape his clutches, his fellow Cyclops come to their brother’s rescue, but when they ask who has hurt him, the Cyclops replies ‘Nobody’ and the other Cyclops understandably lose interest and shuffle off.

Names are important also in traditional fairy tales. Those brought up on Grimm’s tales will remember Rumpelstiltskin, the evil dwarf who loses his power to harm people if they guess his name. And then of course there is Harry Potter, and the villain Voldemort who cannot be named for fear of conjuring him up. He is known throughout as ‘he who cannot be named’.

Marking changes in style, identity and allegiance

But back to ‘so-called’ Islamic State. Radicalised European or American Jihadist fighters change their names from their bland-sounding European ones to Arab ones that make them sound like the holy fighters they profess to be, the change of names marking a clear change in identity. If they return and are deradicalised, their names change back too. Many of us with less sinister motives make minor or greater changes to our names to mark changes in style, identity or allegiance.

We encourage or discourage nicknames and abbreviations at different stages in our life, reflecting how we wish to be seen – remember for example when Kate became Catherine? When we marry, we may change – or resolutely not change – our surnames – if of course this is allowed or encouraged in our culture (in some countries, such as Iceland, the issue does not arise and names are not enmeshed in a patriarchal system).

In Britain we might use our second name rather than our first; but this would not work in a country such as Russia where the second name is invariably a patronymic, i.e. your father’s first name, regardless of your gender. You may even be known by different names in different places – such as Jack in town and Ernest in the country.

Magic? Perhaps not. But certainly another facet of the power of words.

Find out more

Share

Santa’s Job: An Occupational Health Psychology Perspective

This December, Kevin Teoh (Department of Organizational Psychology, Centre for Sustainable Working Life) makes some light-hearted observations on the psychosocial working conditions of Santa Claus.

Santa Claus has a tough job (By Jonathan G Meath (Jonathan G Meath) [CC BY-SA 2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons)

Santa Claus sighs as he reviews his list of kids who have been naughty, and then goes over those who have been marked nice. The increasing global population means the number of children on his list grows with each passing year. Currently, it’s estimated to contain the names of between 152 and 526 million children (Bump, 2011; Svan, 2009), meaning a lot of presents to sort out and deliver. This is concerning, as there is ample evidence demonstrating that high workloads are linked with poorer health and lower job satisfaction (Goetz et al., 2013; Ree et al., 2014). Gosh, a sick and unhappy Santa, we wouldn’t want that.

Santa’s demanding job

As Christmas approaches and work intensifies, Santa’s standard 9-5 hours five days a week gradually extends into the evening and the weekends, increasing the number of hours worked. The seasonal nature of work faced by Santa and his team exists in other industries as well. Accountants during tax filing season go through a similar increase in their working hours, which has a detrimental impact on their health and work-life balance (Sweeney & Summers, 2002). Putting aside the amazing feat delivering presents around the world on December 25th, cognitive functioning after more than 24 hours of continuous wakefulness is similar to having a blood alcohol concentration level that is over the legal limit (Dawson & Reid, 1998). If we are concerned for the safety and wellbeing of Santa, perhaps he shouldn’t be operating his sleigh under such conditions.

While the job of Santa is likely to be very secure, I wonder whether his crew of elves and reindeers experience similar precarious working conditions that many seasonal workers do. Unfortunately, across Europe the high prevalence of temporary contracts faced by such workers not only increases job insecurity, but temporary workers often have fewer employment rights, perform more hazardous jobs, have poorer working conditions and are paid less (Hesselink et al., 2015). But surely though, given the charitable nature of Santa he must be as close to the best employer you will find?

Taking control of your work environment

Santa has little influence over the fact the busy festive season peaks at the end of December. This isn’t desirable considering the importance control at work has in relation to worker happiness and health. However, the reality of many jobs is the presence of external factors beyond a person’s control. To manage this, job crafting has emerged with growing support as an approach encouraging workers to alter aspects of their own jobs that they can (Wrzesniewski & Dutton, 2001). We actually see Santa himself do this in trying to manage his big deadline. While many countries see December 25th as the day Santa visits with presents, Santa has staggered the dates on which he visits different countries. For example, he distributes gifts in the Netherlands on the 5th of December (as Sinterklaas), before moving onto Germany, Switzerland and neighbouring countries the next day. On the 18th, you will find him as St. Nicholas in the Ukraine, while on the 6th of January a Father Frost gives out gifts to many children of a Russian Orthodox background.

In addition, we see that Santa has crafted part of the job for himself, and delegated aspects of the role to others. Across the world we see Santa as the bearer of gifts and happiness. However, in many cultures Santa partners local representatives who handle issues relating to discipline and punishment. The distributed work often involves beating misbehaving children or taking them away in sacks, and is carried out by Santa’s assistant Krampus (Austria and Germany), Schmutzli (Switzerland) or Zwarte Pieten (Belgium and the Netherlands), amongst others. It is not clear why he has crafted his job in this way. It could be to manage the overwhelming workload, or perhaps it’s an aspect he does not feel comfortable about or even competent at. Regardless, it seems to contribute to Santa’s success.

Why is Santa, Santa?

Santa-Hat-webConsidering these points above, what motivates Santa to work through such difficult working conditions? He is likely to be eligible for retirement, and while he may be doing it for the fame it is unlikely that the role provides a strong financial incentive. It is, however, far more likely that Santa draws meaning and purpose from this job of his. We know that individuals working or volunteering with charity and religious organisations are motivated by their values and their propensity for prosocial behaviour (Cnaan et al., 1993). Furthermore, having a sense of purpose and meaning at work is positively linked with better work and general wellbeing, engagement and performance (Shuck & Rose, 2013). Focusing specifically on Santa, two studies (Fletcher & Low, 2008; Hancock, 2013) involving a group of Santa Clauses found that these actors frequently perceived authenticity in their role as Santa. The job was not only because of the money, but was driven by a sense of recognition that they were doing something worthwhile, bringing happiness to the kids and making it a magical experience for them.

From a distance, it seems that Santa has most things under control. Yes – it is not a perfect working environment, but Santa has taken charge of his work environment, moving deadlines and empowering partners to work with him where possible. He appears to be very much in touch with why he is doing this job, providing meaning and purpose to his role. There is still scope to improve, a better understanding of the demands will help develop and target resources relevant to Santa. Listening to and appreciating Santa is also imperative. After all, if we don’t support and believe in Santa, how can we expect Santa to continually believe in himself?

*A longer version of this article first appeared in the European Academy of Occupational Health Psychology Newsletter (2015, Volume 12, Issue 2).

Find out more

References

  • Bump, P. (2011, December 14). Santa’s Christmas Eve Workload, Calculated. The Atlantic
  • Cnaan, R.A., Kasternakis, A., & Wineburg, R.J. (1993). Religious people, religious congregations, and volunteerism in human services: Is there a link? Nonprofit and Voluntary Sector Quarterly, 22, 133-151.
  • Dawson, D. & Reid K. (1998). Fatigue, alcohol and performance impairment. Nature, 388, 235.
  • Fletcher, R. & Low, D. (2008). Emotional Labour and Santa Claus. ANZMAC 2008, December 1-3, Melbourne, Australia.
  • Goetz, K., Musselmann, B., Szecsenyi, J., & Joos, S. (2013). The influence of workload and health behaviour on job satisfaction of General Practitioners. Family Medicine, 45, 2, 95-101.
  • Hancock, P.G. (2013). ”Being Santa Claus’: the pursuit of recognition in interactive service work. Work, Employment & Society, 27, 6, 1001-1020.
  • Hesselink, J.K., Verbiest, S., & Goudswaard, A. (2015). Temporary workers. OSH Wiki: European Agency for Safety and Health at Work.
  • Ree, E., Odeen, M., Eriksen, H.R., Indahl, A., Ihlebæk, C., Hetland, J., & Harris, A. (2014). International Subjective health complaints and self-rated health: Are expectancies more important than socioeconomic status and workload? Journal of Behavioural Medicine, 21, 3, 411-420.
  • Shuck, B. & Rose, K. (2013). Reframing employee engagement within the context of meaning and purpose: Implications for HRD. Advances in Developing Human Resources, 15, 4, 341-355.
  • Svan, K. (2009). Santa Claus at Risk. Faculty of Science, University of Gothenburg.
  • Sweeney, J.T. & Summers, S.L. (2002). The effect of the busy season workload on public accountants’ job burnout. Behavioural Research in Accounting, 14, 1, 223-245.
  • Wrzesniewski, A. & Dutton, J.E. (2001). Crafting a job: Revisioning employees as active crafters of their work. Academy of Management Review, 26, 2, 179-201.
Share

The Paris Climate Agreement and the Year 1965: How Much Can We Achieve in 50 Years (Or Less)?

This post was contributed by Dr Hiroki Shin, postdoctoral research assistant in Birkbeck’s Department of History, Classics and Archaeology, and writer for the college’s Reluctant Internationalists project. This first appeared on the project’s blog on 17 December 2015

In this post, Dr Shin considers the agreement reached at the Paris climate conference earlier last week, and points to a longer history of tensions between international and national attempts to control energy.

****

After intensive and tough negotiations, the COP 21 climate conference in Paris finally reached an agreement on 12 December 2015. Political leaders hailed the agreement as a historic turning point, an agreement in which the global community now shares the recognition that climate change is a threat to human existence, an enormous challenge that has to be tackled with an internationally coordinated system. The Paris pact aims to keep global temperature rise well below 2.0C, while officially acknowledging the more ambitious target of 1.5C. It also envisions the world with zero carbon emissions in the latter half of this century. The new global climate deal is to be implemented through reviews of individual countries’ performance every five years – measured against their voluntary pledges (Intended Nationally Determined Contributions, or INDCs) – and by intensifying efforts to mitigate climate change over the coming years.

As much as it is a landmark event in terms of the world coming to embrace a common goal, we need to see the Paris agreement as a headlight illuminating a rough road ahead of us. Furthermore, the agreement’s reference to the latter part of this century might make it sound as though we have plenty of time, but this is not the case. Looking at how things have changed in the past shows that, when dealing with a global problem, decades are a very short unit of time.

Picture 3Half a century ago, in 1965, energy was already an international topic, though not for its environmental implications. The world was only beginning to realise, mostly at the national level, the environmental harm caused by human activity. It was still a burgeoning recognition expressed, for example, in the US President’s Science Advisory Committee report Restoring the Quality of Our Environment (1965). The report opened with the statement, ‘Ours is a nation of affluence’, but 1965 was a year when affluence and scarcity formed a curious mix. One of its manifestations was the American Northeast blackout in November 1965, evidence that energy affluence came with disruptions, shortages and the fear of losing power.

The blackout, which lasted for more than ten hours on a Tuesday evening, affected over thirty million people in a country that had come to depend for the major part of its normal life on electrical power. What is interesting is not just how Americans experienced and responded to the sudden deprivation of electricity, but also how outsiders saw the event. On the other side of the Pacific Ocean, in Japan, which still regarded the USA as the model for its economic development and standard of living, news of the Northeast blackout was received with a mixture of surprise and admiration. An article described it as an event caused by ‘a blind spot of the hypermodern city’.[i] The mechanised cities that came to a halt during the power outage were seen as proof of the extent to which energy-using technology penetrated into American life. The blackout was therefore seen as a sign of affluence, the level of energy civilisation the Japanese aspired to achieve.

The British, less impressed by the American hypermodern, asked themselves the question: ‘Could it happen here?’ The answer of the UK Central Electricity Generating Board (CEGB) was that, thanks to Britain’s integrated power supply system, it was ‘impossible to visualise a similar situation arising in this country’.[ii] Despite the CEGB’s confident claim, there were a number of blackouts in the UK at the time. Less than a week after the American blackout, on 15 November 1965, a power outage occurred, affecting London, the Home Counties, Birmingham, Leeds, Nottingham, Derby, Chesterfield and most of East Anglia. The CEGB blamed the exceptionally cold weather that coincided with the operation to overhaul much of its equipment.[iii] The Guardian’s chief editor, Mark Arnold-Forster, exonerated the CEGB by pointing out that the real culprits were ‘millions of customers [who] felt cold at once and switched on direct-acting electric fires’.[iv] Indeed, a sudden imbalance of supply and demand had been the cause of a number of blackouts since the 1940s, including a power outage on the Christmas Day in 1962.[v]

Picture 4

The Guardian, 27 December 1962.

Energy experts in 1960s Europe were well aware that demand was not the obedient follower of supply. In the spring of 1965, the Committee on Electric Power of the UN Economic Commission for Europe held a symposium in Istanbul to consider the challenge of meeting the rapidly growing electricity demand.[vi] The attendance of 215 representatives from twenty-one countries demonstrated that those countries were facing a similar problem. However, although the problem was shared, their approaches differed. While the USSR delegate referred to a central committee to allocate power to different classes of consumer in times of emergency, the UK delegate – ever so inclined to soft persuasion – presented a paper on how to control load growth using consumer advisory services. While the problem was discussed at an international forum, the solution was sought at the national level. What could be described as the common recognition then was that the problem caused by power demand ‘does not arise only today, but exists at all times’, as expressed by the Turkish chairman.[vii] This amounted to an admission that there would be no future in which everyone’s need for power is fully satisfied at all times.

In July the same year, in Bangkok, another international meeting was held by the UN Economic Commission for Asia and the Far East (ECAFE, later to become ESCAP). The focus of ECAFE’s working group meeting was the development of energy resources – particularly electric power – and how to exploit them for industrialising the ECAFE region that included major developing countries such as China, India, Japan, South Korea, the Philippines, Vietnam and Iran.[viii] Some of the ECAFE countries were already experiencing the demand problem discussed by European countries in Istanbul. The demand for power was constantly outstripping supply in developing countries too, but it was simply taken as insufficient capacity building. What is striking about the Bangkok conference is how easily the longing for more energy could overshadow other important issues such as balancing the supply capacity and demand. Another topic that received only a passing reference was the depletion of fossil fuels, even though the Asian energy experts must have been familiar with King Hubbert’s ‘peak oil’ theory, first presented in 1956, which warned that US oil production would reach its peak around 1965–1970.

Picture 5

ECAFE Electric Power Experts’ Tour in India, 1956. United Nations Photo

The two meetings in Istanbul and Bangkok demonstrate that the different priorities had a profound effect on how some problems were selected while other issues were obscured. Fifty years on from the two UN meetings, the environmental issues are now centre stage at international forums where developed and developing countries participate equally in negotiations. In this way, there have been major changes. Nevertheless, there are uncanny parallels between the situation today and that of 1965. Energy disruption still haunts developed and developing countries; power outages arising from technical problems, human errors and natural disasters abound, and the renewable transition has now been added to the list of disruption causes. For instance, the UK’s attempt to abandon all of its coal-fired plants has narrowed the electricity supply margin to satisfy the nation’s demand. In early November 2015, the National Grid had to appeal to business users to reduce energy consumption to avert a wide-scale disconnection. In Germany, a record number of consumers were disconnected because they could not pay their electricity bills, which had been inflated due to added subsidies for renewable energy. Blackouts have yet to be eliminated in developed countries; they are still alive and kicking. In developing countries, including those that have already achieved significant levels of development, the appetite for energy is unabated. More than anything, the belief that greater energy use leads to greater economic growth remains so strong that it is obscuring other important issues and sacrificing the global environment for future generations.

Read the original post on The Reluctant Internationalists project site

Read the original post on The Reluctant Internationalists project site

A brief look at the events of 1965 and 2015 tells us that the length of fifty years has turned out to be far from sufficient in balancing our needs and desires for energy with the resources and capacity we have. During the same period, we have failed to manage our power demands, which has led to severe damage to the global environment. With the coming of a more rigorous emissions control regime, the problem of managing energy demands will become more acute. In addition, as the Paris meeting highlighted, the fundamental divide between energy haves and have-nots has changed very little in the past fifty years, and this is the situation we have to deal with in the coming decades. Aligning our goals is one thing, aligning our acts and deeds is another, and the latter is usually more difficult. To meet the numerous challenges, several decades are equivalent to but a short space of time. This means that we must equip ourselves with ever-increasing determination and will in order to sprint through the long and rough terrain in the decades to come.

[i] Asahi Journal, 9 January 1966, p.88.

[ii] The Guardian, 11 November 1965.

[iii] The Guardian, 16 November 1965.

[iv] The Guardian, 18 November 1965.

[v] The Guardian, 27 December 1962.

[vi] Economic Commission for Europe, Symposium on Special Problems in Meeting Rapidly-Growing Requirements for Electric Power (UN, 1966).

[vii] Ibid, p. 23.

[viii] Economic Commission for Asia and the Far East, The Role and Application of Electric Power in the Industrialization of Asia and the Far East(UN, 1965). A recent review of the ECAFE’s early history is Ikuto Yamaguchi, ‘The Development and Activities of the Economic Commission for Asia and the Far East (ECAFE), 1947–65’, in S. Akita, G. Krozewski and S. Watanabe (eds.), The Transformation of the International Order of Asia (Routledge, 2014).

Dr Hiroki Shin is Co-Investigator of the AHRC-funded ‘Material Cultures of Energy’ project (PI: Prof Frank Trentmann), based at Birkbeck College, University of London. www.bbk.ac.uk/mce

Find out more

Share