What are the origins of the Pride March?

Although this year’s Pride March has been cancelled, we wish to highlight and celebrate the history of the annual celebration. In this blog, Rebekah Bonaparte, Communications Officer at Birkbeck, explores the radical roots of the annual Pride March.

June usually marks Pride Month. The streets of London and many UK towns and cities are adorned with the infamous Pride rainbow, as thousands would usually turn out in celebration of the lesbian, gay, bisexual, transgender and queer (LGBTQ) community.

Many will now be familiar with the rainbow flag that has become increasingly visible throughout the month of June. The Pride logo can be seen on the websites of corporations and organisations as the internationally recognised event has become increasingly mainstream. But what are the origins of the Pride march?

The Stonewall Inn

Although there had been groups campaigning for the rights of the LGBTQ community to be recognised before the 1960s, the Stonewall Uprising is thought of as an important moment in the fight for gay rights in the US and beyond.

The uprising began when New York police officers raided the Stonewall Inn bar on 28 June 1969. Police raids of gay and lesbian bars were commonplace at this time and this instance proved to be the catalyst for an outpouring of fury amongst the LGBTQ+ community who were continually targeted by the police. A lesbian woman, Stormé DeLarverie, who is thought to be one of the first to fight back at Stonewall insisted that the often labelled ‘riots’ was “a rebellion.”

Six days of protests followed the raid on the Stonewall Inn and figures such as Marsha P. Johnson and Sylvia Rivera emerged as leaders of the revitalised movement.

The following year, Christopher Street Liberation Day Umbrella Committee held its first march, initially called ‘The Christopher Street Liberation Day Committee’ to commemorate the Stonewall uprisings and promote cohesion amongst the LGBTQ community. Today, the Stonewall Inn is considered a national landmark and the LGTBQ+ Pride March is held across the world in June.

Pride in London

In 1970 two British activists, Aubrey Walter and Bob Mellor, founded the Gay Liberation Front in a basement of the London School of Economics. Walter and Mellor were said to have been inspired by the Black Panthers as that year they attended the Black Panther’s Revolutionary Peoples’ Convention, but also the various liberation movements that were taking place all over the world. At the time in the UK homosexuality had been partially decriminalised and homophobia was largely accepted.

The Gay Liberation Front in London held its first Pride rally in 1972 on 2 July (the closest Saturday to the Stonewall anniversary) and continued to host annual rallies until it became more of a carnival event in the 1990s. In 1996 it was renamed Lesbian, Gay, Bisexual and Transgender Pride. The march was thought of as a display of solidarity and self-acceptance, but also a vehicle to drive social change and challenge injustice.

The Pride March has been held in London and across the UK since. It is characterised by its carnival spirit, and a safe space for members of the LGBTQ community to assert their identities and achievements. In recent years it has become increasingly mainstream, with corporations and organisations capitalising on the annual celebration and some believe it is has become far removed from its radical roots.

#YouMeUsWe

The organisation Pride in London was set up in 2004, and has been arranging the march since. Unfortunately, this month’s Pride event had to be cancelled but the organisation has announced its virtual campaign, #YouMeUsWe which calls on members of the community to practice allyship and challenge instances of discrimination and marginalisation.

Pride remains a visual reminder for the continued struggle for LGBTQ+ rights across the world, a source of hope and jubilation for many.

Further information:

Share
. Reply . Category: Uncategorized . Tags: , , , , ,

Cancel the Window-Cleaning Contract!

Professor Jerry White, Professor of Modern London History at Birkbeck recounts how the College faired during the Second World War. This blog is part of the 200th-anniversary series, marking the founding of the College and the 75th anniversary of Victory in Europe Day.

Bomb damage to Birkbeck Library

Bomb damage to Birkbeck Library. The area around Birkbeck College was bombed during the air-raid of 10-11 May 1941. The resultant fire destroyed the Library. Image courtesy of Birkbeck History collection.

Most of London University shut down on the declaration of war in September 1939. The headquarters at Senate House was taken over by the Ministry of Information and most colleges were evacuated (like much of the BBC, many government departments and most of London’s hospitals) to areas thought to be less vulnerable to bombing. University College shifted to Aberystwyth and elsewhere in Wales, King’s to Bristol, LSE and Bedford to Cambridge, and so on. Birkbeck, its London roots deeper than any of its sister colleges and so unable to be useful to Londoners if sent to the country, resolved to close on the outbreak of war and for a time did so. But the war failed to open with a bang and in the absence of air attack, or apparently any likelihood of bombing for the immediate future, Birkbeck reopened at the end of October 1939. Indeed, it didn’t merely reopen but expanded its offer: for the first time, extensive daytime teaching was made available for those London students unable to follow their chosen university colleges out of the capital. And despite the blackout, a wide range of evening teaching also resumed.

Birkbeck was not yet at its present Bloomsbury site. That building contract had been let but work had to stop in July 1939 because of the uncertain international situation – contractors were given more pressing projects to work on, both civil defence and industrial – and in fact the new college would not be completed and occupied till 1951. So Birkbeck was still in its late-Victorian location in Breams and Rolls Buildings, straddling the City and Holborn boundary west of Fetter Lane, incidentally sharing a party wall with the Daily Mirror building. It had some near misses during the main blitz of 1940-41 and narrowly escaped total destruction in the great City fire raid of 29 December 1940, which opened a view – never before seen – of St Paul’s from the college windows. From that time on all places of work had to arrange a fireguard of staff to be in the building at night time to deal with incendiaries and raise the fire brigade if necessary. There followed nearly three-and-a-half years of relative quiet, with sporadic bombing of London and the Baby Blitz of early 1944 rarely troubling the college and its work. But Birkbeck would nearly meet its nemesis from a V1 flying bomb (or doodle-bug) at 3.07am on 19 July 1944.

Dr A. Graham was a member of the college fireguard that night, on the 1-3am watch.

I wakened Jackson [the College accountant] to do the 3-5am spell…. We were saying a few words to one another when we heard The Daily Mirror alarm go. Suddenly the bomb, which had merely been a near one until that second … dived without its engine stopping. Its noise increased enormously; Jackson and I looked at one another in silence; and I remember wondering what was going to happen next. What did happen was all over before we realised it had happened … a gigantic roar from the engine of the bomb, not the noise of an explosion, but a vast clattering of material falling and breaking, a great puff of blast and soot all over the room, and then utter quiet. Massey [another fire watcher] raised his head from the bed where he had been asleep and asked what all that was….

As the dust settled Graham climbed over the flattened metal doors of the College and went into the street. The first thing he heard was footsteps coming at a run up Breams Buildings. It was a Metropolitan police constable: ‘he called backwards into the darkness… “It’s all right, George, it’s in the City”’; satisfying himself there were no urgent casualties he promptly disappeared. Troup Horne, the College secretary from 1919-1952, was also one of the fireguard but, not wanted till 5am, was in a makeshift bed in his office: ‘At 3.06am I was awakened by a doodle overhead. Thinking we were for it, I pulled a sheet over my head to keep the plaster out of my remaining hairs; and five seconds later the damned thing went pop.’ Horne was found ‘covered from head to foot with soot, dust, and thousands of fragments of broken glass and other bits scattered from the partition which separated the general office from his room.’ His chief assistant, Phyllis Costello, was also sleeping in the College that night and was frequently part of the fireguard. She rushed to see if he was injured and was greeted by Horne instructing, ‘Cancel the window-cleaning contract’.

Indeed, there were no windows left anywhere in the College. For some time after, a witticism coined in Fleet Street during the main Blitz, was Birkbeck’s watchword: ‘We have no panes, dear mother, now.’*

*Edward Farmer (1809?-1876), ‘The Collier’s Dying Child’: ‘I have no pain, dear mother, now.’ All the information used here comes from E.H. Warmington, A History of Birkbeck College University of London During the Second World War 1939-1945, published by Birkbeck in 1954.

Share
. Reply . Category: College . Tags: , , , , , , , , , ,

Maths for the Masses

In this blog, Ciarán O’Donohue a PhD student in the Department of History, Classics and Archaeology, discusses the decision to teach mathematics to the first students of the Mechanics Institute. This is part of the 200th anniversary blog series that celebrates the College’s bicentenary in 2023.   

The Massacre of Peterloo

The Massacre of Peterloo. The commander is saying “Down with ’em! Chop ’em down my brave boys; give them no quarter! They want to take our Beef & Pudding from us – & remember the more you kill the less poor rates you’ll have to pay so go on Lads show your courage & your Loyalty!”

Many of us will be familiar with the common questioning of why certain concepts are taught in our schools. Mathematics, and especially its most intricate systems, are often first to face the firing squad. It is not unusual to hear someone discussing education to ask: “Why are we not taught about credit, loans, and tax? I’m never going to use Pythagoras’s Theorem!” Certainly, when the subject of mathematics is brought up, the utility of algebra and theorems are often jovially dismissed as unimportant.

Two centuries ago, the picture was very different. The question of whether mathematics would be useful or dangerous knowledge to teach to the working class was one that was debated extremely seriously. In November 1823, the same month that the London Mechanics’ Institution was founded (which has now come to be named Birkbeck, University of London), Bell’s Weekly Messenger seized upon the propriety of teaching maths to London’s lower orders, lamenting that “the unhappy scepticism in France has been justly ascribed to this cause.” The implication was that teaching maths to the wider populace had caused them to question the order of society, and directly contributed to the French Revolution and its aftermath. Pertinently, this was an order which the British government had spent a fortune, not to mention the lives of hundreds of thousands of British subjects in the Napoleonic Wars to restore.

A revolution in Britain itself was still palpably feared in the 1820s, and its spectre was made more haunting by the Peterloo massacre just four years before this particular article was written, in August 1819. And so, surrounding the foundation of our College, and which subjects were appropriate, a war of words was waged.

The idea of teaching London’s working classes mathematics filled many with visceral dread. It was believed this would cause them to also become questioning like France’s peasants, eventually seeking proof for statements which they had hitherto blindly accepted.

The teaching of mathematics to mechanics, then, was considered by many to be socially, politically, and morally dangerous. Not only might it turn them into a questioning multitude, unwilling to simply accept what they’re told, it might also make them question the very structure of society and push for a semblance of equality. For critics, both outcomes could readily lead to revolution.

Henry Brougham, one of the founders of the College, believed that this catastrophe could be averted by teaching a reified body of knowledge, including a simplified version of mathematics. Writing of geometry, Brougham argued that, rather than “go through the whole steps of that beautiful system, by which the most general and remote truths are connected with the few simple definitions and axioms” it would be sufficient (and indeed safer) if the masses were to learn only the practical operations and general utility of geometry.

Similarly, many religious supporters of extending mathematical education to the mechanics believed that it would make people more religious, not less, if only it were taught in the right way. As God was believed to have created the world, the logic and order inherent in mathematic systems was held to show traces of his hand at work. An appreciation of mathematics and its traceable, systematic connections would thus create a renewed appreciation of God; not to mention for the order of the world as divinely ordained.

Likewise, moralists perceived more benefits in teaching the mechanics mathematics than drawbacks. The issue for them was not if the mechanics were to learn or read, but rather what. The key issue was that the mechanics were already largely literate. The rise of cheap literature, especially of the sentimental and pornographic varieties, preoccupied the minds of moralists and industrialists.

As the lower orders were believed to be motivated primarily by sensuality, learning mathematics was presented as a salve to degeneracy; a way to occupy their time with higher minded pursuits and strengthen their characters against wanton immorality.

Perhaps most worrying was the growing and uncontrollable availability of radical political writings. This more than anything was likely to upset the current order of society. The perceived and highly theoretical disadvantages of a mathematical education were thus infinitely preferable to such a realistic and allegedly growing threat. It was believed that the teaching of mathematics and science through a dedicated course of study, being undertaken as in the evenings, might reduce the time and energy the working man would have to devote to reading political tracts, let alone political activism.

It is, however, worth noting that, although many mechanics were literate, and most had rudimentary mathematical skills, the wider debate was far removed from the reality. Many mechanics required far more elementary lessons in mathematics before the advanced classes could even be attempted.  Although mathematics and science initially formed the centre of the curriculum at Birkbeck in the 1820s, by 1830 the reality of need had been discovered: advanced classes had been removed altogether, and instruction in elementary arithmetic was given to vast numbers of members. This was to continue to be the reality for much of the next 30 years.

How far, then, the raging debates about the inclusion of mathematics in the curricula of new centres for working-class education impacted the trajectory, is still a topic for debate.

Further information: 

 

Share
. Reply . Category: College, Social Sciences History and Philosophy . Tags: , , , , , , , , , ,

Lillian Penson: the first PhD in the University of London

Lillian Margery Penson was the first person in the University of London to be awarded a PhD. In this blog, Joanna Bourke discusses the life and achievements of Penson. This blog is part of a series that celebrate 200 years since Birkbeck was established and International Women’s Day on Sunday 8 March.

Lillian Margery Penson

Lillian Margery Penson_© Royal Holloway College, RHC-BC.PH, 1.1, Archives-Royal Holloway University of London

Lillian Margery Penson (1896-1963) was an outstanding scholar and university administrator. She was the first person (of any sex) in the University of London to be awarded a PhD; she was the first woman to become a Professor of History at any British university; and she was the first woman in the UK and Commonwealth to become a vice-chancellor of a university, at the age of only 52. She owed her undergraduate and doctoral education to the History Department at Birkbeck.

Opinions about her were divided. Was she the “foremost woman in the academic life of our day” (The Scotsman), a “remarkable woman” (The Times), and someone who exuded “charm, tolerance, and a sense of humour”? Or was she an “imperious grande dame”, “très autoritaire”, and “too trenchant”? The answer is probably “a mixture”. Although Penson “could on occasion be brusque and even intimidating”, she “had a happy knack of getting to know people quickly”, was “an excellent judge of wine and loved good company”, and projected “a wealth of genuine kindness”. In other words, Penson was probably trapped in that familiar double-bind experienced by powerful women in male-dominated fields: she was admired for her intellect and determination, yet disparaged as a woman for possessing those same traits. One newspaper report on the achievements of “the professor” even referred to Penson using the masculine pronoun: “he”.

Who was Penson? She was born in Islington on 18 July 1896. Her father worked as a wholesale dairy manager and her family were of the Plymouth Brethren persuasion. Indeed, one colleague observed that the “marks of a puritanical upbringing were never effaced” and her “belief in work and duty” meant that she was always made uncomfortable by “flippant talk”. She never married.

From her youth, Penson was intrigued by diplomatic history, colonial policy, and foreign affairs. Her intellectual talents were obvious. In 1917, at the age of 21 years, she graduated from Birkbeck with a BA in History (first class). The war was at its height, so she joined the Ministry of National Service as a junior administrative officer (1917-18) before moving to the war trade intelligence department (1918-19). At the end of the war, Penson returned to her studies of history at Birkbeck and became, in 1921, the first person in the University of London to be awarded a PhD.

Penson’s achievement was even more remarkable because of her gender. After all, throughout the period from 1921 to 1990, only one-fifth of PhD students in history were female. Penson was also young. The average age for history students to complete their doctorates was their mid-30s; Penson was only 25 years old. Birkbeck immediately offered her a job as a part-time lecturer, during which time she also taught part-time at the East London Technical College, now Queen Mary University of London. In 1925, she was given a full-time lecturing post at Birkbeck.

More notably, she was the first female Vice-Chancellor of a university in the UK and the Commonwealth. Indeed, the second female vice-chancellor would not be appointed for another 27 years (this was Dr Alice Rosemary Murray who was appointed Vice-Chancellor of Cambridge in 1975). Then, in 1948, the University of Cambridge agreed to award degrees to women. The last time they had tried this (in 1897), there had been a riot. In 1948, however, the Queen, Myra Hess, and Penson became the first women to be awarded honorary Cambridge degrees (in Penson’s case, a LL.D or Doctor of Laws). The Scotsman decreed Penson’s academic and administrative talents to be “unsurpassed even in the annals of that great institution”.

Many of the values that Penson promoted were those at the heart of the Birkbeck mission. She spoke eloquently on the need to offer university education for “virtually all comers”, with no restriction based on religion, race, or sex. She was keen to insist that the job of the university teacher was to “do something more than impose upon the memories of our students masses of detailed information”.

As with many powerful women, she has largely been forgot. After her death, a University of London Dame Lillian Penson fund was established to provide travel money between scholars engaged in research in one of the universities of the Commonwealth, especially Khartoum, Malta, the West Indies, and new universities in African countries. This seems to have disappeared. All that remains is a bricks-and-mortar legacy in the shape of the Lillian Penson Hall, which still exists next to Paddington Station in Talbot Square, providing accommodation for over 300 students.

Share
. 1 comment . Category: College . Tags: , , , , , , ,

The Bonnart Trust PhD Scholarship

Zehra Miah is a Bonnart Scholar who is currently undertaking a PhD on the experiences of Turkish immigrants in London from 1971 to 1991. In this blog, she shares what it was like applying for the scholarship and how it has allowed her to pursue her project full-time.

Pictured: Idris Sinmaz (Zehra’s grandfather) came to London from Istanbul in 1971 to work in the restaurant of his landlord’s son. His two sons and wife joined him in 1973, his married daughter stayed in Turkey. This image was taken in 1980, by which time Idris had opened his own restaurant, Abant on Kingsland High Street in Dalston. Abant is a lake in his hometown of Bolu, Turkey.

Freddie Bonnart-Braunthal founded the Bonnart Trust to fund research aimed at tackling the causes and consequences of intolerance. Largely inspired by his own experiences leaving Vienna in 1935 and being branded an enemy alien and interred in the UK, he wanted to provide funding for scholars, such as myself, to explore these topics and to use their findings to help make a more tolerant and equal world.

When considering embarking on a PhD one of the main hurdles, once you have written your proposal, met with a supervisor, perhaps even had an interview and secured a place is – how to pay for it! My own story, is that I had returned to study as a mature student with three young children and a full-time job as an Executive Assistant. I had studied for my BA and MA at Birkbeck part-time and decided that if I was going to do a PhD then I wanted it to be all or nothing, so I applied for a full-time place. Starting the PhD meant not only the loss of my salary for me but also for my family, even cobbling together the fees would have been a struggle.  In short, without the Bonnart Trust seeing value in my research and awarding me the scholarship, I would, best case scenario perhaps, be pushing through a part-time PhD or, more likely have made the decision to take a different career path.

As a prospective student, you will already know from the institutions that you have applied to that whilst there is not an awful lot of funding about, it is a different offer with every university having vastly different application processes. If you have chosen to study at Birkbeck, or you are considering it and your research area fits within the remit of the Trust, namely addresses diversity and inclusivity or social justice and equality, then I would urge you to consider applying for this fantastic scholarship.

My research considers whether ethnic, religious and racial labels have helped or hindered the Turkish speaking minorities in London between 1971-1999.  When I read the guidelines and spoke to my supervisors (Professor David Feldman and Dr Julia Laite) it was clear that the Bonnart Trust Scholarship was most closely aligned with my research interests.  I have previously held a fees scholarship for my Master’s at Birkbeck and one thing I was not aware of at the time is quite how many people I would meet, collaborate with and the opportunities to present that come along when you hold a scholarship.

These opportunities are worth just as much, as the funding,which is full fees, an annual stipend, and a research allowance of up to £1,000.  The scholarship is open to the entire School of Social Sciences, History and Philosophy so it is competitive, but I can honestly say that I thoroughly enjoyed the process. The form had very specific questions (as do all funding applications, and helpfully they all ask different things with different word counts).  For the Bonnart Trust Scholarship I had to succinctly answer a number of questions about my research in general, , why it was important, what sort of influence outside of academia I hoped for and the possibilities it might offer to help address some of the Trust’s aims; no section allowed more than 250 words.

I am naturally a ‘better in the room’ sort of person so when I was shortlisted and invited to interview I knew that this was my opportunity to demonstrate just how important I felt my research was.  I can understand though, that interviews can be a bit daunting and my interview for the scholarship involved a panel comprising a linguist, two historians and a political scientist (one of whom was a past Bonnart Scholar).  I had lots of great advice, but there are two key points I want to share; firstly, you are the expert and you love your project, but spend some time considering what could go wrong and what the challenges might be and secondly, be ready to address every member of the panel even if they are outside your discipline, find one thing to engage with them on within your research.  They aren’t there to catch you out; they simply want to hear that you have thought through your ideas.

I was in Prague Castle when I got the email informing me that I had been successful and I am so grateful that the funding is allowing me to carry out this work. Since starting my PhD I have had numerous opportunities to meet Bonnart Scholars working in other disciplines. Next term there is the annual Bonnart Trust research seminarwhich will, I hope, be a great forum to meet more people interested in what I do and doing interesting things; they are now my peers, colleagues and maybe even my future employers!

I would urge anyone who feels that their research aligns with the Trust’s mission to take a look at the website, have a read of the current and previous projects and see where you fit – and then apply!

Applications are now open for the Bonnart Trust PhD Scholarship and will close on 31 January 2020.

Further information:

 

Share
. 1 comment . Category: Social Sciences History and Philosophy . Tags: , , , ,

The deep historically rooted misperceptions revealed by Brexit

Dr Jessica Reinisch, Reader in Modern European History, discusses the ahistorical narrative around the UK and its European neighbours that is shaping Brexit. 

History has been part of the Brexit madness from the start. It’s hardly news that thinking about things that happened in the past is often directly shaped by perceived priorities in the present, but something rather more one-sided has been going on with history under Brexit. From the small group of Eurosceptic historians around David Abulafia and their problematic claims about Britain’s past, to the Tory MP Daniel Kawczynski’s wilful ignorance of the role of Marshall Aid in post-war Britain: history has never been far away from the Brexit politics of today.

Since the UK referendum campaigns, politicians have tried to bolster their support of ‘Leave’ with claims about history and arguments about the present in the light of the past. Some academic historians and a range of history buffs have been eager to oblige these efforts. For them, what Abulafia called “a historical perspective” involves rifling through the past for evidence that the Brexit project is valid, desirable, perhaps even inevitable. As their once marginalised opinions have become mainstream and their confidence has grown, Brexit-supporting historians have set the pace of historical debate, if we can call it that – or rather the proclamation of more or less uncontextualized (or simply false) statements about the past, followed by an often bewildered chorus of disquiet from historians at large.

The claims put forward by the ‘Historians for Britain’ were quickly rebuffed by historians across the UK, who presented plenty of evidence that undermined the supposed exceptionality and benevolence of the British path and its immutable separation from the rest of Europe. “Britain’s past”, they pointed out, was “neither so exalted nor so unique.” In a more recent piece in the New Statesman, Richard Evans, never one to run from a good fight, lists a catalogue of examples where today’s Brexiteers have manipulated and distorted the past to fit their political agenda. But these efforts notwithstanding, the politics of Brexit has spawned what Simon Jenkins has fittingly called “yah-boo history: binary storytelling charged with fake emotion, sucked dry of fact or balance.” Jenkins’ comment made reference to Labour’s John McDonnell, though as Richard Evans shows, the Tory Brexiteers’ list of abuses of history is rather more substantial and significant.

The problem isn’t so much that apparently everyone feels entitled to serendipitously dip into the past for findings to support whatever they believe in; it is rather that much of this history is so very un- and anti-historical. History has become a caricature of parochial dreams, nostalgias and made-up analogies to prop up binary political choices. At stake is the nature, direction and meaning of British history and Britain’s place in the world. But just as important is the question of whether history can really be scaled down to an apparently singular ‘British’ vs ‘European’ position.

It is high time for historians to restore complexity and take seriously the variety of geographical and historical vantage points which can bring to light very different timelines and priorities. The contributions to a roundtable debate, published by Contemporary European History, have tried to do just that. It asked 19 historians of Europe working within very different national and historiographical traditions to reflect on the historical significance and contexts that gave rise to Brexit and within which it should be understood. The result is a palette of pertinent historical contexts in which Brexit has made an appearance and can be analysed. As a result, some of the certainties appearing in the Brexiteers’ version of history suddenly seem much less certain.

The upshot of this roundtable cannot be easily reduced to a political headline, and that is precisely the point. Serious history rarely works that way. As the contributors show, the prospect of Brexit has revealed deep historically-rooted misperceptions between the UK and its European neighbours; Brexit in this sense is a process of stripping away dusty historical delusions about national paths and those of neighbouring countries. The essays demonstrate that Brexit has to be understood in the context of a long history of British claims about the uniqueness of the UK’s past. Europeans at times recognised British history as a model to be emulated. But they periodically also challenged the applicability of the British yardstick to their own national exceptionalisms, or pointed to an equally long history of connections between the UK, the British Empire and the European continent. The present debates about British history and its place in Europe and the world alter readings of the past, in some cases significantly. History is, once more, being rewritten in the light of Brexit.

This article first appeared on the Cambridge Core blog

Share
. Reply . Category: Social Sciences History and Philosophy . Tags: , ,

Yes, the monuments should fall

This article was contributed by Dr Joel McKim, from Birkbeck’s Department of Film, Media and Cultural Studies.Lee Park, Charlottesville, VA

Writing in the 1930s, the Austrian writer Robert Musil famously noted that despite their attempted grandeur, there is nothing as invisible as a monument: “They are impregnated with something that repels attention, causing the glance to roll right off, like water droplets off an oilcloth, without even pausing for a moment.” It’s difficult to reconcile Musil’s observation with what we’ve witnessed from afar over the past week – a nation seemingly ripping itself apart, a statue of Robert E. Lee situated at the centre of the conflict.  Michael Taussig has, more recently, suggested an important adjustment to Musil’s theory arguing that it’s not until a monument is destroyed (or is slated for removal) that it succeeds in drawing our attention. The monument, Taussig reminds us, is often the first symbolic target in times of struggle.  “With defacement,” he writes, “the statue moves from an excess of invisibility to an excess of visibility.”

The confederate statues in America and the dilemma over what do with them became extremely visible this past week. It’s a discussion that has actually been taking place for some time now, with the removal in April and May of a number of monuments in New Orleans (including statues of Lee and Jefferson Davis, the president of the Confederacy) being a recent flashpoint. And there are of course many global and historical precedents to this debate, including the removal of racist and imperial icons in South Africa over the past several years and the precarious fate of Soviet-era statuary (see for example the excellent Disgraced Monuments, by our own Laura Mulvey and Mark Lewis). Decisions over what to do with symbols of past shame or troubling history also extend to the realm of preservation. Germany and Austria have recently been debating whether several architectural sites connected to the history of National Socialism, including the Nuremberg rally ground and the birthplace of Adolf Hitler, should be preserved, destroyed, or simply left to decay.

Apart from the abhorrent, but hopefully small faction who sees these symbols as worthy of veneration, another argument for keeping confederate monuments in place surfaces frequently from an apparently more benign viewpoint. We’ve all heard some variation of this position expressed over the past several days: “These monuments are an important reminder of our difficult and troubling history.” Or, “These statues help us to educate ourselves about what we have overcome.” Or, “If we destroy the past we will be doomed to repeat it.” While perhaps well meaning, I believe this line of argument is misguided in a number of ways. I think it’s fundamentally mistaken in its understanding of both the social dynamic and cultural history of monuments. Let me explain why.

Firstly, if monuments do have a significant educational purpose (and even this is questionable), it is certainly naïve to think this is the only mode by which they function. Rather than serving as references to the figure or event of the history they depict, public monuments communicate far more about the collective sentiments of our current period and the period in which they were erected. They express, in other words, rather than simply educate. The majority of confederate monuments, as New Orleans mayor Mitch Landrieu reminds us, were constructed not at the time of the civil war, but long afterwards during moments of resurgence in pro-confederate sentiment and white backlash against black civil rights, such as the Southern Redemption period. They were much less a marker of a tragic, but completed chapter in the nation’s history, than an expression of a renewed commitment to the cultural values of the losing side. Nicholas Mirzoeff points out that the monument to Robert E. Lee in Charlottesville was completed in 1924 and reflects a period of intense KKK organizing in the area. That these monuments can still function today as rallying points for ethnic nationalists and white supremacists, rather than as neutral transmitters of a distant history, should be self-evident after this week’s events. Could whatever nominal educational value these monuments possess, ever justify the continued emboldening role they play for these groups, or the genuine pain and distress they cause to so many who are forced to encounter them in public space? Ask a member of the black community if they are in need of a statue of Robert E. Lee to teach them about the history and continued impact of slavery and discrimination in America.

The second reason I think anxieties of the “destruction of history” type are misguided is that they don’t adequately recognize the always provisional and malleable nature of monuments and memorials. Far from being permanent or stable markers of history, monuments are perpetually being altered, moved, re-interpreted and reconsidered. They are contentious and contingent objects. The memorial landscape is continually in a process of adaptation. As Kirk Savage claims in his insightful history of the National Mall in Washington, “The history of commemoration is . . . a history of change and transformation.” Even the Lincoln Memorial, the most monumental of American memory sites, is an example of adaptation according to Savage. Its adoption as a symbol for the black civil rights movement occurred despite, rather than because of its intended design – the planners deliberately downplayed Lincoln’s role in the abolition of slavery. Artist Krzysztof Wodiczko’s protest-oriented projections onto existing statues are another important example of how the struggle to determine a monument’s meaning may continue long after its construction. Some of the most powerful monuments and memorial proposals of the past few decades have incorporated an element of self-destruction or suspicion into their own form. From the Gerz’s monument against fascism that disappears into the earth, to Maya Lin’s deliberately non-monumental Vietnam Veteran’s Memorial, to Horst Hoheisel’s proposal to blow up the Brandenburg Gates as a memorial for the murdered Jews of Europe. In short, public monuments change; their lifespan is not and probably shouldn’t be infinite. We don’t owe them that. The debate and conflict surrounding the removal of confederate monuments is obviously a clear indication that America is also currently undergoing a process of significant change. While the events of the past week have been worrying and sickening, I am heartened by the number of courageous people committed to ensuring that this change is a move forward, rather than a regression.

Dr Joel McKim is Lecturer in Media and Cultural Studies and the Director of the Vasari Research Centre for Art and Technology. His book Architecture, Media and Memory: Facing Complexity in a Post-9/11 New York is forthcoming in 2018 from Bloomsbury.

Further information

Share
. Read all 9 comments . Category: Arts . Tags: , , , , ,

Bad Habits? France’s ‘Burkini ban’ in Historical Perspective

This article was written by Dr Carmen Mangion from Birkbeck’s Department of  History, Classics and Archaeology. The article was originally published on the History Workshop Online‘s blog.

The ‘burkini ban’ issued by 30 French beach towns at the end of July 2016 sparked a media frenzy. Town mayors saw the burkini, the full-body swimsuit favoured by some Muslim women as a means of maintaining modesty while enjoying the sea, as a symbol of Islamic extremism and a threat to ‘good morals and secularism’. France’s 1905 constitution separates Church and State, and embraces a laïcité (a secularism in public affairs which prohibits religious expression) meant to limit religion’s influence on its citizens though still allowing freedom of religion. It originated as a means of eliminating the influence of the Catholic Church.Following ministerial criticism, France’s top administrative court investigated the ‘burkini ban’, ruling in late August that it violated basic freedoms.

Nuns at the beach
Nuns at the beach (Facebook/Izzeddin Elzir)

Amidst this furore, Italian Imam Izzedin Elzir’s image of nuns on the beach in their religious habits triggered an international media response. The image, appearing across social media and in outlets as prominent as the New York Times, implied the hypocrisy of a ban targeting Muslims and ignoring Christians. The photos were ironic on two counts:

First, some French mayors were emphatic that nuns in habits were also forbidden on beaches.

Second, and more apposite to this blog post, both the media and ‘ordinary’ citizens seem to be unaware that the ‘nun’ on the streets of Paris (and elsewhere) once sparked a similar outrage.

The historical context was of course different (it always is), but the indignation and the drive to control women’s appearance was just as virulent. Such outrage was not limited to France, but as the ‘burkini ban’ was initiated by the French, it seems appropriate to begin with this bit of French history.

The French revolution of the 1790s, with its cry of liberté, égalité, fraternité was not such a good thing for Catholic nuns. The nun, in her religious habit, became a symbol of the Catholic Church’s role in upholding the inequities and injustices of the ruling classes within France. Catholic nuns, then fully habited, were visible on the streets of Paris as educators, nurses and providers of social welfare, and became targets of anti-clerical outrage. The republican political regime set French nuns ‘free’ from their lifelong vows of poverty, chastity, obedience and their religious habit. It closed convents and confiscated their property. Some members of religious communities weren’t so willing to be set free, however, and were imprisoned. They were told to remove their religious habits (made illegal in 1792) and instead wear secular garb. Nuns including the Carmelites of Compiègne and the Daughters of Charity of Arras were executed for refusing to take the oath of loyalty to the Constitution.

French citizens also made their umbrage against nuns known. One 1791 print representing revolutionary anticlericalism, La Discipline patriotique or le fanatisme corrige (‘The patriotic discipline or fanaticism corrected’), showed the market women of Paris’s les Halles disrobing and thrashing the religious fanaticism out of a group of nuns. Such disciplining of women’s bodies was both salicious and violent.

la_discipline_patriotique_ou_le_-_btv1b69446090

The patriotic discipline or Fanaticism corrected (La Discipline patriotique ou le fanatisme corrigée) (Image: BnF/Gallica)

This urge to control the religious ‘fanaticism’ of women and monitor their clothing choices was not only a French issue; it had earlier incarnations. The dissolution of the monasteries in England in the mid-sixteenth century also ‘freed’ nuns and monks from their vows — and their property. English women’s response to this was to form English convents in exile, many in France and Belgium. In the 1790s English nuns fled, often surreptitiously, back to England. But penal laws restricting Catholic practices were still in effect and English bishops initially discouraged nuns from wearing their religious habit. English citizens too showed their indignation for female religious life by throwing epithets and stones at nuns; the Salford house of the Daughters of Charity of St Vincent de Paul was set alight in 1847. Similar events happened in the United States. Most notoriously the Ursuline convent in Charleston, South Carolina was burned down in the 1830s by anti-nun rioters. In the Netherlands, in Spain, in Belgium, in Germany and more recently in Eastern and Central Europe, nuns were also targeted. Women in religious clothing were (and are) easy targets of vitriol and violence.

So burkinied Muslim women and habited Catholic nuns have far more in common than one might think. The nun’s religious habit, like the burkini, has links to religious identity as counter to cultural norms. Critics say that women in burkinis challenge the French secular way of life. History shows that the habited nun also challenged both a republican version of Frenchness and also an English version of Englishness.

Within this context, the burkini furore illustrates two points.

First, it is yet another disappointing reminder that women’s bodies and appearances remain far too often more relevant (and newsworthy) than women’s intellects and voices. Clothing regulations are an excuse to control women and to divert attention from more substantive issues. They are a means of enforcing a societal version of femininity that doesn’t allow for difference. Women choosing to wear religious dress (or dress associated with religious affiliation) should not be stigmatised.

Second, by focusing on the burkini, we forget the more salient issue of figuring out how diverse people can live together peacefully. It is the social, economic and political factors that need attention: cultural inclusion, high unemployment and participation in civic life. Criminalising what women wear on the beach doesn’t even come close to addressing these issues.

Further Reading:

  • Carmen Mangion, ‘Avoiding “rash and Imprudent measures”: English Nuns in Revolutionary Paris, 1789-1801’ in Communities, Culture and Identity: The English Convents in Exile, 1600-1800 edited by Caroline Bowden and James E. Kelly (Ashgate, 2013), pp. 247-63.
  • Gemma Betros, ‘Liberty, Citizenship and the Suppression of Female Religious Communities in France, 1789-90’, Women’s History Review, 18 (2009), 311–36
  • For a robust comparison of nineteenth-century American nativism to the politics of Islam see José Casanova, ‘The Politics of Nativism Islam in Europe, Catholicism in the United States’, Philosophy & Social Criticism, 38 (2012), 485–95. A short and accessible version of this essay can be found here.
Share
. Reply . Category: Social Sciences History and Philosophy . Tags: , , , , , , , , , , , ,

Hard right, soft power: fascist regimes and the battle for hearts and minds

This article was written by Dr David Brydan, a post doctoral researcher in the Department of History, Classics and Archaeology and on Birkbeck’s Reluctant Internationalists project. It was originally published on  The Conversation.

A new global “soft power” ranking recently reported that the democratic states of North America and Western Europe were the most successful at achieving their diplomatic objectives “through attraction and persuasion”.

Countries such as the US, the UK, Germany and Canada, the report claimed, are able to promote their influence through language, education, culture and the media, rather than having to rely on traditional forms of military or diplomatic “hard power”.

The notion of soft power has also returned to prominence in Britain since the Brexit vote, with competing claims that leaving Europe will either damage Britain’s reputation abroad or increase the importance of soft power to British diplomacy.

Although the term “soft power” was popularised by the political scientist Joseph Nye in the 1980s, the practice of states attempting to exert influence through their values and culture goes back much further. Despite what the current soft power list would suggest, it has never been solely the preserve of liberal or democratic states. The Soviet Union, for example, went to great efforts to promote its image to intellectuals and elites abroad through organisations such as VOKS (All-Union Society for Cultural Relations with Foreign Countries).

Perhaps more surprisingly, right-wing authoritarian and fascist states also used soft power strategies to spread their power and influence abroad during the first half of the 20th century. Alongside their aggressive and expansionist foreign policies, Hitler’s Germany, Mussolini’s Italy and other authoritarian states used the arts, science, and culture to further their diplomatic goals.

‘New Europe’

Prior to World War II, these efforts were primarily focused on strengthening ties between the fascist powers. The 1930s, for example, witnessed intensive cultural exchanges between fascist Italy and Nazi Germany. Although these efforts were shaped by the ideology of their respective regimes, they also built on pre-fascist traditions of cultural diplomacy. In the aftermath of World War I, Weimar Germany had become adept at promoting its influence through cultural exchanges in order to counter its diplomatic isolation. After 1933, the Nazi regime was able to shape Weimar-era cultural organisations and relationships to its own purpose.

Leni Riefenstahl, Hitler’s film-maker. Bundesarchiv Bild, CC BY

This authoritarian cultural diplomacy reached its peak during World War II, when Nazi Germany attempted to apply a veneer of legitimacy to its military conquests by promoting the idea of a “New Europe” or “New European Order”. Although Hitler was personally sceptical about such efforts, Joseph Goebbels and others within the Nazi regime saw the “New Europe” as a way to gain support. Nazi propaganda promoted the idea of “European civilization” united against the threat of “Asiatic bolshevism” posed by the Soviet Union and its allies.

As seen in Poland: a BNazi anti-Bolshvik poster

Given the lack of genuine political cooperation within Nazi-occupied Europe, these efforts relied heavily on cultural exchange. The period from the Axis invasion of the Soviet Union in June 1941 until the latter stages of 1943 witnessed an explosion of “European” and “international” events organised under Nazi auspices. They brought together right-wing elites from across the continent – from women’s groups, social policy experts and scientists to singers, dancers and fashion designers.

All of these initiatives, however, faced a common set of problems. Chief among them was the challenge of formulating a model of international cultural collaboration which was distinct from the kind of pre-war liberal internationalism which the fascist states had so violently rejected. The Nazi-dominated European Writers’ Union, for example, attempted to promote a vision of “völkisch” European literature rooted in national, agrarian cultures which it contrasted to the modernist cosmopolitanism of its Parisian-led liberal predecessors. But as a result, complained one Italian participant, the union’s events became “a little world of the literary village, of country poets and provincial writers, a fair for the benefit of obscure men, or a festival of the ‘unknown writer’”.

Deutschland über alles

Despite the language of European cooperation and solidarity which surrounded these organisations, they were ultimately based on Nazi military supremacy. The Nazis’ hierarchical view of European races and cultures prompted resentment even among their closest foreign allies.

Jesse Owens after disproving Nazi race theory at the Berlin Olympics, 1936. Bundesarchiv, Bild, CC BY-SA

These tensions, combined with the practical constraints on wartime travel and the rapid deterioration of Axis military fortunes from 1943 onwards, meant that most of these new organisations were both ineffective and short-lived. But for a brief period they succeeded in bringing together a surprisingly wide range of individuals committed to the idea of a new, authoritarian era of European unity.

Echoes of the cultural “New Europe” lived on after 1945. The Franco regime, for example, relied on cultural diplomacy to overcome the international isolation it faced. The Women’s section of the Spanish fascist party, the Falange, organised “choir and dance” groups which toured the world during the 1940s and 1950s, travelling from Wales to West Africa to promote an unthreatening image of Franco’s Spain through regional folk dances and songs.

But the far-right’s golden age of authoritarian soft power ended with the defeat of the Axis powers. The appeal of fascist culture was fundamentally undermined by post-war revelations about Nazi genocide, death camps and war crimes. At the other end of the political spectrum, continued Soviet efforts to attract support from abroad were hampered by the invasion of Hungary in 1956 and the crushing of the Prague Spring in 1968.

This does not mean that authoritarian soft power has been consigned to history. Both Russia and China made the top 30 of the most recent global ranking, with Russia in particular leading the way in promoting its agenda abroad through both mainstream and social media.

The new wave of populist movements sweeping Europe and the United States often also put the promotion of national cultures at the core of their programmes. France’s Front National, for example, advocates the increased promotion of the French language abroad on the grounds that “language and power go hand-in-hand”. We may well see the emergence of authoritarian soft power re-imagined in the 21st century.

The Conversation

Share
. Reply . Category: Social Sciences History and Philosophy . Tags: , , , , ,

The Iraq War, Brexit and Imperial blowback

This post was contributed by Dr Nadine El-Enany, lecturer at Birkbeck’s School of Law. Here, Dr El-Enany shares her personal thoughts on the historical context of the EU referendum, and the British vote to leave. This post first appeared on Truthout on Wednesday 6 July 2016.

The Union Jack, the flag of the UK

Brexit is a disaster we can only understand in the context of Britain’s imperial exploits. A Bullingdon boy (Oxford frat boy) gamble has thrown Britain into the deepest political and economic crisis since the second world war and has made minority groups across the UK vulnerable to racist and xenophobic hatred and violence.

People of colour, in particular those in the global South, know all too well what it is to be at the receiving end of the British establishment’s divisive top-down interventions. Scapegoating migrants is a divisive tool favoured by successive governments, but the British establishment’s divide and rule tactic was honed much further afield in the course of its colonial exploits. Britain has a long history of invading, exploiting, enslaving and murdering vast numbers of people, crimes for which it has never been held accountable.

Brexit

While the British Empire may be a thing of the past, British imperialism is not. This month the Chilcot inquiry reported on the role of Tony Blair’s government in the 2003 invasion of Iraq which resulted in the death of nearly half a million Iraqis and the destabilization of the region, for which its inhabitants continue to pay the price. It is no coincidence that the Blairite wing of the Labour Party, amidst the Brexit chaos, launched a coup against their current leader, Jeremy Corbyn, who was set to call for Blair to be put on trial for warcrimes.

The referendum that resulted in a 52 percent vote in favour of Britain leaving the EU was initiated by the Conservative government. Shortly after the result was announced, it became clear that the leaders of the Brexit campaign had not wanted this result. Boris Johnson MP appeared ashen-faced at a press conference. He had neither expected nor wanted to win the referendum. He only wanted to be next in line for Number 10 Downing Street. David Cameron, who had led the Remain campaign, resigned as Prime Minister immediately. He had called the referendum in a bid to keep the Conservative Party together, without sparing a thought for the lives that would be destroyed if the bet did not pay off. His gamble backfired, as did Boris Johnson’s. Michael Gove MP, who had been Johnson’s right-hand man in the Leave campaign, betrayed him within days of the result, announcing he would be running for Prime Minister, thereby ending Johnson’s bid to lead the country.

This series of events has thrown the Conservative Party into disarray, the very outcome Cameron had wanted to avoid. Nigel Farage, who stoked up unprecendented levels of racist hate and deserves much of the credit for the Brexit win, resigned as leader of the UK Independence Party on Monday, saying he “wants his life back.”

As political leaders jump ship in the wake of the Brexit vote, reports have emerged of a Britain divided, of a traumatized population, grieving and suffering the onset of depression. There is talk of the need for reconciliation in a country where communities and families have been divided. Alongside this, there are expressions of anger and demands for the British establishment to be held accountable for the outcome of the referendum.

There is no doubt that the feelings of anger and loss in the wake of Brexit are real, but where is our collective sense of outrage in the face of the establishment’s divisive and destructive actions elsewhere? After all, the deregulatory reforms entailed in austerity policies imposed in EU countries with disastrous consequences, including cuts to vital welfare services, following the 2007 financial crisis, as Diamond Ashiagbor has argued, is “medicine first trialled on the global South since the 70s”. Ashiagbor notes “European states are experiencing this as a category error, in part because they have not been on the receiving end of such policies”, which are all too familiar in the global South.

Brexit is the fruit of empire

In the week following the announcement of the referendum results, two news items probably escaped most people’s attention. The UK Supreme Court delivered a ruling that further impedes the prospect of the Chagos Islanders returning to the home from which they were forcibly removed in 1971 by the colonial British government as part of a deal to allow the US to establish a military base on the largest island, Diego Garcia.

Also in the news last week were reports of 94-year-old Kenyan, Nelson Njao Munyaka, who testified in the High Court about killings he witnessed by British soldiers under 1950s British colonial rule. Munyaka is one of 40,000 Kenyans suing the British government over injuries and loss suffered in the course of its repression of the Mau Mau independence movement. Munyaka spoke of witnessing the shooting of his workmates, being made to carry their corpses and the flashbacks he suffers of the physical and verbal assaults he endured at the hands of British soldiers.

Brexit is not only nostalgia for empire — it is also the fruit of empire. Britain is reaping what it sowed. The legacies of British imperialism have never been addressed, including that of racism. British colonial rule saw the exploitation of peoples, their subjugation on the basis of race, a system that was maintained through the brutal and systematic violence of the colonial authorities.

The prevalence of structural and institutional racism in Britain today made it fertile ground for the effectiveness of the Brexit campaign’s racist and dehumanizing rhetoric of “taking back control” and reaching “breaking point.” This rhetoric is entirely divorced from an understanding of British colonial history, including the country’s recent imperial exploits, which have destabilized and exploited regions and set in motion the migration of today.

Islamophobia powered the Blair-Bush war machine, allowing the lie to be peddled that only the Arab world produces brutal despots, and that the lives of nearly half a million Iraqis are an acceptable price to pay for Britain to be the closest ally of the world’s superpower. Just as the political leaders who called the EU referendum along with those who led the Leave campaign did so with no plan in place for the aftermath, so did the Bush-Blair coalition embark on the 2003 invasion of Iraq with catastrophic consequences. Thirteen years on, Iraqis continue to feel viscerally the trauma of war and the pain of their divided society. Only this week, another suicide bombing in a busy market place took the lives of more than 200 people.

Read Dr Nadine El-Enany's original blog post at Truthout

Read Dr Nadine El-Enany’s original blog post at Truthout

The British establishment does not care to learn lessons from the past. Recall its thoughtless and entirely self-interested military intervention in Libya in 2011, which has left the country in a war-torn state of violence and chaos, a hot-bed for ISIS. But we can learn lessons — lessons that might help the left build solidarity and resist repression in more productive ways. We can begin by understanding Brexit instability and our feelings of loss and fear in the context of longstanding and far-reaching oppression elsewhere. As for privileged Remainers with power and influence, they are disingenuous not to accept a large slice of responsibility for the outcome of the EU referendum. From New Labour’s redefining of the Left as “extreme centre,” to Labour’s “austerity lite,” to their support for imperial wars and the mainstream media’s marginalization of left voices and people of color, and their denial of racism, they oiled the wheels of the Brexit battle bus. It is no use for the powerful liberal mainstream to cry crocodile tears now. They would do better to recognize their role in creating the conditions for the sort of racism that propelled the Brexit campaign to victory.

Note: This post represents the views of the author and not those of Birkbeck, University of London

(Copyright, Truthout.org. Reprinted with permission)

Find out more

Share
. Reply . Category: Law, Uncategorized . Tags: , , , ,