Occupational Psychology at Birkbeck: the early years

This post by Gerry Randell, Emeritus Professor of Organisational Behaviour, University of Bradford, was originally published in 2009.

Birkbeck Occupational Psychology: staff and students in October 1958

The first master’s students in Occupational Psychology in Britain graduated from Birkbeck 50 years ago this October: I was one of them.

A postgraduate diploma in industrial and commercial psychology had been on the statutes of the University of London since the 1920s, mainly at the instigation of the National Institute of Industrial Psychology and taught by and tailored to the Institute’s staff. Alec Rodger had been on the staff of the NIIP in the 30s and had risen to be Head of Vocational Guidance. In the early years of the war, most of the NIIP staff were drafted into the services, mainly to work on personnel selection. Alec became the Senior Psychologist for the Admiralty. After the war he was appointed Reader in Occupational Psychology (a term he invented) at Birkbeck and set about resuscitating the old diploma course. He published an article in Occupational Psychology in 1952 describing and explaining the curriculum for the new ‘Postgraduate Diploma in Occupational Psychology’ that he had just established. It was probable that the first students on this course were young NIIP staff and Alec’s friends. One of them was Peter Cavanagh whom Alec had spotted as someone who had scored particularly well on the Navy’s selection tests and had somehow arranged for him to be allocated to the Senior Psychologist’s Department. Subsequently Peter joined Alec at Birkbeck as his first Lecturer in Occupational Psychology.

A diploma, not being an attractive qualification for budding occupational psychologists, was not pulling in the students in the early 50s, so Alec then set about manoeuvring for it to become a masters and recruiting students on the strength of that. He happened to be the UG External Examiner for psychology at Nottingham at that time and persuaded two of the students he examined to sign up for the 2 year part-time MSc/MA to be course, Peter Henderson and I. When we turned up at Birkbeck in October 1956 there was a third student on the course, Russell Wicks from UCL. There was also a ‘visitor’ – Mrs Hussein from India – who would be ‘sitting in’; over the years Alec was very welcoming to ‘visitors’ from all over the world. We assembled in room 408 on the top floor of the college from 6 to 9, Tuesdays, Wednesdays and Thursdays.

During the year, the Diploma was turned into a Master’s degree, so the three of us had to re-register and look forward to an extra year of attendance! In 1957 eight new students enrolled and joined in the lectures/ discussions with us, in 1958 a further nine enrolled. After submitting our dissertations in September, eight of us graduated in October 1959, Professor Leslie Hearnshaw of Liverpool being the External Examiner. Of the three of us in cohort 1, Russell went on to teach at Surrey, Peter to Queens Belfast and I stayed on at Birkbeck as Alec’s first Assistant Lecturer.

Further Information:

Share
. Reply . Category: Business Economics and Informatics . Tags: , , , , , ,

Twenty years of network learning

Malcolm Ballantyne reflects on how this unique model of blended learning developed at Birkbeck.Picture of Organizational Psychology students in 1958 and 2008.

As far as we know, the MScs in Organizational Psychology and Organizational Behaviour were the first degree courses in the UK to require students to interact online. So why did it happen at Birkbeck? As is often the case there was no single reason, there were three contributing factors.

First, in the mid-1980s, the whole of Birkbeck faced a financial crisis. There was a change in the funding formula for part-time students that assumed that a university’s core funding should be based on full-time student numbers and that part-time students were a marginal additional cost. It didn’t work for Birkbeck and for a while it looked as if the whole College might close. The matter was resolved, but only for undergraduate students. Postgraduate departments had, in effect, to double the number of students. In Occupational Psychology (as it then was) we quickly realised this meant extending our catchment area beyond London.

Quite coincidentally, the Psychological Services Division of the Manpower Services Commission approached the Department, asking if we could develop a distance learning version of the MSc for their psychologists who were based throughout the country. If so, they could support the necessary curriculum development.

Lastly, I had been experimenting with on-line tutorials on my second-year module ‘People and Advanced Technology’. The technological support for this was very crude but I had actually done it as early as 1981. Looking back, I think I was the person who needed persuading the most but we brought these three factors together and the result was Network Learning.

A story which hasn’t been told is how, as an occupational psychologist, I was running on-line tutorials in the early 1980s. I came to psychology relatively late in my career, I didn’t get my BSc until I was 30. For the first ten years of my working life, I worked as a television technician for the BBC. I quickly discovered that a technical career was not for me, but the work was interesting, and I became absorbed by the experience of technological change. Between 1960 and 1970, the original 405-line television system was replaced by the 625-line system, colour television was introduced and, less obviously but more profoundly, valves were replaced first by transistors and then by the first generation of silicon chips. The work I and my colleagues did was transformed dramatically.

Having got my degree, I then worked as a psychologist for British Steel and saw even more dramatic effects of technological change on heavy industry with essentially heavy manual jobs becoming mechanised and computer-controlled.

And so, in 1974, to Birkbeck, as one of the two last lecturers to be taken on by Alec Rodger – Leonie Sugarman and I were interviewed on the same day. I covered ergonomics and work design and, when we redesigned the course in 1976, I started my second year module on the effects of changing technology on people’s working lives.

In the summer of that year, the College very generously supported me in attending a NATO ‘Advanced Study Institute’ – two weeks in Greece working with some of the world’s top human-computer interaction specialists and it was here that I first became familiar with the work which was being done on the organizational impact of IT. This also led, three years later, to being invited to join a British Library funded project in which we aimed to replicate the production of an academic journal on-line. The software we used was an early computer conferencing system called Notepad which, incidentally, gave me access to e-mail for the first time. In 1979, there weren’t many people to send messages to.

In 1978, a very influential book on computer mediated communication was published, Hiltz & Turoff’s ‘The Network Nation’. This described how computer conferencing systems had first been created and, significantly from my point of view, raised the possibility of interactive learning systems. I had to try it and persuaded Brian Shackel, the director of the BL project, to allow me to register my 1981 students on Notepad. It was immensely difficult. The computer was at the University of Birmingham and there weren’t that many dial-up terminals at Birkbeck. The telephone system was quite unreliable in those days but we actually managed to make contact and run some on-line discussions.

Following this experience, I applied for funding for more reliable technology. The feedback I received for these unsuccessful bids suggested that what I was proposing wasn’t really understood. So, with the arrival of the first Birkbeck VAX computer, I wrote a system myself – OPECCS, the Occupational Psychology Experimental Computer Conferencing System. It was very simple – and by this time we had e-mail in the form of VAXmail – but it worked quite well and I think it must have been around this time that my colleagues became aware of what I was doing.

So, why has Network Learning been so successful? It’s difficult to be certain but my own feeling is that it was because we had a clear philosophy from the start. We started with the assumption that what made the Birkbeck approach distinctive was the opportunity for students to meet face-to-face and discuss things informally – allowing for what John Dewey called ‘collateral learning’, learning which is neither planned nor intended but which nevertheless happens and is significant. In taking on students from a wider geographical spread, the face-to-face element would have to be less frequent but should remain an essential part of the process. The purpose of the technology was to be that of maintaining continuity of discussion between these face-to-face meetings. This was quite unlike the Open University’s approach where the process was seen as a distance learning experience where the technology was an additional, and optional, means of support.

More recent ideas, particularly from knowledge management, would support the Birkbeck approach. The debate on ‘stickiness’ and ‘leakiness’ of knowledge in organisations (why is it so difficult to get ‘best practice’ transferred across an organization while the company’s best guarded secrets disappear out of the back door to one’s competitors?) recognises the importance of face-to-face contact in the transfer of tacit knowledge. Even Microsoft, determined to operate its R&D function in Washington State, eventually had to relocate to Silicon Valley. I’ve never been able to understand those who maintain that for ‘true distance learning’ there must be no face-to-face contact.

I left the Department at the time that Network Learning was starting in earnest. We had one year of a pilot with Manpower Services Commission psychologists as students – it was shaky but it worked. Today, it’s wonderful to see the success that has followed.

Further Information

Share
. Reply . Category: Business Economics and Informatics . Tags: , , , , , , ,

What are the origins of the Pride March?

Although this year’s Pride March has been cancelled, we wish to highlight and celebrate the history of the annual celebration. In this blog, Rebekah Bonaparte, Communications Officer at Birkbeck, explores the radical roots of the annual Pride March.

June usually marks Pride Month. The streets of London and many UK towns and cities are adorned with the infamous Pride rainbow, as thousands would usually turn out in celebration of the lesbian, gay, bisexual, transgender and queer (LGBTQ) community.

Many will now be familiar with the rainbow flag that has become increasingly visible throughout the month of June. The Pride logo can be seen on the websites of corporations and organisations as the internationally recognised event has become increasingly mainstream. But what are the origins of the Pride march?

The Stonewall Inn

Although there had been groups campaigning for the rights of the LGBTQ community to be recognised before the 1960s, the Stonewall Uprising is thought of as an important moment in the fight for gay rights in the US and beyond.

The uprising began when New York police officers raided the Stonewall Inn bar on 28 June 1969. Police raids of gay and lesbian bars were commonplace at this time and this instance proved to be the catalyst for an outpouring of fury amongst the LGBTQ+ community who were continually targeted by the police. A lesbian woman, Stormé DeLarverie, who is thought to be one of the first to fight back at Stonewall insisted that the often labelled ‘riots’ was “a rebellion.”

Six days of protests followed the raid on the Stonewall Inn and figures such as Marsha P. Johnson and Sylvia Rivera emerged as leaders of the revitalised movement.

The following year, Christopher Street Liberation Day Umbrella Committee held its first march, initially called ‘The Christopher Street Liberation Day Committee’ to commemorate the Stonewall uprisings and promote cohesion amongst the LGBTQ community. Today, the Stonewall Inn is considered a national landmark and the LGTBQ+ Pride March is held across the world in June.

Pride in London

In 1970 two British activists, Aubrey Walter and Bob Mellor, founded the Gay Liberation Front in a basement of the London School of Economics. Walter and Mellor were said to have been inspired by the Black Panthers as that year they attended the Black Panther’s Revolutionary Peoples’ Convention, but also the various liberation movements that were taking place all over the world. At the time in the UK homosexuality had been partially decriminalised and homophobia was largely accepted.

The Gay Liberation Front in London held its first Pride rally in 1972 on 2 July (the closest Saturday to the Stonewall anniversary) and continued to host annual rallies until it became more of a carnival event in the 1990s. In 1996 it was renamed Lesbian, Gay, Bisexual and Transgender Pride. The march was thought of as a display of solidarity and self-acceptance, but also a vehicle to drive social change and challenge injustice.

The Pride March has been held in London and across the UK since. It is characterised by its carnival spirit, and a safe space for members of the LGBTQ community to assert their identities and achievements. In recent years it has become increasingly mainstream, with corporations and organisations capitalising on the annual celebration and some believe it is has become far removed from its radical roots.

#YouMeUsWe

The organisation Pride in London was set up in 2004, and has been arranging the march since. Unfortunately, this month’s Pride event had to be cancelled but the organisation has announced its virtual campaign, #YouMeUsWe which calls on members of the community to practice allyship and challenge instances of discrimination and marginalisation.

Pride remains a visual reminder for the continued struggle for LGBTQ+ rights across the world, a source of hope and jubilation for many.

Further information:

Share
. Reply . Category: Uncategorized . Tags: , , , , ,

Cancel the Window-Cleaning Contract!

Professor Jerry White, Professor of Modern London History at Birkbeck recounts how the College faired during the Second World War. This blog is part of the 200th-anniversary series, marking the founding of the College and the 75th anniversary of Victory in Europe Day.

Bomb damage to Birkbeck Library

Bomb damage to Birkbeck Library. The area around Birkbeck College was bombed during the air-raid of 10-11 May 1941. The resultant fire destroyed the Library. Image courtesy of Birkbeck History collection.

Most of London University shut down on the declaration of war in September 1939. The headquarters at Senate House was taken over by the Ministry of Information and most colleges were evacuated (like much of the BBC, many government departments and most of London’s hospitals) to areas thought to be less vulnerable to bombing. University College shifted to Aberystwyth and elsewhere in Wales, King’s to Bristol, LSE and Bedford to Cambridge, and so on. Birkbeck, its London roots deeper than any of its sister colleges and so unable to be useful to Londoners if sent to the country, resolved to close on the outbreak of war and for a time did so. But the war failed to open with a bang and in the absence of air attack, or apparently any likelihood of bombing for the immediate future, Birkbeck reopened at the end of October 1939. Indeed, it didn’t merely reopen but expanded its offer: for the first time, extensive daytime teaching was made available for those London students unable to follow their chosen university colleges out of the capital. And despite the blackout, a wide range of evening teaching also resumed.

Birkbeck was not yet at its present Bloomsbury site. That building contract had been let but work had to stop in July 1939 because of the uncertain international situation – contractors were given more pressing projects to work on, both civil defence and industrial – and in fact the new college would not be completed and occupied till 1951. So Birkbeck was still in its late-Victorian location in Breams and Rolls Buildings, straddling the City and Holborn boundary west of Fetter Lane, incidentally sharing a party wall with the Daily Mirror building. It had some near misses during the main blitz of 1940-41 and narrowly escaped total destruction in the great City fire raid of 29 December 1940, which opened a view – never before seen – of St Paul’s from the college windows. From that time on all places of work had to arrange a fireguard of staff to be in the building at night time to deal with incendiaries and raise the fire brigade if necessary. There followed nearly three-and-a-half years of relative quiet, with sporadic bombing of London and the Baby Blitz of early 1944 rarely troubling the college and its work. But Birkbeck would nearly meet its nemesis from a V1 flying bomb (or doodle-bug) at 3.07am on 19 July 1944.

Dr A. Graham was a member of the college fireguard that night, on the 1-3am watch.

I wakened Jackson [the College accountant] to do the 3-5am spell…. We were saying a few words to one another when we heard The Daily Mirror alarm go. Suddenly the bomb, which had merely been a near one until that second … dived without its engine stopping. Its noise increased enormously; Jackson and I looked at one another in silence; and I remember wondering what was going to happen next. What did happen was all over before we realised it had happened … a gigantic roar from the engine of the bomb, not the noise of an explosion, but a vast clattering of material falling and breaking, a great puff of blast and soot all over the room, and then utter quiet. Massey [another fire watcher] raised his head from the bed where he had been asleep and asked what all that was….

As the dust settled Graham climbed over the flattened metal doors of the College and went into the street. The first thing he heard was footsteps coming at a run up Breams Buildings. It was a Metropolitan police constable: ‘he called backwards into the darkness… “It’s all right, George, it’s in the City”’; satisfying himself there were no urgent casualties he promptly disappeared. Troup Horne, the College secretary from 1919-1952, was also one of the fireguard but, not wanted till 5am, was in a makeshift bed in his office: ‘At 3.06am I was awakened by a doodle overhead. Thinking we were for it, I pulled a sheet over my head to keep the plaster out of my remaining hairs; and five seconds later the damned thing went pop.’ Horne was found ‘covered from head to foot with soot, dust, and thousands of fragments of broken glass and other bits scattered from the partition which separated the general office from his room.’ His chief assistant, Phyllis Costello, was also sleeping in the College that night and was frequently part of the fireguard. She rushed to see if he was injured and was greeted by Horne instructing, ‘Cancel the window-cleaning contract’.

Indeed, there were no windows left anywhere in the College. For some time after, a witticism coined in Fleet Street during the main Blitz, was Birkbeck’s watchword: ‘We have no panes, dear mother, now.’*

*Edward Farmer (1809?-1876), ‘The Collier’s Dying Child’: ‘I have no pain, dear mother, now.’ All the information used here comes from E.H. Warmington, A History of Birkbeck College University of London During the Second World War 1939-1945, published by Birkbeck in 1954.

Share
. Reply . Category: College . Tags: , , , , , , , , , ,

Maths for the Masses

In this blog, Ciarán O’Donohue a PhD student in the Department of History, Classics and Archaeology, discusses the decision to teach mathematics to the first students of the Mechanics Institute. This is part of the 200th anniversary blog series that celebrates the College’s bicentenary in 2023.   

The Massacre of Peterloo

The Massacre of Peterloo. The commander is saying “Down with ’em! Chop ’em down my brave boys; give them no quarter! They want to take our Beef & Pudding from us – & remember the more you kill the less poor rates you’ll have to pay so go on Lads show your courage & your Loyalty!”

Many of us will be familiar with the common questioning of why certain concepts are taught in our schools. Mathematics, and especially its most intricate systems, are often first to face the firing squad. It is not unusual to hear someone discussing education to ask: “Why are we not taught about credit, loans, and tax? I’m never going to use Pythagoras’s Theorem!” Certainly, when the subject of mathematics is brought up, the utility of algebra and theorems are often jovially dismissed as unimportant.

Two centuries ago, the picture was very different. The question of whether mathematics would be useful or dangerous knowledge to teach to the working class was one that was debated extremely seriously. In November 1823, the same month that the London Mechanics’ Institution was founded (which has now come to be named Birkbeck, University of London), Bell’s Weekly Messenger seized upon the propriety of teaching maths to London’s lower orders, lamenting that “the unhappy scepticism in France has been justly ascribed to this cause.” The implication was that teaching maths to the wider populace had caused them to question the order of society, and directly contributed to the French Revolution and its aftermath. Pertinently, this was an order which the British government had spent a fortune, not to mention the lives of hundreds of thousands of British subjects in the Napoleonic Wars to restore.

A revolution in Britain itself was still palpably feared in the 1820s, and its spectre was made more haunting by the Peterloo massacre just four years before this particular article was written, in August 1819. And so, surrounding the foundation of our College, and which subjects were appropriate, a war of words was waged.

The idea of teaching London’s working classes mathematics filled many with visceral dread. It was believed this would cause them to also become questioning like France’s peasants, eventually seeking proof for statements which they had hitherto blindly accepted.

The teaching of mathematics to mechanics, then, was considered by many to be socially, politically, and morally dangerous. Not only might it turn them into a questioning multitude, unwilling to simply accept what they’re told, it might also make them question the very structure of society and push for a semblance of equality. For critics, both outcomes could readily lead to revolution.

Henry Brougham, one of the founders of the College, believed that this catastrophe could be averted by teaching a reified body of knowledge, including a simplified version of mathematics. Writing of geometry, Brougham argued that, rather than “go through the whole steps of that beautiful system, by which the most general and remote truths are connected with the few simple definitions and axioms” it would be sufficient (and indeed safer) if the masses were to learn only the practical operations and general utility of geometry.

Similarly, many religious supporters of extending mathematical education to the mechanics believed that it would make people more religious, not less, if only it were taught in the right way. As God was believed to have created the world, the logic and order inherent in mathematic systems was held to show traces of his hand at work. An appreciation of mathematics and its traceable, systematic connections would thus create a renewed appreciation of God; not to mention for the order of the world as divinely ordained.

Likewise, moralists perceived more benefits in teaching the mechanics mathematics than drawbacks. The issue for them was not if the mechanics were to learn or read, but rather what. The key issue was that the mechanics were already largely literate. The rise of cheap literature, especially of the sentimental and pornographic varieties, preoccupied the minds of moralists and industrialists.

As the lower orders were believed to be motivated primarily by sensuality, learning mathematics was presented as a salve to degeneracy; a way to occupy their time with higher minded pursuits and strengthen their characters against wanton immorality.

Perhaps most worrying was the growing and uncontrollable availability of radical political writings. This more than anything was likely to upset the current order of society. The perceived and highly theoretical disadvantages of a mathematical education were thus infinitely preferable to such a realistic and allegedly growing threat. It was believed that the teaching of mathematics and science through a dedicated course of study, being undertaken as in the evenings, might reduce the time and energy the working man would have to devote to reading political tracts, let alone political activism.

It is, however, worth noting that, although many mechanics were literate, and most had rudimentary mathematical skills, the wider debate was far removed from the reality. Many mechanics required far more elementary lessons in mathematics before the advanced classes could even be attempted.  Although mathematics and science initially formed the centre of the curriculum at Birkbeck in the 1820s, by 1830 the reality of need had been discovered: advanced classes had been removed altogether, and instruction in elementary arithmetic was given to vast numbers of members. This was to continue to be the reality for much of the next 30 years.

How far, then, the raging debates about the inclusion of mathematics in the curricula of new centres for working-class education impacted the trajectory, is still a topic for debate.

Further information: 

 

Share
. Reply . Category: College, Social Sciences History and Philosophy . Tags: , , , , , , , , , ,

Lillian Penson: the first PhD in the University of London

Lillian Margery Penson was the first person in the University of London to be awarded a PhD. In this blog, Joanna Bourke discusses the life and achievements of Penson. This blog is part of a series that celebrate 200 years since Birkbeck was established and International Women’s Day on Sunday 8 March.

Lillian Margery Penson

Lillian Margery Penson_© Royal Holloway College, RHC-BC.PH, 1.1, Archives-Royal Holloway University of London

Lillian Margery Penson (1896-1963) was an outstanding scholar and university administrator. She was the first person (of any sex) in the University of London to be awarded a PhD; she was the first woman to become a Professor of History at any British university; and she was the first woman in the UK and Commonwealth to become a vice-chancellor of a university, at the age of only 52. She owed her undergraduate and doctoral education to the History Department at Birkbeck.

Opinions about her were divided. Was she the “foremost woman in the academic life of our day” (The Scotsman), a “remarkable woman” (The Times), and someone who exuded “charm, tolerance, and a sense of humour”? Or was she an “imperious grande dame”, “très autoritaire”, and “too trenchant”? The answer is probably “a mixture”. Although Penson “could on occasion be brusque and even intimidating”, she “had a happy knack of getting to know people quickly”, was “an excellent judge of wine and loved good company”, and projected “a wealth of genuine kindness”. In other words, Penson was probably trapped in that familiar double-bind experienced by powerful women in male-dominated fields: she was admired for her intellect and determination, yet disparaged as a woman for possessing those same traits. One newspaper report on the achievements of “the professor” even referred to Penson using the masculine pronoun: “he”.

Who was Penson? She was born in Islington on 18 July 1896. Her father worked as a wholesale dairy manager and her family were of the Plymouth Brethren persuasion. Indeed, one colleague observed that the “marks of a puritanical upbringing were never effaced” and her “belief in work and duty” meant that she was always made uncomfortable by “flippant talk”. She never married.

From her youth, Penson was intrigued by diplomatic history, colonial policy, and foreign affairs. Her intellectual talents were obvious. In 1917, at the age of 21 years, she graduated from Birkbeck with a BA in History (first class). The war was at its height, so she joined the Ministry of National Service as a junior administrative officer (1917-18) before moving to the war trade intelligence department (1918-19). At the end of the war, Penson returned to her studies of history at Birkbeck and became, in 1921, the first person in the University of London to be awarded a PhD.

Penson’s achievement was even more remarkable because of her gender. After all, throughout the period from 1921 to 1990, only one-fifth of PhD students in history were female. Penson was also young. The average age for history students to complete their doctorates was their mid-30s; Penson was only 25 years old. Birkbeck immediately offered her a job as a part-time lecturer, during which time she also taught part-time at the East London Technical College, now Queen Mary University of London. In 1925, she was given a full-time lecturing post at Birkbeck.

More notably, she was the first female Vice-Chancellor of a university in the UK and the Commonwealth. Indeed, the second female vice-chancellor would not be appointed for another 27 years (this was Dr Alice Rosemary Murray who was appointed Vice-Chancellor of Cambridge in 1975). Then, in 1948, the University of Cambridge agreed to award degrees to women. The last time they had tried this (in 1897), there had been a riot. In 1948, however, the Queen, Myra Hess, and Penson became the first women to be awarded honorary Cambridge degrees (in Penson’s case, a LL.D or Doctor of Laws). The Scotsman decreed Penson’s academic and administrative talents to be “unsurpassed even in the annals of that great institution”.

Many of the values that Penson promoted were those at the heart of the Birkbeck mission. She spoke eloquently on the need to offer university education for “virtually all comers”, with no restriction based on religion, race, or sex. She was keen to insist that the job of the university teacher was to “do something more than impose upon the memories of our students masses of detailed information”.

As with many powerful women, she has largely been forgot. After her death, a University of London Dame Lillian Penson fund was established to provide travel money between scholars engaged in research in one of the universities of the Commonwealth, especially Khartoum, Malta, the West Indies, and new universities in African countries. This seems to have disappeared. All that remains is a bricks-and-mortar legacy in the shape of the Lillian Penson Hall, which still exists next to Paddington Station in Talbot Square, providing accommodation for over 300 students.

Share
. 1 comment . Category: College . Tags: , , , , , , ,

The Bonnart Trust PhD Scholarship

Zehra Miah is a Bonnart Scholar who is currently undertaking a PhD on the experiences of Turkish immigrants in London from 1971 to 1991. In this blog, she shares what it was like applying for the scholarship and how it has allowed her to pursue her project full-time.

Pictured: Idris Sinmaz (Zehra’s grandfather) came to London from Istanbul in 1971 to work in the restaurant of his landlord’s son. His two sons and wife joined him in 1973, his married daughter stayed in Turkey. This image was taken in 1980, by which time Idris had opened his own restaurant, Abant on Kingsland High Street in Dalston. Abant is a lake in his hometown of Bolu, Turkey.

Freddie Bonnart-Braunthal founded the Bonnart Trust to fund research aimed at tackling the causes and consequences of intolerance. Largely inspired by his own experiences leaving Vienna in 1935 and being branded an enemy alien and interred in the UK, he wanted to provide funding for scholars, such as myself, to explore these topics and to use their findings to help make a more tolerant and equal world.

When considering embarking on a PhD one of the main hurdles, once you have written your proposal, met with a supervisor, perhaps even had an interview and secured a place is – how to pay for it! My own story, is that I had returned to study as a mature student with three young children and a full-time job as an Executive Assistant. I had studied for my BA and MA at Birkbeck part-time and decided that if I was going to do a PhD then I wanted it to be all or nothing, so I applied for a full-time place. Starting the PhD meant not only the loss of my salary for me but also for my family, even cobbling together the fees would have been a struggle.  In short, without the Bonnart Trust seeing value in my research and awarding me the scholarship, I would, best case scenario perhaps, be pushing through a part-time PhD or, more likely have made the decision to take a different career path.

As a prospective student, you will already know from the institutions that you have applied to that whilst there is not an awful lot of funding about, it is a different offer with every university having vastly different application processes. If you have chosen to study at Birkbeck, or you are considering it and your research area fits within the remit of the Trust, namely addresses diversity and inclusivity or social justice and equality, then I would urge you to consider applying for this fantastic scholarship.

My research considers whether ethnic, religious and racial labels have helped or hindered the Turkish speaking minorities in London between 1971-1999.  When I read the guidelines and spoke to my supervisors (Professor David Feldman and Dr Julia Laite) it was clear that the Bonnart Trust Scholarship was most closely aligned with my research interests.  I have previously held a fees scholarship for my Master’s at Birkbeck and one thing I was not aware of at the time is quite how many people I would meet, collaborate with and the opportunities to present that come along when you hold a scholarship.

These opportunities are worth just as much, as the funding,which is full fees, an annual stipend, and a research allowance of up to £1,000.  The scholarship is open to the entire School of Social Sciences, History and Philosophy so it is competitive, but I can honestly say that I thoroughly enjoyed the process. The form had very specific questions (as do all funding applications, and helpfully they all ask different things with different word counts).  For the Bonnart Trust Scholarship I had to succinctly answer a number of questions about my research in general, , why it was important, what sort of influence outside of academia I hoped for and the possibilities it might offer to help address some of the Trust’s aims; no section allowed more than 250 words.

I am naturally a ‘better in the room’ sort of person so when I was shortlisted and invited to interview I knew that this was my opportunity to demonstrate just how important I felt my research was.  I can understand though, that interviews can be a bit daunting and my interview for the scholarship involved a panel comprising a linguist, two historians and a political scientist (one of whom was a past Bonnart Scholar).  I had lots of great advice, but there are two key points I want to share; firstly, you are the expert and you love your project, but spend some time considering what could go wrong and what the challenges might be and secondly, be ready to address every member of the panel even if they are outside your discipline, find one thing to engage with them on within your research.  They aren’t there to catch you out; they simply want to hear that you have thought through your ideas.

I was in Prague Castle when I got the email informing me that I had been successful and I am so grateful that the funding is allowing me to carry out this work. Since starting my PhD I have had numerous opportunities to meet Bonnart Scholars working in other disciplines. Next term there is the annual Bonnart Trust research seminarwhich will, I hope, be a great forum to meet more people interested in what I do and doing interesting things; they are now my peers, colleagues and maybe even my future employers!

I would urge anyone who feels that their research aligns with the Trust’s mission to take a look at the website, have a read of the current and previous projects and see where you fit – and then apply!

Applications are now open for the Bonnart Trust PhD Scholarship and will close on 31 January 2020.

Further information:

 

Share
. 1 comment . Category: Social Sciences History and Philosophy . Tags: , , , ,

The deep historically rooted misperceptions revealed by Brexit

Dr Jessica Reinisch, Reader in Modern European History, discusses the ahistorical narrative around the UK and its European neighbours that is shaping Brexit. 

History has been part of the Brexit madness from the start. It’s hardly news that thinking about things that happened in the past is often directly shaped by perceived priorities in the present, but something rather more one-sided has been going on with history under Brexit. From the small group of Eurosceptic historians around David Abulafia and their problematic claims about Britain’s past, to the Tory MP Daniel Kawczynski’s wilful ignorance of the role of Marshall Aid in post-war Britain: history has never been far away from the Brexit politics of today.

Since the UK referendum campaigns, politicians have tried to bolster their support of ‘Leave’ with claims about history and arguments about the present in the light of the past. Some academic historians and a range of history buffs have been eager to oblige these efforts. For them, what Abulafia called “a historical perspective” involves rifling through the past for evidence that the Brexit project is valid, desirable, perhaps even inevitable. As their once marginalised opinions have become mainstream and their confidence has grown, Brexit-supporting historians have set the pace of historical debate, if we can call it that – or rather the proclamation of more or less uncontextualized (or simply false) statements about the past, followed by an often bewildered chorus of disquiet from historians at large.

The claims put forward by the ‘Historians for Britain’ were quickly rebuffed by historians across the UK, who presented plenty of evidence that undermined the supposed exceptionality and benevolence of the British path and its immutable separation from the rest of Europe. “Britain’s past”, they pointed out, was “neither so exalted nor so unique.” In a more recent piece in the New Statesman, Richard Evans, never one to run from a good fight, lists a catalogue of examples where today’s Brexiteers have manipulated and distorted the past to fit their political agenda. But these efforts notwithstanding, the politics of Brexit has spawned what Simon Jenkins has fittingly called “yah-boo history: binary storytelling charged with fake emotion, sucked dry of fact or balance.” Jenkins’ comment made reference to Labour’s John McDonnell, though as Richard Evans shows, the Tory Brexiteers’ list of abuses of history is rather more substantial and significant.

The problem isn’t so much that apparently everyone feels entitled to serendipitously dip into the past for findings to support whatever they believe in; it is rather that much of this history is so very un- and anti-historical. History has become a caricature of parochial dreams, nostalgias and made-up analogies to prop up binary political choices. At stake is the nature, direction and meaning of British history and Britain’s place in the world. But just as important is the question of whether history can really be scaled down to an apparently singular ‘British’ vs ‘European’ position.

It is high time for historians to restore complexity and take seriously the variety of geographical and historical vantage points which can bring to light very different timelines and priorities. The contributions to a roundtable debate, published by Contemporary European History, have tried to do just that. It asked 19 historians of Europe working within very different national and historiographical traditions to reflect on the historical significance and contexts that gave rise to Brexit and within which it should be understood. The result is a palette of pertinent historical contexts in which Brexit has made an appearance and can be analysed. As a result, some of the certainties appearing in the Brexiteers’ version of history suddenly seem much less certain.

The upshot of this roundtable cannot be easily reduced to a political headline, and that is precisely the point. Serious history rarely works that way. As the contributors show, the prospect of Brexit has revealed deep historically-rooted misperceptions between the UK and its European neighbours; Brexit in this sense is a process of stripping away dusty historical delusions about national paths and those of neighbouring countries. The essays demonstrate that Brexit has to be understood in the context of a long history of British claims about the uniqueness of the UK’s past. Europeans at times recognised British history as a model to be emulated. But they periodically also challenged the applicability of the British yardstick to their own national exceptionalisms, or pointed to an equally long history of connections between the UK, the British Empire and the European continent. The present debates about British history and its place in Europe and the world alter readings of the past, in some cases significantly. History is, once more, being rewritten in the light of Brexit.

This article first appeared on the Cambridge Core blog

Share
. Reply . Category: Social Sciences History and Philosophy . Tags: , ,

Yes, the monuments should fall

This article was contributed by Dr Joel McKim, from Birkbeck’s Department of Film, Media and Cultural Studies.Lee Park, Charlottesville, VA

Writing in the 1930s, the Austrian writer Robert Musil famously noted that despite their attempted grandeur, there is nothing as invisible as a monument: “They are impregnated with something that repels attention, causing the glance to roll right off, like water droplets off an oilcloth, without even pausing for a moment.” It’s difficult to reconcile Musil’s observation with what we’ve witnessed from afar over the past week – a nation seemingly ripping itself apart, a statue of Robert E. Lee situated at the centre of the conflict.  Michael Taussig has, more recently, suggested an important adjustment to Musil’s theory arguing that it’s not until a monument is destroyed (or is slated for removal) that it succeeds in drawing our attention. The monument, Taussig reminds us, is often the first symbolic target in times of struggle.  “With defacement,” he writes, “the statue moves from an excess of invisibility to an excess of visibility.”

The confederate statues in America and the dilemma over what do with them became extremely visible this past week. It’s a discussion that has actually been taking place for some time now, with the removal in April and May of a number of monuments in New Orleans (including statues of Lee and Jefferson Davis, the president of the Confederacy) being a recent flashpoint. And there are of course many global and historical precedents to this debate, including the removal of racist and imperial icons in South Africa over the past several years and the precarious fate of Soviet-era statuary (see for example the excellent Disgraced Monuments, by our own Laura Mulvey and Mark Lewis). Decisions over what to do with symbols of past shame or troubling history also extend to the realm of preservation. Germany and Austria have recently been debating whether several architectural sites connected to the history of National Socialism, including the Nuremberg rally ground and the birthplace of Adolf Hitler, should be preserved, destroyed, or simply left to decay.

Apart from the abhorrent, but hopefully small faction who sees these symbols as worthy of veneration, another argument for keeping confederate monuments in place surfaces frequently from an apparently more benign viewpoint. We’ve all heard some variation of this position expressed over the past several days: “These monuments are an important reminder of our difficult and troubling history.” Or, “These statues help us to educate ourselves about what we have overcome.” Or, “If we destroy the past we will be doomed to repeat it.” While perhaps well meaning, I believe this line of argument is misguided in a number of ways. I think it’s fundamentally mistaken in its understanding of both the social dynamic and cultural history of monuments. Let me explain why.

Firstly, if monuments do have a significant educational purpose (and even this is questionable), it is certainly naïve to think this is the only mode by which they function. Rather than serving as references to the figure or event of the history they depict, public monuments communicate far more about the collective sentiments of our current period and the period in which they were erected. They express, in other words, rather than simply educate. The majority of confederate monuments, as New Orleans mayor Mitch Landrieu reminds us, were constructed not at the time of the civil war, but long afterwards during moments of resurgence in pro-confederate sentiment and white backlash against black civil rights, such as the Southern Redemption period. They were much less a marker of a tragic, but completed chapter in the nation’s history, than an expression of a renewed commitment to the cultural values of the losing side. Nicholas Mirzoeff points out that the monument to Robert E. Lee in Charlottesville was completed in 1924 and reflects a period of intense KKK organizing in the area. That these monuments can still function today as rallying points for ethnic nationalists and white supremacists, rather than as neutral transmitters of a distant history, should be self-evident after this week’s events. Could whatever nominal educational value these monuments possess, ever justify the continued emboldening role they play for these groups, or the genuine pain and distress they cause to so many who are forced to encounter them in public space? Ask a member of the black community if they are in need of a statue of Robert E. Lee to teach them about the history and continued impact of slavery and discrimination in America.

The second reason I think anxieties of the “destruction of history” type are misguided is that they don’t adequately recognize the always provisional and malleable nature of monuments and memorials. Far from being permanent or stable markers of history, monuments are perpetually being altered, moved, re-interpreted and reconsidered. They are contentious and contingent objects. The memorial landscape is continually in a process of adaptation. As Kirk Savage claims in his insightful history of the National Mall in Washington, “The history of commemoration is . . . a history of change and transformation.” Even the Lincoln Memorial, the most monumental of American memory sites, is an example of adaptation according to Savage. Its adoption as a symbol for the black civil rights movement occurred despite, rather than because of its intended design – the planners deliberately downplayed Lincoln’s role in the abolition of slavery. Artist Krzysztof Wodiczko’s protest-oriented projections onto existing statues are another important example of how the struggle to determine a monument’s meaning may continue long after its construction. Some of the most powerful monuments and memorial proposals of the past few decades have incorporated an element of self-destruction or suspicion into their own form. From the Gerz’s monument against fascism that disappears into the earth, to Maya Lin’s deliberately non-monumental Vietnam Veteran’s Memorial, to Horst Hoheisel’s proposal to blow up the Brandenburg Gates as a memorial for the murdered Jews of Europe. In short, public monuments change; their lifespan is not and probably shouldn’t be infinite. We don’t owe them that. The debate and conflict surrounding the removal of confederate monuments is obviously a clear indication that America is also currently undergoing a process of significant change. While the events of the past week have been worrying and sickening, I am heartened by the number of courageous people committed to ensuring that this change is a move forward, rather than a regression.

Dr Joel McKim is Lecturer in Media and Cultural Studies and the Director of the Vasari Research Centre for Art and Technology. His book Architecture, Media and Memory: Facing Complexity in a Post-9/11 New York is forthcoming in 2018 from Bloomsbury.

Further information

Share
. Read all 9 comments . Category: Arts . Tags: , , , , ,

Bad Habits? France’s ‘Burkini ban’ in Historical Perspective

This article was written by Dr Carmen Mangion from Birkbeck’s Department of  History, Classics and Archaeology. The article was originally published on the History Workshop Online‘s blog.

The ‘burkini ban’ issued by 30 French beach towns at the end of July 2016 sparked a media frenzy. Town mayors saw the burkini, the full-body swimsuit favoured by some Muslim women as a means of maintaining modesty while enjoying the sea, as a symbol of Islamic extremism and a threat to ‘good morals and secularism’. France’s 1905 constitution separates Church and State, and embraces a laïcité (a secularism in public affairs which prohibits religious expression) meant to limit religion’s influence on its citizens though still allowing freedom of religion. It originated as a means of eliminating the influence of the Catholic Church.Following ministerial criticism, France’s top administrative court investigated the ‘burkini ban’, ruling in late August that it violated basic freedoms.

Nuns at the beach
Nuns at the beach (Facebook/Izzeddin Elzir)

Amidst this furore, Italian Imam Izzedin Elzir’s image of nuns on the beach in their religious habits triggered an international media response. The image, appearing across social media and in outlets as prominent as the New York Times, implied the hypocrisy of a ban targeting Muslims and ignoring Christians. The photos were ironic on two counts:

First, some French mayors were emphatic that nuns in habits were also forbidden on beaches.

Second, and more apposite to this blog post, both the media and ‘ordinary’ citizens seem to be unaware that the ‘nun’ on the streets of Paris (and elsewhere) once sparked a similar outrage.

The historical context was of course different (it always is), but the indignation and the drive to control women’s appearance was just as virulent. Such outrage was not limited to France, but as the ‘burkini ban’ was initiated by the French, it seems appropriate to begin with this bit of French history.

The French revolution of the 1790s, with its cry of liberté, égalité, fraternité was not such a good thing for Catholic nuns. The nun, in her religious habit, became a symbol of the Catholic Church’s role in upholding the inequities and injustices of the ruling classes within France. Catholic nuns, then fully habited, were visible on the streets of Paris as educators, nurses and providers of social welfare, and became targets of anti-clerical outrage. The republican political regime set French nuns ‘free’ from their lifelong vows of poverty, chastity, obedience and their religious habit. It closed convents and confiscated their property. Some members of religious communities weren’t so willing to be set free, however, and were imprisoned. They were told to remove their religious habits (made illegal in 1792) and instead wear secular garb. Nuns including the Carmelites of Compiègne and the Daughters of Charity of Arras were executed for refusing to take the oath of loyalty to the Constitution.

French citizens also made their umbrage against nuns known. One 1791 print representing revolutionary anticlericalism, La Discipline patriotique or le fanatisme corrige (‘The patriotic discipline or fanaticism corrected’), showed the market women of Paris’s les Halles disrobing and thrashing the religious fanaticism out of a group of nuns. Such disciplining of women’s bodies was both salicious and violent.

la_discipline_patriotique_ou_le_-_btv1b69446090

The patriotic discipline or Fanaticism corrected (La Discipline patriotique ou le fanatisme corrigée) (Image: BnF/Gallica)

This urge to control the religious ‘fanaticism’ of women and monitor their clothing choices was not only a French issue; it had earlier incarnations. The dissolution of the monasteries in England in the mid-sixteenth century also ‘freed’ nuns and monks from their vows — and their property. English women’s response to this was to form English convents in exile, many in France and Belgium. In the 1790s English nuns fled, often surreptitiously, back to England. But penal laws restricting Catholic practices were still in effect and English bishops initially discouraged nuns from wearing their religious habit. English citizens too showed their indignation for female religious life by throwing epithets and stones at nuns; the Salford house of the Daughters of Charity of St Vincent de Paul was set alight in 1847. Similar events happened in the United States. Most notoriously the Ursuline convent in Charleston, South Carolina was burned down in the 1830s by anti-nun rioters. In the Netherlands, in Spain, in Belgium, in Germany and more recently in Eastern and Central Europe, nuns were also targeted. Women in religious clothing were (and are) easy targets of vitriol and violence.

So burkinied Muslim women and habited Catholic nuns have far more in common than one might think. The nun’s religious habit, like the burkini, has links to religious identity as counter to cultural norms. Critics say that women in burkinis challenge the French secular way of life. History shows that the habited nun also challenged both a republican version of Frenchness and also an English version of Englishness.

Within this context, the burkini furore illustrates two points.

First, it is yet another disappointing reminder that women’s bodies and appearances remain far too often more relevant (and newsworthy) than women’s intellects and voices. Clothing regulations are an excuse to control women and to divert attention from more substantive issues. They are a means of enforcing a societal version of femininity that doesn’t allow for difference. Women choosing to wear religious dress (or dress associated with religious affiliation) should not be stigmatised.

Second, by focusing on the burkini, we forget the more salient issue of figuring out how diverse people can live together peacefully. It is the social, economic and political factors that need attention: cultural inclusion, high unemployment and participation in civic life. Criminalising what women wear on the beach doesn’t even come close to addressing these issues.

Further Reading:

  • Carmen Mangion, ‘Avoiding “rash and Imprudent measures”: English Nuns in Revolutionary Paris, 1789-1801’ in Communities, Culture and Identity: The English Convents in Exile, 1600-1800 edited by Caroline Bowden and James E. Kelly (Ashgate, 2013), pp. 247-63.
  • Gemma Betros, ‘Liberty, Citizenship and the Suppression of Female Religious Communities in France, 1789-90’, Women’s History Review, 18 (2009), 311–36
  • For a robust comparison of nineteenth-century American nativism to the politics of Islam see José Casanova, ‘The Politics of Nativism Islam in Europe, Catholicism in the United States’, Philosophy & Social Criticism, 38 (2012), 485–95. A short and accessible version of this essay can be found here.
Share
. Reply . Category: Social Sciences History and Philosophy . Tags: , , , , , , , , , , , ,