The surprising impact of innovation on reducing climate change

New research by the Department of Management’s Dr Fred A. Yamoah and colleagues explores the relationship between innovation input, governance and carbon dioxide emissions.

Picture of a wind farm

There is no doubt that the humanitarian and economic impact of climate change is a matter for global concern. However, prior research tells us that it is emerging and developing economies that are likely to be hit hardest by the impact of global warming.

In their 2019 report, the Intergovernmental Panel on Climate Change (IPCC) found that emerging and developing economies, with their heavy reliance on agriculture, forestry and tourism, were more at risk from the adverse impact of climate change than more developed economies. Indeed, the IPCC found that every one-degree centigrade increase in temperature would lead to a 1.3% drop in economic growth in an emerging economy.

What role does innovation play in the fight against climate change?

Typically, the fate of countries in this position has been viewed somewhat fatalistically, with little known about what can be done to mitigate the damage caused by the poor climate choices of more developed countries. However, since innovative technologies are known to have a positive impact on climate change factors by conserving energy and reducing emissions, we wanted to know whether increased innovation input could support developing economies in the fight against climate change.

Our study involved an analysis of data from the World Bank database on 29 emerging countries over the period from 1990 to 2018. My colleagues Godfred Adjapong Afrifa, Gloria Appiah (both Kent Business School), Ishmael Tingbani (Bournemouth University) and I examined whether investment in cutting-edge technologies could help address climate change problems in emerging economies, and how this relationship is supported or mitigated by governance factors.

The impact of governance

Why is it important to consider governance alongside innovation and climate change? First of all, it is good for business: stakeholder theory tells us that organisations that please their stakeholders by following ethical norms of fairness, trustworthiness and respect are likely to see improved overall performance in the long term.

When it comes to climate change targets, governments and international governing bodies such as the EU or ECOWAS are among the most critical stakeholders, as they are more likely to take a long term view and possess the necessary regulatory powers to ensure best practices are upheld.

How innovation benefits emerging economies

The introduction of innovative technologies and practices can benefit emerging economies in a number of ways. For farmers, genetic technologies can develop resilient crops that adapt to environmental challenges in agriculture. New technologies also typically conserve energy and reduce harmful fuel emissions.

Looking at the data, our results suggest that emerging countries with high innovative competencies reduce climate change problems by approximately 26.8%, with a 10% increase in cutting-edge technology.

While these findings show the dramatic impact of innovation on mitigating the negative effects climate change, it is important to note that the positive results were moderated by governance factors, as the quality of governance influences countries’ investment in innovative technologies towards curbing environmental damage.

Contrary to the typically deterministic view of climate change, our results suggest that emerging economies’ innovation efforts could have a significant impact on national and global success in the fight against climate change.

Further Information:

Share
. Reply . Category: Business Economics and Informatics . Tags: , , , , , , , , , , , , ,

The economics of public sector employment

Our Dr Pedro Gomes has been researching public employment for nearly fifteen years. He shares why it is so important to understand how the public sector works and the key findings from his research.

Public employment is a significant consideration in any national economy. In developed countries, public employment makes up 15-30 percent of total employment and represents the large majority of government consumption. In the US, for example, the government spends 60 percent more on general government employees than on the purchase of intermediate goods and services.

The public sector also operates according to different rules than the rest of the economy, as governments do not face the same competitive forces, nor have the same objectives as private sector firms.

Considering that the public sector is responsible for delivering many key services in our society, from education to healthcare, it is essential to have a good understanding of how its employment operates. The recent COVID-19 pandemic has again put focus on the importance of having a modern public sector, with an employment force prepared to face difficult, unpredictable and unlikely crises, but its aftermath with high public debt, also puts emphasis on the costs of the public sector workforce.

Below are three of the key findings from my research into this area.

Governments hire disproportionately more educated workers

In the paper Public Employment Redux, my colleagues Pietro Garibaldi, Thepthida Sopraseuth and I explore the phenomenon whereby governments hire more educated workers than the private sector.

We noticed that governments hire very few workers with low qualifications. In the US, for example, one third of workers have a masters or a PhD qualification, and one third of these work for the government.  We documented empirical evidence for this education bias in the US, UK, France and Spain.

There are a few different explanations for this trend:

  • The government needs more educated workers to provide its highly technical goods and services, such as healthcare, education and the judicial system.
  • Higher educated workers take more of a wage penalty to work in the public sector, so are relatively less expensive to hire.
  • Public sector jobs that require low qualifications pay more than similar level jobs in the private sector, so they attract workers with more qualifications.

Within our model, we found that the technological skills needed for the public sector was the main driver of the disproportionate representation of educated workers, but that wage setting and excess underemployment explain 12-15 percent of the education bias.

Unlike other sectors, the government is able to set wages more freely, as the cost is financed from tax revenue. If the government chooses to pay very high wages, too many people will choose a skilled role in the public sector as their first choice. However, if wages are too low, too few workers will want to join the government.

In reality, a balance is needed, so the government can always attract the workers it needs, without leading to underemployment in the public sector.

Nepotism in hiring practices allows friends and family to ‘jump the queue’ for government roles

Public sector hires are often based on nepotism: Scopa (2009) found that the probability of working in the public sector is 44% higher for individuals whose parents also work in the public sector, while Colonnelli et al. (2020) found that politically connected individuals in Brazil enjoy easier access to public sector jobs.

In my research into this topic with Andri Chassamboulli, we suggest that workers can use their connections to find jobs in the public sector faster. We created a search and matching model with private and public sectors to test this theory.

Surprisingly, we encountered some positive side effects to nepotistic practices. Conditional on high public sector wages, our findings suggest that hiring through connections reduces unemployment, as people who do not have connections will instead find roles in the private sector. Conversely, if the government sets the optimal wage possible for the successful running of the public sector, nepotism is reduced.

We conclude, therefore, that nepotism is a symptom of a problem in the public sector, rather than the disease, and the problem is created when wages are set too high.

Women prefer working in the public sector

In most countries, the public sector hires disproportionately more women than men. My colleague Zoë Kuehn and I developed a model to try to make sense of this imbalance.

Our findings show that the gender imbalance in the public sector is driven by supply, meaning that women self-select to work in the public sector more than men. One explanation for this is that the type of job carried out by the government is coincidentally the type of work preferred by women, such as healthcare and education. However, even discounting these sectors, women’s public employment remains 20-25% higher than men.

This remaining imbalance can be explained by the different characteristics of public and private employment. The gender wage gap and working hours are both reduced in the public sector, making this an attractive choice for women who may be factoring family commitments alongside work opportunities in their choice of employment. Alongside reduced working hours, the public sector offers additional benefits such as more sick days, flexible hours and employer-provided childcare, ensuring an overall better work-life balance in the public sector.

 

Further Information:

 

Share
. Reply . Category: Business Economics and Informatics . Tags: , , , , , , , , , , ,

Decision making under uncertainty: Ambiguity preferences

David Schröder, Associate Professor in Finance at Birkbeck’s Department of Economics, Mathematics and Statistics, and Elisa Cavatorta, Associate Professor in the Department of Political Economy at King’s College London, have developed a questionnaire to measure how members of the public make decisions under uncertainty. Take the survey online to find out your ambiguity preferences.

Cartoon of a figure at a crossroads

This article was originally posted on the Research Outreach blog and is licensed under a Creative Commons Attribution 4.0 International License.

Every decision and action that we take in life is associated with a degree of uncertainty; whether we cross a road, what we invest our money in, what career we follow and the thousands of other decisions that we make on a daily basis. Over the years, economists and psychologists have studied different factors that affect how individuals make decisions under uncertainty so that they can better understand what drives the behaviours that we see in the world.

One of the underlying factors that explain individual behaviour under uncertainty is the different degree of tolerance anyone has for situations of uncertainty, in other words their individual preferences. To apply the behavioural models proposed in the scientific literature, it is important to accurately measure these preferences driving our behaviour. Elisa Cavatorta, Associate Professor at the Department of Political Economy at King’s College London, and David Schröder, Associate Professor in Finance at Birkbeck, University of London, noticed that existing approaches to measure uncertainty attitudes are overly complex and therefore rarely used to measure preferences outside economic laboratories. To improve the ability to measure uncertainty attitudes and make their measurement more accessible, they designed a new questionnaire to facilitate the assessment of the preferences guiding everyday decision making under uncertainty.

Understanding risk and ambiguity

The most common factor that people associate with decision making under uncertainty is risk, and how tolerant an individual is towards risky scenarios. In a risk-based situation, we have a sense of the likelihood of the different outcomes that our decisions could deliver. For example, if you roll a fair dice, you know that you have a one in six chance of getting a specific number. Likewise, outcomes of recurring situations may involve known likelihoods: if your parent cooks their usual signature dish, you know the chances that it tastes delicious.

In many situations however, there is an additional degree of uncertainty about the potential outcomes of our decisions and actions. For example, if you hear that the dice that you are about to use has a flaw that means it will not roll fairly, you can no longer accurately predict the likelihood of rolling the number four. If a stranger cooks for you, the probability that the dish is delicious can be vague. Thanks to the work of Knight (1921) and Ellsberg (1961), this degree of uncertainty over vague or unknown probabilities is referred to as ambiguity. Different people have a different “taste” for the lack of accurate information about the probabilities of given outcomes and will respond differently.

Our preferences towards ambiguity guide the decisions that we make under uncertainty. There isn’t an optimal decision that fits all. Optimal decisions for everyone depend on one’s own preferences. If we can accurately measure ambiguity preferences, then we have a powerful insight into human behaviour, that is, how people make choices subject to limited information.

Measuring ambiguity preferences

Traditionally, ambiguity attitudes have been measured within a controlled economic laboratory environment. Ambiguity tests have focused on specific decision tasks involving known and unknown probabilities, often complex to understand and requiring lengthy explanations. This method has produced very accurate results; however, the complexity of these tasks makes them impractical to roll out on a large enough scale to understand the decision-making behaviours that we see in the general population.

Elisa Cavatorta and David Schröder researched ambiguity preferences in great detail. They started from the results in a laboratory setting, but the researchers were motivated to find a more practical way of measuring ambiguity preferences outside of these experiments. They knew that an online survey questionnaire or a questionnaire that could be conducted by telephone would be a far more practical mechanism for collecting data from much larger groups of participants and be more practical for researchers who conduct field studies.

Survey design

Cavatorta and Schröder have designed a simple survey questionnaire, which accurately measures ambiguity preferences. Their work has been inspired by various studies that recommended using surveys to measure other economic preferences. In their 2019 paper, they develop a measurement for ambiguity preferences, adding to existing ones designed to elicit preferences for risk, trust and impatience.

The research team developed their questionnaire using a sample of 121 students from various colleges of the University of London. The idea was to find the best combination of survey questions that would most accurately predict ambiguity preferences elicited with the well-established approach in the laboratory setting. The challenge was to find the best combination of these survey questions. The research team selected around 50 possible candidate survey questions of various types. Some of these questions are short thought-experiments where participants make choices in some hypothetical games (e.g. selecting the preferred option between one unknown, i.e., ambiguous, probability and one with known probability, i.e. risky). Some questions are attitudinal questions from the psychology literature, in which participants assess how much they like or dislike a situation.

The researchers considered the predictive power of all combinations of the candidate questions. Using a selection process that evaluates all possible combinations and then selects the best predictors is a data-driven method that minimises forms of bias in the selection process. The result is a five survey-item questionnaire that provides an individual ambiguity preferences score that correlates well with the ambiguity preference score that would come out in a laboratory setting. This means the survey questionnaire is an accurate substitute when measurement in the laboratory is impractical or unavailable.

Possible uses of the new measurement

The professors recommend their measurement whenever incentivised experiments in a laboratory are not feasible, for example, when researchers need to gather ambiguity preferences of a large number of participants, field-studies, or in scenarios where time or money is limited.

This questionnaire provides the opportunity to conduct large-scale studies into the impact of ambiguity preferences on economic and social behaviour. Given the uncertainty surrounding many decisions in every-day life, applications of the measurement can be wide-ranging. The current pandemic demonstrates people have different preferences for ambiguity and this guides different reactions and health behaviour. Another application concerns the financial services industry: the industry has traditionally focused on risk preferences when recommending the most suitable investment options to its clients. Risk preference assessments help us to understand one element of what makes an investment a good match for an individual. However, investments often involve unknown risks (i.e. ambiguity), so the measurement can assist financial services professionals to better tailor their product recommendations to the client’s tastes and needs.

Further Information

Share
. Reply . Category: Business Economics and Informatics . Tags: , , , , , , , , ,

Caring about Homecare

Caroline Wiemar and Kerry Harman from the Centre for Social Change and Transformation in Higher Education discuss the ongoing challenges faced by homecare workers since the homecare sector was privatised in the 1990s.

A carer with and elderly person in a carehome

Photo courtesy of Matthias Zomer

The invisibility of women’s work has been documented by feminist scholars for decades (see DeVault, 2014) and here we are in 2020 and, for paid homecare workers in the UK (and many other countries), the situation has not improved. Indeed, things have gotten a lot worse. While the COVID-19 crisis has drawn attention to the importance of ‘key workers’, particularly those employed in the care sector, proposed government immigration policy which prevents ‘low skilled’ workers entering the UK, including care workers, makes the weekly ‘clap for our carers’ feel like shallow rhetoric. Indeed, Hayes and Walters point to ‘the exploitation of care workers for political profit’ during the pandemic.

So what do we know about these homecare workers that, until quite recently, have been largely invisible? Annual reports on adult social care workforce data provide an overview of the workforce in England. Approximately 295,000 care jobs are in care home services with nursing; another 305,000 jobs in care only home services and the majority of jobs are in homecare, with 520,000 employed in this work. In other words, the provision of paid care is a major industry in the UK. Of the homecare workforce, approximately 50% were employed on zero-hours contracts, 84% were female, the average worker was 43 years old, 83% were British, 7% were EU (non-British) and 9% were non-EU. Across the care sector, there are large variations in ethnicity by region with London having the most diverse workforce (67% BAME) and the North East the least diverse (96% white). So homecare workers are likely to be more mature women, on precarious employment contracts, and Black or from a minority ethnic background if they work in London and white if they work in the North East.

While the outbreak of COVID-19 has contributed to a recognition of the ‘crisis in care’, a number of reports over many years indicate the homecare sector was in crisis well before the pandemic (BBC Panorama, 2019; Gardiner, 2015; Holmes, 2016; Koehler, 2014; UNISON, 2016). A shift to the outsourcing of this work to the private sector by local authorities during the 1990s had resulted in a race to the bottom in terms of hourly rates of pay and overall employment conditions for homecare workers (Hayes, 2017). This is exacerbated by an aggressive tendering process which often forces smaller, local agencies to eventually close their doors. The experience of working in the sector and changes that have taken place since the 1990s is provided in the following account by one of the authors:

I started working in the care sector 31 years ago when I got a part-time job as a ‘home help’ with the local council. My role was to help elderly disabled people in their own homes and to maintain their independence by doing shopping, laundry, housework, getting medications. The pay and conditions were good, with paid annual leave and sick pay. It was a satisfying job to strike up a relationship with the people I helped, hearing their stories of the past. I had time to have a conversation with them, which they enjoyed as sometimes I was the only person they might see that week. Then after a few years we were renamed ‘homecare workers’. With this title came changes –service users times were cut and they started to charge for their care. We had to do more in less time.

When the council outsourced homecare we were transferred to a non-profit organisation and we all had to take a pay cut. Our hours were cut, as well as sick pay and annual leave. If we did not take these cuts we did not have a job. You keep going because the vulnerable need your assistance. It’s not their fault we now work for less than previously. Then the non-profit organisation lost the contract and we were transferred over to a profit making company. I cared for a lady called Edna for just over ten years and she saw the changes with me. Edna had no family and I became her family. I used to get half an hour in the morning to give her a bath, dry her, help her dress, give her a drink, breakfast and medication. I used to go in earlier, just so I didn’t have to rush, as I knew I could not do all that in the time I had been given. We would have our conversation while I was carrying out my tasks. I would do all the things she no longer could because she was hard of hearing, like making phones calls. I’d organise appointments to doctors, hospital, medications and go with her in my own time. I’d make sure she had food, clean clothes – all things we able people take for granted. Over the years carers have lost pay, conditions, working hours and time to care.

Homecare is a low paid job and carers are not recognised for what they do . All I ever wanted was to have time to care, to give the person that I care for their dignity and independence – make them feel valued as a person and that they matter. Carers are everything to our service users – we are carers, nurse, secretary, friend, relative, the go to person who can sort everything out. Most of it is done in our own time. Sadly, my Edna passed away. She was classed as a vulnerable adult, but how vulnerable did she have to be to get the time and care she should of had? How long can carers go on giving their all and not being recognised and respected, on low pay and zero hours contracts? Carers look after the vulnerable but who looks after the carers?

(also listen to Caroline at a recent ‘How might we recognise the value of homecare provision?’ event at Birkbeck)

The ongoing ‘crisis in care’ resulting from the privatisation of the care sector since the mid-1990s points to the urgency of public policy interventions, backed by the resources to enable local authorities to bring homecare services back in-house. This would make it possible for fair wages to be paid and better working conditions for homecare workers across the country. Public policy interventions would also make it easier for trade unions to organise care workers, which is extremely challenging in the private care sector.

Another possible solution to the crisis in care in the UK has been a call for the professionalisation of the sector and this is usually accompanied by proposals for training and development. However, will more training and development get to what we believe is the heart of the problem, which is the ongoing failure to attend to the often embodied skills and knowing that homecare workers have developed in and through their everyday practices and experience at work? Indeed, many training and development programmes are underpinned by the same set of assumptions on what counts as ‘good care’ and who knows about ‘good care’ that work to make the everyday knowing in practice of homecare workers invisible.  As Weimar points out above, carers are also: ‘nurse, secretary, friend, relative, the go to person who can sort everything out’ and this is not ‘low skilled’ work.

During 2018/19, a participatory project with homecare workers was started in two boroughs in London called the ‘Invisible work, invisible knowledges?’ project. The authors met during that project. The purpose of the project was to make contact with homecare workers and find out more about their everyday experiences at work as part of a planned larger project on ‘Reimagining care’. One of the authors met with 13 homecare workers overall, in either individual or small group meetings, and the conversation usually started with: ‘Can you tell me what happens during a normal day at work? Is there such a thing as a ‘normal’ day?’ She was interested in hearing from homecare workers about what they actually do and, as part of these conversations, the homecare workers would often talk about the challenges they experience in their daily work. The resounding problem identified by care workers was the lack of time in the Care Plans[1] they are given to complete their work in a way that enables the people they care for to be treated with dignity and respect. This has resulted in many homecare workers providing additional hours of unpaid care to provide a level of care to care recipients that they consider adequate. As one care worker said, ‘If you see that there’s no food in the fridge, are you going to let someone go hungry?’ This is a reminder that, sometimes, care workers are the only point of contact that care recipients have with the outside world.

Another issue raised was the precarity of homecare workers’ employment contracts. The majority of care workers in London are employed by private agencies, with a large percentage on zero-hour contracts. Many care workers spoke about contracts that had eventually dwindled to very few hours work each week and the need to look for work elsewhere. A reduction in weekly hours was often connected with concerns raised by the care workers about the welfare of their clients/their working conditions. This is an issue that has been raised recently by the MP for Nottingham East, Nadia Whittome.

One outcome from the first stage of the project has been establishing a core group of homecare workers who are interested in documenting their embodied skills and knowledges which are so often overlooked. A crucial aspect of the research is recognising these workers as active producers of knowledge on care rather than passive recipients of knowledge produced in the academy and it is for this reason that homecare workers must be paid as co-researchers on the project. We are hoping the research will contribute to changing the ways care is able to be imagined as well as more democratic processes for developing policy on care, which includes homecare workers getting a seat at the policy making table.

To find out more about the ‘Reimagining Care’ project contact Kerry Harman.

References

DeVault, M. L. (2014). Mapping Invisible Work: Conceptual Tools for Social Justice Projects. Sociological Forum, 29(4), 775-790. doi:10.1111/socf.12119

[1] These are the plans which are put together, usually by an Occupational Therapist, after conducting an assessment with the person requiring care. They specify how many visits per day are required, the duration of each visit and the key activities to be undertaken at each visit.

 

Share
. Reply . Category: Social Sciences History and Philosophy . Tags: , , , , , , ,