Rapport or Respect?

Trainee qualitative researchers, learning the most popular research method of interviewing, are routinely taught to use their interpersonal skills to create rapport with participants. This has been questioned for the last 20 years by Jean Duncombe and Julie Jessop. They ask, how ethical it is for researchers to fake friendship as a means to the end of gathering data?

On the one hand, it is common for people to use interpersonal skills to help us get what we want from others in our day-to-day lives. This applies whether we want a loan from a credit agency, a prescription from the doctor, a response to a complaint – in a multitude of situations, presenting our most polite and friendly selves can help to get the results we want. So it is arguable that it makes sense also to use these everyday methods in research.

On the other hand, research encounters are rather different from everyday encounters. This applies particularly to qualitative research where a researcher may spend a considerable period of time giving a participant their undivided attention. This is an unusual and often a welcome experience for participants, who often describe it in positive terms such as ‘therapeutic’, ‘cathartic’ or ‘a treat’.

Many of the people we want things from in day-to-day life are either providing us with goods and services, so that a transactional element is built into the encounter, or are already in a personal relationship with us through kinship, friendship or community membership. So the rapport we build in those situations already has a clear basis which is mutually understood. This does not apply within the research encounter, where we are usually asking participants to give us their time and information in exchange for a potential benefit to an imagined future population. (I considered the extent to which this is ethical in my recent post on the LSE Impact Blog.) Also, despite all the efforts to secure informed consent, we know that people generally agree to participate in research for their own reasons rather than ours. And where that reason is to get a little human company and kindness, which is lacking from their own lives, the practice of building rapport begins to appear even more suspect.

Imagine you are, let us say, living on minimal welfare benefits with a chronic condition which makes it difficult for you to leave the house. You have lost touch with the friends you used to have when you could go out to work, and your family live far away. You suffer from anxiety and you are very lonely. The carers who come in three times a day are brisk and professional; they don’t have time to chat, and you don’t want to hold them up because you know they are always under pressure. Then a researcher calls, saying she is doing an evaluation of the care you receive, and asking if she can visit you to ask a few questions. You are delighted because it’s been years since you had a visitor and she sounds so kind and friendly on the phone. When she visits, you tell her all sorts of things about yourself and your life. She seems really interested, and laughs at your jokes, and tells you a few things about her own life in return. You haven’t felt this good in years. When she has asked all her questions, you ask one of your own: please will she visit you again? She looks at the floor and says she would like to, but she can’t promise, because between work and her children she doesn’t have much free time. You would like to suggest she brings her children with her, but you know a ‘no’ when you hear one, so you let her go, wait for the front door to close, and listen to the emptiness of your home and your life.

Duncombe and Jessop point out that these problems are multiplied in longitudinal research, where the boundaries between real and faked friendship can become much more blurred. They share experiences of participants beginning to treat them as friends, and the discomfort that arises when they don’t reciprocate. I have had similar experiences, and I’m sure many other qualitative and mixed-methods researchers have too. It is interesting to consider this Euro-Western approach in the light of the very different Indigenous approach, in which research is deemed to be ethical when it serves to maintain and develop existing relationships. Looked at in this way, our Euro-Western approach of creating and then dropping relationships to further our research purposes seems potentially abusive.

The EU-funded TRUST project developed a Global Code of Conduct for Research in Resource-Poor Settings. It was based on four values elicited from research they did with a wide variety of people around the world: respect, fairness, honesty and care. The aim was to combat ‘ethics dumping’, where research deemed unethical in a higher-income country is conducted, instead, in a lower-income country where research is not governed by a regulatory system. I would argue that these values should also apply where research is done by a researcher with more social capital than some or all of their participants. In the vignette above, the researcher was not entirely honest and did not show care in response to the participant’s request, e.g. by signposting them to a local befriending service. This could be described as ‘friendship dumping’.

When you think about it, researchers using their interpersonal skills to create rapport with a participant as a means to an end is actually quite manipulative. This might be more defensible when we are ‘studying sideways’ or ‘studying up’, but even then it seems questionable. Showing respect for participants would be a more creditable aim, especially if it was combined with fairness, honesty and care.

The next post on this blog will be in September. You can follow the blog, above, to get my posts in your inbox.

This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Methods for Lived Experience Research

Note: This post was first published on the SRA blog in November 2021 and is reproduced here with the kind permission of the author and SRA.

In this blog post, Kimberley Neve, researcher at the Centre for Food Policy at City, University of London outlines different methods for capturing ‘lived experience’. Lived experience is the actual, specific ways in which people experience something, in this case food – access to food, food poverty, food quality, food allergies and many others. Kimberley and other researchers at the Centre for Food Policy specialising in qualitative methods have produced a Brief to give an overview of the range of methods you can use when researching people’s lived experience of ‘food environments’. Food environments are the space in which we make all our decisions about food – what to eat, where to buy it, when and with whom to eat it.

Using qualitative methods to influence policy

As researchers we want our work to have impact. We also want to know that it resonates with people and reflects not only the experiences of the research participants, but also of the general population in some way. For our research to have a positive impact, effective communication with policy-makers, both locally and nationally, is vital. Despite the potential of qualitative methods to inform policy that is effective and equitable for the people it is designed to help, the number of qualitative studies used as evidence for policy remains modest compared to quantitative studies.

We wanted to raise the profile of qualitative research methods among both policy-makers and food environment researchers by demonstrating the range of potential methods and their benefits (and drawbacks), with a focus on how using them can help inform policy. These methods can be utilised in a wide range of research areas – for example local transport, access to outdoor space or crime in local areas – providing in-depth insights into people’s lived experiences and practices that can explain how or why people act the way they do.

In our Centre for Food Policy Research Brief (the ‘Brief’) we initially mapped existing studies capturing the lived experience of food environments, categorising methods and relevant case studies. Following this, we consulted with members of our Community of Practice – experts in qualitative research and food environments – for feedback prior to final edits.

What are the qualitative methods you can use?

The Brief is not an exhaustive list of the qualitative methods available; however, we’ve tried to capture the main methods you can use. For the scope of the Brief, we didn’t include quantitative methods but of course recognise their vital role.

Often, combining quantitative and qualitative methods can yield the most valuable insights.

To make the overview as useful as possible, we categorised the methods in the following way:

  • Group 1 – Exploring experiences, perceptions, beliefs, practices and social networks;
  • Group 2 – Observing practices in situ;
  • Group 3 – Designing policy and interventions drawing on the lived experience of participants.
overview illustration

Which method should you use for your research?

Typically, you’ll be likely to benefit from combining methods to suit your research context. For example, visual methods and observation tend to be accompanied by individual or group interviews to provide a more in-depth exploration. In the full Brief you’ll find an overview of qualitative methods with the key benefits and potential limitations of each. Assuming you know all about individual interviews and focus group discussions already, here are a selection of other methods less frequently used in research projects.

Group 1: Visual methods

This includes photo elicitation, creative arts (where participants create artwork such as drawings, videos or theatre), concept mapping (pile sorting, ranking, mental mapping) and timelines. One study in the US used photo elicitation in urban neighbourhoods to identify community-level actions to improve urban environments in relation to health. The study allowed the researchers to identify that not all food outlets affected health in the same way, and that contextual factors such as crime and safety influence how people accessed food, which had implications for community-level policy.

  • PROS – Group 1 methods work particularly well with young participants or where there are language barriers, as views can be expressed more directly and simply. Participants may also be more willing to share information visually and images can provide insights that may not have been accessible via specific questioning.
  • CONS – Visual data can be difficult to interpret in a way that fully represents the participant perspective, and there is a potential for photographs to be seen as reflections of reality, rather than subjective perceptions that provide insights into reality. Participants could also misunderstand the objective and take photos that do not help to answer the research question.

Group 1: Geospatial methods

Geospatial methods often combine mapping with photography and/ or GPS to create visual data that can then be discussed in one-to-one interviews or focus group discussions for more insights. Methods include spatial mapping, geonarratives and geotagged photography. These methods are relatively new to the food environment literature; however they have been used very effectively to explore how people engage with their environment in general, for example in their green space encounters.

  • PROS – Similar to visual methods, geospatial methods can work well to engage participants in a way that is more creative and encourage them to share information more openly. They also allow for participants to share their knowledge as experts of their own food environments. These methods provide insightful data into the connections between space and place, particularly if combined with interviews or focus groups.
  • CONS – Geotagging requires specific technology that may be expensive and difficult to operate. There are also ethical considerations with mapping someone’s location – when and how this data is collected, stored and used are important factors to specify during the research design.

Group 2: Observation

This involves observing participant behaviour with methods such as go-along tours, transect walks and community group observation. Unlike with non-participant observation (below), the researcher talks to the participants during the activity about what their actions and interactions mean to them. For instance, during a go-along tour in a supermarket (shop-along), the researcher might ask for the thought process behind the decision to purchase a product. Transect walks are go-along tours with the addition of creating a map of the local food environment resources, constraints and opportunities.

In a UK study, go-along interviews were used to explore which changes to supermarket environments would support healthier food practices. A key insight from this research was that varied individual responses to the supermarket environment in low-income neighbourhoods are mediated by differing levels of individual agency. Interventions should include an emphasis on factors that increase agency in order to change how people buy food.

  • PROS – Insights into the practical aspects of daily life and routines can be captured interactively with the participant and explored in more detail with further questioning. Power imbalances in research are addressed as participants take more control of the research process.
  • CONS – The researcher’s presence may impact how participants behave or move around spaces, for instance by influencing what they buy in a shop-along tour. It is also quite time-intensive to organise and participate in.
couple shopping photo

Group 2: Non-participant observation

This is where participants are watched from a distance, for instance by video, with little or no interaction with the researcher. This method was used as part of a focused ethnographic study in Kenya along with interviews and cognitive mapping. The aim of the study was to inform policies for improving infant and young children’s nutrition practices. Among other insights, a key finding for policy was that future interventions must consider various aspects of food insecurity to improve conditions in practice.

  • PROS – You can get insights into ‘real’ individual actions, such as shopping or eating practices, without the researcher’s presence influencing the actions. Features of everyday life that may otherwise not be mentioned can be recorded and explored with further questioning. The researcher can also complete a log to provide contextual insights that can explain practices from a more objective viewpoint.
  • CONS – Observation alone, without a follow-up interview or discussion, means the researcher is unable to dig into the reasons underpinning the actions, so the interpretation of the situation can be subjective.

Group 3: Photovoice, co-design, co-creation, systems mapping, group model building

The third group of methods were particularly difficult to classify, as terminology and meanings often overlapped (for instance with co-creation and co-design). These methods place the participant at the centre of the research process and actively engage communities affected by policy decisions (at a neighbourhood, city, county, country level) in the research process. Participants are encouraged to draw on their own experiences, expertise and knowledge of their food environments to think about and propose change, so that policies resulting from the research are relevant and context-specific, and as a result have the potential to be more sustainable.

An example of effective group model building can be seen in a study in the US, where community-based workshops took place with a diverse group of chain and local food outlet owners, residents, neighbourhood organisations, and city agencies.Action ideas were discussed for interventions to promote healthy food access, including funding new stores that stock healthy food options and building the capacity for sourcing local produce in stores.

  • PROS – For all of the methods in Group 3, the ‘hands-on’ nature of research enables participants to generate information and share knowledge on their own terms. Outputs, such as policy recommendations, are created together with the participants to be effective in their local context following an in-depth research process.
  • CONS – These methods all run the risk of being perceived as tokenistic by participants if engagement is not meaningful and genuine.

In brief

Decisions about which methods to select to study live experience depend on the purpose of the study (i.e. guided by a specific research question), the local context, time and resources available, and the benefits and limitations of each method.Recently, the COVID-19 pandemic has accelerated the possibilities of using digital tools and technology as key facilitators for remote research.

As researchers, we not only need to engage participants and design research projects that will yield useful insights; we also have to translate our findings so that these insights can inform the design of effective and equitable policy. By using a range of methods, a more comprehensive and detailed overview can be communicated. Visual materials and stories are particularly effective ways for qualitative researchers to communicate their findings to policy-makers and make a refreshing addition to the more common interviews and focus groups.

Acknowledgements

This blog was written based on the work produced by all authors credited in the full Brief: Understanding Lived Experience of Food Environments to Inform Policy: An Overview of Research Methods

Author biography

Kimberley Neve is a Researcher at the Centre for Food Policy, City, University of London. She works as part of the Obesity Policy Research Unit, investigating people’s lived experiences of food environments to inform policy in areas such as infant feeding and weight management. Kimberley is a Registered Associate Nutritionist with a Masters in Global Public Health Nutrition.

Rethinking Vulnerability and Sensitivity

Research ethics committees are very concerned with the potential vulnerability and sensitivity of research participants. So far, so laudable – but I don’t think they show their concern in particularly useful ways. Gaining formal approval from a research ethics committee is a hoop many researchers have to jump through, but then the real work of ethics begins.

For most research ethics committees, vulnerability is an attribute of some groups and not others. Groups who may be deemed to be vulnerable include children, older people, or adults with learning disabilities. These categories are specified by UKRI who oversee government-funded research in the UK. But if you look at this in more detail, it doesn’t stand up. Take children. Say a competent 14-year-old is a young carer for their single parent who lives with severe and enduring mental health problems and drinks alcohol all day. Which of those two people might be better able to give informed consent to the child taking part in research? Conversely, people are not necessarily vulnerable because they are older. President Biden is 79 and I can’t imagine him being seen as vulnerable. Learning disabilities don’t necessarily make people vulnerable either, as some of my dyslexic friends would no doubt agree.

Vulnerability is not an attribute, it is a state we all move into and out of in different ways. The start of the Covid-19 pandemic made this abundantly clear. Quite suddenly we were all vulnerable to illness, perhaps death; to increased anxiety; to fear for loved ones who fell sick; to bereavement. Heads of state were no safer than ordinary people living in apartments or suburbs, and researchers were every bit as vulnerable as their participants. Perhaps one small positive side-effect of the pandemic is this: we can see more clearly that we are all vulnerable to changing circumstances resulting in trouble or trauma. Which does not mean we are all vulnerable all the time – but that any of us may be, or may become, vulnerable at any time. As researchers, I think it is essential for us to be aware of this, and ready to face and manage it when it occurs.

Vulnerability and sensitivity have something in common. Just as it is not possible to predict from group membership who is and is not vulnerable, so it is not possible to predict who will and will not be upset by a topic. Of course some topics are likely to be upsetting: female genital mutilation, suicide, sex work, and so on. And we need to put whatever precautions we can in place if we are investigating topics like these, that are evidently sensitive: to make the experience as safe as possible for our participants, and for ourselves. But we cannot be sure that everyone will find these topics equally sensitive; there are people who can take such topics in their stride.

Conversely, some people may be upset by apparently innocuous topics. Suppose a market researcher is investigating people’s perceptions of homewares. In one interview, the researcher asks their question about teapots, and realises their participant is struggling to hold back tears. The participant explains that the last gift ever given to them by their beloved mother, who died exactly one year ago, was a teapot. Perfectly plausible; impossible to foresee.

So, we can’t always predict everything everyone will be sensitive about, and we shouldn’t pretend we can. But, again, we need to equip ourselves with the mental and emotional intelligence and dexterity to be able to deal with the unexpected. Because if there is one thing we can predict, it is that at times we will face the unpredictable.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Indigenous Research Methods: Another Reading List

I thought it was time to share more of the books from my shelves. As with my previous post on this topic, this post is a reflection of my personal collection, built from the recommendations of students, colleagues and people on social media, as well as my own explorations. The more I have read and worked with Indigenous scholarship, the more convinced I have become of the importance of including these perspectives in my own work wherever they are relevant. I am glad to be able to use my own power, such as it is, to amplify the voices of scholars who are much more marginalised than me.

Books on Indigenous research methods are very different from books on Euro-Western research methods. Books on Euro-Western research methods are akin to recipe books: combine these things, like this, and you will probably get that result, unless some contextual factor gets in the way. Books on Indigenous research methods don’t start with what to do and how to do it, they start with stories, and thinking, and sharing, and knowing, and learning. One key difference is that Indigenous research is designed to serve existing relationships, and if it is not likely to at least maintain and ideally strengthen those relationships, it is not deemed to be worth conducting. In the Euro-Western paradigm, we teach novice qualitative researchers to ‘create rapport’ with participants, to put them at ease – in effect, to make instrumental use of our friendship skills to obtain information from people we may not ever see again. Euro-Western researchers have begun to question how ethical this is. Indigenous researchers offer us some unmissable clues to the answer.

I am not, and I will never be, an expert on Indigenous research. Since my book on research ethics came out – with its subtitle of ‘Euro-Western and Indigenous Perspectives’ – I have received several invitations to speak about Indigenous research and to peer-review journal articles written by Indigenous scholars. I always refuse the first, and I only accept the second if the journal editor can assure me that the other reviewers will be Indigenous scholars (which, to date, no journal editor has been able to do). As a white English person I already have too much power in this post-colonial arena. I do not plan ever to use any of it to set myself above or take advantage of the Indigenous scholars who have taught me, and are teaching me, so much through their writings.

These books could be described as more theoretical than practical but, in the words of Kurt Lewin, the inventor of action research, ‘There is nothing as practical as a good theory.’ Lewin was a Jewish German psychologist who immigrated to the US as an adult in 1933, so he had experienced and understood oppression. He was also, perhaps as a result, much more interested in applied research which could make a positive difference to social problems than to research that might generate knowledge for its own sake. In the Indigenous research literature this distinction is not relevant, made or discussed, because knowledge is conceptualised as collectively owned, in contrast to the Euro-Western paradigm where knowledge is conceptualised as a form of individual property.

I could say a lot more about the similarities and differences I perceive, but I need to get to the books! The first is Talkin’ Up To The White Woman: Indigenous Women and Feminism by Aileen Moreton-Robinson, Professor of Indigenous Research at RMIT in Melbourne, Australia. This was recommended by various people on social media, and I didn’t get around to buying a copy until last year, but I’m not sorry because I got the 20th anniversary edition with a new preface. It is a book of relevance to every white woman and anyone who uses feminist theory. Although it was written over 20 years ago, it is still highly, urgently topical. The author explains how white women dominate the feminist agenda; invites us to notice and interrogate our white privilege; and suggests we need to figure out how to give up some of that privilege in the interests of greater equality – which, after all, is where feminism came in.

Syed Farid Alatas is Professor of Sociology at the National University of Singapore. His book Alternative Discourses in Asian Social Science: Responses to Eurocentrism points out how and why Euro-Western social science doesn’t fit with Asian realities. The book covers the whole of Asia and all of the social sciences, and – despite its title – argues that alternative discourses alone are not enough, particularly if they are created in the same mould as the Euro-Western social science discourses so prevalent in Asian universities. Alatas explains in forensic detail how Asian academies are still colonised by Western approaches and curricula. He calls for a ‘liberating discourse’ which will help to popularise Asian ideas and perspectives.

Antonia Darder is a Puerto Rican and American scholar, artist, poet and activist. She has edited a collection called Decolonizing Interpretive Research: A Subaltern Methodology for Social Change. The foreword, by Linda Tuhiwai Smith, notes that ‘dominant theories … have spectacularly failed to transform the lives of subaltern communities and have instead reinforced privilege and inequalities across all developed and developing countries’ (p xii). In her introduction, Darder points out that an insistence on empirical evidence is a colonialist approach and, in close alignment with Alatas, calls for a reversal of privilege to foreground Indigenous philosophies and approaches.

Applying Indigenous Research Methods: Storying with Peoples and Communities is edited by Indigenous American scholars Sweeney Windchief and Timothy San Pedro. The editors begin by acknowledging that there is more in the literature about what Indigenous research methods are, and why, than about how they can be applied. This book sets out to correct that imbalance – and says quite clearly on the back cover that it is designed for use and teaching across Indigenous studies and education. Any Euro-Western researcher who is looking for methodological novelty they can use in their own work will not find that here. What they will find instead are inspiring stories of how research can be when it is understood and conducted holistically in and for communities of people who share a system of values which have been developed and tested over millennia.

Indigenous Canadian scholars Deborah McGregor, Jean-Paul Restoule and Rochelle Johnston have edited Indigenous Research: Theories, Practices, and Relationships. This also focuses on how Indigenous research is conducted in practice and includes inspiring stories to demonstrate some ways this has been done.

Shawn Wilson, Andrea Breen and Lindsay DuPré have edited Research and Reconciliation: Unsettling Ways of Knowing through Indigenous Relationships. The editors are two Indigenous researchers and one white settler. They explain the troubled complexity of the concept of reconciliation, which means different things to different people and can be co-opted for colonialist purposes. The editors are overtly working towards twin purposes of creating intellectual discomfort in some arenas and, in others, creating and protecting spaces for researchers to work as authentically as possible. And, again, the contributions are inspiring stories – though sadly, unlike all the others, this book doesn’t have an index.

There are more links between the last three books than their presentation of stories. These books seem to speak to each other, the stories intertwining and sometimes disagreeing, going back and forth and around again but always making progress. Like a conversation. And they are all very readable, written with dialogue and storytelling, poetry and images.

Lastly I am going to mention again a book I covered in my previous post: Indigenous Research Methodologies by Professor Bagele Chilisa from the University of Botswana. I am mentioning this book again because the second edition is now out and well worth buying and reading, even if you already have the first edition.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

What Is Data?

Last week, in the context of some work I’m doing for a client, I was trying to find something someone had written in answer to the question: what is data? I looked around online, and in my library of methods books, and I couldn’t find anything except some definitions.

The definitions included:

  • Factual information used as a basis for reasoning or calculation (Merriam-Webster)
  • Information, especially facts or numbers, collected to be used to help with making decisions (Cambridge English Dictionary)
  • Individual facts, statistics, or items of information, often numeric (Wikipedia)

Data is also, demonstrably, a word, and a character in Star Trek. So far, so inconclusive. Yet people talk and write about data all the time: in the media, in books and journals, in conversations and meetings. And they use it to refer to many other things than facts or numbers. Data may be anything from a piece of human tissue to the movement of the stars.

Euro-Western researchers conventionally speak and write of ‘collecting’ data. And indeed some data can be collected. If you want to research beach littering, you can go and collect all the litter from one or more beaches, and then use that litter as data for analysis. If you want to know what differences there may be in how print media describes people of different genders, you can collect relevant extracts from a bunch of articles and then use those extracts as data for analysis. So this is valid in some cases. However, if you plan to research lived experience by collecting data, you are effectively viewing people as repositories of data which can be transferred to researchers on request, and viewing researchers as people who possess no data themselves so need to take it from others. Clearly neither of these positions are accurate.

Some Euro-Western researchers speak and write of ‘constructing’ data. This refers to the generation of data as a creative act, such as through keeping a diary for a specified length of time, taking photographs during a walking interview, or making a collective collage in a focus group. Even conventional interview or focus group data can be viewed as being constructed by researcher and participant(s) together.

Autoethnographers and embodiment researchers privilege data from their own lived experience, though often they also use data collected from, or constructed with, others. But for these researchers, their own sensory experiences, thoughts, emotions, memories and desires are all potential data.

For Indigenous researchers, all of these and more can be used as data, which is often co-constructed with the researcher and all participants working together in a group. This is done in whatever way is appropriate for the researcher’s and participants’ culture. Māori research data is co-constructed through reflective self-aware seminars. In the Mmogo method from southern Africa, objects with symbolic and socially constructed meanings are co-constructed from familiar cultural items such as clay, grass stalks, cloth and colourful buttons, during the research process, to serve as data (Chilisa 2020: 223-4,243). Indigenous researchers in America, Canada and Australia use oral history, stories and artworks as data (Lambert 2014:29-35).

All of this tells us that data is not purely facts and numbers, as the definitions would have us believe. Conversely, we could conclude from the examples above that pretty much anything can be data. This does not mean anything can be data for any research project. You’re not likely to find a cure for disease by collecting bus timetables, or identify the best way to plan a new town by making inukshuk. But bus timetables could be very useful for research into public transport systems, and making inukshuk could be integral to Indigenous research into the knowledge and belief systems of Arctic peoples.

Data can be documents or tattoos, poems or maps, artefacts or photographs – the list is very, very long. And of course a research project may use different kinds of data, which could be collected, or constructed, or some of each. The question we need to ask ourselves, at the start of any research project, is: what kind(s) of data are most likely to help us answer our research question, within its unique context including any constraints of budget and/or timescale? In the end, for some projects, the answer will be facts, or numbers, or both. But if we assume this from the start, we close off all sorts of potentially interesting and useful options.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Ethics Codes and Guidelines

Last month I was involved in the final review meeting for the PRO-RES project. This is a project funded by the European Commission to create an ethics framework for all non-medical researchers. I worked on this project from 2018–2021: I have written about the experience here, and about some of the resources we created and curated here.

One key resource is a collection of research ethics codes and guidelines. We also conducted five case studies of very different approaches to developing and implementing codes and guidelines. These were from:

The International Network of Governmental Science Advice (INGSA)

The United Kingdom Research Integrity Office (UKRIO) and the Association of Research Managers and Administrators in the UK (ARMA)

The Social Research Association (SRA)

The Estonian Code of Conduct for Research Integrity

The Croatian Agency for Personal Data Protection

INGSA has around 6000 members from more than 100 countries, and they are not just government science advisors (as the name suggests) but a much wider group. INGSA acts as an informal network of key actors who help to build evidence and provide advice for policy-makers. It works to ensure that the evidence used by its members is scientifically robust and ethically sound. Its global and transdisciplinary work is too complex and multi-faceted to be managed through a written ethics code or guideline. Instead, it focuses on training advisors to identify robust and ethical evidence.

UKRIO and ARMA worked together to create a common framework for ethics support and review for UK universities and other research organisations. The aim was to support best practice and common standards, and the framework was co-produced by ethicists, research ethics committee chairs, and representatives of universities, research funders and learned societies. The framework was published in 2020, is explicit and detailed, and is freely available online. It is now being used by many universities and research organisations.

The SRA has recently updated its ethical guidelines, which are widely used by researchers from a range of sectors. The SRA is a small charity run by volunteers, and the update was also done by volunteers, which meant it took quite a long time. The pandemic slowed the process even more. In retrospect, they would have benefited from paying someone to do the initial drafting with input from a group of volunteers. They considered looking for another organisation’s guidelines to adopt, but decided that could be just as difficult and might prove impossible. So they pressed on and finished the job. The guidelines were published in early 2021 and are freely available online.

The Estonian case study researched the process leading to, and following, the signing of the national Estonian Code of Conduct for Research Integrity in 2017. The process of developing and signing the code took 18 months and involved universities and research organisations, plus consultations with partners from research and development institutions and with the wider public. After the code was signed, the process of implementation began, with debates around committees for research integrity and different universities applying the code in different ways. The Estonian Research Council and the Estonian Ministry of Education and Science are reorganising relevant legislation to align with the code, and monitoring its implementation.

The Croatian case study focused on personal data protection in academic and research institutions throughout the country, before and after the EU’s General Data Protection Regulation (GDPR) came into force in 2018. The number of reported personal data breaches in Croatia increased dramatically after the implementation of GDPR, but very few of these related to research. Hundreds of data protection officers across Croatia were found to have little knowledge of personal data protection or its relationship with ethics. Ethical issues around personal data protection were also found to be problematic at EU level. Each of these aspects of the case study were written up in open access journal articles.

These case studies may seem quite disparate but, collectively, they offer some useful lessons. First, when creating frameworks for ethics and integrity in research, there is a clear need to balance ethical ideals with what is possible in practice. Second, being prescriptive is not possible because of the constant changes to research contexts and wider society. Third, delegating responsibility for ethics to a specialised team such as a research ethics committee leads to compliance, not engagement. (I have written more about this elsewhere.) Fourth, sanctions and incentives can help to deepen commitment, but are only appropriate for some discrete elements of research ethics such as GDPR.

I also found it interesting to observe the discussions during the PRO-RES project. I learned that a number of ethicists yearn for a common ethics guide or code: ‘one code to rule them all, one code to bind them,’ as I sometimes enjoyed misquoting. I also learned that institutions, organisations, nations and other groups feel a strong need to develop their own code, with nuances and emphases that reflect their own ethos and vision. The PRO-RES project initially aimed to create a common framework for all non-medical researchers. And indeed it has done so, though how widely the framework will be taken up and used remains to be seen.

A central part of the framework is the PRO-RES Accord, a concise statement of ethical principles which was widely consulted on during the PRO-RES project. Over 1000 people, across Europe and beyond, gave feedback on draft versions before the accord was finalised. Signing the accord means you agree to abide by its principles; endorsing the accord means you commend its principles and will strive to promote them. Anyone can download, sign, and/or endorse the accord, either as an individual or on behalf of an organisation. Perhaps you would like to do so yourself.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Asset-Based Research

For me, one of the greatest developments in research methods so far this century is the genesis and expansion of asset-based research.

Up to the end of the last century, research was almost entirely based on deficits. What we studied were problems, lacks, difficulties, deficiencies, gaps. This is understandable: people generally do research to try to improve matters, so starting with something that needs improvement makes sense. However, we were missing a big trick.

Around the turn of the century, psychologists Martin Seligman and Mikhail Csikszentmihalyi founded the positive psychology movement. Before then psychologists had exclusively studied topics such as memory loss, criminal and deviant behaviour, attachment disorders, psychopathology and the like. The positive psychology movement chose to study topics such as happiness, resilience, well-being and so on, to find out what we can learn from people who are flourishing and how we might be able to extend some of that to others.

Organisational researchers David Cooperrider and Suresh Srivastva were taking a similar approach. They developed the method of Appreciative Inquiry which begins by looking at what an organisation does well and is proud of, and then considers how it can improve in the light of its successes. And researchers from various disciplines around the world have been drawing on Amartya Sen’s capabilities approach to consider what Indigenous and other marginalised people can and do contribute to their communities.

Asset-based research is also beginning to be used in other fields, including Autism research. I am proud to have made a small contribution to this process myself, through a journal article Aimee Grant and I wrote which was published in Contemporary Social Science last month. The article is called Considering the Autistic advantage in qualitative research: the strengths of Autistic researchers. Much Autism-related research has been conducted by neurotypical people based on a view of Autistic people as deficient. By contrast, in our article, Aimee and I demonstrate that Autistic people like us have a lot to offer to qualitative research teams. We have also formulated some guidance, for teams with a mix of neurotypical and neurodiverse people, to facilitate effective inclusive working.

I am delighted to say the article is open access so you can all read it! I am also delighted that it has generated a lot of interest, with over 2,500 views in its first three weeks. And I feel proud to have been able to make this contribution within nine months of my own Autism diagnosis. Though I should acknowledge that I couldn’t have done it without Aimee, who was an excellent collaborator. Also, we had fantastic support from the journal editors and the anonymous reviewers. If you are looking for a home for an article on researcher experiences and research methods, or would like to propose a special issue, I would encourage you to consider Contemporary Social Science. It is the journal of the Academy of Social Sciences, of which I am a Fellow, but you don’t need any links to the Academy to submit work to the journal. They publish four issues a year, of which only one is open access at present, but that may change in time.

Anyway, if you find our article helpful or interesting – or disagree with the points we make, because all reasoned debate is useful – then please let us know, either here in the comments or over on Twitter.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

The Research Trajectory

I have been talking about the research trajectory with my students for years. I describe my conception of this trajectory, which I call ‘The Helen Kara Inverted Bell Curve Of Research’. I often use this conception to help me explain why it is usually not a good idea, when data analysis is challenging, to decide that all problems will be solved by throwing in a few extra methods – gathering more data, reading a new body of literature, and so on.

It occurred to me that manifesting the image I see in my head might be entertaining for me (it was!) and perhaps useful for others. So I made a graphic. Here it is. Does it resonate with you?

The Helen Kara Inverted Bell Curve Of Research

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Why Secondary Data Should Come First

The argument put forward in this post has been brewing in my mind – and being put into practice in my research work – since some time before COVID19 appeared in our midst. The pandemic has accentuated the point I want to make.

Essentially, my argument is this: researchers should make as much use of secondary data as possible before we even think about gathering any primary data.

Most novice researchers are taught that new research requires primary data; that original research requires data gathered for the purpose by the researcher or the research team. Most research ethics committees focus most of their efforts on protecting participants. We need to change this. I believe we should be teaching novice researchers that new/original research requires existing data to be used in new ways, and primary data should be gathered only if absolutely necessary. I would like to see research ethics committees not only asking what researchers are doing to ensure the safety and wellbeing of participants, but also requiring a statement of the work that has been done using secondary data to try to answer the research question(s), and a clear rationale for the need to go and bother people for more information.

I believe working in this way would benefit researchers, participants, and research itself. For researchers, gathering primary data can be lots of fun and is also fraught with difficulty. Carefully planned recruitment methods may not work; response rates can be low; interviewees often say what they want to say rather than answering researchers’ questions directly. For participants, research fatigue is real. Research itself would receive more respect if we made better and fuller use of data, and shouted about that, rather than gathering data we never use (or worse, reclassifying stolen sacred artefacts and human remains as ‘data’ and refusing to return them to their communities of origin because of their ‘scientific importance’ – but that’s another story).

Some people think of secondary data as quantitative: government statistics, health prevalence data, census findings, and so on. But there is lots of qualitative secondary data too, such as historical data, archival data, and web pages current and past. Mainstream and social media provide huge quantities of secondary data (though with social media there are a number of important ethical considerations which are beyond the scope of this post).

Of course secondary data isn’t a panacea. There is so much data available these days that it can be hard to find what you need, particularly as it will have been gathered by people with different priorities from yours. Also, it’s frustrating when you find what you need but you can’t access it because it’s behind a paywall or it has an obstructive gatekeeper. Comparison can be difficult when different researchers, organisations, and countries gather similar information in different ways. It can be hard to understand, or detect any mistakes in, data you didn’t gather yourself, particularly if it is in large, complicated datasets. Information about how or why data was gathered or analysed is not always available, which can leave you unsure of the quality of that data.

On the plus side, the internet allows quick, easy, free access to innumerable quantitative and qualitative datasets, containing humongous amounts of data. Much of this has been collected and presented by professional research teams with considerable expertise. There is scope for historical, longitudinal, and cross-cultural perspectives, way beyond anything you could possibly achieve through primary data gathering. Working with secondary data can save researchers a great deal of time at the data gathering stage, which means more time available for analysis and reporting. And, ethically, using secondary data reduces the burden on potential participants, and re-use of data honours the contribution of previous participants.

There are lots of resources available on using quantitative secondary data. I’m also happy to report that there is now an excellent resource on using qualitative secondary data: Qualitative Secondary Analysis, a recent collection of really good chapters by forward-thinking researchers edited by Kahryn Hughes and Anna Tarrant. The book includes some innovative methods, interesting theoretical approaches, and lots of guidance on the ethics of working with secondary data.

Some people think that working with secondary data has no ethical implications. This is so wrong it couldn’t be wronger. For a start, it is essential to ensure that informed consent for re-use has been obtained. If it hasn’t, either obtain such consent or don’t use the data. Then there are debates about how ethical it is to do research using secondary data about groups of people, or communities, without the involvement of representatives from those groups or communities. Also, working with secondary data can be stressful and upsetting for researchers – imagine if you were working with historical data about the Holocaust, or (as Kylie Smith does) archival data about racism in psychiatric practice in mid-20th century America. Reading about distressing topics day after day can be harmful to our emotional and mental health, and so to our physical health as well.

These are just a few of the ethical issues we need to consider in working with secondary data. Again, it is beyond the scope of this post to cover them all. So working with secondary data isn’t an easy option; although it is different from working with primary data, it can be just as complex. I believe novice researchers should learn how to find and use secondary data, in ethical ways, before they learn anything about primary data gathering and analysis.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved Patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $70 per month. If you think a day of my time is worth more than $70 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

The Personal Is Empirical

Human beings are natural researchers: exploring, seeking and comparing data, testing, evaluating, drawing conclusions. We do this all our lives. One of our first research methods, when we are only a few months old, is to put everything in our mouths. By the time we are a few years old we are asking incessant questions. We are programmed to investigate. As we get older, our methods get more sophisticated – and if we train as a professional researcher, they become more systematic, too.

Do you know the roots of the word ‘empirical’? It is derived from the Greek word ‘empeirikos’, meaning ‘experienced’. It means something verifiable by experiment or experience. So, the personal is empirical.

Autoethnographers know this already. For a generation now autoethnographers have been ‘utilizing personal stories for scholarly purposes’ (Chang 2008:10). Some have put too much emphasis on the personal stories and not enough on the scholarly purposes, leading to accusations of self-indulgence, navel-gazing, and irrelevance. More, though, have worked to link their personal experience with other data and wider narratives, theory, evidence, policy, and practice, in a systematic and rigorous way.

Embodied researchers also know that the personal is empirical. They focus on the physical, sensory dimensions of experience, as part of the data they collect. This subverts the conventional view of scholarly work as entirely cerebral – or, as the embodied researchers would have it, ‘disembodied research’. Embodied research is also open to accusations of self-interest and irrelevance. Yet embodied researchers point out that no research can in fact be disembodied. Even sitting still and thinking is a physical activity; the brain with which you think forms part of your body.

Other researchers draw on the personal in other ways. In my work on creative research methods, I have been astonished by the number of people who combine their artistic skills, or their writing talents, or their aptitude for making, or their technological savvy, or some other personal attribute with their research. This usually results in enrichment and often innovation, yet even now working in these ways can feel like swimming against the tide. The way we try to contain knowledge in silos, and reify specialisation, is not the norm in human history. It is not long since nobody thought it strange for someone to be both weaver and astronomer, doctor and poet, musician and engineer. Why have we forgotten that ‘the more diverse someone’s knowledge, the more likely they are to be able to identify and implement creative solutions to problems’? (Kara 2020:11).

Musing on all of this, I came up with the phrase ‘the personal is empirical’. I tried it out on a group of students last month and it went down well. Then, like a good scholar, I checked to see whether anyone else had used the phrase already. It was used by one US academic, most recently around 15 years ago. She was a feminist too and I guess for her, as for me, the generation of this phrase was influenced by the old feminist mantra that ‘the personal is political’. Nobody owned that phrase, and nobody owns this one either – you’re free to use it if you wish.

In fact, it would be great if you did. Because we need more people to understand that ‘knowledge is worth having, no matter where it originates’ (Kara 2020:11) – whether that is in the body, or someone’s wider life experience, or in a test tube, or an encounter with a book, or a conversation, or an animated film. As a species, as inhabitants of planet Earth, we have a plethora of problems to solve. We cannot afford to reject knowledge, or create hierarchies of knowledge; we need to value everyone’s expertise. And their experience. And experiments, and evidence, and theories – the whole lot. In fact, it is all empirical, but nobody will argue if you talk about empirical experiments or empirical evidence. The personal is empirical? That’s more provocative. So take this toy I have given you, my dear ones; take it and play!

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $70 per month. If you think a day of my time is worth more than $70 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!