Why Peer Reviewing Is More Difficult These Days

I have been a peer reviewer of journal articles for the last eight years. I documented my first peer review, in late 2014, on this blog. Peer reviewing has never seemed easy to me – and I don’t think it should. Reviewing original work by other scholars is bound to be intellectually and emotionally demanding. But I feel as if peer reviewing has become more difficult, even over the comparatively short time I have been involved. There are several reasons for this, and I will focus on three of them here: hoaxes, malpractice and complexity.

Academic hoaxes pre-date my reviewing experience. In 2005, three US-based doctoral students in computer science, Jeremy Stribling, Max Krohn and Dan Aguayo, created SCIgen. SCIgen is a computer program which can generate whole computer science journal articles including graphs, figures and citations, that look credible but are in fact nonsensical. A lot of articles generated by SCIgen have been accepted by, and published in, academic journals, despite the use of peer reviewers.

And such hoaxes are not limited to computer science. In 2017–18, three UK-based scholars, James Lindsay, Helen Pluckrose and Peter Boghossian, wrote 20 fake articles using social science jargon. They were able to get several of these articles published in academic journals, even though some of them promoted morally questionable acts. The aim of these three scholars was apparently to highlight what they saw as poor quality work in some areas of the social sciences. However, I am not sure this intended end justifies the questionable means of duping reviewers and editors into publishing bogus research.

Sadly, though, it seems that academic journals are regularly duped into publishing bogus research by researchers themselves. Retraction Watch, based in the US, has been keeping track of retracted journal articles for the last 12 years. Some articles are retracted because their authors made honest mistakes. But the Retraction Watch database lists a lot of other reasons for retraction, including falsification or fabrication of data, and falsification, fabrication or manipulation of images or results. And the numbers are staggering. At the time of writing, there are over 1,500 articles listed on the database as retracted due to the falsification and/or fabrication of data, and over 1,000 due to the manipulation of images. Also, the database only includes those articles in which fabrication, falsification or manipulation have been detected and reported. By its own admission, Retraction Watch is biased towards the life sciences, so problematic journal articles in other sectors will be even less visible.

A bunch of people make it their business to find and publicise these problematic articles. One even does it under her own name: Elisabeth Bik. Others use pseudonyms such as Clare Francis, Smut Clyde, Cheshire, and TigerBB8.

Bik specialises in identifying manipulated images, and has found through empirical research that their prevalence is increasing. However, Bik has a particular talent for pattern recognition. Of course it is useful to know that images may be manipulated, and Bik regularly shares examples on social media and elsewhere which can help others understand what to look for. But even so, spotting manipulated images can be difficult for the average, harassed, unpaid peer reviewer. And catching fabricated or falsified images, data or results may be almost impossible without inside information. Most journal articles have strict word limits which can work against them here. These restrictions mean researchers are used to some aspects of their processes receiving a cursory mention at best, and this can enable cheating to pass undetected.

When reviewing goes wrong, consequences can be disastrous. The link is to a recent controversy about a published article promoting a morally questionable act. I am not using any of its keywords in this article. I think there are some particularly interesting aspects of this case. It is not the first article to be published that features morally questionable acts. I have read the article; it is well written, and I can see how a peer reviewer could regard it as worthy of publication – as its own peer reviewers did. The problem, for me, lay in the background of the author who promotes morally questionable acts outside of academia. He may have written this article in the hope that publication would lend legitimacy to his actions. Even if he did not, publication might be perceived to confer such legitimacy, which could cause reputational damage to the publisher and the university concerned.

So, the article you are reviewing may be a hoax, and/or may contain data, images, and/or results that have been manipulated, fabricated or falsified, in ways that are difficult or impossible to detect, and/or may have been written by someone with a dodgy agenda. But that’s not all. Academic work – and, indeed, the world around us – is becoming more complex. More research is transdisciplinary, pushes methodological boundaries, is multi-lingual, and so on. The process of peer review was devised when people worked in neat, tidy, single disciplines and fields. In that landscape people could act as experts on other people’s work in its entirety. These days that is not so easy. Topics such as sustainability, the climate crisis, and food security transcend disciplines and methods. This means that nobody, really, is an expert any more, so peer review is effectively obsolete. Yet it is still being used.

This means we need not only peer review before publication, but also after publication. Luckily there is a tool for this: PubPeer, a website where you can comment on published journal articles, anonymously if you wish. This enables researchers with inside information to whistleblow without risking the loss of their jobs. Also, you can use PubPeer to check articles you are intending to cite, to make sure nobody has raised any concerns about the work you want to use. At the moment PubPeer focuses mostly on laboratory and clinical research, but there is also (not surprisingly) some computer science. In fact PubPeer can be used for any published journal article as long as the article has a recognisable ID such as a DOI. Also, there is a PubPeer browser plugin which enables PubPeer comments to be visible on other websites besides PubPeer itself.

This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Rapport or Respect?

Trainee qualitative researchers, learning the most popular research method of interviewing, are routinely taught to use their interpersonal skills to create rapport with participants. This has been questioned for the last 20 years by Jean Duncombe and Julie Jessop. They ask, how ethical it is for researchers to fake friendship as a means to the end of gathering data?

On the one hand, it is common for people to use interpersonal skills to help us get what we want from others in our day-to-day lives. This applies whether we want a loan from a credit agency, a prescription from the doctor, a response to a complaint – in a multitude of situations, presenting our most polite and friendly selves can help to get the results we want. So it is arguable that it makes sense also to use these everyday methods in research.

On the other hand, research encounters are rather different from everyday encounters. This applies particularly to qualitative research where a researcher may spend a considerable period of time giving a participant their undivided attention. This is an unusual and often a welcome experience for participants, who often describe it in positive terms such as ‘therapeutic’, ‘cathartic’ or ‘a treat’.

Many of the people we want things from in day-to-day life are either providing us with goods and services, so that a transactional element is built into the encounter, or are already in a personal relationship with us through kinship, friendship or community membership. So the rapport we build in those situations already has a clear basis which is mutually understood. This does not apply within the research encounter, where we are usually asking participants to give us their time and information in exchange for a potential benefit to an imagined future population. (I considered the extent to which this is ethical in my recent post on the LSE Impact Blog.) Also, despite all the efforts to secure informed consent, we know that people generally agree to participate in research for their own reasons rather than ours. And where that reason is to get a little human company and kindness, which is lacking from their own lives, the practice of building rapport begins to appear even more suspect.

Imagine you are, let us say, living on minimal welfare benefits with a chronic condition which makes it difficult for you to leave the house. You have lost touch with the friends you used to have when you could go out to work, and your family live far away. You suffer from anxiety and you are very lonely. The carers who come in three times a day are brisk and professional; they don’t have time to chat, and you don’t want to hold them up because you know they are always under pressure. Then a researcher calls, saying she is doing an evaluation of the care you receive, and asking if she can visit you to ask a few questions. You are delighted because it’s been years since you had a visitor and she sounds so kind and friendly on the phone. When she visits, you tell her all sorts of things about yourself and your life. She seems really interested, and laughs at your jokes, and tells you a few things about her own life in return. You haven’t felt this good in years. When she has asked all her questions, you ask one of your own: please will she visit you again? She looks at the floor and says she would like to, but she can’t promise, because between work and her children she doesn’t have much free time. You would like to suggest she brings her children with her, but you know a ‘no’ when you hear one, so you let her go, wait for the front door to close, and listen to the emptiness of your home and your life.

Duncombe and Jessop point out that these problems are multiplied in longitudinal research, where the boundaries between real and faked friendship can become much more blurred. They share experiences of participants beginning to treat them as friends, and the discomfort that arises when they don’t reciprocate. I have had similar experiences, and I’m sure many other qualitative and mixed-methods researchers have too. It is interesting to consider this Euro-Western approach in the light of the very different Indigenous approach, in which research is deemed to be ethical when it serves to maintain and develop existing relationships. Looked at in this way, our Euro-Western approach of creating and then dropping relationships to further our research purposes seems potentially abusive.

The EU-funded TRUST project developed a Global Code of Conduct for Research in Resource-Poor Settings. It was based on four values elicited from research they did with a wide variety of people around the world: respect, fairness, honesty and care. The aim was to combat ‘ethics dumping’, where research deemed unethical in a higher-income country is conducted, instead, in a lower-income country where research is not governed by a regulatory system. I would argue that these values should also apply where research is done by a researcher with more social capital than some or all of their participants. In the vignette above, the researcher was not entirely honest and did not show care in response to the participant’s request, e.g. by signposting them to a local befriending service. This could be described as ‘friendship dumping’.

When you think about it, researchers using their interpersonal skills to create rapport with a participant as a means to an end is actually quite manipulative. This might be more defensible when we are ‘studying sideways’ or ‘studying up’, but even then it seems questionable. Showing respect for participants would be a more creditable aim, especially if it was combined with fairness, honesty and care.

The next post on this blog will be in September. You can follow the blog, above, to get my posts in your inbox.

This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

The Value and Limitations of Lived Experience

At times I have been hired for my ‘lived experience’, either as a carer for people with mental health problems or as a disabled person myself. I have also worked in research teams with people who have other kinds of ‘lived experience’, such as parenting children under five or living with addiction. I am not particularly keen on the phrase ‘lived experience’, because as far as I can tell all human experience is lived experience. I prefer ‘experts by experience’.

However, I also think the concept is flawed. Being an expert by experience is not like being an expert in domestic plumbing, or millinery or research ethics. For a start, the categories provided for experts by experience are incredibly broad. ‘Disability’ is a huge category. I am Autistic and I live with fibromyalgia and asthma. That qualifies me as an expert by experience – but I am no expert in the experiences of Deaf people, or stroke survivors, or people with Tourette’s syndrome, or many, many others. ‘Addiction’ is another huge category, covering street and pharmaceutical drugs, alcohol, shopping, sex and so on. Someone who is addicted to alcohol will not be an expert in the experiences of someone who is addicted to heroin or gambling. I could give you equivalent examples for mental health carers, the parents of young children, and any other category of ‘expert by experience’ you care to name.

Also, I often observe – and have experienced – experts by experience being required to subordinate their experience-based expertise to expertise conferred in other ways, such as through education or employment, and/or to organisational constraints. I have heard of situations where research ethics committees discounted expertise based on experience (which was no fun at all for the researchers concerned). And I have other forms of expertise myself, developed through education and employment; my experience shows that these are valued more highly than my expertise by experience. I earn more with them, for one thing. This all leads me to understand that expertise by experience is worth less than other forms of expertise.

I should also acknowledge that I have witnessed several situations where third sector organisations passed over a capable and qualified candidate to recruit an employee with lived experience. This might look like organisations valuing expertise from lived experience more highly than other forms of expertise, but in each case the story did not end well. Recruitment is one thing, retention is quite another. Recruiting someone who is not able to do the job, and then not providing the adaptations and support they need to become able to do the job, is a costly form of box-ticking. And I don’t mean only financial costs; failed employment leads to enormous emotional and mental health costs too.

Another thing I have observed – and not only post-recruitment – is much less support and development being available for experts by experience than for other kinds of experts. I have mentioned payment, which may be in the form of a voucher, or travel expenses and a sandwich lunch; once in a while a reasonable amount of actual money. Sometimes there is a helpful booklet or a little bit of training. I have never seen any sign of experts by experience being permitted, let alone encouraged, to develop other forms of expertise.

This is just one example of the ‘us and them’ aspect of experts by experience. In the early 2000s I did a lot of work with Sure Start, a New Labour initiative involving partnership working in areas of deprivation to provide multi-agency one-stop-shop support for parents and children under the age of five. My role was to support partnerships in their early stages so I spent a lot of time sitting around tables with groups including nursery educators, midwives, health visitors, Home-Start managers, and other such professionals. They would talk about ‘the parents’, meaning the people who would be using the services once they were set up. It felt very much as though they were othering their potential service users. I would ask, ‘How many of the people round this table are parents?’ Inevitably some were; often most. Then I would facilitate a discussion about how the lived experience of the parent-professionals could inform the work of the partnership. This made some of the professionals uncomfortable at times. I’m not sorry.

As a researcher, part of my job is to separate and categorise information to help me find useful links and patterns. But this separation and categorising work is temporary, for the purpose of discovery. Separating and categorising people is inevitable, at least for people using English because of how the language works – but this always carries the potential for othering. In my lived experience, experts by experience are often on the receiving end. It is not a pleasant place to be, when you are allowed to be involved so far and no further, when others always have the final say.

Everyone is an expert on something, whether that is cleaning a house or conducting an orchestra, plastering a wall or piloting a battleship. I wish experience-based expertise was valued as highly as education-based or employment-based expertise. I think it has every bit as much value and I hope, one day, this will be fully recognised.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Ethics Is Expensive

A while ago I turned down some potentially lucrative work on ethical grounds. I was approached by a global company I will call SubSidTech because it is a wholly-owned subsidiary company of one of the Big Five (Alphabet, Amazon, Apple, Meta and Microsoft). SubSidTech wanted help with creative research methods, and I was tempted, because I could have charged them a high fee and they might well have flown me to interesting places. But we didn’t get that far.

I turned down the work because I know that SubSidTech’s parent company works in some ways I consider to be unethical. I explained this to SubSidTech, politely; they sent a cordial email back thanking me for my candour and assuring me that they respected my views. I would have been very glad of the money. But I know turning the work down was, for me, the right thing to do.

It got me thinking, though, about the costs of acting ethically. Let’s start with consumption. I try to shop as ethically as I can: wherever possible I buy from companies with good policies and practices; I try to buy fairly traded and environmentally friendly products; I do what I can to avoid perpetuating cruelty to humans or animals. But living this way is often more expensive. For example, my phone is a Fairphone 3. The people who make this phone are paid a living wage, it is partly made from recycled plastic, and I can repair it myself with component parts available online at reasonable prices. It doesn’t have the built-in obsolescence of many mobile phones. But it was not cheap.

Sometimes being ethical can save money. I often buy second-hand clothes from online marketplaces or charity shops. But usually there is a premium to be paid for ethical consumption. And with costs rising as steeply as they are at present, I find myself rethinking a lot of my previously automatic choices. I love organic butter. It tastes like the butter of my country childhood, it’s not full of hormones, and it’s good for the planet – but its price has risen by 17% in recent weeks, and non-organic butter is cheaper. I don’t know how long I can maintain my ethical shopping preferences because, although I am not on a low income, my income is not rising. (It does go up and down a bit, but the average profit from my business over the last five years has been £24,964 per year; I can pay myself most of that.) And people who live on low incomes or welfare benefits have much more limited options for shopping ethically. The impact of the global financial squeeze on ethical consumption practices is already being recognised.

There is also a cost to doing research ethically. Taking the time to do proper participatory or other inequality-tackling research; paying or otherwise recompensing participants; providing suitable aftercare – these all cost more money, time and commitment than funders are used to funding or researchers are used to providing. Completing an ethics application form has a sizeable time cost, though some of the work done will save time later on. But there is still a time overhead, unless you are the kind of researcher who, having received their formal ethical approval, declares that they have ‘done ethics’ and will now get on with their research. And if you’re not that kind of researcher, if you aim to think and act ethically throughout your research work, then that also comes with a time cost and in some cases a financial cost too.

Because of the costs of acting ethically, we end up having to make compromises. Due to the rising cost of living I am consuming less of the ethically produced goods I like to eat and wear and use. My current choice is to consume less, rather than to buy unethically produced goods; this is a mark of privilege, and may have to change again in time. Perhaps there will also come a point where I cannot choose to turn down work from companies whose practices I regard as unethical. I hope not – but I know that, as for most people, if I need the money badly enough I will take any work I can get. But when it comes to research ethics, I plan to stand my ground. This is easier because someone else is paying the bill, most of the people I work for and with understand the purpose and value of research ethics, and often I can influence the ethical aspects of the research I conduct or support. That doesn’t mean research ethics is compromise-free – there are often compromises to be made where ethics is concerned. But I am happy to work in a profession where ethics, albeit expensive, is taken as seriously as I take it in my personal life.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Methods for Lived Experience Research

Note: This post was first published on the SRA blog in November 2021 and is reproduced here with the kind permission of the author and SRA.

In this blog post, Kimberley Neve, researcher at the Centre for Food Policy at City, University of London outlines different methods for capturing ‘lived experience’. Lived experience is the actual, specific ways in which people experience something, in this case food – access to food, food poverty, food quality, food allergies and many others. Kimberley and other researchers at the Centre for Food Policy specialising in qualitative methods have produced a Brief to give an overview of the range of methods you can use when researching people’s lived experience of ‘food environments’. Food environments are the space in which we make all our decisions about food – what to eat, where to buy it, when and with whom to eat it.

Using qualitative methods to influence policy

As researchers we want our work to have impact. We also want to know that it resonates with people and reflects not only the experiences of the research participants, but also of the general population in some way. For our research to have a positive impact, effective communication with policy-makers, both locally and nationally, is vital. Despite the potential of qualitative methods to inform policy that is effective and equitable for the people it is designed to help, the number of qualitative studies used as evidence for policy remains modest compared to quantitative studies.

We wanted to raise the profile of qualitative research methods among both policy-makers and food environment researchers by demonstrating the range of potential methods and their benefits (and drawbacks), with a focus on how using them can help inform policy. These methods can be utilised in a wide range of research areas – for example local transport, access to outdoor space or crime in local areas – providing in-depth insights into people’s lived experiences and practices that can explain how or why people act the way they do.

In our Centre for Food Policy Research Brief (the ‘Brief’) we initially mapped existing studies capturing the lived experience of food environments, categorising methods and relevant case studies. Following this, we consulted with members of our Community of Practice – experts in qualitative research and food environments – for feedback prior to final edits.

What are the qualitative methods you can use?

The Brief is not an exhaustive list of the qualitative methods available; however, we’ve tried to capture the main methods you can use. For the scope of the Brief, we didn’t include quantitative methods but of course recognise their vital role.

Often, combining quantitative and qualitative methods can yield the most valuable insights.

To make the overview as useful as possible, we categorised the methods in the following way:

  • Group 1 – Exploring experiences, perceptions, beliefs, practices and social networks;
  • Group 2 – Observing practices in situ;
  • Group 3 – Designing policy and interventions drawing on the lived experience of participants.
overview illustration

Which method should you use for your research?

Typically, you’ll be likely to benefit from combining methods to suit your research context. For example, visual methods and observation tend to be accompanied by individual or group interviews to provide a more in-depth exploration. In the full Brief you’ll find an overview of qualitative methods with the key benefits and potential limitations of each. Assuming you know all about individual interviews and focus group discussions already, here are a selection of other methods less frequently used in research projects.

Group 1: Visual methods

This includes photo elicitation, creative arts (where participants create artwork such as drawings, videos or theatre), concept mapping (pile sorting, ranking, mental mapping) and timelines. One study in the US used photo elicitation in urban neighbourhoods to identify community-level actions to improve urban environments in relation to health. The study allowed the researchers to identify that not all food outlets affected health in the same way, and that contextual factors such as crime and safety influence how people accessed food, which had implications for community-level policy.

  • PROS – Group 1 methods work particularly well with young participants or where there are language barriers, as views can be expressed more directly and simply. Participants may also be more willing to share information visually and images can provide insights that may not have been accessible via specific questioning.
  • CONS – Visual data can be difficult to interpret in a way that fully represents the participant perspective, and there is a potential for photographs to be seen as reflections of reality, rather than subjective perceptions that provide insights into reality. Participants could also misunderstand the objective and take photos that do not help to answer the research question.

Group 1: Geospatial methods

Geospatial methods often combine mapping with photography and/ or GPS to create visual data that can then be discussed in one-to-one interviews or focus group discussions for more insights. Methods include spatial mapping, geonarratives and geotagged photography. These methods are relatively new to the food environment literature; however they have been used very effectively to explore how people engage with their environment in general, for example in their green space encounters.

  • PROS – Similar to visual methods, geospatial methods can work well to engage participants in a way that is more creative and encourage them to share information more openly. They also allow for participants to share their knowledge as experts of their own food environments. These methods provide insightful data into the connections between space and place, particularly if combined with interviews or focus groups.
  • CONS – Geotagging requires specific technology that may be expensive and difficult to operate. There are also ethical considerations with mapping someone’s location – when and how this data is collected, stored and used are important factors to specify during the research design.

Group 2: Observation

This involves observing participant behaviour with methods such as go-along tours, transect walks and community group observation. Unlike with non-participant observation (below), the researcher talks to the participants during the activity about what their actions and interactions mean to them. For instance, during a go-along tour in a supermarket (shop-along), the researcher might ask for the thought process behind the decision to purchase a product. Transect walks are go-along tours with the addition of creating a map of the local food environment resources, constraints and opportunities.

In a UK study, go-along interviews were used to explore which changes to supermarket environments would support healthier food practices. A key insight from this research was that varied individual responses to the supermarket environment in low-income neighbourhoods are mediated by differing levels of individual agency. Interventions should include an emphasis on factors that increase agency in order to change how people buy food.

  • PROS – Insights into the practical aspects of daily life and routines can be captured interactively with the participant and explored in more detail with further questioning. Power imbalances in research are addressed as participants take more control of the research process.
  • CONS – The researcher’s presence may impact how participants behave or move around spaces, for instance by influencing what they buy in a shop-along tour. It is also quite time-intensive to organise and participate in.
couple shopping photo

Group 2: Non-participant observation

This is where participants are watched from a distance, for instance by video, with little or no interaction with the researcher. This method was used as part of a focused ethnographic study in Kenya along with interviews and cognitive mapping. The aim of the study was to inform policies for improving infant and young children’s nutrition practices. Among other insights, a key finding for policy was that future interventions must consider various aspects of food insecurity to improve conditions in practice.

  • PROS – You can get insights into ‘real’ individual actions, such as shopping or eating practices, without the researcher’s presence influencing the actions. Features of everyday life that may otherwise not be mentioned can be recorded and explored with further questioning. The researcher can also complete a log to provide contextual insights that can explain practices from a more objective viewpoint.
  • CONS – Observation alone, without a follow-up interview or discussion, means the researcher is unable to dig into the reasons underpinning the actions, so the interpretation of the situation can be subjective.

Group 3: Photovoice, co-design, co-creation, systems mapping, group model building

The third group of methods were particularly difficult to classify, as terminology and meanings often overlapped (for instance with co-creation and co-design). These methods place the participant at the centre of the research process and actively engage communities affected by policy decisions (at a neighbourhood, city, county, country level) in the research process. Participants are encouraged to draw on their own experiences, expertise and knowledge of their food environments to think about and propose change, so that policies resulting from the research are relevant and context-specific, and as a result have the potential to be more sustainable.

An example of effective group model building can be seen in a study in the US, where community-based workshops took place with a diverse group of chain and local food outlet owners, residents, neighbourhood organisations, and city agencies.Action ideas were discussed for interventions to promote healthy food access, including funding new stores that stock healthy food options and building the capacity for sourcing local produce in stores.

  • PROS – For all of the methods in Group 3, the ‘hands-on’ nature of research enables participants to generate information and share knowledge on their own terms. Outputs, such as policy recommendations, are created together with the participants to be effective in their local context following an in-depth research process.
  • CONS – These methods all run the risk of being perceived as tokenistic by participants if engagement is not meaningful and genuine.

In brief

Decisions about which methods to select to study live experience depend on the purpose of the study (i.e. guided by a specific research question), the local context, time and resources available, and the benefits and limitations of each method.Recently, the COVID-19 pandemic has accelerated the possibilities of using digital tools and technology as key facilitators for remote research.

As researchers, we not only need to engage participants and design research projects that will yield useful insights; we also have to translate our findings so that these insights can inform the design of effective and equitable policy. By using a range of methods, a more comprehensive and detailed overview can be communicated. Visual materials and stories are particularly effective ways for qualitative researchers to communicate their findings to policy-makers and make a refreshing addition to the more common interviews and focus groups.


This blog was written based on the work produced by all authors credited in the full Brief: Understanding Lived Experience of Food Environments to Inform Policy: An Overview of Research Methods

Author biography

Kimberley Neve is a Researcher at the Centre for Food Policy, City, University of London. She works as part of the Obesity Policy Research Unit, investigating people’s lived experiences of food environments to inform policy in areas such as infant feeding and weight management. Kimberley is a Registered Associate Nutritionist with a Masters in Global Public Health Nutrition.

Rethinking Vulnerability and Sensitivity

Research ethics committees are very concerned with the potential vulnerability and sensitivity of research participants. So far, so laudable – but I don’t think they show their concern in particularly useful ways. Gaining formal approval from a research ethics committee is a hoop many researchers have to jump through, but then the real work of ethics begins.

For most research ethics committees, vulnerability is an attribute of some groups and not others. Groups who may be deemed to be vulnerable include children, older people, or adults with learning disabilities. These categories are specified by UKRI who oversee government-funded research in the UK. But if you look at this in more detail, it doesn’t stand up. Take children. Say a competent 14-year-old is a young carer for their single parent who lives with severe and enduring mental health problems and drinks alcohol all day. Which of those two people might be better able to give informed consent to the child taking part in research? Conversely, people are not necessarily vulnerable because they are older. President Biden is 79 and I can’t imagine him being seen as vulnerable. Learning disabilities don’t necessarily make people vulnerable either, as some of my dyslexic friends would no doubt agree.

Vulnerability is not an attribute, it is a state we all move into and out of in different ways. The start of the Covid-19 pandemic made this abundantly clear. Quite suddenly we were all vulnerable to illness, perhaps death; to increased anxiety; to fear for loved ones who fell sick; to bereavement. Heads of state were no safer than ordinary people living in apartments or suburbs, and researchers were every bit as vulnerable as their participants. Perhaps one small positive side-effect of the pandemic is this: we can see more clearly that we are all vulnerable to changing circumstances resulting in trouble or trauma. Which does not mean we are all vulnerable all the time – but that any of us may be, or may become, vulnerable at any time. As researchers, I think it is essential for us to be aware of this, and ready to face and manage it when it occurs.

Vulnerability and sensitivity have something in common. Just as it is not possible to predict from group membership who is and is not vulnerable, so it is not possible to predict who will and will not be upset by a topic. Of course some topics are likely to be upsetting: female genital mutilation, suicide, sex work, and so on. And we need to put whatever precautions we can in place if we are investigating topics like these, that are evidently sensitive: to make the experience as safe as possible for our participants, and for ourselves. But we cannot be sure that everyone will find these topics equally sensitive; there are people who can take such topics in their stride.

Conversely, some people may be upset by apparently innocuous topics. Suppose a market researcher is investigating people’s perceptions of homewares. In one interview, the researcher asks their question about teapots, and realises their participant is struggling to hold back tears. The participant explains that the last gift ever given to them by their beloved mother, who died exactly one year ago, was a teapot. Perfectly plausible; impossible to foresee.

So, we can’t always predict everything everyone will be sensitive about, and we shouldn’t pretend we can. But, again, we need to equip ourselves with the mental and emotional intelligence and dexterity to be able to deal with the unexpected. Because if there is one thing we can predict, it is that at times we will face the unpredictable.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

How To Find A Collaborator

The question to ask first is, when might you want to find a collaborator? Some work needs to be done alone, such as most doctoral research. Some work is sometimes best done alone, such as writing an opinion piece for a high-profile blog. But some work definitely needs to be done in collaboration. Most research benefits from collaboration. When I am commissioned to do a piece of research alone or with one other colleague, I always recommend that the commissioner set up a small group of relevant people to advise and steer the research project. And writing often benefits from collaboration too. In fact academic writing is always more or less collaborative: even if only one person is named as the author, the work will have been influenced by other scholars, colleagues, reviewers, editors – the list is long. And if more than one author is named, the work is likely to have benefited from the sustained engagement of more than one person.

Some work really needs collaborators. Three colleagues and I wrote Creative Research Methods in Education, and it was a better book, as a result, than it would have been if any three or two of us had worked on the project. I often receive requests to collaborate with others on research, or writing or both. Sometimes they are from friends or colleagues, and I always consider those carefully. Narelle Lemon from Swinburne University in Melbourne, Australia, suggested we work together on the education book when we first met in person. Sometimes requests to collaborate come from people I don’t know. The reception those ones get will vary depending on what the person is proposing and how they put that across. If the email is from a free email provider such as gmail, with lots of spelling mistakes, asking me to collaborate on research to help prove that hemlock cures cancer – and to contribute to the funding of that research – I will reach swiftly for the delete button. Conversely, if the email is from an organisational address, well written, and asking me to collaborate on work that is within one of my areas of expertise, I will respond – and if the enquirer mentions that they have a budget, I am likely to respond positively.

The best collaboration request I have had from a stranger came from Richard Phillips of Sheffield University. His initial message, in July 2018, simply said: “Dear Helen, I would like to explore the possibility of involving you in a workshop on creative writing and social research, and have a budget for this. It would be great to hear from you and discuss. Thanks, Richard.” Short, to the point, and very interesting indeed. I emailed straight back, and in his reply he told me he liked my book on creative research methods. Better and better! We spoke a couple of days later, met a couple of weeks after that, ran the workshop in November 2018, and our book on Creative Writing for Social Research was published in January 2021.

If you want to find a collaborator, the most important thing is to do your homework. If you want someone to co-write a journal article about the role of manicures in ex-convict rehabilitation, you need to find someone who shares that niche interest. And when you do find someone who seems suitable, make sure your potential collaborator likes to write; not everyone does. There should be no need to introduce yourself, because the person you are contacting should be able to find information about you online; if they can’t, they are much less likely to agree to collaborate with you.

Overall, people are more likely to agree to collaborate if you are their peer or above, the work you are proposing is within their areas of interest, and you have a budget. If you have nothing but passion for a project, it is still worth asking suitable people if they are willing to collaborate, but be prepared for rejection. Also, please be aware that offering to collaborate for free could put you at risk of being exploited. However much you care about an issue, it is equally important to take care of yourself.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Indigenous Research Methods: Another Reading List

I thought it was time to share more of the books from my shelves. As with my previous post on this topic, this post is a reflection of my personal collection, built from the recommendations of students, colleagues and people on social media, as well as my own explorations. The more I have read and worked with Indigenous scholarship, the more convinced I have become of the importance of including these perspectives in my own work wherever they are relevant. I am glad to be able to use my own power, such as it is, to amplify the voices of scholars who are much more marginalised than me.

Books on Indigenous research methods are very different from books on Euro-Western research methods. Books on Euro-Western research methods are akin to recipe books: combine these things, like this, and you will probably get that result, unless some contextual factor gets in the way. Books on Indigenous research methods don’t start with what to do and how to do it, they start with stories, and thinking, and sharing, and knowing, and learning. One key difference is that Indigenous research is designed to serve existing relationships, and if it is not likely to at least maintain and ideally strengthen those relationships, it is not deemed to be worth conducting. In the Euro-Western paradigm, we teach novice qualitative researchers to ‘create rapport’ with participants, to put them at ease – in effect, to make instrumental use of our friendship skills to obtain information from people we may not ever see again. Euro-Western researchers have begun to question how ethical this is. Indigenous researchers offer us some unmissable clues to the answer.

I am not, and I will never be, an expert on Indigenous research. Since my book on research ethics came out – with its subtitle of ‘Euro-Western and Indigenous Perspectives’ – I have received several invitations to speak about Indigenous research and to peer-review journal articles written by Indigenous scholars. I always refuse the first, and I only accept the second if the journal editor can assure me that the other reviewers will be Indigenous scholars (which, to date, no journal editor has been able to do). As a white English person I already have too much power in this post-colonial arena. I do not plan ever to use any of it to set myself above or take advantage of the Indigenous scholars who have taught me, and are teaching me, so much through their writings.

These books could be described as more theoretical than practical but, in the words of Kurt Lewin, the inventor of action research, ‘There is nothing as practical as a good theory.’ Lewin was a Jewish German psychologist who immigrated to the US as an adult in 1933, so he had experienced and understood oppression. He was also, perhaps as a result, much more interested in applied research which could make a positive difference to social problems than to research that might generate knowledge for its own sake. In the Indigenous research literature this distinction is not relevant, made or discussed, because knowledge is conceptualised as collectively owned, in contrast to the Euro-Western paradigm where knowledge is conceptualised as a form of individual property.

I could say a lot more about the similarities and differences I perceive, but I need to get to the books! The first is Talkin’ Up To The White Woman: Indigenous Women and Feminism by Aileen Moreton-Robinson, Professor of Indigenous Research at RMIT in Melbourne, Australia. This was recommended by various people on social media, and I didn’t get around to buying a copy until last year, but I’m not sorry because I got the 20th anniversary edition with a new preface. It is a book of relevance to every white woman and anyone who uses feminist theory. Although it was written over 20 years ago, it is still highly, urgently topical. The author explains how white women dominate the feminist agenda; invites us to notice and interrogate our white privilege; and suggests we need to figure out how to give up some of that privilege in the interests of greater equality – which, after all, is where feminism came in.

Syed Farid Alatas is Professor of Sociology at the National University of Singapore. His book Alternative Discourses in Asian Social Science: Responses to Eurocentrism points out how and why Euro-Western social science doesn’t fit with Asian realities. The book covers the whole of Asia and all of the social sciences, and – despite its title – argues that alternative discourses alone are not enough, particularly if they are created in the same mould as the Euro-Western social science discourses so prevalent in Asian universities. Alatas explains in forensic detail how Asian academies are still colonised by Western approaches and curricula. He calls for a ‘liberating discourse’ which will help to popularise Asian ideas and perspectives.

Antonia Darder is a Puerto Rican and American scholar, artist, poet and activist. She has edited a collection called Decolonizing Interpretive Research: A Subaltern Methodology for Social Change. The foreword, by Linda Tuhiwai Smith, notes that ‘dominant theories … have spectacularly failed to transform the lives of subaltern communities and have instead reinforced privilege and inequalities across all developed and developing countries’ (p xii). In her introduction, Darder points out that an insistence on empirical evidence is a colonialist approach and, in close alignment with Alatas, calls for a reversal of privilege to foreground Indigenous philosophies and approaches.

Applying Indigenous Research Methods: Storying with Peoples and Communities is edited by Indigenous American scholars Sweeney Windchief and Timothy San Pedro. The editors begin by acknowledging that there is more in the literature about what Indigenous research methods are, and why, than about how they can be applied. This book sets out to correct that imbalance – and says quite clearly on the back cover that it is designed for use and teaching across Indigenous studies and education. Any Euro-Western researcher who is looking for methodological novelty they can use in their own work will not find that here. What they will find instead are inspiring stories of how research can be when it is understood and conducted holistically in and for communities of people who share a system of values which have been developed and tested over millennia.

Indigenous Canadian scholars Deborah McGregor, Jean-Paul Restoule and Rochelle Johnston have edited Indigenous Research: Theories, Practices, and Relationships. This also focuses on how Indigenous research is conducted in practice and includes inspiring stories to demonstrate some ways this has been done.

Shawn Wilson, Andrea Breen and Lindsay DuPré have edited Research and Reconciliation: Unsettling Ways of Knowing through Indigenous Relationships. The editors are two Indigenous researchers and one white settler. They explain the troubled complexity of the concept of reconciliation, which means different things to different people and can be co-opted for colonialist purposes. The editors are overtly working towards twin purposes of creating intellectual discomfort in some arenas and, in others, creating and protecting spaces for researchers to work as authentically as possible. And, again, the contributions are inspiring stories – though sadly, unlike all the others, this book doesn’t have an index.

There are more links between the last three books than their presentation of stories. These books seem to speak to each other, the stories intertwining and sometimes disagreeing, going back and forth and around again but always making progress. Like a conversation. And they are all very readable, written with dialogue and storytelling, poetry and images.

Lastly I am going to mention again a book I covered in my previous post: Indigenous Research Methodologies by Professor Bagele Chilisa from the University of Botswana. I am mentioning this book again because the second edition is now out and well worth buying and reading, even if you already have the first edition.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Data Dreaming

Inspired by my last post on What is data?, a researcher – who needs to remain anonymous – has written this guest post for my blog.

As an interdisciplinary researcher working in arts/health/humanities contexts, I am interested in the language used to discuss data: terms such as ‘rich’ and ‘noisy’ refer to ‘evidence’ that is complex or messy. Data can take many forms as Helen Kara’s blog (and books) articulate, and can also carry different values. The power practices played out between qualitative and quantitative research paradigms are also evident in the history of arts-based practice research as a poor relation to written outputs. We are on a long journey towards recognition and understanding of arts paradigms in terms of audits, funding and, most importantly, knowledge.

Last Thursday (17 March 2022), academics in receipt of grants from UK research councils were busy submitting their annual outputs to ‘Research Fish’, a reporting system for the outputs of grant funded projects. The research leads are required to complete online forms with details of all the material that has been produced that is associated with the grant. Reports are required while the research is ongoing and for five years after funding has ended. For arts-based researchers, this exercise can feel like a process of putting a square peg in a round hole due to the scientific bias of the reporting format and categories. Even the section on the impact narrative seems to offer limited opportunity to discuss how research can positively impact on individuals; I found myself ticking the ‘other’ box rather too frequently after wrestling with the different categories offered on the form. I even wondered whether the timing of the Helen Kara’s blog addressing the vexed issue of ‘What is data?’ had been deliberate or a happy/unhappy accident in view of the deadline that day for the Research Fish audit.

Fishing completed, I returned to my emails to find an urgent message about one of the funded projects I’d just reported on. This research grant was in its final year and involved a team of arts practitioners facilitating creative workshops to explore questions about adolescent identities and mental health. A query had been raised by the funder during an audit of expenditure and I was informed that a consumables cost had been removed as it was deemed ineligible due to not being ‘directly related to the research being carried out’. The items identified were tote bags and their contents: journals, badges, craft materials and sensory tools (fidget toys).

The justification we provided was that the items were being used to support the practical workshops in schools and were part of the data collection. Participants used the journals during the workshop, responding to prompts and tasks through writing or drawing (giving us insights into their thoughts, feelings, experiences through creative processes); hence these were an important source of data contributing to our analysis. The bags contained pens, badges (used for communication preferences as well as names), arts materials for making activities and what are known as ‘fidget or stim toys’ (for sensory play/stimming). These ensured participants had access to the same set of resources, which is important for parity and inclusion. The stim toys were particularly valuable and popular with our neurodivergent participants, enabling the researchers and teachers to understand more about the role of stimming for this population (regulating emotion, facilitating focus, supporting processing). This was also important to creating a sense of group identity as the stim tools were something the participants used to interact with each other as well as individually. One participant described the resource as ‘my little bag of heaven’. The impact narrative for this project referred to a headteacher describing it as ‘changing lives’ due to the impact on individuals and the school as a whole.

There is pleasure and joy through the learning co-produced in these rich interdisciplinary research environments; the activities can produce tacit knowledge and felt understanding, the ‘moments of being’ Virginia Woolf describes, in which we perceive a new reality working in the arts/science interface. However, the query about the research rationale for these materials (and their relevance to the data) reminds me of Virginia Woolf’s fishing analogy in her essay ‘Professions for Women’ and her description of a young girl writing in contexts where a dominant authority stifles the work of an/other:

The image that comes to my mind when I think of this girl is the image of a fisherman lying sunk in dreams on the verge of a deep lake with a rod held out over the water. She was letting her imagination sweep unchecked round every rock and cranny of the world that lies submerged in the depths of our unconscious being. Now came the experience, the experience that I believe to be far commoner with women writers than with men. The line raced through the girl’s fingers. Her imagination had rushed away. It had sought the pools, the depths, the dark places where the largest fish slumber. And then there was a smash. There was an explosion. There was foam and confusion. The imagination had dashed itself against something hard. The girl was roused from her dream. She was indeed in a state of the most acute and difficult distress. To speak without figure she had thought of something, something about the body, about the passions which it was unfitting for her as a woman to say.’

Arts practices are embodied research approaches, requiring arts materials to ‘probe the dark places where the largest fish slumber’. I can only dream of a future heaven where this is no longer ‘unfitting’ for us as researchers to say, but instead is understood and valued as data.

What Is Data?

Last week, in the context of some work I’m doing for a client, I was trying to find something someone had written in answer to the question: what is data? I looked around online, and in my library of methods books, and I couldn’t find anything except some definitions.

The definitions included:

  • Factual information used as a basis for reasoning or calculation (Merriam-Webster)
  • Information, especially facts or numbers, collected to be used to help with making decisions (Cambridge English Dictionary)
  • Individual facts, statistics, or items of information, often numeric (Wikipedia)

Data is also, demonstrably, a word, and a character in Star Trek. So far, so inconclusive. Yet people talk and write about data all the time: in the media, in books and journals, in conversations and meetings. And they use it to refer to many other things than facts or numbers. Data may be anything from a piece of human tissue to the movement of the stars.

Euro-Western researchers conventionally speak and write of ‘collecting’ data. And indeed some data can be collected. If you want to research beach littering, you can go and collect all the litter from one or more beaches, and then use that litter as data for analysis. If you want to know what differences there may be in how print media describes people of different genders, you can collect relevant extracts from a bunch of articles and then use those extracts as data for analysis. So this is valid in some cases. However, if you plan to research lived experience by collecting data, you are effectively viewing people as repositories of data which can be transferred to researchers on request, and viewing researchers as people who possess no data themselves so need to take it from others. Clearly neither of these positions are accurate.

Some Euro-Western researchers speak and write of ‘constructing’ data. This refers to the generation of data as a creative act, such as through keeping a diary for a specified length of time, taking photographs during a walking interview, or making a collective collage in a focus group. Even conventional interview or focus group data can be viewed as being constructed by researcher and participant(s) together.

Autoethnographers and embodiment researchers privilege data from their own lived experience, though often they also use data collected from, or constructed with, others. But for these researchers, their own sensory experiences, thoughts, emotions, memories and desires are all potential data.

For Indigenous researchers, all of these and more can be used as data, which is often co-constructed with the researcher and all participants working together in a group. This is done in whatever way is appropriate for the researcher’s and participants’ culture. Māori research data is co-constructed through reflective self-aware seminars. In the Mmogo method from southern Africa, objects with symbolic and socially constructed meanings are co-constructed from familiar cultural items such as clay, grass stalks, cloth and colourful buttons, during the research process, to serve as data (Chilisa 2020: 223-4,243). Indigenous researchers in America, Canada and Australia use oral history, stories and artworks as data (Lambert 2014:29-35).

All of this tells us that data is not purely facts and numbers, as the definitions would have us believe. Conversely, we could conclude from the examples above that pretty much anything can be data. This does not mean anything can be data for any research project. You’re not likely to find a cure for disease by collecting bus timetables, or identify the best way to plan a new town by making inukshuk. But bus timetables could be very useful for research into public transport systems, and making inukshuk could be integral to Indigenous research into the knowledge and belief systems of Arctic peoples.

Data can be documents or tattoos, poems or maps, artefacts or photographs – the list is very, very long. And of course a research project may use different kinds of data, which could be collected, or constructed, or some of each. The question we need to ask ourselves, at the start of any research project, is: what kind(s) of data are most likely to help us answer our research question, within its unique context including any constraints of budget and/or timescale? In the end, for some projects, the answer will be facts, or numbers, or both. But if we assume this from the start, we close off all sorts of potentially interesting and useful options.

This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!