Conventional research methods are good methods. Creative research methods, in themselves, are not better than conventional research methods. Sometimes all you need is to do some interviews and, if that’s the case, there’s not much point deciding to design an app and ask participants to use it to create multi-media data. But I have argued for many years that it is worth knowing about as many methods as you can, because that gives you a better chance of answering your research questions. Methods are tools, and the more tools we have in our toolboxes – within reason – the better equipped we are to do the work we need to do.
Several client meetings recently have gone like this:
Client: We need to use more research methods, not just surveys and interviews. Can you help? Me: Yes indeed I can. I think methods X, Y and Z might suit you. Client: But it will take us time to learn those methods and we don’t have any spare time. Me: 🤦♀️
In these situations it is my job to find helpful arguments that will encourage my clients to find the time they need for the work they want to do. Here are four of the main arguments I use in this situation.
1. You will get better quality data.
Study after study after study, using creative methods, report that their authors are absolutely sure they have richer, more useful data than they would have been able to obtain using conventional methods. Of course there is a difficulty here that any researcher will recognise: no control group. Even so, the sheer number of times this appears in the literature, from sources independent of each other and with experience of using both conventional and creative methods, suggests that there is some truth in the assertion.
2. Funders and commissioners often appreciate a more creative approach these days.
A sensible and well thought through creative approach can help your work to stand out from the crowd. After all, there will be lots of other people who think they can’t find the time to learn about the creative methods that might help them to do their work more effectively. And this means that funders and commissioners will read lots of applications recommending surveys, interviews, and focus groups. If your application recommends collage, digital storytelling, and poetic analysis – OK there is no guarantee of success, but it should at least pique the readers’ interest and be more memorable than most.
3. After the initial set-up stage, some creative methods can save you time.
This applies particularly to creative methods that give participants a high level of control over creating data. These may be low tech, such as diaries, or high tech, such as apps. Getting participants to keep a diary is potentially a big win, with lots of data being generated with little or no researcher involvement. It’s a good idea to provide some structure, e.g. asking participants to answer three questions each week, or to record their reflections on a particular issue on one weekday and one weekend day – whatever works for your research project. And diaries may be written, or audio-recorded, or even drawn or stitched. Using apps in research can be expensive, especially if you need to commission a bespoke app, but can also have big potential advantages. For many participants, apps are user-friendly (though not for all, so you need to offer an analogue alternative too). And data generated using an app is immediately available to the researchers for analysis. So, for both of these methods and many others besides, there is a chunk of work to be done in setting up the method, but once that is done, they really can save you time in the long run.
4. Creative methods can be more ethical.
Please note I am definitely not saying creative methods are more ethical. But they can be, and where they are, this is an argument worth making. For example, some creative methods of gathering data can facilitate the involvement of participants in the initial phase of data analysis. Enhanced interviewing is one such method, where the interview can include questions about participants’ interpretations of the photos they have taken, or the artefact they have brought, or whatever is being used to enhance the interviews. Creative methods of presentation can be more engaging for audiences, and help them to understand more fully and remember better the messages you convey. There are plenty of other such examples of ways in which creative methods can support and augment researchers’ ethical work.
So those are the four main arguments I use. If you know of others, please share them in the comments.
When I wrote my doctoral thesis, nearly 20 years ago now, I wanted to write it creatively. I was already a professional writer and I could see the potential for creative approaches to help me communicate the points I needed to make. Also, I gathered data in the form of stories, so to me it made sense that my thesis should be made up of stories too. But my supervisors were resistant. After some discussions, they allowed me to write one chapter creatively, as long as I wrote the rest of my thesis in a conventional style.
The difference between then and now is that back in the mid-2000s, the literature on writing creatively in academia was very limited. Laurel Richardson’s seminal Fields of Play was available, but it was on its own at that time; there was no body of literature from which to build a rationale for using creative techniques in academic writing. And of course that was exactly what I needed to do to reassure my supervisors about the merits of my intended approach.
But now there is such a body of literature! In this post I share four particularly useful books, all published in the last couple of years. Also, they are all well referenced, so you can use them to find other literature, if you wish. Then you can create a cogent, evidence-based argument for using creative techniques in writing your doctoral dissertation or thesis.
I also want to recommend Fields of Play. Although it was written late in the last century, it is still highly relevant today. Laurel Richardson dismantles the rationale for the norms of conventional academic writing such as passive voice and authorial authority. Then she creates a new rationale for using fiction techniques, poetry, drama and other creative approaches in academic writing. And she practises what she preaches within the text, to excellent effect.
Reimagining Doctoral Writing (University of Colorado Press, April 2022) is edited by Cecile Badenhorst, Brittany Amell and James Burford. This edited collection is all about doctoral writing. Authors come from around the world, and they investigate doctoral writing from a range of perspectives and in a range of contexts. They also consider some potential futures of doctoral writing. This book is available as an open access ebook through the WAC Clearing House.
Doing Rebellious Research: In and Beyond the Academy (Brill, May 2022) is edited by Pam Burnard, Elizabeth Mackinlay, David Rousell and Tatjana Dragovic. This edited collection has four parts. The second part is called ‘Rebellious Writings Written Differently: A Manifesto’. It contains seven chapters and a set of reflective questions, and overall is designed to encourage and inspire a radical approach to academic communication.
Refining Your Academic Writing (Routledge, December 2022) is by Pat Thomson. This short book treats revision as not a boring mechanical process but a creative, imaginative craft. It is part of the Insider Guides to Success in Academia series which, in the interests of full disclosure, I should point out is co-edited by Pat and me. But I am recommending this book here, not simply because it’s in our series, but because it is as useful and radical as the others in this post.
Creative Writing for Social Research (Policy Press, January 2021) is by Richard Phillips and me, with 14 tremendous contributors who put the principles set out in the book into practice. We have received excellent feedback on this book, such as: ‘The text is well written and engaging… I would recommend this book to all qualitative researchers.’ Thank you Ruthi Margulis for your heartwarming review in Research Matters (Dec 2021, p 13), the quarterly magazine for researchers published by the Social Research Association.
These books are in general a pleasure to read. They are well written and full of ideas, encouragement, and inspiration. And it’s not only the books – if you want more personalised support with your thesis writing, you can always come on one of my writing retreats (if there are still places available). Whatever resources you draw on, I wish you joy of your doctoral writing.
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
I have an exciting new venture to share with you. For the last couple of years I have been working with Policy Press on a new series of short affordable books on creative research methods in practice. And we have just gone public! The first book is on its way: Photovoice, Reimagined by Nicole Brown. And there are several more books in the pipeline. Two are being written right now – one on fiction in research, and one on phenomenography – and four other book proposals are under review.
I wanted to edit this series because there are no such books available to help researchers learn in detail about why, when, and how to use a new research method. There are several books giving an overview of creative research methods, within or across academic disciplines; some sole-authored, some edited collections. These are useful texts but they do not generally offer enough depth of information to enable readers to try out the methods for themselves with confidence. The main rationale for this new series is to do just that.
One of the hardest things to sort out was the design for the covers and webpage. That took months and a lot of emails, discussions, and meetings (most of which I didn’t need to attend, thank goodness). We almost agreed on some covers and then the sales and marketing people at Policy Press said the designs weren’t good enough. They were absolutely right. So we went back to the actual drawing board and started again. I am so pleased with the final result. I think hot air balloons are a delightful combination of science and art, innovation and exploration and adventure – just like creative research methods. (Let’s not focus too closely on the ‘hot air’ part, OK?!) Also Policy Press likes to have a Bristol element to their designs, and Bristol holds an annual International Balloon Fiesta – Europe’s largest event of its kind – so the design works from that viewpoint too.
I am so happy to be able to tell you about this new book series. And if you would like to propose a book for the series, do get in touch!
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
‘written, graphical or pictorial matter, or a combination of these types of content, in order to transmit or store information or meaning’ (p 11).
So documents have a range of purposes, and can come in a wide variety of forms and formats: digital or hard copy; reports, letters, emails, social media posts, forms, meeting minutes, web pages, leaflets, shopping lists; and so on. Documents are rarely just containers of information, they are also tools for people to use in the world. Documents are used for purposes such as communication (letters, emails etc), or enforcement (legislation and legal judgements), or to make something happen (a child’s birthday present wish-list or an adult’s last will and testament).
Documents can be rich sources of data for research. They may be collected, from libraries, the internet, archives etc, or constructed, such as when a researcher asks participants to keep a diary of relevant events for a specific time period. Collected documents are secondary data, and using secondary data where possible is an ethical approach to research, because it reduces the burden of primary data collection for participants and for researchers.
There are many ways to analyse documentary data: thematic analysis, content analysis, discourse analysis, narrative analysis and metaphor analysis are just a few. And documents are being used as data for research across a wide range of disciplines and fields: psychology, ecology, education, health, technology, linguistics and many others too. Innovative work is being done with documents in research all around the world.
What does not yet exist is an edited collection of chapters to give a sense of the breadth and depth of possibilities offered to research by documents. So I am delighted that Aimee Grant has invited me to co-edit just such a book, which we intend to showcase some of the excellent work being done with documents by researchers worldwide. We formulated our call for proposals last week; the deadline is midday BST on 24 April 2023. Please help us to spread the word!
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
We all have biases and prejudices that affect our lives in many ways, from the choices we make to our interactions with others. And of course our biases and prejudices can affect our research work too. We can never completely escape from our biases and prejudices, but there are a number of steps we can take to mitigate their impact. Here are ten of the most useful.
1. Get as much good quality information as you can.
The less information you have, the more space there is for biases and prejudices to operate. Ideally, seek information from reputable sources that is backed up by other reputable sources. Of course in some research areas, at the frontiers of knowledge, there is little to be found – but there will be foundational information to build pioneering research on, and again this needs to be demonstrably solid and trustworthy.
2. Use structures to help you think.
Structures, such as checklists, can bring rigour to your thinking. They should be predetermined and tested. One structure I use frequently is the eight criteria identified by Sarah Tracy for assessing the quality of qualitative research. These criteria were themselves developed from a systematic analysis of debates on quality in the qualitative research literature – exactly the kind of demonstrably solid foundational information I referred to in Tip 1 above.
3. Take steps to mitigate the effects of your emotions.
Our emotions are always with us and they inevitably affect our work. We need to be aware of our feelings so we can take the necessary steps to ensure they are not unduly influencing our decisions. Where emotional influence is unavoidable, we should be open about this in our reporting.
4. Seek the opinions of others.
Other people are often better at spotting our biases and prejudices than we are ourselves. It can be useful to talk through your work with someone you trust to give you an honest opinion. Ask for their views about where your biases and prejudices lie, and how they might be affecting your research.
5. Value scepticism.
Remember, if it looks too good to be true, it probably is. Of course it is possible to overdo scepticism: doubting the accuracy of every single thing is annoying for others and bad for your own mental health. But scepticism in the form of truly critical thinking can be a useful counterbalance to bias and prejudice.
6. Flip the viewpoint.
This involves conducting thought experiments and is particularly useful for debiasing during analytic work. If you think your data is pointing towards a conclusion that group X needs intervention Y, try imagining the opposite. What if group X didn’t need intervention Y? Or what if group X needed intervention M rather than intervention Y? This may sound fanciful, even pointless, yet I recommend that you give it a try. It can be a really useful way to shed light on your findings.
7. Consider accountability.
Who are you accountable to? What would they think of your work? It won’t just be one group of people, so think this through for each group: participants, participants’ families, participants’ community members, colleagues, superiors, maybe funders, your family, your friends… Try to see your work as each group would see it, and consider what that tells you.
8. Use mindfulness.
Bias and prejudice can creep in when you think and work fast. There are incentives in most people’s working lives to think and work fast, but deliberately slowing our thinking can be a very useful guard against bias and prejudice.
9. Practice reflexivity.
Reflexivity involves carefully and critically examining the influences on our work, such as our characters, institutions, identities and experiences. There is no set way to do this, except that it should not become an end in itself; it should serve our research work, or it risks becoming self-indulgent. Working reflexively involves asking ourselves questions such as: Why am I doing this research? What and whose purposes does it serve? Why do some aspects of my research work please or trouble me? And so on.
10. Read work by people who are not like you.
I cannot stress this enough. Learn about others’ views. Read work by people of different genders, ages, ethnicities, cultures, religions/beliefs, political persuasions. Find out how the world looks to them. And this loops us right back to Tip 1 above, because gathering more information about people who are not like us helps to dispel any biases and prejudices we hold about them.
Do you have any other tips for debiasing work? If so, please pop them in the comments.
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
I have been a peer reviewer of journal articles for the last eight years. I documented my first peer review, in late 2014, on this blog. Peer reviewing has never seemed easy to me – and I don’t think it should. Reviewing original work by other scholars is bound to be intellectually and emotionally demanding. But I feel as if peer reviewing has become more difficult, even over the comparatively short time I have been involved. There are several reasons for this, and I will focus on three of them here: hoaxes, malpractice and complexity.
Academic hoaxes pre-date my reviewing experience. In 2005, three US-based doctoral students in computer science, Jeremy Stribling, Max Krohn and Dan Aguayo, created SCIgen. SCIgen is a computer program which can generate whole computer science journal articles including graphs, figures and citations, that look credible but are in fact nonsensical. A lot of articles generated by SCIgen have been accepted by, and published in, academic journals, despite the use of peer reviewers.
And such hoaxes are not limited to computer science. In 2017–18, three UK-based scholars, James Lindsay, Helen Pluckrose and Peter Boghossian, wrote 20 fake articles using social science jargon. They were able to get several of these articles published in academic journals, even though some of them promoted morally questionable acts. The aim of these three scholars was apparently to highlight what they saw as poor quality work in some areas of the social sciences. However, I am not sure this intended end justifies the questionable means of duping reviewers and editors into publishing bogus research.
Sadly, though, it seems that academic journals are regularly duped into publishing bogus research by researchers themselves. Retraction Watch, based in the US, has been keeping track of retracted journal articles for the last 12 years. Some articles are retracted because their authors made honest mistakes. But the Retraction Watch database lists a lot of other reasons for retraction, including falsification or fabrication of data, and falsification, fabrication or manipulation of images or results. And the numbers are staggering. At the time of writing, there are over 1,500 articles listed on the database as retracted due to the falsification and/or fabrication of data, and over 1,000 due to the manipulation of images. Also, the database only includes those articles in which fabrication, falsification or manipulation have been detected and reported. By its own admission, Retraction Watch is biased towards the life sciences, so problematic journal articles in other sectors will be even less visible.
A bunch of people make it their business to find and publicise these problematic articles. One even does it under her own name: Elisabeth Bik. Others use pseudonyms such as Clare Francis, Smut Clyde, Cheshire, and TigerBB8.
Bik specialises in identifying manipulated images, and has found through empirical research that their prevalence is increasing. However, Bik has a particular talent for pattern recognition. Of course it is useful to know that images may be manipulated, and Bik regularly shares examples on social media and elsewhere which can help others understand what to look for. But even so, spotting manipulated images can be difficult for the average, harassed, unpaid peer reviewer. And catching fabricated or falsified images, data or results may be almost impossible without inside information. Most journal articles have strict word limits which can work against them here. These restrictions mean researchers are used to some aspects of their processes receiving a cursory mention at best, and this can enable cheating to pass undetected.
When reviewing goes wrong, consequences can be disastrous. The link is to a recent controversy about a published article promoting a morally questionable act. I am not using any of its keywords in this article. I think there are some particularly interesting aspects of this case. It is not the first article to be published that features morally questionable acts. I have read the article; it is well written, and I can see how a peer reviewer could regard it as worthy of publication – as its own peer reviewers did. The problem, for me, lay in the background of the author who promotes morally questionable acts outside of academia. He may have written this article in the hope that publication would lend legitimacy to his actions. Even if he did not, publication might be perceived to confer such legitimacy, which could cause reputational damage to the publisher and the university concerned.
So, the article you are reviewing may be a hoax, and/or may contain data, images, and/or results that have been manipulated, fabricated or falsified, in ways that are difficult or impossible to detect, and/or may have been written by someone with a dodgy agenda. But that’s not all. Academic work – and, indeed, the world around us – is becoming more complex. More research is transdisciplinary, pushes methodological boundaries, is multi-lingual, and so on. The process of peer review was devised when people worked in neat, tidy, single disciplines and fields. In that landscape people could act as experts on other people’s work in its entirety. These days that is not so easy. Topics such as sustainability, the climate crisis, and food security transcend disciplines and methods. This means that nobody, really, is an expert any more, so peer review is effectively obsolete. Yet it is still being used.
This means we need not only peer review before publication, but also after publication. Luckily there is a tool for this: PubPeer, a website where you can comment on published journal articles, anonymously if you wish. This enables researchers with inside information to whistleblow without risking the loss of their jobs. Also, you can use PubPeer to check articles you are intending to cite, to make sure nobody has raised any concerns about the work you want to use. At the moment PubPeer focuses mostly on laboratory and clinical research, but there is also (not surprisingly) some computer science. In fact PubPeer can be used for any published journal article as long as the article has a recognisable ID such as a DOI. Also, there is a PubPeer browser plugin which enables PubPeer comments to be visible on other websites besides PubPeer itself.
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
Trainee qualitative researchers, learning the most popular research method of interviewing, are routinely taught to use their interpersonal skills to create rapport with participants. This has been questioned for the last 20 years by Jean Duncombe and Julie Jessop. They ask, how ethical it is for researchers to fake friendship as a means to the end of gathering data?
On the one hand, it is common for people to use interpersonal skills to help us get what we want from others in our day-to-day lives. This applies whether we want a loan from a credit agency, a prescription from the doctor, a response to a complaint – in a multitude of situations, presenting our most polite and friendly selves can help to get the results we want. So it is arguable that it makes sense also to use these everyday methods in research.
On the other hand, research encounters are rather different from everyday encounters. This applies particularly to qualitative research where a researcher may spend a considerable period of time giving a participant their undivided attention. This is an unusual and often a welcome experience for participants, who often describe it in positive terms such as ‘therapeutic’, ‘cathartic’ or ‘a treat’.
Many of the people we want things from in day-to-day life are either providing us with goods and services, so that a transactional element is built into the encounter, or are already in a personal relationship with us through kinship, friendship or community membership. So the rapport we build in those situations already has a clear basis which is mutually understood. This does not apply within the research encounter, where we are usually asking participants to give us their time and information in exchange for a potential benefit to an imagined future population. (I considered the extent to which this is ethical in my recent post on the LSE Impact Blog.) Also, despite all the efforts to secure informed consent, we know that people generally agree to participate in research for their own reasons rather than ours. And where that reason is to get a little human company and kindness, which is lacking from their own lives, the practice of building rapport begins to appear even more suspect.
Imagine you are, let us say, living on minimal welfare benefits with a chronic condition which makes it difficult for you to leave the house. You have lost touch with the friends you used to have when you could go out to work, and your family live far away. You suffer from anxiety and you are very lonely. The carers who come in three times a day are brisk and professional; they don’t have time to chat, and you don’t want to hold them up because you know they are always under pressure. Then a researcher calls, saying she is doing an evaluation of the care you receive, and asking if she can visit you to ask a few questions. You are delighted because it’s been years since you had a visitor and she sounds so kind and friendly on the phone. When she visits, you tell her all sorts of things about yourself and your life. She seems really interested, and laughs at your jokes, and tells you a few things about her own life in return. You haven’t felt this good in years. When she has asked all her questions, you ask one of your own: please will she visit you again? She looks at the floor and says she would like to, but she can’t promise, because between work and her children she doesn’t have much free time. You would like to suggest she brings her children with her, but you know a ‘no’ when you hear one, so you let her go, wait for the front door to close, and listen to the emptiness of your home and your life.
Duncombe and Jessop point out that these problems are multiplied in longitudinal research, where the boundaries between real and faked friendship can become much more blurred. They share experiences of participants beginning to treat them as friends, and the discomfort that arises when they don’t reciprocate. I have had similar experiences, and I’m sure many other qualitative and mixed-methods researchers have too. It is interesting to consider this Euro-Western approach in the light of the very different Indigenous approach, in which research is deemed to be ethical when it serves to maintain and develop existing relationships. Looked at in this way, our Euro-Western approach of creating and then dropping relationships to further our research purposes seems potentially abusive.
The EU-funded TRUST project developed a Global Code of Conduct for Research in Resource-Poor Settings. It was based on four values elicited from research they did with a wide variety of people around the world: respect, fairness, honesty and care. The aim was to combat ‘ethics dumping’, where research deemed unethical in a higher-income country is conducted, instead, in a lower-income country where research is not governed by a regulatory system. I would argue that these values should also apply where research is done by a researcher with more social capital than some or all of their participants. In the vignette above, the researcher was not entirely honest and did not show care in response to the participant’s request, e.g. by signposting them to a local befriending service. This could be described as ‘friendship dumping’.
When you think about it, researchers using their interpersonal skills to create rapport with a participant as a means to an end is actually quite manipulative. This might be more defensible when we are ‘studying sideways’ or ‘studying up’, but even then it seems questionable. Showing respect for participants would be a more creditable aim, especially if it was combined with fairness, honesty and care.
The next post on this blog will be in September. You can follow the blog, above, to get my posts in your inbox.
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
At times I have been hired for my ‘lived experience’, either as a carer for people with mental health problems or as a disabled person myself. I have also worked in research teams with people who have other kinds of ‘lived experience’, such as parenting children under five or living with addiction. I am not particularly keen on the phrase ‘lived experience’, because as far as I can tell all human experience is lived experience. I prefer ‘experts by experience’.
However, I also think the concept is flawed. Being an expert by experience is not like being an expert in domestic plumbing, or millinery or research ethics. For a start, the categories provided for experts by experience are incredibly broad. ‘Disability’ is a huge category. I am Autistic and I live with fibromyalgia and asthma. That qualifies me as an expert by experience – but I am no expert in the experiences of Deaf people, or stroke survivors, or people with Tourette’s syndrome, or many, many others. ‘Addiction’ is another huge category, covering street and pharmaceutical drugs, alcohol, shopping, sex and so on. Someone who is addicted to alcohol will not be an expert in the experiences of someone who is addicted to heroin or gambling. I could give you equivalent examples for mental health carers, the parents of young children, and any other category of ‘expert by experience’ you care to name.
Also, I often observe – and have experienced – experts by experience being required to subordinate their experience-based expertise to expertise conferred in other ways, such as through education or employment, and/or to organisational constraints. I have heard of situations where research ethics committees discounted expertise based on experience (which was no fun at all for the researchers concerned). And I have other forms of expertise myself, developed through education and employment; my experience shows that these are valued more highly than my expertise by experience. I earn more with them, for one thing. This all leads me to understand that expertise by experience is worth less than other forms of expertise.
I should also acknowledge that I have witnessed several situations where third sector organisations passed over a capable and qualified candidate to recruit an employee with lived experience. This might look like organisations valuing expertise from lived experience more highly than other forms of expertise, but in each case the story did not end well. Recruitment is one thing, retention is quite another. Recruiting someone who is not able to do the job, and then not providing the adaptations and support they need to become able to do the job, is a costly form of box-ticking. And I don’t mean only financial costs; failed employment leads to enormous emotional and mental health costs too.
Another thing I have observed – and not only post-recruitment – is much less support and development being available for experts by experience than for other kinds of experts. I have mentioned payment, which may be in the form of a voucher, or travel expenses and a sandwich lunch; once in a while a reasonable amount of actual money. Sometimes there is a helpful booklet or a little bit of training. I have never seen any sign of experts by experience being permitted, let alone encouraged, to develop other forms of expertise.
This is just one example of the ‘us and them’ aspect of experts by experience. In the early 2000s I did a lot of work with Sure Start, a New Labour initiative involving partnership working in areas of deprivation to provide multi-agency one-stop-shop support for parents and children under the age of five. My role was to support partnerships in their early stages so I spent a lot of time sitting around tables with groups including nursery educators, midwives, health visitors, Home-Start managers, and other such professionals. They would talk about ‘the parents’, meaning the people who would be using the services once they were set up. It felt very much as though they were othering their potential service users. I would ask, ‘How many of the people round this table are parents?’ Inevitably some were; often most. Then I would facilitate a discussion about how the lived experience of the parent-professionals could inform the work of the partnership. This made some of the professionals uncomfortable at times. I’m not sorry.
As a researcher, part of my job is to separate and categorise information to help me find useful links and patterns. But this separation and categorising work is temporary, for the purpose of discovery. Separating and categorising people is inevitable, at least for people using English because of how the language works – but this always carries the potential for othering. In my lived experience, experts by experience are often on the receiving end. It is not a pleasant place to be, when you are allowed to be involved so far and no further, when others always have the final say.
Everyone is an expert on something, whether that is cleaning a house or conducting an orchestra, plastering a wall or piloting a battleship. I wish experience-based expertise was valued as highly as education-based or employment-based expertise. I think it has every bit as much value and I hope, one day, this will be fully recognised.
This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
A while ago I turned down some potentially lucrative work on ethical grounds. I was approached by a global company I will call SubSidTech because it is a wholly-owned subsidiary company of one of the Big Five (Alphabet, Amazon, Apple, Meta and Microsoft). SubSidTech wanted help with creative research methods, and I was tempted, because I could have charged them a high fee and they might well have flown me to interesting places. But we didn’t get that far.
I turned down the work because I know that SubSidTech’s parent company works in some ways I consider to be unethical. I explained this to SubSidTech, politely; they sent a cordial email back thanking me for my candour and assuring me that they respected my views. I would have been very glad of the money. But I know turning the work down was, for me, the right thing to do.
It got me thinking, though, about the costs of acting ethically. Let’s start with consumption. I try to shop as ethically as I can: wherever possible I buy from companies with good policies and practices; I try to buy fairly traded and environmentally friendly products; I do what I can to avoid perpetuating cruelty to humans or animals. But living this way is often more expensive. For example, my phone is a Fairphone 3. The people who make this phone are paid a living wage, it is partly made from recycled plastic, and I can repair it myself with component parts available online at reasonable prices. It doesn’t have the built-in obsolescence of many mobile phones. But it was not cheap.
Sometimes being ethical can save money. I often buy second-hand clothes from online marketplaces or charity shops. But usually there is a premium to be paid for ethical consumption. And with costs rising as steeply as they are at present, I find myself rethinking a lot of my previously automatic choices. I love organic butter. It tastes like the butter of my country childhood, it’s not full of hormones, and it’s good for the planet – but its price has risen by 17% in recent weeks, and non-organic butter is cheaper. I don’t know how long I can maintain my ethical shopping preferences because, although I am not on a low income, my income is not rising. (It does go up and down a bit, but the average profit from my business over the last five years has been £24,964 per year; I can pay myself most of that.) And people who live on low incomes or welfare benefits have much more limited options for shopping ethically. The impact of the global financial squeeze on ethical consumption practices is already being recognised.
There is also a cost to doing research ethically. Taking the time to do proper participatory or other inequality-tackling research; paying or otherwise recompensing participants; providing suitable aftercare – these all cost more money, time and commitment than funders are used to funding or researchers are used to providing. Completing an ethics application form has a sizeable time cost, though some of the work done will save time later on. But there is still a time overhead, unless you are the kind of researcher who, having received their formal ethical approval, declares that they have ‘done ethics’ and will now get on with their research. And if you’re not that kind of researcher, if you aim to think and act ethically throughout your research work, then that also comes with a time cost and in some cases a financial cost too.
Because of the costs of acting ethically, we end up having to make compromises. Due to the rising cost of living I am consuming less of the ethically produced goods I like to eat and wear and use. My current choice is to consume less, rather than to buy unethically produced goods; this is a mark of privilege, and may have to change again in time. Perhaps there will also come a point where I cannot choose to turn down work from companies whose practices I regard as unethical. I hope not – but I know that, as for most people, if I need the money badly enough I will take any work I can get. But when it comes to research ethics, I plan to stand my ground. This is easier because someone else is paying the bill, most of the people I work for and with understand the purpose and value of research ethics, and often I can influence the ethical aspects of the research I conduct or support. That doesn’t mean research ethics is compromise-free – there are often compromises to be made where ethics is concerned. But I am happy to work in a profession where ethics, albeit expensive, is taken as seriously as I take it in my personal life.
This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
Note: This post was first published on the SRA blog in November 2021 and is reproduced here with the kind permission of the author and SRA.
In this blog post, Kimberley Neve, researcher at the Centre for Food Policy at City, University of London outlines different methods for capturing ‘lived experience’. Lived experience is the actual, specific ways in which people experience something, in this case food – access to food, food poverty, food quality, food allergies and many others. Kimberley and other researchers at the Centre for Food Policy specialising in qualitative methods have produced a Brief to give an overview of the range of methods you can use when researching people’s lived experience of ‘food environments’. Food environments are the space in which we make all our decisions about food – what to eat, where to buy it, when and with whom to eat it.
Using qualitative methods to influence policy
As researchers we want our work to have impact. We also want to know that it resonates with people and reflects not only the experiences of the research participants, but also of the general population in some way. For our research to have a positive impact, effective communication with policy-makers, both locally and nationally, is vital. Despite the potential of qualitative methods to inform policy that is effective and equitable for the people it is designed to help, the number of qualitative studies used as evidence for policy remains modest compared to quantitative studies.
We wanted to raise the profile of qualitative research methods among both policy-makers and food environment researchers by demonstrating the range of potential methods and their benefits (and drawbacks), with a focus on how using them can help inform policy. These methods can be utilised in a wide range of research areas – for example local transport, access to outdoor space or crime in local areas – providing in-depth insights into people’s lived experiences and practices that can explain how or why people act the way they do.
In our Centre for Food Policy Research Brief (the ‘Brief’) we initially mapped existing studies capturing the lived experience of food environments, categorising methods and relevant case studies. Following this, we consulted with members of our Community of Practice – experts in qualitative research and food environments – for feedback prior to final edits.
What are the qualitative methods you can use?
The Brief is not an exhaustive list of the qualitative methods available; however, we’ve tried to capture the main methods you can use. For the scope of the Brief, we didn’t include quantitative methods but of course recognise their vital role.
Often, combining quantitative and qualitative methods can yield the most valuable insights.
To make the overview as useful as possible, we categorised the methods in the following way:
Group 1 – Exploring experiences, perceptions, beliefs, practices and social networks;
Group 2 – Observing practices in situ;
Group 3 – Designing policy and interventions drawing on the lived experience of participants.
Which method should you use for your research?
Typically, you’ll be likely to benefit from combining methods to suit your research context. For example, visual methods and observation tend to be accompanied by individual or group interviews to provide a more in-depth exploration. In the full Brief you’ll find an overview of qualitative methods with the key benefits and potential limitations of each. Assuming you know all about individual interviews and focus group discussions already, here are a selection of other methods less frequently used in research projects.
Group 1: Visual methods
This includes photo elicitation, creative arts (where participants create artwork such as drawings, videos or theatre), concept mapping (pile sorting, ranking, mental mapping) and timelines. One study in the US used photo elicitation in urban neighbourhoods to identify community-level actions to improve urban environments in relation to health. The study allowed the researchers to identify that not all food outlets affected health in the same way, and that contextual factors such as crime and safety influence how people accessed food, which had implications for community-level policy.
PROS – Group 1 methods work particularly well with young participants or where there are language barriers, as views can be expressed more directly and simply. Participants may also be more willing to share information visually and images can provide insights that may not have been accessible via specific questioning.
CONS – Visual data can be difficult to interpret in a way that fully represents the participant perspective, and there is a potential for photographs to be seen as reflections of reality, rather than subjective perceptions that provide insights into reality. Participants could also misunderstand the objective and take photos that do not help to answer the research question.
Group 1: Geospatial methods
Geospatial methods often combine mapping with photography and/ or GPS to create visual data that can then be discussed in one-to-one interviews or focus group discussions for more insights. Methods include spatial mapping, geonarratives and geotagged photography. These methods are relatively new to the food environment literature; however they have been used very effectively to explore how people engage with their environment in general, for example in their green space encounters.
PROS – Similar to visual methods, geospatial methods can work well to engage participants in a way that is more creative and encourage them to share information more openly. They also allow for participants to share their knowledge as experts of their own food environments. These methods provide insightful data into the connections between space and place, particularly if combined with interviews or focus groups.
CONS – Geotagging requires specific technology that may be expensive and difficult to operate. There are also ethical considerations with mapping someone’s location – when and how this data is collected, stored and used are important factors to specify during the research design.
Group 2: Observation
This involves observing participant behaviour with methods such as go-along tours, transect walks and community group observation. Unlike with non-participant observation (below), the researcher talks to the participants during the activity about what their actions and interactions mean to them. For instance, during a go-along tour in a supermarket (shop-along), the researcher might ask for the thought process behind the decision to purchase a product. Transect walks are go-along tours with the addition of creating a map of the local food environment resources, constraints and opportunities.
In a UK study, go-along interviews were used to explore which changes to supermarket environments would support healthier food practices. A key insight from this research was that varied individual responses to the supermarket environment in low-income neighbourhoods are mediated by differing levels of individual agency. Interventions should include an emphasis on factors that increase agency in order to change how people buy food.
PROS – Insights into the practical aspects of daily life and routines can be captured interactively with the participant and explored in more detail with further questioning. Power imbalances in research are addressed as participants take more control of the research process.
CONS – The researcher’s presence may impact how participants behave or move around spaces, for instance by influencing what they buy in a shop-along tour. It is also quite time-intensive to organise and participate in.
Group 2: Non-participant observation
This is where participants are watched from a distance, for instance by video, with little or no interaction with the researcher. This method was used as part of a focused ethnographic study in Kenya along with interviews and cognitive mapping. The aim of the study was to inform policies for improving infant and young children’s nutrition practices. Among other insights, a key finding for policy was that future interventions must consider various aspects of food insecurity to improve conditions in practice.
PROS – You can get insights into ‘real’ individual actions, such as shopping or eating practices, without the researcher’s presence influencing the actions. Features of everyday life that may otherwise not be mentioned can be recorded and explored with further questioning. The researcher can also complete a log to provide contextual insights that can explain practices from a more objective viewpoint.
CONS – Observation alone, without a follow-up interview or discussion, means the researcher is unable to dig into the reasons underpinning the actions, so the interpretation of the situation can be subjective.
Group 3: Photovoice, co-design, co-creation, systems mapping, group model building
The third group of methods were particularly difficult to classify, as terminology and meanings often overlapped (for instance with co-creation and co-design). These methods place the participant at the centre of the research process and actively engage communities affected by policy decisions (at a neighbourhood, city, county, country level) in the research process. Participants are encouraged to draw on their own experiences, expertise and knowledge of their food environments to think about and propose change, so that policies resulting from the research are relevant and context-specific, and as a result have the potential to be more sustainable.
An example of effective group model building can be seen in a study in the US, where community-based workshops took place with a diverse group of chain and local food outlet owners, residents, neighbourhood organisations, and city agencies.Action ideas were discussed for interventions to promote healthy food access, including funding new stores that stock healthy food options and building the capacity for sourcing local produce in stores.
PROS – For all of the methods in Group 3, the ‘hands-on’ nature of research enables participants to generate information and share knowledge on their own terms. Outputs, such as policy recommendations, are created together with the participants to be effective in their local context following an in-depth research process.
CONS – These methods all run the risk of being perceived as tokenistic by participants if engagement is not meaningful and genuine.
In brief
Decisions about which methods to select to study live experience depend on the purpose of the study (i.e. guided by a specific research question), the local context, time and resources available, and the benefits and limitations of each method.Recently, the COVID-19 pandemic has accelerated the possibilities of using digital tools and technology as key facilitators for remote research.
As researchers, we not only need to engage participants and design research projects that will yield useful insights; we also have to translate our findings so that these insights can inform the design of effective and equitable policy. By using a range of methods, a more comprehensive and detailed overview can be communicated. Visual materials and stories are particularly effective ways for qualitative researchers to communicate their findings to policy-makers and make a refreshing addition to the more common interviews and focus groups.
Kimberley Neve is a Researcher at the Centre for Food Policy, City, University of London. She works as part of the Obesity Policy Research Unit, investigating people’s lived experiences of food environments to inform policy in areas such as infant feeding and weight management. Kimberley is a Registered Associate Nutritionist with a Masters in Global Public Health Nutrition.