I recently had the opportunity to take part in an asynchronous online focus group. So, I did; not least because I was curious to know what it would be like. I found it a rather odd experience. I had a few problems with the tech to start with, which was a bit annoying but is not unusual. I managed to get it sorted in the end – my pop-up blockers were to blame – but I did come close to abandoning the whole exercise through frustration at having to email support people rather than doing what I needed to do in the group. I’m not a techie, but I understand that it can be difficult to create a platform which works seamlessly on any type of hardware – laptop, tablet, mobile etc – and in any browser. So, tolerance may be required for participating in research online.
Once I got into the online environment, I found a series of intriguing questions to work through. Others had already responded to some of the questions so I could take their responses into account. (I don’t think I was supposed to be able to see them until I had answered each question myself, but I could see them, so I read them before formulating my own answers.) Even so, it didn’t feel at all like a group. I have facilitated many in-person focus groups and the interactions between group members are definitely a big part of the process; so much so that some researchers have chosen to analyse these as well as the transcript. Maybe if there had been more responses and exchanges it might have felt more like a group discussion, but I think it would still have felt like quite a solitary, albeit interesting, endeavour.
I think part of why it didn’t feel much like a group was the amount of reading and viewing required. The focus group didn’t only have questions to answer, but also text, videos, and diagrams to digest in between each question. Also, there were points where to give a full answer, I would have needed to stop and read a couple of journal articles and/or book chapters, and/or take a walk to think about the issue. But I didn’t because of time.
This focus group had 10 discussion topics, most of which included at least a dozen questions. In theory, we could choose a topic to focus on, but in practice, I found I had no option but to work through all of the questions from the start (though it is entirely possible that this was due to my technological incompetence). As a result, I spent more time feeling a sense of urgency to get through all the many questions than happily engaging with the interesting material presented. It took me almost three hours to work through the questions at speed. I skip-read some of the text and skipped almost all the videos. I started to watch one in an area I was particularly interested in but then saw that it was 17 minutes long and decided I couldn’t allocate that much time. I tried to start another but the software asked for access to my camera and microphone so I said no because of the security risks. If I had engaged with everything as thoroughly as the researcher no doubt wanted me to – and as I would have liked to myself, if my time was unlimited – I think it would have taken me at least a full day to work through all the materials and answer all the questions. And when I did eventually get to the end, it was just the end. After all that work I would have liked a ‘thank you’ message at the very least, and ideally a big burst of fireworks on the screen! Though I expect the researcher didn’t do that because they were only expecting participants to focus on one or two topics.
The group was online for a couple of months and the researcher included various messages encouraging members to come back and respond to others’ input. I can see why this would be useful for the research, but I couldn’t see much – if any – evidence of people doing that, even though my own contributions were made closer to the end than the start of the operational period. Also, I didn’t go back and add further responses myself. I felt as if I should, but I didn’t get around to it. There was no option to receive email alerts when a new answer was posted, which might have helped, though everyone’s inboxes are overstuffed so if that option had been available I might well not have taken it up, or taken it up and then deleted the emails without reading them.
I could see that the researcher had worked hard to try to provide a good online environment in which their expert participants could engage with specialised material. Alternative methods could include: reducing the number of questions, or separating the sections into different “focus groups” in different online spaces and asking people to participate in one or more of those groups in accordance with their interests, preferences, and capacities. Also, I think for participation which is so complex and time-consuming, there should really be an incentive, though I know not everyone has a budget for such things.
Although I found it quite onerous, participation was useful because it provided some insight into the potential impacts of this method on a participant. That gave me some ideas about what to do and not do if I ever want to use asynchronous online focus groups myself, or if I am mentoring someone who wants to use this method. It was also useful because the researcher who set it up was doing their best to research a complex and important piece of work which is likely to end up helping a lot of people. Although aspects of the experience were frustrating at times, my interest in methods renders those aspects also interesting to me in retrospect. So overall I think it was time well spent.
The first book in the series, Photovoice Reimagined by Nicole Brown, is published today. There are three others scheduled for publication this year. Fiction and Research, by Becky Tipper and Leah Gilman, will be published in July; Doing Phenomenography, by Amanda Taylor-Beswick and Eva Hornung, will follow in September; and Encountering The World with i-Docs, by Ella Harris, will be available in December. Three more are currently in the writing phase, two proposals are out for review, and I am in discussions with eight or nine other authors or teams of authors about possible future publications. Potential topics under development include enhanced interviewing, poetic inquiry and decolonisation, sandboxing, using comics in research, creative sonic research methods, zines in the research encounter, mapping, journey mapping, inclusive creative fieldwork, creative evaluation, visual scribing, urban exploration, visual methods in practice and emoji coding.
I decided to edit this series because I knew there were not enough publication opportunities for people writing about creative research methods. That meant students and researchers wanting to learn more about these kinds of methods were struggling to find relevant information. The books in the series are short, practical how-to books, designed to help researchers learn enough to try out the methods for themselves.
This kind of initiative also helps to establish the legitimacy of creative research methods. Now, in the first half of the 21st century, creative research methods are following a similar trajectory to that of qualitative methods in the second half of the 20th century. It may surprise you to know that economists began adopting qualitative methods as early as the 1960s. After much debate, psychologists began using qualitative methods in the 1980s and engineers joined in in the 2010s. Other disciplines also expanded their methodological repertoires and, as a result, academic journals publishing qualitative research were set up for areas of study formerly thought of as quantitative. For example, the journal Qualitative Health Research was founded in 1991, though Qualitative Psychology was not set up until 2013.
At present, creative research methods are perhaps most firmly established in the discipline of education, I suspect because it is such a creative profession. But I am seeing creative methods being used and promoted in a very wide range of disciplines, such as facilities management, health and the politics of fashion. This is reflected in the doctoral students I teach on courses for the National Centre for Research Methods, doctoral training partnerships and universities. Students come to learn about creative methods from arts and humanities and social sciences disciplines. So far, so unsurprising. But I also get engineers, physicists, business students, computer scientists – all sorts in fact.
In the Euro-Western world we think of creative research methods as new. However, the work of Indigenous methods experts such as Bagele Chilisa from Botswana, Margaret Kovach from Canada and Linda Tuhiwai Smith from New Zealand shows us that creative methods are in fact very old indeed – tens of thousands of years old, in some cases, so very much older than the ‘scientific method’ which has only dominated research in the Euro-Western world for the last few centuries. ‘Older’ does not necessarily equal ‘better’, but in this case I think it does. The scientific method has its place but is not the be-all and end-all of research. Creative methods are more likely to treat people holistically, take context into account and produce rich data and analyses. The scientific method assumes a level of universal consistency and uniformity, while creative methods make space for individual particularities.
Creative Research Methods in Practice is a small but tangible step on our journey away from the dominance of positivism and post-positivism. These stances emphasise objectivity, which is unachievable, and usually consider experiments to be the ideal form of research. Again, there is a place for experimental methods, but there is also a role in research for all sorts of creative methods, from participatory approaches to autoethnography, board games to computer games, apps to zines. And these are the kinds of methods I aim to showcase in the series. If you would like to write a book for this series, please do get in touch.
The Bloomsbury Handbook of Creative Research Methods, published last month, is at present only available in hardback at a recommended retail price of £140, or as an ebook at £126. Regular readers will know that I have ranted on this blog before about the iniquitous prices charged by some academic publishers, and advocated working with not-for-profit university presses. So, it is reasonable to ask me, as some people have: why did I agree to edit this expensive book for Bloomsbury?
The backstory is this: Maria Brauzzi, an editor at Bloomsbury who I did not know, emailed me in late 2021 to invite me to edit a Handbook of Creative Research Methods for them. At the time I had started work on editing a creative data analysis book for Policy Press with Dawn Mannay and Ali Roy, and chapter proposals were landing in my inbox. We received over 60 proposals, most of which were good. We had originally intended to produce a normal-sized book with around 12 chapters, but with so many good proposals to choose from, Policy Press agreed to produce a Handbook of Creative Data Analysis with around 30 chapters. (I’m delighted to say that is now in production and will be published in early September.)
Even so, selecting the chapters to include in the Policy Press Handbook was tough. Then I had a brainwave! I hadn’t replied to Maria at Bloomsbury because I couldn’t decide whether to accept her invitation. So, I emailed back and told her I had too many good proposals to fit into the Handbook I was doing with Policy Press, and asked whether I could pivot some of them into the Handbook she wanted to commission for Bloomsbury. She said ‘yes!’ so I ended up being sole editor of one Handbook and lead editor of another at the same time.
I do not recommend this course of action unless you have, as I had then (and I’m glad to say, have again now), a solid, competent, and reliable support worker or other assistant. I could not have edited this Handbook without my support worker’s help. But editing it meant I was able to offer publishing opportunities to people who deserved them, including some people from marginalised groups. I’m glad I could do that, even though it meant working for a publisher who screws royalties down to the bone, lower than any of my other publishers, while earning a massive profit by selling books at prices that most people can’t afford.
So, to redress the balance a tiny little bit, I am offering a free copy of the Handbook to one of my blog followers. If you’re not a follower yet, you should be able to see a ‘Follow Blog Via Email’ notice with space to enter your email address. Any blog follower who wants a chance of a free copy needs to comment below and check back here a week after this blog has been posted to see who has won. My support worker will put all the names in a hat and pick one at random, then add a comment stating who will receive the free copy. I will post a book to that person, wherever they are in the world.
Congratulations to Lucia 🎉 our winner of the prize draw for a free copy of The Bloomsbury Handbook of Creative Research Methods!
I am delighted to announce the publication of this book which I have edited. I am sorry it is very expensive – I hope Bloomsbury will produce a paperback in due course, and in the meantime, you should be able to get hold of a copy if you have access to an academic library.
The book has 22 chapters in nine sections with 2-4 chapters per section. The first section is an overview with chapters on creative research methods and ethics, creative research methods in the geo-political south, digital tools for creative data analysis, and human geography and creative methods. The other sections are on narrative inquiry, poetic analysis, visual methods, creating visual art, participatory textiles, embodied performative methods, participants as experts, and creative collaboration. I chose these divisions. The content of the book is so rich that there are many other ways I could have divided the chapters. For example, I could have had a section on digital methods, or one on multi-modal methods, or one on feminist research. I made the choices I did with two key aims: first, to make the book flow as well as possible from start to finish, and second, to highlight some of the key points that were coming through in the chapters. Of course this book is in no sense exhaustive, but it does provide some useful insight into the scope and range of creative methods in the 2020s.
The authors come from Australia, Canada, Belgium, India, Ireland, Nepal, the UK and the US, and include doctoral students, independent researchers, practice-based researchers and senior professors. Each chapter is excellent, important, and potentially useful for researchers. They all tell previously untold stories. Perhaps because of my interest in research ethics, Caroline Aldridge’s chapter seems particularly important to me. It highlights some of the barriers that can still face researchers wanting to use creative methods. Caroline is a former social worker and a bereaved mother whose son died as a result of mental illness. She wanted to investigate how other similarly bereaved parents experienced professional and organisational responses and investigations following their child’s death. Caroline worked with potential participants, via a private Facebook group, to co-create a research design which used participatory textiles. These would include a mixed-media quilt co-created with participants, plus researcher-created mixed-media visual vignettes. Both are tried and tested techniques. Caroline did this work carefully, respectfully, and ethically, using all her trauma-sensitive professional social work and insider researcher skills. Then her proposed approach, with all its supporting evidence, was rejected by her university’s research ethics committee. They wanted her to use more conventional methods where the researcher retains more power and the participants are simply providers of data. This left Caroline with a choice of doing her work ethically while disobeying the ethics committee, or obeying the ethics committee and, paradoxically, doing less ethical research. She made a third and very difficult choice and, with considerable sadness, suspended her doctoral research. Many researchers have faced similar dilemmas but they are rarely reflected in the literature. I am grateful to Caroline for agreeing to write a rather different chapter than she had originally proposed, because I think these stories need to be heard.
The overview chapters are important too. Su-ming Khoo, from the National University of Ireland, explores the relationships between creativity, art and science, with an unflinching look at the dark side of creativity, and demonstrates the place of creativity in ethical decision-making as well as research methods. Bibek Dahal and Suresh Gautam, from Nepal, show us where the differences and similarities lie in creative research methods in the geo-political North and South of the world. Christina Silver, Sarah L. Bulloch and Michelle Salmona, from the UK and Australia, outline the role of computer-assisted qualitative data analysis software packages in creative data analysis. Nadia von Benzon from the UK traces the development and use of creative research methods in geography, and considers some ways in which creative research methods transcend disciplinary boundaries. Taken together, these lenses – ethical, global, digital and disciplinary – tell us a lot about where the field of creative research methods is at present.
Overall the book gives us a good insight into a global field in which people are reviewing and developing methods, identifying new ethical difficulties and finding ways to overcome them, making good use of technology, and working across disciplines. It also shows that creative research methods are local, manual, and applicable within single disciplines. And it clearly demonstrates that creative methods are not only useful for gathering data but can also be useful at every point from research design to dissemination.
I would love to know which chapter (or chapters) of this book seems most important to you, and why. Perhaps you could tell me in a comment.
I have an exciting new venture to share with you. For the last couple of years I have been working with Policy Press on a new series of short affordable books on creative research methods in practice. And we have just gone public! The first book is on its way: Photovoice, Reimagined by Nicole Brown. And there are several more books in the pipeline. Two are being written right now – one on fiction in research, and one on phenomenography – and four other book proposals are under review.
I wanted to edit this series because there are no such books available to help researchers learn in detail about why, when, and how to use a new research method. There are several books giving an overview of creative research methods, within or across academic disciplines; some sole-authored, some edited collections. These are useful texts but they do not generally offer enough depth of information to enable readers to try out the methods for themselves with confidence. The main rationale for this new series is to do just that.
One of the hardest things to sort out was the design for the covers and webpage. That took months and a lot of emails, discussions, and meetings (most of which I didn’t need to attend, thank goodness). We almost agreed on some covers and then the sales and marketing people at Policy Press said the designs weren’t good enough. They were absolutely right. So we went back to the actual drawing board and started again. I am so pleased with the final result. I think hot air balloons are a delightful combination of science and art, innovation and exploration and adventure – just like creative research methods. (Let’s not focus too closely on the ‘hot air’ part, OK?!) Also Policy Press likes to have a Bristol element to their designs, and Bristol holds an annual International Balloon Fiesta – Europe’s largest event of its kind – so the design works from that viewpoint too.
I am so happy to be able to tell you about this new book series. And if you would like to propose a book for the series, do get in touch!
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
‘written, graphical or pictorial matter, or a combination of these types of content, in order to transmit or store information or meaning’ (p 11).
So documents have a range of purposes, and can come in a wide variety of forms and formats: digital or hard copy; reports, letters, emails, social media posts, forms, meeting minutes, web pages, leaflets, shopping lists; and so on. Documents are rarely just containers of information, they are also tools for people to use in the world. Documents are used for purposes such as communication (letters, emails etc), or enforcement (legislation and legal judgements), or to make something happen (a child’s birthday present wish-list or an adult’s last will and testament).
Documents can be rich sources of data for research. They may be collected, from libraries, the internet, archives etc, or constructed, such as when a researcher asks participants to keep a diary of relevant events for a specific time period. Collected documents are secondary data, and using secondary data where possible is an ethical approach to research, because it reduces the burden of primary data collection for participants and for researchers.
There are many ways to analyse documentary data: thematic analysis, content analysis, discourse analysis, narrative analysis and metaphor analysis are just a few. And documents are being used as data for research across a wide range of disciplines and fields: psychology, ecology, education, health, technology, linguistics and many others too. Innovative work is being done with documents in research all around the world.
What does not yet exist is an edited collection of chapters to give a sense of the breadth and depth of possibilities offered to research by documents. So I am delighted that Aimee Grant has invited me to co-edit just such a book, which we intend to showcase some of the excellent work being done with documents by researchers worldwide. We formulated our call for proposals last week; the deadline is midday BST on 24 April 2023. Please help us to spread the word!
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
We all have biases and prejudices that affect our lives in many ways, from the choices we make to our interactions with others. And of course our biases and prejudices can affect our research work too. We can never completely escape from our biases and prejudices, but there are a number of steps we can take to mitigate their impact. Here are ten of the most useful.
1. Get as much good quality information as you can.
The less information you have, the more space there is for biases and prejudices to operate. Ideally, seek information from reputable sources that is backed up by other reputable sources. Of course in some research areas, at the frontiers of knowledge, there is little to be found – but there will be foundational information to build pioneering research on, and again this needs to be demonstrably solid and trustworthy.
2. Use structures to help you think.
Structures, such as checklists, can bring rigour to your thinking. They should be predetermined and tested. One structure I use frequently is the eight criteria identified by Sarah Tracy for assessing the quality of qualitative research. These criteria were themselves developed from a systematic analysis of debates on quality in the qualitative research literature – exactly the kind of demonstrably solid foundational information I referred to in Tip 1 above.
3. Take steps to mitigate the effects of your emotions.
Our emotions are always with us and they inevitably affect our work. We need to be aware of our feelings so we can take the necessary steps to ensure they are not unduly influencing our decisions. Where emotional influence is unavoidable, we should be open about this in our reporting.
4. Seek the opinions of others.
Other people are often better at spotting our biases and prejudices than we are ourselves. It can be useful to talk through your work with someone you trust to give you an honest opinion. Ask for their views about where your biases and prejudices lie, and how they might be affecting your research.
5. Value scepticism.
Remember, if it looks too good to be true, it probably is. Of course it is possible to overdo scepticism: doubting the accuracy of every single thing is annoying for others and bad for your own mental health. But scepticism in the form of truly critical thinking can be a useful counterbalance to bias and prejudice.
6. Flip the viewpoint.
This involves conducting thought experiments and is particularly useful for debiasing during analytic work. If you think your data is pointing towards a conclusion that group X needs intervention Y, try imagining the opposite. What if group X didn’t need intervention Y? Or what if group X needed intervention M rather than intervention Y? This may sound fanciful, even pointless, yet I recommend that you give it a try. It can be a really useful way to shed light on your findings.
7. Consider accountability.
Who are you accountable to? What would they think of your work? It won’t just be one group of people, so think this through for each group: participants, participants’ families, participants’ community members, colleagues, superiors, maybe funders, your family, your friends… Try to see your work as each group would see it, and consider what that tells you.
8. Use mindfulness.
Bias and prejudice can creep in when you think and work fast. There are incentives in most people’s working lives to think and work fast, but deliberately slowing our thinking can be a very useful guard against bias and prejudice.
9. Practice reflexivity.
Reflexivity involves carefully and critically examining the influences on our work, such as our characters, institutions, identities and experiences. There is no set way to do this, except that it should not become an end in itself; it should serve our research work, or it risks becoming self-indulgent. Working reflexively involves asking ourselves questions such as: Why am I doing this research? What and whose purposes does it serve? Why do some aspects of my research work please or trouble me? And so on.
10. Read work by people who are not like you.
I cannot stress this enough. Learn about others’ views. Read work by people of different genders, ages, ethnicities, cultures, religions/beliefs, political persuasions. Find out how the world looks to them. And this loops us right back to Tip 1 above, because gathering more information about people who are not like us helps to dispel any biases and prejudices we hold about them.
Do you have any other tips for debiasing work? If so, please pop them in the comments.
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
Trainee qualitative researchers, learning the most popular research method of interviewing, are routinely taught to use their interpersonal skills to create rapport with participants. This has been questioned for the last 20 years by Jean Duncombe and Julie Jessop. They ask, how ethical it is for researchers to fake friendship as a means to the end of gathering data?
On the one hand, it is common for people to use interpersonal skills to help us get what we want from others in our day-to-day lives. This applies whether we want a loan from a credit agency, a prescription from the doctor, a response to a complaint – in a multitude of situations, presenting our most polite and friendly selves can help to get the results we want. So it is arguable that it makes sense also to use these everyday methods in research.
On the other hand, research encounters are rather different from everyday encounters. This applies particularly to qualitative research where a researcher may spend a considerable period of time giving a participant their undivided attention. This is an unusual and often a welcome experience for participants, who often describe it in positive terms such as ‘therapeutic’, ‘cathartic’ or ‘a treat’.
Many of the people we want things from in day-to-day life are either providing us with goods and services, so that a transactional element is built into the encounter, or are already in a personal relationship with us through kinship, friendship or community membership. So the rapport we build in those situations already has a clear basis which is mutually understood. This does not apply within the research encounter, where we are usually asking participants to give us their time and information in exchange for a potential benefit to an imagined future population. (I considered the extent to which this is ethical in my recent post on the LSE Impact Blog.) Also, despite all the efforts to secure informed consent, we know that people generally agree to participate in research for their own reasons rather than ours. And where that reason is to get a little human company and kindness, which is lacking from their own lives, the practice of building rapport begins to appear even more suspect.
Imagine you are, let us say, living on minimal welfare benefits with a chronic condition which makes it difficult for you to leave the house. You have lost touch with the friends you used to have when you could go out to work, and your family live far away. You suffer from anxiety and you are very lonely. The carers who come in three times a day are brisk and professional; they don’t have time to chat, and you don’t want to hold them up because you know they are always under pressure. Then a researcher calls, saying she is doing an evaluation of the care you receive, and asking if she can visit you to ask a few questions. You are delighted because it’s been years since you had a visitor and she sounds so kind and friendly on the phone. When she visits, you tell her all sorts of things about yourself and your life. She seems really interested, and laughs at your jokes, and tells you a few things about her own life in return. You haven’t felt this good in years. When she has asked all her questions, you ask one of your own: please will she visit you again? She looks at the floor and says she would like to, but she can’t promise, because between work and her children she doesn’t have much free time. You would like to suggest she brings her children with her, but you know a ‘no’ when you hear one, so you let her go, wait for the front door to close, and listen to the emptiness of your home and your life.
Duncombe and Jessop point out that these problems are multiplied in longitudinal research, where the boundaries between real and faked friendship can become much more blurred. They share experiences of participants beginning to treat them as friends, and the discomfort that arises when they don’t reciprocate. I have had similar experiences, and I’m sure many other qualitative and mixed-methods researchers have too. It is interesting to consider this Euro-Western approach in the light of the very different Indigenous approach, in which research is deemed to be ethical when it serves to maintain and develop existing relationships. Looked at in this way, our Euro-Western approach of creating and then dropping relationships to further our research purposes seems potentially abusive.
The EU-funded TRUST project developed a Global Code of Conduct for Research in Resource-Poor Settings. It was based on four values elicited from research they did with a wide variety of people around the world: respect, fairness, honesty and care. The aim was to combat ‘ethics dumping’, where research deemed unethical in a higher-income country is conducted, instead, in a lower-income country where research is not governed by a regulatory system. I would argue that these values should also apply where research is done by a researcher with more social capital than some or all of their participants. In the vignette above, the researcher was not entirely honest and did not show care in response to the participant’s request, e.g. by signposting them to a local befriending service. This could be described as ‘friendship dumping’.
When you think about it, researchers using their interpersonal skills to create rapport with a participant as a means to an end is actually quite manipulative. This might be more defensible when we are ‘studying sideways’ or ‘studying up’, but even then it seems questionable. Showing respect for participants would be a more creditable aim, especially if it was combined with fairness, honesty and care.
The next post on this blog will be in September. You can follow the blog, above, to get my posts in your inbox.
This blog and the videos on my YouTube channel are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
Note: This post was first published on the SRA blog in November 2021 and is reproduced here with the kind permission of the author and SRA.
In this blog post, Kimberley Neve, researcher at the Centre for Food Policy at City, University of London outlines different methods for capturing ‘lived experience’. Lived experience is the actual, specific ways in which people experience something, in this case food – access to food, food poverty, food quality, food allergies and many others. Kimberley and other researchers at the Centre for Food Policy specialising in qualitative methods have produced a Brief to give an overview of the range of methods you can use when researching people’s lived experience of ‘food environments’. Food environments are the space in which we make all our decisions about food – what to eat, where to buy it, when and with whom to eat it.
Using qualitative methods to influence policy
As researchers we want our work to have impact. We also want to know that it resonates with people and reflects not only the experiences of the research participants, but also of the general population in some way. For our research to have a positive impact, effective communication with policy-makers, both locally and nationally, is vital. Despite the potential of qualitative methods to inform policy that is effective and equitable for the people it is designed to help, the number of qualitative studies used as evidence for policy remains modest compared to quantitative studies.
We wanted to raise the profile of qualitative research methods among both policy-makers and food environment researchers by demonstrating the range of potential methods and their benefits (and drawbacks), with a focus on how using them can help inform policy. These methods can be utilised in a wide range of research areas – for example local transport, access to outdoor space or crime in local areas – providing in-depth insights into people’s lived experiences and practices that can explain how or why people act the way they do.
In our Centre for Food Policy Research Brief (the ‘Brief’) we initially mapped existing studies capturing the lived experience of food environments, categorising methods and relevant case studies. Following this, we consulted with members of our Community of Practice – experts in qualitative research and food environments – for feedback prior to final edits.
What are the qualitative methods you can use?
The Brief is not an exhaustive list of the qualitative methods available; however, we’ve tried to capture the main methods you can use. For the scope of the Brief, we didn’t include quantitative methods but of course recognise their vital role.
Often, combining quantitative and qualitative methods can yield the most valuable insights.
To make the overview as useful as possible, we categorised the methods in the following way:
Group 1 – Exploring experiences, perceptions, beliefs, practices and social networks;
Group 2 – Observing practices in situ;
Group 3 – Designing policy and interventions drawing on the lived experience of participants.
Which method should you use for your research?
Typically, you’ll be likely to benefit from combining methods to suit your research context. For example, visual methods and observation tend to be accompanied by individual or group interviews to provide a more in-depth exploration. In the full Brief you’ll find an overview of qualitative methods with the key benefits and potential limitations of each. Assuming you know all about individual interviews and focus group discussions already, here are a selection of other methods less frequently used in research projects.
Group 1: Visual methods
This includes photo elicitation, creative arts (where participants create artwork such as drawings, videos or theatre), concept mapping (pile sorting, ranking, mental mapping) and timelines. One study in the US used photo elicitation in urban neighbourhoods to identify community-level actions to improve urban environments in relation to health. The study allowed the researchers to identify that not all food outlets affected health in the same way, and that contextual factors such as crime and safety influence how people accessed food, which had implications for community-level policy.
PROS – Group 1 methods work particularly well with young participants or where there are language barriers, as views can be expressed more directly and simply. Participants may also be more willing to share information visually and images can provide insights that may not have been accessible via specific questioning.
CONS – Visual data can be difficult to interpret in a way that fully represents the participant perspective, and there is a potential for photographs to be seen as reflections of reality, rather than subjective perceptions that provide insights into reality. Participants could also misunderstand the objective and take photos that do not help to answer the research question.
Group 1: Geospatial methods
Geospatial methods often combine mapping with photography and/ or GPS to create visual data that can then be discussed in one-to-one interviews or focus group discussions for more insights. Methods include spatial mapping, geonarratives and geotagged photography. These methods are relatively new to the food environment literature; however they have been used very effectively to explore how people engage with their environment in general, for example in their green space encounters.
PROS – Similar to visual methods, geospatial methods can work well to engage participants in a way that is more creative and encourage them to share information more openly. They also allow for participants to share their knowledge as experts of their own food environments. These methods provide insightful data into the connections between space and place, particularly if combined with interviews or focus groups.
CONS – Geotagging requires specific technology that may be expensive and difficult to operate. There are also ethical considerations with mapping someone’s location – when and how this data is collected, stored and used are important factors to specify during the research design.
Group 2: Observation
This involves observing participant behaviour with methods such as go-along tours, transect walks and community group observation. Unlike with non-participant observation (below), the researcher talks to the participants during the activity about what their actions and interactions mean to them. For instance, during a go-along tour in a supermarket (shop-along), the researcher might ask for the thought process behind the decision to purchase a product. Transect walks are go-along tours with the addition of creating a map of the local food environment resources, constraints and opportunities.
In a UK study, go-along interviews were used to explore which changes to supermarket environments would support healthier food practices. A key insight from this research was that varied individual responses to the supermarket environment in low-income neighbourhoods are mediated by differing levels of individual agency. Interventions should include an emphasis on factors that increase agency in order to change how people buy food.
PROS – Insights into the practical aspects of daily life and routines can be captured interactively with the participant and explored in more detail with further questioning. Power imbalances in research are addressed as participants take more control of the research process.
CONS – The researcher’s presence may impact how participants behave or move around spaces, for instance by influencing what they buy in a shop-along tour. It is also quite time-intensive to organise and participate in.
Group 2: Non-participant observation
This is where participants are watched from a distance, for instance by video, with little or no interaction with the researcher. This method was used as part of a focused ethnographic study in Kenya along with interviews and cognitive mapping. The aim of the study was to inform policies for improving infant and young children’s nutrition practices. Among other insights, a key finding for policy was that future interventions must consider various aspects of food insecurity to improve conditions in practice.
PROS – You can get insights into ‘real’ individual actions, such as shopping or eating practices, without the researcher’s presence influencing the actions. Features of everyday life that may otherwise not be mentioned can be recorded and explored with further questioning. The researcher can also complete a log to provide contextual insights that can explain practices from a more objective viewpoint.
CONS – Observation alone, without a follow-up interview or discussion, means the researcher is unable to dig into the reasons underpinning the actions, so the interpretation of the situation can be subjective.
Group 3: Photovoice, co-design, co-creation, systems mapping, group model building
The third group of methods were particularly difficult to classify, as terminology and meanings often overlapped (for instance with co-creation and co-design). These methods place the participant at the centre of the research process and actively engage communities affected by policy decisions (at a neighbourhood, city, county, country level) in the research process. Participants are encouraged to draw on their own experiences, expertise and knowledge of their food environments to think about and propose change, so that policies resulting from the research are relevant and context-specific, and as a result have the potential to be more sustainable.
An example of effective group model building can be seen in a study in the US, where community-based workshops took place with a diverse group of chain and local food outlet owners, residents, neighbourhood organisations, and city agencies.Action ideas were discussed for interventions to promote healthy food access, including funding new stores that stock healthy food options and building the capacity for sourcing local produce in stores.
PROS – For all of the methods in Group 3, the ‘hands-on’ nature of research enables participants to generate information and share knowledge on their own terms. Outputs, such as policy recommendations, are created together with the participants to be effective in their local context following an in-depth research process.
CONS – These methods all run the risk of being perceived as tokenistic by participants if engagement is not meaningful and genuine.
In brief
Decisions about which methods to select to study live experience depend on the purpose of the study (i.e. guided by a specific research question), the local context, time and resources available, and the benefits and limitations of each method.Recently, the COVID-19 pandemic has accelerated the possibilities of using digital tools and technology as key facilitators for remote research.
As researchers, we not only need to engage participants and design research projects that will yield useful insights; we also have to translate our findings so that these insights can inform the design of effective and equitable policy. By using a range of methods, a more comprehensive and detailed overview can be communicated. Visual materials and stories are particularly effective ways for qualitative researchers to communicate their findings to policy-makers and make a refreshing addition to the more common interviews and focus groups.
Kimberley Neve is a Researcher at the Centre for Food Policy, City, University of London. She works as part of the Obesity Policy Research Unit, investigating people’s lived experiences of food environments to inform policy in areas such as infant feeding and weight management. Kimberley is a Registered Associate Nutritionist with a Masters in Global Public Health Nutrition.
Research ethics committees are very concerned with the potential vulnerability and sensitivity of research participants. So far, so laudable – but I don’t think they show their concern in particularly useful ways. Gaining formal approval from a research ethics committee is a hoop many researchers have to jump through, but then the real work of ethics begins.
For most research ethics committees, vulnerability is an attribute of some groups and not others. Groups who may be deemed to be vulnerable include children, older people, or adults with learning disabilities. These categories are specified by UKRI who oversee government-funded research in the UK. But if you look at this in more detail, it doesn’t stand up. Take children. Say a competent 14-year-old is a young carer for their single parent who lives with severe and enduring mental health problems and drinks alcohol all day. Which of those two people might be better able to give informed consent to the child taking part in research? Conversely, people are not necessarily vulnerable because they are older. President Biden is 79 and I can’t imagine him being seen as vulnerable. Learning disabilities don’t necessarily make people vulnerable either, as some of my dyslexic friends would no doubt agree.
Vulnerability is not an attribute, it is a state we all move into and out of in different ways. The start of the Covid-19 pandemic made this abundantly clear. Quite suddenly we were all vulnerable to illness, perhaps death; to increased anxiety; to fear for loved ones who fell sick; to bereavement. Heads of state were no safer than ordinary people living in apartments or suburbs, and researchers were every bit as vulnerable as their participants. Perhaps one small positive side-effect of the pandemic is this: we can see more clearly that we are all vulnerable to changing circumstances resulting in trouble or trauma. Which does not mean we are all vulnerable all the time – but that any of us may be, or may become, vulnerable at any time. As researchers, I think it is essential for us to be aware of this, and ready to face and manage it when it occurs.
Vulnerability and sensitivity have something in common. Just as it is not possible to predict from group membership who is and is not vulnerable, so it is not possible to predict who will and will not be upset by a topic. Of course some topics are likely to be upsetting: female genital mutilation, suicide, sex work, and so on. And we need to put whatever precautions we can in place if we are investigating topics like these, that are evidently sensitive: to make the experience as safe as possible for our participants, and for ourselves. But we cannot be sure that everyone will find these topics equally sensitive; there are people who can take such topics in their stride.
Conversely, some people may be upset by apparently innocuous topics. Suppose a market researcher is investigating people’s perceptions of homewares. In one interview, the researcher asks their question about teapots, and realises their participant is struggling to hold back tears. The participant explains that the last gift ever given to them by their beloved mother, who died exactly one year ago, was a teapot. Perfectly plausible; impossible to foresee.
So, we can’t always predict everything everyone will be sensitive about, and we shouldn’t pretend we can. But, again, we need to equip ourselves with the mental and emotional intelligence and dexterity to be able to deal with the unexpected. Because if there is one thing we can predict, it is that at times we will face the unpredictable.
This blog, the monthly #CRMethodsChat on Twitter, and the videos on my YouTube channel, are funded by my beloved Patrons. Patrons receive exclusive content and various rewards, depending on their level of support, such as access to my special private Patreon-only blog posts, bi-monthly Q&A sessions on Zoom, free e-book downloads and signed copies of my books. Patrons can also suggest topics for my blogs and videos. If you want to support me by becoming a Patron click here. Whilst ongoing support would be fantastic you can make a one-time donation instead, through the PayPal button on this blog, if that works better for you. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!