Research Methods Books By Women Of Colour

Last week I was tagged in a tweet asking this very interesting question:

I thought of a couple of responses immediately, then another the next morning. I also decided to write this blog post because I knew there was more I could say.

Disclaimer: this isn’t a ‘best of’ or a full review, this is simply what is on my shelves in my personal research methods library. I have found these books from social media, peer reviews, bibliographies, recommendations. Between them they cover a wide range of methods and topics: qualitative, quantitative and multi-modal research; arts-based methods and technology; decolonizing methods and Indigenous research; various disciplinary topics; and a lot of ethics.

‘Why to’ books

These books make a case for doing research in certain ethical ways. Let’s start with a classic: Decolonizing Methodologies by Linda Tuhiwai Smith. I read this ground-breaking book during my Masters’ in Social Research Methods around the turn of the century, and bought the second edition when it came out in 2012. This little paperback is remarkably comprehensive and full of wisdom.

Building on the work of Smith: Decolonizing Educational Research: From Ownership To Answerability by Leigh Patel (2016). This is a thoughtful, passionate clarion call for education research to focus on learning.

Building on both of the above: Decolonizing Interpretive Research: A Subaltern Methodology for Social Change, edited by Antonia Darder (2019). Interpretive research prioritises philosophical and methodological ways of understanding society. While this book is quite conceptual, its use of multiple voices provides a depth of insight into the importance of the points it makes. Also, if you have read Smith and Patel before you get to this book, it will make more sense.

‘How to’ books

Heewon Chang’s Autoethnography As Method (2008) is a book I frequently recommend to students. It is readable, practical, and clear. Autoethnography is sometimes criticized as self-indulgent and navel-gazing, but if you do it Chang’s way, it won’t be. Also autoethnography has a key role to play in these pandemic times.

Pranee Liamputtong’s Performing Qualitative Cross-Cultural Research (2010) is another classic. It is great on cultural sensitivity and gives lots of really helpful examples. Every researcher should read this book unless they’re absolutely sure they are doing monocultural research – and even then they would probably learn something useful.

Caroline Lenette’s Arts-Based Methods in Refugee Research: Creating Sanctuary (2019) is more specialist, yet has a lot to offer to anyone interested in arts-based methods. She pays particular attention to the methods of digital storytelling, photography, community music, and participatory video.

Indigenous methodologies

 Indigenous Methodologies: Characteristics, Conversations, and Contexts (2009) is by Margaret Kovach from Saskatchewan in Canada. This very readable book includes conversations with six Indigenous thinkers which contribute an interesting diversity of ideas and experiences.

Indigenous Research Methodologies (2019 – 2nd edition) by Bagele Chilisa from Botswana in Africa is another classic. It is also very readable and comprehensive.

The first disciplinary book I found on Indigenous methodologies is by Lori Lambert: Research for Indigenous Survival: Indigenous Research Methodologies in the Behavioral Sciences (2014). Lambert is from the US and, like Kovach, includes other voices in her work. However, the other voices in Lambert’s book are of people from Indigenous communities, in Canada, the US and Australia, who are subject to research. As is common with Indigenous research texts, Lambert’s book is very readable.

Maggie Walter from Tasmania is lead author of Indigenous Statistics: A Quantitative Research Methodology (2013) with Chris Andersen from Canada (who is a man, but I guess he can’t help that, and evidently he was happy for Walter to be first author so good for him). If you’re quant-averse, don’t worry; this is not about how to do sums, it’s about which sums are worth doing and why. And, again, it’s very readable.

Edited collections

These are both edited by men, but are on relevant topics and include chapters by women of colour. The first is White Logic, White Methods: Racism and Methodology (2008) edited by Tukufu Zuberi and Eduardo Bonilla-Silva. Two-fifths of chapters are written or co-written by women of colour.

The second is Research Justice: Methodologies for Social Change (2012) edited by Andrew Jolivétte. Only two chapters in this book are by men, the other 14 are by women (including Antonia Darder and Linda Smith). I reviewed this book for the LSE book review blog back in 2015.

Other relevant topics

While these books are not directly about research methods, they are on topics which are so relevant to researchers that I will include them here.

Algorithms of Oppression: How Search Engines Reinforce Racism (2018), by Safiya Noble, is a passionately and beautifully argued book about why algorithms are not neutral and the impact that has on society. Researchers use search engines all the time and we need to know about this stuff.

Race After Technology (2019) by Ruha Benjamin builds on and expands Noble’s work. She demonstrates that advances in technology are lauded as objective and progressive, but in fact they reproduce and reinforce existing inequalities. Crucially, she includes a chapter on practical ways to counter this dissonance.

Sway: Unravelling Unconscious Bias (2020) by Pragya Agarwal helps us to understand and challenge our own unconscious biases. Any researcher concerned about ethics would benefit from reading this book.

In fact, any researcher concerned about ethics would benefit from reading any of the books listed here. Although the word ‘ethics’ doesn’t appear in any of the titles, each of these books points the way towards a more ethical research practice.

This is certainly not a comprehensive list of methods and other research-relevant books (and chapters) by women of colour. If you have other suggestions to make, please add them in the comments.

This is a simulpost with the blog of the Social Research Association of the UK and Ireland.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $52 per month. If you think a day of my time is worth more than $52 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Research methods to consider in a pandemic

methodsSince lockdown began, researchers have been discussing how best to change our methods. Of the ‘big three’ – questionnaires, interviews, and focus groups – only questionnaires are still being used in much the same way. There are no face-to-face interviews or focus groups, though interviews can still be held by telephone and both can be done online. However, doing research online comes with new ethical problems. Some organisations are forbidding the use of Zoom because it has had serious security problems, others are promoting the use of Jitsi because it is open source.

I’ve been thinking about appropriate methods and I have come up with three options I think are particularly worth considering at this time: documentary research, autoethnography, and digital methods. These are all comparatively new approaches and each offers scope for considerable creativity. Documentary research seems to be the oldest; I understand that its first textbook, A Matter of Record by UK academic John Scott, was published in 1990. Autoethnography was devised by US academic Carolyn Ellis in the 1990s, and digital methods have developed as technological devices have become more available to more people through the 21st century.

Documentary research is also called document research or document analysis. Interest in this approach has been growing recently, with two books published in the last two years in the UK alone. The first is Doing Excellent Social Research With Documents (2018) by Aimee Grant (with a gracious foreword by John Scott). The second is Documentary Research in the Social Sciences (2019) by Malcolm Tight. These books demonstrate that documents can be used as data in a wide range of research projects. Of course some documents are only available in hard copy, such as those held in archives or personal collections, but a large and growing number of documents are freely available online. A range of analytic techniques can be used when working with documents, such as content analysis, thematic analysis, or narrative analysis.

Autoethnography is ethnography written by, about, and through the researcher’s self (just as autobiography is biography written by its subject). In some quarters autoethnography has a bad reputation as self-indulgent navel-gazing. And of course, like all research methods, it can be poorly used – but when used well it has great potential for insight. I am seeing signs that there are going to be a lot of COVID19 autoethnographies, so I would recommend steering away from this, but there may well be other aspects of your life that could become a fruitful basis for research. Using autoethnography well requires the researcher to make careful judgements about how much of their self to include in the research as data, what other data to gather, and how to analyse all of that data. Also, good autoethnography is likely to have a clear theoretical perspective and implications for policy and/or practice. Texts I would recommend here are Autoethnography as Method (2009) by Korean-American academic Heewon Chang, and Evocative Autoethnography (2016) by US academics Arthur Bochner and Carolyn Ellis.

Digital research or digital methods are terms that have come to encompass a wide range of methods united by their dependence on technology. Although this is the newest of the three approaches I’m covering today, it is also the most complex and changeable. Many pre-digital research methods can be adapted for use in digital ways, and the digital environment also enables the development of new research methods. Documentary research in lockdown will be mostly, if not entirely, digital, and there is also scope for digital autoethnography. Texts I would recommend, again both from the UK, are Understanding Research in the Digital Age by Sarah Quinton and Nina Reynolds, and Doing Digital Methods by Richard Rogers. One thing to remember when doing digital research is that inequalities also exist in the digital environment; it is not a neutral space. I can recommend a couple of texts on this topic too, both from the US: Algorithms of Oppression by Safiya Noble, and Race After Technology by Ruha Benjamin.

Doing research in a pandemic also requires considerable thought about ethics. I have long argued that ethical considerations should start at the research question, and I believe that is even more crucial at present. Does this research need doing – or does it need doing now, in the middle of a global collective trauma? If not, then don’t do that research, or postpone it until life is easier. Alternatively, you may be doing urgent research to help combat COVID19, or important research that will go towards a qualification, or have some other good reason. In which case, fine, and the next ethical question is: how can my research be done in a way that places the least burden on others? The methods introduced above all offer scope for conducting empirical research without requiring much input from other people. Right now, everyone is upset; many are worried about their health, income, housing, and/or loved ones; increasing numbers are recently bereaved. Therefore everyone is vulnerable, and so needs more care and kindness than usual. This includes potential participants and it also includes researchers. We need to choose our methods with great care for us all.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $47 per month. If you think a day of my time is worth more than $47 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

 

Writing Is A Research Method

writing on keyboardIt has always struck me as odd that people don’t recognise writing as a research method. I doubt there is a single piece of formal research in the Euro-Western world which doesn’t involve writing. Yes, we can make all our reports with video, but those videos need scripting and that requires words. As researchers, writing is one way in which we exercise our power. You may not think of yourself or your writing as powerful, yet writing is an act of power in the world. I was reminded recently by a colleague that my words on this blog are powerful. I’d forgotten. It’s easy to forget, but we need to remember.

Writing, in Euro-Western research, is universal. It’s the one method used regularly by both quantitative and qualitative researchers. Perhaps that’s why it isn’t recognised as a method, because it unites us rather than dividing us. But it is a method, and I would argue that it is a qualitative method. We can’t do research without writing, and how we write affects the ways our work is understood and used by other people.

I’ve been interested in the terminology around the COVID-19 pandemic, which I think provides a useful example. Last week I wrote a post about self-isolation. Following a lot of travelling the previous week I’ve been voluntarily staying at home, seeing only my partner and a couple of delivery people. One friend challenged my use of the term ‘self-isolation’, saying that in their view I was doing social distancing because I wasn’t sleeping separately and staying 2m away from my partner or using separate washing facilities, and I was still taking deliveries in person. I could see their point, though I know others are using the term ‘self-isolation’ in the same way as me. My view of social distancing is that it is more about literally keeping our distance from each other in public places. But these are new terms and we’re all trying to figure this whole thing out while it’s happening.

However, neither of them are particularly lovely terms, and I have appreciated the appearance of alternatives. The first I saw was I think an FB post taken from Instagram (I can’t remember who generated either post now – my apologies; if it was you or you know who it was, please comment below and I’ll edit to credit). The post suggested that we’re not doing social distancing, we’re doing physical distancing for social solidarity. I really liked that concept. Then yesterday Leo Varadkar, Taoiseach of Ireland (and a doctor), spoke of cocooning, and I heard that Americans were talking of ‘shelter in place’.

While I have no evidence for this beyond my own reactions, I suspect that more positive terms are likely to lead to more acceptance. Asking someone to isolate themselves has connotations of loneliness, sadness, and prison (which also has associations with the term ‘lockdown’ currently in use around the world). Physical distancing sounds easier and more accurate than social distancing, and coupling it with social solidarity makes it feel stronger and more righteous. Cocooning makes me think of cosiness and warmth, plus it rhymes (or almost) with other gentle words like soothing and crooning. Asking someone to shelter in place has connotations of home, familiarity, and safety.

As researchers, we often have new information to impart and we sometimes arrive at new concepts which need to be named. There are a whole bunch of words and phrases for us to choose from in writing each new sentence. The words and phrases we use can make a great deal of difference to how our work is received. This means we need to take care in choosing our words and phrases, and in putting them together to make sentences, and in putting sentences together to make paragraphs. These tiny laborious steps are like the strokes of an artist’s brush or the stitches from a crafter’s needle: the beating heart of the writer’s art.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $52 per month. If you think a day of my time is worth more than $52 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Ten Top Tips For Managing Your Own Research

crossroads-1580168__340When someone mentions research methods, what do you think of? Questionnaires? Interviews? Focus groups? Ways of doing research online? Do you only think of data gathering, or do you think of methods of planning research, analysing data, presenting and disseminating findings?

Research methods is a huge and growing field with many books and innumerable journal articles offering useful information. But nobody talks about methods for managing your own research. Perhaps you’re doing postgraduate research in academia or workplace research such as an evaluation. Even if you’re a fully funded full-time doctoral student, research is not all you do. Research has to fit in with the rest of your life and all its domestic work, family needs, other paid or voluntary work, hobbies, exercise, and so on.

Nobody talks about the methods for doing this kind of personal research management. Or, at least, not many people. I said quite a lot about it in my book Research and Evaluation for Busy Students and Practitioners. Petra Boynton also addresses it in her book The Research Companion. But I haven’t seen it mentioned anywhere else (if you have, please let us know in the comments). So here are ten top tips:

  1. Plan everything. Lots of books will tell you how to plan your research project. What they don’t say is that you also need to plan for the changes to your life and work which will result from you taking on the research. How will your research affect your other commitments? What do you need to do to minimise the impact of your research on your other commitments and vice versa? Build in contingency time for unforeseen events.
  2. Manage your time carefully. Use your plan to help you. Break down the main tasks into monthly, weekly and daily to-do lists. Review these regularly.
  3. Learn to work productively in short bursts. It may seem counter-intuitive, but most people get more done this way than by setting aside whole days to work on a project.
  4. Use time when your mind is under-occupied, e.g. when you’re waiting in a queue or doing repetitive household tasks, to think about and solve problems related to your research.
  5. Seek support from your family. Make sure they know about your research and understand its importance to you.
  6. Seek support from colleagues, managers, tutors etc, whether your work is paid or unpaid. Make sure they know about your research and understand its importance in your life.
  7. Don’t cut corners in ways that could damage your health. Eat sensibly, take exercise, get enough sleep and rest.
  8. Take breaks. At least three short breaks in each day, one day off in each week, and four weeks off in each year.
  9. Don’t beat yourself up if things go wrong. Be kind to yourself and learn what you can from the experience. Then re-group, re-plan, and set off again.
  10. Reward yourself appropriately for milestones reached and successes achieved.

In my view, these are as much research methods as questionnaires and interviews. Learning to use them involves acquiring tacit knowledge. I’ve been on a mission to convert tacit knowledge to explicit knowledge ever since I started writing for professionals. This blog post is part of that process. If you have other tips, please add them in the comments.

This blog is funded by my beloved patrons. It takes me around one working day per month to post here each week. At the time of writing I’m receiving funding of $12 per month. If you think 4-5 of my blog posts is worth more than $12 – you can help! Ongoing support would be fantastic but you can also support for a single month if that works better for you. Support from Patrons also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

How Do Research Methods Affect Results?

questionsLast week, for reasons best known to one of my clients, I was reading a bunch of systematic reviews and meta-analyses. A systematic review is a way of assessing a whole lot of research at once. A researcher picks a topic, say the effectiveness of befriending services in reducing the isolation of housebound people, then searches all the databases they can for relevant research. That usually yields tens of thousands of results, which of course is far more than anyone can read, so the researcher has to devise inclusion and/or exclusion criteria. Some of these may be about the quality of the research. Does it have a good enough sample size? Is the methodology robust? And some may be about the topic. Would the researcher include research into befriending services for people who have learning disabilities but are not housebound? Would they include research into befriending services for people in prison?

These decisions are not always easy to make. Researcher discretion is variable and fallible, and this means that systematic reviews themselves can vary in quality. One thing they almost all have in common, though, is a despairing paragraph about the tremendous variability of the research they have assessed and a plea to other researchers to work more carefully and consistently.

One of the systematic reviews I read last week reported an earlier meta-analysis on the same topic. A meta-analysis is similar to a systematic review but uses statistical techniques to assess the combined numerical results of the studies, and may even re-analyse data if available. The report of the meta-analysis I read, in the systematic review, contained a sentence which jumped out at me: ‘…differences in study design explained much of the heterogeneity [in findings], with studies using randomised designs showing weaker results.’

Randomised designs are at the top of the hierarchy of evidence. The theory behind the hierarchy of evidence is that the methods at the top are free from bias. I don’t subscribe to this theory. I think all research methods are subject to bias, and different methods are subject to different biases. For example, take the randomised controlled trial or RCT. This is an experimental design where participants are randomly assigned to the treatment or intervention group (i.e. they receive some kind of service) or to the control group (i.e. they don’t). This design assumes that random allocation alone can iron out all the differences between people. It also assumes that the treatment/intervention/service is the only factor that changes in people’s lives. Clearly, each of those may not in fact be the case.

Now don’t get me wrong, I’m not anti-RCTs. After all, every research method is based on assumptions, and in the right context an RCT is a great tool. But I am against bias in favour of any particular method per se. And the sentence in the systematic review stood out for me because I know the current UK Government is heavily biased towards randomised designs. It got me wondering, do randomised designs always show weaker results? If so, is that because the method is more robust – or less? And does the UK Government, which is anti-public spending, prefer randomised designs because they show weaker results, and therefore are less likely to lead to conclusions that investment is needed?

And that got me thinking we really don’t know enough about how research methods influence research results. I went looking for work on this and found none, just the occasional assertion that methods do affect results. Which seems like common sense… but how do they? Does the systematic review I read hold a clue, or is it a red herring? The authors didn’t say any more on the subject.

We can’t always do an RCT, even when the context means it would be useful, because (for example) in some circumstances it would be unethical to withhold provision of a treatment/intervention/service. So what about other methods? Do we understand the implications of asking a survey question that a participant has never thought about and doesn’t care about – or cares about a great deal? I know that taking part in an interview or focus group can lead people to think and feel in ways they would not otherwise have done. What impact does that have on our research? Can we trust participants to tell us the truth, or at least something useful?

This is troubling me and I have more questions than answers. I fear I may be up an epistemological creek without an ontological paddle. But I think that bias in favour of – or against – a particular research method, without good evidence of its benefits and disadvantages, is poor research practice. And it’s not only the positivists who are subject to this. Advocates of participatory research are every bit as biased, albeit in the opposite direction. The way some participatory researchers write, you’d think their research caused bluebirds to sing and rainbows to gleam and all to be well in the world.

It seems to me that we all need to be more discerning about method. And that’s not easy when there are so many available, and a plethora of arguments about what works in which circumstances. So I think we may need to go meta here and do some research on the research. But ‘further research needed’ is a very researcher-y way of thinking, and I’m a researcher, so… does my bias look big in this?

Rowan the Rigorous Research Rabbit

Middle poster-minI have more exciting news! But first: why is this blog like a bus stop? Because you wait ages for posts with exciting news, then two come along in quick succession!

Further to last week’s comics extravaganza, this week I would like to present an animation I wrote, which has been animated by clever and diligent students at Staffordshire University. The idea for this came from Alke Gröppel-Wegener. She and I were chatting over lunch late last year and I told her about the comic I was writing. “Why don’t you write an animation, too?” she asked. “Because I don’t know how to write an animation and I don’t know any animators,” I said, thinking that was a fairly conclusive argument. But Alke brushed away my objections with a flick of her hand, explained there were student animators at the University, and proclaimed her conviction that of course I could write an animation if I tried.

So I went home and asked the internet how to write an animation and whaddaya know, it knew. I was also very lucky to work with Laura Weston, a knowledgeable and gifted tutor who downloaded segments of her brain into mine on demand. And as for the animators – well!

Laura told me I’d be working with third-years. She helped me to put together a brief, mainly by reining me in when I got all enthusiastic about over-complicating things, and then she publicised the brief. To begin with we asked for character sketches and received several submissions. It was so hard to choose between them that I ended up asking two people to work together – and then discovered that (a) Kalina Kolchevska and Kiefer Bray were first-years and (b) they were already good friends and happy to collaborate. They did a great job creating our hero Rowan and his evil nemesis Cavil the Carrot Fly.

Then I went to meet with a group of third-years, had a chat with them about the freelance lifestyle, and explained that I wanted to put a team together to create the animation. I am so pleased that Carolann Dalziel, aka Caz, volunteered to be the producer, because she did an amazing job. I am also very pleased that Aimee Carter volunteered to direct. I would have been happy with whoever wanted to work on the project, but I am honestly delighted to have had two women working with me as the animation industry is so male-dominated. (I’m also delighted that my comic was illustrated by a woman because that industry is too.)

The rest of the team included artistic director and lead animator Janine Perkins, sound technician and background artist Cameron Jones, Aneesa Malik and David Trotter who drew the storyboards, and Kiefer Bray and Ash Michaelson who worked as junior animators. They have all done such a terrific job that the animation looks very professional. I went to the end-of-year degree show at Staffordshire University earlier this month, where the animation was first shown to the public, and it got excellent feedback.

It is of course about research methods: in particular, how to choose a research question. This is something that troubles students year after year, all around the world. Caz and Aimee, Kalina and Kiefer, Aneesa and David, Janice and Cameron and Ash and I all hope that the animation we have made will help students through this knotty problem. Check it out and see what you think. It’s only one minute long.

Conversation With A Purpose

covertest2I have exciting news! This has been a long time in the planning and making, and has come to fruition in part thanks to the support of my beloved patrons. The inspiration came almost two years ago, at one of the pedagogy sessions of the 2016 Research Methods Festival. Research colleagues from the UK’s National Centre for Research Methods, where I am a Visiting Fellow, talked about the difficulty in bridging the gap between classroom and practice when teaching research methods. It occurred to me then that comics and graphic novels could have a useful role to play here, and I vowed to do what I could to make that happen.

Today, I am glad to launch my first research methods comic online. It’s called Conversation With A Purpose and it tells the story of a student’s first real-life interview. I wrote the words, but I couldn’t have made a comic without a collaborator, because I can draw the curtains but that’s about all. My colleague and friend Dr Katy Vigurs put me in touch with Gareth Cowlin who teaches on the Cartoon and Comic Arts degree course at Staffordshire University. I presented his students with a brief, and was lucky enough to recruit the very talented Sophie Jackson to create the artwork for the comic. Sophie is not only a highly skilled artist, she is also a joy to work with, so the entire project was a delight from start to finish.

The in-person launch happened last Friday night at Show and Tell, Staffordshire University’s 2018 art and design degree show. I also launched another creative teaching aid at the show, but you’ll have to wait till next week to find out about that! People’s feedback on the comic was very positive, though I wasn’t surprised because we had already received terrific testimonials from a couple of eminent scholars.

And you know the best part of all? You can download the comic, Conversation With A Purpose, and you will find instructions for printing it here. It will look best if you have a colour printer, though it should also work in monochrome. The comic includes discussion questions for use in the classroom.

Please enjoy, use, and share our comic. And if you would like to help me create more resources like this, please consider joining my patrons. I love producing free stuff to help students and teachers but, as an independent researcher with no guaranteed salary, my resources are very limited. This is where every single supporter makes a real difference.

Why Not Include Theory?

theoryLast week I wrote a post about how to choose a research method. It received a fair amount of approval on social media, and a very interesting response from @leenie48 from Brisbane, Australia, with a couple of contributions from @DrNomyn. I’ve tidied up our exchange a little; it actually ended up in two threads over several hours, so wasn’t as neat as it seems here. I was travelling and in and out of meetings so undoubtedly didn’t give it the attention it deserved. I couldn’t embed the tweets without tedious repetition, so have typed out most of the discussion; our timelines are accessible if anyone feels the need to verify. Here goes:

EH: Your post suggests one can jump from rq to method choice with no consideration of theory. I disagree.

HK:I teach, and write for, students at different levels. Here in the UK masters’ students in many subjects have to do research with no consideration or knowledge of theory.

EH: Perhaps it might be useful to point out advice is for specific readers. Bit sick of having to explain to new phd students that this kind of advice is not for them!

HK: You’re right, and I am sorry for causing you so much inconvenience. I’ll re-tag all my blog posts, though that will take a while as there’s a sizeable archive.

HK: That seems unnecessarily pejorative. I don’t regard practice-based masters’ research as ‘pretend’, but as a learning opportunity for students. Commissioned research and practice-based research is professional rather than academic. Not wrong, simply different.

EH: Then why not include theory?

HK: I’ve explained why I didn’t include it in my blog post, so I’m not sure what you’re asking here?

And that’s where the discussion ended, with me confused as @leenie48’s question was on the other thread. Having put this into a single conversation, though, for the purposes of this post, it makes more sense. I think @leenie48 was asking why not include theory in masters’ level or practice-based research.

My conversation with @leenie48 might lead the uninitiated reader to believe that theory is a homogeneous ‘thing’. Not so. Theory is multiple and multifaceted. There are formal and informal theories; social and scientific theories; grand and engaged theories; Euro-Western and Southern theories. These are oppositional theory labels; there are also aligned options such as post-colonial and Indigenous theories.

I studied a module on social theory for my MSc in Social Research Methods, and used hermeneutic theory (a grand-ish formal Euro-Western social theory) for my PhD. Yet I don’t think I understood what theory is for, i.e. how it can be used as a lens to help us look at our subjects of study, until well after I’d finished my doctoral work.

If you’re doing academic research, theory can be very useful. Some, like @leenie48, may argue that it is essential. It is certainly a powerful counter when you’re playing the academic game. Yet theory is, like everything, value-laden. At present, in the UK, the French social theorist Bourdieu is so fashionable that the British Sociological Association is often spoken of, tongue in cheek, as the Bourdieu Sociological Association. At the other extreme, social theories from the Southern hemisphere are often ignored or unknown. So I would argue that if we are to include theory, we need to engage with the attributes of the theory or theories on which we wish to draw, and give a rationale for our choice. I find it frustrating that so much of academia seems to regard any use of theory as acceptable as long as there is use of theory, rather than questioning why a particular theory is being used.

This kind of engagement and rationale-building takes time and a certain amount of academic expertise. If you’re doing research for more practical reasons, such as to obtain a masters’ degree, evaluate a service, or assess the training needs of an organisation’s staff, theory is a luxury. These kinds of research are done with minimal resources to achieve specific ends. I don’t think this is, as @leenie48 would have it, ‘pretend research’. For sure it’s not aiming to contribute to the global body of knowledge, but I can see the point in working to discover particular information that will enable certain people to move forward in useful ways.

I have still to tackle two other points raised by @leenie48: the ‘methodology vs method’ question, and the issue of writing for masters’ students vs doctoral students on this blog and elsewhere. So that’s my next two blog posts sorted out then!

How To Choose A Research Method

chooseBecause you do things in a sensible order, you have your research question, right? Good. It’s very important to have that first. The method (or methods) you choose should be the one (or ones) most likely to help you answer your question. You can’t figure out which methods are most likely to help if you don’t yet know what your question is. So if you’re actually not sure of your question, stop reading this RIGHT NOW and go settle your question, then come back and carry on reading.

OK, now you definitely have a question. You’ll probably have an idea of what kind of research method may be most help. (If you don’t, I recommend you get into the research methods literature, such as this book.) Let’s say you think your question could be answered most usefully by doing a bunch of interviews. Your next step is to think through the pros and cons of interviews as thoroughly as you can. You may find it helps to read relevant excerpts from the research methods literature. For example, here is a breakdown of the pros and cons of interviewing, taken from page 143 of my book Research and Evaluation for Busy Students and Practitioners: A Time-Saving Guide (2nd edn; Policy Press, 2017).

Pros Cons
Interviews yield rich data Interviews are time-consuming for researchers and participants
Face-to-face interviews let the interviewer include observational elements, e.g. from the participant’s appearance or body language, that are not available with other methods The researcher’s interpretation of the data from a face-to-face interview may be affected by the quality of the rapport they developed with their participant
Interviews can be conducted by telephone, which saves time and costs and increases anonymity Not everyone is comfortable using the telephone, and it can be harder to create a rapport over the phone than in person
An interview equivalent can be conducted by email, which avoids transcription and so saves time and money; this also helps in reaching some groups of people e.g. those with severe hearing impairment Conducting an ‘interview’ by email can make it more difficult to follow up interesting answers with supplementary questions
Interviewers can follow up interesting answers with supplementary questions Interviewers’ input can influence participants’ answers
Unstructured interviews can be particularly useful at the exploratory stage of a research project Unstructured interviews run the risks of missing important issues or degenerating into a general chat
Semi-structured interviews allow participants to participate in setting the research agenda, which may be more politically acceptable, lead to more useful data, or both Semi-structured interviews make it harder to compare data from different individuals or groups
Structured interviews enable clearer comparison of data from different individuals or groups Structured interviews require the question designer to be able to consider all the issues that are relevant to the participants
Recording data enables exact reproduction of someone’s words and pauses Transcribing interview data is time-consuming and expensive

This kind of thinking will help you to decide on your research method. Also, you will need to be pragmatic. For example, if you have a very tight deadline and no time or budget for transcription, then interviewing is not a good idea however much it might fit the research question. In such a case you would need to consider other methods. My book contains similar ‘pros and cons’ tables for using secondary data, questionnaires, focus groups, documents as data, observational data, visual data, and collecting data online. Of course this is not an exhaustive list, and if you’re considering using, say, mobile methods, soundscapes, or ethnography, you might need to construct a ‘pros and cons’ list of your own. To do this, you would need to read, watch videos, and talk to people with more knowledge about the method of interest.

Once you have established the pros and cons of the method, these need to be weighed against pragmatic considerations of available time, money, and other resources. This assessment will be different for each research project, in its own context; there are no hard-and-fast rules. But however you do an assessment like this, your results will always be better than those of someone who uses the method that first springs to mind. For sure you may end up using the method you first thought of, in which case you might say to me, Helen, what is the point of doing all that thinking? The point is you’ll be making a considered and informed decision to use the method. That means you’ll be able to justify your decision to readers, reviewers, tutors, supervisors, managers, examiners, or whoever else has an interest in your work.

Why Research Participants Rock

dancingI wrote last week about the creative methods Roxanne Persaud and I used in our research into diversity and inclusion at Queen Mary University of London last year. One of those was screenplay writing, which we thought would be particularly useful if it depicted an interaction between a student and a very inclusive lecturer, or between a student and a less inclusive lecturer.

I love to work with screenplay writing. I use play script writing too, sometimes, though less often. With play script writing, you’re bound by theatre rules, so everything has to happen in one room, with minimal special effects. This can be really helpful when you’re researching something that happens in a specific place such as a parent and toddler group or a team sport. Screenplay, though, is more flexible: you can cut from private to public space, or include an army of mermaids if you wish. Also, screenplay writing offers more scope for descriptions of settings and characters, which, from a researcher’s point of view, can provide very useful data.

Especially when participants do their own thing! Our screenplay-writing participants largely ignored our suggestions about interactions between students and lecturers. Instead, we learned about a south Asian woman, the first in her family to go to university, who was lonely, isolated, and struggling to cope. We found out about a non-binary student’s experience of homophobia, sexism and violence in different places on campus. We saw how difficult it can be for Muslim students to join in with student life when alcohol plays a central role. Scenes like these gave us a much richer picture of facets of student inclusion and exclusion than we would have had if our participants had kept to their brief.

Other researchers using creative techniques have found this too. For example, Shamser Sinha and Les Back did collaborative research with young migrants in London. One participant, who they call Dorothy, wanted to use a camera, but wasn’t sure what to capture. Sinha suggested exploring how her immigration status affected where she went and what she could buy. Instead, Dorothy went sightseeing, and took pictures of Buckingham Palace. The stories she told about what this place and experience meant to her enriched the researchers’ perceptions of migrant life, not just the ‘aggrieved’ life they were initially interested in, but ‘her free life’ (Sinha and Back 2013:483).

Katy Vigurs aimed to use photo-elicitation to explore different generations’ perceptions of the English village where they lived. She worked with a ladies’ choir, a running club, and a youth project. Vigurs asked her participants to take pictures that would show how they saw and experienced their community. The runners did as she asked. The singers, who were older, took a few photos and also, unprompted, provided old photographs of village events and landmarks, old and new newspaper cuttings, photocopied and hand-drawn maps of the area with added annotations, and long written narratives about their perceptions and experiences of the village. The young people also took some photos, mostly of each other, but then spent a couple of hours with a map of the village, tracing the routes they used and talking with the researcher about where and how they spent time. Rather than standard photo-elicitation, this became ‘co-created mixed-media elicitation’ as Vigurs puts it (Vigurs and Kara 2016:520) (yes, I am the second author of this article, but all the research and much of the writing is hers). Again, this provided insights for the researcher that she could not have found using the method she originally planned.

Research ethics committees might frown on this level of flexibility. I would argue that it is more ethical than the traditional prescriptive approach to research. Our participants have knowledge and ideas and creativity to share. They don’t need us to teach them how to interact and work with others. In fact, our participants have a great deal to teach us, if we are only willing to listen and learn.