The Ethics of Research Evidence

Like so many of the terms used in research, ‘evidence’ has no single agreed meaning. Nor does there seem to be much consensus about what constitutes good or reliable evidence. The differing approaches of other professions may confuse the picture. For example, evidence that would convince a judge to hand down a life sentence would be dismissed by many researchers as anecdote.

evidenceGiven that evidence is such a slippery, contentious topic, how can researchers begin to address its ethical aspects? A working definition might help: evidence is ‘information or data that people select to help them answer questions’ (Knight 2004:1). Using that definition, we can look at the ethical aspects of our relationship with evidence: how we choose, use, and apply the evidence we gather and construct.

Evidence is often talked and written about as though it is something neutral that simply exists, like a brick or a table, to be used by researchers at will. Knight’s definition is helpful because it highlights the fact that researchers select the evidence they use. Evidence, in the form of facts or artefacts, is neither ethical nor unethical. But in the process of selection, there is always room for bias, and that is where ethical considerations come into play.

To choose evidence ethically, I would argue that first you need to recognise the role of choice in the process, and the associated potential for bias. Then you need to consider some key questions, such as:

  • What is the question you want to answer?
  • What are your existing thoughts and feelings about that topic?
  • How might they affect your choices about evidence?
  • What can you do to make those choices open and defensible?

The aim is to be able to demonstrate that you have chosen the information or data you intend to define as ‘evidence’ in as ethical a way as possible.

Once you have chosen your evidence, you need to use it ethically within the research process. This means subjecting all your evidence to rigorous analysis, interpreting your findings accurately, and reporting in ways that will communicate effectively with your audiences. These are some of the key responsibilities of ethical researchers.

Research is a process that converts evidence into research evidence. It starts with the information or data that researchers choose to use as evidence, which may be anything from statistics to artworks. Then, through the process of (one would hope) diligent research, that evidence becomes research evidence. Whether and how research evidence is applied in the wider world is the third ethical aspect.

Sadly, there is a great deal of evidence that evidence is not applied well, or not applied at all. Most professional researchers have tales to tell of evidence being buried by research funders or commissioners. This seems particularly likely where findings conflict with political or money-making ambitions. In some sectors, such as third sector evaluation, this is widespread (Fiennes 2014). How can anyone make an evidence-based decision if the evidence collected by researchers has not been converted into evidence they can use?

The use of research evidence is often beyond the control of researchers. One practical action a researcher can take is to suggest a dissemination plan at the outset. This can be regarded as ethical, because such a plan should increase the likelihood of research evidence being used. But it could also be regarded as manipulative: using the initial excitement around a new project to persuade people to sign up to a plan they might later regret.

It seems that ethics and evidence are uneasy bedfellows. Again, Knight tries to help us here, by suggesting that research evidence should be used by people with expertise. This raises a further, pertinent question: what is the ethics of expertise? I will address that next week.

A version of this article was originally published in ‘Research Matters’, the quarterly newsletter for members of the UK and Ireland Social Research Association.

Why Research Participants Rock

dancingI wrote last week about the creative methods Roxanne Persaud and I used in our research into diversity and inclusion at Queen Mary University of London last year. One of those was screenplay writing, which we thought would be particularly useful if it depicted an interaction between a student and a very inclusive lecturer, or between a student and a less inclusive lecturer.

I love to work with screenplay writing. I use play script writing too, sometimes, though less often. With play script writing, you’re bound by theatre rules, so everything has to happen in one room, with minimal special effects. This can be really helpful when you’re researching something that happens in a specific place such as a parent and toddler group or a team sport. Screenplay, though, is more flexible: you can cut from private to public space, or include an army of mermaids if you wish. Also, screenplay writing offers more scope for descriptions of settings and characters, which, from a researcher’s point of view, can provide very useful data.

Especially when participants do their own thing! Our screenplay-writing participants largely ignored our suggestions about interactions between students and lecturers. Instead, we learned about a south Asian woman, the first in her family to go to university, who was lonely, isolated, and struggling to cope. We found out about a non-binary student’s experience of homophobia, sexism and violence in different places on campus. We saw how difficult it can be for Muslim students to join in with student life when alcohol plays a central role. Scenes like these gave us a much richer picture of facets of student inclusion and exclusion than we would have had if our participants had kept to their brief.

Other researchers using creative techniques have found this too. For example, Shamser Sinha and Les Back did collaborative research with young migrants in London. One participant, who they call Dorothy, wanted to use a camera, but wasn’t sure what to capture. Sinha suggested exploring how her immigration status affected where she went and what she could buy. Instead, Dorothy went sightseeing, and took pictures of Buckingham Palace. The stories she told about what this place and experience meant to her enriched the researchers’ perceptions of migrant life, not just the ‘aggrieved’ life they were initially interested in, but ‘her free life’ (Sinha and Back 2013:483).

Katy Vigurs aimed to use photo-elicitation to explore different generations’ perceptions of the English village where they lived. She worked with a ladies’ choir, a running club, and a youth project. Vigurs asked her participants to take pictures that would show how they saw and experienced their community. The runners did as she asked. The singers, who were older, took a few photos and also, unprompted, provided old photographs of village events and landmarks, old and new newspaper cuttings, photocopied and hand-drawn maps of the area with added annotations, and long written narratives about their perceptions and experiences of the village. The young people also took some photos, mostly of each other, but then spent a couple of hours with a map of the village, tracing the routes they used and talking with the researcher about where and how they spent time. Rather than standard photo-elicitation, this became ‘co-created mixed-media elicitation’ as Vigurs puts it (Vigurs and Kara 2016:520) (yes, I am the second author of this article, but all the research and much of the writing is hers). Again, this provided insights for the researcher that she could not have found using the method she originally planned.

Research ethics committees might frown on this level of flexibility. I would argue that it is more ethical than the traditional prescriptive approach to research. Our participants have knowledge and ideas and creativity to share. They don’t need us to teach them how to interact and work with others. In fact, our participants have a great deal to teach us, if we are only willing to listen and learn.

Creative Research In Practice

like cloudIt’s not often I get to share an output from the commissioned research I do. Sometimes clients don’t want to share publicly for reasons of confidentiality, and sometimes there are other reasons they don’t publish. As a commissioned researcher, I can’t publish the work someone else has paid for without their agreement. But I’m glad to say that Queen Mary University of London (QMUL) has published the full report of the research I did for them last year with my colleague Roxanne Persaud.

The research question was: How can QMUL improve students’ experience with respect to the inclusivity of their teaching, learning, and curricula? The original brief focused on the protected characteristics covered by the UK Equality Act 2010: age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion and belief, and sexual orientation. Roxanne and I advised QMUL to take a more holistic approach to inclusivity, as the protected characteristics don’t cover some factors that we know can lead to discrimination and disadvantage, such as socioeconomic status and caring responsibilities. We recommended Appreciative Inquiry as a methodological framework, because it doesn’t start from a deficit perspective emphasising problems and complaints, but focuses on what an organisation does well and what it could do better. (It doesn’t ignore or sideline problems and complaints, either; it simply starts from the standpoint that there are assets to build on.)  And of course we suggested creative techniques, particularly for data-gathering and sense-making, alongside more conventional methods.

Roxanne and I were both keen to do this piece of work because we share an interest in diversity and inclusion. Neither of us had worked with QMUL before and we weren’t sure whether they would appreciate our approach to their brief. Sometimes commissioners want to recruit people who will do exactly what they specify. Even so, I’d rather say how I think a piece of work needs to be done; if the commissioner doesn’t want it done that way, then I don’t want the job.

QMUL shortlisted six sets of applicants. The interview was rigorous. Roxanne and I came out feeling we’d done ourselves justice, but with no clue as to whether we might have got the work or not. But we did!

The research was overseen by a Task & Finish group, made up of staff from different departments, who approved the methods we had put forward. We conducted a targeted literature review to identify key issues and best practice for inclusivity in the UK and overseas, and set the research in an institutional, societal, and theoretical context. The theoretical perspectives we used began with the theory of intersectionality developed by the law professor Kimberlé Crenshaw, which we then built on using the diffraction methodology of the physicist and social theorist Karen Barad. These two theories together provided a binocular lens for looking at a very complex phenomenon.

The timescale for the research was tight, and data gathering collided with Ramadan, exams, and the summer holidays. So, not surprisingly, we struggled with recruitment, despite strenuous efforts by us and by helpful colleagues at QMUL. We were able to involve 17 staff and 22 students from a wide range of departments. We conducted semi-structured telephone interviews with the staff, and gave students the option of participating in face-to-face interviews or group discussions using creative methods. These methods included:

  • The life-sized lecturer: an outline figure on a large sheet of paper, with a label indicating what kind of person they are e.g. ‘a typical QMUL lecturer’ and ‘an ideally inclusive lecturer’, which students could write and draw on.
  • Sticker maps: a map of organisational inclusivity, which we developed for QMUL, on which students could place small green stickers to indicate areas of good practice and small red stickers to indicate areas for further improvement.
  • Empathy maps: tools to help participants consider how other students or staff in different situations think and feel; what they might see, say, and do; and where they might experience ‘pain or gain’ with respect to inclusive learning.
  • Screenplay writing: a very short screenplay depicting an interaction between a student and a very inclusive lecturer, or between a student and a less inclusive lecturer. The screenplay will include dialogue and may also include information about characters’ attributes, the setting, and so on.

We generated over 50,000 words of data, which we imported into NVivo. Roxanne and I spent a day working together on emergent data coding, discussing excerpts from different interviews and group sessions, with the aim of extracting maximum richness. Then I finished the coding and carried out a thematic analysis while Roxanne finished the literature review.

We wrote a draft report, and then had two ‘review and refine’ meetings for sense-making, which were attended by 24 people. The first meeting was with members of the Task & Finish group, and the second was an open meeting, for participants and other interested people. We presented the draft findings, and put up sheets on the walls listing 37 key factors identified in the draft report. We gave participants three sticky stars to use to indicate their top priorities, and 10 sticky dots to use to indicate where they would allocate resources. People took the resource allocation incredibly seriously, and it was interesting to see how collaboratively they worked on this. I heard people saying things like, ‘That’s important, but it’s already got five dots on, so I’m going to put another one here.’ I wish I could have recorded all their conversations! We did collect some further data at these meetings, including touch-typed notes of group discussions and information about the relative frequency of occurrence, and importance, of the 37 key factors. All of this data was synthesised together with the previously collected data in the final report and its recommendations.

The comparatively small number of participants was a limitation, though we did include people from all faculties and most schools, and we certainly collected enough data for a solid qualitative study. We would have liked some quantitative data too, but the real limitation was that most of the people we reached were already concerned about inclusivity. We didn’t reach enough people to be able to say with certainty whether this was, or was not, the case more widely at QMUL. Also, while none of our participants disagreed unduly with our methodology or methods, others at QMUL may have done so. In a university including physicists, mathematicians, engineers, social scientists, artists, doctors, dentists and lawyers, among others, it seems highly unlikely that anyone could come up with an approach to research that would receive universal approval.

Yet I’m proud of this research. It’s not perfect – for example, I’ve realised, in the course of writing this blog post, that we didn’t explicitly include the research question in the research report! But its title is Inclusive Curricula, Teaching, and Learning: Adaptive Strategies for Inclusivity, which seems clear enough. I’m sure there are other ways it could be improved. But I’m really happy with the central features: the methodology, the methods, and the flexibility Roxanne and I offered to our client.

Evaluating excellence in arts-based research: a case study

This article first appeared in Funding Insight on 16 June 2016 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com.

peacock-536478__340I recently wrote on this topic citing the work of Sarah J Tracy from Arizona State University, who developed a set of eight criteria for assessing the quality of arts-based and other forms of qualitative and mixed-methods research. Now I propose to apply those criteria to an example of arts-based research, to find out how they can work in practice.

The research example I have chosen is by Jennifer Lapum and her colleagues from Toronto in Canada, who investigated 16 patients’ experiences of open-heart surgery. Their work is methodologically interesting because they used arts-based techniques, not only for data generation, but also for data analysis and dissemination. They published an account of their work in Qualitative Inquiry which I will interrogate here.

Lapum  gathered narrative data from two interviews with post-operative patients, one while they were still in hospital and the other some weeks after returning home. Also, journals were kept by patients between the two interviews. She then put together a multi-disciplinary team of people, including artists, researchers, designers, and medical staff, and they spent a year doing arts-based analysis of the patients’ stories. This included metaphor analysis, poetic inquiry, sketching, concept mapping, and construction of photographic images. The team then developed an installation, covering 1,739 square feet, with seven sections representing the seven stages of a patient’s journey. These sections were arranged along a labyrinthine route, with the operating room at the centre, all hung with textile compositions incorporating poems and photographic images that had been generated at the analytic stage. Further dissemination via a short video on YouTube gives some idea of how it would be to visit this installation.

So how does this research fit with Tracy’s eight criteria? First we ask: is the research topic worthy? I would argue that in this case the answer is yes. Open-heart surgery must be a daunting prospect, even though the rewards can be immense. Lapum’s work offers potential patients and carers some insight into the journey they may take, and offers medical and other relevant staff an increased understanding of patients’ experiences. This is likely to improve outcomes for patients.

Second, is this project richly rigorous? The sample size is small, but the data was carefully constructed. Also, the analytic process was extremely thorough, with a multi-disciplinary team spending a year working with the data. Therefore I would conclude that this criterion has been met.

Do we have sincerity? Is the research reflexive, honest, and transparent? The published article is quite explicit about the methods used, and credits several people who have been involved with the process. The article asserts that the research was reflective, though the article itself is not. Nor do the writers outline all the decisions they took in the course of analysis and dissemination. However, space in a journal article is limited – but there is no mention of what was left out and why. So the research as presented here is sincere up to a point, but there is scope for more reflexivity and transparency.

What about credibility? There is certainly thick description and multiplicity of voices and perspectives in this research. Also, while the research team did not include participants as such, contributions were made by ‘knowledge users’ including cardiovascular health practitioners and former heart surgery patients. So, in Tracy’s terms, this research is definitely credible.

The next criterion is resonance. The installation certainly had aesthetic merit. It was generalisable to some extent: certainly to heart surgery patients and practitioners from other geographic locations, and perhaps to patients and practitioners of other kinds of major organ surgery. And it was also transferable: ‘we found people of diverse backgrounds not only resonated with the work but were also able to consider the application of these ideas to their lives and/or professional field’ (Lapum et al 2012:221). So, yes, it was resonant.

Did this research make a significant contribution? It evidently extended the knowledge, and may have improved the practice, of the research team. The project was methodologically unusual, and explicitly aimed to engage the audience’s aesthetic and emotional faculties, as well as their intellectual abilities, in responding to the research findings. However, there is no report of the installation’s impact on its audience but, again, this may be due to lack of space. So I would argue that this criterion was met, and the research may in fact have made a more significant contribution than we can discern from one journal article.

How ethical was the research? The article does not mention ethics, though it seems inevitable that the research must have received formal ethical approval. The level of thought and care applied to the research suggests that it was ethical, though this is implicit rather than explicit. But, once again, this may be due to space constraints.

And finally, does the research have meaningful coherence? The article tells an engaging and comprehensible story, so yes, it does.

It is perhaps unfair to judge a long and complex research project on the basis of a single journal article of just a few thousand words. Lapum and her colleagues have published several articles about their research; to make a full judgement I should really read them all. However, if the authors had carried out an analysis of their article based on Tracy’s criteria, they might have chosen to add a sentence or two about what they left out, a paragraph or two on reflexivity, a short description of the impact of the installation on its audience, and some information about ethics. The article as it stands is excellent; with these amendments, it could have been outstanding. This demonstrates that Tracy’s criteria are useful for assessing not only research itself, but also reports of research.

The Importance Of Creative Research Methods

me presenting at CRMSS17Last Thursday, Friday and Saturday I was privileged to facilitate the inaugural Creative Research Methods summer school run by Keele University‘s Cultural Animation and Social Innovation Centre (CASIC) working with the New Vic Theatre in nearby Newcastle-under-Lyme. Around 40 people came, travelling from America and South Africa, Sweden and Poland, no doubt other countries I’ve forgotten, and all around the UK.

On the first two mornings we were lucky enough to get to work in the theatre’s auditorium, a wonderful space with plenty of room to move around and interact with people in all sorts of ways. On the first day we used pipecleaners to model journeys both literal and metaphorical, and on the second day we explored issues of power in research using Open Space Technology.

For the first two afternoons, we crossed the car park to the theatre’s Workspace rehearsal room, another great space – with a balcony! On the first afternoon we learned about cultural animation, used buttons to create community maps, then added frames and artefacts to help us come up with research questions. Then we devised and performed creative group presentations – that was so much fun! On the second afternoon we mapped pathways through participation in universities, using flip chart paper, coloured Post-It notes and pens, pipecleaners and tape – by now the creative juices were really flowing.

On the third day we were at the beautiful Keele campus, where (as it was a Saturday) we could use some of the university’s technology facilities: the KAVE for virtual reality and gaming, the Claus Moser studio for soundscapes, and the Turing Lab to make digital circuits. In the afternoon we focused on creative academic writing, hearing about ethnography as advocacy for the animals who are often invisible in social research, and geopoetics, before doing a geopoetics exercise.

We crammed in a great deal, yet there was so much else we could have included. Perhaps the richest part of the summer school was its discussions: between any two people, or a group, or all of us together. I was delighted and astonished by the calibre of the students: an enormously intelligent, creative, dynamic bunch; it was an honour to spend three days in their company.

I love to teach creative research methods, and I’m looking forward to my next gig this Friday at LSE for the National Centre for Research Methods (fully booked I’m afraid). I find a lot of my teaching involves giving people permission to work creatively – or perhaps enabling them to give themselves permission – and advising people on how to convince supervisors and ethics committees that it is legitimate to take a creative approach to research. There is a long hard fight ahead to convince people in certain quarters that useful knowledge exists beyond the bounds of academic convention. In this fight, we are on the same side as Indigenous researchers around the world who find their methodologies are sidelined or ridiculed by the academy. Anishnabe researcher Kathy Absolon, in conversation with Plains Cree and Salteaux researcher Margaret Kovach, said this:

If you go on a water walk or quest, that is your methodology. I was reflecting when you were talking about yours [methodology]. If I said I am doing my PhD and my methodology is my dreams, and I am going to go on a fast every year, and after that fast I had somebody come and visit me and talk to me about my fast and take [teachings] with them. I wouldn’t propose that because I wouldn’t want that to [be] measured. I know that is Indigenous methodologies, but I wouldn’t propose it as a methodology within a mainstream setting because I don’t want them to have the power to say that that’s not research. But it is. (Absolon in Kovach 2009:152-3)

There is a parallel here with creative research in the Euro-Western paradigm, where supervisors, ethics committees, journal editors and reviewers, and others have the power to say ‘this is not research’ to people who know perfectly well that their textile art, ice-skating, or poetry, is indeed research. Patricia Leavy has written eloquently of ‘the ache of false separation’ that some people feel when required to keep their art separate from their research work (2010:240).

Some people have said to me that one reason I can write the books I write is that I’m not an academic. As an independent researcher, I have much less power than many academics, in many ways. But I do have the power to say ‘this is research’, and to collect the evidence that this is research, and put it in a scholarly book, so that other people can cite that work, which helps to convince doubting/frightened/threatened supervisors and others. And I will stand with Indigenous researchers, though their methods are not my methods, because I recognise that knowledge comes from more places and in more ways in this complex and beautiful world than those I can access myself.

Still it feels lonely sometimes. So having the opportunity to spend three days with a group of lively-minded people, who are not only open to this but engaging with it, excited by it, and pushing its boundaries in fascinating ways, was an absolute delight.

Indigenous Research Methods: A Reading List

Indigenous methods booksLast week I wrote about challenging the dominance of English in writing for research and academia. That theme is also relevant to this post, though here it’s more about challenging Euro-Western epistemologies and methods than the English language itself. Over the last year I have built a personal library of books about, or relevant to, my investigation of Indigenous research methods and ethics. The point of this, for me, is to bring these methods into my scholarship, alongside creative and conventional methods, as appropriate. The point is not to become an ‘expert’ on Indigenous research; for a white British person, that is not, should not be, an option. At the start of this work, I worried about being extractive, but I found comfort in the words of Margaret Kovach, an Indigenous researcher from Saskatchewan in Canada, who encourages non-Indigenous scholars to help make space for Indigenous methodologies and assess their value on their own terms. This is what I am trying to do.

For those who are new to this topic, ‘Indigenous’ denotes the native peoples of colonised lands, such as Aboriginal Australians or Inuit Alaskans, while ‘indigenous’ denotes the native peoples of non-colonised lands. So I am an indigenous Brit who will never be an Indigenous researcher. Some people described as Indigenous are unhappy with the term because they feel that it makes them seem like one homogeneous group, whereas in fact there is tremendous diversity. For example, there are hundreds of tribal and language groupings in Australia alone. However, as it is the term most commonly used in the literature, I’m sticking with it for now.

The first book is the foundational Decolonizing Methodologies by Linda Tuhiwai Smith, a Maori researcher from New Zealand. In fact I bought the first edition of this soon after it came out in 1999, the year I began my MSc in Social Research Methods. The second edition came out in 2012. This book shows how research was used as a tool of imperialism to help subjugate colonised peoples through, among other things, complete disregard for Indigenous knowledges and Indigenous peoples’ own research methods. It highlights the value of these knowledges and methods, and calls for research to be linked explicitly with social justice.

Shawn Wilson is an Opaskwayak Cree researcher from Canada who has also lived and worked with Indigenous peoples in Alaska and Australia, as well as spending time with Indigenous peoples in New Zealand, Morocco, and elsewhere. His book, Research Is Ceremony: Indigenous Research Methods (2008), is based on his doctoral research and describes a paradigm shared by Indigenous researchers in Canada and Australia. It’s not easy to get hold of; I tracked down a Canadian bookseller who seems to have bought up the last available copies, and I fear it may be going out of print, which would be a great shame as it is readable and insightful. UPDATE: The publisher emailed me in January 2018 to say it’s not out of print (hurrah!) and it is now available through the link above.

Margaret Kovach is a Plains Cree and Salteaux researcher from Canada whose Indigenous Methodologies: Characteristics, Conversations, and Contexts came out in 2009. Her book covers epistemologies, methods, and ethics. It is a work of considerable scholarship that is also accessible and full of wisdom.

Bagele Chilisa is a Professor at the University of Botswana. Her book Indigenous Research Methodologies (2012) gives an uncompromising and international account of some of the theories, epistemologies, ontologies and methods used by Indigenous researchers. While no book on this subject could be completely comprehensive, Chilisa makes a good job of showing the diversity, as well as some of the commonalities, of Indigenous methodology.

Donna Mertens from the US, Fiona Cram from New Zealand, and Bagele Chilisa have edited a collection called Indigenous Pathways into Social Research: Voices of a New Generation (2013). They have contributions from Indigenous researchers from all around the world: Vanuatu, Mexico, Cameroon, Hawai’i, Alaska, Papua New Guinea, and many other countries. These are fascinating accounts, highlighting personal, political, and ethical challenges, and how they have been overcome. They also say a lot about Indigenous methodologies around the world.

Also in 2013, Maggie Walter, a trawlwoolway researcher from Tasmania, and Chris Andersen, a Métis researcher from Canada, brought out Indigenous Statistics: A Quantitative Research Methodology. This book demonstrates the pervasiveness of Euro-Western thought in the construction of statistical research, using national censuses for ilustration. It offers a framework for Indigenous quantitative research, nayri kati or ‘good numbers’, which places an Indigenous standpoint at the centre. There is a short video online of Maggie Walter talking about Indigenous quantitative research.

Lori Lambert is a Mi’kmaq researcher from north-eastern Canada who has also worked with Indigenous peoples from Montana, US; northern Manitoba, Canada; and Queensland, Australia. Her book, Research for Indigenous Survival: Indigenous Research Methodologies in the Behavioral Sciences, was published in 2014. To the best of my knowledge, this is the first book to position Indigenous methods within a Euro-Western disciplinary category. Like other Canadian writers, such as Wilson and Kovach (above), Lambert includes the voices of people she has worked with alongside her own in her narrative.

Another essential text, though not specifically about research methods, is Southern Theory by Australian academic Raewyn Connell (2009). This book is subtitled ‘The global dynamics of knowledge in social science’ and in my view is essential reading for anyone engaging with social theory. During my MSc, I was taught social theory as the preserve of dead white men, and I am sure this is still being taught in many Euro-Western universities today. Connell’s book gives the lie to this approach.

This list is not exhaustive; it is just my personal library. One limitation is that I can’t afford expensive books. While I was writing this blog post, I had a message from my friend and colleague Roxanne Persaud, alerting me to Susan Strega and Leslie Brown’s edited collection Research as Resistance: Revisiting Critical, Indigenous, and Anti-Oppressive Practices (2nd edn 2015). I would love to read this book, but even the paperback is over £60 which puts it out of my reach.

These books are not comfortable reads for Euro-Western scholars, but they are hugely important. We need to know how research has been, and is, misused by Euro-Western cultures in order to learn how to use it better. Indigenous scholars are extraordinarily generous in their assessment of the potential value of Euro-Western methodologies, even those methodologies that have been instrumental in stealing their lands and their cultures and traumatising generations of their peoples. Yet most Euro-Western researchers either ignore Indigenous research entirely, or conclude that Indigenous peoples must have picked up a few tricks from the colonisers. I’m not sure which is worse. Indigenous research methods pre-date Euro-Western research methods by tens of thousands of years, and there is a great deal that Euro-Western researchers can learn from these approaches.

The Variety Of Indie Research Work

varietyOne of the things I love about being an independent researcher is the sheer variety of projects I work on and tasks I might do in a day. Yesterday, I was only in the office for the afternoon, yet I worked on at least seven different things. Here’s what I did.

First, I checked Twitter, and found a tweet with a link to a blog post I wrote about an event that is part of a project I’m working on with and for the forensic science community. This is a new departure for me, in that I haven’t worked with forensic scientists before, though the work itself is straightforward. I’m supporting a small group of people with research to identify the best way to create a repository for good quality student research data, and it’s surprisingly interesting. So I retweeted the tweet.

Second, I dealt with the morning’s emails. The arrival of a purchase order I’d been waiting for weeks to receive – hurrah! I formulated the invoice and sent it off to the client. Then some correspondence about the creative research methods summer school I’m facilitating at Keele in early July – just three weeks away now, so the planning is hotting up (and there are still some places left if you’d like to join us – it’ll be informative and fun). The most interesting email was a blog post from Naomi Barnes, an Australian education scholar who is considering what it means to be a white educator in the Australian school system. This chimes with the work I am doing on my next book, so I leave a comment and tweet the link.

While on Twitter, I got side-tracked by a tweet announcing #AuthorsForGrenfell, an initiative set up by authors for authors to donate items for auction to raise funds for the Red Cross London Fire Relief Fund to help survivors of the Grenfell Tower fire. I’d been wanting to help: my father is a Londoner, I have always had family in London, I lived in London myself from 1982-1997, and one member of my family is working in the tower right now to recover bodies. So it feels very close to home. But I’m not in a position to give lots of money, so I was delighted to find this option which I hope will enable me to raise more money than I could give myself. I have offered one copy of each of my books plus a Skype consultation with each one. My items aren’t yet up on the site, but I hope they will be soon because bidding is open already. If you’re one of my wealthy readers, please go over there and make a bid!

Then I spent some time researching aftercare for data. Yes, indeed there is such a thing. So far I’ve come up with two ways to take care of your data after your project is finished: secure storage and open publication. They are of course diametrically opposed, and which you choose depends on the nature of your data. Open publication is the ethical choice in most cases, enabling your data to be reused and cited, increasing your visibility as a researcher, and reducing the overall burden on potential research participants. In some cases, though, personal or commercial sensitivities will require secure storage of data. There may be other ways to take care of data after the end of a project, and I’ll be on the lookout for those as I work on my next book.

By now it was 6 pm so I did a last trawl of the emails, and found one from Sage Publishing with a link to a Dropbox folder containing 20 research methods case studies for me to review. They publish these cases online as part of their Methodspace website. I like this work: it’s flexible enough to fit around other commitments and, like other kinds of review, it tests my knowledge of research methods while also helping me to stay up to date. Best of all, unlike other kinds of review, Sage pay for my expertise. So I downloaded all the documents, checked and signed the contract, and emailed it back with a ‘thank you’. By then it was 6.30 pm and time to go home.

As the old saying goes, variety is the spice of life. I certainly like the flavour it gives to my work. Some days I work on a single project all day; those days are fun too. Yesterday I worked in my own office, today I’m out at meetings locally, tomorrow I’m off to London. It’s always ‘all change’ and I wouldn’t have it any other way.

Let’s Talk About Research Misconduct

detective-152085__340Research misconduct is on the rise, certainly within hard science subjects, quite possibly elsewhere. Researchers around the world are inventing data, falsifying findings, and plagiarising the work of others. Part of this is due to the pressure on some researchers to publish their findings in academic journals. There is also career-related pressure on researchers to conduct accurate polls, produce statistically significant results, and get answers to questions, among other things. Some clients, managers, funders and publishers have a low tolerance for findings that chime with common sense or the familiar conclusion of ‘more research is needed’. They may expect researchers to produce interesting or novel findings that will direct action or support change.

Publishers are working to counteract misconduct in a variety of ways. Plagiarism detection software is now routinely used by most big publishers. Also, journal articles can be retracted (i.e. de-published) and this is on the increase, most commonly as a result of fraud. However, the effectiveness of retraction is questionable. The US organisation Retraction Watch has a ‘leaderboard’ of researchers with the most retracted papers, some of whom have had more papers retracted than you or I will ever write, which suggests that retraction of a paper – even for fraud – does not necessarily discredit a researcher or prevent them from working.

Some research misconduct can have devastating effects on people, organisations, and professions. People may lose their jobs, be stripped of prizes or honours, and be prosecuted in criminal courts. Organisations lose money, such as the cost of wasted research, disciplinary hearings, and recruitment to fill vacancies left by fraudulent researchers. And whole professions can suffer, as misconduct slows progress based on research. For example, in 2012 the Journal of Medical Ethics published a study showing that thousands of patients had been treated on the basis of research published in papers that were subsequently retracted. Retraction Watch shows that some papers receive hundreds of citations even after they have been retracted, which suggests that retraction may not be communicated effectively.

Yet even the potentially devastating consequences of misconduct are clearly not much of a deterrent – and in many cases may not occur at all. Let’s examine a case in more detail. Hwang Woo-Suk is a researcher from South Korea. In the early 2000s he was widely regarded as an eminent scientist. Then in 2006 he was found to have faked much of his research, and he admitted fraud. Hwang’s funding was withdrawn, criminal charges were laid against him, and in 2009 he received a suspended prison sentence. Yet he continued to work as a researcher (albeit in a different specialism) and to contribute to publications as a named author.

Closer to home, a survey of over 2,700 medical researchers published by the British Medical Journal in 2012 found that one in seven had ‘witnessed colleagues intentionally altering or fabricating data during their research or for the purposes of publication’. Given the pressures on researchers, perhaps this is not surprising – though it is deeply shocking.

The examples given in this article are from hard science rather than social research. Evidence of misconduct in social research is hard to find, so it would be tempting to conclude that it happens less and perhaps that social researchers are somehow more ethical and virtuous than other researchers. I feel very wary about making such assumptions. It is also possible that social research is less open about misconduct than other related disciplines, or that it’s easier to get away with misconduct in social research.

So what is the answer? Ethics books, seminars, conferences etc frequently exhort individual researchers to think and act ethically, but I’m not sure this provides sufficient safeguards. Should we watch each other, as well as ourselves? Maybe we should, at least up to a point. Working collaboratively can be a useful guard against unethical practice – but many researchers work alone or unsupervised. I don’t think formal ethical approval is much help here, either; it is certainly no safeguard against falsifying findings or plagiarism. Perhaps all we can do at present is to maintain awareness of the potential for, and dangers of, misconduct.

A version of this article was originally published in ‘Research Matters’, the quarterly newsletter for members of the UK and Ireland Social Research Association.

How Independent is an Indie Researcher?

independent womanI have always loved being independent. My parents like to tell the story of the time when, soon after I learned to walk, they took me for a picnic in a local park. My father put me down on the grass, and I got to my feet and toddled away. My mother looked anxious, and my father said, reassuringly, ‘She won’t go far.’ But his confidence was misplaced, because I headed determinedly off into the wide green yonder, and he had to do a quick sprint to bring me back before I came to grief.

When I began researching, I called myself a freelance researcher, or a consultant researcher. I didn’t start calling myself an independent researcher until Immy Holloway told me I should, at a terrific research methods conference in Bournemouth in 2006. (The same conference where I met the incomparable Ken and Mary Gergen, as a result of which they kindly wrote the foreword for my creative research methods book.) As soon as Immy suggested the phrase, I took to it immediately. It seemed to suit.

I love working independently. Particularly at the moment, when I’m mostly home-office-based and writing – though after a few weeks I’ll be pleased to have the meetings and teaching that are scheduled then. But for now, I’m really happy sitting alone at my desk, looking out at the garden growing into spring, listening to the birdsong and the squeals of next door’s children on their trampoline, and writing this blog post.

You know, though, I’ve been thinking recently that despite being officially an independent researcher, I’m actually very dependent. For example, I am completely dependent on others for my income. If nobody chooses me, or not enough people choose me, to do available work, I will go under – particularly as there is so little research funding for which indies can apply. Also, I often need to ask for favours, from small (please can I put your name down as a referee for this research tender?) to large (please will you write a foreword for my book?). As an independent writer, I am dependent on readers for reviews, whether official written ones on websites or in journals, or unofficial verbal ones – the coveted ‘word of mouth’ (at least, it’s coveted if the words are complimentary). More worryingly, I am also dependent on readers to help get my books translated into other languages. My publisher tells me that this usually happens when a bilingual academic makes a proposal to a non-English publisher and offers to support the translation. I am only fluent in English, and although I have good international networks, they’re mostly in English-speaking countries. Unlike institution-based scholars, I have never been able to afford to go to a conference outside the UK where I might make contacts with bilingual academics who could help with translations, perhaps in return for other favours. As a result, I know very few people who I can ask to help with translations. (If you know anyone in the social sciences, arts, or humanities who might help, do tell me please!)

I remember when my supervisor and I were planning my viva. I knew who I wanted for my external examiner, but my supervisor over-ruled me, because she didn’t know the person I wanted, and she did know someone else who she thought would be good (and was). She said she was sure he would do it because he owed her a favour. I have learned since then that a lot of academia seems to work through giving and calling in favours. In such an environment it feels odd to call myself ‘independent’.

The book I’m writing is on research ethics. In the Indigenous research paradigm, reciprocity between researchers and participants is a key ethical principle. However, in the Euro-Western paradigm, researchers have found that attempting such reciprocity where there is an imbalance of power is difficult and can even have dangerous consequences (Israel 2015:137-8). I can’t find much work on reciprocity between academics, and what I can find addresses reciprocity between countries or disciplines and doesn’t say much about power imbalances. I haven’t found anything about reciprocity across the walls of the academy, where there is undoubtedly a power imbalance. I’m glad to say that, in my own experience at least, academics have mostly been courteous and often generous with their help and support for my work, even though, as an indie, I can’t reciprocate in all the same ways that I could if I was based in an institution. This potentially makes me even more dependent, because I have less to offer than salaried mid-career academics. As I progress in my work, will this power imbalance grow? Will it adversely affect the reciprocity on which my entire career depends? Or am I needlessly worrying about something because it feels insecure, when in fact it doesn’t really matter?

Mixed-Methods Data Analysis

concentric circles slide 2Following my post last month about using concentric circles for gathering research data, I had a question from a reader. Nieky van Veggel asked me, “How would I analyse the outcomes of this method?” This is a good question and, like many good questions, it has more than one answer.

First, you can do quantitative analysis: counting and measuring. If you have the participant at the centre, you can count the number of people, agencies, or whatever it is that they have drawn or placed around the concentric circles. Then in either system you can measure the distance, or distances, between the fixed central point and the drawn or placed point(s) chosen by the participant. Once you have the raw numerical data from your counting and/or measurement, you can use statistical calculations as appropriate to your sample size and sampling technique.

Second, you can do qualitative analysis. You can look at the types of relationships depicted and sort those into categories and themes. You can cross-tabulate relationships with other participant attributes, e.g. age or gender. You can also cross-tabulate with any other data you have collected to see if there is a relationship.

Third, you can do both. Then you can synthesize your qualitative and quantitative analyses – or, at least, you can try. There are too many ways of synthesizing data to give full details in a blog post, but you can find more information, references, and examples on pages 106-109 of my book on creative research methods. This post is designed to give you an overview of the subject.

Data synthesis, or data integration as it is also known, can be useful in a number of ways. For example, it can be used to triangulate your data, or to enrich your analysis, and it can yield results which could not be obtained through the analysis of any single dataset. The findings of each single dataset will help to answer your research questions up to a point, but bringing those findings together may give a fuller explanatory narrative. However, integrating findings from different datasets can be one of the most challenging aspects of mixed-methods data analysis. Therefore, it makes sense to have a rationale for doing this, rather than trying to do it for its own sake.

Broadly, when you try to integrate your data, one of three things can happen:

  1. The findings from the different datasets agree. Sadly this is not as common as you might think.
  2. The findings from the different datasets agree in some respects but not in others. This is probably the most common outcome, and requires hard thinking and more analysis to try to resolve the disagreements as far as possible, with further research required where resolution cannot be reached.
  3. The findings from the different datasets do not agree at all. This almost certainly indicates a need for further research – which is not always a palatable message for research managers, commissioners, or funders.

When you write up your data integration process and findings, you need to show how each element relates to the others. The danger with this is it can make your article or report rather ‘methods-heavy’, so be concise where you can.

Australian researchers Reesa Sorin, Tamara Brooks and Ute Haring did some research into children’s understandings of their physical environment. In the process, they developed an analytical procedure using three different methods to analyse a dataset made up of children’s artworks and stories. They began with a quantitative technique: content analysis. This involved identifying the main features of children’s drawings and putting them into categories such as animals, houses and trees. Then they counted the number and frequency of items in each category, reasoning that the more frequently something appeared, the more meaningful it was to children. The other two methods were qualitative. One was interpretive analysis, in which they identified more categories, this time based on the presentation of each drawing, its mood, and the messages in the story the child had told about their drawing. The other qualitative method was developmental analysis, which suggests that stages in the development of children’s artworks can be correlated with their ages. So the content analysis outlined the features of the drawings, the interpretive analysis added depth by showing multiple meanings, and the developmental analysis added ages and stages. The researchers concluded that this combination of analytic methods can ‘provide deep insights into young children’s understandings’ (Sorin, Brooks and Haring 2012: 29).

Data analysis is at the core of our interpretive work as researchers, yet it is rarely discussed and often misunderstood. You can’t learn how to analyse data from a blog post, but it may help you to figure out what some of your current questions are. And I hope, Nieky van Veggel, that this post will provide a step on the way to ticking off another item on your impossible list. Good luck!