Why Secondary Data Should Come First

The argument put forward in this post has been brewing in my mind – and being put into practice in my research work – since some time before COVID19 appeared in our midst. The pandemic has accentuated the point I want to make.

Essentially, my argument is this: researchers should make as much use of secondary data as possible before we even think about gathering any primary data.

Most novice researchers are taught that new research requires primary data; that original research requires data gathered for the purpose by the researcher or the research team. Most research ethics committees focus most of their efforts on protecting participants. We need to change this. I believe we should be teaching novice researchers that new/original research requires existing data to be used in new ways, and primary data should be gathered only if absolutely necessary. I would like to see research ethics committees not only asking what researchers are doing to ensure the safety and wellbeing of participants, but also requiring a statement of the work that has been done using secondary data to try to answer the research question(s), and a clear rationale for the need to go and bother people for more information.

I believe working in this way would benefit researchers, participants, and research itself. For researchers, gathering primary data can be lots of fun and is also fraught with difficulty. Carefully planned recruitment methods may not work; response rates can be low; interviewees often say what they want to say rather than answering researchers’ questions directly. For participants, research fatigue is real. Research itself would receive more respect if we made better and fuller use of data, and shouted about that, rather than gathering data we never use (or worse, reclassifying stolen sacred artefacts and human remains as ‘data’ and refusing to return them to their communities of origin because of their ‘scientific importance’ – but that’s another story).

Some people think of secondary data as quantitative: government statistics, health prevalence data, census findings, and so on. But there is lots of qualitative secondary data too, such as historical data, archival data, and web pages current and past. Mainstream and social media provide huge quantities of secondary data (though with social media there are a number of important ethical considerations which are beyond the scope of this post).

Of course secondary data isn’t a panacea. There is so much data available these days that it can be hard to find what you need, particularly as it will have been gathered by people with different priorities from yours. Also, it’s frustrating when you find what you need but you can’t access it because it’s behind a paywall or it has an obstructive gatekeeper. Comparison can be difficult when different researchers, organisations, and countries gather similar information in different ways. It can be hard to understand, or detect any mistakes in, data you didn’t gather yourself, particularly if it is in large, complicated datasets. Information about how or why data was gathered or analysed is not always available, which can leave you unsure of the quality of that data.

On the plus side, the internet allows quick, easy, free access to innumerable quantitative and qualitative datasets, containing humongous amounts of data. Much of this has been collected and presented by professional research teams with considerable expertise. There is scope for historical, longitudinal, and cross-cultural perspectives, way beyond anything you could possibly achieve through primary data gathering. Working with secondary data can save researchers a great deal of time at the data gathering stage, which means more time available for analysis and reporting. And, ethically, using secondary data reduces the burden on potential participants, and re-use of data honours the contribution of previous participants.

There are lots of resources available on using quantitative secondary data. I’m also happy to report that there is now an excellent resource on using qualitative secondary data: Qualitative Secondary Analysis, a recent collection of really good chapters by forward-thinking researchers edited by Kahryn Hughes and Anna Tarrant. The book includes some innovative methods, interesting theoretical approaches, and lots of guidance on the ethics of working with secondary data.

Some people think that working with secondary data has no ethical implications. This is so wrong it couldn’t be wronger. For a start, it is essential to ensure that informed consent for re-use has been obtained. If it hasn’t, either obtain such consent or don’t use the data. Then there are debates about how ethical it is to do research using secondary data about groups of people, or communities, without the involvement of representatives from those groups or communities. Also, working with secondary data can be stressful and upsetting for researchers – imagine if you were working with historical data about the Holocaust, or (as Kylie Smith does) archival data about racism in psychiatric practice in mid-20th century America. Reading about distressing topics day after day can be harmful to our emotional and mental health, and so to our physical health as well.

These are just a few of the ethical issues we need to consider in working with secondary data. Again, it is beyond the scope of this post to cover them all. So working with secondary data isn’t an easy option; although it is different from working with primary data, it can be just as complex. I believe novice researchers should learn how to find and use secondary data, in ethical ways, before they learn anything about primary data gathering and analysis. This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $70 per month. If you think a day of my time is worth more than $70 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

The Personal Is Empirical

Human beings are natural researchers: exploring, seeking and comparing data, testing, evaluating, drawing conclusions. We do this all our lives. One of our first research methods, when we are only a few months old, is to put everything in our mouths. By the time we are a few years old we are asking incessant questions. We are programmed to investigate. As we get older, our methods get more sophisticated – and if we train as a professional researcher, they become more systematic, too.

Do you know the roots of the word ‘empirical’? It is derived from the Greek word ‘empeirikos’, meaning ‘experienced’. It means something verifiable by experiment or experience. So, the personal is empirical.

Autoethnographers know this already. For a generation now autoethnographers have been ‘utilizing personal stories for scholarly purposes’ (Chang 2008:10). Some have put too much emphasis on the personal stories and not enough on the scholarly purposes, leading to accusations of self-indulgence, navel-gazing, and irrelevance. More, though, have worked to link their personal experience with other data and wider narratives, theory, evidence, policy, and practice, in a systematic and rigorous way.

Embodied researchers also know that the personal is empirical. They focus on the physical, sensory dimensions of experience, as part of the data they collect. This subverts the conventional view of scholarly work as entirely cerebral – or, as the embodied researchers would have it, ‘disembodied research’. Embodied research is also open to accusations of self-interest and irrelevance. Yet embodied researchers point out that no research can in fact be disembodied. Even sitting still and thinking is a physical activity; the brain with which you think forms part of your body.

Other researchers draw on the personal in other ways. In my work on creative research methods, I have been astonished by the number of people who combine their artistic skills, or their writing talents, or their aptitude for making, or their technological savvy, or some other personal attribute with their research. This usually results in enrichment and often innovation, yet even now working in these ways can feel like swimming against the tide. The way we try to contain knowledge in silos, and reify specialisation, is not the norm in human history. It is not long since nobody thought it strange for someone to be both weaver and astronomer, doctor and poet, musician and engineer. Why have we forgotten that ‘the more diverse someone’s knowledge, the more likely they are to be able to identify and implement creative solutions to problems’? (Kara 2020:11).

Musing on all of this, I came up with the phrase ‘the personal is empirical’. I tried it out on a group of students last month and it went down well. Then, like a good scholar, I checked to see whether anyone else had used the phrase already. It was used by one US academic, most recently around 15 years ago. She was a feminist too and I guess for her, as for me, the generation of this phrase was influenced by the old feminist mantra that ‘the personal is political’. Nobody owned that phrase, and nobody owns this one either – you’re free to use it if you wish.

In fact, it would be great if you did. Because we need more people to understand that ‘knowledge is worth having, no matter where it originates’ (Kara 2020:11) – whether that is in the body, or someone’s wider life experience, or in a test tube, or an encounter with a book, or a conversation, or an animated film. As a species, as inhabitants of planet Earth, we have a plethora of problems to solve. We cannot afford to reject knowledge, or create hierarchies of knowledge; we need to value everyone’s expertise. And their experience. And experiments, and evidence, and theories – the whole lot. In fact, it is all empirical, but nobody will argue if you talk about empirical experiments or empirical evidence. The personal is empirical? That’s more provocative. So take this toy I have given you, my dear ones; take it and play!

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $70 per month. If you think a day of my time is worth more than $70 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Research and Stories, Part 2

My recent post Research Is All About Stories got a big reaction on the socials. I encouraged people who tweeted me to add their comments to the blog, which several of them did. They made some really useful points that I’m going to amplify in this post. Also on Twitter Hoda Wassif recommended The Science of Storytelling by Will Storr which I am now reading. It’s an excellent book and quite an eye-opener, even to someone who has been interested in stories and storytelling for many years.

In my last post I said that stories are used all around the world, and I stand by that, but I have learned from Storr’s book that there are cultural differences in the types of stories which are told. Stories told in Europe (and therefore, by extension, stories told by European settlers and their descendants in the US and Canada) generally focus on a courageous individual who can create change, and have a clearly defined ending. Stories told in China usually focus on a group or community, involve multiple perspectives, and have an ambiguous ending which the reader can figure out as they please. European readers take pleasure in a story’s resolution; Chinese readers take pleasure in deciding on their preferred solution to narrative puzzles.

Of course it’s not quite that simple. There are elements of ambiguity to the ending of some European stories, and I would suspect there are elements of resolution to the ending of some Chinese stories. And other cultures treat stories differently again. The Indigenous writer Jo-ann Archibald/Q’um Q’um Xiiem, in her book Indigenous Storywork, tells us that in the oral traditions of Indigenous peoples, stories are used for many purposes, such as education, entertainment, healing, ritual, community, and spirituality. A storyteller will select a story for a particular occasion and reason, and will tell it in their own way, as honestly and clearly as they can. The listener is expected to listen fully, engaging their emotions as well as their cognition, and visualising scenes and interactions.

The key point for us, as researchers, is to understand that if we are using stories with participants and/or audiences from a variety of cultures, they may have a different understanding of what constitutes ‘story’ and what stories are for. We need to know about this if we are to do our work effectively.

In response to my last post on stories, Pauline Ridley helpfully questioned my assertion that ‘we all do know, when we read or hear or watch a narrative, whether it tells a truth’. She pointed out that ‘Unfamiliar stories, outside the listener’s experience, may take longer to penetrate before they ring true.’ This chimes with the information I have gathered about the different ways in which stories are told and used within different cultures. I should know better by now than to treat anything as widespread as stories as a single homogenous category, but clearly I have some way to go!

Damian Milton and Olumide Adisa on Twitter, and Hala Ghanem on the blog, all made the important point that we need to consider who is telling a story and whose stories are being told – and heard, and acted upon. Storytellers have power, and for some years researchers thought a good, ethical, use of our power was to use our stories to ‘give voice’ to marginalised people. More recently we have begun to see this as paternalistic and to recognise that others’ voices are not ours to bestow. Marginalised people already have perfectly good voices, which researchers might usefully amplify at times, by helping to ensure those voices are heard by people in power. One reason stories are useful for research is that a story poses and investigates a question. So does a research project, albeit in a different way, but the parallel is clear. Stories are useful for research in a multitude of ways: on funding applications, as data, in reports and presentations, among others. I’m not sure it would be possible to complete a research project without involving a story somewhere, somehow. Anyway, I wouldn’t want to try. My human brain is hardwired to create stories; I would rather recognise and acknowledge this, and work with it rather than against it. Bring on the stories!

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $68 per month. If you think a day of my time is worth more than $68 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Research Is All About Stories

My assertion that research is all about stories is probably less divisive and controversial now than it was 15 years ago when I was finishing my PhD. Still, I’m sure there are plenty of researchers who would disagree. Let me put my case and see whether I can convince some of them to come over to the fun side.

Stories are a key part of how human beings interact. To the best of my knowledge, there is no human community or culture in the world which does not use stories to communicate. We also use stories for entertainment – skilled oral storytellers and story singers have been popular entertainers since time immemorial, and the huge popularity of more recent media such as books and films speaks for itself.

I have argued earlier on this blog that stories are also valuable for learning. Communication and learning are central to research, and there is a role for entertainment, too. So we can see that stories might be a good fit. But, Helen, you might be saying at this point, shouldn’t research be about facts and the truth? Well now, let’s think a little about truth. In the English oral storytelling tradition, a teller will sometimes close a story with a short rhyme:

The dreamer awakes, the shadow goes by,

I told you a tale, my tale is a lie.

But heed to me closely, fair maiden, proud youth,

My tale is a lie – what it tells is the truth.

In a journal article I wrote with Lucy Pickering on the ethics of presentation, we said something very similar in a more academic way. Drawing on the work of Bakan and others, we distinguished between ‘literal’ truth and ‘real’, or authentic, truth. The former deals with facts, the latter deals more with feelings; what ‘rings true’, to use a metaphor whose source seems lost to history. Blacksmiths? Musicians? Campanologists? Who knows?. But we all do all know, when we read or hear or watch a narrative, whether it tells a truth.

Lucy Pickering and I argued that research needs an appropriate balance of literal and authentic truth. That balance will shift between topics and disciplines, but there always needs to be some of each. Even in the most quantitative research, a story is still necessary; the researcher can’t simply present pages and pages of tables, calculations, graphs and charts without a written narrative directing the reader to the salient points – how this calculation was chosen, why that outlier is important, the implications of the significance level for practice and policy.

Scholars of story Louise Phillips and Tracey Bunda, in their excellent book Research Through, With And As Storying, suggest that stories can be experienced as theories. I agree with this, and would extend it to suggest that theories can be experienced as stories. In fact I could go further and say that theories used and/or developed by researchers, whether formal or informal, are stories: stories about how the world can be shaped and about how we see the world.

In Unflattening, Nick Sousanis describes stories as ‘that most human of activities, the framing of experience to give it meaning’ (p 95). Which is exactly what researchers do, especially if they are using qualitative techniques.

Asking ourselves the question, “What’s the story here?” can be helpful at many points in research work. We should have a clear story to tell of why we are doing our research, and another to explain what the research is about. When we come to report on our research, whatever the medium – written, presented remotely, presented in person, video, animation, multi-media, whatever – we should be using stories. Stories are engaging, informative, and memorable. Surely that’s exactly what we want our research to be.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $67 per month. If you think a day of my time is worth more than $67 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

History, Truth, Research and Choices

I didn’t get on too well with history at school. It was all about kings and queens and battles, people and events I couldn’t identify with. I enjoyed historical novels if they were about times that had relevance for me, e.g. the first world war (in which my maternal grandfather fought) or the second world war (in which my paternal grandfather fought). But in general I preferred the contemporary world I knew, and books and films set there.

In the late 1980s I discovered revisionist history. I loved The Women’s History of the World by Rosalind Miles (later rebranded as Who Cooked the Last Supper?), which was an eye-opening book, clever, funny, and a welcome counterpoint to all the male-dominated history I’d read. I was fascinated by Peter Fryer’s books Black People in the British Empire, which demonstrated that the British empire was based on exploitation and oppression, and Staying Power: The History of Black People in Britain, which showed how Black people had been present and influential in British society for two thousand years. (The link is to a recent edition of this book with a new foreword by Gary Younge – if you haven’t come across it and you’re interested, I would recommend a read.)

More recently I have read Inglorious Empire: What the British did to India by Shashi Tharoor (2017), An Indigenous Peoples’ History of the United States by Roxanne Dunbar-Ortiz (2014), and The Inconvenient Indian by Thomas King (2013). I would recommend each of these books for their perspective, dignified approach, and eloquent writing.

At the start of lockdown, some kind neighbours along my street set up a book exchange for our community outside their house. A few weeks ago I found a copy of The American Future by Simon Schama, a high-profile and respected British academic historian, award-winning writer and broadcaster. This book has four sections:

  1. American War (civil war, World War Two, Vietnam)
  2. American Fervour (religion – mostly Judeo-Christian)
  3. What is an American? (immigration, primarily of Germans, other Europeans, Mexicans and Chinese people)
  4. American Plenty (shift in mindset from infinite to finite availability of land and resources)

With my new awareness of the position of Indigenous peoples in the US, thanks to the work of Roxanne Dunbar-Ortiz and Thomas King, I wondered what Schama said on the subject. The subtitle of his book is A History From The Founding Fathers to Barack Obama, which didn’t fill me with optimism. And sure enough, Indigenous people barely feature in sections 1-3. There is a brief acknowledgement in the prologue on page 14 that ‘Native American tribes’ in Iowa might have had a different viewpoint from ‘Canadian troopers’ on whether Iowa had ever experienced war. There is a brief mention on page 114 that in the late nineteenth century, the army was involved in ‘finishing off Native Americans’. And other such mentions in passing – until section 4, pages 316-330, a subsection called ‘White Path 1801-1823’, which tells the story of the Cherokee people in Tennessee. Schama evidently attempts to use a reasonably even-handed approach: he acknowledges the Cherokee perspective and recognises at least some of the injustice done to them through broken promises, land grabs and forced relocations. He describes president-to-be Andrew Jackson as ‘unexpectedly brutal’ and says that ‘extinction’ [of Indigenous peoples] ‘was an actual policy determined by actual men’ (322). Schama also describes Jackson as ‘the ethnic cleanser of the first democratic age’ (326).

The story of American history from the late 18th century to the present day is told very differently by Dunbar-Ortiz. She acknowledges Jackson as ‘the implementer of the final solution for the Indigenous peoples east of the Mississippi’ (96). She points out that ‘In the 1990s, the term “ethnic cleansing” became a useful descriptive term for genocide.’ (9) And she identifies ‘four distinct periods’ where documented policies of genocide were created by US administrations. The first is the ‘Jacksonian era of forced removal’, and then ‘the California gold rush in Northern California; the post-Civil War era of the so-called Indian wars in the Great Plains; and the 1950s termination period’ (9).

Having already read Dunbar-Ortiz and King, the way Schama tells the story seemed to me to involve a lot of erasure of Indigenous peoples. And sometimes, due to his narrative choices, his writing seems quite tone deaf. ‘The dream of American plenty for the ordinary man was born from Andrew Jackson’s determination to evict tens of thousands of Indians – Chickasaw, Choctaw, Seminole and Creek as well as Cherokee – from the only homelands they had ever known, because they happened to be in the way.’ (323) Recognition of Andrew Jackson’s atrocities doesn’t hide the division Schama draws between ‘the ordinary man’ and ‘Indians’. That raises a whole bunch of ugly questions. He doesn’t engage with any of them.

Dunbar-Ortiz writes about the impact of history itself as its scholars work to protect ‘the origin myth’ of the Founding Fathers and independence. That origin myth ‘embraces genocide’ (2) which is ‘often accompanied by an assumption of disappearance’ (xiii). I see this in Schama’s engaging, entertaining, readable writing: the overall message is that some Indigenous people were badly treated, a long time ago, in a sub-plot to the major storyline of independence and democracy in a nation of immigrants. A Spectator review on the back of the book reinforces this point by claiming that Schama is ‘weaving the immediate present with [America’s] earliest history’. That ‘earliest history’ is somewhere around 1775. Dunbar-Ortiz, meticulously and forensically, establishes the existence of sophisticated societies and cultures in America thousands of years ago.

Schama’s book was first published in 2008, Dunbar-Ortiz’ in 2014 – but most of her sources are pre-2008, so they would also have been available to him. It is both fascinating and nauseating to read these two very different accounts of what is ostensibly the same history. The authors have completely different perspectives and narratives. And this, for me, is the key learning point. When we conduct research or scholarly work, we bring a perspective and we choose a narrative. Dunbar-Ortiz is open about this, talking about starting a dozen times before she settled on a narrative, and outlining where she sits within relevant debates around Native American scholarship (xii-xiii). Schama simply launches in to an authoritative tale.

The narratives selected by researchers and scholars both reveal and conceal. It is not possible to tell everything that could be told. With this comes huge responsibility. We need to tell the most important, most necessary stories – but that in itself raises new questions. Most important and necessary to whom, for what, and why? Which other stories could we tell? How do we know those stories are not every bit as important and necessary? With the story we choose to tell, how can we acknowledge what we are leaving out as well as what we are focusing on?

This is a complex business and there are no easy answers because each case will be different. What is essential is to be aware of the issues and to use our authorial power as wisely as we can.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $57 per month. If you think a day of my time is worth more than $57 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

In Praise Of Not Knowing

This image has an empty alt attribute; its file name is dont-know-1.jpg

Perhaps we learned to dread saying ‘I don’t know’ at school, where it didn’t impress our teachers. Or maybe it’s the human desire for certainty and predictability which makes an ‘I don’t know’ so unwelcome. Anyway, we are supposed to know things, us researchers and scholars. There have been times when, involved in a conversation and someone uses an acronym or references a person or organisation I haven’t heard of, I nodded wisely while making frantic mental notes to look it up later. Indeed there is a small industry making money out of this tendency by publishing Bluffer’s Guides to a variety of topics such as social media, wine, cycling and Brexit. These Guides are apparently designed to amuse, inform, and enable the reader to hold her own in any conversation on the subject. By their very existence they discourage the use of ‘I don’t know’.

But think about it: do you know someone who always has an answer for everything? I have met several people like that in my life. Aren’t they annoying?

I think an honest ‘I don’t know’ has a lot going for it. For a start, I think it is useful to acknowledge to ourselves when we don’t know something. Then we can find out, either consciously or sub-consciously. I had an email recently from someone I care about, asking for my help in solving a personal problem. As I read the email, I saw that their problem was quite complicated, and realised I didn’t have a ready answer. I finished reading and turned to a different task. Half an hour later I read the email again – and this time I was able to formulate a response. The part I think of as my ‘back brain’ had been working on the problem while I was otherwise occupied, and had come up with a solution. I love it when this happens. It’s where we get the phrase ‘sleep on it’ – if you fall asleep at night thinking of a problem you need to solve, you may well wake up in the morning with a solution in your mind.

I also think it is useful to acknowledge to other people when there is something we don’t know. In the conversations where someone talks about something I haven’t come across, these days I ask them what the acronym means or who the person or organisation is and why they’re relevant to our conversation. This enables better quality communication and discussion. I also own up to not knowing when I’m teaching. I often teach doctoral students who are, by definition, clever and knowledgeable people. This means they sometimes ask me questions to which I don’t know the answer, and for which there is no Bluffer’s Guide – and anyway, trying a bluff on a room full of doctoral students would not be a good idea. So I say, ‘I don’t know,’ and add, ‘but maybe someone else here does?’ And, very often, they do.

I wonder whether part of the problem with ‘I don’t know’ is that acknowledging it, to ourselves or to others, takes some confidence. Confidence that we can find out; confidence that others won’t think badly of us… Actually, it seems to me that many people respect you more if you are honest about what you don’t know, because then can have more faith in what you claim you do know.

Having said that, it is also important to be flexible about what you know, to allow for the possibility of change. I knew some things when I wrote the first edition of my book on creative research methods in 2015. Then I learned more, including some things that contradicted parts of what I knew before, so in my second edition I acknowledged and explained these changes of mind. I don’t think this invalidates my work. Nobody can know everything, and what we know changes with time as we learn more, just as we learned that the earth is round not flat, and that fatal diseases can be eradicated with vaccines. ‘I didn’t know that’ is part of the ‘I don’t know’ family, and just as valuable.

Not knowing something is the foundation for research, because we do research to find out new knowledge. Students sometimes say to me, ‘I don’t know if I’m doing my research right.’ I say, ‘If that’s how you feel, you probably are doing it right.’ Then they look at me like Luke Skywalker looks at Yoda when he has just said something particularly cryptic, so I tell them all research is built on uncertainty; if they already knew whatever it is they want to find out, there would be no point in doing their research in the first place.

Perhaps the hardest part is the way all of our lives are currently built on uncertainty. When and how will this pandemic end? Who will be alive when it does? What will the world be like? Of course knowing the future was always an illusion, but our plans were often enacted which made it seem real. Now it may feel pointless even to make a plan. More and more people are talking about our current predicament as “the new normal”, and I recognise in this an understandable reaching for certainty. But not much is normal about the way we are currently living, and we may find we can deal with that better if we embrace the uncertainty and face up to what we don’t know.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $55 per month. If you think a day of my time is worth more than $55 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Research Work In Lockdown

podcast-5227930__340This week’s blog is a short podcast produced by the lovely people from Policy Press and featuring an interview with me about research work in lockdown. The podcast was featured earlier this week on the Policy Press blog, Transforming Society, which is well worth a follow, not only for its excellent content but also because followers get a hefty 35% discount on all Policy Press books. You can listen to the podcast on Soundcloud, Apple Podcasts or Spotify – just click on the link for the platform you prefer.

Writing Is A Research Method

writing on keyboardIt has always struck me as odd that people don’t recognise writing as a research method. I doubt there is a single piece of formal research in the Euro-Western world which doesn’t involve writing. Yes, we can make all our reports with video, but those videos need scripting and that requires words. As researchers, writing is one way in which we exercise our power. You may not think of yourself or your writing as powerful, yet writing is an act of power in the world. I was reminded recently by a colleague that my words on this blog are powerful. I’d forgotten. It’s easy to forget, but we need to remember.

Writing, in Euro-Western research, is universal. It’s the one method used regularly by both quantitative and qualitative researchers. Perhaps that’s why it isn’t recognised as a method, because it unites us rather than dividing us. But it is a method, and I would argue that it is a qualitative method. We can’t do research without writing, and how we write affects the ways our work is understood and used by other people.

I’ve been interested in the terminology around the COVID-19 pandemic, which I think provides a useful example. Last week I wrote a post about self-isolation. Following a lot of travelling the previous week I’ve been voluntarily staying at home, seeing only my partner and a couple of delivery people. One friend challenged my use of the term ‘self-isolation’, saying that in their view I was doing social distancing because I wasn’t sleeping separately and staying 2m away from my partner or using separate washing facilities, and I was still taking deliveries in person. I could see their point, though I know others are using the term ‘self-isolation’ in the same way as me. My view of social distancing is that it is more about literally keeping our distance from each other in public places. But these are new terms and we’re all trying to figure this whole thing out while it’s happening.

However, neither of them are particularly lovely terms, and I have appreciated the appearance of alternatives. The first I saw was I think an FB post taken from Instagram (I can’t remember who generated either post now – my apologies; if it was you or you know who it was, please comment below and I’ll edit to credit). The post suggested that we’re not doing social distancing, we’re doing physical distancing for social solidarity. I really liked that concept. Then yesterday Leo Varadkar, Taoiseach of Ireland (and a doctor), spoke of cocooning, and I heard that Americans were talking of ‘shelter in place’.

While I have no evidence for this beyond my own reactions, I suspect that more positive terms are likely to lead to more acceptance. Asking someone to isolate themselves has connotations of loneliness, sadness, and prison (which also has associations with the term ‘lockdown’ currently in use around the world). Physical distancing sounds easier and more accurate than social distancing, and coupling it with social solidarity makes it feel stronger and more righteous. Cocooning makes me think of cosiness and warmth, plus it rhymes (or almost) with other gentle words like soothing and crooning. Asking someone to shelter in place has connotations of home, familiarity, and safety.

As researchers, we often have new information to impart and we sometimes arrive at new concepts which need to be named. There are a whole bunch of words and phrases for us to choose from in writing each new sentence. The words and phrases we use can make a great deal of difference to how our work is received. This means we need to take care in choosing our words and phrases, and in putting them together to make sentences, and in putting sentences together to make paragraphs. These tiny laborious steps are like the strokes of an artist’s brush or the stitches from a crafter’s needle: the beating heart of the writer’s art.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $52 per month. If you think a day of my time is worth more than $52 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

The Power Of Naming

wordsWhen I first learned about research, as a student of Social Psychology at the London School of Economics in the early 1980s, the people we collected data from were called ‘subjects’. They were subject to our research, and subjects of our research; we were (told we were) the objective neutral researchers with the power to collect and analyse data. That power came from knowing how to do those things: special, arcane knowledge available only to insiders, i.e. those with enough educational capital.

By the time I got back into research, around the turn of the century, researchers had begun to acknowledge that positivism might not be the only game in town. The terminology had moved on and ‘subjects’ were becoming more widely known as ‘participants’. We felt good about this: instead of subjecting people to our whims, we would let them join in with our research (up to a point, mostly defined by us). How kind.

I’m beginning to think it’s time for another shift. I’m enjoying the way some researchers are being creative here, such as Alistair Roy with his ‘tour guides’. However, while that term works well for Roy who conducts walking interviews with marginalised young men in cities, it’s not universally applicable. So I’m wondering about… contributors?

I also think it might be time to rethink ‘data’. The word is drawn from the Latin meaning ‘something given’. Yet more often data is something researchers take and keep. The ability to classify things as ‘data’ has enabled serious abuses, some of which are still ongoing today. For example, in her magisterial book An Indigenous People’s History of the United States, Roxanne Dunbar-Ortiz demonstrates that Euro-Western researchers retain the human remains and burial offerings of millions of Indigenous people by classifying them as ‘data’. For Indigenous peoples, these remains and offerings are sacred, yet Euro-Western researchers continue to ignore their requests for the return of their sacred objects, using ‘science’ as the reason. On this basis it might make sense to reword ‘data’ as ‘loot’ or ‘swag’.

Another option would be to refer to people who provide information for research as ‘people’ and to the information they provide as ‘information’. I’m in favour of this because it has a levelling quality, especially if we researchers also refer to ourselves as ‘people’. It saves us from the irregular verb effect: I am a researcher, you are a participant, they are users of research.

All this is still researcher-led, though, so potentially paternalistic (or, in my case, perhaps maternalistic?!). A further option could be to let people who contribute to research decide how to define both their roles and what they offer to the process.

Some readers may regard all this as quibbling over semantics. However, given the strength of the relationship between language and thought, it seems to me important to consider these issues. Names have power: power to identify and classify. When we name individuals, roles, groups, artefacts, we are saying something about how we see the world. As always we need to use this power with care.

This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $23 per month. If you think a day of my time is worth more than $23 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!

Australasian Research Ethics

AHRECS logoSystems of research ethics regulation differ around the world. Some countries have no research ethics regulation system at all. Others may have a system but, if they do, it is only available in their home language so people like me who only speak and read English are unable to study that system (Israel 2015:45). The main English-speaking countries tend to have formal systems of research ethics regulation, stemming from biomedical research in response to ethical crises such as Nuremberg and Tuskegee. These are usually implemented through research ethics committees or their equivalents such as institutional review boards in the US.

One big difference in Australasia is that work on research ethics by and for Indigenous communities seems to be further ahead in Australia and New Zealand than in any other continental region as a whole. Australia has the Guidelines for Ethical Research in Australian Indigenous Studies produced by the Australian Institute of Aboriginal and Torres Strait Islander Studies (AIATSIS). AIATSIS is a statutory organisation, set up by white settlers in the 1960s and governed by a Council, with the first Aboriginal Council member joining in 1970. The Council is now predominantly made up of Aboriginal people and Torres Strait Islanders. The latest edition of the Guidelines is dated 2012 but they are under review at the time of writing. In New Zealand, Māori people with experience from research ethics committees came together to write Te Ara Tika, a document offering guidelines for Māori research ethics published in 2010. These kinds of guidelines help Indigenous peoples to claim their right of research sovereignty, i.e. control over the conduct of and participation in research that affects them. However, they are not necessarily aligned with each other, or with other systems of ethical governance for research that may exist in the same jurisdictions. This may hamper collaborative or multi-area research and lead to increased separation rather than reconciliation between peoples (Ríos, Dion and Leonard 2018).

So it’s a complex and fascinating picture. I am fortunate to be working on a project at present with three experts in Australasian research ethics: Gary Allen, Mark Israel, and Colin Thomson. (The sharp-eyed among you may notice that I cited Israel in the first paragraph above. He has written a rather good book on research ethics subtitled Beyond Regulatory Compliance and now in its second edition.) Together they are the senior consultants of the Australasian Human Research Ethics Consultancy (AHRECS), established in 2007 to provide expert consultancy services around research ethics in Australasia and Asia-Pacific. AHRECS also works with Indigenous consultants from both Australia and New Zealand, one of the latter being Barry Smith who is a co-author of Te Ara Tika.

The amount of expertise in AHRECS is enormous. Better still, they offer to share some of this expertise to anyone who wants to sign up for their free monthly e-newsletter on research ethics (and I can confirm from experience that they don’t spam you). Link here (scroll down, it’s on the right). Their blog provides a useful archive and they accept guest posts on relevant topics; I just wrote one for them on The Ethics of Evaluation Research. So you get two for the price of one this week!

This blog is funded by my beloved patrons. It takes me around one working day per month to post here each week. At the time of writing I’m receiving funding of $17 per month. If you think 4-5 of my blog posts is worth more than $17 – you can help! Ongoing support would be fantastic but you can also support for a single month if that works better for you. Support from Patrons also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!