
Perhaps we learned to dread saying ‘I don’t know’ at school, where it didn’t impress our teachers. Or maybe it’s the human desire for certainty and predictability which makes an ‘I don’t know’ so unwelcome. Anyway, we are supposed to know things, us researchers and scholars. There have been times when, involved in a conversation and someone uses an acronym or references a person or organisation I haven’t heard of, I nodded wisely while making frantic mental notes to look it up later. Indeed there is a small industry making money out of this tendency by publishing Bluffer’s Guides to a variety of topics such as social media, wine, cycling and Brexit. These Guides are apparently designed to amuse, inform, and enable the reader to hold her own in any conversation on the subject. By their very existence they discourage the use of ‘I don’t know’.
But think about it: do you know someone who always has an answer for everything? I have met several people like that in my life. Aren’t they annoying?
I think an honest ‘I don’t know’ has a lot going for it. For a start, I think it is useful to acknowledge to ourselves when we don’t know something. Then we can find out, either consciously or sub-consciously. I had an email recently from someone I care about, asking for my help in solving a personal problem. As I read the email, I saw that their problem was quite complicated, and realised I didn’t have a ready answer. I finished reading and turned to a different task. Half an hour later I read the email again – and this time I was able to formulate a response. The part I think of as my ‘back brain’ had been working on the problem while I was otherwise occupied, and had come up with a solution. I love it when this happens. It’s where we get the phrase ‘sleep on it’ – if you fall asleep at night thinking of a problem you need to solve, you may well wake up in the morning with a solution in your mind.
I also think it is useful to acknowledge to other people when there is something we don’t know. In the conversations where someone talks about something I haven’t come across, these days I ask them what the acronym means or who the person or organisation is and why they’re relevant to our conversation. This enables better quality communication and discussion. I also own up to not knowing when I’m teaching. I often teach doctoral students who are, by definition, clever and knowledgeable people. This means they sometimes ask me questions to which I don’t know the answer, and for which there is no Bluffer’s Guide – and anyway, trying a bluff on a room full of doctoral students would not be a good idea. So I say, ‘I don’t know,’ and add, ‘but maybe someone else here does?’ And, very often, they do.
I wonder whether part of the problem with ‘I don’t know’ is that acknowledging it, to ourselves or to others, takes some confidence. Confidence that we can find out; confidence that others won’t think badly of us… Actually, it seems to me that many people respect you more if you are honest about what you don’t know, because then can have more faith in what you claim you do know.
Having said that, it is also important to be flexible about what you know, to allow for the possibility of change. I knew some things when I wrote the first edition of my book on creative research methods in 2015. Then I learned more, including some things that contradicted parts of what I knew before, so in my second edition I acknowledged and explained these changes of mind. I don’t think this invalidates my work. Nobody can know everything, and what we know changes with time as we learn more, just as we learned that the earth is round not flat, and that fatal diseases can be eradicated with vaccines. ‘I didn’t know that’ is part of the ‘I don’t know’ family, and just as valuable.
Not knowing something is the foundation for research, because we do research to find out new knowledge. Students sometimes say to me, ‘I don’t know if I’m doing my research right.’ I say, ‘If that’s how you feel, you probably are doing it right.’ Then they look at me like Luke Skywalker looks at Yoda when he has just said something particularly cryptic, so I tell them all research is built on uncertainty; if they already knew whatever it is they want to find out, there would be no point in doing their research in the first place.
Perhaps the hardest part is the way all of our lives are currently built on uncertainty. When and how will this pandemic end? Who will be alive when it does? What will the world be like? Of course knowing the future was always an illusion, but our plans were often enacted which made it seem real. Now it may feel pointless even to make a plan. More and more people are talking about our current predicament as “the new normal”, and I recognise in this an understandable reaching for certainty. But not much is normal about the way we are currently living, and we may find we can deal with that better if we embrace the uncertainty and face up to what we don’t know.
This blog, and the monthly #CRMethodsChat on Twitter, is funded by my beloved patrons. It takes me at least one working day per month to post here each week and run the Twitterchat. At the time of writing I’m receiving funding from Patrons of $55 per month. If you think a day of my time is worth more than $55 – you can help! Ongoing support would be fantastic but you can also make a one-time donation through the PayPal button on this blog if that works better for you. Support from Patrons and donors also enables me to keep this blog ad-free. If you are not able to support me financially, please consider reviewing any of my books you have read – even a single-line review on Amazon or Goodreads is a huge help – or sharing a link to my work on social media. Thank you!
This week’s blog is a short podcast produced by the lovely people from
It has always struck me as odd that people don’t recognise writing as a research method. I doubt there is a single piece of formal research in the Euro-Western world which doesn’t involve writing. Yes, we can make all our reports with video, but those videos need scripting and that requires words. As researchers, writing is one way in which we exercise our power. You may not think of yourself or your writing as powerful, yet writing is an act of power in the world. I was reminded recently by a colleague that my words on this blog are powerful. I’d forgotten. It’s easy to forget, but we need to remember.
When I first learned about research, as a student of Social Psychology at the London School of Economics in the early 1980s, the people we collected data from were called ‘subjects’. They were subject to our research, and subjects of our research; we were (told we were) the objective neutral researchers with the power to collect and analyse data. That power came from knowing how to do those things: special, arcane knowledge available only to insiders, i.e. those with enough educational capital.
Systems of research ethics regulation differ around the world. Some countries have no research ethics regulation system at all. Others may have a system but, if they do, it is only available in their home language so people like me who only speak and read English are unable to study that system (
When someone mentions research methods, what do you think of? Questionnaires? Interviews? Focus groups? Ways of doing research online? Do you only think of data gathering, or do you think of methods of planning research, analysing data, presenting and disseminating findings?
Have you noticed how people seem to be getting offended about the strangest things? For example, there has been controversy this month over two songs that are regularly played in English-speaking countries at this time of year. The first is Baby It’s Cold Outside, a duet between two people (usually a man and a woman, though the lyrics are not gender-specific). It was written by Frank Loesser in 1944 to sing with his wife as a party trick. One character is persuading a slightly reluctant other to stay in the warm rather than go out into the winter weather. It’s flirtatious and funny, especially in 


Hi! I’m Dr. Echo Rivera, founder and owner of ![Research ethics in the real world [FC]](https://helenkara.com/wp-content/uploads/2018/08/research-ethics-in-the-real-world-fc.jpg)
Last week, for reasons best known to one of my clients, I was reading a bunch of systematic reviews and meta-analyses. A systematic review is a way of assessing a whole lot of research at once. A researcher picks a topic, say the effectiveness of befriending services in reducing the isolation of housebound people, then searches all the databases they can for relevant research. That usually yields tens of thousands of results, which of course is far more than anyone can read, so the researcher has to devise inclusion and/or exclusion criteria. Some of these may be about the quality of the research. Does it have a good enough sample size? Is the methodology robust? And some may be about the topic. Would the researcher include research into befriending services for people who have learning disabilities but are not housebound? Would they include research into befriending services for people in prison?