4 tips on running research campaigns with impact from CNN’s Head of Content Insight
By Duncan Smith

Dr. Hamish McPharlin started out as a junior researcher at Decipher Media Consultancy, and worked his way to Director of Research, running market research studies for brands, agencies and publishers. “But at the back of my mind, I always had this unsettling feeling,” he reveals. “Every time I delivered a study I would think about how my clients had to live and die by the research I was providing, whereas I was just handing it over and moving on to the next one.”
Looking to better understand how research is used once it leaves his hands, he joined BBC Global News as Head of Insight. Now as Head of Content Insight at CNN, Hamish reveals his tips on running research campaigns with impact.
1. Dump the data dump
Once I moved to the publisher side, it quickly became evident that internal stakeholders won’t care about data unless it’s clear what they are supposed to do with it. They are busy and the last thing they need is you creating more things for them to look at without a compelling reason for them to take action. You want them to be interested every time you have something to say.
So, how do you present your data in a compelling way that drives action? To me, there are two key stages. If you create 50 slides on every single angle you’ve discovered, or share a massive data table with 10 interesting areas highlighted—you’ve actually only completed half of the job.
What must follow is your attempt to investigate and truly understand what the data is saying, and what the business must do in response. Producing data is easy—understanding what it is telling you, and how it should compel your business to take action is the harder, and much more important bit. Your final delivery to your stakeholder may be as simple as a couple of sentences, but it has the weight of all that data, time and analysis behind it. I tend to presume that if you produce 10 ideas, none will get actioned, but if you produce 1 or 2, they will.
Don’t be in love with your creation. Do whatever it takes to get to the essence or synthesis of what you’re trying to say. There are a lot of talented analysts who can produce very impressive charts that carry a lot of information. However, I increasingly find myself simplifying things as much as I can. That could mean designing an extremely simple graphic where 90% of the data isn’t even included. When you’re zoomed right in on the key insight, it’s much more effective.
At CNN, our internal insight slides almost always follow a simple template: one sentence explaining what happened and one sentence saying what to do about it. And that’s it. It’s harder than it sounds. As researchers, often the insight we discover is complicated and littered with caveats; getting it down to a couple of sentences is a real challenge. But this process forces us to really understand what we are saying. If you can’t write down that second sentence, you need to ask yourself: why are you creating this slide?
It takes a little more effort to push yourself like this but it’s very healthy. We need to take on the responsibility rather than pass the buck to our stakeholders.
2. Plan to fail
Research teams don’t really have revenue targets, but I do believe we’re still expected to show a return on investment. Often we don’t think about it like that, because most of the time it can’t really be tracked back to a sale, but it is still true. When a company signs off budget for a new thought leadership study, the business will expect to be able to market that research. It’s down to the research team to produce a great study that can be used.
Essentially, our reputation as a team within the company rides on it, which gives rise to the uncomfortable pressure of your hypothesis being right. The key to managing this is to build your study around exploring a topic from a number of angles, rather than putting all your eggs into one basket by trying to focus on proving that ‘A is better than B’.
It’s OK to have a hypothesis in mind but I put a lot of contingencies into my studies; additional questions, multi-method techniques, demographic profiling. I cram them in! What are the other four or five possible results that might reveal themselves? How can I construct my study to deliver on those findings if they are true? What new questions can be added to the survey to explore more deeply the nature of the ‘engagement’ that we think we’re going to see? Would adding a verbatim section or supplementing with qualitative interviews help us answer the question of ‘why’ we’re seeing what we’re seeing?
And demographics are key if your top level findings are inconclusive. Perhaps your research on a new ad format doesn’t show a difference in engagement but what if you dig below the surface? How does it perform amongst young people? Business leaders? Heavy users? Build depth into your methodology from the outset, and never assume your initial idea is going to carry the day.
When you’re sitting at the end of your fieldwork with a dataset and no way to go back and do it again, this will allow you to turn the ‘Rubik’s Cube’ a few more times to find the detail that might lie beneath the surface.
3. Create an all star cast
Digging through all the data to find answers is such a thrilling thing to do. It’s like being an archaeologist. I’ve got a little brush and I get to dig and see what new truth emerges that no one has seen before.
But if any study that I produce looks impressive, that was never me alone. That was me working with a group of people, constantly challenging each other. You need those people around you. Conceiving and standing up a study is complicated; there are a lot of moving parts, and the devil is always in the details. Having people involved means that someone may spot a problem or suggest an idea that you didn’t think of. I once had a study where a single suggestion with a colleague in the final week ended up producing the centrepiece insight of the whole study.
4. Be a methodology mixologist
When I was a junior researcher, I won a bid to do a big study into the future of media consumption. When I went to the kick off meeting, my client told me, “We liked you guys. But we also liked some other folks too, and we want you to work together on this one.”
To young me, it sounded difficult and alarming to be working with a competitor. But it was an incredible experience. Two companies with different strengths got to meet each other, bring our methods and ideas to the table, and work side-by-side towards a common goal. By combining our strengths (research design, recruitment, creative stimulus, data analysis) with their strengths (biometric lab-based fieldwork), we were able to produce something better than the sum of its parts. In addition, I learned a huge amount by exposing myself to new ideas and approaches, and ended up commissioning the same company years later when I was a client.
As a client, that experience taught me to be bold and creative with methodologies; I also have briefed multiple companies to work together and this has yielded amazing results. The key is to look for complementary strengths: qual and quant, tech and lab work, creativity and science.
I like the idea of multiple research companies collaborating on a project. Media and advertising is awash with studies that repeatedly use a handful of methodologies. Combining companies and their approaches to a question can lead to something unique and special. Often it means companies working together that might consider themselves competitors, but I tend to believe bringing together specialist expertise is a great way to unearth genuinely new insights.