Content marketing can get attention for your company. Publishing original research can earn something much more elusive: authority.
Producing original research indicates your brand has insights about your industry that no one else owns. Given how hard it is to say something truly new and compelling with content marketing, it’s a small wonder more people don’t use original research.
But not all research projects are created equal, and over the years, I’ve learned to spot research inexperience. These eight signals tell me whether a study was well designed or produced.
1. Tell stories; don’t take inventory
In my 15-plus years of writing and publishing research, this is the No. 1 thing people get wrong: When undertaking a research project, they heed the siren’s call – “Let’s find out all the things!”
After all, when spending all this time and money on a survey, why not ask … everything? The problem is (a) participants won’t complete all of the survey and (b) reading through all those things is downright boring.
Get focused. For example, a broad study about artificial intelligence wouldn’t work well, but research on the challenges of adopting AI tech in health care would. Narrow the area of study so you can extract meaningful, never-before insights.
2. Clearly state methodology
A competent study – whether survey-based or otherwise – publishes a clear methodology. It includes the sample size, how respondents were recruited, and demographic summaries relevant to the study (e.g., gender, years of experience, role, geography). These details help your reader gut-check whether your findings are worthwhile.
3. Sample competently or be transparent about potential bias
How did you source respondents for your survey? Did you use a panel and try to get an accurate sampling based on the underlying population? While ideal, this approach isn’t always cost-effective. For niche groups, panels can cost well over $50 per complete survey, a price many companies simply can’t pay.
Sampling your own audience is free and effective, but you should be transparent about the biases that may surface based on that audience. For example, if you’re determining how SEO-savvy new business owners are, polling members of an SEO tech platform means the results include opinions from people who are more advanced in the subject.
4. Ask questions that don’t push a point of view
You think you’re being sneaky, but I see you. Don’t design questions to push your product or service.
The funny thing: If you ask a patently self-promotional question, your survey takers will kick your butt. I was once involved in a study where the client asked a barely disguised question about people’s preference for their product or their competitors’. The smart survey takers smelled a rat and chose the more “primitive” product from the lineup. It was a bit of a “&%$# you and the horse you rode in on.” It made me giggle.
5. Work with a report writer who speaks data
Not all good writers have the skills to report your research findings. You want a writer-analyst, not just a writer. You want someone who won’t regurgitate the basic facts of the survey but put those findings in context and explain why they’re fascinating. Experienced journalists are excellent analysts, and I seek them out for tough research-reporting assignments.
6. Tie to relevant, timely, and properly cited third-party research
Excellent research reports tie in other research data to make their cases. For example, if you do a research study about telemedicine in health care, your audience will respect the findings even more if you mention other respectable research organizations uncovering similar issues. But for the love of all things nerdy, please don’t quote a third-party study if it’s more than two years old. And quite frankly, in pandemic times, I’m loath to quote studies more than three months old. Be smart about what data are truly relevant.
7. Charts and graphs should be able to stand alone
This is a personal pet peeve, but it bears making this list. If you publish a blog post or any type of report with original charts and graphs, each one should be able to stand alone. In other words, if someone published just your chart image, it should contain all the information a viewer needs to make sense of that insight – the key finding, the question asked, the sample size, and the source.
Find out more about designing templates for charts and graphs in this article I wrote for the Content Marketing Institute.
8. Design visuals for clarity, not sparkle
We all get taken in by cool visualizations. Who of us hasn’t swooned over Edward Tufte’s books? (OK, maybe not, but I know I’m not the only one.) But let’s get real; few (I daresay none) of us need to channel Beautiful Evidence to publish thought leadership research. Your mantra should be KISS (keep it simple, stupid).
Always design for clarity, and don’t be afraid of the good ol’ bar or line chart if you’re inexperienced. My favorite book for chart and graph design is The Wall Street Journal’s Guide to Information Graphics. There is time, young grasshopper, to graduate to spider charts and heat maps.
Respect the signals
Writing all that is particularly painful because I sound like a research scold, but these good early signals indicate if a research project is run competently. I share these with a tremendous amount of humility, having made a lot of mistakes in my years of designing research.
What would you add to the list of good research hallmarks?
HANDPICKED RELATED CONTENT:
Cover image by Joseph Kalinowski/Content Marketing Institute