Wash, Rinse, Repeat: Structured Ideation Leads to Consistently Higher Win Rates

Get more wins and bigger lifts with a reliable CRO ideation method toolkit that scales with your program.

A call for better ideas

Ideas are a dime a dozen. It’s true. And it’s also true that the less confidence you had in a winning idea, or,  conversely, the more confidence you had in a losing one, the more valuable the practice of experimentation becomes. Put another way, the more surprising the experiment result, the higher your ROI.

Those of us who have been doing this for a while know that the whole argument for experimentation is based on the fact that we, as marketers, product managers, and designers, have very poor intuitions about what will work and what will not work. Take these statistics for instance: 

  • In a HBR meta-analysis of 6,375 a/b/n tests run on Optimizely’s platform in 2018, only 10.7% of tests reached 90% statistical significance.
  • Google ran approximately 12,000 randomized experiments in 2009, with only about 10 percent of these leading to business change.(Source)

I’m going to go ahead and pass judgment here: as CROs, we can and should be doing a lot better than a 10% win rate. (Not to brag, but our clients are crushing it by these standards.)

Does experience and talent improve intuitions?

When I first got my start in conversion rate optimization, I had the sense that somewhere out there were people who were “conversion experts,” who knew exactly what would work and wouldn’t work for users. Through the years, I have studied the principles of effective design, engaged actively in communities of individuals sharing their successes and failures, and personally witnessed the results of hundreds of experiments. Here, years later, I would bet that my intuitions are only marginally better than those of a random person off the street.

I’ve come to believe, as have many of my peers, that best practices are bogus and simply cannot be implemented with impunity. Simply put, I’ve seen so many of them strike out too many times to put much stock in them, no matter how popular they seem.

image2
Photo source: pixabay.com

Occasionally, an individual with a precocious talent for UX comes along. But even the savants commonly overlook things and fail to anticipate how users will behave. What’s more, these individuals aren’t scalable. You can’t build a success model based on the talents of one individual; rather, we need systematic success.

I consider myself a conversion expert, not because I know how a specific user interface should be designed for greatest impact, but because I know how to reliably and efficiently identify ideas that will optimize that interface better than most intuitions would. (Oh yeah, and I also know how to test those ideas!)

The best sources for optimization ideas

At Search Discovery, we turn to the following toolkit of methods to get at the best ideas. While these aren’t the only sources of quality ideas, these are ones that we’ve found to be accessible, affordable, and reliable for most businesses.

User testing – Relative to 20 years ago, these days, user research is pretty cheap, fast, and easy. All things considered, unmoderated, remote user testing on a select audience is one of the best sources for optimization hypotheses available. It cuts through all your biases and very quickly gives you empathy for users of your experience.

While there is a (growing) cost to it, and there are plenty of traps to avoid—leading the tester, poor audience screening, “lab setting” influence, overreacting to rare events—you’ll be hard pressed to find a better way to understand what users struggle with, what they like and dislike.

Comparative analysis – While it’s tempting to ogle over whatever your top competitors are doing, it’s a bad idea to just copy their tactics. However, canvasing a larger competitive set as well as a few best-in-class peers from adjacent categories can provide you with some powerful insights. We tend to focus on a few, targeted dimensions at a time, such as the use of imagery, value propositions, design/layout, CTAs, navigation, persuasion tactics, etc. This gives us a manageable workload while ensuring we get a concrete understanding of general competitor practices. It also tends to reveal the language (words/design/holistic experience) that consumers are accustomed to seeing and may expect to find on your site as well.

image4
Photo by Hannah Lim on Unsplash

Ideation workshops – No one is a stranger to brainstorming sessions, but almost as many of us have been disappointed by them as well. Done properly, group ideation work brings forward a broad range of perspectives and creativity. (Hint: it requires rules, focus, and skilled moderation.) This knocks availability bias on its hind quarters and creates an environment of true collaboration and synergy.

image5
Photo by Daria Nepriakhina on Unsplash

We take many principles from Design Thinking into our workshops and sometimes use the other methods here as a springboard to identify the problems that an ideation session will attempt to solve.

Heuristic analysis – What’s a heuristic? We view heuristics as any established principles, theories, models/frameworks and so-called best practices to web pages and user flows. Heuristics may be based in principles of psychology, neuroscience, persuasion, branding, general marketing/sales, or in the expertise of industry thought leaders. The key is taking a heuristic to the experience to generate ideas rather than the other way around of generating an idea and then seeking a heuristic to defend it.

I like heuristics because they help me expand and focus my natural thought process while analyzing a page. Like asking, “what features on this page might cause anxiety or anticipation?” Or, “how might the decoy effect be relevant here?”

Digital behavioral data analysis – This is the work of a digital analyst, but one skilled at combining data with UX to understand and hypothesize about human behavior and user experience. The work may involve core digital analytics tools such as Google Analytics, supporting tools such as Decibel Insight, or raw data tables accessed via query tools such as Tableau or SQL, but it always involves a UX interface.

Often, data analysis accompanies ideas harvested from other methods both to size the opportunities but also to validate a hypothesis about behavior. In fact, often behavioral data analysis starts with a hypothesis about behaviors and ends with a hypothesis about how to shape that behavior differently.

Voice of Customer (VOC) analysis – In my humble opinion, every single person working in marketing/product discipline should be required to read or listen to verbatim words from a customer on a weekly basis. It’s far too easy to become a detached armchair philosopher on what the customer needs, wants, and is experiencing; and hearing it directly from the source will give you both a factual basis for your perspective and empathy for what they experience.

But VOC analysis isn’t always qualitative. The analysis piece may come from finding trends and insights in large data sets such as those produced by surveys, support tickets, transcripts from chats or phone calls and the like.

VOC may come solicited, as from pop-up site surveys, or not, as from product reviews. You usually don’t have to dig too deep to find issues that are both critical and salient, but it might be hard to narrow in on issues caused by or directly related to a specific area of the website, app, etc.

Structured ideation methods are mucho work. Why bother?

Availability bias – when we reach for ideas, we tend to pull down the first things that come to mind. This ends up being a very small consideration set. What makes us think it’s the best? Reality is, generating ideas via deliberate process explores a greater breadth and depth of possibilities. Like looking for gold by upturning a lot of earth rather than just scanning the surface of the dirt.

Expertise bias – the more you know about a thing, the less capable you become of seeing that thing through the eyes of someone who doesn’t know much about it. Most of the people you’re looking to win over have very little familiarity with your website, whereas you have been looking at it every day for the past year. It’s actually very difficult for you to form an accurate conception of what things look like to new visitors.

Improving your RBI – we want to improve the frequency and magnitude of our winning experiments. Kind of like the baseball metric of Runs Batted In (RBI). This means being more innovative and taking risks which takes more thought and effort. This means rapidly discarding ideas that show little promise through early validation methods. It means starting with the very best ideas—working smarter and not just harder.

How to make it a program

What we see is that if we treat ideation as a tool in the toolbox to be used when needed then it often goes underused. It’s very easy to neglect and deprioritize, especially when you’re sacrificing test execution to do it. There should be a constant influx of new ideas into an optimization program in order to ensure that you’re working on the best opportunities at any given time.

The best programs have an established calendar and roadmap of ideation activities, which helps ensure that ideation is carried out consistently and that approaches are diversified in both method and optimization target. They meet regularly to revise the roadmap and to review results of ideation activities. They keep track of where ideas come from and trace test results back to idea sources. This helps inform prioritization of ideation activities down the road.

What's it going to take?

Accountability – someone needs to be accountable for bringing new ideas in the door. It should be an expectation and a priority in their regular work schedule.

Resources – it’s going to take time, and likely budget to do this. User testing software/services, for example, are not free and are critical.

Diverse skill sets – some great ideas will only emerge from skillful data exploration and analysis. Effective, unbiased user testing is not intuitive to everyone and benefits from training and experience. Moderating brainstorming sessions effectively is astonishingly more difficult than most believe it to be. It’s rare to find one person who can navigate all of these waters well.

Collaboration – great ideas can come from anywhere, but simply having an idea box of sorts will not be enough to draw them out. You will have to involve other teams in the process, whether that means rotating groups into ideation sessions, or going to teams that collect and analyze voice of customer information. It will take a lot of different people to cover all the bases.

And one more thing

It can be nice sometimes to have some outside perspective bringing ideas to the table. I learned this when I ran my own brand-side program and used an agency purely for ideation. An agency can be objective, suggesting things that you can’t (or wouldn’t) due to any number of internal factors. And they have the benefit of seeing across many businesses and industries at what works and doesn’t. An agency isn’t the only solution to getting these benefits, but the outside view is impactful.

If you want help improving your win rate with better ideas, get in touch!

Leave a Comment

Your email address will not be published. Required fields are marked *

Related Posts

Join the Conversation

Check out Kelly Wortham’s Optimization based YouTube channel: Test & Learn Community.

Search Discovery
Education Community

Join Search Discovery’s new education community and keep up with the latest tools, technologies, and trends in analytics.

Follow Us

Share

Share on facebook
Share on twitter
Share on linkedin
Scroll to Top