OPTIMIZATION:
TOOL SELECTION & IMPLEMENTATION
WE DELIVER:
- Subject Matter Expert guidance from industry technology leaders
- A vendor evaluation customized to your organization’s unique needs
- Technology expertise across all leading platforms and technologies—and several up-and-coming technologies as well
- A focus on making sure your test platform and its integrations work well, so you can focus on what you should test instead of worrying what you can test
SUCCESS STORIES
- Released first production dashboards within 90 days
- Built a center of data excellence.
- Instilled an experimentation culture and optimization capability
- Drove 11% growth of registrations
- 388 net new contacts, opportunity valued at nearly $7.7MM
- Potential of $1MM cost savings in media spend
- 5.4% growth in YoY donations
OUR APPROACH
STEP 1:
Our approach to Vendor Evaluations starts with understanding your unique requirements and an understanding of your existing technology landscape.
- Business Requirements: We interview you and your key stakeholders and partners including IT to ensure we have a full understanding of your technology landscape and needs.
- Criticality Assessment: Not all requirements are equal, and just like we take time to prioritize our experiments to ensure the most impactful tests get launched first, so to must we assess the list of requirements captured for import to your program from nice-to-have to critical need.
- Capabilities Fitness Assessment: We pull together documentation and vendor interviews to assess each vendor against all of your requirements scoring them from 1 (not available) to 5 (built-in capability, best in class)
STEP 2:
STEP 3:
ALTERNATELY:
LATEST INSIGHTS:

How to A/B Test Statistical Intuition via Simulation. Bonus! Free A/B Test Simulation Tool Included!
Developing a solid intuition about the statistical concepts that drive the interpretation of a basic A/B test is a powerful thing. Explore an A/B test simulator to lock in that intuition!

A Cranium-Expanding Machine Learning Discussion with Matt Gershoff
We recap Matt Gershoff’s machine learning discussion with the Test and Learn Community. Read more to see if you’re ready to invest in machine learning!

How (and Why!) To Monitor Your Tests
Common Mistakes When Running A/B Tests There are a few. When running a test, there are multiple places where one can misstep to put the test, and the program, in jeopardy: Peeking at your test. If you have set up a beautiful fixed horizon A/B test, but then you peek for results and call it

Should We Be Calling It CRO?
We discuss the maturation of CRO, what to focus on when optimizing, and the skills required for success. See how we can help your CRO process!