Quantifying the Impact of Your Optimization Program
Whether your optimization program is just starting and only one employee is involved or if your optimization program has been in place for years and spreads across multiple teams within your organization, it is vital that the program impact is being measured and communicated broadly.
Why you may ask? Well, there are a few important reasons.
First, it will ensure that you can continue to get the funding and support of the optimization program which can sometimes be seen as expendable in near-sighted organizations. Optimization programs have an opportunity cost due to the time and money required to keep them running. Showing that the optimization program impact far exceeds that opportunity cost is essential. We’ve seen optimization programs get severely reduced in funding and in some cases completely dismantled. This came as a shock to us but at the start of the 2020 pandemic, we saw optimization teams as one of the very first teams to get cut as organization were looking to reduce costs — If only the organization knew the revenue impact that was being thrown away based upon those decisions.
Second, funding and support will lead to more test ideas, more insights and more revenue, the more you invest, the greater your returns. The “hotter” your optimization program becomes in the organization; the more people will want to join in the fun. Word spreads quickly. We’ve seen this happen both from the agency side of things as well as when running optimization programs on the client-side. And that can be really cool to see and participate in.
Last and certainly not least, the customer – your customer – always wins in the end here because of the improved user experience that your optimization program provides. Aside from the quantitative impact we will discuss below, there is certainly a qualitative impact associated with your optimization program. Every optimization run, regardless if it produced a statistically significant winner or not, can provide organizational learnings about the site experience and user behavior. These learnings can be super impactful when applied to the customer experience. A great example is that you may run a test and see no lift in bottom line revenue, however, you discovered that users from a certain geolocation responded very well to the test treatment. This knowledge can quickly jump start a Personalization program in that area.
There are several quantitative models that can be used to measure the impact of your optimization program. We will be focusing on one of those models today. Our experience driving optimization programs over the last two decades has taught us that this model is the one that people will understand and support the most. It’s fairly straight forward, and you won’t lose anyone in the details. At a high level, you will be taking the revenue lift associated with each statistically significant test winner, and projecting that revenue impact out for a set time period using a decaying coefficient to account for visitors getting used to the test treatment (“newness effect” wears off) and for other changes to the site that could affect the long-term performance of the winning treatment variation (WTV) such as new product launches, promotions and overall site functionality changes.
For each individual optimization test run on the site, you will want to collect the following data:
Date range that the test ran
Annual visits and orders for the page the test ran on
Visits for the Control group and WTV
Orders for the Control group and WTV
Revenue for the Control group and WTV
Step 1 - Calculate the projected orders for the WTV
Current Annual Visits * Conversion Rate (CVR) for the WTV
Step 2 - Determine the yearly incremental orders due to the WTV
Projected Orders from Step 1 – Current Annual Orders
Step 3 - Calculate Average Order Value (AOV) for the WTV
WTV Revenue / WTV Orders
Step 4 - Calculate Incremental Yearly Revenue
Yearly Incremental Orders for WTV * AOV for WTV
Step 5 - Account for Decay
Incremental Yearly Revenue * 0.75
Step 6 – Optional Monthly Decay
You could also calculate the above on a monthly basis and use a monthly decay such as 2.5% per month
You can now sum up all of the estimated annual revenue impacts for each test winner to get a cumulative total for your Optimization Program. This value can be taken to your leadership team on a monthly basis to show program impact. In a follow-up blog post, we’ll talk about gamification theory and how you can make this process fun and get everyone involved!
NOTE: You will want to validate the performance of the WTV over time. Site changes and seasonal traffic will cause variations in the performance of the WTV. Just because the WTV showed a 5.5% lift in Revenue for the 3 weeks the test ran, doesn’t mean that the same performance will continue in the long term. Performance will fluctuate for the reasons above, along with technology changes and customer churn. Validating the WTV can be done by using a few different methods. One method is to implement a continual holdout of say 5% of traffic. 95% of site traffic will receive the WTV experience. The two experiences can then be compared in your Analytics platform, assuming you have properly integrated your optimization and web analytics data. Having a 5% holdout group has an opportunity cost associated with it because the holdout group will not be receiving the incremental revenue lift associated with the WTV. Another method is to run the same test 6 months later. Keep in mind that the control experience will be different for this second iteration as compared to the original test. But you will obtain data that either shows the WTV is still outperforming the control experience or possibly underperforming. You could also not reach significance and have an inconclusive test.
Hopefully, you have a good understanding of how to place a quantitative value on your Optimization program. You now have a powerful tool in your Optimization tool belt. Valuing your program is critical if you want to maintain funding and support and increase participation in test ideation. Best of luck!!