Maximize your holiday revenue with our Holiday Playbook 2024 

Woman showing thumbs up and down representing mixed signals

We Test Everything: Getting Mixed Signals

This test presents seemingly contradictory results,” says Allan Levy, CEO of Alchemy Worx. “Viewed separately, clicks and revenue would lead one to believe the product creative was not just the winner, but the decisive winner. Conversions viewed alone would lead us decisively in the opposite direction. This is an example of why it’s so important to measure every metric available.

Sometimes A/B split-test results raise questions that can only be addressed with more testing, as in the case of the following:

Alchemy Worx conducted an A/B split test on behalf of finishing tools merchant Level 5 that pitted emails with product-specific creative against creative featuring a broader category.

The product-specific creative featured a nine-inch roller with the headline “Apply More Mud, Save More Time.” The category email featured multiple compound rollers and an identical headline.

The category email drove 27% fewer clicks, 40% less revenue, but 37% more conversions.

“This test presents seemingly contradictory results,” says Allan Levy, CEO of Alchemy Worx. “Viewed separately, clicks and revenue would lead one to believe the product creative was not just the winner, but the decisive winner.

“Conversions viewed alone would lead us decisively in the opposite direction,” he adds. “This is an example of why it’s so important to measure every metric available.”

When we see results like these, it’s helpful to try to understand the mindset of the consumer. That allows us to think about next steps. In this example, we asked:

  • Does this indicate that Level 5 customers are more focused on specific usage? Should we make sure we always suggest specific products?
  • Would we see the same results if we tested this in a different category?
  • Could the test results have been swayed by external factors we weren’t aware of?

Based on the results combined with our discussion, we agreed that there are some logical next tests that would help our email campaigns get smarter, always the goal of every test that we do.

  • We need to retest our original hypothesis using a different category.
  • We need to retest in this category again, with a different roller.
  • We need to make sure our retest is on a different day of the week, and in a different month/season, so we can minimize potential external factors.

Want to know how our retests turned out? We’ll share more in a future “We Test Everything” post.

At Alchemy Worx, we test everything. You should, too. For more information on how to partner with Alchemy Worx, contact us.

YOU MIGHT ALSO LIKE...