Posts Tagged "Experimentation"

Using YoY/MoM conversion rate goals as targets can backfire

A common exercise product teams do at the end of each year is goal setting and revision. We often see conversion rate goals / objectives being set like: Increase the conversion rate from 6.7% to 7.4% When goals measure absolute…

Read More

Presentation: How to avoid 5 testing pitfalls & run trustworthy experiments (Web Analytics Wednesday Melbourne, 2019-11-06)

Earlier this month, I gave a talk to the local Web Analytics Wednesday group in Melbourne on running A/B tests and trustworthy experimentation. It features some of our biggest mistakes in split testing and the simple methods we take to…

Read More

Why we build experiments in an IDE rather than a WYSIWYG

Experiments built entirely within SaaS platforms’ web interfaces often take longer and require unnecessary busy-work. This article explores the reasons we would rather build experiments in an IDE and how Mojito supports this approach, so you don’t have to touch…

Read More

Why an A/B testing tool should form an experiments layer over your site

There’s a reason tag managers are now the de facto for tag deployment. Before tag managers, you’d embed tags directly into your application. It could take weeks or months to deploy them inside large, monolithic apps… Meanwhile, you’d be shifting…

Read More
The Mojito logo.

Introducing Mojito: Mint Metrics’ open-source split testing tool

Update: We have just launched our documentation site for Mojito here. We’re excited to open source Mojito – the experimentation stack we’ve used to run well over 500 experiments for Mint Metrics’ clients. It’s a fully source-controlled experimentation stack for…

Read More

Track Optimizely, VWO & Mojito tests into Google Optimize

You’ve probably audited your Google Analytics setup and validated the data roughly matches data in your CRM etc (bonus points if you perform this QA process regularly). But who regularly audits the data quality of Optimizely / VWO / Convert.com…

Read More

Why purpose-built analytics tools beat Optimizely / VWO’s A/B test tracking

We typically find that relying just on Optimizely, VWO or Convert.com’s A/B test tracking has numerous hidden costs: Restrictive analytics capabilities Worse site performance Increases your compliance obligations & compromises your data sovereignty In our experience Analytics tools like GA…

Read More

Why you need error tracking & handling in your split tests

Building large, complex experiments introduces new logic, new code and sometimes new bugs. But most A/B testing tools don’t perform error tracking or handling for you. So when you launch your experiment and it tanks… …did your awesome new idea…

Read More

How to reduce your A/B testing tool’s page speed impact

Client-side A/B testing tools get criticised for loading huge chunks of JS synchronously in the head (rightfully so). Despite the speed impact, these tools deliver far more value through the experiments they deliver. And luckily, we can help manage the…

Read More