Every eCommerce blog and agency considers mini carts to be a best practice (including us). They give a visual confirmation that a product has been added to the cart, but more importantly, they keep the user on the product page, which makes it easier for them to find more products. Still, we had never seen data proving that a mini cart is a must-have. So we created a plan to get some answers using qualitative and quantitative data.
Using Optimizely, we set up a test. By adding products to their cart, customers in test group A activated the mini cart and received a visual confirmation that their product was ready for purchase. Visitors in test group B experienced a more traditional shopping funnel. After they added an item to their cart, the site redirected them to the cart page. From a usability standpoint, redirecting visitors to the cart seems anti-intuitive for a site that averages 2.5 units per order.
Naturally, we hypothesized that mini cart would provide the best experience for customers. It was reliable, functional, and fit in with the design. The A/B test would confirm that the mini-cart helped conversion rate. The user testing would tell us why it won by revealing the pain points caused by its absence. Our primary purpose was not to see if the mini-cart would help conversion, but why.
In this case, split testing revealed that our mini cart had zero effect on the conversion rate or revenue-per-visitor. User testing revealed that neither the mini cart nor its absence created pain points. No one cared that we had built a reliable, functional and attractive mini cart. In other words, we were wrong. None of the testers even mentioned it.
Often in the world of conversion testing, we focus too much on the data and forget to ask real people what they think of the site. Tools like UserTesting.com make getting feedback much easier. We observed that users assigned to test group B (redirected to the cart page) were excited to leave the cart and enjoy the site experience more. They were so interested in seeing more patterns and product names that navigating back did not cause a nuisance reaction.
We designed Scoutbags.com to immerse visitors into a full brand experience by showcasing what makes Scout products unique: patterns and colors. The site engaged users so well that in this instance the addition of a mini cart did little to improve the experience. Despite doing little to improve the experience, the mini cart did not detract from the experience. However, other clients have experienced a different outcome resulting in increased average order value and increased revenue after implementing a mini cart.
The question remains, will implementing a mini cart on YOUR site increase conversions? Give user testing and A/B testing a shot, and let us know how it goes. Or contact us to set up the test for you!