Amazon Adds Alexa A/B Testing Service To increase Customer Engagement 

Amazon is launching the Alexa Skill A/B testing service. The new service allows skill builders to design A/B experiments with the goal of maximising in-skill purchases, repeat visits, and number of dialogs for a session. “With the new A/B skills testing service, we were able to design and deploy an A/B test in a little […]

Topics

  • Amazon is launching the Alexa Skill A/B testing service. The new service allows skill builders to design A/B experiments with the goal of maximising in-skill purchases, repeat visits, and number of dialogs for a session.

    “With the new A/B skills testing service, we were able to design and deploy an A/B test in a little under two hours,” said James Holland, lead voice developer at Vocala. “We could analyse the results of our experiment through a dashboard. After a few weeks, we were clearly able to see that the longer prompt was over 15 per cent more effective in driving paid conversions.”

    The A/B testing service automates different facets of experimentation from customer randomisation to navigating users to control and treatment versions of skills, and displaying experiment-related analytics on a dashboard.

    “Ultimately, the A/B testing service allows us to get a holistic understanding of customer behaviour,” said Daniel Mittendorf, who is the CTO at Beyto, a voice development agency.

    “We have developed two versions of Stream Player,” said Mittendorf. “The newer version is built using the Alexa Presentation Language (APL). It allows people to switch channels within a session, as opposed to an older experience, where the session is terminated after eight seconds. However, the newer version of the skill suffers from a crucial drawback. Because utterances are sent to the skill and not the device, customers cannot turn the volume up and down as easily as they could with the older version.”

    Mittendorf wanted to test whether the original or APL-based experience drove higher customer retention.

    “We were able to launch an A/B experiment in less than an hour,” he said. “Within three weeks we could see that the APL version of the skill drove seven percent greater engagement and also increased the retention rate. The A/B Skills Testing feature allowed us to arrive at an important decision in terms of how we should be thinking about investing our limited development resources. Going forward, we will invest our time on the APL version of the skill.”

    Topics

    More Like This