
How "a simple experiment" brought Obama $ 60 million [translation]
On October 21, 2010, President Barack Obama visited Palo Alto. To raise money for his campaign, a dinner party was organized, the cost of participation in which was $ 30,400. This is an extremely effective method of raising funds, although it can only work if you have the opportunity to invite the president. How to raise funds if you only have a website that no one has seen but your grandmother? It was in this situation that we found ourselves in the distant 2007, when Obama was nominated for the presidency, and the lag behind the closest competitor was expressed by a double-digit percentage.

In 2008, when I worked as director of the analytical department of the Obama administration, my responsibilities included making decisions based on data analysis. We started with one simple experiment in December 2007. Based on this experiment, we concluded that each visitor to our site was a unique “opportunity to raise funds” and, more importantly, using this “opportunity” through optimization and A / B testing, we could attract tens of millions of dollars .
This experiment tested two blocks of our site: a block with a picture and video (Media section) and a block with a button (Call-to-action).

For the experiment, we used Google Website Optimizer and conducted a full factorial multivariate test. In simple words, we tested all combinations of buttons and media content. We had 4 button options, 3 pictures and 3 videos, which meant 24 (4 * 6) combinations for the test. One of the combinations was randomly shown to each visitor of the landing page, and we, in turn, tracked the number of visitors who left their e-mail.

Before you see the results, try to guess which button and media content in your opinion were the most successful?
The metric we used to measure the success of the experiment was the rate at which subscriptions were received. To be more precise, this is the number of people subscribing, divided by the number of people who saw a particular version of the page. Since a total of 310,382 visitors visited the landing page during the experiment, this means that approximately 13,000 people saw each option.
Here are the values we observed for each of the landing page options:

And we observed such values when evaluating combinations of different options:

The best combination of a button and media content was “Combination 11”, which consisted of a “Learn more” button and a family photo:

Before we started the experiment, the people working on the campaign preferred “Sam's Video” (the last in the slide list show above). If we had not started the experiment, we would most likely have put this particular video on our landing page. As a result, such a trifle could be a fatal mistake, because as it turned out, any of the videos affects the conversion weaker than any image.
11.6% - this is the conversion in the subscription that the winning option had, in contrast to 8.26% of the conversion of the original page. Thus, we managed to get 40.6% more signatures. Let's try to understand where this figure came from. If we consider that this improvement will retain its effect throughout the campaign, then later, closer to the end, we will be able to evaluate the final numbers and determine the changes that this experiment led to. Roughly speaking, 10 million people signed up for a landing page during the campaign. If we had not started this experiment and simply left the original page, then this number would have been about 7.1 million people. And this is less by as much as 2.9 million email addresses.
Sending emails to people who left their mail with an offer to become volunteers allowed to attract 10% of them. As a result, it turns out that we managed to attract 288,000 volunteers from 2,880,000 email addresses, and this is much more than originally planned. On average, each email address that was received using our landing page donated $ 21 for the entire campaign. Altogether, an additional 2,880,000 email addresses from our list resulted in an additional $ 60 million in donations.
The translation was carried out by the employees of the Generate.club project , the first service in Russia to optimize landing pages automatically. We will be happy to answer all your questions.
PS The article is a translation of the story of helping A / B testing in the election campaign of Barack Obama. Unfortunately, due to an incomprehensible Habra bug, there was no opportunity to indicate this when creating the publication. Sorry for the confusion!

In 2008, when I worked as director of the analytical department of the Obama administration, my responsibilities included making decisions based on data analysis. We started with one simple experiment in December 2007. Based on this experiment, we concluded that each visitor to our site was a unique “opportunity to raise funds” and, more importantly, using this “opportunity” through optimization and A / B testing, we could attract tens of millions of dollars .
This experiment tested two blocks of our site: a block with a picture and video (Media section) and a block with a button (Call-to-action).

For the experiment, we used Google Website Optimizer and conducted a full factorial multivariate test. In simple words, we tested all combinations of buttons and media content. We had 4 button options, 3 pictures and 3 videos, which meant 24 (4 * 6) combinations for the test. One of the combinations was randomly shown to each visitor of the landing page, and we, in turn, tracked the number of visitors who left their e-mail.

Before you see the results, try to guess which button and media content in your opinion were the most successful?
results
The metric we used to measure the success of the experiment was the rate at which subscriptions were received. To be more precise, this is the number of people subscribing, divided by the number of people who saw a particular version of the page. Since a total of 310,382 visitors visited the landing page during the experiment, this means that approximately 13,000 people saw each option.
Here are the values we observed for each of the landing page options:

And we observed such values when evaluating combinations of different options:

Winner
The best combination of a button and media content was “Combination 11”, which consisted of a “Learn more” button and a family photo:

Before we started the experiment, the people working on the campaign preferred “Sam's Video” (the last in the slide list show above). If we had not started the experiment, we would most likely have put this particular video on our landing page. As a result, such a trifle could be a fatal mistake, because as it turned out, any of the videos affects the conversion weaker than any image.
11.6% - this is the conversion in the subscription that the winning option had, in contrast to 8.26% of the conversion of the original page. Thus, we managed to get 40.6% more signatures. Let's try to understand where this figure came from. If we consider that this improvement will retain its effect throughout the campaign, then later, closer to the end, we will be able to evaluate the final numbers and determine the changes that this experiment led to. Roughly speaking, 10 million people signed up for a landing page during the campaign. If we had not started this experiment and simply left the original page, then this number would have been about 7.1 million people. And this is less by as much as 2.9 million email addresses.
Sending emails to people who left their mail with an offer to become volunteers allowed to attract 10% of them. As a result, it turns out that we managed to attract 288,000 volunteers from 2,880,000 email addresses, and this is much more than originally planned. On average, each email address that was received using our landing page donated $ 21 for the entire campaign. Altogether, an additional 2,880,000 email addresses from our list resulted in an additional $ 60 million in donations.
Received knowledge
- Every visitor to your website is an opportunity. Do not miss this opportunity, use optimization and A / B testing, this will give you an additional advantage.
- Check assumptions. All campaigners liked the videos. But in the end, the video worked worse than the pictures. We would never know about this if we did not check our assumptions.
- Experiment at an early stage, experiment more often. We launched the experiment in December 2007, and it brought benefits until the end of the presidential campaign. Since the first experiment was so effective, we continued similar experimentation on the entire site throughout the campaign.
The translation was carried out by the employees of the Generate.club project , the first service in Russia to optimize landing pages automatically. We will be happy to answer all your questions.
PS The article is a translation of the story of helping A / B testing in the election campaign of Barack Obama. Unfortunately, due to an incomprehensible Habra bug, there was no opportunity to indicate this when creating the publication. Sorry for the confusion!