A / B testing and its results, which shocked experts: intuition sometimes fails

    When choosing a particular interface, website design, contact collection tools and conversion enhancement, website owners often make the same mistake: they don’t test different options. When creating a website, we often trust the opinion of the so-called experts, or read useful articles, which unanimously state, for example, that using photos of happy people in a landing increases conversion. Is it worth it to unconditionally listen to the opinions of experts? And to all the persuasion to test different options to answer, what is the exact interface that Ivan Ivanovich suggested creating a conditional? Since the creation of sites and web design, in particular, have become a kind of professional activity, certain stereotypes have emerged in each of these areas, which most site creators follow. Why is it worth checking even the most in your opinion, or in the opinion of your expert friend, the effective design of your interface or some of its details using a / b testing? Because in the field of usability there are practically no solutions that are equally good for all sites. Do not believe? Under the cut, there are cases of how checking even the most apparently obvious best options reveals that they did not work for conversion at all.

    This article, based on the material of Justin Rondo, is intended to dispel some myths about which interface elements increase conversion, as well as to prove the need for a / b testing of absolutely any tools that the site visitor falls into. The article is written about the first person.

    What I like about split testing is its ability to shock even the most experienced testers. Sometimes even the most deeply studied hypotheses are erroneous.

    This is the main reason why companies should test absolutely everything: from the text of the commercial proposal to the design of the page, instead of relying on intuition and personal preferences. In this post I will share some cases in which there are huge differences of opinion among the members of the WhichTestWon community, or the results of these tests absolutely shocked not only the staff of our editorial staff, but also the judges of TestingAwards.

    Social approval is not everything

    Can you guess which of the two versions of the form with one input field will generate more subscribers to the web design blog?

    71% of WhichTestWon community members decided that the first version (the one on the left) would work more efficiently because there was evidence of social approval on it. When I propose the same test to a live audience at various conferences, as a rule, 90-95% of respondents choose the option, supported by indicating the number of subscribers.

    However, they are all wrong. The right option (the one without counting the number of subscribers) was filled by 122% more users than the version with an element of social approval. In a world where everyone is fed up with a huge number of Facebook or Twitter accounts, 14,000 subscribers do not look convincing enough to motivate potential customers to take action.

    Usually, we see how companies add an element of social recognition without first testing it, because "social media experts" vied with them that it increases conversion. Fortunately, in our example, the guys did not neglect the testing, otherwise, they would lose more than half of the subscribers.

    Do not get me wrong, social approval is very valuable. It helps some to fight visitor distrust and increase brand prestige. However, the difficulty lies in choosing where it is best to post statistics that illustrate social approval, and which information is worth sharing. What is more profitable: to tell about the number of subscribers, about the number of likes on Facebook, about awards - or about everything at once? The answer is simple: do the testing. Never act blindly, you do not know what this can lead to.

    Icons are now trending, but they may impair conversion

    Icons returned to web design in triumph. In general, they are quite useful, especially if used as markers in stylized lists of product categories. The Build.com team decided to find out if the icons would be effective navigation tools ... and the results surprised them a lot.
    Here are two options for the top of the page that they tested (with and without icons):

    In this case, the function of the icons was the reflection of different categories of products on the pages of the site. Representatives of the company believed that convenient navigation through the most visited sections would help increase sales. However, they were mistaken: the option without icons contributed to a 21% increase in sales.
    Why? It seems to us that although the icons made the navigation system prettier, they visually overloaded the interface and confused the user.

    Testing security certificates on contact collection forms

    The stumbling block of any lead generation page is the subscription form itself. It is very important to determine the optimal number of fields, choose the right design, add privacy policy badges to increase user confidence.

    Such methods are quite well known, all marketers understand this ... and that is why 74% of the members of the WhichTestWon team incorrectly determined the winner in this A / B test.

    A form without the TRUSTe logo was completed by 12.6% more users. Yes, the “subscribe” button has been reduced to accommodate the TRUSTe logo, but it seems to us that the presence of the logo became the reason for the lower conversion.

    User confidence plays a key role in increasing conversion rates; but the point is to place confidence enhancing elements in the right place at the right time.

    In this particular case, the TRUSTe logo was published in the wrong place at the wrong time, and the privacy signature did not work to increase confidence. Visitors are accustomed to seeing this element in the basket, and not when filling out the form. It is likely that many of them, at a subconscious level, began to suspect that they were now deducted money.

    Instead of a logo with assurances of the security of the service, you could use a small text link leading to the privacy policy page.

    An example of such a subscription form created on the Witget service :

    But you can only find out through testing.

    If it’s hard for you to figure out how to motivate potential customers to leave contacts, we recommend reading the article “10 Useful Tips: Under What Sauce to Collect Site Visitors Contacts”.

    Remember, context is the key to everything!

    If the conversion rate drops, you need to add a photo of a happy person?

    Conversion rate optimization specialists and designers love using photos of happy people. The face is the first thing that catches your eye when you visit the site. Numerous studies that track the trajectory of the user's gaze, this only proves.

    However, these photos of human faces can be very distracting. Thus, we recommend that before adding a photo to your page, test this solution to make sure that the photo does not compete with the title and call to action.

    Here is an example:

    A form without a picture was completed by 24% more users. It should be said that the testing was not perfectly fair - the text in the second picture is slightly changed, but nothing more. Honestly, testing is better than none. However, it is worth saying that this is not the first and not the last test, from which it became clear that if you remove the faces from the page, the conversion will increase.

    By the way: it is good that company representatives used a photo of a real employee, and not another "stock" photo. Stock and production photographs are much less effective than those that depict real clients, employees, etc.

    Perhaps the most surprising is that during this split testing, HubSpot was going to make it mandatory to place a photograph of a person on each of its landing pages. Perhaps at some level this makes sense - they found that the effectiveness of some pages improves if you place a photo there. However, this rule is not relevant for all pages. Fortunately, testing sowed a grain of doubt, and the company changed its page design requirements.

    So, before creating a landing page or developing design requirements, be sure to test ... you can’t imagine how many conversions you can miss!

    Preview video with an emphasis on the product or on the person?

    Here is another case that clearly demonstrates whether to use human faces. Before you are two versions of the video page, in the first half of the video previews are people, and in the second - only the product.

    The Autodesk page version in which a person was portrayed on the cover of the video received 50% less video views. Nothing but a video preview image has changed. I am not against the photos of people on the sites, I only urge you to pre-test all your decisions!

    Of course, the testing team was shocked by the results. In the end, they conducted a user survey to figure it out. The answers showed that potential Autodesk buyers are interested in seeing how the product works, rather than listening to other people talking about the product.

    All this boils down to one thing: you need to know your audience, and remember that solutions tested by others are not suitable for everyone.

    To summarize

    Testing leaders have been puzzled by such unexpected results, and this will happen more than once. The trick is to figure out what to do after the test results have diverged from your hypotheses and assumptions. Your next steps may include re-evaluating a specific list of things, such as hypotheses, technology, traffic sources, devices, and so on.

    In any case - even if the testing failed, or even the very understanding of what your site visitors want was erroneous - you learned something valuable that you can use in the future.

    Remember, testing is a long process, and we can only approach the ideal after passing through a series of successes and failures.

    Keep on testing! To succeed, you need to check so many options, so it is not surprising that we often suffer losses where we least expected it.

    Source: https://vwo.com/blog/ab-testing-results-that-surprised-experts/

    By the way, if you are looking for testers for your site and don’t know where and how to find them, we talked about this in one of previous articles on his blog, “ 6 available usability testing tools for websites .” Follow our blog posts!

    Only registered users can participate in the survey. Please come in.

    Have you ever been surprised at least once with the results of a / b testing?

    • 29.4% Yes, and more than once 121
    • 12.4% It was, but it's a rarity 51
    • 12.4% No, my hypotheses are almost always confirmed 51
    • 45.7% No, I have never done a / b testing 188

    Also popular now: