Conversion rate optimization is an endless mission for digital marketers. Collecting high-quality leads for sales requires intentional strategies for optimizing marketing campaigns. And this starts and ends with making sure your web forms are doing their job.
One of the best ways to confirm your web forms are leading to top conversions is through A/B testing. If you’re not using Formstack’s Form A/B Testing feature, you should check it out! This advanced feature allows you to test variations of your forms to find out what resonates with your target market.
Here at Formstack, we love using this feature to enhance forms across our site. And we’ve gained a ton of valuable insights from all the tests we’ve run.
Here are three things we’ve learned from our own use of Form A/B Testing:
#1: Some best practices for web forms do not always win.
One of the form best practices we hear all the time is that shorter forms (or those with fewer fields) are better. We’ve even preached this ourselves a time or two. But, in truth, sometimes a longer form that allows you to collect more information does not negatively impact your conversion rate.
Case in point: We ran an A/B test to gauge impact of adding a few fields to our demo request form, and we saw no significant change in submissions. We expected to see a drop in submissions with a longer form, but the “shorter forms lead to higher conversions” best practice didn’t hold up. In this case, we were able to gather more qualifying information without damaging our conversion rate.
#2: Fewer submissions or fields completed do not always indicate a losing test.
When you’re running an A/B test, it’s important to have a clear objective in mind. Are you testing for more overall submissions or more qualified submissions? The distinction is important. You won’t know which variation of your form is the winner in an A/B test if you’re not clear about your testing objective.
Case in point: We recently decided to test one checkbox field on our webinar registration form to see if a change in verbiage would help us attract more qualified leads. The end result was that the updated form with more specific verbiage brought in a higher quality of leads, while the original form with more vague verbiage brought it a higher volume of leads. Because our goal was to bring in more qualified leads, the updated version of the form was the winner—even though that form bought in a smaller percentage of leads.
#3: What works for one form type will not always work for another.
As mentioned previously, marketers often believe that shorter forms lead to higher conversions. And while we know that’s not always true, it can be true for certain form types. According to the 2015 Form Conversion Report, “The type of form you use directly impacts the number of fields it’s safe to include.”
Case in point: We just finished up an A/B test on our trial signup form to see if removing a couple non-required fields would increase our signup rate, and we saw a big increase in conversions with the shorter form. As opposed to the demo request form test mentioned earlier, fewer fields led to a lift in submissions here. This is likely because those signing up for a trial are considering a bigger commitment than those requesting a demo, so the potential trialers experience more form friction when it comes to giving up personal information.
Test Your Forms for More Conversions
These are just a few of the insights we’ve gleaned from our in-house use of the Form A/B Testing feature. Want more information on testing your web forms? Click below to download our interactive e-book “Testing Your Way to a High-Converting Form.”