If you’re not convinced Facebook is going all in on video, check out this little stat:
The company just announced that they plan on investing over one billion dollars into its video platform over the next year!
At this point, it’s pretty safe to say video is here to stay, and by now you may have even seen it dominate the space!
This investment and other recent product releases such as the service ‘Watch’ lend further credibility to advertisers claims that Facebook video ads are costing less, reaching more people, and are generating a better ROI than their counterparts.
Emeric and I are always very intrigued by the chatter coming from these “experts” and we can’t help but put their claims to the test.
In AdEspresso’s Facebook group, we kept seeing posts like the one below.
At Agorapulse, we’ve heard claims like this, but we didn’t really have the data to back them up.
After all, video advertising has never really been a big thing for us. Up until around December 2016, we only used around 5% of our total budget on video advertising.
But, after seeing the explosive growth in Facebook videos over the past couple of years, we decided to give it shot.
Here’s what we did:
In the rest of this post, we’ll share with you the experiment set up, the ads we ran, the results, and our key takeaways.
In this experiment, we wanted to be very clear on what metrics we would evaluate to determine the success of the test. So, we decided on the three most important attributes in our campaign and measured them accordingly:
The number of paying users topped the list as this refers to subscribers who is now paying for our tool.
Next, the number of free trials is also important as our customers typically try our tool out for 14 days before converting into a paid subscriber.
Last but not least, we also decided to consider the number of leads collected. This refers to people who signed up to use one of our free tools or subscribed to our blog. These users will continue to receive engagement emails and may sign up at a later time.
With these goals in mind, this is our hypothesis:
Facebook Video Ads will not get better results than Image Link Ads.
The reason for this is because we continue to see positive results from normal image link ads and while video engagement is on the rise, we are doubtful that the views will turn into meaningful conversions.
To make it easy for you to see what we did, we broke out our campaigns into a few sections below.
We used AdEspresso to create 2 campaigns: one with a promotional video, and the other with a photo. Both campaigns linked to the same blog post mentioned above.
Next, we created 1 ad set in each campaign. In each ad set, the settings were nearly identical:
Let’s discuss both the targeting and placement briefly.
To make this test as effective as possible, website visitors should really only see 1 of the 2 ads.
In other words, the person who viewed the photo ad should not be able to view the video ad and vice versa.
Otherwise, the results may be skewed, as there would be some overlap in the audiences.
However, unless you have a HUGE budget to test with, and Facebook’s help, it is very difficult to prevent this from happening.
So, we took the next best option available to us and excluded those who viewed the video for at least 3 seconds from the photo ad set.
By doing this, people who viewed the video would not see the photo ad. Unless of course, the photo ad was delivered to them before they viewed our video ad.
Note: Unfortunately, this is the biggest flaw of our experimental design. Despite this, we wanted to see if there will be a significant difference in both ad formats. Please take this into consideration when you’re looking at our results.
As you can see above, we chose to run our ads only on the desktop newsfeed.
This is because visitors cannot sign up for a free trial on mobile devices. So if we ran ads on mobile placements, visitors interested in a free trial would have to visit our website on their desktops and sign up.
The process of going from mobile to desktop makes it difficult for us to track conversions. For this reason, we only advertised on desktop and not on mobile.
Ad Creative & Copy:
To get statistically significant split test results, we only tested a couple variations and kept the ad copies and other ad elements the same for both.
Here’s what the video link ad looks like:
Here’s an example of the photo link ad:
Here’s what the blog post looks like:
We were able to achieve some pretty awesome results during our experiment. The following results were based on a 28-day click, and a 1-day view as it is usually best to base your results off a 28-day click attribution.
We picked up 31 new leads and 40 free trials.
The video ad brought in 26 free trials, while the photo ad only brought in 14.
The video ad was the sole ad that led to a paid subscription worth $199/month.
From the results, it seems like video ads are working for us!
To validate our conclusion that the Video ad outperformed the photo ad we ran it through a statistical analysis formula.
While the 1 signup vs 0 signups is a big deal to us it fails on the statistical testing due to the low number.
But if we analyze the Reach to number of free trials it is quite significant, as shown below.
Great results that help us conclude that Facebook ads using videos outperformed.
But, just to make sure we’ve poked all the holes in this we can, let’s analyze the number of Link Clicks vs free trials.
Again we find our test results prove that Facebook ads using videos outperformed ads using photos.
We have confidence moving forward that video ads will result in more free trials and signups.
We learned a few things along the way and well, what’s the point of the experiment if we do not share some takeaways with you?
As you can see, only 1 of 40 free trials turned into subscribers during the 28-day period.
When we paused the campaign, it had 40 free trials but no purchases. That dissuaded us from continuing to run the campaign, as we thought that something was surely wrong.
But I guess we were (gladly) wrong! We should have exercised greater patience and let it run longer – perhaps until we hit 100 subscriptions to get a more accurate picture.
Next, as you can see above, we only ran 2 creative – a video versus a single image.
The results were not ideal either.
Our average CTRs (link) were as low as 0.34%. Had we split tested more copy, video thumbnails vs image, video length, etc., we might have increased the CTRs up to at least 1% while keeping the conversion rates similar.
Doing that alone would have cut our cost per free trial by 66%!
Facebook recently recommended at least 50 conversions a week for its system to learn more about your ideal audiences.
Given that a free trial costs us roughly $20, we should be spending $1,000/campaign in a week!
Perhaps that would have given us better results and lower cost per free trial over time.
If we had to re-run this experiment again, we would set aside $6,000 to $8,000 for it or the equivalent of $200/campaign/day.
From the experiment we were able to determine that the ad format, video vs. photo, doesn’t really affect the ad performance as much as we anticipated. But, we’re really glad we ran with it. The Data we collected allowed us to see what might perform better in future campaigns and gives us the insights necessary to further refine future campaigns.
So, what are you waiting for? Start testing today and take your campaigns to the next level.