Are Facebook Video Ads Superior to Other Ad Formats?

December 6, 2017 • By

Jason HJH.

Subscribe to the Social Media Lab Podcast via iTunes | Stitcher | Spotify  | Google Play | RSS

Are Facebook Video Ads Superior to Other Ad Formats?

If you’re not convinced Facebook is going all in on video,  check out this little stat:

The company just announced that they plan on investing over one billion dollars into its video platform over the next year!

At this point, it’s pretty safe to say video is here to stay, and by now you may have even seen it dominate the space!

This investment and other recent product releases such as the service ‘Watch’ lend further credibility to advertisers claims that Facebook video ads are costing less, reaching more people, and are generating a better ROI than their counterparts.

Emeric and I are always very intrigued by the chatter coming from these “experts” and we can’t help but put their claims to the test.

In AdEspresso’s Facebook group, we kept seeing posts like the one below.

Video Ad Feedback

At Agorapulse, we’ve heard claims like this, but we didn’t really have the data to back them up.

After all, video advertising has never really been a big thing for us. Up until around December 2016, we only used around 5% of our total budget on video advertising.

But, after seeing the explosive growth in Facebook videos over the past couple of years, we decided to give it shot.

Here’s what we did:

  • We used this blog post as the destination of our experiment.
  • We created an audience of people who visited our English website in the last 30 days, excluding our trial and paying users.
  • We created 2 nearly identical campaigns – one using a video and another using a photo.

Facebook Ad Spend

In the rest of this post, we’ll share with you the experiment set up, the ads we ran, the results, and our key takeaways.

The Experiment

In this experiment, we wanted to be very clear on what metrics we would evaluate to determine the success of the test. So, we decided on the three most important attributes in our campaign and measured them accordingly:

  1. Number of Paying Users (the most important)
  2. Number of Free Trials
  3. Number of Leads

The number of paying users topped the list as this refers to subscribers who is now paying for our tool.

Next, the number of free trials is also important as our customers typically try our tool out for 14 days before converting into a paid subscriber.

Last but not least, we also decided to consider the number of leads collected. This refers to people who signed up to use one of our free tools or subscribed to our blog. These users will continue to receive engagement emails and may sign up at a later time.

With these goals in mind, this is our hypothesis:

Facebook Video Ads will not get better results than Image Link Ads.

Facebook Video Ads

The reason for this is because we continue to see positive results from normal image link ads and while video engagement is on the rise, we are doubtful that the views will turn into meaningful conversions.

How We Set Up The Experiment

To make it easy for you to see what we did, we broke out our campaigns into a few sections below.

  1. Number of Campaigns
  2. How We Named the Ads
  3. Ad Set Settings

— Number of Campaigns:

We used AdEspresso to create 2 campaigns: one with a promotional video, and the other with a photo. Both campaigns linked to the same blog post mentioned above.

— How We Named The Ads:

  • Photo link ad: SMLabs – Test 1 – Link ad – Desktop newsfeed – Website conversions
  • Video link ad: SMLabs – Test 1 – Video ad – Desktop newsfeed – Website conversions

— Ad Set Settings:

Next, we created 1 ad set in each campaign. In each ad set, the settings were nearly identical:

  • Targeting: English website visitors in last 30 days, excluding trial and paying users
  • Budget: $20/day
  • Conversion window: 28-day click
  • Placement: Desktop newsfeed

Let’s discuss both the targeting and placement briefly.

Targeting:

To make this test as effective as possible, website visitors should really only see 1 of the 2 ads.

In other words, the person who viewed the photo ad should not be able to view the video ad and vice versa.

Otherwise, the results may be skewed, as there would be some overlap in the audiences.

However, unless you have a HUGE budget to test with, and Facebook’s help, it is very difficult to prevent this from happening.

So, we took the next best option available to us and excluded those who viewed the video for at least 3 seconds from the photo ad set.

By doing this, people who viewed the video would not see the photo ad. Unless of course, the photo ad was delivered to them before they viewed our video ad.

Note: Unfortunately, this is the biggest flaw of our experimental design. Despite this, we wanted to see if there will be a significant difference in both ad formats. Please take this into consideration when you’re looking at our results.

Placement:

Facebook Ad Placements

As you can see above, we chose to run our ads only on the desktop newsfeed.

This is because visitors cannot sign up for a free trial on mobile devices. So if we ran ads on mobile placements, visitors interested in a free trial would have to visit our website on their desktops and sign up.

The process of going from mobile to desktop makes it difficult for us to track conversions. For this reason, we only advertised on desktop and not on mobile.

Ad Creative & Copy:

To get statistically significant split test results, we only tested a couple variations and kept the ad copies and other ad elements the same for both.

Here’s what the video link ad looks like:

Agorapulse Facebook Video Ad

Here’s an example of the photo link ad:

Agorapulse Photo Link Ad Facebook Video Ads Test

Here’s what the blog post looks like:

Agorapulse Blog Post

Our results

We were able to achieve some pretty awesome results during our experiment. The following results were based on a 28-day click, and a 1-day view as it is usually best to base your results off a 28-day click attribution.

We picked up 31 new leads and 40 free trials.

The video ad brought in 26 free trials, while the photo ad only brought in 14.

The video ad was the sole ad that led to a paid subscription worth $199/month.

Facebook Video Ads

From the results, it seems like video ads are working for us!

Facebook Ad Results

To validate our conclusion that the Video ad outperformed the photo ad we ran it through a statistical analysis formula.

While the 1 signup vs 0 signups is a big deal to us it fails on the statistical testing due to the low number.

But if we analyze the Reach to number of free trials it is quite significant, as shown below.

facebook video ad stats

Great results that help us conclude that Facebook ads using videos outperformed.

But, just to make sure we’ve poked all the holes in this we can, let’s analyze the number of Link Clicks vs free trials.

Facebook ad link clicks

Again we find our test results prove that Facebook ads using videos outperformed ads using photos.

We have confidence moving forward that video ads will result in more free trials and signups.

Insights

We learned a few things along the way and well, what’s the point of the experiment if we do not share  some takeaways with you?

Takeaway #1: Exercise patience when running campaigns with secondary conversions.

As you can see, only 1 of 40 free trials turned into subscribers during the 28-day period.

When we paused the campaign, it had 40 free trials but no purchases. That dissuaded us from continuing to run the campaign, as we thought that something was surely wrong.

But I guess we were (gladly) wrong! We should have exercised greater patience and let it run longer – perhaps until we hit 100 subscriptions to get a more accurate picture.

Takeaway #2: Split test more creative and move up the CTRs and CVRs.

Next, as you can see above, we only ran 2 creative – a video versus a single image.

The results were not ideal either.

Facebook Ad Low CTRs

Our average CTRs (link) were as low as 0.34%. Had we split tested more copy, video thumbnails vs image, video length, etc., we might have increased the CTRs up to at least 1% while keeping the conversion rates similar.

Doing that alone would have cut our cost per free trial by 66%!

Takeaway #3: Use Facebook’s Optimization algorithm to your advantage.

Facebook recently recommended at least 50 conversions a week for its system to learn more about your ideal audiences.

Given that a free trial costs us roughly $20, we should be spending $1,000/campaign in a week!

Perhaps that would have given us better results and lower cost per free trial over time.

If we had to re-run this experiment again, we would set aside $6,000 to $8,000 for it or the equivalent of $200/campaign/day.

Conclusion:

From the experiment we were able to determine that the ad format, video vs. photo, doesn’t really affect the ad performance as much as we anticipated. But, we’re really glad we ran with it. The Data we collected allowed us to see what might perform better in future campaigns and gives us the insights necessary to further refine future campaigns.

So, what are you waiting for? Start testing today and take your campaigns to the next level.

305dc95fb7350c7b570bc60b8cce18cfSSSSS