Has this ever happened to you before?
You have a great product that customers love and one day, someone influential writes about it.
What would you do?
Would you pay to promote it?
Last year, we started promoting such articles among our website visitors.
Then an idea came to mind – what if we could replicate the same article on our site, then drive traffic from a Facebook Ad to it?
This way, we’ll not only get to toot our own horn, but retarget all of these people!
Image credit: http://clubwealth.com
So we designed an experiment to find out if this was indeed a good idea:
Here’s a quick look at how both campaigns did:
As you can see, the original article generated 50% more free trials than the replica!
But did the free trials turn into subscriptions and if so, which Facebook Ad campaign was more profitable?
We’ll share the final results with you later, but first let’s talk a bit about the experiment.
Whenever your product gets featured in the media or by an influential person, there is an opportunity to get the word out and establish your brand as an authority.
We believe that these organic mentions (social proof) are one of the best forms of testimonials for any business.
That’s why we consider promoting these articles as one of the parts of our growth strategy.
However, it is difficult to know who has read them.
When you promote these third party articles, you cannot track the people who has clicked on the article unless you used a tool like Snip.ly.
And this makes it impossible for us to retarget these people to sign up for free trials. Yes, that means we’re missing a big opportunity!
So we thought of replicating the article on our site. This would allow us to promote a 3rd party mention of our product, yet track and retarget these people.
The best of both worlds!
To see if there is a difference between both approaches, we decided to conduct this experiment with the following hypothesis:
There are 4 things we want to highlight in our Facebook Ad Campaign set-up:
For this experiment, we wanted to reach people who already know about Agorapulse. So we retargeted people who visited our website for the last 30 days.
We also targeted people worldwide, excluding Spanish-, Portuguese-, and French-speaking countries as these people may find the English post less relevant due language issues.
We also excluded the “Manager” app, which includes all free trial users and paying subscribers.
This was done to avoid contaminating the results since we only want to track new free trials and subscribers from the ad.
The ad is simple. Since we are split testing between two different articles, we used the same ad in both Facebook Ad campaigns:
For the first campaign, we sent users to the original article:
Then we replicated the article using Instapage and created a second campaign for it:
Both articles have identical content except for 3 things:
Now let’s take a look at the rest of the set-up.
One peculiar thing we did in this experiment was to optimize for link clicks instead of free trials.
Facebook recommends you to achieve a minimum of 50 conversions a week before using the conversion objective.
Since we didn’t fit the criteria, we decided to optimize for link clicks instead.
Apart from that, we used the automatic bidding system and spent $15 a day for each campaign.
Which article do you think generated more paying subscribers from the Facebook Ads?
It’s the original article!
Not only did the original article generate 50% more free trials, it was also the only campaign which brought in 3 subscribers of $99/month.
This means that the Facebook Ad campaign will break even in 1.5 months and make profits thereafter. Sweet!
This got us thinking: if we ran this experiment or a similar experiment again, would we see the same results?
To find out, we have to see if the results are statistically significant.
If math is not your friend, it’s alright! All you need is this calculator.
It’s like a virtual fortune teller – tell it something and it will show you the outcome!
YOU ALSO WANT TO LISTEN TO THE PODCAST FOR FURTHER INSIGHTS ON HOW WE VERIFIED THIS DATA WITH OUR OWN DATA SCIENTIST!
So we entered the number of visits to each site and free trials into the tool (A is representing the replica page, B is representing the original article):
Boom, the results were generated immediately!
As you can see the original article converted 54% better than the replica.
However, there is a 10% margin of error (100%-90%), which means that the results may not have been statistically significant.
In other words, if we ran the experiment again, there is a 1 in 10 chance that we would not get a similar result.
What about the conversion of free trials into subscribers? Would we likely see the same results again if we re-ran the experiment?
We entered the number of free trials and subscriptions respectively and here’s what we saw:
(A is representing the replica page, B is representing the original article)
The results are positive!
This means that if we were to repeat the same test 100 times, we would likely get the same outcome 97 of those 100 times.
Truth is, we thought the replicated article would outperform the original.
It has no menu, no side bars, and nothing which distracts the reader.
So these results were surprising to us!
But here’s what the scientistS had to say:
In the replicated article, we credited iag.Me, the original publisher.
However, doing so probably suggests to our readers that they’re about to read something that may be potentially biased.
Why else would we be promoting it?
So we may have lost many visitors even before they finished reading the post compared to the original article.
This would explain why fewer people took up free trials – because they did not even get to the point in the article where our product was mentioned.
Earlier, I mentioned that one of the key differences between the original and replicated articles was the presence of social proof.
On the original article, you can see that it has been shared 4.4K times on social media!
Social proof is a powerful force that can influence one’s choice.
That’s why literally every new blog tries to promote its articles as hard as it can to look credible.
Compare a 3rd-party article shared over 4.4K times to a replicated article with on social proof, which is more believable?
Although we launched this experiment with the aim of tracking visitors and re-targeting them in the future, we have yet to do so.
Once we do, the results may not be the same.
We may just see far cheaper, downstream free trials and subscribers from the people who visited the replicated site.
All we need to do is to present them with the deal and encourage them to try our product out.
Do you think this would change the results entirely?
We think it might 🙂
We will update again once we rerun the experiment and add the retargeting campaign. Bookmark and check this space again!
The results we had from this experiment were encouraging — so we decided to duplicate the same campaign with the original article and launch it as a fresh campaign.
We spent an additional $3,862.62 and generated 163 free trials at $23.70 each (well within our target) and turned 11 of those free trials into paid subscriptions!
The average of those subscriptions was worth $119 a month and the campaign would break even in 3 months and turn profits after that.
To say the least, we are happy with how the campaign is going!
What do you think about this experiment? Is there something you would have done differently?