Chapter 17:

Case study: GlobalBrand

Our model is not just a theory, so we wanted to give you an example of how we have used it in the past. The best thing about our model is that it scales perfectly. Whether you're a one man show or you have 50,000 employees, it will work.

GlobalBrand Case Study (Stage 1) #

Once upon a time there was a company called GlobalBrand (this is why we sign non-disclosure agreements, people). They were sending millions of emails and SMS per month to its global audience, pretty much in the name of branding and communications. Yes, they had goals and yes they had objectives but they were only measuring the success or failure of the work they were doing on open rates and click through rates.

This was where it ended and was what we thought a big problem.

Their goal was to reduce unnecessary costs while keeping the communications channels open and running so they could justify their operational expenses. Otherwise they would have to reduce headcount. They had a $5 Million cost saving target in 6 months with good communication levels being about 5 million comms sent out per week.

Bear in mind that on any given week the company would send 400-500 different campaigns, some of them were lifecycle based communications and some of them were calendar based launches. The difference between the two sets of communication was that the lifecycle based communications were sent periodically as part of a long term program and calendar communications were sent when specific GlobalBrand products launched. With this many campaigns numbering in the millions of emails sent per month where do you start to optimise?

What we needed was a benchmark.

Benchmarking (Stage 2) #

The first thing we did was look at GlobalBrand's benchmarks. Actually when we looked they didn't have any. So we went to work to create them. Not as easy as it sounds because each lifecycle had to have different benchmarks. When GlobalBrand attracted a new customer the first lifecycle communication they got was a welcome message. Then there was a honeymoon period where the consumer was still trying to figure everything out and eagerly accepted all incoming communications (typically the first 3 months). Then there was a Reality Phase where different kinds of messages were sent (3-12 months). And then there was a phase that GlobalBrand called Re-Purchase in which they wanted the customer to upgrade which was set from 12-18 months.

The Calendar communications also had totally different open rates and CTRs. Anyway, GlobalBrand did categorise what they were doing so we segmented each campaign into it's lifecycle or calendar group and then ran a single standard deviation across each lifecycle open rates and click through rates.

This gave us our initial traffic light mechanism and we could quickly point out the good and bad performers in terms of OR and CTR across each segment. But that wasn't far enough. In order to determine what the impact of success or failure was we had to introduce costs to the equation.

How much did it cost? #

So some campaigns only sent 1000 emails and some would send 500,000. So the metrics tended to be misleading. You might have a small campaign that fired 1000 emails with a 50% open rate and a 30% click through rate and only get 150 clicks. The campaign with 500,000 emails sent might only have an open rate of 30% and a CTR of 50% but it meant 75000 clicks. From a pure open rate performance point of view the first campaign would be at the green end of our benchmark but the second campaign that provided much more impact to the business might be considered an yellow light in terms of open rate.

We also had a cultural problem in that volumes sent across all channels was the key message (branding and engagement being the goals) but no thought was given to anything in country x about the cost as "Business Central" (a kind of global support function) would handle the costs so local campaign managers in many cases didn't have to.

We needed to figure out the price of doing stuff in order to measure the impact to the business better.
tweet this

We needed to figure out the price of doing all this stuff in order to measure the impact to the business better than a simple open rate or click through rate.

So we worked out the operational expenses of sending an email and the expenses of sending an SMS. With email again it depended on whether you were sending a lifecycle mail or a calendar mail as the costs were different in each case. Calendar emails typically had more variable costs (like agency costs).

So GlobalBrand had 25000 people working for it. They on average did 2080 hours a year each and altogether the fixed costs were $2,000,000,000. So on average per hour cost to the company was 2,000,000,000/52,000,000 = $38.46. Each Calendar campaign cost an average of $10,000 to produce.

GlobalBrand had 70 full time staff working 43 hours a week (on average with overtime) meaning they had a weekly cost of (70x43)x38.46=$115,765. Now they were sending approximately 3.5M emails a week meaning each email cost them 115765/3500000=$0.033 to send.

The calendar costs were slightly more expensive per send as typically they only sent 500,000 per week and you added an extra $10K on average per campaign as variable costs. You typically had far less people working on each calendar campaign (approximately seven). So the cost per send was (7x43)x38.46=$11,576+10000= $21,576. That meant the calendar cost per send was 21,576/500,000 = $0.043 to send a calendar email.

SMS also usually had 5 full time people plus operator (media) costs which depended on each country - they varied from 1 cent per sent SMS to 5 cents. So the cost per send was (5x43)x38.46=$8268.9 plus media costs which varied by country divided by the amount sent which was typically 1.5 million a week. On average it worked out at about $0.034 to send an SMS.

Immediately this gave the business a price point. They knew that calendar emails were the most expensive and lifecycle emails were the cheapest.

Using the example above it demonstrates the power of using CPC as the success indicator rather than open rates and click through rates. Now when they sent 1000 emails at a cost of 0.033 per email it cost them $33. When they sent 500000 at the same price it was $16500. When this transposes to CPC (from the above example) it means 33/150 = $0.22 CPC and 16500/75000 = $0.22 CPC. This means the campaigns were performing identically from a cost perspective.

Red, yellow, green (Stage 3) #

What we did was use $3 CPC as our red light meaning campaigns would simply be stopped, $1-3 as our yellow light and less than a dollar CPC would be a green light. By doing this we achieved two things, firstly we simplified our optimisation methods. We didn't look at Opens and Click through rates to start our analysis we looked at CPC because by default a low CPC means a good ratio of clicks to the site. Secondly we changed the culture to be very cost conscious. Where before people were only looking at volume they were now looking at volume and cost.

This significantly changed the way campaigns were run. Anything that cost more than $1 CPC was suddenly scrutinised very carefully. We started to see markets that consistently performed poorly, so budgets were cut until they brought their spending under control. We started to see trending campaigns that outperformed others that were ramped up. We saw that SMS was a big cost cow in many of the ways it was being used. Some of the SMS campaigns resulted in no clicks at all meaning the campaigns were dropped. On other campaigns we used 5% tests across copy and content to gauge the response against our benchmarks before changing things if they worked.

Using our model saved the customer a combined $22 million over a six month period.
tweet this

All of these changes combined resulted in $22 million savings for GlobalBrand over a six month period, a fantastic result in terms of opportunities to use that money elsewhere.

GlobalBrand continues to this day using the method above and has now lowered its bar to $1-2 being an yellow light as it has eliminated campaigns that perform worse than that. The important thing is it hasn't stopped sending the emails or SMS's it just sends the ones people want to read more cost efficiently.

If you liked this chapter, please recommend it to others.

comments powered by Disqus

Subscribe to our newsletter

We will never spam you or sell any of your data.

What's been said about the book?

"Data, data everywhere and yet all decisions from the gut!" That just about encapsulates why our marketing strategies are faith based, why our websites are barely functional ("the CEO loves purple!"), and why we are not making the types of profits we deserve. I love this book because Steve and Markus provide specific advice on how to unsuck our lives! Buy. Don't suck. Win.
Avinash Kaushik
Digital Marketing Evangelist - Google
Author - Web Analytics 2.0
In your face and a Must Read for beginner and expert analysts alike.
Jim Sterne
Founder - eMetrics Summit
Author - Social Media Metrics
Chairman - Digital Analytics Association

About the authors

We have a single goal together, to make our customers a billion euros in profit. This won't happen in one year, it might take five years, but we will not stop until we have generated a billion of provable profit for our customers.

We love meeting and talking to new people! Get in touch with us. Come have a pint with us! We are available for projects, especially in warm places with great fishing opportunities and/or great beer.

Steve Jackson
steve@blackbeak.com


+358 50 34 35 159
Markus Sandelin
markus@kingmuffin.com



+358 44 36 99 887