Adam LoftingDigital Analyst at WWF
More by Adam Lofting
In our line of work, we love the stories that bring to life the opportunity of increasing conversion rates and raising more money for our important causes.
Here's a personal favourite of mine.
How a 1 hour split test raised £68,000 for WWF-UK
While I was working at WWF-UK, we were raising funds for an emergency tiger campaign. Tiger adoptions were typically promoted at an asking price of £3 per month, but online we also offered a one-time payment option starting at £36 to adopt a tiger for a year.
Regular donations were prompted at:
£3 or £5 or £10 or £Other (where other is a free input box)
One-off payments were prompted at:
£36 or £60 or £120 or £Other (i.e. the one-off amounts multiplied by 12)
The majority of our gifts were regular donations, but when we looked at the data for the one-off adoption payments the vast majority of donors were selecting the £36 radio button. Based on the suspicion that the gaps at this scale made the increments look too dramatic to a typical donor (£36 to £60 feels bigger than £3 to £5), we tested donation prompts of £40 £50 and £60 for the one off-amounts. A user could still donate the minimum amount of £36 using the "Other" input box.
By the standards of online optimisation tests, this one was easy to run. It took less than an hour of work to setup, analyze the results and change our site when the test was finished.
The result
The new price-prompts for the one-off donation amounts resulted in a £3.40 increase to the average gift value for one-off adoptions. This was a £68,000 increase to annual income. A sum like that could restore 1,360 hectares of grassland habitat where tigers live and hunt their prey.
For an hour of techy/webby work, that's a significant contribution to make to the organisational mission, and understandably it's a story people like to hear. Everyone from your supporters through to your board of trustees will enjoy this sort of story, so it makes sense that we love and share them. I've shared this one many times.
The risks of sharing simple split-test wins
But, as with all the stories we share there is a risk. Most of our news today is filled with riveting tales that lack the information we actually need to make informed decisions. When we share our conversion rate optimisation stories, let's be careful not to leave out the information that people really need to replicate the success.
I thought it might be useful to tell you about the story behind the story above, and behind every optimisation test I have run.
The story behind the story
At WWF, the price prompts on our donation asks were sacrosanct. At least that's how they felt to me when I started working there. They were the product of years of fundraising learning and you'll notice they fit with the style of most fundraising asks from most charities. They are like that because they work. On my first day working there, if I'd suggested changing these price prompts, the answer would have been no. Not that I had any inkling at that point that this was a good test to run.
This optimisation test happened because it had the support of the fundraising team, the web team and the relevant management. It had this support because we had been running lots of smaller tests over the preceding year. And we'd been running them well. We had been documenting and sharing our process and our rationale, and building organisational support with more and more significant tests.
Why we were able to run this test
It was only because we had good data on our existing donation amounts that we could identify this opportunity for improving one-off gift amounts. Weeks, months or maybe even years of work by many people had to happen before the one hour of work that gave us the great web-testing story above.
I can put my name to bits of this organisational development, and some of the new processes we setup but it's not my work overall. The real credit goes to the person who managed to include the words "multivariate testing and optimisation" in the job description when they were recruiting for my role. This was my permission to test things.
Sometimes I had to fight for the time to test, because although the value was understood by the people who recruited the role, this was only part of my time, and big and urgent projects often trump slow and important ones.
The secret to building a testing culture
But by continuing to document results and to share them with the wider team, the process slowly moved from fighting for the time to run tests, to one of having so many requests for things to test from across the organisation that we had a significant backlog of ideas.
All in all, it took at least a year of work including technical, process and internal negotiating to reach the point where a one hour tweak to the website could increase our annual income by £68,000 and I hope that's an actually useful story for you to hear about optimising your website.
I don't want to belittle the individual case-studies, as this story about spending £200 per year to keep your towels warm shows, a good story can be a catalyst for change. But the best use of the individual stories of testing is to normalise the ongoing process with the rest of your organisation.
The real first step to split testing
To optimise your fundraising conversion rate, don't start worrying about the colour of your call to action button. Instead, do this:
Make testing a part of someone's job. In writing. In the job description.
Set specific goals for the number of tests to run each year.
Document everything, or you'll find counter-intuitive findings will be undone by well meaning others.
Make your results freely and easily available to anyone in the organisation. This is science and you need to be peer-reviewed.
Review your tests regularly (weekly or fortnightly) and continually plan what you will test next.
If you do all that, you'll be able to answer your own questions about what to test and what tools to use.