Thursday, 26 November 2009

How can we make our supporters more active online?

Following the release of an UK e-campaigning benchmark study, Duane from fairsay asked a group of practitioners the following question:

"By now, many of you have dug into the 2009 eCampaigning Review (http://fairsay.com/ecr09). You will thus have noticed that there are a few areas where *many organisations are doing very poorly*. *Why do you think that is?*

50%+ of supporters had not taken any action in the last 18 months for almost half the organisations

70% of /active/ supporters had only taken 1 action in the last 18 months

The discussion on the list was interesting reaching from "the issue is capacity" to "the issue is strategy as well".

Below is my contribution to this discussion which was based on my experience in the past couple of years ...

--------------------

Due to the nature of Duane's research, we are in danger of falling into a trap of silo-ing people into 'Campaigners'. If we look at our supporters in this way, then yes, it is a problem if they only take ONE action in 18 months.

But we know that it is entirely possible that a supporter will engage in other ways over years - by attending an event, filling in an action card, donating, purchasing merchandise/virtual gift, running a marathon, donating in memory of someone, filling in an order form, etc, etc...

Now this points to few related and common problems in charity communications:

- no integration between online and offline databases - therefore no 360 degrees view of a supporter. In other words, one person could be interacting with an organisation online as well as offline. But due to the lack of integration between databases where this data is being held and/or lack of enough data to de-dupe records, that individual might appear inactive. Or that individual could represent two, three, four different people - depending on how many disparate databases their data is held on.

- due to the lack of a full view of how a supporter interacts with an organisation, it's hard/time-consuming to decide what kind of retainment strategy should be developed. Therefore we could be communicating to a supporter who has only taken one online action as if they were an 'inactive supporter' because we are unaware of their 'offline' or just non-campaigning online activities.

- silos in the organisation - in order to create an overall, corporate, supporter development strategy which offers people a number of ways to engage, different teams in an organisation need to collaborate on it's development. Which can be hard because every team feels a little bit that they 'own' THEIR Campaigners, or Donors, or Teachers, etc, etc - basically the fact that one supporter could be all three is hard to grasp.

- going back to e-campaigning - As Andrew said "Not every campaign is well suited to public mobilization". And we still do it because, in our lobbying efforts, we need to show that there is public/voter support for an issue. But sometimes we need to create campaigns with the sole purpose of engaging less active lists or recruiting a specific age-group or retaining very active campaigners. And not every Campaigns and Policy team buys into this.

These issues are much harder to resolve in bigger organisations and working in a small organisation could actually be an advantage.

Wednesday, 25 November 2009

Evaluating and Monitoring (impact of) digital communications

Monitoring and Evaluating Online Projects With online communications we can measure and evaluate a lot more than in offline communications. The trick is to understand what stats are relevant/useable, what they tell us and what we can learn from them.
What do we mean by 'impact' in the world of digital communications and how we evaluate it.


Metrics:
Open rates, click-though rates
Conversion rate
Page views
Visits
Email sign-ups
Referrers

What can we evaluate online?
-how many people came to the website
-how many times people loaded (viewed) the page
-where they went after they viewed the page
-where they came from, what they searched for to end up on that page
-we can even analyze the behaviour of a specific segment of people ( for example -everyone who visited a page, or came from a specific location)
-how many people successfully took a journey we’ve set through the website

email

-how many people opened an email (HTML)
-how many people clicked on the links
-how many people who opened the email finished the journey (for example, donated or took a campaigning action)
etc, etc

Based on this data we can get a good idea of how successful a specific content is, what positioning of elements on a web page leads to more conversion rates etc..

Impact

In order to be able to measure impact you need to be clear about your communication strategy and how this contributes to impact of you initiative.

So old-school comms planning is the key and I always try to do it in an iterative way with the teams I work with:

1.When you identify objectives (why are you producing a piece of communication) immediately think how you’ll know that you’ve achieved it. Can you measure this – do you have internal systems set up to do it? If not, have a think again.

So for example a typical objective would be “To educate our supporters about XYZ”. How will you measure this? By the number of email sign-ups? Number of packs distributed? Online survey of your supporters before and after? None of these answers is wrong, but they do clearly set out what would reaching a specific objective mean for your organisation.

By asking these questions you identify the RESPONSE you need from your supporters.

2.Who is your audience? What are they like? Are they likely to respond in the way you want them to respond? If yes, great, but if no, maybe you need to rethink your objectives and therefore your response?

How to set your targets?

The best way is learning from the past or from baseline studies.
Or you can just evaluate by monitoring trends – for example, “%increase” or “more of ….”. But then you need a baseline which shows what the current situation is – so you have something to compare to.
If no baseline, compare to same industry benchmarks (e-benchmarks study),

Baselines
E-benchmarks study - US http://www.e-benchmarksstudy.com/

Performance benchmarks - UK - www.fairsay.com