Archive for the ‘Marketing data management’ Category

Percassity Perspectives

Wednesday, July 21st, 2010

Latest edition of my company newsletter.

Issue 4, July 2010 News round-up includes a renaissance for direct mail, and the latest on data protection and cookies. We also take a look at the best marketing operations news on the web and also question the existent of the “marketing all-rounder”, and debate the internal vs external database resource question.

When to stop flogging a dead horse

Tuesday, April 13th, 2010

There’s a strong tendency when planning a data selection for a forthcoming campaign or programme to pull as much as possible in order to maximise the reach of the activity and corresponding response. This is nearly always self-defeating however, and not least when it comes to using every record meeting your selection criteria, regardless of how long ago it was collected or when any kind of response was last received. Even if such data is not obviously out of date, there are many reasons to exclude it from ongoing activity.

Although this is likely to be an issue restricted to email activity rather than relatively more expense direct mail, it’s still applicable to both. The greater cost involved with DM creates a natural incentive to fine-tuning selections ahead of launching a campaign. Even so, it’s extraordinary how poorly targeted such activity can often still be, with the obvious parameter of data age not taken into account.

The seemingly next-to-nothing cost of email though makes it easy to think that that there is no impact to using all available data, but as we all know (albeit don’t necessarily acknowledge) this is not the case. Diligent email marketers will of course remove bounced email addresses from their lists in order to maintain a clean database and eliminate records known to be no longer active (although not always, see Email bounces and database updates). And it goes without saying that opt-outs and unsubscribes must be removed in order to maintain privacy compliancy. Other than that, if you’ve got a usable record, use it, right?

Well, an obvious effect of taking this approach is to actually diminish your percentage open rates, since the opens that you do achieve will be diluted by all those disengaged recipients. Now you might be thinking that this is just damned lies and statistics, since the overall number of opens isn’t changed by the total number of recipients. If you’re monitoring these metrics however, they will be giving you a false, and unnecessarily pessimistic, impression. It will be much harder to achieve improvements due to the dead weight of of those recipients who are never going to look at what you send them.

Continuing to market to an artificially inflated list also obscures the number of people you’re actually reaching. The absolute open and click rates are crucial of course, but continuing to hope that non-responsive recipients will at some point come to life again may mask deeper issues with your database. Perhaps you should be looking for fresh subscribers or prospects via external data acquisition or increased social media activity to encourage opt-in. (Don’t just rush out and rent a list though – see the point on Data acquisition in my recent post How to take advantage of a recovery.)

How then should you go about honing your list selection when preparing a new campaign? Well obviously it goes without saying that your activity should be carefully targeted at individuals meeting relevant criteria across role, industry, interest, behaviour and so. A quick and easy way to eliminate the unresponsive element of your database however is to apply a filter I and others often refer to as “recency” (accepting this is a made-up word!). This is by no means rocket science, but takes a little discipline and good data management. Put simply, those individuals in your database that have not responded or interacted in any way for a defined period of time, usually 2-3 years, should be excluded from activity going forwards. Even if their email address is still in use they’re simply never going to respond and are just skewing your results as discussed. The minuscule possibility that they will respond in the future is just not worth the negative impact of continuing to include these recipients in your activity.

The trick here of course is the ability to effectively determine who these non-responders are. You will need the outcomes of your email and other direct activity to be fed back to your database in order to readily make a selection based on these criteria. As well as email opens and clicks, you should also take into account website log-in if applicable, event attendance, purchase (obviously) and any other behaviour you can identify and track. Increasingly, this might include social media activity, such as Twitter or Facebook. It’s quite possible that lack of actual response to email doesn’t mean lack of interest, but you need to demonstrate this, not just make an assumption. The ability to make this part of your selection criteria clearly needs to be a “production” capability, built-in to your marketing operations, and not a hugely labour intensive task for every campaign execution.

It’s worth noting also that the lack of response to marketing activity could itself be used as a trigger for some other kind of follow-up, particularly for high value contacts. If a past customer or senior-level prospect has stopped responding, a quick call using a low-cost resource (i.e. not an expensive Inside Sales rep) to check their status could be worthwhile. Maybe the contact has left and been replaced, changed roles or allowed your company to fall off their radar. You might be able to re-engage, but if not, move on.

Recency should be a field in your database that is constantly calculated based on all the criteria outlined above, which can be readily included in a selection. Just to make the point, this is completely different from “last edit date”, which can often be set when a record in a database is merely viewed, regardless of whether a real change was made or activity performed by the contact. Implementing this simple addition to your campaign selection will have an instant, positive effect on your marketing metrics and save you from flogging dead horses.

How to take advantage of a recovery

Monday, February 8th, 2010

So, 2010 is well underway, and hopefully the difficulties of 2008/9 are slipping behind us. As business begins to pick up again, and budgets and activity levels are restored, everyone looks forward to getting back to business as usual.

Or perhaps not – perhaps there’s a better way.

Many Marketing departments shed staff last year, and although I’m not advocating a jobless recovery, there may be smarter ways of undertaking the activities some of those people may have been undertaking, rather than just throwing bodies at the routine challenges encountered by marketing. Here’s a few things to think about doing differently this year.

  • Data acquisition – When obtaining targeted contacts for marketing activity take a long term approach. Renting a list for a tactical campaign that’s coming up will not be successful; ongoing activity is the key. Spend time researching the right data source (see Business data and Sales prospecting tools on our Resources page) and if bespoke contact discovery is necessary, leave enough time. This also makes investing the necessary time and effort in properly handling the data more worthwhile: load the list into your database/campaign system, flag the source, track outgoing activity and record response (see point below). This allows the effectiveness of the acquired data to be measured much more readily.
  • Proactive data quality management – Avoid “a quick check of the data” being the last thing that happens before campaign execution. Data quality is an ongoing task and leaving it to the last moment will mean it’s always a panic activity that never gets done properly. Ideally, you should implement a true data quality programme and a suitable solution to monitor and maintain data (see previous posts Data quality – a vision and 6 data quality solution requirements). At the minimum though, use one of the many (not necessarily expensive) tools to identify issues on a routine basis and fix them as you go along. (See Data quality tools and consultancies on the Resources page.)
  • Joined up response management – Campaign execution, whether direct mail or email, is often carried out by external vendors, which is understandable. They can pull landing pages and micro-sites together quickly and easily, where perhaps building such facilities into your main website is onerous and time consuming. However, campaign reporting should take place within your existing processes so that it’s a business-as-usual activity, not an exceptional process that only a couple of people understand. If you are hosting you’re own landing pages, the same principle applies of course. Hopefully capturing such responses directly to your marketing database is relatively straight-forward (many systems have web-to-lead functions), but if it has to be manual, so be it. This investment in time will pay off when it comes to reporting and tracking.
  • Skills – Consider the expertise that is really required as activity levels rise and how best to obtain it. Rather than re-employ generalists, identify two or three step change projects and employ temporary specialists or agencies to get those changes achieved using what was, previously, salary budget. Once these programmes have been completed, review the skills you need before determining the types of roles required and taking on new permanent staff. Use this as an opportunity also to do some testing before deciding where to focus new spend. Again, this isn’t to discourage creating jobs for unemployed marketers, but experimenting and testing actually creates gainful activity that will bring the recovery forward, without requiring companies to commit too soon whilst it remains tentative.
  • Sales and Marketing database integration – Strive to ensure that your marketing system and the system your Sales team are using are linked together as closely as possible. Leads, once qualified, should appear directly in your SFA (sales force automation) system, not as spreadsheets or emails sent to Reps. Even better, share contact data between the two systems so that changes in either are immediately available to everyone. This should hopefully also help with tracking leads once they have been supplied to Sales, and eventually measuring the outcome of marketing activity.

As the recovery takes hold, let’s hope that marketing departments start hiring again, and put all that talent to work on creating effective campaigns.

With thanks to Kate Mayfield of Data & Mash for contributing to this post.

Alternative approaches to subject line personalisation

Tuesday, October 27th, 2009

Over coffee with a client’s VP of Marketing last month, we came up with an idea for customising (or customizing!) subject lines in email marketing. It’s well known that subject lines are a key determinant of open rates and every good campaign should involve the testing of different variations to establish which one  performs best. Considerations often revolve around personalisation or length (with regards to whether shorter or longer is better), but we got to thinking that quirky or straight might also have an effect.

Some people, we concluded, might quite enjoy an email subject like “Have lunch on us whilst we talk about our stuff!”, whereas others may prefer a more serious tone along the lines of “Learn the benefits of our products over lunch”. This could be tested over a sequence of campaign executions and the individual open rate for each recipient recorded to see whether they tended to respond better to one type of line or another. This implied preference could then be recorded within the email or marketing database and utilised as a customisation parameter in future activity.

Of course any number of other factors could influence an individual’s open rate so ongoing monitoring and adjustment would be needed to ensure peak effectiveness. Just an idea though, and I set it free here for your consideration. If you give it a go, let me know how you get on!

Email bounces and database updates

Friday, August 28th, 2009

Commencing an engagement earlier in the summer with a company for which I had previously worked, I was issued with an Exchange account for internal communications whilst on-site. Not surprisingly, my external email address was the same as it had been when I was employed there, since it adopted a standard format comprising my first and surname together with the company’s domain. What did surprise me though, eighteen months after leaving the company, was the steady stream of emails I began to receive from lists to which I had been subscribed before I left.

Now perhaps I should have diligently ensured, before moving on, that I had unsubscribed from these lists or informed their senders of my change of address. The reality though is that this is often harder than it seems, between keeping track of the lists to which you have subscribed and knowing how to advise your new details. It’s usually not the highest priority when moving on either.

These emails sent to my old address would certainly have been bouncing back to the originator for quite some time. The failure, or conscious decision, by these senders not to process these bounces and use them as an opportunity to update their databases is astonishing. Across the entirety of their databases and subscriber lists, given the rate of decay of business data, these senders must experience significant volumes of email delivery failures.

Just as with spam, it’s tempting to dismiss such considerations on the grounds that the cost of continuing to send to dead addresses is minimal, the effort of doing something about it substantial and the overall impact negligible. This is not the case however, and persisting in sending to bounced addresses can lead to deliverability issues and represents a missed opportunity for database management.

Repeatedly sending to non-existent addresses and incurring the bounce back messages this generates gets noticed and can lead to being placed on spam offender lists. This could cause all email to be blocked by spam filters with obvious dire consequences for campaign effectiveness. You may not even know that here is a problem, except for the rather disappointing response rates.

Failing to update marketing databases with bounced addresses also means that the opportunity to track the fact that the record itself may be invalid is also lost. If other activity is being driven from the database, such as DM, then significant cost can be incurred sending to contacts who are no longer there. Acting on email bounces also offers the opportunity to proactively update the database. If an individual represented a high value contact (someone in a senior position or a frequent purchaser), perhaps it’s worth a call to establish where they’ve moved in order to re-establish contact or identify a replacement?

I’m not complaining that I’m receiving some of these emails again, and it may even be to some of the senders’ benefit in the end. But the likelihood of this situation arising is tiny and the potential negative impact significant. There’s no excuse for bad practice.

Eleven Steps to kick off your CRM system project

Monday, July 6th, 2009

We’ve run many marketing automation projects over the years, both large and small. Here’s a simplified version of the methodology we use, and some hints around getting your project up and running!

1. Project Feasibility – an informal review to scope the potential project and set some expectations. This process might be no more than a short internal meeting, but at this stage you’ll not only be able to roughly size the project, but you’ll also have a good handle on the costs you’re currently incurring. Look at the organisation’s current levels of marketing activity, not only in departments carrying out marketing, but also Sales and other functions. Try and come up with some metrics such as spend (internal, number of activities, overall number of touches), and the programme objectives; customer acquisition, retention, up-sell/cross-sell. Don’t forget softer marketing activities such as newsletters sent by product or customer service groups.

You also need to get a rough idea of the data available, again don’t forget to look out-side the main marketing teams as well as internally. This usually means Sales, Finance (if there is no data warehouse), Customer Services and product management teams.

Add into the mix your organisation’s future needs, growth strategies, new products, desired improvement in customer experiences, structural acquisitions, as well as any predictable internal factors around people or structural changes.

Activity + Costs + Data Resource + Business Objectives are the inputs you’ll need do outline the project scope.

2. Initiate Project – you might have an internal project initiation process or it might be a more informal set of actions. But any successful project will need most of these components in place:

  • Business Buy-in – Your project is going to need or catalyse change in your business. Now is the time to get your directors or SVP’s on board. And don’t forget to keep up a dialogue with the guys in IT!
  • Project Champion (Board) – Someone with a stake in the project’s success and with enough political weight to fight your corner for resource and support
  • Project Manager – A good PM combines a detailed technical understanding with the oleaginous charm of a diplomat and the motivational skills of Madame Whiplash! They can be either from marketing or from IT or both! At times it’s going to be a full time job, so make sure they have the bandwidth.
  • Success Definition – Develop meaningful indicators of success; these might include  reduction in costs, improvement in productivity or trends in conversion costs. Keep them simple (at least what you share with the business) and realistic.

3. 1st Stage Requirements Definition and Data Audit Documentation

  • A high quality piece of work at this stage is vital to the success of the project; investment in time here will be repaid by a successful implementation many fold. When you start writing the cheques is too late to be finding  figure out that what is being delivered doesn’t meet your needs.
  • Clearly prioritise all key features; essential/desirable/optional. On any requirements document the nice-to-haves tend to take up the same amount of space as the need-to-haves.
  • Think about phasing; its likely any substantial project will be delivered (and paid for) in a number of stages; prioritise key deliverables, but you also need to work out the optimum structure to meet operational constraints.
  • Identify any internal process changes needed, this is another area that is easy to overlook or underestimate. Does this need to be a vendor deliverable or can the business handle it themselves?
  • The Data Aaudit doesn’t need to be exhaustive at this stage; but you need to have a very good handle on the inputs the system will need, files layouts where applicable, approximate record quantities, and source system dependencies. In any complex organisation it’s easy to underestimate the number of data sources needed for build and production. On one recent project the estimate was 18. The real number once an exhaustive process was complete? 61!

4. RFI/RFP to vendors (and internal Technology Group) – You may or may not have an internal IT resource who feel they can deliver a Marketing automation/CM project. One way to cut through the politics of this is to ask them to respond like the other vendors – make sure they price internal IT resources realistically.

5. Response evaluation and contract negotiation

  • Allow plenty of time for this stage; there’s nothing like seeing the figures on the table to focus the mind, and the vendor will be looking to safeguard their position. A successful negotiation will allow both parties to apportion the risk
  • Usually there will be a significant up-front cost for development. A guaranteed contract term will allow the vendor to amortise the development costs over the the period of the contract.

6. Project Plan and Timeline setting – Make this realistic but not too long. You need to be able to keep the momentum going, but its not great to forever be announcing delays. Try and structure the project to allow early wins; for example you may not need every single data feed to start gaining value from a single customer view.

7. Detailed Requirements and Data discovery

  • This should be straightforward process if you’ve got a good requirements price, but the vendor should respond to your functional prioritisation, allowing you to make informed choices before agreeing the statement of work.
  • Allow plenty of engagement time for Data discovery. You’ve probably lived with this data for a long period of time, but any external consultant or specialist is starting from scratch. You’ll also have to make knowledgeable internal data specialists available to the vendor; if you’ve got complete documentation on all internal systems and feeds, congratulations – that’s a first!

8. Development – Ensure configuration and customisation adhere to the agreed requirements and specification, without suffering from scope-creep (constant additions to the original functionality). Any such development should be minimised and every process or function scrutinised to gauge its real priority and whether “out of the box” functionality will suffice. Conduct regular review sessions with key stakeholders to demonstrate functionality and ensure it is on track.

9. Implementation and migration – Develop data migration and cut-over alongside functional development. Ensuring the right data is available in the new system from day one is critical and users will be unforgiving if it is not. Many CRM implementations fail due to data issues, including data quality. Will you migrate all data from legacy systems, or apply rules and filters? What is the data model of the new system compared to previous ones, will there need to be a mapping process.

10. Training ‘Go-Live’ – Don’t overlook training and plan well in advance of go-live. Avoid the temptation to just let users loose on a new system and learn it for themselves, but develop a proper training programme, with hands-on usage (even if it’s a late beta version) and plenty of exercises and review sessions. Aim to have training deliverables available (documentation, process guides or screen tutorials). Run post go-live sessions to re-cap key functions and answer any questions on general functionality arising as users start utilising the system.

11. Evaluation and On-going development – Conduct reviews to ensure the system is delivering the required functionality. Survey users for their opinion on usability, how much they’re using the system and any key missing functions. Does it make their job easier? Put aside resources to make enhancements post go-live – don’t expect the job t complete at this stage.

On good form

Wednesday, April 29th, 2009

I wanted to briefly mention a great new resource for anyone involved in online data collection, brought to us by international data quality and addressing guru, Graham Rhind. “Better data quality from your web form” is a free download ebook in pdf format that is designed to help achieve effective international name and address Internet data collection. In the spirit of full disclosure I should mention that Graham asked me to take a look at the book before he published it and as such I can say it’s an invaluable source of information.

Exhibiting Graham’s customary thorough and comprehensive coverage of the topic, the book includes guidance on name and address capture, use of pick-lists and other form elements, usability and data validation. Longe-standing readers of my blog will know that web forms are something of a hot topic for me and I hope this book will help curb some of the worst examples of bad practice out there!

The book is available for download from Graham’s site, and whilst you’re there you should take a look at the wealth of additional information he makes available.

7 reasons for real time data updates

Thursday, February 12th, 2009

Previously, (see The secret to CRM & Marketing data management?) I’ve written about the benefits and hazards of creating independent marketing databases, and in particular the questions that need to be asked before taking such an approach. I’m currently involved in a debate over the long term approach that should be taken to the management of marketing data, and where it should reside, which raises some of these issues.

Take the real life example of a campaign automation system that is synchronised with a sales force automation (SFA) solution via a real time data adapter. Changes made to customer and prospect contact data in either system are exchanged almost immediately, together with leads and status updates. When it works, it’s fabulous, providing a real time view of data in either system, ensuring Sales and Marketing are seeing the same picture, whilst enabling them to use the best-of-breed system most appropriate to their respective requirements.

A new CMO and the prevailing economic conditions though have lead to questioning whether marketing data should continue to be managed in-house, rather than outsourcing to a marketing service provider. In reviewing the options for outsourcing however, one of the first issues (of many) that arises is how, if at all, should sales and marketing data integration be maintained?

Most out-sourced or hosted solutions tend to rely on much less sophisticated and timely batch data transfers, via ftp or similar mechanism, which are a long way from the real time synchronisation currently enjoyed. Is moving to such a mechanism and the attendant loss of immediacy important? “This is a really worrying trend,” says Shane Redding of business to business digital and data marketing consultancy Cyance. “It is disappointing to see companies make a backward step of this kind, which in my opinion is usually the result of not making the next step of really using the real time data in anger which then demonstrates the return on investment.”

Shane and I are very much in agreement, and here’s why.

  1. Sales and Marketing users don’t, and shouldn’t need to, understand the intricacies of data integration. They just want to know that data in one system is available in the other; a Sales rep entering a new contact in the SFA system wants to know their prospect is available for marketing activity. It invokes much greater confidence if this transfer is immediate, without having to know about or understand overnight batch updates. Once control is lost, users feel disconnected and reduce their ownership of the process, leading to a rapid deterioration in data quality.
  2. The sooner changes made to a record in either system are replicated, the less chance there is of subsequent changes to the same record in the other system being made before the data is transferred, leading to potential anomalies or corruption. This is particularly the case where records are merged or changes are made to many fields at the same time.
  3. Marketing-generated leads need to be transferred to Sales promptly. Research shows that timely lead follow-up is one of the biggest determinants of successful lead conversion. If a lead or response relates to an existing contact or customer, Sales should be made aware as soon as possible, allowing a rep to handle their account in the most appropriate way
  4. Best-of-breed marketing practices, such as trigger marketing based on response or other events, require good data integration. Explaining such requirements away saying “we don’t need to do that” won’t cut it. Your competitors are doing it.
  5. Business is moving ever faster. It is expected that data changes are available immediately, especially between Sales and Marketing systems. Reverting to a batch system is a backwards step that fails to lay the foundations for modern and forward-thinking marketing capability.
  6. System development and testing are substantially quicker and easier if changes in one system are reflected in the other almost straight away, rather than having to wait to see if configuration changes are working as intended.
  7. Much of the complexity in data synchronisation lies in the business rules for handling updates, conflicts, mappings and referential integrity. Once these rules are in place, why not transfer data more frequently, reducing the volume and complexity of batch updates when they occur?

Marketing shouldn’t be ashamed to stand up for genuine business requirements, with demonstrable benefits. Don’t let internal politics or external suppliers tell you otherwise!

With thanks to Shane Redding for contributing to this post.

Top 10 tips for sourcing marketing technology

Tuesday, February 3rd, 2009

Just about this time last year, I outlined a fairly personal set of 6 crucial marketing automation system requirements that it was particularly important to us were incorporated into the system we were about to deploy. Business 2 Business Marketing magazine’s online companion site has just published an alternative take on marketing technology requirements which I thought complimented mine. Rather broader in scope, point four “Pilot your technology” particularly resonated with me, given its message of testing and phased roll-out, rather than taking an all-or-nothing approach (see “Testing, testing, testing” for my thoughts in this area.)

After the problems we’ve been experiencing this year, having rushed into production with key elements untested, this sentiment is particularly pertinent. Don’t let yourself think “I’m sure it’ll be alright” – if it can go wrong, it probably will!

The third rail – sales order processing databases

Thursday, October 9th, 2008

I’ve written a lot about integrating sales and marketing databases (posts too numerous to provide links – search on “integration” in the sidebar), but so far I haven’t mentioned the third source in the marketing data ecosystem – order processing systems. Order processing systems are where the sales orders that leads and opportunities (hopefully!) eventually turn into are captured, invoices created and ultimately customer status converted. It may also be known as an enterprise resource planning (ERP) system, and also handle financials, human resources and other functions (possibly even CRM).

The reason these systems are important within a marketing operations context is because they are generally the system of record regarding whether an organisation is a customer or not, and what their purchase history is if so. Although the sales and marketing systems should have a view of completed opportunities and closed deals, there is inevitably a disconnect from what was supposed to have been sold and what was actually booked. Put starkly, once the deal is clinched, Sales’ enthusiasm for making sure it is accurately reflected in the SFA system wanes considerably; commissions are likely to be calculated based on what the order processing system says.

Care needs to be given to designing order processing links though. Here are some considerations:

  • Is the feed uni or bi-directional? In other words does the marketing database just receive updates of customer status and possibly purchase history. Such feeds are often one-way, as the owner of the order system will jealously guard their data integrity – not unreasonably, as it represents the “real” customer database for the company. However, if there is no feedback mechanism, then it may not be possible to correct issues with the data, such as missing address elements, inconsistent country values or duplicates.
  • How does the order system handle accounts and organisations. As a result of the different imperatives of ordering systems (delivery, invoicing, credit accounts), data is frequently held in a way that is inconsistent with that of the marketing database. If different departments of the same organisation, for instance, have made separate purchases, the order system may create separate records which will be perceived by the marketing database as duplicates. Take care in removing these duplicates from the marketing database however; not only might they simply turn up again with the next order system update, but you will loose the account number reference in the marketing database which might be a crucial external reference.
  • What purchase history data is available? If the feed is at “account” level (which may not be the same as unique organisations) it may include most recent order, invoice or contract date. That might be enough to derive a “customer” status, such as having ordered within a specified time frame or are within a maintenance contract, but may not include any information on what was ordered. On the other hand, you might be faced with a feed of every order or invoice, which is considerably more challenging to integrate.

Unlike the third rail of an electric railway, which you shouldn’t ever touch in order to avoid electric shock, the order processing systems is generally avoided even though they’re a crucial source of marketing data. Which isn’t to say you won’t get a shock if you try and integrate it!