Archive for April, 2009

On good form

Wednesday, April 29th, 2009

I wanted to briefly mention a great new resource for anyone involved in online data collection, brought to us by international data quality and addressing guru, Graham Rhind. “Better data quality from your web form” is a free download ebook in pdf format that is designed to help achieve effective international name and address Internet data collection. In the spirit of full disclosure I should mention that Graham asked me to take a look at the book before he published it and as such I can say it’s an invaluable source of information.

Exhibiting Graham’s customary thorough and comprehensive coverage of the topic, the book includes guidance on name and address capture, use of pick-lists and other form elements, usability and data validation. Longe-standing readers of my blog will know that web forms are something of a hot topic for me and I hope this book will help curb some of the worst examples of bad practice out there!

The book is available for download from Graham’s site, and whilst you’re there you should take a look at the wealth of additional information he makes available.

Discipline and touch control

Wednesday, April 29th, 2009

A point raised at the IDM Emerging Digital Trends seminar the week before last, a recent conversation and an article in the April issue of Database Marketing magazine all highlighted an issue I thought worth recounting here.

Presenting on the use of web analytics, David Walmsley, Head of Web Selling at John Lewis talked about customer segmentation and tailoring communications based on behaviour and purchase history. He made the observation that by creating such segments, messaging content and frequency could be tailored appropriately to recipients, increasing relevance and effectiveness.

Separately, I was speaking to the former head of database marketing at a US Mid-Western publishing company. He recounted the tale of having finally made inroads with his marketing campaigns colleagues in persuading them to adopt a segmentation strategy. This was aimed at helping to reduce the over-touch problems they were facing, where some individuals were receiving as many as one email per day, such was their volume of activity. This of course lead to drastically falling response rates, as the blizzard of email simply went ignored. (Interestingly, even opt-out rates weren’t that high, such was the level of disengagement among recipients). “We need to improve targeting and reduce touch volume”, said my exasperated contact.
“No problem,” came the response, “we’ll just stop emailing the bottom, least valuable segment – that should cut volume by 10%!” The observation that this would make no difference to the top segment (those receiving an email a day), went unheeded…

In his piece in Database Marketing, Warwick Beresford-Jones wrote about “optimisation”, making the point that a given individual can be contacted only so many times before they become unresponsive, and that those touches should be used wisely to achieve best value. “Without optimisation, your best customers are generally over-contacted and you second and third best customers are under-contacted,” states Beresford-Jones. “There is a point in the year when you actually start to annoy your best customers and this impacts directly on campaign profitability.”

Back at the IDM seminar, David Walmsley highlighted this temptation of sending “one extra” email to the top segment when the weekly sales numbers aren’t quite reaching target. I asked him how this temptation should be avoided. “It takes discipline,” suggested Walmsley, adding that taking a short term approach ultimately leads to lost value within your customer base. How should this discipline be engendered though, and in particular how can pressure from senior management in tough trading conditions be resisted?

The answer, as Beresford-Jones amply illustrates in his article, is to have the numbers to hand to support your case. This means being able to demonstrate the return from a given piece of activity and ideally the campaign cost savings made by reducing segment sizes whilst maintaining targeting focus. Point to falling response rates (and opt-out rates where appropriate) as evidence that contact fatigue has set in.

The ease of executing online communications is such that over-touching, even with the best of intentions, is all too easy. Even the pushiest retail sales person would be unlikely to follow you around the store asking every few steps whether you wanted to buy something (certainly not at John Lewis!). Yet that’s what our customers’ inboxes can feel like at times. A little discipline never did anyone any harm and this is no exception.

Salesforce.com, analytics, email marketing and financials – it’s all in the cloud

Monday, April 13th, 2009

The Salesforce.com customer conference in the UK this year took the form of CloudForce, a complimentary day of sessions and vendor showcases, held at London’s ExCel exhibition centre last week. It’s no revelation that Salesforce.com have long since moved on from simply being a salesforce automation developer. Today, they position themselves as “Force.com”, promoting the benefits of cloud computing – multi-tenanted, internet based computer platforms – that obviate the need to install software. Indeed, the “no software” message, and attendant logo of the word “software” with a line through it, was repeated at every opportunity. An amusing moment came when Paul Cheesbrough, CIO of The Telegraph Media Group, made reference to “your software” when joining Salesforce.com CEO, Marc Benioff, on stage during the main session. “Your platform I mean,” Cheesbrough quickly corrected himself, “there is no software.”
“Thank you,” replied Benioff.
“I saw it in your eyes!” quipped Cheesbrough.

The AppExchange platform that forms part of Saleforce.com offers a plethora of opportunities to expand the functionality of the base product. However, the ready integration capabilities of Salesforce.com and the Force.com application platform enable new possibilities, some of which I thought noteworthy. Force.com is particularly interesting, as it opens up the platform beyond Sales and Customer Service management to one that allows developers to create their own applications running on the Salesforce.com cloud infrastructure. To developers, Force.com represents the opportunity to deliver solutions based around the software-as-a-service ethos, without having to build the delivery infrastructure themselves. Adopters of these solutions, for whom not having to install software and maintain their own IT infrastructure is appealing, gain access to applications meeting their requirements that might not otherwise have made it to this delivery mechanism.

One such example is a complete accounting application from financial software developers Coda, called Coda2go. Based around their on premise solution, Coda2go runs entirely on the Force.com platform and integrates closely with Salesforce.com itself. I wrote recently about the considerations of integrating sales order processing within the sales and marketing “data ecosystem”, where I made reference to the point at which an Opportunity is closed and an order booked. With Coda2go, this process, together with resulting invoicing, is practically a one-click undertaking. Once the Opportunity is ready to be booked as a sales order, which would typically involve manually switching to a different system, all of the order details are picked up from Salesforce.com, transferred to Coda2go, invoices created and the rest of the accounting process put in train. I can’t speak to how good a financials solution Coda2go is, but this looks pretty neat!

Closer to marketing home, Cognos (now part of IBM) and QlikTech were offering Salesforce.com enabled versions of their analytics solutions. As well as enabling more sophisticated analysis, visual representation and dashboards than native Salesforce.com, these solutions will work across multiple data sources, holding out the prospect of unified marketing and sales reporting and analysis. Joining marketing data such as campaign execution, response and leads with converted opportunities and closed deals, the nirvana of true, operational marketing effectiveness reporting comes a step closer. Of course a variety of process implications still need to be considered, but at least data visibility is improved.

Finally, and firmly within the marketing realm, a couple of email campaign solutions and a data collection system caught my eye. Genius.com and ExactTarget both offer solutions for creating and despatching marketing emails with all the features you would expect, including HTML templates, personalisation, tracking and reporting. Naturally, this is integrated with Salesforce.com in terms of data management and reporting, making straightforward but relatively sophisticated email marketing very easy. Clicktools allows the creation of surveys, landing page and forms, enabling rapid generation of marketing response mechanisms, as well as questionnaires and so on. Between all of these solutions, it seems possible that best-of-breed marketing campaigns consisting of outbound email and rich landing pages with response tracking can be created relatively easily and inexpensively, without needing full scale and costly marketing automation solutions.

So, there you have my quick round-up of highlights from CloudForce ’09, all without reference to meteorology or having my head in the clouds. Doh! Too late.

How data quality equals more revenue

Thursday, April 2nd, 2009

Writing in his “Optimize Your Data Quality” blog recently, Jan-Erik Ingvaldsen of data quality solution developer Omikron referenced an article on destinationCRM.com about a piece of research that’s a must have for anyone building their data quality business case.

In their recent research study “The Impact of Bad Data on Demand Creation”, sales and marketing advisory firm SiriusDecisions assert that following best practices in data quality led directly to a 66 percent increase in revenue. Whilst I’ve outlined some generic business case drivers in the past (see “Building a data quality business case“), this is the kind of quantitative study that can really grab C-level attention when you’re trying to justify investment in data quality. The research outlines how addressing quality issues early on in the data life-cycle has an almost exponential benefit in cost efficiency and highlights the importance of collaboration in driving quality improvements.

“It is something that your organization simply can’t afford not to do,” says SiriusDecisions’ senior director of research, Jonathan Block. No argument here!