Archive for the ‘Analytics’ Category

Discipline and touch control

Wednesday, April 29th, 2009

A point raised at the IDM Emerging Digital Trends seminar the week before last, a recent conversation and an article in the April issue of Database Marketing magazine all highlighted an issue I thought worth recounting here.

Presenting on the use of web analytics, David Walmsley, Head of Web Selling at John Lewis talked about customer segmentation and tailoring communications based on behaviour and purchase history. He made the observation that by creating such segments, messaging content and frequency could be tailored appropriately to recipients, increasing relevance and effectiveness.

Separately, I was speaking to the former head of database marketing at a US Mid-Western publishing company. He recounted the tale of having finally made inroads with his marketing campaigns colleagues in persuading them to adopt a segmentation strategy. This was aimed at helping to reduce the over-touch problems they were facing, where some individuals were receiving as many as one email per day, such was their volume of activity. This of course lead to drastically falling response rates, as the blizzard of email simply went ignored. (Interestingly, even opt-out rates weren’t that high, such was the level of disengagement among recipients). “We need to improve targeting and reduce touch volume”, said my exasperated contact.
“No problem,” came the response, “we’ll just stop emailing the bottom, least valuable segment – that should cut volume by 10%!” The observation that this would make no difference to the top segment (those receiving an email a day), went unheeded…

In his piece in Database Marketing, Warwick Beresford-Jones wrote about “optimisation”, making the point that a given individual can be contacted only so many times before they become unresponsive, and that those touches should be used wisely to achieve best value. “Without optimisation, your best customers are generally over-contacted and you second and third best customers are under-contacted,” states Beresford-Jones. “There is a point in the year when you actually start to annoy your best customers and this impacts directly on campaign profitability.”

Back at the IDM seminar, David Walmsley highlighted this temptation of sending “one extra” email to the top segment when the weekly sales numbers aren’t quite reaching target. I asked him how this temptation should be avoided. “It takes discipline,” suggested Walmsley, adding that taking a short term approach ultimately leads to lost value within your customer base. How should this discipline be engendered though, and in particular how can pressure from senior management in tough trading conditions be resisted?

The answer, as Beresford-Jones amply illustrates in his article, is to have the numbers to hand to support your case. This means being able to demonstrate the return from a given piece of activity and ideally the campaign cost savings made by reducing segment sizes whilst maintaining targeting focus. Point to falling response rates (and opt-out rates where appropriate) as evidence that contact fatigue has set in.

The ease of executing online communications is such that over-touching, even with the best of intentions, is all too easy. Even the pushiest retail sales person would be unlikely to follow you around the store asking every few steps whether you wanted to buy something (certainly not at John Lewis!). Yet that’s what our customers’ inboxes can feel like at times. A little discipline never did anyone any harm and this is no exception.

Salesforce.com, analytics, email marketing and financials – it’s all in the cloud

Monday, April 13th, 2009

The Salesforce.com customer conference in the UK this year took the form of CloudForce, a complimentary day of sessions and vendor showcases, held at London’s ExCel exhibition centre last week. It’s no revelation that Salesforce.com have long since moved on from simply being a salesforce automation developer. Today, they position themselves as “Force.com”, promoting the benefits of cloud computing – multi-tenanted, internet based computer platforms – that obviate the need to install software. Indeed, the “no software” message, and attendant logo of the word “software” with a line through it, was repeated at every opportunity. An amusing moment came when Paul Cheesbrough, CIO of The Telegraph Media Group, made reference to “your software” when joining Salesforce.com CEO, Marc Benioff, on stage during the main session. “Your platform I mean,” Cheesbrough quickly corrected himself, “there is no software.”
“Thank you,” replied Benioff.
“I saw it in your eyes!” quipped Cheesbrough.

The AppExchange platform that forms part of Saleforce.com offers a plethora of opportunities to expand the functionality of the base product. However, the ready integration capabilities of Salesforce.com and the Force.com application platform enable new possibilities, some of which I thought noteworthy. Force.com is particularly interesting, as it opens up the platform beyond Sales and Customer Service management to one that allows developers to create their own applications running on the Salesforce.com cloud infrastructure. To developers, Force.com represents the opportunity to deliver solutions based around the software-as-a-service ethos, without having to build the delivery infrastructure themselves. Adopters of these solutions, for whom not having to install software and maintain their own IT infrastructure is appealing, gain access to applications meeting their requirements that might not otherwise have made it to this delivery mechanism.

One such example is a complete accounting application from financial software developers Coda, called Coda2go. Based around their on premise solution, Coda2go runs entirely on the Force.com platform and integrates closely with Salesforce.com itself. I wrote recently about the considerations of integrating sales order processing within the sales and marketing “data ecosystem”, where I made reference to the point at which an Opportunity is closed and an order booked. With Coda2go, this process, together with resulting invoicing, is practically a one-click undertaking. Once the Opportunity is ready to be booked as a sales order, which would typically involve manually switching to a different system, all of the order details are picked up from Salesforce.com, transferred to Coda2go, invoices created and the rest of the accounting process put in train. I can’t speak to how good a financials solution Coda2go is, but this looks pretty neat!

Closer to marketing home, Cognos (now part of IBM) and QlikTech were offering Salesforce.com enabled versions of their analytics solutions. As well as enabling more sophisticated analysis, visual representation and dashboards than native Salesforce.com, these solutions will work across multiple data sources, holding out the prospect of unified marketing and sales reporting and analysis. Joining marketing data such as campaign execution, response and leads with converted opportunities and closed deals, the nirvana of true, operational marketing effectiveness reporting comes a step closer. Of course a variety of process implications still need to be considered, but at least data visibility is improved.

Finally, and firmly within the marketing realm, a couple of email campaign solutions and a data collection system caught my eye. Genius.com and ExactTarget both offer solutions for creating and despatching marketing emails with all the features you would expect, including HTML templates, personalisation, tracking and reporting. Naturally, this is integrated with Salesforce.com in terms of data management and reporting, making straightforward but relatively sophisticated email marketing very easy. Clicktools allows the creation of surveys, landing page and forms, enabling rapid generation of marketing response mechanisms, as well as questionnaires and so on. Between all of these solutions, it seems possible that best-of-breed marketing campaigns consisting of outbound email and rich landing pages with response tracking can be created relatively easily and inexpensively, without needing full scale and costly marketing automation solutions.

So, there you have my quick round-up of highlights from CloudForce ’09, all without reference to meteorology or having my head in the clouds. Doh! Too late.

Tackling the lead tracking disconnect

Tuesday, November 13th, 2007

As we construct the requirements for our new SFA/CRM system, the issue of lead definition and tracking rears its head again (see The broken Salesforce.com leads model). Having largely succeeded in securing the concept of ensuring that responses from the same individual are always linked to that person’s record, such that the individual lies at the heart of the lead, the next discussion revolves around tracking through to opportunity. The problem here is essentially twofold:

  • How is it possible to judge which response is the one that lead to the creation of an opportunity?
  • How is the influence of multiple contacts on an opportunity tracked?

In other words, how is marketing effectiveness measured, and ultimately return on marketing investment determined? Clearly it’s important that we can systematically link a lead to an opportunity within the database, which makes it easier to draw a line from response to a piece of business. What’s crucial though is maintaining the ability to link all responses to an opportunity, and undertake analysis around how responses influence opportunity creation. It could well be that a webinar is a good way of moving qualified leads through to opportunity conversion and into the pipeline, but what prompted the webinar attendance itself? It could be a whitepaper download or a software evaluation, or specific examples of these types of activity; is one topic more popular and likely to bring about a conversion?

This is where an analytical approach comes in to play. Rather than simply connecting the dots of response, lead generation and opportunity, modelling techniques should allow less obvious links to be made. It’s still crucial that responses are reliably linked to individuals, as this is the linking point (contacts are attached to opportunities, therefore creating the link with response). This is where data quality plays such a crucial role, especially in matching incoming response or new contact creation to existing contacts, such as an event registration or where a Sales rep enters a new individual. Failing to recognise that a new response is from an existing individual, who may already have a response history, will destroy the ability to undertake this kind on analysis and produce meaningful results. (Use the search function at the top of the side bar on the right to look for various past entries on data quality.)

I’m hopeful that this functionality will be adopted in our new system and that we’ll be able to make the investment in the related analytics to maximise the benefit. There still remain the usual, considerable, user education requirements, especially among our Sales colleagues, to ensure that human action doesn’t scupper these best laid plans. The biggest problem is Marketing created opportunities being disregarded and a brand new one created by the rep when they commence working on it. The challenge here, I think, is to ensure that it’s easier to use an existing opportunity that has been created than set-up a new one. After all, the former should be less effort, which is usually a good message with which to start!

Institute of Direct Marketing Data Council Summit

Saturday, March 3rd, 2007

This week saw the first Institute of Direct Marketing Data Council Summit, a day of presentations from such luminaries as Sean Kelly, Steve Wills and Huw Davis along with practitioners from BP, the AA and others. Themed “Data Management Strategies that Create Competitive Advantage”, the conference was intended to address issues such as building competitive advantage through customer intelligence and insight, using data to improve the customer experience and demonstrating the value of data to the board.

Chair for the day Ian Lovett, of data consultancy Blue Sheep, opened with a stern warning to the direct marketing profession that the growing consumer impression of environmental damage and intrusion through wasteful direct mail was creating a political will to introduce ever greater restrictions on privacy and data use, such as requiring opt-in for all marketing. Better targeting and management of data quality was needed to demonstrate that direct marketing is a responsible and considerate discipline that can be trusted with personal data. “Love you data,” said Ian: “Clean it, use it and don’t abuse it!”.

Other themes running through the day were the idea that marketing has failed to keep up with the technology available to it, the growing recognition of the strategic value of data and a topic close to my heart, the creation of central insight departments in marketing organisations. Presenting a retail segmentation case study, Sean Kelly suggested that the failure to create marketing intelligence capability based on the latest technology prior to operational capability is the single greatest reason for CRM failure. He likened it to having the ability to talk but without a brain to control what to say! Peter Mouncey from the Cranfield University School of Management echoed this, saying that organisations’ data strategy lags behind their CRM strategies, adding that they must be aligned to be effective.

Rosemary Albinson from BP and Steve Willis (yep, the guy I quote my my homepage) both commented that marketers must become comfortable with the hard data of marketing results and spend time in working in insight in order to progress to the boardroom. At the same time the scale of the task should be acknowledged; explaining marketing analytics to an accomplished scientist, Rosemary Albinson was told that is seemed more complicated than his field of climatology! Based on experience from the his Customer Insight Forum, Steve Willis also outlined his thinking regarding the management of insight and his vision for a dedicated function lead by an Insight Director, a position he says that is becoming increasingly common. Christine Bailey, from Cranfield University School of Management also commented that it helps to put a central insight team in place.

There were a few different offers on a definition of insight, from Steve Willis’ “embedded knowledge” to Christine Bailey’s multiple sources of actionable customer data. And although he couldn’t be there himself, former GE CEO Jack Welch was quoted as saying (and here I paraphrase) that competitive advantage is derived from the ability to learn faster and act faster than the competition.

The day was closed out by Huw Davis, always entertaining and worth listening to. He talked about the opportunities and challenges regarding international data strategies, particularly in the developing markets of China, India and elsewhere. Clearly the data infrastructure in those countries isn’t quite what we’re used to, but the opportunities for direct marketing are immense. Huw is also about to launch a new analytics business utilising lower cost analyst resource in Asia with a UK account team – you read it here first!

All told, quite an interesting day that reinforced some of my thoughts in this area and provided some interesting tips on data quality programmes and data warehouse projects. I’d better get back to immersing myself in marketing insight – next stop the boardroom!