Archive for the ‘Marketing operations’ Category

When to stop flogging a dead horse

Tuesday, April 13th, 2010

There’s a strong tendency when planning a data selection for a forthcoming campaign or programme to pull as much as possible in order to maximise the reach of the activity and corresponding response. This is nearly always self-defeating however, and not least when it comes to using every record meeting your selection criteria, regardless of how long ago it was collected or when any kind of response was last received. Even if such data is not obviously out of date, there are many reasons to exclude it from ongoing activity.

Although this is likely to be an issue restricted to email activity rather than relatively more expense direct mail, it’s still applicable to both. The greater cost involved with DM creates a natural incentive to fine-tuning selections ahead of launching a campaign. Even so, it’s extraordinary how poorly targeted such activity can often still be, with the obvious parameter of data age not taken into account.

The seemingly next-to-nothing cost of email though makes it easy to think that that there is no impact to using all available data, but as we all know (albeit don’t necessarily acknowledge) this is not the case. Diligent email marketers will of course remove bounced email addresses from their lists in order to maintain a clean database and eliminate records known to be no longer active (although not always, see Email bounces and database updates). And it goes without saying that opt-outs and unsubscribes must be removed in order to maintain privacy compliancy. Other than that, if you’ve got a usable record, use it, right?

Well, an obvious effect of taking this approach is to actually diminish your percentage open rates, since the opens that you do achieve will be diluted by all those disengaged recipients. Now you might be thinking that this is just damned lies and statistics, since the overall number of opens isn’t changed by the total number of recipients. If you’re monitoring these metrics however, they will be giving you a false, and unnecessarily pessimistic, impression. It will be much harder to achieve improvements due to the dead weight of of those recipients who are never going to look at what you send them.

Continuing to market to an artificially inflated list also obscures the number of people you’re actually reaching. The absolute open and click rates are crucial of course, but continuing to hope that non-responsive recipients will at some point come to life again may mask deeper issues with your database. Perhaps you should be looking for fresh subscribers or prospects via external data acquisition or increased social media activity to encourage opt-in. (Don’t just rush out and rent a list though – see the point on Data acquisition in my recent post How to take advantage of a recovery.)

How then should you go about honing your list selection when preparing a new campaign? Well obviously it goes without saying that your activity should be carefully targeted at individuals meeting relevant criteria across role, industry, interest, behaviour and so. A quick and easy way to eliminate the unresponsive element of your database however is to apply a filter I and others often refer to as “recency” (accepting this is a made-up word!). This is by no means rocket science, but takes a little discipline and good data management. Put simply, those individuals in your database that have not responded or interacted in any way for a defined period of time, usually 2-3 years, should be excluded from activity going forwards. Even if their email address is still in use they’re simply never going to respond and are just skewing your results as discussed. The minuscule possibility that they will respond in the future is just not worth the negative impact of continuing to include these recipients in your activity.

The trick here of course is the ability to effectively determine who these non-responders are. You will need the outcomes of your email and other direct activity to be fed back to your database in order to readily make a selection based on these criteria. As well as email opens and clicks, you should also take into account website log-in if applicable, event attendance, purchase (obviously) and any other behaviour you can identify and track. Increasingly, this might include social media activity, such as Twitter or Facebook. It’s quite possible that lack of actual response to email doesn’t mean lack of interest, but you need to demonstrate this, not just make an assumption. The ability to make this part of your selection criteria clearly needs to be a “production” capability, built-in to your marketing operations, and not a hugely labour intensive task for every campaign execution.

It’s worth noting also that the lack of response to marketing activity could itself be used as a trigger for some other kind of follow-up, particularly for high value contacts. If a past customer or senior-level prospect has stopped responding, a quick call using a low-cost resource (i.e. not an expensive Inside Sales rep) to check their status could be worthwhile. Maybe the contact has left and been replaced, changed roles or allowed your company to fall off their radar. You might be able to re-engage, but if not, move on.

Recency should be a field in your database that is constantly calculated based on all the criteria outlined above, which can be readily included in a selection. Just to make the point, this is completely different from “last edit date”, which can often be set when a record in a database is merely viewed, regardless of whether a real change was made or activity performed by the contact. Implementing this simple addition to your campaign selection will have an instant, positive effect on your marketing metrics and save you from flogging dead horses.

External marketing service provider or internal database?

Friday, March 12th, 2010

A recent discussion in the pages of Database Marketing magazine regarding the merits of in-house versus out-sourced data management was reassuringly familiar. I’ve been involved in many debates as to the best approach over the years, with no definitive answer being reached. It depends, of course, on the circumstances of the organisation and it’s marketing requirements.

It may be possible, with a databases that doesn’t involve too many feeds or updates, to hold it externally and undertake batch cleansing. Increasingly though, the proliferation of data sources, update frequency and links to other systems means such a stand-alone approach isn’t feasible. In addition, data cleansing can’t be viewed as simply an occasional process, but one of continuous improvement.

This doesn’t necessarily dictate an in-house or external solution, but whatever the solution is, it must be able to integrate with other corporate systems and data sources. Enquiries captured on the website need to be stored in the marketing database for inclusion in ongoing nurturing activity, campaign outcomes fed back for tracking and measurement and qualified leads passed to Sales, with the eventual results recorded for ROI analysis.

Marketing systems and processes must increasingly be integrated with the wider enterprise and both MSPs and solution vendors must ensure this is what they are delivering.

How to take advantage of a recovery

Monday, February 8th, 2010

So, 2010 is well underway, and hopefully the difficulties of 2008/9 are slipping behind us. As business begins to pick up again, and budgets and activity levels are restored, everyone looks forward to getting back to business as usual.

Or perhaps not – perhaps there’s a better way.

Many Marketing departments shed staff last year, and although I’m not advocating a jobless recovery, there may be smarter ways of undertaking the activities some of those people may have been undertaking, rather than just throwing bodies at the routine challenges encountered by marketing. Here’s a few things to think about doing differently this year.

  • Data acquisition – When obtaining targeted contacts for marketing activity take a long term approach. Renting a list for a tactical campaign that’s coming up will not be successful; ongoing activity is the key. Spend time researching the right data source (see Business data and Sales prospecting tools on our Resources page) and if bespoke contact discovery is necessary, leave enough time. This also makes investing the necessary time and effort in properly handling the data more worthwhile: load the list into your database/campaign system, flag the source, track outgoing activity and record response (see point below). This allows the effectiveness of the acquired data to be measured much more readily.
  • Proactive data quality management – Avoid “a quick check of the data” being the last thing that happens before campaign execution. Data quality is an ongoing task and leaving it to the last moment will mean it’s always a panic activity that never gets done properly. Ideally, you should implement a true data quality programme and a suitable solution to monitor and maintain data (see previous posts Data quality – a vision and 6 data quality solution requirements). At the minimum though, use one of the many (not necessarily expensive) tools to identify issues on a routine basis and fix them as you go along. (See Data quality tools and consultancies on the Resources page.)
  • Joined up response management – Campaign execution, whether direct mail or email, is often carried out by external vendors, which is understandable. They can pull landing pages and micro-sites together quickly and easily, where perhaps building such facilities into your main website is onerous and time consuming. However, campaign reporting should take place within your existing processes so that it’s a business-as-usual activity, not an exceptional process that only a couple of people understand. If you are hosting you’re own landing pages, the same principle applies of course. Hopefully capturing such responses directly to your marketing database is relatively straight-forward (many systems have web-to-lead functions), but if it has to be manual, so be it. This investment in time will pay off when it comes to reporting and tracking.
  • Skills – Consider the expertise that is really required as activity levels rise and how best to obtain it. Rather than re-employ generalists, identify two or three step change projects and employ temporary specialists or agencies to get those changes achieved using what was, previously, salary budget. Once these programmes have been completed, review the skills you need before determining the types of roles required and taking on new permanent staff. Use this as an opportunity also to do some testing before deciding where to focus new spend. Again, this isn’t to discourage creating jobs for unemployed marketers, but experimenting and testing actually creates gainful activity that will bring the recovery forward, without requiring companies to commit too soon whilst it remains tentative.
  • Sales and Marketing database integration – Strive to ensure that your marketing system and the system your Sales team are using are linked together as closely as possible. Leads, once qualified, should appear directly in your SFA (sales force automation) system, not as spreadsheets or emails sent to Reps. Even better, share contact data between the two systems so that changes in either are immediately available to everyone. This should hopefully also help with tracking leads once they have been supplied to Sales, and eventually measuring the outcome of marketing activity.

As the recovery takes hold, let’s hope that marketing departments start hiring again, and put all that talent to work on creating effective campaigns.

With thanks to Kate Mayfield of Data & Mash for contributing to this post.

Data quality is for life not just for Christmas

Thursday, December 10th, 2009

As Christmas rushes towards us, we’re once again reminded that those considering pets as gifts must keep in mind the ongoing responsibility they represent: “A dog is for life, not just for Christmas”. In considering this recently, I was struck that the adage could similarly be applied to data quality (without meaning to trivialise the original message). Data quality is not a one off exercise, a gift to users or marketing campaigns, but an ongoing commitment that requires management buy-in and appropriate resourcing.

It’s well known that data decays rapidly, particularly in B2B which must contend with individuals being promoted, changing jobs, moving companies and so on, together with mergers, acquisitions, wind-ups and more. I often refer to this as the “Data Half Life”, the period of time it takes for half of a database or list to become out-of-date, which can be two years or fewer. It’s this fact that makes data quality maintenance an ongoing task and not simply something that can be done once, ahead of a big campaign or new system implementation.

Yet time and again, I’m asked how best to “clean-up” a database in exactly such a situation, or I hear of efforts to do so. I’m not saying such an  undertaking shouldn’t be made, it’s certainly better to do so than not, but the effort and expense is substantially wasted if it’s conducted on an ad hoc or piecemeal basis. Data immediately starts to decay, as contacts move, addresses change, new records are added and inevitable duplicates created, standardisation rules disregarded, fields not properly completed and other issues creep in. Very soon the data is in the same state as it was before “the big clean” took place.

It’s tempting to suggest undertaking a batch cleanse on a regular basis then, recognising these problems and trying to stay on top of them. Depending on the nature of your database, this could well be a viable approach, and might be quite cost effect, particularly if you contract a bureau or data management supplier on an annual basis, say. Unless your database is a relatively static campaign management system that can be taken offline whilst such an operation is undertaken – which could be several days – this approach presents its own issues. Considerations here include what to do with data that changes in the system whilst it’s at the same time away being cleansed, how to extract and reload updates handling the merging of any identified duplicates.

Far better though is an approach to data quality management that builds quality into the heart of an organisation’s processes and operations. Something along the lines that I outlined here some time ago and which incorporates Gartner’s “data quality firewall” concept. (This suggests that just as a network firewall should protect against unwanted intrusion, a data quality firewall should prevent bad data from reaching an organisation’s systems.) Ideally, one of the growing number of data quality software platforms should be deployed in order to create a framework for this environment (recognising that neither the issue or the solution is solely one of technology). Competition in this area continues to erode the cost of such solutions, and indeed open-source vendor Talend even offer a version of their Talend Open Studio product as a free download.

Adopting this way of managing data quality is a longer term play that may lack the one-off satisfaction of a quick clean-up and require maintenance, nurturing and care long after the initial “gift” of project approval is received. But just like a dog, this is a gift that will keep on giving in terms of operational effectiveness and business agility, making rapid and flexible campaign execution a reality and not a chore.

What IT needs to do for Marketing

Monday, October 5th, 2009

It’s well known that Sales and Marketing are the cats and dogs of many companies (or dogs and cats, I’m not trying to start a debate about which is which in this post!), constantly fighting with each other and falling out. But what about Marketing and IT? Technology is crucial to most marketers and we turn to our IT colleagues for solutions to help us manage customer lifecycle, campaign execution and many other aspects of marketing activities. Alongside systems deemed business critical in finance and operations though, Marketing is often de-prioritised and left to fend for itself.

IT’s response to requests from Marketing for additional resource often revolves around their need to focus on “core functions”, but what are these functions? Clearly IT has many demands placed on it from across any business. Systems relating to financial management and service delivery will always occupy a high profile position, against those merely generating and tracking demand for a company’s products and services. The tendency among IT organisations is to want to retain ownership of as much as possible, define everything as a project and then submit every initiative to a review board for approval.

Marketing’s requirements are often much simpler than this, and the rising prevalence of hosted and software-as-a-service solutions mean these needs can be met in a much lighter-touch way. IT’s role then becomes that of creating an environment where these solutions can be rapidly selected and deployed, undertaking integration (often only a configuration task) where necessary. Core IT skills such as requirements definition, vendor assessment and selection and project management are still invaluable, but they are relieved of the heavy lifting of creating the environment for a new system and handling the fine detail of implementation.

Clearly the arguments in favour of outsourcing are well rehearsed, but Sales and Marketing represent a particularly good fit for this approach. IT’s “core function” can then become enablement, and the growing contingent of highly capable, technically literate marketing operations professionals can take it from there. There’s no reason that Marketing and IT can’t play nicely; now how to achieve the same result with Sales…

Eleven Steps to kick off your CRM system project

Monday, July 6th, 2009

We’ve run many marketing automation projects over the years, both large and small. Here’s a simplified version of the methodology we use, and some hints around getting your project up and running!

1. Project Feasibility – an informal review to scope the potential project and set some expectations. This process might be no more than a short internal meeting, but at this stage you’ll not only be able to roughly size the project, but you’ll also have a good handle on the costs you’re currently incurring. Look at the organisation’s current levels of marketing activity, not only in departments carrying out marketing, but also Sales and other functions. Try and come up with some metrics such as spend (internal, number of activities, overall number of touches), and the programme objectives; customer acquisition, retention, up-sell/cross-sell. Don’t forget softer marketing activities such as newsletters sent by product or customer service groups.

You also need to get a rough idea of the data available, again don’t forget to look out-side the main marketing teams as well as internally. This usually means Sales, Finance (if there is no data warehouse), Customer Services and product management teams.

Add into the mix your organisation’s future needs, growth strategies, new products, desired improvement in customer experiences, structural acquisitions, as well as any predictable internal factors around people or structural changes.

Activity + Costs + Data Resource + Business Objectives are the inputs you’ll need do outline the project scope.

2. Initiate Project – you might have an internal project initiation process or it might be a more informal set of actions. But any successful project will need most of these components in place:

  • Business Buy-in – Your project is going to need or catalyse change in your business. Now is the time to get your directors or SVP’s on board. And don’t forget to keep up a dialogue with the guys in IT!
  • Project Champion (Board) – Someone with a stake in the project’s success and with enough political weight to fight your corner for resource and support
  • Project Manager – A good PM combines a detailed technical understanding with the oleaginous charm of a diplomat and the motivational skills of Madame Whiplash! They can be either from marketing or from IT or both! At times it’s going to be a full time job, so make sure they have the bandwidth.
  • Success Definition – Develop meaningful indicators of success; these might include  reduction in costs, improvement in productivity or trends in conversion costs. Keep them simple (at least what you share with the business) and realistic.

3. 1st Stage Requirements Definition and Data Audit Documentation

  • A high quality piece of work at this stage is vital to the success of the project; investment in time here will be repaid by a successful implementation many fold. When you start writing the cheques is too late to be finding  figure out that what is being delivered doesn’t meet your needs.
  • Clearly prioritise all key features; essential/desirable/optional. On any requirements document the nice-to-haves tend to take up the same amount of space as the need-to-haves.
  • Think about phasing; its likely any substantial project will be delivered (and paid for) in a number of stages; prioritise key deliverables, but you also need to work out the optimum structure to meet operational constraints.
  • Identify any internal process changes needed, this is another area that is easy to overlook or underestimate. Does this need to be a vendor deliverable or can the business handle it themselves?
  • The Data Aaudit doesn’t need to be exhaustive at this stage; but you need to have a very good handle on the inputs the system will need, files layouts where applicable, approximate record quantities, and source system dependencies. In any complex organisation it’s easy to underestimate the number of data sources needed for build and production. On one recent project the estimate was 18. The real number once an exhaustive process was complete? 61!

4. RFI/RFP to vendors (and internal Technology Group) – You may or may not have an internal IT resource who feel they can deliver a Marketing automation/CM project. One way to cut through the politics of this is to ask them to respond like the other vendors – make sure they price internal IT resources realistically.

5. Response evaluation and contract negotiation

  • Allow plenty of time for this stage; there’s nothing like seeing the figures on the table to focus the mind, and the vendor will be looking to safeguard their position. A successful negotiation will allow both parties to apportion the risk
  • Usually there will be a significant up-front cost for development. A guaranteed contract term will allow the vendor to amortise the development costs over the the period of the contract.

6. Project Plan and Timeline setting – Make this realistic but not too long. You need to be able to keep the momentum going, but its not great to forever be announcing delays. Try and structure the project to allow early wins; for example you may not need every single data feed to start gaining value from a single customer view.

7. Detailed Requirements and Data discovery

  • This should be straightforward process if you’ve got a good requirements price, but the vendor should respond to your functional prioritisation, allowing you to make informed choices before agreeing the statement of work.
  • Allow plenty of engagement time for Data discovery. You’ve probably lived with this data for a long period of time, but any external consultant or specialist is starting from scratch. You’ll also have to make knowledgeable internal data specialists available to the vendor; if you’ve got complete documentation on all internal systems and feeds, congratulations – that’s a first!

8. Development – Ensure configuration and customisation adhere to the agreed requirements and specification, without suffering from scope-creep (constant additions to the original functionality). Any such development should be minimised and every process or function scrutinised to gauge its real priority and whether “out of the box” functionality will suffice. Conduct regular review sessions with key stakeholders to demonstrate functionality and ensure it is on track.

9. Implementation and migration – Develop data migration and cut-over alongside functional development. Ensuring the right data is available in the new system from day one is critical and users will be unforgiving if it is not. Many CRM implementations fail due to data issues, including data quality. Will you migrate all data from legacy systems, or apply rules and filters? What is the data model of the new system compared to previous ones, will there need to be a mapping process.

10. Training ‘Go-Live’ – Don’t overlook training and plan well in advance of go-live. Avoid the temptation to just let users loose on a new system and learn it for themselves, but develop a proper training programme, with hands-on usage (even if it’s a late beta version) and plenty of exercises and review sessions. Aim to have training deliverables available (documentation, process guides or screen tutorials). Run post go-live sessions to re-cap key functions and answer any questions on general functionality arising as users start utilising the system.

11. Evaluation and On-going development – Conduct reviews to ensure the system is delivering the required functionality. Survey users for their opinion on usability, how much they’re using the system and any key missing functions. Does it make their job easier? Put aside resources to make enhancements post go-live – don’t expect the job t complete at this stage.

Salesforce.com, analytics, email marketing and financials – it’s all in the cloud

Monday, April 13th, 2009

The Salesforce.com customer conference in the UK this year took the form of CloudForce, a complimentary day of sessions and vendor showcases, held at London’s ExCel exhibition centre last week. It’s no revelation that Salesforce.com have long since moved on from simply being a salesforce automation developer. Today, they position themselves as “Force.com”, promoting the benefits of cloud computing – multi-tenanted, internet based computer platforms – that obviate the need to install software. Indeed, the “no software” message, and attendant logo of the word “software” with a line through it, was repeated at every opportunity. An amusing moment came when Paul Cheesbrough, CIO of The Telegraph Media Group, made reference to “your software” when joining Salesforce.com CEO, Marc Benioff, on stage during the main session. “Your platform I mean,” Cheesbrough quickly corrected himself, “there is no software.”
“Thank you,” replied Benioff.
“I saw it in your eyes!” quipped Cheesbrough.

The AppExchange platform that forms part of Saleforce.com offers a plethora of opportunities to expand the functionality of the base product. However, the ready integration capabilities of Salesforce.com and the Force.com application platform enable new possibilities, some of which I thought noteworthy. Force.com is particularly interesting, as it opens up the platform beyond Sales and Customer Service management to one that allows developers to create their own applications running on the Salesforce.com cloud infrastructure. To developers, Force.com represents the opportunity to deliver solutions based around the software-as-a-service ethos, without having to build the delivery infrastructure themselves. Adopters of these solutions, for whom not having to install software and maintain their own IT infrastructure is appealing, gain access to applications meeting their requirements that might not otherwise have made it to this delivery mechanism.

One such example is a complete accounting application from financial software developers Coda, called Coda2go. Based around their on premise solution, Coda2go runs entirely on the Force.com platform and integrates closely with Salesforce.com itself. I wrote recently about the considerations of integrating sales order processing within the sales and marketing “data ecosystem”, where I made reference to the point at which an Opportunity is closed and an order booked. With Coda2go, this process, together with resulting invoicing, is practically a one-click undertaking. Once the Opportunity is ready to be booked as a sales order, which would typically involve manually switching to a different system, all of the order details are picked up from Salesforce.com, transferred to Coda2go, invoices created and the rest of the accounting process put in train. I can’t speak to how good a financials solution Coda2go is, but this looks pretty neat!

Closer to marketing home, Cognos (now part of IBM) and QlikTech were offering Salesforce.com enabled versions of their analytics solutions. As well as enabling more sophisticated analysis, visual representation and dashboards than native Salesforce.com, these solutions will work across multiple data sources, holding out the prospect of unified marketing and sales reporting and analysis. Joining marketing data such as campaign execution, response and leads with converted opportunities and closed deals, the nirvana of true, operational marketing effectiveness reporting comes a step closer. Of course a variety of process implications still need to be considered, but at least data visibility is improved.

Finally, and firmly within the marketing realm, a couple of email campaign solutions and a data collection system caught my eye. Genius.com and ExactTarget both offer solutions for creating and despatching marketing emails with all the features you would expect, including HTML templates, personalisation, tracking and reporting. Naturally, this is integrated with Salesforce.com in terms of data management and reporting, making straightforward but relatively sophisticated email marketing very easy. Clicktools allows the creation of surveys, landing page and forms, enabling rapid generation of marketing response mechanisms, as well as questionnaires and so on. Between all of these solutions, it seems possible that best-of-breed marketing campaigns consisting of outbound email and rich landing pages with response tracking can be created relatively easily and inexpensively, without needing full scale and costly marketing automation solutions.

So, there you have my quick round-up of highlights from CloudForce ’09, all without reference to meteorology or having my head in the clouds. Doh! Too late.

Can we learn permission marketing from Generation Y?

Monday, March 30th, 2009

This week saw the annual IDM Lunch taking place once again, an opportunity for members to meet, catch up and discuss current issues over lunch, followed by a keynote address. The calibre of the speakers is always high and this year was no exception, with “worldwide business and technology strategist and best-selling author” (according to the IDM) Don Tapscott occupying the slot this time.

Tapscott’s presentation set out to highlight some of the reasons to embrace rather than disdain “Generation Y”, to whom the Internet is second nature. Rupert Murdoch described them as “digital natives”, against those of a slightly older disposition for whom the Internet arrived at a later stage in life and therefore making them “digital immigrants”. This generation are “bathed in bits” and have a completely different approach to media consumption and social interaction. This of course is characterised by Facebook, Twitter and My Space, but also, critics assert, lack of attention span, insularity and general dumbing down.

Tapscott rejects this description though, and as a Professor of Management at the Joseph L. Rotman School of Management, University of Toronto, among other roles, and having recently completed a $4m research programme in this area, I guess he should know what he’s talking about. The general thrust of Tapscott’s counter-argument was that far from leading to the atrophy of the skills needed in modern business, online technologies foster them. The collaboration, team work and leadership engendered and developed online create individuals far more likely to be effective knowledge workers in the future.

Tapscott also highlighted the attitude of Gen Y to email and a memorable way of characterising it. Email is regarded as a more formal means of communication than instant messaging or social media sites; in other words, something for the oldies to use! Though not a new observation, Tapscott’s research turned up the following gem: when asked when email would be used by today’s teenagers, the response was “when writing a thank you letter to my friend’s parents for having me to dinner.” The art of letter writing may well be on borrowed time…

You can enjoy the rest of Tapscott’s observations in more detail by reading his latest book, Grown Up Digital: How the Net Generation is Changing Your World, so I won’t dwell further here. The other element of his presentation which interested me though was actually his opening gambit. Demonstrating that great minds think alike, as I had suggested this only moments earlier to my neighbour at the table, he asked for a show of hands as to how many Twitter users were in the audience. Of perhaps the couple of hundred delegates, about a third professed to using Twitter, which was a little higher than I might have expected from a relatively senior audience. (Although given the IDM’s strap-line “Digital, Direct, Data”, perhaps this was just the digital contingent.)

Now, I confess I’m not on Twitter, though it’s on my list of things to do. This result however, somewhat supports my assertion that few of the people I’d like to speak do use the service, making my presence a little futile. However, I’m not closed off to it, and I was only recently enthusiastically assured of it’s great utility by an industry colleague (you know who you are!). In view of the upcoming generation ensconced in this technology though, marketers surely need to take these channels seriously, and start learning how to use them to the mutual benefit of organisations and those they wish to influence. This is a similar situation to the early web, when companies built websites with a limited understanding of what they hoped to achieve. This has the danger of being self-fulfilling, but the web didn’t turn out too badly!

What interests me, to bring this back on topic, is the operational implications of these technologies. How can they be effectively integrated into marketing processes, measured and justified? Or is this counter to the ethos of Web 2.0, where such mercenary and quantitative thinking is counter culture? It would seem a shame if so, as Twitter’s “follow my Tweets” approach strikes me as the ultimate in permission marketing. Where’s Seth Godin when you need him? (Well, try here, here or here!)

Killer slogans vs. operational excellence

Wednesday, March 4th, 2009

Another snippet in support of getting the basics right, this time from well respected Cass Business School Honorary Professor of Marketing Metrics, Robert Shaw. In a letter to the London Financial Times recently, he criticised over-reliance on branding at the expense of executional considerations. In response to an article regarding strikingly similar new slogans from Pepsi and Coke, he said “it is the operationally excellent marketers that have a big competitive advantage over their wasteful, slogan-obsessed rivals.”

I mean no disrespect to my branding colleagues when I say I’m not going to argue with that!

Better marketing operations management

Friday, January 9th, 2009

I recently stumbled across quite a nice piece entitled Six steps to better marketing operations management, which talks about the importance of efficient marketing processes and executional excellence, themes regular readers here will recognise. Marketing operations is considered in it’s widest definition, encompassing planning, budgeting and resource management, which is actually quite refreshing when you’re used to thinking in a narrower context of data and process. The article is actually quite old, having been posted in June 2005, but is no less relevant today, and as such I thought it worth bringing to attention here.