Archive for the ‘Activity’ Category

Inbound is the new outbound in marketing engagement

Monday, January 17th, 2011

This post is a slightly longer version of a piece I recently contributed to Database Marketing magazine’s annual What lies ahead feature, published in their January edition.

Since its inception as a discipline, Marketing has essentially consisted of sending out messages to the marketplace, promoting products and services. This process has certainly become increasingly sophisticated, with direct and data-driven marketing holding out the promise of better targeting and more personalisation. Digital marketing has driven this development forwards, with greater segmentation, customisation and the ability to more reliably track and measure marketing outcomes.

However, the phenomena of search marketing, social media and other forms of user-generated content are leading to a reversal of this process. Consumers (in both the B2C and B2B sense) are increasingly rejecting this marketing-out scenario, leading to the falling effectiveness of email and other direct marketing communications. Instead, they are seeking out companies and their products, based on search results, referrals and online “buzz”. This is nothing new, with search optimisation and social media presence having represented key marketing priorities from some time now.

The emphasise though is moving to the question of how to engage those responding to marketing stimulus (whether “pushed” or more organic) in order to initiate the relationship that is crucial to successful marketing? This is where inbound marketing steps in, and where I believe significant development is taking place in focus and sophistication. Marketing is used to sending out communications with content customised for specific groups of recipients, but increasingly the same experience is now expected by website visitors.

Based on previous visits, IP look-up and online profiles, website visitors can experience content relevant and optimised for them, increasing the effectiveness of a brand’s web presence. The introduction by Facebook earlier this year of Instant Personalisation is an example of this trend, and according to Mark Zuckerberg we are moving from a web that doesn’t know who we are to one where the web knows exactly who we are.

And it’s data that drives the ability to provide and track this experience between visits and web locations? As database marketing and web analytics necessarily converge and evolve, the next generation of direct marketing will get underway.

Database and digital marketing: two sides of the same coin?

Friday, July 9th, 2010

Earlier in the year I commented on a series of posts on the excellent Customer Experience Matrix blog maintained by marketing technology analyst and consultant David Raab, comparing and contrasting database and digital marketing. I’ve been meaning to highlight the posts here for a while as I think they’re thought provoking and important to the future of these two disciplines, and I’ve finally got around to it. If I’m being honest, I’m not sure (as I make clear in my comment) what the answer is here, and there’s a good deal of evolution yet to come. Still, David’s posts make interesting reading which I wanted to share.

In Unica and Alterian Lead Database Marketers to the Digital Promised Land, David reports on a recent Unica acquisition and a survey released by Alterian, making the observation that longer-standing database marketing vendors “have failed to adapt to the new world of digital marketing”. Although David contends that this doesn’t apply to Unica and Alterian, he cites a statistic from Alterian’s study that 61% of marketers do not integrate Web analytics with other customer data, to support his overall position. “Marketers are eagerly moving from classic direct marketing to digital…but still lack the skills and resources to do it effectively,” he says.

David’s follow-up post, Can Database Marketers Learn Digital Tricks? goes into more detail. He comments on the relative measurability of the two disciplines, particularly concerning “addressable individuals” and the greater comfort that database marketing has in working with existing, well-known customers rather than prospects about which less information is available. Finally, in Clarifying the Differences Between Database and Digital Marketing, David outlines a detailed comparison of key aspects of both database and digital marketing, highlighting common characteristics and key differences.

“There’s no reason the same organization or individual can’t master both database and digital marketing,” David concludes, but “it will take conscious effort to address the differences and fill the gaps that they imply.” A crucial topic that necessitates further discussion…

External marketing service provider or internal database?

Friday, March 12th, 2010

A recent discussion in the pages of Database Marketing magazine regarding the merits of in-house versus out-sourced data management was reassuringly familiar. I’ve been involved in many debates as to the best approach over the years, with no definitive answer being reached. It depends, of course, on the circumstances of the organisation and it’s marketing requirements.

It may be possible, with a databases that doesn’t involve too many feeds or updates, to hold it externally and undertake batch cleansing. Increasingly though, the proliferation of data sources, update frequency and links to other systems means such a stand-alone approach isn’t feasible. In addition, data cleansing can’t be viewed as simply an occasional process, but one of continuous improvement.

This doesn’t necessarily dictate an in-house or external solution, but whatever the solution is, it must be able to integrate with other corporate systems and data sources. Enquiries captured on the website need to be stored in the marketing database for inclusion in ongoing nurturing activity, campaign outcomes fed back for tracking and measurement and qualified leads passed to Sales, with the eventual results recorded for ROI analysis.

Marketing systems and processes must increasingly be integrated with the wider enterprise and both MSPs and solution vendors must ensure this is what they are delivering.

Following the clues to a successful sale

Friday, January 15th, 2010

One of the common traps into which Marketing often falls is to treat a lead as a one-off event in isolation to anything that has gone before or after. This results in every response from a given individual being treated as a new lead, rather than as a package of interest in a company’s products and services.

But leads are like clues in a detective story. In a criminal investigation, Police are often said to be “following multiple lines of enquiry” – in other words, following-up on leads. These leads, or clues, whilst separate from each other, may all point to the same end result. In our case, this is a successful sale, but by themselves each clue may not be enough to solve the mystery. Whitepaper downloads and webinar attendances may not mean much by themselves, but put together they point to an interest in a specific solution or a particularly pressing need.

This is where good lead management becomes crucial, and where so many software solutions fall short. Many systems treat leads as separate, unrelated events and make no effort to tie them all together and present the evidence as a whole. It’s little wonder then that Sales are driven to distraction with a stream of seemingly trivial clues, whilst not being able to see the big picture. At the same time, vital evidence is overlooked – leads go to waste without being followed-up.

We owe it to ourselves to recognise the short-comings of the tools we have available and address these problems. Otherwise, marketing investment will continue to go to waste and fail to deliver the results expected.

Data quality is for life not just for Christmas

Thursday, December 10th, 2009

As Christmas rushes towards us, we’re once again reminded that those considering pets as gifts must keep in mind the ongoing responsibility they represent: “A dog is for life, not just for Christmas”. In considering this recently, I was struck that the adage could similarly be applied to data quality (without meaning to trivialise the original message). Data quality is not a one off exercise, a gift to users or marketing campaigns, but an ongoing commitment that requires management buy-in and appropriate resourcing.

It’s well known that data decays rapidly, particularly in B2B which must contend with individuals being promoted, changing jobs, moving companies and so on, together with mergers, acquisitions, wind-ups and more. I often refer to this as the “Data Half Life”, the period of time it takes for half of a database or list to become out-of-date, which can be two years or fewer. It’s this fact that makes data quality maintenance an ongoing task and not simply something that can be done once, ahead of a big campaign or new system implementation.

Yet time and again, I’m asked how best to “clean-up” a database in exactly such a situation, or I hear of efforts to do so. I’m not saying such an  undertaking shouldn’t be made, it’s certainly better to do so than not, but the effort and expense is substantially wasted if it’s conducted on an ad hoc or piecemeal basis. Data immediately starts to decay, as contacts move, addresses change, new records are added and inevitable duplicates created, standardisation rules disregarded, fields not properly completed and other issues creep in. Very soon the data is in the same state as it was before “the big clean” took place.

It’s tempting to suggest undertaking a batch cleanse on a regular basis then, recognising these problems and trying to stay on top of them. Depending on the nature of your database, this could well be a viable approach, and might be quite cost effect, particularly if you contract a bureau or data management supplier on an annual basis, say. Unless your database is a relatively static campaign management system that can be taken offline whilst such an operation is undertaken – which could be several days – this approach presents its own issues. Considerations here include what to do with data that changes in the system whilst it’s at the same time away being cleansed, how to extract and reload updates handling the merging of any identified duplicates.

Far better though is an approach to data quality management that builds quality into the heart of an organisation’s processes and operations. Something along the lines that I outlined here some time ago and which incorporates Gartner’s “data quality firewall” concept. (This suggests that just as a network firewall should protect against unwanted intrusion, a data quality firewall should prevent bad data from reaching an organisation’s systems.) Ideally, one of the growing number of data quality software platforms should be deployed in order to create a framework for this environment (recognising that neither the issue or the solution is solely one of technology). Competition in this area continues to erode the cost of such solutions, and indeed open-source vendor Talend even offer a version of their Talend Open Studio product as a free download.

Adopting this way of managing data quality is a longer term play that may lack the one-off satisfaction of a quick clean-up and require maintenance, nurturing and care long after the initial “gift” of project approval is received. But just like a dog, this is a gift that will keep on giving in terms of operational effectiveness and business agility, making rapid and flexible campaign execution a reality and not a chore.

What IT needs to do for Marketing

Monday, October 5th, 2009

It’s well known that Sales and Marketing are the cats and dogs of many companies (or dogs and cats, I’m not trying to start a debate about which is which in this post!), constantly fighting with each other and falling out. But what about Marketing and IT? Technology is crucial to most marketers and we turn to our IT colleagues for solutions to help us manage customer lifecycle, campaign execution and many other aspects of marketing activities. Alongside systems deemed business critical in finance and operations though, Marketing is often de-prioritised and left to fend for itself.

IT’s response to requests from Marketing for additional resource often revolves around their need to focus on “core functions”, but what are these functions? Clearly IT has many demands placed on it from across any business. Systems relating to financial management and service delivery will always occupy a high profile position, against those merely generating and tracking demand for a company’s products and services. The tendency among IT organisations is to want to retain ownership of as much as possible, define everything as a project and then submit every initiative to a review board for approval.

Marketing’s requirements are often much simpler than this, and the rising prevalence of hosted and software-as-a-service solutions mean these needs can be met in a much lighter-touch way. IT’s role then becomes that of creating an environment where these solutions can be rapidly selected and deployed, undertaking integration (often only a configuration task) where necessary. Core IT skills such as requirements definition, vendor assessment and selection and project management are still invaluable, but they are relieved of the heavy lifting of creating the environment for a new system and handling the fine detail of implementation.

Clearly the arguments in favour of outsourcing are well rehearsed, but Sales and Marketing represent a particularly good fit for this approach. IT’s “core function” can then become enablement, and the growing contingent of highly capable, technically literate marketing operations professionals can take it from there. There’s no reason that Marketing and IT can’t play nicely; now how to achieve the same result with Sales…

Eleven Steps to kick off your CRM system project

Monday, July 6th, 2009

We’ve run many marketing automation projects over the years, both large and small. Here’s a simplified version of the methodology we use, and some hints around getting your project up and running!

1. Project Feasibility – an informal review to scope the potential project and set some expectations. This process might be no more than a short internal meeting, but at this stage you’ll not only be able to roughly size the project, but you’ll also have a good handle on the costs you’re currently incurring. Look at the organisation’s current levels of marketing activity, not only in departments carrying out marketing, but also Sales and other functions. Try and come up with some metrics such as spend (internal, number of activities, overall number of touches), and the programme objectives; customer acquisition, retention, up-sell/cross-sell. Don’t forget softer marketing activities such as newsletters sent by product or customer service groups.

You also need to get a rough idea of the data available, again don’t forget to look out-side the main marketing teams as well as internally. This usually means Sales, Finance (if there is no data warehouse), Customer Services and product management teams.

Add into the mix your organisation’s future needs, growth strategies, new products, desired improvement in customer experiences, structural acquisitions, as well as any predictable internal factors around people or structural changes.

Activity + Costs + Data Resource + Business Objectives are the inputs you’ll need do outline the project scope.

2. Initiate Project – you might have an internal project initiation process or it might be a more informal set of actions. But any successful project will need most of these components in place:

  • Business Buy-in – Your project is going to need or catalyse change in your business. Now is the time to get your directors or SVP’s on board. And don’t forget to keep up a dialogue with the guys in IT!
  • Project Champion (Board) – Someone with a stake in the project’s success and with enough political weight to fight your corner for resource and support
  • Project Manager – A good PM combines a detailed technical understanding with the oleaginous charm of a diplomat and the motivational skills of Madame Whiplash! They can be either from marketing or from IT or both! At times it’s going to be a full time job, so make sure they have the bandwidth.
  • Success Definition – Develop meaningful indicators of success; these might include  reduction in costs, improvement in productivity or trends in conversion costs. Keep them simple (at least what you share with the business) and realistic.

3. 1st Stage Requirements Definition and Data Audit Documentation

  • A high quality piece of work at this stage is vital to the success of the project; investment in time here will be repaid by a successful implementation many fold. When you start writing the cheques is too late to be finding  figure out that what is being delivered doesn’t meet your needs.
  • Clearly prioritise all key features; essential/desirable/optional. On any requirements document the nice-to-haves tend to take up the same amount of space as the need-to-haves.
  • Think about phasing; its likely any substantial project will be delivered (and paid for) in a number of stages; prioritise key deliverables, but you also need to work out the optimum structure to meet operational constraints.
  • Identify any internal process changes needed, this is another area that is easy to overlook or underestimate. Does this need to be a vendor deliverable or can the business handle it themselves?
  • The Data Aaudit doesn’t need to be exhaustive at this stage; but you need to have a very good handle on the inputs the system will need, files layouts where applicable, approximate record quantities, and source system dependencies. In any complex organisation it’s easy to underestimate the number of data sources needed for build and production. On one recent project the estimate was 18. The real number once an exhaustive process was complete? 61!

4. RFI/RFP to vendors (and internal Technology Group) – You may or may not have an internal IT resource who feel they can deliver a Marketing automation/CM project. One way to cut through the politics of this is to ask them to respond like the other vendors – make sure they price internal IT resources realistically.

5. Response evaluation and contract negotiation

  • Allow plenty of time for this stage; there’s nothing like seeing the figures on the table to focus the mind, and the vendor will be looking to safeguard their position. A successful negotiation will allow both parties to apportion the risk
  • Usually there will be a significant up-front cost for development. A guaranteed contract term will allow the vendor to amortise the development costs over the the period of the contract.

6. Project Plan and Timeline setting – Make this realistic but not too long. You need to be able to keep the momentum going, but its not great to forever be announcing delays. Try and structure the project to allow early wins; for example you may not need every single data feed to start gaining value from a single customer view.

7. Detailed Requirements and Data discovery

  • This should be straightforward process if you’ve got a good requirements price, but the vendor should respond to your functional prioritisation, allowing you to make informed choices before agreeing the statement of work.
  • Allow plenty of engagement time for Data discovery. You’ve probably lived with this data for a long period of time, but any external consultant or specialist is starting from scratch. You’ll also have to make knowledgeable internal data specialists available to the vendor; if you’ve got complete documentation on all internal systems and feeds, congratulations – that’s a first!

8. Development – Ensure configuration and customisation adhere to the agreed requirements and specification, without suffering from scope-creep (constant additions to the original functionality). Any such development should be minimised and every process or function scrutinised to gauge its real priority and whether “out of the box” functionality will suffice. Conduct regular review sessions with key stakeholders to demonstrate functionality and ensure it is on track.

9. Implementation and migration – Develop data migration and cut-over alongside functional development. Ensuring the right data is available in the new system from day one is critical and users will be unforgiving if it is not. Many CRM implementations fail due to data issues, including data quality. Will you migrate all data from legacy systems, or apply rules and filters? What is the data model of the new system compared to previous ones, will there need to be a mapping process.

10. Training ‘Go-Live’ – Don’t overlook training and plan well in advance of go-live. Avoid the temptation to just let users loose on a new system and learn it for themselves, but develop a proper training programme, with hands-on usage (even if it’s a late beta version) and plenty of exercises and review sessions. Aim to have training deliverables available (documentation, process guides or screen tutorials). Run post go-live sessions to re-cap key functions and answer any questions on general functionality arising as users start utilising the system.

11. Evaluation and On-going development – Conduct reviews to ensure the system is delivering the required functionality. Survey users for their opinion on usability, how much they’re using the system and any key missing functions. Does it make their job easier? Put aside resources to make enhancements post go-live – don’t expect the job t complete at this stage.

Discipline and touch control

Wednesday, April 29th, 2009

A point raised at the IDM Emerging Digital Trends seminar the week before last, a recent conversation and an article in the April issue of Database Marketing magazine all highlighted an issue I thought worth recounting here.

Presenting on the use of web analytics, David Walmsley, Head of Web Selling at John Lewis talked about customer segmentation and tailoring communications based on behaviour and purchase history. He made the observation that by creating such segments, messaging content and frequency could be tailored appropriately to recipients, increasing relevance and effectiveness.

Separately, I was speaking to the former head of database marketing at a US Mid-Western publishing company. He recounted the tale of having finally made inroads with his marketing campaigns colleagues in persuading them to adopt a segmentation strategy. This was aimed at helping to reduce the over-touch problems they were facing, where some individuals were receiving as many as one email per day, such was their volume of activity. This of course lead to drastically falling response rates, as the blizzard of email simply went ignored. (Interestingly, even opt-out rates weren’t that high, such was the level of disengagement among recipients). “We need to improve targeting and reduce touch volume”, said my exasperated contact.
“No problem,” came the response, “we’ll just stop emailing the bottom, least valuable segment – that should cut volume by 10%!” The observation that this would make no difference to the top segment (those receiving an email a day), went unheeded…

In his piece in Database Marketing, Warwick Beresford-Jones wrote about “optimisation”, making the point that a given individual can be contacted only so many times before they become unresponsive, and that those touches should be used wisely to achieve best value. “Without optimisation, your best customers are generally over-contacted and you second and third best customers are under-contacted,” states Beresford-Jones. “There is a point in the year when you actually start to annoy your best customers and this impacts directly on campaign profitability.”

Back at the IDM seminar, David Walmsley highlighted this temptation of sending “one extra” email to the top segment when the weekly sales numbers aren’t quite reaching target. I asked him how this temptation should be avoided. “It takes discipline,” suggested Walmsley, adding that taking a short term approach ultimately leads to lost value within your customer base. How should this discipline be engendered though, and in particular how can pressure from senior management in tough trading conditions be resisted?

The answer, as Beresford-Jones amply illustrates in his article, is to have the numbers to hand to support your case. This means being able to demonstrate the return from a given piece of activity and ideally the campaign cost savings made by reducing segment sizes whilst maintaining targeting focus. Point to falling response rates (and opt-out rates where appropriate) as evidence that contact fatigue has set in.

The ease of executing online communications is such that over-touching, even with the best of intentions, is all too easy. Even the pushiest retail sales person would be unlikely to follow you around the store asking every few steps whether you wanted to buy something (certainly not at John Lewis!). Yet that’s what our customers’ inboxes can feel like at times. A little discipline never did anyone any harm and this is no exception., analytics, email marketing and financials – it’s all in the cloud

Monday, April 13th, 2009

The customer conference in the UK this year took the form of CloudForce, a complimentary day of sessions and vendor showcases, held at London’s ExCel exhibition centre last week. It’s no revelation that have long since moved on from simply being a salesforce automation developer. Today, they position themselves as “”, promoting the benefits of cloud computing – multi-tenanted, internet based computer platforms – that obviate the need to install software. Indeed, the “no software” message, and attendant logo of the word “software” with a line through it, was repeated at every opportunity. An amusing moment came when Paul Cheesbrough, CIO of The Telegraph Media Group, made reference to “your software” when joining CEO, Marc Benioff, on stage during the main session. “Your platform I mean,” Cheesbrough quickly corrected himself, “there is no software.”
“Thank you,” replied Benioff.
“I saw it in your eyes!” quipped Cheesbrough.

The AppExchange platform that forms part of offers a plethora of opportunities to expand the functionality of the base product. However, the ready integration capabilities of and the application platform enable new possibilities, some of which I thought noteworthy. is particularly interesting, as it opens up the platform beyond Sales and Customer Service management to one that allows developers to create their own applications running on the cloud infrastructure. To developers, represents the opportunity to deliver solutions based around the software-as-a-service ethos, without having to build the delivery infrastructure themselves. Adopters of these solutions, for whom not having to install software and maintain their own IT infrastructure is appealing, gain access to applications meeting their requirements that might not otherwise have made it to this delivery mechanism.

One such example is a complete accounting application from financial software developers Coda, called Coda2go. Based around their on premise solution, Coda2go runs entirely on the platform and integrates closely with itself. I wrote recently about the considerations of integrating sales order processing within the sales and marketing “data ecosystem”, where I made reference to the point at which an Opportunity is closed and an order booked. With Coda2go, this process, together with resulting invoicing, is practically a one-click undertaking. Once the Opportunity is ready to be booked as a sales order, which would typically involve manually switching to a different system, all of the order details are picked up from, transferred to Coda2go, invoices created and the rest of the accounting process put in train. I can’t speak to how good a financials solution Coda2go is, but this looks pretty neat!

Closer to marketing home, Cognos (now part of IBM) and QlikTech were offering enabled versions of their analytics solutions. As well as enabling more sophisticated analysis, visual representation and dashboards than native, these solutions will work across multiple data sources, holding out the prospect of unified marketing and sales reporting and analysis. Joining marketing data such as campaign execution, response and leads with converted opportunities and closed deals, the nirvana of true, operational marketing effectiveness reporting comes a step closer. Of course a variety of process implications still need to be considered, but at least data visibility is improved.

Finally, and firmly within the marketing realm, a couple of email campaign solutions and a data collection system caught my eye. and ExactTarget both offer solutions for creating and despatching marketing emails with all the features you would expect, including HTML templates, personalisation, tracking and reporting. Naturally, this is integrated with in terms of data management and reporting, making straightforward but relatively sophisticated email marketing very easy. Clicktools allows the creation of surveys, landing page and forms, enabling rapid generation of marketing response mechanisms, as well as questionnaires and so on. Between all of these solutions, it seems possible that best-of-breed marketing campaigns consisting of outbound email and rich landing pages with response tracking can be created relatively easily and inexpensively, without needing full scale and costly marketing automation solutions.

So, there you have my quick round-up of highlights from CloudForce ’09, all without reference to meteorology or having my head in the clouds. Doh! Too late.

7 reasons for real time data updates

Thursday, February 12th, 2009

Previously, (see The secret to CRM & Marketing data management?) I’ve written about the benefits and hazards of creating independent marketing databases, and in particular the questions that need to be asked before taking such an approach. I’m currently involved in a debate over the long term approach that should be taken to the management of marketing data, and where it should reside, which raises some of these issues.

Take the real life example of a campaign automation system that is synchronised with a sales force automation (SFA) solution via a real time data adapter. Changes made to customer and prospect contact data in either system are exchanged almost immediately, together with leads and status updates. When it works, it’s fabulous, providing a real time view of data in either system, ensuring Sales and Marketing are seeing the same picture, whilst enabling them to use the best-of-breed system most appropriate to their respective requirements.

A new CMO and the prevailing economic conditions though have lead to questioning whether marketing data should continue to be managed in-house, rather than outsourcing to a marketing service provider. In reviewing the options for outsourcing however, one of the first issues (of many) that arises is how, if at all, should sales and marketing data integration be maintained?

Most out-sourced or hosted solutions tend to rely on much less sophisticated and timely batch data transfers, via ftp or similar mechanism, which are a long way from the real time synchronisation currently enjoyed. Is moving to such a mechanism and the attendant loss of immediacy important? “This is a really worrying trend,” says Shane Redding of business to business digital and data marketing consultancy Cyance. “It is disappointing to see companies make a backward step of this kind, which in my opinion is usually the result of not making the next step of really using the real time data in anger which then demonstrates the return on investment.”

Shane and I are very much in agreement, and here’s why.

  1. Sales and Marketing users don’t, and shouldn’t need to, understand the intricacies of data integration. They just want to know that data in one system is available in the other; a Sales rep entering a new contact in the SFA system wants to know their prospect is available for marketing activity. It invokes much greater confidence if this transfer is immediate, without having to know about or understand overnight batch updates. Once control is lost, users feel disconnected and reduce their ownership of the process, leading to a rapid deterioration in data quality.
  2. The sooner changes made to a record in either system are replicated, the less chance there is of subsequent changes to the same record in the other system being made before the data is transferred, leading to potential anomalies or corruption. This is particularly the case where records are merged or changes are made to many fields at the same time.
  3. Marketing-generated leads need to be transferred to Sales promptly. Research shows that timely lead follow-up is one of the biggest determinants of successful lead conversion. If a lead or response relates to an existing contact or customer, Sales should be made aware as soon as possible, allowing a rep to handle their account in the most appropriate way
  4. Best-of-breed marketing practices, such as trigger marketing based on response or other events, require good data integration. Explaining such requirements away saying “we don’t need to do that” won’t cut it. Your competitors are doing it.
  5. Business is moving ever faster. It is expected that data changes are available immediately, especially between Sales and Marketing systems. Reverting to a batch system is a backwards step that fails to lay the foundations for modern and forward-thinking marketing capability.
  6. System development and testing are substantially quicker and easier if changes in one system are reflected in the other almost straight away, rather than having to wait to see if configuration changes are working as intended.
  7. Much of the complexity in data synchronisation lies in the business rules for handling updates, conflicts, mappings and referential integrity. Once these rules are in place, why not transfer data more frequently, reducing the volume and complexity of batch updates when they occur?

Marketing shouldn’t be ashamed to stand up for genuine business requirements, with demonstrable benefits. Don’t let internal politics or external suppliers tell you otherwise!

With thanks to Shane Redding for contributing to this post.