Archive for the ‘Activity’ Category

Testing, testing, testing

Tuesday, July 8th, 2008

Implementation of the new Aprimo marketing automation solution (see 6 crucial marketing automation system requirements) has not been proceeding smoothly unfortunately, with a number of functionality and reliability issues recurring as we try to bed the system in. Hopefully these will be resolved soon, but what the experience has highlighted is the importance of testing before going into production with any system. This goes double (if not ten times) when deploying brand new software, as has been the case for part of our implementation.

When the professional services and consultants’ clocks are ticking, at the rate of tens of thousands of dollars a week, its tempting to cut them loose and go-live without completing testing as rigorously or comprehensively as warranted. This is a false economy though, and as with many things, it’s easy to launch in haste and regret it at leisure. The three most important considerations of any system launch? Testing, testing, testing!

6 crucial marketing automation system requirements

Wednesday, February 13th, 2008

Go-live of our new marketing automation system, Aprimo Enterprise, is rapidly approaching. I’ve just returned from a week’s user acceptance testing in Bedford, Massachusetts and general training is planned in the next few weeks. The launch will be the culmination of a significant project, involving the collection of requirements from the many business units across the company, integration of over 600 existing website data capture forms and implementation of relatively complex lead routing rules. Whilst these requirements have long been documented in great detail, here are some of the key elements that should considered vital for most marketing automation solutions.

  1. SFA integration – customer and prospect data, together with leads, are the lifeblood of sales and marketing activities and need to be shared with the utmost efficiency. As such, a robust connection between the marketing system and our chosen SFA solution, Salesforce.com, is crucial. Perhaps surprisingly, this is something that the existing, in-house developed system already has, so replicating this functionality is a given if progress is to be made. (I say surprising because so many organisations struggle with this, maintaining separate sales and marketing data “islands”, creating huge issues for ensuring the latest data is being shared and properly utilised.)
  2. Data import – loading external data, including event delegates, purchased lists and enquiries from third party websites, into a marketing database is an ongoing activity, making the ability to do so quickly and easily a significant operational benefit. This should ideally be as flexible as possible, in terms of file layouts and matching criteria, together with data quality maintenance such as address standardisation and general validation.
  3. Segmentation – it goes without saying that creating campaign execution queries needs to be as straightforward as possible. This functionality though should extend to the flexible creation of cells or sub-segments, including intelligent “cascading” of contacts based on a selection criteria priority. This is at the heart of customised and relevant, as opposed to one size fits all, marketing activity.
  4. Email execution – email is inevitably the most utilised communications channel in business to business marketing, necessitating capable and flexible execution. This should include support for templates and standard elements (logos, graphics etc) together with personalisation and customisation, for taking advantage of the segmentation capabilities driving relevant communications.
  5. Form handling – all website data capture should ultimately flow back into the marketing database, so minimal, if any, manual intervention should be required. In addition though, outbound marcoms (specifically email, but ideally print direct mail too), is frequently likely to link back to a landing page or microsite with a data capture mechanism. Again, this should be seamlessly integrated and in the case of a form arrived at via an email click-though, must be pre-populated (it’s infuriating to have to key in all your details again when you’ve just received an email from a company purporting to know who you are!).
  6. Database access – often overlooked, the ability to query and modify individual contact and organisation records within a marketing database is hugely beneficial. It’s tempting to think that all data will be extracted en mass as a list or email broadcast selection and that it never needs to be dealt with individually. This is rarely the case though, and as inevitable issues and queries arise, minor updates required and verifications needed, the ability to quickly look-up individual records makes life much easier.

This is by no means a comprehensive set of requirements of course, and vast amounts of detail lie behind even these points. These should be the “table stakes” though for any services enterprise marketing automation solution. Anything else is cutting corners!

6 data quality solution requirements

Wednesday, July 18th, 2007

Further to the “vision” I outlined last month, the RFP for our data quality system has been released to vendors (see recent post “Data quality – a vision”). Whilst data quality is by no means a technology problem or one that technology alone can solve, I do believe that a good software solution can create a platform around which the right processes can be built to achieve and maintain better data quality. These are the key requirements to which I’ve asked the vendor short list to respond. (A list of data quality suppliers is available on my main website.)

  1. Data profiling/validation rule generation – appropriate analysis of existing data structures and content in order to determine rules for ongoing data validation and exceptions reporting.
  2. Initial database address standardisation and de-duplication – perform initial address standardisation to local postal authority conventions, appending an address quality score to each record. Conduct org level de-duplication on the standardised data at country and site level and once organisations have been de-duped, conduct an individual level de-dupe at organisation level.
  3. Operational data processing – ongoing ad hoc data loads from internal and external sources, requiring address standardisation and merge/append (i.e. de-duplication) processing for loading to the main database. Monitoring and reporting of data validity and rule compliancy.
  4. Monitoring and maintenance – proactive identification of data quality issues resulting from invalid data loads or user updates. Present data requiring review/correction to appropriate users in order that amendments can be made and then prepared for loading back into the central database.
  5. Profiling and metrics – ongoing data quality metrics (consistency, completeness, frequency counts, scoring) and intervention reporting (duplicates identified and removed, automated validity amendments, manual corrections) based on set rules. Presentation via “dashboard” type report for easy review.
  6. Online data capture – real-time validation, standardisation and enhancement of data captured via web-based forms, including contact name and job title, email, telephone number and other elements. Apply formatting to all data (capitalisation etc) and telephone (local presentation conventions). Process captured data to be merge/appended to the main database.

Whilst I’m waiting for proposals from the vendors, the next step is to develop the business case for the project itself, to which I’ll return here soon.