ICP Blog

DAM Rockstar 3: Think ecosystem, not platforms - Tech & Integration

I was looking back at the techy questions that came up during the ‘Ask the Rockstar session’ and so I thought I’d use my turn in charge of the blog to expand a little on how Digital Asset Management must be about so much more than a single application for programmes to succeed.

Tom Sloan, Director, Marketing & Technology Consultant

I’ll start by proposing that, although a DAM tool is typically a single application the Practice of DAM within an organisation is not a tool. It is a strategic concept which is so much more than an application that stores assets.

The principles that underpin the successful management of digital assets must be maintained, governed and monitored on a regular basis to provide assurance that the organisation’s DAM practice stays aligned with its strategy across the entire content eco-system. DAM leaders & practitioners are now rightly taking that bold step to take charge of the entire content lifecycle both upstream (ideation, creation, approval) and downstream (distribution, transformation, re-assembly, publication, measurement).  This will include implementation of new technology, integration between technologies, process redesign and governance.

The proliferation of channels, the increased quantity of content required, and the ever-expanding importance of e-commerce and social commerce have necessitated an eco-system of applications as no one piece of technology can do it all. Technical integration provides both time efficiencies and data consistency.  Manual migration and misaligned data structures can often cause more problems than they solve.

If you are starting from scratch, the view now is that DAM projects shouldn’t really exist independently.  They should instead be a component of a Content Eco-System Programme focussing on end to end asset lifecycle processes.  They should be facilitated by technology integrations and benefit from overarching governance structures, operational models and change management at the programme level, not just during an implementation, but as a core capability to support continuous innovation and evolution.

The non-cyborg, mostly human me, knows that in the real world businesses today are rarely in a position to do all this.  That might be because there is some degree of reliance on some parts of the eco-system.  These can be existing applications or processes which are deemed fit for purpose, or business critical workflows that need to be protected from wholesale change. That’s totally understandable.  In fact, the most successful programmes will mark out a much more gradual roadmap towards an integrated content eco-system inspired by short, medium- and long-term business outcomes.  Each is supported by the appropriate business case for every integration along the way, evolving the content eco-system (and the remit of DAM practitioners with it) whilst remembering to demonstrate ROI through measurable metrics like cost savings and process efficiency.

The key thought here is that, as DAM practitioners, we shouldn’t be afraid of making the case for The Practice of Digital Asset Management across an entire content eco-system, made up of multiple applications; and proposing a holistic set of system-wide policies (data, governance, operations & change) to complement some nifty, time saving and user-pleasing integrations.

So, how best to move to a content eco-system approach instead of a more siloed set of independent software projects? 

Here are a few things to consider which I hope also addresses some of the questions raised in the Henry Stewart ‘Ask the Rockstar’ session

1. Establish programme wide governance and discussion forums bringing in your experts on different systems and processes

Cross functional governance groups which include stakeholders representing different applications, teamed with business stakeholders representing different stages of the product & asset lifecycles ensure that a holistic approach to the programme is maintained. Decisions which have upstream and downstream impacts can be de-risked and aligned upfront and result in clearer C-Suite relationships as well as communications to the wider business.

There are also a lot of items which can and should be discussed at a much broader level than just for single applications. Examples here are asset visibility, security and user permissions cross-application as well as how user account creation and deletion are managed. Where possible, access to eco-system applications need to be built into onboarding and offboarding process.  A model for external access should be agreed as centrally as possible to avoid challenges like having multiple logins and passwords to remember, and different password expiry dates for each application.

2. Purposefully establish the role and scope of each application within the eco-system

Each application should have a clear purpose for the business and the user community. The capabilities of each technology should be made clear at the programme governance level so that any current or future needs from anywhere in the content eco-system can be discussed.  If possible, existing solutions should be leveraged to satisfy those needs rather than procuring additional software or significant customisation of an application is given the go-ahead. The example here is for DRM tools. Many DAM applications can provide sufficient rights functionality, but if a significant proportion of your assets require appended rights information beyond content rights / licenses / usage restrictions then an integration with a DRM tool may be required. Talent / Model / Music rights are a good example, or if royalty payments are part of your business model.

Other steps can be taken to encourage adoption of the right tools at the right time. It may be necessary for example, to limit functionality in downstream applications to ensure assets and data are being managed and quality assured in the appropriate systems. If you’re leveraging an integration between DAM & CMS, don’t let people upload directly to the CMS bypassing any quality control or metadata requirements which support increased re-use of that content in other channels, for example.

Finally, and more specifically in a DAM context – you should be clear about what should and shouldn’t be stored in your DAM. Asset re-useability is often a KPI for DAM implementation, and restricting visibility for large numbers of assets, or having very complex user permission model and rules can actively work against that metric. It’s always worth challenging permission requirements and in cases where heavy restrictions are being insisted upon, challenge the reasons for storing it in an enterprise platform in the first place!

3. Get broad agreement on what within your business constitutes ‘Master Data’ and how this can be kept in sync across multiple applications

Master Data are the key metadata items which should be universally important and constant across multiple applications. Typically, this might refer to Category / Brand / Sub-Brand type information, Geographies / Markets, or even Asset Types. When new values for any of these fields are added, they need to be selectable in your applications and appear across the eco-system simultaneously. This requires a carefully thought-out process for keeping the fields in sync and mapping any legacy data from an old field to a new one in multiple systems if necessary. This doesn’t always require technology if there is a well governed and executed set of processes for doing this in place.

4. Leverage two-way integrations to enrich user & search experience in each platform, but ensure that the content or data is only editable or managed in the ‘system of record’ for that content type

The data that exists in different eco-system applications can be used to enrich experiences in others. For example, assets may be required to appear in our PIM alongside the relevant product records – but it would also be very helpful to pull the product data from PIM and index it alongside the asset metadata in DAM. This can help provide an additional or enhanced search experience where DAM users can also browse, navigate, search or group content according to product data. Importantly, the product data from PIM shouldn’t be editable in the DAM and vice-versa.

5. Stay as ‘Out-of-the-box’ as possible!

The integration capabilities of any MarTech application should be key criteria for most customers during the procurement process. Where possible, lessons should be learned from eco-system pioneers of the past who, during their DAM 1.0 or even 2.0 journeys, were forced to customise a product in order to address specific user stories or requirements. This may have solved some short term problems, but the result is that these platforms quickly moved away from the simplest upgrade path set out by the product vendor, making it much trickier to keep them updated and derive the benefits of the vendors' product improvements.

6. Use automation smartly, but don’t over-estimate its benefits at this stage

Leveraging transformation on digital assets as they move downstream can remove manual production effort required to ensure channel optimisation, so this is certainly an automation to consider. It relies on having solid asset specifications briefed and quality controlled during creative process and ingestion steps within the content eco-system.

Today, Artificial Intelligence is being used to provide very basic metadata enrichment. It has potential to do more, but mainly around content curation/ and discovery. AI or Machine Learning investment is better deployed in content analytics / data aggregation & insight rather than in DAM (for now at least). Crowdsourcing metadata sounds great, but a crowd won’t instinctively tag your images the way your users would search for them, that knowledge sits internally and needs to be understood! Also, it’s a pain to implement, manage & QA!

7. Do Change Management wide, and continuously.

Running multiple change programmes across different components of the eco-system can lead to mixed messaging and user confusion about the strategic objectives of the programme. Of course, the detailed training and onboarding for individual applications might need to look different or be done separately, but having these change management initiatives and deliverables ladder up into a single broader programme will really help end-user understanding, limit the number of separate communications and ensure a consistent approach across the end to end user and content journey.

It’s important to remember that although user numbers might remain fairly consistent after the launch of an application, its user base will often be constantly changing, with users leaving the business and new agencies coming on board – it’s important that access to training and knowledge sharing is constantly available, otherwise adoption could drop of and data quality could suffer after launch – leading users to unfairly level blame at the technology itself. 

By Tom Sloan, Director, Marketing & Technology Consultant

 

Here are the topics and the ICP experts blog series weighing in:

1: Getting stakeholder buy-in to a winning plan: Victor Lebon, CEO, EMEA & APAC 

2: Plan beyond the tech; Adopting and Adapting: Jing Wang, Consultant 

3: Think ecosystem, not platforms: Tom Sloan, Director, Marketing Technology - this blog

4: Find the balance between metadata structure and flexibility: Sara James, Sr. Marketing Technology Consultant 

5: People and resources make a difference: Elliott Brown, Group Business Director