This is the sixth post in my series on beliefs the DAM community has developed over time that need to be reassessed.

In this post, I'll address the changing role of DAM, from an archival repository to a living, breathing, critical business application.

"DAM is all about getting and holding assets."

To be fair I think most folks would agree this idea became outmoded and unfashionable a few years ago already, but there is further change afoot in the scope of DAM that I want to bring to your attention.

Let's start by looking back to the beginning of DAM. DAM started out as a repository for final assets. An archive. That was DAM 1.0, and many companies and vendors never graduated from there (even when they know more is possible and desirable, they don't always see how to get there).

The next wave of DAM, DAM 2.0, was all about connecting to upstream collection and downstream distribution points. Think about upload portals, download portals, and connections to web content management (WCM) systems, social sites, and more.

There is still plenty of innovation happening in DAM 2.0 – it's not over with by any means. For example, most downstream systems like WCM's have pretty rudimentary integrations that allow a page author in the WCM to open a panel and select assets from the DAM that are copied into the WCM, then served to the web through whatever mechanism that WCM uses. This means content is duplicated, and the performance of the content on the website – views, engagement, etc – is disconnected from the asset in the DAM. That's both a waste of storage and a missed opportunity to understand asset performance in the context of all the other metadata available in the DAM. How is one asset performing compared to the cost of creating it? Compared to assets from a different creator or agency? It's really hard to answer questions like these if the chain of custody of the asset is broken.

A better integration would be one where the author in the WCM opens a panel and select assets from the DAM that are referenced by URL in the WCM, but published directly from the DAM to a content delivery network rather than replicated in the WCM. This means big storage savings, and more importantly, the DAM gets pinged every time the asset is served to a viewer, so performance is much easier to track.

An even more sophisticated integration would place a dynamic asset on the page in the WCM – when the page is loaded by a viewer, the WCM would search the DAM for the most appropriate asset given the context of the viewer – age, location, known preferences, session history, etc. If that sounds a lot like the programmatic examples I shared earlier in the this series, you're right!


But the biggest thrust of innovation in DAM today is in DAM 3.0, which adds the part in the middle – the creation, transformation, interpretation, and insight to content throughout the content supply chain. This shift is necessary because the throughput of content is a major bottleneck to meeting consumer expectations of freshness and relevance, as I discussed in part 3. DAM 3.0 is all about having an end-to-end digital content supply chain that empowers people to do what people do well – making creative choices, establishing the goals of the system, and building relationships across the relevant players in the supply chain, while maximizing the manual and routine operations that can be handed off to computer systems.

Next up, in part 7 of this series we'll dig into artificial intelligence and the role it can play in DAM, which goes waaaay beyond autotagging.

If you're enjoying this series, I'd love to chat with you about DAM trends. Please get in touch!