This is the seventh post in my series on beliefs the DAM community has developed over time that need to be reassessed.
In this post I’ll focus on the potential for one critical technology that’s sweeping the software world–artificial intelligence–to assist DAM users throughout the campaign lifecycle.
“Autotagging is a gimmick, so AI doesn’t really have a role in DAM.”
It’s true that there’s been a lot of hype about using artificial intelligence services to auto-tag images, and also a lot of folks throwing cold water on the idea. The truth is somewhere in the middle – there are some clear use cases where it can deliver a lot of value, like user generated content in volumes that are just too large to be tagged by humans. But for many specialized vocabularies, the general purpose public cloud services aren’t there yet, even though they’re getting better by the day.
But whatever your feeling about autotagging, machine learning and analytics can do so much more for your DAM efforts. This post offers a preview of eight other ways this technology can accelerate and simplify the asset lifecycle; you can expect to see these ideas emerge in products in 2018. (And I won’t use the word autotagging again in this post–I promise!)
For the sake of this post, let’s follow a simple campaign development process shown here:
1. Project creation: An intelligent system should take into account your historical performance on the steps of a creative process to suggest timeframes for deliverables. High performing people are usually also overconfident in their ability to hit deadlines – in fact, research shows your odds of hitting a deadline that you’re 90% confident you’ll hit are actually no better than even. Oops! Computers don’t suffer from cognitive bias, so they can help anchor our expectations to reality.
2. Photoshoot: If you’re working with an agency or freelancer that’s uploading content to your system, why do you have to go to the trouble of getting IT to grant them credentials and set up a dropfolder for them? Your system should be smart enough to connect their contact information to the relevant projects so they can use a generic hot folder and the system can autoroute that content to the right place
3. Search and shot selection: As a marketing manager reviewing uploaded content, you’ll find some content that really inspires. The DAM will suggest similar content from the repository you should also consider, keeping usage rights and permissions in mind of course.
4. Ad creative: As the creative production process gets underway, the DAM can help the creative manager assign retouching, layout, or other tasks with historical insight in mind – if you’re on a short deadline, for example, it can push content to folks on the team with the history of working most quickly.
5. Creative review & approval: Now it’s time to review content. For complex content, like a video that has a lot of brand placement, multiple cast members, voice over talent, music sources, and stock footage, the usage rights review can be very laborious. For other content, like a stock photo, it might be very simple. A smart DAM includes usage rights based on uploaded contracts, read by natural language processing algorithms, and mapped against the intended territory, media type, and dates of the campaign you’re working on to know what rights you have, or uses an integration to a rights management system. But this isn’t necessarily perfect, especially when the rights are complex. This is where AI comes in – by determining the level of complexity and risk, an AI-enabled DAM can route highly complex or risky content to human review while ensuring simpler content meets usage rights automatically, saving significant time and manual effort.
6. Marketing review & approval: Now our content is reviewed, approved, and live. Our celebrity spokesperson puts his foot in his mouth, precipitating a brand crisis. We have to scramble to figure out where all the content that he’s included in has been placed so we can quickly take it down or swap it out. A DAM with strong compound asset reporting should be able to identify where we’ve published assets that include this person, and integrated AI algorithms like Google Vision can further help us by using an image search to find all matching images on the public web. (Google Vision is primarily known for the AI task I promised not to mention again…did you know it will also crawl the web?)
7. Brand portal: We’ve also published content from the campaign to our brand portal for our retail partner’s use. When she searches the portal, we can provide additional appropriate content suggestions she may not have considered looking for.
8. Archiving: Finally, the campaign is over and we want to archive it. It’s expensive and unnecessary to keep all our campaigns in low-latency, ultra-high availability storage. Today, you might set a generic policy for tiering. For example, deciding to keep final production assets in medium-latency storage, and the work in progress file archive – content that’s useful for reusing a special effect, music track, or a cut scene – in lower availability, higher latency, and much less expensive storage. But maintaining such a policy, and getting it right, isn’t easy. A smart system can automatically look at usage patterns and identify the content that’s likely to be requested again. For example, when you kick off a spring photoshoot campaign, the system might elevate previous spring shoots from the lower tier storage to the higher tier, to make them more easily available for inspiration and reuse.
So those are 8 examples of AI and advanced analytics helping DAM users and administrators accelerate every step of the content lifecycle!
In my next post – the last in the series – I’ll tackle the false belief that, with legacy enterprise DAMs increasingly outmoded, the only escape is to simpler, department-level solutions.
- In part 6, I wrote about the changing role of DAM, from an archival repository to a living, breathing, critical business application.