In some cases, organizations have unique file naming conventions, but file names are often created by people, which more often yield not-so-unique file names. One person names a file one way and another person names the exact same file another way because they use it differently or in a different place. While this demonstrates a clear lack of consistency and governance, which happens way too often. This is especially true if you not using a Digital Asset Management (DAM) solution with clear guidelines and stop gaps to catch these sort of things as part of a workflow.
We end up with collections of assets which may have:
many similar file names (are these true redundancies or simply the result of a poor file naming convention?)
some assets with the same file names (whether the content is the same or not throughout a folder structure)
every asset having different file names even though some may be exactly the same (content wise)
Assets just copied multiple times across folder structures (which makes sorting by file name out of the question)
So here is the dilemma. What do you do with exact duplicates?
Throw away all your IP and start over? (Not wise)
Painstakingly open and look through each and every asset to try comparing and contrasting them all, one asset at a time? (some people do this, as painful, time-consuming and expensive as it sounds since it may even take a subject matter expert to review some assets)
Have a computer “just do something about it“
Use a file browser to make this review process go faster? (getting warmer, but we can actually do better than visually checking each asset, with or without any metadata)
And we have not even begun to discuss different versions of the same assets and different file types (different file extensions)
What would you do?
Some DAM solutions will look for matching file names and catch those during the upload process to the DAM (based on matching file names, regardless of whether the actual content is duplicated or not, as described earlier).
There is even a better way…
Enter the world of algorithms. Yes, an algorithm is complex code. Do not worry because these complex codes can be nicely packaged into easy to use and very powerful tools for data deduplication (also referred to as ‘deduping’ or ‘dedupe’ in shorthand). The algorithm does a bit-by-bit comparison of each asset, regardless of file name, and creates a checksum. Checksums are a string of letters and numbers (alphanumeric) which act like a fingerprint, unique to that asset. If an asset is an exact duplicate (really beyond any visual comparison), it will produce the same checksum. If two assets have the same checksum, they are exact duplicates.
How does it work with assets?
Add a period in a text file
Move a line in graphic
Clip an audio file
Color correct a photograph
Doing any of these changes the bits which make up these assets and that will yield a different checksum. If two assets have the same arrangement of bits, you will likely have the same checksum.
So how accurate is it?
One algorithm called MD5 is accurate to 1 in 1 Octillion (That is 10 to the 27th power or 1,000,000,000,000,000,000,000,000,000 according to most American English standard dictionary numbers). That should be accurate enough for quite a while, don’t you think? Read on. This gets better.
Regardless of the operating system (OS) or the computer (PC/MAC/whatever) you use to run this algorithm, MD5 can catch exact duplicate assets and when it does, it will produce the exact same checksum or fingerprint.
I briefly mentioned MD5 during a metadata webinar and people got really excited.
Where did MD5 come from?
MD5 was originally designed in 1991 by MIT’s Ron Rivest as a cryptographic hash function using 128-bit hash values. While the intelligence community encrypts with other algorithms now due to security concerns, MD5 was a standard among software vendors in order to verify whether a download was exactly as intended to be downloaded (not hacked). MD5 has now been replaced by SHA-256 as a U.S. National Standard. Shying away from a 512 bit algorithm (SHA-512) which are even more taxing to a system, MD5 is still one of the commonly used data deduplication methods. Note that MD5 is not recommended for any SSL, password security or any security today. We are talking about using MD5 just for data deduplication here, not security.
How common is this tool found?
The command for MD5 is built into UNIX machines (Apple’s Terminal application). There are a bunch of PC programs which use MD5 (or SHA-256) and are available online for a nominal fee. Some DAM systems are available with MD5. Some DAM systems are available with a less powerful algorithm called CRC32 which is a 32 bit hash.
What does a MD5 checksum look like?
5d41402abc4b2a76b9719d911017c592
To technology folks, this exciting stuff with major potential. For the rest of us, you do not need to run away, but understand that a DAM should be able to create, read and compare these values. A DAM should also be able to report on this along with the rest of the metadata for every asset available.
What are the benefits of MD5?
We can run MD5 on a collection of assets (in a DAM or not) and compare the checksums. If any checksums match, you just found duplicate assets. Several MD5 tools do this comparison of checksums. Handle duplicates however your organization deems fit in a systematic manner. Just be aware of where the assets were intended to be used, particularly if the file names do not match.
We could also search for assets using the checksums (even as metadata on a per asset basis in the DAM if you assign a field to it) to reduce duplicates.
We could request for a DAM vendor to compare all checksums in the DAM (one per asset) for any uploads.
It is very common to have many duplicate assets within an organization. Some organizations have run MD5 on their assets and reduced duplicate assets by over 80%.
MD5 can even work on a string of text (outside of a file) to verify if it is the same as another string of text.
This can reduce storage on servers of any duplicates. Why would we want to store exact duplicates repeatedly?
MD5 runs on any Operating System and any computer which can handle the checksum function.
What are the risks behind MD5?
If we have embedded metadata (metadata embedded inside the asset) that is edited differently between two duplicate assets, you may get a different checksum (duplicate not found).
Layered masks not visible to the naked eye may throw MD5 off if one asset has a layered mask and another asset with duplicate content does not (duplicate not found).
Collisions may happen. MD5 is no longer recommended for any security needs. SHA-256 trumps MD5.
The MD5 tool may tax your system performance while creating and comparing checksums for a collection of assets. SHA-256 is even more taxing on a system. SHA-1may be less taxing on a system, but can also have collisions (not good for security).
This does not necessarily identify nor eliminate all duplicates, but MD5 can help address most of the them.
People may continue creating and acquiring duplicate assets, but deduplication on DAM system will help act as a stop gap to additional duplicates being introduced to the DAM.
How to use MD5 on assets?
You could…
Run MD5 on all assets already in the DAM (dedupe existing DAM assets)
Run MD5 on all assets to be uploaded on a ongoing basis and compare those checksums to the checksums of assets already existing in the DAM (dedupe all asset uploads against existing DAM assets)**
**Note this may, depending on the DAM system, require either:
A configuration of the DAM (varies among DAM systems) if it already exists as a feature
A customization to the DAM for this process to be automated upon upload (if it is not an available feature to the DAM system)
A manual effort prior to upload to DAM or even outside of the DAM (which may catch less duplicates if neither the customization nor configuration is available).
Where can we find more information about data deduplication?
Google it.
Ask DAM vendor(s) about whether they have some data deduplication methods in their DAM system. Many people (include vendors) may not be aware of the need for data deduplication. If your DAM vendor does not have it, ask for data deduplication to be part of their roadmap of upcoming improvements with accompanying documentation. The more people ask, the more likely the vendor will add this to their roadmap.
Let us know when you are ready for assistance in deduplicating your digital assets for your business or consulting for your Digital Asset Management needs.
Some people actually ask themselves this question and wonder why they can’t go back to their old ways of doing business.
Sure, you can. You can also do the following…
“We can find all my final assets really easily because I have them all right here on my desktop.”
We have:
Final1
Final02
FinalFinal
FinalFinalFinal
ReallyFinal
LastFinal
Extrafinal
Superfinal
SuperduperFinal
ExtraLargeFinal
Final_with_cheese
AlmostFinished_really_Iswear
Ok. I might have a little problem with version control and file naming conventions.
Yes, a DAM can have version control to take care of this little problem a few of us might have.
“We store all our files on our own desktops.”
A desktop is just another silo. Who else can see your assets on your desktop? Can you find all your assets on your own desktop? What happens to these assets when you lose your laptop or get a new computer?
“We can keep all our assets on shared drives.” Yeah, those are so searchable, right? As long as your perfectly crafted file names say everything about every asset you’ll ever need to know. Oh, wait. Shared drives are not truly searchable to the asset level beyond a so-called unique filename.
“We have unique file names for every asset.” File names are created by humans and meant for humans outside of a DAM. Many DAM systems do not care what your file names are as long as they are not 250 characters long, filled with spaces and special characters. Scary sounding, huh? Some organizations prior to having a DAM have some of these “unique file names“. You know who you are. Some DAM solutions assign unique identifiers to each and every asset uploaded/imported to the DAM and these make file names into metadata for the DAM.
“We can keep assets on CDs or external drives so we can share them easily.” You must like burning money if you are still using CDs or DVDs today. Where is the latest version? Which CD is that on again? Or do you need to burn another set of CDs for the latest version of assets? External drives (regardless of how big or small) can get lost, dropped or corrupt very easily. External drives have the same version control issue as CDs, even if backed-up regularly. How often is new version created by someone else? How do you ship these to external clients? That’s free, right?
“I will just email the asset around to everyone.” Are you planning to fill up every one of those people’s email inbox with high volumes of data? And each one will back up that data multiplied by how many people? What is the file size limitations for your email attachments? 5MB? 10 MB? Some email accounts do not even accept attachments, in fear of viruses. Will you continue to email this asset for each person who needs to see this each time they need to see this asset? Will you repeat this every time they need to see an asset again? Wow, that is a lot of email data repeated over and over again, isn’t it? With a DAM, you could simply send a link to the asset (not email the whole asset) to whomever needs the asset, whether they need just preview it or download the asset, based on permission set by the sender, through the DAM. Let us weigh this option again. Email attachments over and over again vs. email link to asset in DAM which can be updated as needed in DAM.
“We’ll just FTP the assets to the person who needs it.” That is secure, right? No one else can see the FTP server nor add to the FTP server either, right? And where is the version control on a FTP server? Oops.
What have we learned so far?
Use a DAM for assets
Associate metadata to each asset in a DAM so you can search and find it again
Version control with a DAM
Distribute assets with a DAM
Prosper and save some money with a DAM.
Let me know if you have any other brilliant ideas on why you should not store assets in a DAM. I would love to share them with readers.