A new standard for metadata will make things better for music
Every digital music file includes important information encoded within its bits’n’bytes called metadata. This identifies many different attributes of the song: artist, song title, track number, album, composers, label, lyrics, publishers, and (if the label is smart) an unique identifier called the ISRC number.
The importance of proper metadata cannot be overstated. If you can’t identify each individual track with absolutely certainty, it’s impossible to assure that the right people get paid.
The problem is that a lot of metadata standards to date haven’t worked very well. The recording music industry lacks universal buy-in for what’s been in use, so the burden has fallen to third parties like Gracenote and AllMusic. The problem is that each of these third parties are slightly different in how they approach things. And because they charge for their services, not everyone wants to pay. This has resulted in a massively fragmented metadata situation.
This will just not do. That’s where an organization called DDEX comes in.
Earlier this month, DDEX presented something called MEAD (Media Enrichment And Description), a new process for metadata that should make music easier to identify and find. And thus for the right people to get paid the correct amounts.
Getting everyone to agree to the same standards is a near-impossible task. But the thing that’s bringing everyone together is streaming music through smart speakers. If someone gives the command “Play the new Marilyn Manson,” there has to be a way for the magic behind smart speakers to find that exact song. That cannot be done without proper and correct metadata.
The new MEAD proposal provides for carefully structured and standardized metadata for music. This is good news for everyone involved. It’ll take a couple of years for things to be fully adopted, but in the digital world, we have to move towards good data hygiene.
Read more at Forbes.