Media Asset Management (MAM) was a much talked about subject at NAB and to a lesser extent so was analytics, with Adobe in particular showing off some neat visualisations of content performance based on their “Pass” technology. Something that still stood out for me, however, was the manifest lack of any visions for how the broadcaster of the future will merge these different data sets to create value. For posterity, here are my thoughts on the subject.
The need
Broadcasters need to be efficient. This is, sadly, a fact. No one in the industry can afford to be as (relatively) profligate as a dotcom or a big FMCG.
To enable increased operational efficiency, broadcasters also need to start understanding a lot more about their customers. More than just the basic CRM data that a pay provider collects or the (basically bunk) data from entities like BARB, which collect small samples of user-reported data and extrapolate to wide audiences.
They need this information to provide better reporting to ad-buyers, particularly in the multi-platform world they participate in. They also need it to make better programming decisions. Like it or not, program commissioning is inherently risky and uncertain. I strongly believe that some of that risk can be managed with a statistical approach and that although there is art in the relationships held by commissioners, their talent-spotting eyes are not the exclusive path to success.
Today’s issue
This shouldn’t be seen as an attack on the broadcast industry, but it can be a little bit siloed at times! The great MAM solutions that have already been put into, or are about to be put into broadcasters are pretty good at orchestrating content production and distribution workflows but they are a little proprietary in the way they operate. Similarly, audience analytics is a profession in its own right and information from it tends to be maintained in similarly proprietary systems. ERP and finance systems that tell a user how efficient physical and human resources are being deployed represent a third silo of data.
Someone trying to work out whether a particular production technique leads to more efficient return on investment on a particular channel to market by addressing a profitable customer group would need to somehow model data from multiple separate systems. Even then, there’d likely to be rather a lot of assumptions to bolt everything together.
A lesson from the telecoms industry
It’s simple on paper and probably quite tricky in practice, but it feels that what’s needed is an enterprise data mart that sits over the top of the different data sets to create a single view of the customer. Those of you who, like me, have a mobile telecoms background will recognise this as broadly similar to the data marts that helped telecoms companies make more informed marketing ROI decisions in the latter part of the last decade.
A lot of the information required to make this happen is available operationally. For example, metadata schemas contain detailed information on the nature of talent in a show, the amount of time they’re on screen, the nature of the programmes technical and editorial make-up.
I can see one hurdle. On the production side, this type of system would require broadcasters who also create content to start to take a much more consistent, numbers-based approach to operational planning, particularly to link content to its production cost. In my experience although robust planning processes are followed, they tend to focus on the eventual costs and returns of things rather than the units of effort that make them up. For example, a camera team can either cost $50,000 for a production, or it can cost ([cost of camera] + [cost of labour])*number of days. Simplistic, but powerful if you understand your whole cost base at that level and can compare it to the success of the output...
...which is why I wonder why no one is yet offering a technology that would make this kind of facility possible for a broadcaster. The telecoms companies I worked for developed their own and then plugged tools like Business Objects, SAS and Unica into them to enable analytics and campaign management to take place. I doubt there’s appetite in the TV industry to do the same. So perhaps there’s an opportunity here for someone.
The need
Broadcasters need to be efficient. This is, sadly, a fact. No one in the industry can afford to be as (relatively) profligate as a dotcom or a big FMCG.
To enable increased operational efficiency, broadcasters also need to start understanding a lot more about their customers. More than just the basic CRM data that a pay provider collects or the (basically bunk) data from entities like BARB, which collect small samples of user-reported data and extrapolate to wide audiences.
They need this information to provide better reporting to ad-buyers, particularly in the multi-platform world they participate in. They also need it to make better programming decisions. Like it or not, program commissioning is inherently risky and uncertain. I strongly believe that some of that risk can be managed with a statistical approach and that although there is art in the relationships held by commissioners, their talent-spotting eyes are not the exclusive path to success.
Today’s issue
This shouldn’t be seen as an attack on the broadcast industry, but it can be a little bit siloed at times! The great MAM solutions that have already been put into, or are about to be put into broadcasters are pretty good at orchestrating content production and distribution workflows but they are a little proprietary in the way they operate. Similarly, audience analytics is a profession in its own right and information from it tends to be maintained in similarly proprietary systems. ERP and finance systems that tell a user how efficient physical and human resources are being deployed represent a third silo of data.
Someone trying to work out whether a particular production technique leads to more efficient return on investment on a particular channel to market by addressing a profitable customer group would need to somehow model data from multiple separate systems. Even then, there’d likely to be rather a lot of assumptions to bolt everything together.
A lesson from the telecoms industry
It’s simple on paper and probably quite tricky in practice, but it feels that what’s needed is an enterprise data mart that sits over the top of the different data sets to create a single view of the customer. Those of you who, like me, have a mobile telecoms background will recognise this as broadly similar to the data marts that helped telecoms companies make more informed marketing ROI decisions in the latter part of the last decade.
A lot of the information required to make this happen is available operationally. For example, metadata schemas contain detailed information on the nature of talent in a show, the amount of time they’re on screen, the nature of the programmes technical and editorial make-up.
I can see one hurdle. On the production side, this type of system would require broadcasters who also create content to start to take a much more consistent, numbers-based approach to operational planning, particularly to link content to its production cost. In my experience although robust planning processes are followed, they tend to focus on the eventual costs and returns of things rather than the units of effort that make them up. For example, a camera team can either cost $50,000 for a production, or it can cost ([cost of camera] + [cost of labour])*number of days. Simplistic, but powerful if you understand your whole cost base at that level and can compare it to the success of the output...
...which is why I wonder why no one is yet offering a technology that would make this kind of facility possible for a broadcaster. The telecoms companies I worked for developed their own and then plugged tools like Business Objects, SAS and Unica into them to enable analytics and campaign management to take place. I doubt there’s appetite in the TV industry to do the same. So perhaps there’s an opportunity here for someone.
Comments
Post a Comment