Media Factory or Customised Curation? Content's future at IBC2018

Themed around ‘the search for growth’ IBC will showcase AI-assisted workflows, the status of IP infrastructure and the future of content.

AI-assisted workflows at IBC2018
AI-assisted workflows at IBC2018
Open Lightbox

Entering IBC2018 it’s clear that broadcasters are emerging from a period of initial, high-level exploratory discussions about IP, to one in which detailed requirements and challenges are now being actively investigated. The core protocols for IP video workflows are certainly there in ST-2110, but it’s the next level of detail, largely based on feedback from early adopters, that will further inform the standards bodies and help vendors to continue to advance IP workflows.

Broadcasters want a managed, seamless migration to IP with little to no risk. They don’t want to schedule a hard cut date and hope for the best. They want time to test their infrastructure, running legacy systems in parallel when possible, so that risk is completely mitigated.

That’s why worldwide, rollout of uncompressed IP is still very much in its infancy. There have been some SMPTE 2022-6 based installations replacing SDI and even a few of the new SMPTE 2110 based installations. If they are not building greenfield sites, then broadcasters will seek hybrid solutions to transition operations to IP over the next few years.

Expect talk among CTO’s of the merits of cloud-based microservices. This concept breaks down playout into its constituent parts enabling broadcasters to pick and choose variables like data centres, bit rates, and templates. Such an approach should make channel launch as easy as opening apps on a mobile phone.

One buzzword which will be everywhere on the showfloor is Artificial Intelligence (or Augmented Intelligence or even Intelligence Amplification depending on how far you are willing to believe the degree to which an AI can ‘think’). This technology is beginning to prove its metal with practical examples of production and cost efficiency. Fast turnaround, high volume content like sports and news are logical targets for AI-facilitated products.

Linked to this is the increasingly viable concept of just in time content assembly. Here, small segments are created from the live event and used quickly. Catch-up and on-demand follow the linear programme with ever shorter delay. Sports is the prime example: A tennis match can be available on-demand in a matter of minutes after a game has ended. At Wimbledon this year an AI workflow automated virtually the entire process of creating and publishing online two-minute highlight reels.

By breaking down a piece of media into separate objects, attaching meaning to them, and describing how they can be rearranged, a programme can change to reflect the context of an individual viewer.

The BBC for one think this approach (it calls object-based broadcasting) has potential to transform the way content is created and consumed: bringing efficiencies and creative flexibility to production teams, enabling them to deliver a personalised feed that understands individual viewing habits to every member of its audience.

All these trends hook into a new stage of media industrialisation or commoditisation in which media is broken down to be reassembled as in a factory. It is up to the industry whether this leads to bland cookie cutter content or fresh and challenging forms of curated editorial creation.

Posted by Adrian Pennington, technology journalist. August 28, 2018.