This year at IBC, TinkerList, alongside Singular.live, Techex, Grass Valley, and ZIXI, was part of the Accelerator Media Innovation Programme, which supports the media industry in finding the best solutions for complex challenges. Instead of simply discussing the issues, the Programme focuses on hands-on experimentation, i.e. learning through tangible application. One of the standout projects presented at IBC2023 was the Gallery Agnostic Live Media Production, bringing together thought-leaders, innovative vendors, and cloud-based technologies who redefine how live shows are produced and delivered. This project focused on and explored the ways gallery agnostic live broadcasting can be implemented.
What is Gallery Agnostic Live Broadcasting?
Challenges in the Media Production
Progress in the media industry is all about a forward-thinking approach to new challenges. The Agnostic Live Media Production Accelerator Project stands at the forefront of revolutionary media production. The primary goal is to streamline the complex process of preparing and delivering live shows. Oftentimes, “the tech stack follows the workflow, not the other way around”, says Aaron Nuytemans, our Head of Business Development.
Those tech stacks are multi-vendor, multi-format, and a mix of on-premise and cloud. On the contrary, one common interface – in this case, the cloud-based rundown – would allow producers to quickly adapt to venue changes, different budgets, various OB trucks, types of shows, and such. In other words, a common workflow puts more power in the hands of the producers, making the production process smoother and more flexible, resulting in greater creative freedom. And this is exactly what the project aims to change.
The Goal of the Project
Since production companies nowadays have ever-growing tech stacks, the big question arises. How can we control all those devices and software from one common interface?
The Project aimed to explore whether a new, more innovative, approach could be found to the traditional problems of the production environment. How can we create commonality among workflows and, more importantly, what to put at the center of that workflow? How can we enable easy switching between on-prem and cloud for technical and editorial teams, without changing their workflows? And how can we guarantee a smooth production process in an emergency such as in a disaster recovery case?
Aaron says, the goal was to create “an interface, fully cloud-based, where journalists can prepare their show and drop in all the assets they want to see appear on the screen. All they know is — if I drop a video here, it will be sent to the correct device and appear during the live show. This means we made making shows fully device agnostic and automated”.
Solutions and Results
This project demonstrated that it is, indeed, possible to implement a unified workflow that connects to and controls any device in any gallery via API. Therefore, this innovative approach shifted the control layer from multi-vendor, and multi-format tech estates to a common, adaptable workflow, i.e. the rundown.
Gallery and Device Agnostic Broadcast Production
The feasibility of such a gallery and device-agnostic production was proven by setting up 3 different automated and controlled setups. A fully on-premise studio, a solely cloud-based studio, and remote production using only a laptop. This experiment successfully demonstrated that controlling a variety of tech stacks from different galleries, while maintaining a consistent workflow, is possible.
Agnostic media production is no longer the thing of the future but, rather, a very tangible reality.
Beyond IBC2023: Shaping Tomorrow’s Media Landscape
The results of the project highlight a significant shift in the role of producers. They are no longer constrained by the tech stack. But, instead, they can rely on a consistent workflow, regardless of the backend technologies. Likewise, this shift could allow producers to work more efficiently and intelligently, focusing more on creativity instead of technicalities.
Standardized API Framework for gallery-agnostic live broadcasting
The project emphasized the importance of and the need for a standardized API framework. It is required to guarantee the interoperability and connectivity of devices. Such an open and structured framework can reduce the need for heavy integration or transcoding, making the media production process more efficient.
However, the change does not stop here. As part of this Accelerator Project, an open call was made to broadcasters and vendors at IBC: to continue exploring this initiative into IBC2024 to foster quicker adaptation of Gallery and Device Agnostic Live Media Production.
You can watch the full presentation from IBC2023 here.