TL;DR
- There’s no turning back the clock on remote distributed workflows. The pressures to cut costs point in only one direction: the cloud. Can live production workflows be developed so that broadcasters access all the benefits of cloud with no drawbacks in quality?
- SMPTE and Portland company Port 9 think so and aim to sensitize industry opinion so that they can become aware that something like this is possible.
- Applying broadcast standard frame-accurate timing in the cloud with minimal latency remains a challenge but not an insoluble one.
READ MORE: Next Generation Live Production in the Cloud (SMPTE)
SMPTE’s master plan for moving broadcast production into the age of IP didn’t anticipate COVID (how could it?) and the cracks are beginning to show.
The pandemic has made such a time jump on the use of remote distributed video over the internet and cloud-based technology that grounding live broadcast studio workflows in specifications that mirror the gold standards of SDI may outlive its usefulness even before the industry fully transitions.
It was long thought that it would be at best compromised if not downright impossible for live programming to be produced in the cloud, yet this is what some corners of the industry, including SMPTE itself, are contemplating.
One of the innovators pioneering this change is Mike Coleman, a veteran broadcast engineer and co-founder of Portland, Oregon-based Port 9 Labs.
In an excellent blog post written by Michael Goldman for SMPTE and in a public demonstration of Port 9’s proposed cloud-based live switching technology, Coleman explains how SMPTE is working to develop new architectures for live remote video broadcasting — in the cloud.
He argues that the industry has by necessity begun moving parts of its production equation to the cloud, but that this is to a large extent piecemeal.
“If you examine how a cloud-native service would be built, it would be radically different than the architectures you are seeing for these lift-and-shift kinds of things. In other words, for now, there is a big disconnect,” he says.
Coleman admits it is still early in the company’s development of its own cloud-native architecture for doing production in the cloud, and that the industry will be slow out of necessity to evolve in that direction generally, but he nevertheless believes such a transition is inevitable.
“Right now, we have lots of lift-and-shift going on,” he explains. “That means people are moving existing ground-based solutions into the cloud. Since the [pandemic], people have been under a lot of pressure to take what they already have on the ground and incrementally change it to somehow make it work in the cloud. But they are starting to realize their limitations, and the industry is starting to understand it needs to adapt.”
Coleman believes it is now possible to build IP-based media systems that can be used via public cloud services and says his company has had success moving uncompressed video on multiple public cloud systems using multi-flow UDP (User Datagram Protocol).
“Cloud IP media services would be managed as SaaS [software as services]. Broadcasters would control the programming from anywhere they choose, but the underlying service will be maintained by the service provider,” is how he and his colleagues describe it in a separate article written for the SMPTE Motion Imaging Journal.
“It’s definitely an over-the-horizon thing and will likely take many years to get there,” Colman says. “But, in our opinion, cloud architecture, if done correctly, would be totally different from how things are done on the ground, since the whole point obviously would be to leverage the strengths of the cloud.”
A number of critical issues need to be addressed. They include addressing broadcasters chief concerns about quality — of image and of synchronization both of which are fundamental to the SMPTE 2110 family of standards.
Coleman says it should be possible to maintain quality by working with compressed media in the cloud and effectively only using uncompressed media at the point of transmission (or perhaps even rendered at display if edge compute is built out).
He picks out NDI — once anathema to broadcast engineer purists — as a robust and proven solution for sharing lightly compressed AV and metadata across IP networks.
“Generally, it is pretty good for its purpose and pretty easy to move up into the cloud, but even so, the video quality isn’t quite up to modern broadcast standards since it still requires 8-bit compressed video,” Coleman says. “Studios typically would prefer to compose video in the highest possible quality and then use compression later only for the transport phase.”
He thinks this is still a hybrid of the “lift and shift” approach and therefore not ideal. A better solution, to Coleman, is Grass Valley’s AMPP, “which is more cloud-native but still kind of in the middle between lift-and-shift and where we think it has to go.”
Coleman says one key to creating a true cloud-native architecture for broadcasters to use when producing live content involves approaching the concept of an IP-based workflow differently by taking an “asynchronous rather than a synchronous approach.”
“Today, in an IP-based studio, like with most IP-based things, you need extremely tight timing,” he explains. “Everything has to be synchronous using the PTP (Precision Timing Protocol) to [synchronize all devices on a computer network]. In the cloud that is really hard to do and we have begun to realize you don’t need to do it, because you typically have a huge amount of bandwidth and tons of CPU available in the cloud [when using a major cloud provider]. So, instead, we want to work in an asynchronous model, only synchronized on the edge if you need it to be.”
He says Port 9 is working on an architecture that works without being synchronous because everything is time-stamped: “We call this having a looser time model so that we can work on uncompressed video in the cloud.”
Another problem is egress — transferring material, and particular data heavy media, out of the cloud. That’s not a problem, per se — but the cost is.
“Cloud providers will charge you a lot of money in terms of data transfer fees,” Coleman says. “Therefore, typically, you do not want to send your uncompressed video back down to your facility on the ground. Our solution for that is to send only proxies down to the ground — that’s where we would use compression. Broadcasters are already very familiar with using proxies in their workflows.”
He says that SMPTE ST-2110 “is simply too tight in terms of timing” to work as a formal standard for live media production in the cloud, but adds that the Video Services Forum (VSF) is already at work with its Cloud to Ground/Ground to Cloud (CGGC) working group, which launched in 2020.
Among other things, that working group is examining where a common set of requirements for ground-to-cloud and cloud-to-ground data transfers would be necessary, and how best to establish a common technical approach for such data transfers.
Coleman also adds that working group “is embracing the idea of the timing model being looser in the cloud, so in my opinion, they are moving in the right direction by focusing on the data plain, or data transfer area.”
All of this has quite a way to travel before it would ever become ubiquitous in the wider industry. For now, he says the primary initial goal is to simply “sensitize people so that they can become aware that something like this is possible.”
“Broadcasters are still in this process of continuing to try incremental changes to their workflows in order to keep working as they move into the cloud,” he says. “What I’m saying is that an incremental approach won’t ever get you where you want to go. At some point, you have to make a big break. Before they can make that big break, they have to understand how it could work using [a cloud-native process]. I expect there may be a transition period of about five years before broadcasters are really using the cloud the way it ought to be used for live production. But I do think it is inevitable that it will happen.”
EXPANDING THE HORIZONS OF LIVE PRODUCTION:
Looking to stay ahead of the curve in the fast-changing world of live production? Learn how top companies are pushing the boundaries of what’s possible in live events and discover the cutting-edge tools and technologies for everything from live streaming and remote workflows to augmented reality and 5G broadcasting with these fresh insights from NAB Amplify:
- Public Cloud Is Ready for Live Production at Scale
- How 5G Will Impact Live Production Workflows
- Planning a Move? Considerations for Live Production in the Cloud
- Why Live Production Has to Up Its Sustainability Game