What’s the difference between artificial intelligence and machine learning? (Is there a difference?) How can AI/ML help the media industry work smarter, not harder today — and how might that play out in the future?
NAB CTO and EVP Sam Matheny and Renard Jenkins, senior vice president of production integration and creative technology at WarnerMedia, consider these AI/ML questions and others as part of NAB Amplify’s “Hey, Sam!” Q&A series, tackling the issues of who’s who and what’s what in technology and M&E.
Jenkins says that he defines machine learning as “the process of modeling and then training a system to do specific task.” ML requires algorithms, which “have an expectation and they have a trigger and they have an outcome.”
And then, he says, artificial intelligence “is sort of the evolution from machine learning into something where the machine itself can actually begin to predict and make decisions on its own.” When an AI is present, the algorithm does not have a predetermined single outcome, but it may change based on the different triggers in the scenario.
Often, Jenkins says, when AI is used as a buzzword, the correct term would technically be machine learning.
For example, when radio stations started using automation in the 1970s, they were using machine learning to train a device to emulate a radio station’s programmer.
In terms of understanding the relationship between AI and ML, Jenkins says, “Machine learning begot artificial intelligence, in a sense. You know, machine learning has to be there in order for artificial intelligence to develop.”
What are some examples of AI and ML in media today?
Virtual characters are an example of AI in action in 2022. Humans can input certain characteristics and then the algorithm will create a new persona, complete with a unique personality and outlook, even using applications such as GPT-3 to have conversations with these virtual characters.
Jenkins participated in a demonstration of this technology at WarnerMedia, and he admits, “to be honest, it kind of creeped me out that this thing began to answer me in sort of really, really interesting ways.”
Additionally, ML is simplifying supply chains for broadcasters, eliminating mundane tasks previously performed by individual employees. For example, categorized content archival, tagged by keywords, “creates an automated way for you to move the small stuff outta the way.”
Another current is example are algorithms being trained to “edit sports highlights based off of the action that’s taking place within the frame.”
What’s real now, and what will be real?
“Machine learning has been real for quite some time,” Jenkins says, while, “AI, I think, is really, coming into its own now.” It’s advanced beyond “fancy machine learning.”
Some data scientists have said that even though they created these algorithms, they can no longer predict the information that will be generated. That signals, Jenkins says, “that we have entered into a different phase from machine learning into true artificial intelligence.”
Many of our AI and ML expectations are grounded in sci-fi movies or graphic novels, and there’s also an element of “science follows art and, and it starts to look to create some of those things that we’ve seen,” Jenkins explains. He cites the example of the Star Trek flip phones, long predating the Motorola Razr. “With artificial intelligence, the hope is that we’re creating something that can be used for good, and something that can help enhance our lives, much like you see on The Jetsons.”
Want more? The INTELLIGENT CONTENT pillar at NAB Show covers all things AI and ML, data and personalization.
Of course, there are also ethical concerns at play. Just because we can do something, should we? AI ethics are a bit of a minefield, and some algorithms are already performing in ways that exacerbate inequality or create new problems, or at least highlight problems with the data or trainings based in bias.
Facial recognition is a prominent example of this concern, because they frequently import racial prejudices. Résumé scanners intended to correct gender disparities have also been broken by the trainers’ biases.
“We have to make sure that as you’re creating these algorithms, and as you’re creating your data sets that you remove as much bias as you possibly can,” Jenkins says. However, he cautions, “I don’t think that we can ever get to a fully unbiased space because the information is still being put in by human beings. There’s still unconscious bias.” Recognizing that fact is crucial to responsible use of AI.
What’s the timeline for media companies to implement AI IRL?
Jenkins expects “this is gonna be one of those things that sort of happens organically.” There won’t be light switch to turn on artificial intelligence. Rather, “individuals will see opportunities where they can actually make their lives a little bit better through the use of this technology.”
In post production, for example, colorists may start implementing automation for color correction of dailies so that they better match what directors expect during the review process. The actual “artistry and the creativity that goes into the color work would then be done by the artist themselves based off of what this automated process could do.”
In the realm of audiobooks, conversations have started around the ethics of automated voices and the potential help those with disabilities.
HEY, SAM! DEMYSTIFYING M&E TECHNOLOGY TRENDS:
From Artificial Intelligence and Machine Learning to 5G and cloud production, NAB CTO and EVP Sam Matheny tackles the issues of who’s who and what’s what in technology and M&E. Learn all about the technologies that are shaping our rapidly transforming industry in NAB Amplify’s “Hey, Sam!” Q&A series:
- Hey, Sam! What Is Cloud Production?
- Hey, Sam! What Is Hybrid Radio?
- Hey, Sam! What Do I Need To Know About 5G?
- Hey, Sam! How Is Streaming Impacting the TV and Video Marketplace?