Artificial intelligence – adding value or just another buzzword?
By Craig Bury, CTO from Three Media
Incorporating advances in technology to sell a product is nothing new. The music and broadcast equipment sectors have used innovations like Dolby Surround and 4K to promote products to consumers and professionals alike. In both these instances the positive benefits of the technologies are tangible or can be explained easily. Sometimes, however, the technological advance in question is so specialised or arcane that it merely becomes a buzzword that borders on the meaningless.
This is certainly the case with invented terms such as orchestration, end-to-end system and even “the cloud.” These terminologies mean everything and nothing at the same time, which can obscure the actual merits of the underlying concepts. Despite being long-established and perfectly legitimate, this indignity is currently befalling two different but aligned methodologies; artificial intelligence (AI) and machine learning (ML). Attend any exhibition or look through a trade magazine and you’re sure to come across systems or products that claim to have AI and/or ML at their core.
The problem is that there is a degree of bandwagon jumping in all of this. In reality very few products, that are not AI only, have true functional AI within them and while some might have a form of ML, it may not be fully implemented. When manufacturers that do not have AI/ML as a core competency but make unfounded claims as a selling point, it’s little wonder that the market reacts with cynicism.
All of which is unfortunate because AI and ML have a long history of being associated with some pretty impressive technological advancements. They also offer a lot to the broadcast industry in terms of improving and automating processes. Sadly, that potential is often overshadowed by the hype and over-reliance on the power of the terms themselves.
In trying to understand what AI is – and more to the point, what it isn’t – we should take a look at some of its early development. Throughout history, scientists have attempted to produce mechanised procedures that emulate human thought, ranging from calculating machines to sentient robotic beings. The term itself did not appear until 1956, when it was coined by cognitive scientist and father of Artificial Intelligence, John McCarthy for a conference on the subject. The first AI-type programs were also created in the 1950s, including IBM’s The Samuel Checkers-Playing Program.
AI is a sub-specialty of computer science that seeks to emulate human thought and enable computers to perform tasks that would usually require human intelligence. While it has specific applications of its own, it is largely used as an umbrella term for other, more specialised procedures and techniques. These capabilities include robotics, deep learning (another buzzword in broadcast technology), intelligent control and data mining. Furthermore, ML and artificial neural networks are sub-sets of AI and have crossed over into other disciplines, including video and audio processing, because of their ability to control mundane but necessary tasks, such as analysing large amounts of data to create programs and operating routines.
The term Machine Learning was introduced in 1952 by Arthur Samuel, a computer scientist at IBM, developer of the aforementioned Checkers-Playing Program. The concept relates to algorithms and statistical models used in computer systems to carry out particular functions through patterns and conclusions instead of set rules. Artificial or digital neural networks, like ML, do not work on a rule-based approach like general AI systems. Instead, they mimic how some neural nets in the human brain work by being a series of self-modifying connections. AI implementations do, however, have to be ‘trained’ to accomplish set tasks, which is where they differ from the looser statistical approach of ML.
Most likely because of this perceived complexity, the concept of ML is not pushed as much as that of AI. It is not often used as a standalone term and is more often slammed together with AI. Perhaps ML is less ‘sexy’ than AI, which is certainly more relatable than an artificial neural network. The way we think of the terms “artificial intelligence” and “machine learning” may also give us clues to the way we perceive the aligned technologies. Intelligence is something that we humans associate as one of our own qualities while machine learning is meant for some lower form of “robotic” being. Even though it came out almost 20 years ago (2001), the Steven Spielberg film A.I. Artificial Intelligence explores this concept by creating a robot that looks, walks and talks like a young boy, therefore emphasising its humanity instead of it being merely a learning machine.
Because of the general lack of understanding of what AI is and does – combined with the ‘fatigue’ it can cause when trying to discuss it – companies like Three Media are sometimes looked at with skepticism when they promote the capability as part of their product offerings. Three Media has AI within its simulation modelling and optimisation capability, which has been designed in conjunction with Imperial College London, but we don’t heavily promote it. We’ve taken this stance because the buzzword “AI” has saturated the conversation and is not always taken seriously in this context.
It is somewhat ironic that ML is not pushed as much, as it is easier to implement and quicker to develop, it’s just not as relatable. Even with this misconception, it’s capable of increasing efficiency considerably. In light of this, Three Media is preparing to implement ML within its XEN:Pipeline product suite. Initial implementations will involve the introduction of logic for Auto QC by ‘teaching’ the system to recognise false positives and to auto re-trigger any workflow when there is a known failure message.
If AI and ML do not make an impact on people’s imaginations, then perhaps it is time to come up with new terms that are both ‘sexy’ and self-explanatory. Referring to these capabilities as “Cognitive Systems” is a possibility, although cognition is perhaps a little too generous as it is defined as the mental process of acquiring knowledge and understanding. System intelligence, intelligent systems and human emulation could all be applied to AI/ML situations, but would they catch on? Are they buzzword worthy? We probably need AI to help us come up with the answer.
Ultimately, it’s not important what the underlying technology is called, it’s really about how intelligently it is designed, how it works and what value it creates for the customer. Everyone needs to look past the obsession with buzzwords. What counts is how a device or system, with all of its features, benefits your organisation by anticipating needs and growing as your requirements evolve. Now that’s intelligent.