LAS VEGAS — The 2018 NAB Show is closed and in the books; with the attendance figures about 10% less than 2017 at just over 93,000 making the annual trek to Las Vegas. First impressions seemed to confirm the reduced attendance, at least to those in the Central and North Halls, as traffic seemed less than in the past. For 2018, NAB shifted many of the mainstream players back to South Upper and South Lower. The South Halls generally seemed jammed with people strolling to find the major players, at least from the broadcast equipment and content creation standpoint.
For 2018, standout trends centered on the many evolving cloud-based workflows (including content and asset management, playout, and processing); the emerging applications and solutions for Internet Protocol (IP) infrastructures–based on the new SMPTE ST 2110 standards; availability of UHD/4K components (from the absurdly inexpensive to the traditional expectations); the practices of creating workflows for High Dynamic Range (HDR) and associated applications which allow users to generate both SDR and HDR; and last but not least, the true arrival of AR/VR and AI/ML … our new set of two-letter acronyms relating to virtual-reality and artificial intelligence.
IP IS HERE AND A REALITY
Perhaps the most evident of all the new technologies is the SDI-game changer — that is, real-time video networking using IP.
From the IP-transition perspective, this was the first NAB since the adoption of SMPTE ST 2110 standards for managed professional media networks. Dozens of manufacturers brought many new IP‑centric products to the show. At the IP Showcase alone, 50+ vendors showed interconnected IP related products interoperating according to those standards which were produced by the SMPTE over the previous 18-24 months. Countless other vendors also amplified “IP” — regardless of what they meant by it.
For this industry, “IP” is the new buzzword —the new direction, much like “cloud,” was only few short years ago. Yet, IP takes on many different forms — from compressed video, to media workflows, to the carriage of information technology, and — of course — those hot new entries steeped in the production and transmission of real time, full bandwidth, full bitrate uncompressed audio-video (and metadata) in a networking environment. It seemed that without a doubt, IP is destined as an eventual replacement for SDI. Yet there still remain many hurdles to cross for full IP-adoption to be complete.
A collected endeavor of the many manufacturers who contributed to this transition was shown in the IP Showcase, found at the rear of the Central Hall. This year’s exhibit doubled in size over the 2017 NAB Show, reaching nearly 3,000 square feet. It featured a fully functioning, all IP-based video production control room staffed by volunteers and others, including students from Toronto’s Ryerson University, where presentations on IP technologies and applications were streamed live over the NAB channel.
Sponsored by the NAB and composed of trade and standards representatives from organizations including AIMS, VSF, AMWA, IABM, EBU, SMPTE and more; the showcase once again demonstrated working examples of the new IP video standards plus integration of the NMOS interface specifications. This year the showcase exhibits were arranged in an educational environment, letting visitors see and learn about the advanced capabilities of IP for professional video.
Potential IP adopters saw how 53 manufacturers addressed software-defined networking (SDN) alongside new tools aimed at diagnostics and operational management for IP implementations. Records showed some 1,030 attendees were scanned into the system as visitors.
AR/VR & MORE
Rippling down from January’s Consumer Electronics Show, held in this same location, was the enormous prominence of VR/AR (virtual and augmented reality) and AI/ML (artificial intelligence and machine learning). Throughout the show there were sessions and evolving products that support the industry’s new needs to create, manage, and deliver content to these emerging platforms. These cross-platform technologies are opening new doors, ones that are creating immersive and interactive media across social media and transmedia.
One of those “new era” production modes is that of eSports — the transformation of gaming to a real time, arena based live event. eSports attracts inventive players and is in turn changing production techniques that may show promise for aspiring new venues. This new gaming-environment (already attracting more than 40 million fans) could add far reaching opportunities for existing and future stadiums and arenas, especially when those locations are when not hosting other major league sporting events. eSports combines gaming and live “reality” television for both OTA and OTT, and pushes them into social media in a real-time domain. Look for many new programming opportunities across all forms of mobile communications and in-home entertainment.
MORE MOBILE VIDEO TRAFFIC
The biggest booth presence at NAB was Amazon — does this say something about the oncoming change? Here are some thoughts to ponder:
Program content production, ranging from long-form to user-generated short form, continues to explode — driving the technologies forward and the costs to produce that content downward. How that content is going to be consumed was a central undertone at NAB. According to Facebook’s Daniel Danker, “Fifty percent of all internet traffic is now delivery to mobile devices” and is “expected to be up to 75% in five years.”
One out of every five videos is “live” streaming. In August 2017, Facebook introduced “Watch,” a new platform for shows on Facebook. Watch is now available on mobile, on desktop and laptop, and in Facebook TV apps. Shows are made up of episodes — live or recorded — and follow a theme or storyline. And this is not where the story ends.
This, and dozens of similar stories, may indeed help drive the growth of the internet upward and outward. The NAB Show clearly showed this transition moving faster than ever, and the change is at a global level — noting that 73% of the homes in Sweden have 100 Mbps internet full time. All the service providers see this only accelerating the full adoption of mobile video communications and technologies like ATSC 3.0.
UHD, HDR & HFR
Continued emphasis on UHD was also echoed by the adoption of high dynamic range (HDR) and high frame rate (HFR). Several companies exhibited various means and methods to accommodate both the higher resolution characteristics of UHDTV (aka “4K”) and wider color gamut (WCG) perceptual capabilities found in HDR. Adding dimension to newer prospects, camera manufacturers and production solution providers alike are now making strong inroads into HFR video and the ability to produce both HDR and SDR (standard dynamic range aka “plain HD video”) in a simultaneous workflow.
Producing HDR and SDR, in concert with one another, comes with steeper challenges than when broadcasters moved from SD to HD video, or from stereo to surround. The complexities and processes with doing both, so that meaningful and proper images can be delivered to the consumer, showed its challenges at NAB as potential creators and users sought to understand which comes first, the HDR or the SDR — creating a sort of “chicken and the egg” perspective.
The next generation OTA broadcast standard is complete, driving several traditional broadcast companies to develop products aimed at the initial rollouts. ATSC 3.0 is a game-changing standard designed to deliver better video and audio quality, not only for over-the-air (OTA), but also over-the-top (OTT).
Technology wise, ATSC 3.0 is an IP-based transmission standard designed on a five-layer stack akin to the seven-layer OSI stack utilized in IP networking. The model allows for easy technology replacement and substitution as new features or developmental advances are rolled out. And ATSC 3.0 may have far reaching capabilities.
Audio improvements for ATSC 3.0 will be remarkable. Dolby AC-4, the new “next generation audio” (NGA) format, will include three Audio Element Formats: channel-based (like we have today with mono, stereo and surround); object-based (which is for immersive audio mixes) and includes audio signals and positioning metadata for customized audio programming; and scene-based, a sort of soundfield snapshot from a high-order ambisonic (i.e., a full-sphere surround sound technique) source that positions audio above and below the listener.
WHAT MIGHT 2019 BRING?
Given the significant emphasis on software, virtualization, the cloud, and the evolving flavors of audio and video imaging — plus where the internet will really take us — it is too early to figure out where we’ll be at the 2019 show. One thing is for sure though: change will continue and the industry will surely adapt. Stay tuned to see what survives and what stays as reality.
Karl Paulsen is CTO at Diversified and a SMPTE Fellow. He is a frequent contributor to TV Technology, focusing on emerging technologies and workflows for the industry. Contact Karl at firstname.lastname@example.org.