Gisbert Hochguertel: We expect changes in terms of speed and performance

Ikegami: HDK-X500 Portable 3-CMOS HD Camera at IBC 2024

Gisbert Hochguertel, Sales & Marketing / Broadcast & Prof. Video Division, Ikegami Electronics (Europe) GmbH, in the TKT1957 survey «The year 2030: AI or engineer?».

  1. How will the broadcasting industry and broadcast technologies change in the next 5 years?
  2. If we model the world of broadcasting and broadcast tech in 2030, what role will AI play?
  3. How will AI change your business segment?
  4. Which professions will AI displace in the broadcasting technology industry by 2030?

1. For Ikegami, as a traditional manufacturer of studio and portable system camera equipment for live production, we do expect a change in terms of speed and operability, but no longer in terms of resolution or overall picture quality.

The introduction of 4K/UHD almost a decade ago along with HDR (High Dynamic Range) has brought more than enough visual quality into the TV world. This technology has already become a general issue within the TV industry, although the majority of viewers at home are still waiting for those high-quality signals to be delivered to their home TV set.

For general viewing applications, more resolution does not make sense for the human eye and only increases data rates, efforts and costs. The focus in broadcast development within the coming 5 years will be laid on transmission speed and operability.

Remote Production is a key technology here and ST-2110 will be the game changer. Modern compression algorithms, such as JPEG-XS, will allow to cleverly use existing IT-bandwidth so that many sources can be bundled for transmission. This will increase speed and efficiencydrastically whereas at the same time, the content produced remotely can be shared among different locations for editing and postproduction. This will open new ways of program production, not yet being familiar in the broadcast industry today.

Talking about operability, we expect the front-end systems (i.e. all kind of cameras) to become smaller, less heavy and easier to operate by introducing clever technologies for more accurate exposing and focusing. Box cameras are already in high demand today and we may expect this trend to continue in the future. A well-trained camera operator will always be necessary to frame the best state-of-the-art picture, but in many cases, unmanned cameras on robotic heads in various places installed all over the live venue, will replace the camera operator in the long term. And at this point, AI could take over the control of the robotic camera by automatically following objects (e.g. PTZ movements controlled by AI).

2. If we like it or not – we have to accept the fact that AI will play an ever increasing role in the future for live production (that is the segment I´m talking about here and where Ikegami has its expertise). Looking at the current trend for robotic camera heads, it is easy to consider a software to automatically follow persons or objects plus to set them in focus accurately. As such, a fully automated live production of e.g. a football game would be possible… But this is just the beginning. An AI-system could learn to follow fouls or off-sides or could evenlearn how to frame portraits or other sceneries.

3. Looking back into the history of live production, we will see that some decades ago, events were covered by much less cameras. If we take the example of a football game, it was shot by a limited number of camera positions, sometimes only 3 to 5 cameras. At that time, this was OK and accepted.

Nowadays, the spectators at home expect to see the event from all possible angles, (e.g. SpiderCam, Behind-the-Goal Cam, Cam in changing rooms etc…) be it in real time or high frame rate. The number of cameras on site has drastically increased whereby mini- and box cameras play an ever-important role. Most ofthese additional camera positions are unmanned and as such offer a very high potential to be connected and controlled by AI.

For Ikegami as a high-quality camera manufacturer, it means to develop cameras that satisfy the demands of the customers, offering all requested features (e.g. resolution, HFR, HDR, size, weight, IP-connectivity, compression technologies…) within the product portfolio. The cameras will need to be flexibly adaptable (by software) and integrated into a wider range of products (hardware compatibility)with an open structure (interface) to allow AI integration.

4. Since framing pictures and setting the lights is always a creative work, we do not expect directors of photography to become obsolete. The creative process of shooting pictures will still require a human being to control the entire process. Thinking about the job of a vision engineer (e.g. color matching of cameras, prior and during on airtime) we can easily imagine this job to be partially or even fully taken over by AI.

Also, the job of the switcher operator could well be influenced by AI. If we think about more and more camera signals to be input into the switcher, we can easily imagine the operator no longer to be able to distinguish between the multitude ofsignals. In such a case, an AI could automatically suggest decisive signals to the switcher operator.

In a nutshell, wherever predictable routines prevail in the image or automated processes have to be processed, the use of AI can already be imagined today.

However, creative processes, such as image composition, framing, lighting design etc.. will continue to require human beings as the decisive factor – at least for the coming 5 years.

All the opinions of industry leaders can be seen in the survey “Year 2030: AI or Engineer?”.

- Reviews

- News of technologies and software solutions

Subscribe
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments