How to Shoot a Low-Budget Reality Show — Technical Architecture, PTZ, and 24/7 Operations

Low-budget reality production is always a balance between project economics and technical reliability. Minimal redundancy, round-the-clock shooting, daily broadcast, and a continuous YouTube stream. In this case study, Oleg Mikhonosha analyzes the low-budget reality production model through the example of House, a project produced for Georgian broadcaster Rustavi 2. He breaks down the project architecture, equipment choices, and PTZ camera workflows, and explains why, in reality, production is not the key factor, but control.

Oleg Mikhonosha, Technical Operation Manager (TOM), on the configuration of House for Rustavi 2, choosing Panasonic over Sony, PTZ control challenges, and developing custom software.

Portions of this article were published in TFT 1957 | TV & Film Technology Magazine, April 2026, Issue 792.

Project and Production Model

The project House: 16 participants live in an isolated environment, with weekly voting leading to elimination or restrictions.

Four years ago, the format aired as `Prime House` on Rustavi 2. The new version is titled `House`. While the concept remained the same, the production model changed. Previously filmed in a villa, the project was this time produced inside a TV studio. A fully functional living environment was built inside the studio: bedrooms, a living room, a kitchen, and shared spaces.

The duration was 3 months. Initially commissioned for 2 months, the project was extended following strong ratings. The show aired daily at 23:00 on Rustavi 2, alongside a near-continuous YouTube livestream.

From the technical side, the project was managed by two Technical Operation Managers — Oleg Mikhonosha and Valeriy Vergilesov. Their responsibility was maintaining the stability of the entire technical infrastructure 24/7.

The workload was organized in shifts: one day on-site by Oleg, the next by Valeriy. A typical working day was around 12 hours, approximately from 10:00 to 22:00. However, the schedule was adjusted depending on in-show events such as parties, competitions, live host segments, and unexpected tasks.

In addition to the TOMs, the production structure included three rotating technical teams:

  • PTZ operator.
  • Shift director.
  • Sound engineer.
  • Editor.
  • Producer.
  • Two mobile operators with ENG cameras.

This setup ensured the continuous operation of the project with ongoing recording and live transmission.

Technical Operation Managers Oleg Mikhonosha and Valeriy Vergilesov.

What Defines Reality Shows — and What Stays the Same

From a technical perspective, Oleg Mikhonosha argues that most reality shows are built on the same core model: “almost all reality formats are more or less identical.” At their core, they involve a group of participants and continuous observation of their daily lives and interactions.

The differences between projects are not defined by production principles, but by implementation parameters:

  • Location scale.
  • Number of cameras.
  • Size of the technical team.
  • Budget level.
  • Broadcaster requirements.

Reality shows are often perceived as confined to a single location, but this is not a strict rule. In many productions, the main set is combined with field shooting: ENG crews capture отдельные segments, competitions, or activities outside the primary location. This adds complexity to both production and technical architecture, but does not change the core principle — continuous observation.

Optimal Number of Participants

From a production efficiency standpoint, the optimal number of participants is around 16.

“This is the level at which editors and loggers can realistically track everyone. If there are more participants, attention becomes diluted — both for the team and the audience.”

Increasing the number of participants leads to higher editing workload, reduced narrative focus, and weaker audience engagement. Reality production is not just about capturing footage — it is about maintaining viewer focus on specific characters.

Post-Production Logic: Logging, Editing, and Broadcast Delay

One of the key roles in reality production is the logger — a specialist who tracks events in real time. They maintain a log (typically in Excel), marking participant actions with precise time code.

Example:

10:00 — conflict between participants.
12:35 — discussion of voting.
18:20 — emotional moment.

These logs form the foundation of the editorial structure. Based on them, the editor builds the narrative of the episode.

“I always give credit to the editors. They assemble a full episode in less than 24 hours.”

In a low-budget daily production model, a delay of approximately 24 hours is typical. Footage is recorded during the day, edited overnight, and broadcast the following day. The YouTube stream may run in a limited format — around 8–10 hours per day. Key dramatic moments are often excluded from the livestream to preserve suspense for the TV broadcast. This is not a technical limitation, but a programming strategy — managing audience engagement.

“In large-scale reality shows, the delay can be several months. For example, Paradise Hotel in Mexico — filmed in November, aired in April.”

Such formats typically involve:

  • Large camera setups (60+ cameras).
  • Multi-layered editing workflows.
  • Full post-production processing.
  • Narrative restructuring of the material.

This is no longer daily television, but a large-scale production with a long production cycle.

Storytelling: Improvisation or Script?

Storytelling is a central question in the reality format. According to Oleg Mikhonosha, most productions are based on real events, but the level of intervention depends on the country and its television culture.

“In some markets, producers actively amplify conflicts and introduce additional dramatic elements. This increases the level of action and helps удерживать audience attention.”

In this project, the participants were not random individuals but public figures — bloggers, vocalists, and media personalities. This inherently created tension and delivered ratings without the need for artificial conflict enhancement. Casting well-known participants is a strategic decision: the audience engages faster compared to shows with unknown personalities that need to be developed over the course of a season.

The Role of the TOM in Low-Budget Reality: Reliability and Risk Management

The primary function of the Technical Operation Manager was to ensure system reliability. Responsibilities included:

  • Monitoring video signal stability.
  • Maintaining PTZ camera operation.
  • Rapid troubleshooting.
  • Rebooting equipment in case of system freezes.
  • Coordinating technical teams during incidents.

The project relied on Panasonic PTZ cameras operating in near-continuous режим.

“Under continuous operation for more than a week, Panasonic PTZ cameras can occasionally lose autofocus. This is a typical technical behavior under prolonged load. The solution is simple — a hard reboot. A full camera restart. But operators usually don’t handle this — it falls under the responsibility of the technical manager.”

These “invisible” technical processes are what ensure stable broadcast operations in a format where recording never stops.

The project fell into the category of low-budget production, which meant:

  • Limited equipment inventory.
  • Minimal redundancy.
  • Strict shift scheduling.
  • High dependency on human factors.

In such conditions, the role of the TOM becomes critical. Without backup systems, even a short failure directly impacts both broadcast and streaming.

Equipment and Rental Model

The equipment for the project was rented by the broadcaster from NEP. In addition to technical oversight, Oleg’s responsibilities included:

  • Negotiating the technical configuration.
  • Selecting equipment models.
  • Aligning with budget constraints.
  • Adapting technical solutions to fit the project’s financial framework.

Equipment selection was driven by three main factors:

  • Project requirements.
  • Team experience.
  • Budget limitations.

“There are different camera models with varying sensitivity, dynamic range, and performance in infrared mode. Ideally, you could deploy a higher-end setup, but the budget sets clear limits. So you have to find the right balance.”

As a result, the team built a working configuration that ensured operational stability, met broadcast requirements, and stayed within the project’s financial constraints.

Panasonic vs Sony: Reliability, Cost, and Product Evolution

A logical question: if Panasonic PTZ cameras have limitations under prolonged operation, why choose this brand? According to Oleg Mikhonosha, the issue is isolated and does not affect overall system reliability:

“Panasonic only has one minor issue — during very long continuous operation, autofocus can drift. Everything else works reliably. For reality production, it’s one of the most dependable options.”

Today, the PTZ market for broadcast reality is effectively split between Panasonic and Sony. These two manufacturers provide:

  • Stable IP control.
  • Reliable mechanical performance.
  • Consistent color reproduction.
  • Integration into studio workflows.

The main reason for choosing Panasonic is project economics. Sony’s latest-generation PTZ cameras (FR series and the newer AM7) are priced in the range of approximately €12,000–14,000 per unit. Panasonic’s UE series in comparable configurations is around €5,000–6,000.

With a setup of 14 cameras, the difference becomes significant:

  • 14 Panasonic cameras ≈ €70,000–84,000.
  • 14 Sony cameras ≈ €168,000–196,000.

That is a difference of more than €100,000 on the video layer alone.

These figures refer to equipment cost, not rental pricing. However, rental rates are directly influenced by equipment value, making this a critical factor for low-budget productions.

Sony spent a long period without updating its PTZ lineup, followed by a sharp transition to the FR series with interchangeable lenses. This represents a technological leap — but also a price jump. The newer AM7 series, introduced about a year ago, also sits in a high price segment. For large international productions, this is justified. For local reality formats, not always.

The PTZ Market: Control as the Bottleneck

According to Oleg, the main challenge is not optics or sensors:

“The biggest issue is control. It’s very difficult to build software that allows fast and intuitive operation of more than 10 cameras simultaneously.”

In a reality show environment, operators must instantly:

  • Switch presets.
  • Reframe shots.
  • Adjust exposure.
  • Manage multiple locations.

If the interface is overloaded or poorly designed, reaction time drops. This is why Panasonic and Sony remain dominant — they offer mature control ecosystems. Interest in PTZ continues to grow across the industry. At IBC and other international trade shows, nearly every manufacturer presents PTZ solutions. However, not all are ready for real-world multi-camera reality production.

For manufacturers, the key factors today are not only technical specs, but also:

  • Integration into IP-based infrastructure.
  • Simplified user interfaces.
  • Flexible APIs.
  • Stability under 24/7 operation.

Oleg notes that he has developed a clear set of technical requirements for Sony PTZ systems that could make them even more suitable for production environments.

PTZ Experience: Since 2008

PTZ-based production for reality shows is not new for the team. The first project of this kind dates back to 2008 with the format `Hell’s Kitchen.` At that time, Sony BRC series cameras (BRC300 / BRC400) were used. Control was less convenient, but the scale was significantly larger.

In 2008, approximately 30 PTZ cameras were deployed on set. In the current project, the setup included 14 Panasonic UE70 PTZ cameras and 2 broadcast cameras. The reduction in camera count is explained by a smaller location and tighter budget constraints.

Technical Architecture: Cameras, Control Room, and Signal Delivery

The technical configuration was highly pragmatic. As a low-budget reality show, the equipment was selected based on reliability, availability, and ease of maintenance. The video system was built around Panasonic UE70 PTZ cameras. Control was handled via the standard Panasonic RP120 controller. Signal routing was based on the Blackmagic Constellation 2 M/E switcher.

Recording was managed through Metus, with storage on a Synology system with 80 TB capacity. The control room followed a practical European model, adapted for budget production.

The monitoring setup included:

  • Multiviewers with all camera feeds.
  • A quad view with recording streams.
  • Return feed from the broadcaster.

The return feed was required because the project periodically went live on air. As a result, the team simultaneously monitored internal recording, the YouTube stream, and the television broadcast.

The signal from the set was transmitted via fiber to the central routing facility. Further packaging into RTMP or SRT was handled by the broadcaster’s infrastructure. The responsibility of the on-site technical team was to ensure stable signal delivery without loss or interruption.

Signal Architecture, Recording, and Delivery to Air

The video signal path started from the PTZ cameras, passed through routing, AUX selection, and parallel distribution for recording and output.

The logic was straightforward:

  • One stable base signal chain.
  • Minimal signal conversions.
  • Multiple storage points to keep at least one week of content readily accessible.

Recording was done via Metus in XDCAM 50, 1080i50, with three recording channels running simultaneously.

Content was stored in multiple locations:

  • On the Metus system.
  • On the Synology storage array.
  • On a mirrored backup storage system.

This approach reduced dependency on a single node and allowed the production to continue through technical incidents without interruption.

Synchronization was kept simple. LTC timecode was used as the base, fed into Metus and simultaneously into a separate audio recorder. Over the course of the season, rare desynchronization events of a few frames were observed, attributed not to timecode issues but to buffering on specific computers. The standard solution was a quick system reboot.

Network and Storage: Functional Segmentation and `Hot Window`

The network was segmented by function.

  • PTZ control operated on a 100 Mbps network.
  • No NDI or RTMP streams were transmitted from the set.
  • A separate 10 Gb network was used for recording and storage.

Within the same environment:

  • A 10 Gb fiber connection is linked to post-production.
  • Workstations had 10 GB access to shared storage.

No redundancy was implemented for network links or switches. The strategy prioritized simplicity and maintainability: in a budget model, it is often faster to restore a failed node than to maintain complex redundancy systems.

Daily data volume averaged approximately 1.6–1.8 TB. The storage strategy included:

  • At least one week of “hot” storage across three locations.
  • Regular cleanup after the hot window filled up.
  • A retention period of around one month on primary storage.

This was sufficient to support editing cycles and re-edits within the production workflow.

Audio and Control: Tablet Interface, Monitoring, and Night Sound

The show featured 16 participants, but the number of audio channels was significantly higher. In addition to personal lavalier microphones, each room was equipped with ambient microphones. Sennheiser MKH50 microphones were used.

Night scenes presented a separate technical challenge. When participants went to sleep, filming continued in infrared mode. At that point, lavalier microphones often lost power or were switched off. The primary audio in these situations was captured by the fixed ambient microphones. For reality production, this is critical, as key dramatic moments often occur outside structured daytime activity.

An additional layer of control was provided by a system developed by a Danish engineer, designed to manage both cameras and audio paths. In essence, this was an interface layer built on top of standard hardware.

The PTZ operator worked not only with the RP120 controller, but also with a tablet displaying visual presets. For example:

Camera 5.
Preset 1 — Wide shot.
Preset 2 — Medium shot.
Preset 3 — Close-up.

In the standard RP120, presets are triggered by buttons without visual references, requiring memorization. The tablet interface added visual clarity and significantly increased speed, which is essential in a reality format where reactions must be instantaneous. The same logic applies to audio control. Producers and editors used tablets with participant images. To route a specific participant to headphones, it was enough to tap on their photo. This simplified workflow reduced the risk of errors.

Audio, RF, and Control: Ensuring 24/7 Continuity

From an audio perspective, the project was more complex than the number of participants might suggest. At peak, the system handled:

  • 16 wireless channels.
  • 10 ambient channels.

The core audio infrastructure included:

  • Visycom wireless systems.
  • Yamaha QL1 mixing console.

RF coordination was a critical area and was managed centrally.

The primary setup included:

  • Visycom MRK-960 receivers.
  • Visycom MTP-40 transmitters.

Transmitters are rated up to 50 mW, but the project operated at 30 mW as a standard working level. The frequency range was 470–560 MHz. Intermodulation was calculated using Visycom’s standard software, which generated a stable frequency plan and ensured predictable performance in a dense RF environment.

Additionally:

  • Two in-ear transmitters were used for presenters.
  • Their frequency plan was coordinated to avoid interference with the broadcaster.

This was particularly important as the studio was located approximately 50 meters from active broadcast infrastructure. Control and monitoring for producers were not limited to hardware. The project implemented a system-level control interface that standardized operational modes:

  • Day mode.
  • Evening mode.
  • Night mode.
  • Infrared mode.

Each mode could be applied individually to every camera. Shading and exposure adjustments were handled via the Panasonic controller, integrated into the overall control logic. Tally was intentionally disabled so that participants would not know which camera was live.

Night and Infrared Workflow: “We See Everything Without Changing Behavior”

Night scenes were treated as a separate technical layer. Filming continued in infrared mode: IR lamps were used in the rooms, and cameras switched to infrared operation via internal filters, enabling full image capture while maintaining complete darkness for participants.

Incidents, Maintainability, and Growth Areas

In daily operation, failures were rarely related to major systems. The most frequent issues involved components that are constantly handled — microphone mounts and connectors. Soldering and restoring these elements became routine tasks.

Among system-level issues, occasional desynchronization was observed. The solution was relatively simple. However, it was constrained by production logic. Intervention could only take place after the scene was completed. This was necessary to preserve continuity for both recording and editing.

If the budget were increased by 20 percent, the priorities would be:

  • Upgrade from Panasonic UE70 to UE100 for improved sensitivity and lower noise in low-light scenes.
  • Replace Visycom 960 with MRK-16 to significantly reduce hardware footprint and failure points.
  • Increase the number of cameras, as a single camera per room limits directing flexibility and makes participants more “aware” of framing.

What Defines the Stability of a Reality Production

Regardless of country or budget, reality production is built on three core elements:

  • A controllable multi-camera system.
  • Efficient logging and editorial workflows.
  • A well-designed broadcast delay model.

Technology ensures recording. Storytelling is created in the edit. This is where technical and creative processes intersect.

A low-budget reality show can be successfully executed on a relatively compact technical base, provided there is:

  • An experienced engineering team.
  • Properly selected equipment.
  • A well-organized control room.
  • A manageable signal architecture.

The key factor is not the scale of the equipment, but the competence of technical management.

  • Reality production does not require “exotic” equipment.
  • The balance between cost and functionality is critical.
  • The role of the TOM is central in a budget model.
  • PTZ remains the optimal solution for 24/7 formats.

“In reality shows, it’s not the most expensive equipment that wins — it’s the right technical architecture and the experience of the team.”

 

- Reviews

- News of technologies and software solutions