In just the past few years, the broadcast
environment has evolved rapidly. The introduction of fiber-based
transport and server-based playback systems has introduced a layer of
complexity that did not exist previously when mechanical tape machines
providing analog video via coax were the norm.
In the past, the “master control” was run via
RS422 machine control data automation that hardwired to a central control
station. This was usually centrally located within the facility. The “on-air”
signal component was delivered via coax cable and was also switched at the
operation center. Signals from satellite feeds, remote sites and microwave
links were routed and switched as required to keep the newsroom, sports
department and entertainment division operational. The master control operator
watched the equipment and knew what changes were made within the environment
because they were physically located in the same space.
The introduction of digital formats made server-based
storage the preferred method of signal delivery. Other traditional devices used
in a broadcast studio started to use computer language for automation as well.
The RS422 machine was replaced by IP protocol. There was one problem;
the operator could no longer see physical changes being made to the system.
Changes made at a remote location could have serious detrimental
effects on scheduled programming without the proper monitoring.
Broadcast facilities are expanding because of the
need for server storage and the proliferation of automated and controllable
devices. The industry-standards group, the Society of Motion Picture and
Television Engineers (SMPTE), is finalizing an IP standard for field
production devices – cameras and lenses, which will allow a single Category
6 cable to transport high definition video/audio and power in a mobile
application. Who would have thought a true plug-and-play camera for broadcast
applications would be possible?
The proliferation of digital formats and control
allow major networks with newsrooms and studios in various locations to connect
via IP networks. This allows them to easily transport data (video) and control
remote devices as if they were located within the prime facility. For remote
applications it ensures that a device can be brought online and documented as
An Automated Infrastructure Management (AIM)
system provides real-time visibility of the physical layer connectivity. It
ensures the operator that the path required for the on-air signal is available
and functional. If a path is not available as requested, an AIM system will
expedite the diagnosis and minimize troubleshooting time.
Devices are always being added to the system and
the equipment database traditionally lags with these additions. The AIM system
tracks these changes in real-time, making them available online for immediate
recall. Another important capability that is provided by some AIM systems, such
as CommScope’s Quareo
solution, is the ability to track the insertion count of
the fiber connector. Technicians new to fiber might not be aware that fiber
patch-cords should be replaced after the recommended insertion count has been
reached. This is critical in the broadcast environment where patching is more
frequent than a traditional data closet. Important functions such as identification
of fiber splice locations, rack elevation planning and bill of materials
generation are a few of the many useful applications of an AIM system in the
During the upcoming NAB Show in Las Vegas (April 16-21, 2016), I will be discussing some of these issues.
Please join me on April 19, 2016 during my presentation “Why Manage the Physical Layer” at 3:00 pm PT. On the same day, I will be part of the
“Simplifying the Migration from Copper to Fiber” panel discussion at
5:00 pm PT.
I hope you will join me.