There is opportunity in the wireless industry for new power solutions for the cell site architectures deployed today. Improvements particularly can be made for powering remote radio units (RRUs) more efficiently and cost-effectively. When discussing this opportunity, I used the term “voltage drop” in a previous blog post. I will explain the term more clearly here and the central role it plays in the challenge of supplying power efficiently to RRUs.

Voltage drop is a key consideration for radio planners and plays a central role in the challenge of supplying power efficiently to RRUs at cell sites. Voltage drop is the amount of electric voltage that is lost as a result of current moving through the circuit. It’s similar to the concept of signal loss in the RF path; by the very nature of moving through a medium, some of the signal is lost. Voltage itself can be thought of as the amount of water pressure in a hose. The longer the hose, the weaker the water pressure because of friction and other factors.

The challenge with voltage drop in RRU deployments relates to the medium used to deliver the electricity, such as through copper power cable. At most cell sites, voltage is supplied at -48 volts of direct current (VDC). The minimum input voltage for most RRUs is typically around -38 VDC, which means that the maximum voltage drop from the power supply to the RRU is 10 volts (-48 minus -38).

But a network operator doesn’t want to use a cable that actually has a 10 volt drop. They want a buffer to ensure that voltage requirements never exceed what can be delivered, regardless of fluctuations in the power supply. So they want a power cable that drops only 6 or 7 volts, not the full 10.

The way to reduce the amount of voltage drop is to use a larger gauge cable. That works to deliver the right amount of power to the RRU, but of course thicker copper conductor cable is more expensive. It costs more because more material is being used. It also is more costly to ship and more difficult to install.

What’s the solution?

One solution is to deliver higher voltage and therefore lower current if the voltage drop in the power cable will be too high. We have seen 120 volts of alternating current (VAC) used, which is converted back down to -48 VDC at the RRU by an AC/DC converter. This solution works but is very inefficient because of converting DC to AC in the shelter with a power inverter and then converting back to DC at the RRU , and can generate as much as 15 percent additional power loss per RRU.

When it comes to the power cable itself, operators have two options:

1. Use cable types of different gauges to meet voltage drop requirements for different installations. The benefit of doing this is that the right cable can be deployed to meet the power needs based on the distance to the RRU. But it is a much less efficient system to stock, limits scale of economy and complicates roll-out schedules.
2. Use one cable gauge for all installations. This method is more efficient than the above approach, but can be inefficient in the sense that some sites will not need the diameter-size of the cable deployed. The amount of copper used is overkill in many installations meaning costs are higher than they need to be.

None of these approaches is the ideal in terms of efficiency and standardization. The industry is still waiting for a solution that delivers the right amount of power to RRUs in an efficient manner. Stay tuned.