That’s the question posed by one of our customers. I thought it was an interesting one because it brings up a whole lot of questions. But first of all,
No you can’t.
A DIRECTV power inserter puts out 29 volts. I don’t know of any amplified antenna that needs that much voltage. Voltage is one of many ways to describe electricity, and it’s sort of hard to wrap your head around them. This tutorial tries to succeed where others have failed. Without getting too far into the weeds, let’s say that the power supply determines how many volts are supplied, and a device determines how many amps it needs. If you supply more volts to a device than it can handle, it will short out.
Most amplified antennas need about 5 volts to do their jobs. Since a DIRECTV power inserter supplies a lot more than that, you’ll just break the antenna if you connect a DIRECTV power inserter to it.
But the discussion is more interesting than that.
It brings up the fact that antennas don’t actually need power to work. At least they don’t need power from you to work. The person asking this question thought they did.
Amplified antennas need power, because it’s the amplifier itself that needs power. The antenna doesn’t, and many amplified antennas will work even without being plugged in. They just won’t use the built-in amplifier.
This all happens because radio waves (and television and satellite broadcasts) are wireless electricity. Not a lot of electricity, but enough to do the job. It’s a freaky subject and there’s a lot of physics involved. I’ll try to make it simple to understand, but that also means my explanation won’t be 100% accurate to the nth degree, so don’t troll me.
Broadcasting is an interesting thing. You take a signal and amplify it millions and millions of times over, until its base signal level is measured in the hundreds of thousands of watts. Put all that electricity into a tall metal tower and the electricity will actually leave the metal tower and jump into the air. Off it will go in all directions, at roughly the speed of light.
As the signal travels, it loses strength. The further it goes, the weaker it becomes. And this happens fast. That 100,000 watt signal is just a few watts even 500 feet from the tower. Miles away, it’s become a few thousandths of a watt.
That extremely weak electrical signal doesn’t really “want” to travel through the air. If the signal hits something more conductive than air, it will travel into that instead. So, eventually that electrical signal gets to another metal pole — your antenna — where it is transferred and starts going down a metal wire.
This whole time, the electrical signal has kept some of its key properties. Even though it gets weaker, it still keeps the same qualities as the original signal. If the original signal had waves roughly 1 foot long, the weaker signal will too. If the original signal had digital data in it, the weaker signal will too.
That’s the magic of broadcasting. By the time the signal gets through your antenna and down into your TV, it could be 1/10,000th of a watt. (Keep in mind a typical LED light bulb is 20 watts.) That’s still strong enough for a tuner to decode what’s in it, and show that signal on your TV.
You have the power
It all happens without a power source because the signal itself is the power source. In the earliest days of radio, people listened with headphones and the signal itself was strong enough to power those headphones. Today we have TVs and radios that we plug in because we want louder sound and vivid pictures, which are too much for the tiny amount of electricity in the broadcast signal.
It’s all pretty amazing, right?