Distribute tuner 2 out plus antenna

carter69

Active SatelliteGuys Member
Original poster
Jul 8, 2005
16
0
I currently have tuner 2's output from a vip622 going to my upstairs into a 3 way splitter to 3 rooms on channel 75. That works great.

What I would like to do is to add an antenna to my attic. To minimize running new cable to the 3 tv's upstairs and the tv/vip622 antenna in downstairs I was wondering if I could utilize the exisitng cable that I am using right now from tuner #2.

What types of splitters or combiners might be needed ?
 
If you the line to the back of the 622, use a standard splitter with one side going into the OTA antanna input of the unit, then use another standard splitter in reverse to 'combine the second side of the ota coax to the rf output should do it. Make sure you the the receiver set to analog but you will have to change the tv setting to analog as well as change the channel number set on the 622 to match the two up as analog will not let use a channel number higher than 69.
 
If you the line to the back of the 622, use a standard splitter with one side going into the OTA antanna input of the unit, then use another standard splitter in reverse to 'combine the second side of the ota coax to the rf output should do it. Make sure you the the receiver set to analog but you will have to change the tv setting to analog as well as change the channel number set on the 622 to match the two up as analog will not let use a channel number higher than 69.


This process makes you a mini-TV station, as you are coupling the TV2 output into the antenna. Yes the power level is pretty low, but it's both technically illegal & also can let your neigbors see what you are watching:eek:.

You can fix this by either using a filter (that notches out the TV2 output frequency) or amplifier between the antenna and the splitter it's connected to (or an isolator, but those can be hard to find for TV frequencies). The amplifier works because it only has gain in one direction--in the opposite direction it has a large loss. If you use the amplifier, you need to be a bit careful that the output signal isn't so large that it overloads your TV inputs--this can be fixed with an attenuator (after the amplifier) if necessary.
 
This process makes you a mini-TV station, as you are coupling the TV2 output into the antenna. Yes the power level is pretty low, but it's both technically illegal & also can let your neigbors see what you are watching:eek:.

You can fix this by either using a filter (that notches out the TV2 output frequency) or amplifier between the antenna and the splitter it's connected to (or an isolator, but those can be hard to find for TV frequencies). The amplifier works because it only has gain in one direction--in the opposite direction it has a large loss. If you use the amplifier, you need to be a bit careful that the output signal isn't so large that it overloads your TV inputs--this can be fixed with an attenuator (after the amplifier) if necessary.

You couldn't pick that signal up 3 feet away from the receiver with an antenna anyway. The neighbors aren't going to see anything.
 
You couldn't pick that signal up 3 feet away from the receiver with an antenna anyway. The neighbors aren't going to see anything.

This is an interesting assertion. Have you every tried this? Let's take a look at the from a theoretical perspective, to see if this makes sense.

If you assume an output level from the DVR of 20 dBmV (or about -27 dBm), loss in the cable to the antenna of 10 dB, and loss in the splitters of about 10 dB, you get -47 dBm at the antenna.

If you then assume combined (transmit & receive) antenna gains of 6 dB, receiver sensitivity of -90 dBm (which requires a noise figure of about 17 dB--with a good preamp the sensitivity could ve -104 dBm or lower), and feedline loss at the neighbor of another 10 dB (but no feeline loss to the neighbor with the preamp), you could withstand 39 dB off loss to the neighbor (or 63 dB to a neighbor with a preamp).

The Free Space Loss Calculator (Free Space Loss Calculator) shows that at channel 3, 39 dB of loss is about 0.02 miles, or about 100 feet. For 63 dB of allowable loss, it's about 0.34 mile. At TV channel 60, the distances are 10 and 150 feet.

So if you are using TV channel 60, and your neighbors don't have preamps on their antennas, no one is going to see anything. Or if you all have directional antennas, and they are never pointed at one another, you might be okay (depending on the antenna patterns). But if your neighbors have good preamps, and you are using a signal at channel 3, people as far as 1/3 of a mile could see what you are watching (or somewhat less, depending on the antennas used, and how they are pointed).
 
My experience has been the signal is greatly weakened going from one output to the other output on the splitter (IN to OUT or OUTS to IN are ok), but even so I can pick up a very weak signal on my portable tv.
 
carter,
i tried what you are suggesting, and i could not get the reciever to see any of the OTA channels. Too many splitters, too much signal loss.

i put a radio shack booster inline with the OTA input, and it worked. personally didn't like using that, so i just ran a separate cable from Antenna, which was its own pain in the arse.

my pic quality on TV2 is a bit degraded as well, i think because of the increased # of splits.
 
I combind the OTA and tv2 out into one cable then split it out to 2 tvs. Not the best quality, but it works. I have to save up for a second reciever.
 
This is an interesting assertion. Have you every tried this? Let's take a look at the from a theoretical perspective, to see if this makes sense.

If you assume an output level from the DVR of 20 dBmV (or about -27 dBm), loss in the cable to the antenna of 10 dB, and loss in the splitters of about 10 dB, you get -47 dBm at the antenna.

If you then assume combined (transmit & receive) antenna gains of 6 dB, receiver sensitivity of -90 dBm (which requires a noise figure of about 17 dB--with a good preamp the sensitivity could ve -104 dBm or lower), and feedline loss at the neighbor of another 10 dB (but no feeline loss to the neighbor with the preamp), you could withstand 39 dB off loss to the neighbor (or 63 dB to a neighbor with a preamp).

The Free Space Loss Calculator (Free Space Loss Calculator) shows that at channel 3, 39 dB of loss is about 0.02 miles, or about 100 feet. For 63 dB of allowable loss, it's about 0.34 mile. At TV channel 60, the distances are 10 and 150 feet.

So if you are using TV channel 60, and your neighbors don't have preamps on their antennas, no one is going to see anything. Or if you all have directional antennas, and they are never pointed at one another, you might be okay (depending on the antenna patterns). But if your neighbors have good preamps, and you are using a signal at channel 3, people as far as 1/3 of a mile could see what you are watching (or somewhat less, depending on the antennas used, and how they are pointed).

If dish receivers were cable of broadcasting the TV2 signal over the air I wouldn't waste the time running RG6 through the house. I've tried it, it doesn't work.
 
If you then assume combined (transmit & receive) antenna gains of 6 dB, ...
It is fallacy to combine two unrelated factors to manufacture a new argument. Conventional rooftop TV antennas have very poor transmission properties.

Is there more leakage than acceptable? Probably, but you're not spewing broadband like a bad CATV connection does; just two channels that don't conflict with the existing OTA environment.

Is there a better way? I'd suggest putting a unidirectional amplifier on the antenna cable coming into the initial two-way splitter. In theory, this would substantially limit the backfeed and have a side benefit of providing more signal to work with.
 

811 looses signal if I go to a specific channel

Moving from Mountain Time to Central

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Who Read This Thread (Total Members: 1)

Latest posts