I finally debugged why my cable service was so poor. Long story short, an inexplicable 7dB drop in the incoming line, a bad arrangement of splitters, and another unexplained 7dB drop someplace in the house.
Now for the long story:
My troubles started when Time Warner Cable required me to install mini cable boxes to see the full set of channels that I had purchased. I went from a slightly grainy picture to a clear picture on some channels and intermittent digital encoding artifacts on others. In most cases, slightly grainy was a marked improvement over digital encoding artifacts. In fact, for some channels the result was essentially unwatchable - particularly channel 3 (CBS) and channel 4 (PBS).
Two days ago, one television refused to show anything (what was shown was “Searching for Channels"). I started debugging by bypassing the box, and that worked, indicating that the cable wasn’t broken. I then swapped equipment with another room, and the problem stayed with the television and not the box.
Remembering that pressing "info” on the remote control would put the box in a debug mode, I found that the working televion showed -19.44 dB, and the failing television showed -20 dB.
Putting a signal booster on the working televison addressed the digital artifact problem. Putting the same signal booster on the failing television didn’t help.
Working assumptions at this point: with non-digital signals, picture quality degrades linearly with signal strength. With digital signals, viewability is more of a binary quality, and at -20dB the box simply refuses to show anything.
And there is a point after which there isn’t enough signal to be boosted.
Tracing back the line, it comes in from the street to a box, in that box there is both a 2 way splitter and a 4 way splitter, then it goes under the house and is split one final time before going to the two televisions in question. I suspect that the final splitter was added by the builder and not by the cable company. Similarly, I suspect that the additional 2 way splitter was added when we added a detached garage with a room on the second floor.
I then tested the signal strength at the box (before any splitters), and I found +5.6dB.
Based on this video, Time Warner should be providing me 10 to 15 dB. So the signal strength is about a quarter of what I should be getting. And I should be striving to get between 0 and 5 dB to each television.
The first splitter cut that in half, and the second splitter cut that by a factor of 4. The third splitter cut that by a factor of 2. And signal loss along the line should be on the order of another factor of 2.
That sounds like a lot, but in dB terms that’s about 18 dB of loss. Starting with 5.6 and subtracting 18 leaves -12.4. I am getting 7 less than that.
Looking back at my splitters, the first splitter fed half of the stength to one line. Tracing down that line, and that was to my cable modem. While that’s clearly dear to me, I suspect that this ordering was done when I had problem with my cable modem dropping signal. I since have replace the modem.
Reordering the splitters so that 3 lines only go through the four way splitter and two (analog only) signals go through both means that 3 televisions get twice the signal they were before, and the cable modem gets half.
Furthermore, replacing the final splitter with a signal booster means that the two televisions that were having problems now have positive signal strength.
So far, no problem with Internet, and the immediate problem with my TV service has been addressed.
Then again, I just got a notice today that four more channels will require a cable box, which leads to the following question:
If Time Warner Cable is moving towards Digital Only service, shouldn’t they be providing enough signal strength to drive all of the devices in the house?