The trick of course, as Ken said, is to judiciously (or have a friend that works at the cable company come over one night for some beer and his DB meter) keep track of normalization. You always want to assume that whatever signal strength Dish, the cable company, Apple etc, decided to output from their equipment is meant to be delivered to your final equipment. So if you used 2 splitters to get a cable to your TV you are now -6db down or 1/4 of the original signal strength.
What I do is whatever he reads direct from the disconnected cable in signal strength I normalize to zero (0) db in my system. That's of course a baseline I use in my head. For all intents and purposes I mark that cable end as 0 db. I then have a lower powered amp that brings the baseline up +10db for starters. Then for every splitter I have stickers after then on the cables that tell me that I'm at +7 , +4, etc. When I get to - something I'll insert a high end (from Worthington Distribution) very low noise 15db amp (never above as they distort above 15db) then add the +15 to whatever is coming in. So if I had -3 coming in, now I have +12 coming out and keep track again. I never allow any final outbound terminating feeds to drop below +3 or +9db.
I use +3 for any runs from the basement to the basement or 1st floor.
+6 for further runs like to the back screened in porch or the outlet on the patio
+9 goes up to the 2nd floor (much longer runs), and it it look overdriven simple 3db or 6db drop inserts brings the signal down to normalized quality again.
I use RG6 dual for short runs and to runs for small non-HD sets like Kitchen, porch, laundry room, etc.
Dual Quad shielded RG6 for all runs to the 'real HDTV's'
So I'm always at +just-a-little-sometime at the back of every consuming device. Easy to shed signal at that point.