Cookies on DVDActive
DVDActive uses cookies to remember your actions, such as your answer in the poll. Cookies are also used by third-parties for statistics, social media and advertising. By using this website, it is assumed that you agree to this.
 
Leaderboard Extra
Poor Nvidia TV Output

Forums - Discs & Movies - Poor Nvidia TV Output 

Reply 

Message Enter the message here then press submit. The username, password and message are required. Please make the message constructive, you are fully responsible for the legality of anything you contribute. Terms & conditions apply.
Not Registered?
Forgotten Details?
Additional Options These options are not required but may be useful.
Existing Posts
I've seen bad cards, but being underpowered is not likely for a mass produced video card.  Check the min. card requirements to make sure your system is compatible.
I think the card must be underpowered too as compared to stand-alone devices, cause like I said above, I did try the cable on my digital terminal and it worked perfectly.

Dag nabbit.
I don't know the exact length where problems begin.  Like said, the problem is shielding.  Bad shielding bleeds, and interference from outside electrial sources, in.
I was thinking it was the cables, because I was using a 20 footer like you were, but when switching to a shorter one - I think it was 6 foot, the problem was still there. Although, maybe a better shielded one is the answer. I may decide to try it out.
Poor Nvidia TV Output
Jonny "Me You" wrote: So back in late november I went and bought a shiny new Dell with an NVidia 6800 graphics card and figure it would be neat to hook up to my TV so I could watch stuff full screen that way, or edit my home videos in a more professional way, etc..

I finally got the cables the other day and guess what?

The graphics card is emitting a huge rolling bar of distortion making the image from it nearly unwatchable. It's hooked via SVIDEO directly into the TV (a 27 inch sony wega) so it's not a macrovision thing.

At first I thought it was the cable, but then I plugged it into my digital cable terminal and it worked perfectly.

Anyone else have a similar problem? how did you fix it? I can't believe that NVIDIA's display is really that bad.


     My first suggestion, and the easiest, would be to update your drivers.  If this was already suggested, I apologize.

I think this may be the problem, especially if you have a long s-video cable.  Those type of cables are very lossy at lengths of over 8-10 ft or so, because its carrying a video signal.  When using s-video cables you need a high quality cable.  I know this because I had the exact same problem as you, many years ago with my first computer DVD player.  I bought about a standard s-video 20ft cable at a computer show and had the bars on my tv screen like you did.  After pulling some hair out I went and bought a Monster cable, which was very costly, and the problem was gone. It's mostly because of the shielding on the cable.  Standard quality cables (s-video) have poor shielding.

Acoustic Research, which I think is still available at retail stores, or Blue Jeans cable on the internet.  Both would be a better buy than Monster Cable  I think Radio Shack carries Monster Cables now.  If it doesn't work you can always return.
you might want to lower the the resolution too.
Here's a little update. I found where to change the timings and refresh rates, but none of the custom modes made much of a difference. I also downloaded this little thingy called TV tool, but it didn't help either.
I had a thught later on that it might be the refresh rate too, but there isn't anything in the drivers controls that lets you change it. Currently I have the thing set on North American NTSC and 1024 x 768 resolution. I'll play with the res and see if that helps.

Thankfully, the drivers split the displays like I'm using a second monitor, so nothing I do affects my main LCD.
Its your refresh rate homie. A normal TV cant hadle that amount of refreshing. Keep lowering it about 5 points at a time till your problems go bye bye. You probably have it set somewhere around 85 right now.
My Toshiba laptop has a Nvidia Go5700 and it works fine - mind you it is a bit older. Does your tv have different modes? i.e. PAL, SECAM, etc - maybe try changing this but it is a long shot. I assume you have chenged refresh rates, etc on the controll panel of the graphics card to match the tv?
Poor Nvidia TV Output
So back in late november I went and bought a shiny new Dell with an NVidia 6800 graphics card and figure it would be neat to hook up to my TV so I could watch stuff full screen that way, or edit my home videos in a more professional way, etc..

I finally got the cables the other day and guess what?

The graphics card is emitting a huge rolling bar of distortion making the image from it nearly unwatchable. It's hooked via SVIDEO directly into the TV (a 27 inch sony wega) so it's not a macrovision thing.

At first I thought it was the cable, but then I plugged it into my digital cable terminal and it worked perfectly.

Anyone else have a similar problem? how did you fix it? I can't believe that NVIDIA's display is really that bad.