Change to 1080i (interlace instead of progressive)

Post » Fri Feb 04, 2011 6:35 am

Hello,

I've got the problem that my TV doesnt support the progressive-method for 1920x1080 Pixels,
so I have to use 1080i (interlace-method) for this resolution (@ 30Hz)...
How can I change the video-settings to 1080i or Interlace-Mode for Fallout New Vegas?

Thanks in advance!
User avatar
djimi
 
Posts: 3519
Joined: Mon Oct 23, 2006 6:44 am

Post » Fri Feb 04, 2011 2:00 am

Normally a game only uses the resolutions which your display options lists as valid resolutions for your display.

Check your display drivers. Like if you have Catalyst, see if 1080p is listed or enabled. If so, see about removing it. This should stop programs from using that resolution/mode. Normally your display driver will query your display panel and detect which modes are available.

If you can select 1080p as a desktop display mode (which you indicates will not work on your display) then that mode must be listed in your display drivers as valid, for some reason.
User avatar
NAkeshIa BENNETT
 
Posts: 3519
Joined: Fri Jun 16, 2006 12:23 pm

Post » Fri Feb 04, 2011 6:08 am

My display drivers lists both modes 1080p and 1080i but only 1080i works for my TV...
With the 1080-progressive-method I'll get green pixel all over the screen because the TV doesn't support progressive.

Windows and Display is on 1080i @ 30 Hz and it works fine so in fact Fallout should work normally, but it doens't...
It changes his fullview-resolution to 1080p and 60Hz when I click on 1920x1080 Pixels at the videosettings, there's the problem.
I can't set progressive/interlaced and the Hertz in the videosettings of Fallout.

btw I've got nvidia 8800 gts 512mb and TV is connected over DVI-Duallink to HDMI (HDCP is supported)
Other games and full-hd-movies works fine...
User avatar
loste juliana
 
Posts: 3417
Joined: Sun Mar 18, 2007 7:37 pm

Post » Thu Feb 03, 2011 11:46 pm

Is your TV actually a plasma or LCD? I ask because even when a plasma or LCD tv lists itself as being 1080i compatable it almost always means it deinterlaces and scales the image to fit its native resolution of 1280×720 (or 1366x768.) If it will handle that resolution at 60hz then that's what you should do. 720p is going to look better (sharper than it's number of scanlines would indicate compared to 1080i) and suffer from far less motion blur.

You can chose a custom resolution in your catalyst control center or in the nvidia control panel that will output 1080i but that will be useless as game settings will override the default resolution when they launch. You could try to run in windowed mode but that would probably not look or perform particularly great.

You could also try editing your ini file to force a refresh rate of 30 hz- but this would not be interlaced rather it would be progressive scan.

Also other games that work because they let you set them to 1080 resolution at 30hz are not necessarily outputting a properly interlaced image.

Why do you want to run at 1080i instead of 720p though?
User avatar
JD bernal
 
Posts: 3450
Joined: Sun Sep 02, 2007 8:10 am

Post » Fri Feb 04, 2011 5:10 am

Its a LCD and dimensioned for a good PC-compatibility and their resolutions...besides for Full-HD (1920x1080 pixels @ 30Hz)
Other resolutions smaller than 1920x1080 pixels works fine with progressive, I don't get it...thats weird :(

Why should I switch to 720p if I have the chance to get 50% more pixels?

Why does Fallout switch to 1080p if Windows and everything else is running at 1080i ?
The resolution is in both cases 1920x1080 pixels. Just the method is wrong in the fallout-settings...
User avatar
k a t e
 
Posts: 3378
Joined: Fri Jan 19, 2007 9:00 am

Post » Fri Feb 04, 2011 2:59 am

Because that's not how it really works at all for your LCD.

The native resolution (actual number of pixels) that your LCD has is only equivalent to 720p. It does not actually have the capability to present a 1920x1080 image. This is why I asked what type of TV it was - because a "1080i" LCD tv that is not capable of displaying "1080p" at any refresh rate is actually deinterlacing and scaling the image to fit its native resolution.

In this case you will not only get a better frame rate by choosing the native resolution of 720p but also a sharper image without loss of detail due to scaling. In fact, each frame of a 720p on a 720p native LCD will contain more total pixels from the original image than will a frame of scaled 1080i. You may still prefer (for desktop compositing) the resultant scale of objects on your desktop in 1080i but for games you gain nothing by choosing 1080i for an LCD that has a true native resolution of 720p.
User avatar
quinnnn
 
Posts: 3503
Joined: Sat Mar 03, 2007 1:11 pm

Post » Thu Feb 03, 2011 11:06 pm

Because that's not how it really works at all for your LCD.

The native resolution (actual number of pixels) that your LCD has is only equivalent to 720p. It does not actually have the capability to present a 1920x1080 image. This is why I asked what type of TV it was - because a "1080i" LCD tv that is not capable of displaying "1080p" at any refresh rate is actually deinterlacing and scaling the image to fit its native resolution.

In this case you will not only get a better frame rate by choosing the native resolution of 720p but also a sharper image. You may still prefer (for desktop compositing) the resultant scale of objects on your desktop in 1080i but for games you gain nothing by choosing 1080i for an LCD that has a true native resolution of 720p. If it were a CRT then 1080i may actually be the optimal setting for it as CRTs with a maximum resolution of 1080i will neither have to interlace nor scale such an image.
User avatar
Dominic Vaughan
 
Posts: 3531
Joined: Mon May 14, 2007 1:47 pm

Post » Fri Feb 04, 2011 7:33 am

no it's a 1080p-native LCD...just the connection with DVI-DL to HDMI doesnt work with 1080p...

I try to explain:

the nvidia-driver says "resolution 1920x1080 - 1080p (native)" for my TV, but if I activate this, I'll get green pixels all over the screen...
These pixels disappeart when I change to "resolution 1920x1080 - 1080i".
Thats the part I dont understand :/
User avatar
Jonathan Windmon
 
Posts: 3410
Joined: Wed Oct 10, 2007 12:23 pm

Post » Fri Feb 04, 2011 12:43 am

no it's a 1080p-native LCD...just the connection with DVI-DL to HDMI doesnt work with 1080p...


I thought you said it was incapable of displaying a 1080p image? What's the model- I can check real quick- it's actually kind of a good thing to know for certain. Basically all LCDs are progressive mode and have to deinterlace any interlaced image you pass them. If it is capable of accepting a 1080p signal it will be best to figure out how to get that to work since it will look considerably better unless the refresh rate for that mode is abnormally low.

The green pixel thing is strange- it's possible that you'll need to choose a custom resolution for that to work or a different refresh rate (may be slightly more or less than exactly 1920 by 1080 and may be less than 60hz). My LCD TV is "1080p" native but I had to choose a slightly different resolution and play with the refresh rate to get it to display properly. It will display a 1080p image properly from any other source of course except my... nvidia graphics card. So yeah maybe if you give me the model number I can look up some of the exact technical specs and see if any of my buddies over in the AV forums have experience using it as a monitor.
User avatar
Ryan Lutz
 
Posts: 3465
Joined: Sun Sep 09, 2007 12:39 pm

Post » Thu Feb 03, 2011 4:42 pm

Graphic-Card: nVidia GeForce 8800 GTS 512MB (running on the newest driver)

TV: http://www.ciao.co.uk/Productinformation/Philips_32PFL9604H__7167370
User avatar
Nitol Ahmed
 
Posts: 3321
Joined: Thu May 03, 2007 7:35 am

Post » Fri Feb 04, 2011 5:52 am

Ok actually this might be easier than I thought. That is my roomate's tv too so I'll drag it out here and try out some settings and pull out his manual.

I guess that kind of surprised me since he bought it at a yard sale and I don't think it's even available in our market (US). I think the couple who sold it had purchased it when they lived in Austria.

EDIT:

Ok well I have noticed something dissapointing after trying a few different cables and playing around with some settings- well it was disappointing because it worked perfectly without any real fiddling and I was kind of hoping to get to have an excuse to keep this dual HDTV setup going a little longer. It is fully capable of 1080p @60hz over a DVI-D (single link) to HDMI cable. At 1080p you only need a single link dvi cable - but using a dual link cable shouldn't be causing the problems you're experiencing. It's possible that the cable is too long or poorly shielded though you would think that would not matter with a digital signal.

I'm using a GTX 260 and connecting to an Austrian market released 32PFL9604H. The total cable length is about 2 m and it is a single link DVI-D to HDMI adapter-cable. I'm not using a dongle or adapter endpiece connected to another cable.

If you see a relatively short DVI-D single link to HDMI cable that is inexpensive and seems well shielded it might be worth trying because it does seem like the TV can handle 1080p @ 60hz. Something with a solid/heavy ferrite choke (those big heavy ring sections near the ends of cables) should do the trick- doesn't have to be a name brand cable. Before doing that though maybe just try tightening the connection to the graphics card a little and making sure the power cable doesn't wrap around it or something. If it were I suppose you could get some interference in the forms of overtones in the 60hz signal. I would really expect you to have more problems though if you were seeing interference or poor transport of a digital signal.

Heck all I didn't actually have to adjust any of the settings besides setting the resolution in the nVidia control panel to 1080p native: 1080p @ 60hz. It works better at that setting than my screen does.

But yeah if you can't get 1080p to work (it really should) then a 1080i image deinterlaced will probably end up looking better than a 720p image upscaled.

Oh I almost forgot- I had to turn off scaling on the TV before it would properly display the image. It wasn't giving me green spots but it wasn't fitting the screen properly either. The option for that is brought up by hitting option then selecting picture format and "unscaled."

User avatar
Frank Firefly
 
Posts: 3429
Joined: Sun Aug 19, 2007 9:34 am

Post » Fri Feb 04, 2011 4:17 am

So what should I do now with Fallout?

The TV has the option specialised for the use with PC...
Its just a "smart setting" with all the settings (not scaled etc.) saved for the best use with PCs at one selection.
The TV setup isnt the problem.

I think it is the DVI-DL to HDMI cable. Its about 15 meters, but I bought a premium cable
especially for home entertainment systems so theoretically it should work to 99%.
I have to throw in that I know a lot about cables and so on so you can assume that the cable isnt [censored] ;)

Let sum this up:

The best option for me is 1080 interlaced @ 30 Hz, isnt it?
(the TV are able to work with 100Hz, is this possible via PC?)

How can I set up Fallout so that it works well?
User avatar
kiss my weasel
 
Posts: 3221
Joined: Tue Feb 20, 2007 9:08 am

Post » Thu Feb 03, 2011 5:42 pm

I mean the best option is still 1080p if you can punch things hard enough so they know to do what they're told but I guess that might not be the most elegant solution. I think part of the problem is that your system has the normally correct display mode capabilities listed for your monitor. I can't figure out how to exclude some of those so when Fallout NV attempts to output at 60hz directx, windows, and your graphics drivers all oblige and attempt to do so.

I've tried to search for a way to get fallout to accept a 1080i output mode but the best I've been able to find is ATI users complaining that they are getting only 1080i despite choosing otherwise. That's always the way of things isn't it? If I can come across something (maybe forcing the monitor settings to not list 1080p as an allowable output) I'll post it.

I would not suggest setting the refresh rate as high as 100Hz though because your games are going to look the best when you are able to render them consistently as close to the refresh rate as possible. The cable might be an issue- the manual warns against using anything longer than 5m particularly for DVI-HDMI connections from a PC.

I'm sure the cable is a good cable but yeah if it's that long and most of it is coiled up it's basically a giant inducer sitting next behind your computer. 15m is right at where the potential amount if interference and signal degredation can begin to overwhelm the built in error correction in HDMI signalling. Even then 15m should still be doable - but a coiled cable that long sitting amongst a bunch of power cables and electronics might give you enough loss to produce noticeable artifacts. You might just have to uncoil the cable or put a ferite choke on it but a shorter cable might be an easier solution- since you know I about cables already I won't explain why you can get away with cheaper quality cables for short runs. Anyways I'm using a similar card and an identical TV (or was until my roommate got back) and it worked over a $12 2m cable. It's also pretty cheap because it's only single link - all you need for 1080p since with that resolution and the HDMI input on the TV only wires should be live anyways.

Can you maybe uncoil the cable a bit and lay have it lay a little further from some of the power cables? I'm not about to suggest you wrapped it in tinfoil but if you did I would not accuse of being crazy for trying to keep out the game-ruining signals THEY are beaming into your tv.
User avatar
keri seymour
 
Posts: 3361
Joined: Thu Oct 19, 2006 4:09 am

Post » Fri Feb 04, 2011 4:48 am

ATI/AMD cards have an 'HDTV Support' tab to force 1080p60 (Plus other options.) for TVs "that are reporting incomplete or incorrect capabilities in their EDID information" -ATI/AMD.

It's been awhile since I have used Nvidia drivers but do they not have a similar feature?
User avatar
Kristina Campbell
 
Posts: 3512
Joined: Sun Oct 15, 2006 7:08 am

Post » Thu Feb 03, 2011 5:22 pm

I can't find it on the nVidia control panel. I mean I can force it in the desktop but I can not force a specific output mode in all applications. I have done that in the past with my ATI cards though- I was trying to find the similar option for an nVidia card. My only guess is that ATI cards have those because they have been more used in HTPCs maybe?
User avatar
Lexy Corpsey
 
Posts: 3448
Joined: Tue Jun 27, 2006 12:39 am

Post » Fri Feb 04, 2011 6:46 am

Thank you for your help :)

I can't uncoil the cable because my TV is nearly 15m away...
User avatar
Rob
 
Posts: 3448
Joined: Fri Jul 13, 2007 12:26 am


Return to Fallout: New Vegas