Valhalla Legends Forums Archive | Advanced Programming | Re: A driver to add display modes?

AuthorMessageTime
Myndfyr
Is it possible to create a Windows filter driver to add additional display modes to my display?

I have a Samsung 42" plasma TV that's 16:9 and has a max pixel height of 768.  However, when plugged into a computer, it states that the best text quality is at 1024x768.  But that's not 16:9.

I'd like to force it to 1360x768 because I don't really care about text quality - I want it for occasional gaming and mostly media center related-functions.

Is this possible?  Are there any ways around it beyond just an intermediate driver?
May 20, 2007, 9:43 AM
rob
You should be able to modify the registry values directly to change the resolution for the graphics card.  (Even if the resolution isnt an option in the standard graphics control panel)

Open Regedit.
Expand down to HKEY_CURRENT_CONFIG\System\CurrentControlSet\Control\VIDEO\{Hexadecimal number for your primary video card}\0000
DefautlSettings.XResolution will be the x axis for the new resolution.
DefautlSettings.YResolution will be the y axis for the new resolution.

You may also see multiple subkeys under the key for your video card.  I believe these are settings that maybe used in the case that you have had multiple monitors connected to the PC.
May 20, 2007, 10:08 AM
Quarantine
You could use a program called PowerStrip to create your own Display driver or something of that nature. I mucked around with it but fixed my problem before I had to use it.
May 20, 2007, 1:12 PM
Skywing
Custom modes are typically specified via a monitor INF - does your TV manufacturer provide one?

Are you using DVI or analog TV out?

Regardless, this is not something that (typically) requires code changes.
May 20, 2007, 5:03 PM
Myndfyr
Right now I'm using a VGA cable.  I could, however, use DVI (I've got an 8800GTX attached to the TV right now).

To my knowledge, the TV manufacturer does not provide a custom .inf file.  Is it terribly complex to create one?
May 20, 2007, 11:09 PM
JoeTheOdd
On the "Settings" pane of Display Properties, click Advanced. On the "Monitor" pane of the pop-up Window, about halfway down, uncheck "Hide modes that this monitor cannot display."

I obviously can't afford such a huge screen so I have no way of checking if that'll work for you, but it allowed me to run 1920x1440 on my 19" monitor, which reported a maximum of (I think?) 1200x1024. Of course, you're liable to blow your screen up, but a screen that big should protect itself.
May 21, 2007, 4:02 AM
Myndfyr
That doesn't work.
May 21, 2007, 6:56 AM
Barabajagal
You can't set display modes without three factors:
3) Software that will set it [such as the display properties dialog].
2) Drivers that will send the settings to the card.
1) The actual abilities of the card's firmware and hardware [most important].

You may want to try out Omega Drivers. If the actual drivers don't help you, look at the INF files and see if you can allow the settings manually through testing. I was able to overclock my Radeon 9550 250%, running at 85 hz at 1280x768 on my Wide-Screen Projection Television. Unfortunately, it ended up damaging my card, and now I can't play 3D games at full screen.
May 21, 2007, 8:31 AM

Search