Nvidia _Program Settings_ for X3:TC
Moderators: timon37, Moderators for English X Forum
-
- Posts: 1414
- Joined: Thu, 7. Jul 05, 05:17
Hi Andi2,
While we have you looking at Nvidia-related issues , do you have any idea why the X3 engine seems to have so much poorer minimum framerates on modern Nvidia hardware? I think this might be a major contributing factor to threads like this:
X3:TC Min-frame Rate Issue Thread
This issue has been present since the G80 was released and has been remarked upon by reviewers for Reunion as well as TC. This was not an issue with the Geforce 7 series and earlier nvidia hardware as demonstrated by the second link.
A few examples with benchmarks:
X3:TC benchmarks
X3 Reunion benchmarks
While we have you looking at Nvidia-related issues , do you have any idea why the X3 engine seems to have so much poorer minimum framerates on modern Nvidia hardware? I think this might be a major contributing factor to threads like this:
X3:TC Min-frame Rate Issue Thread
This issue has been present since the G80 was released and has been remarked upon by reviewers for Reunion as well as TC. This was not an issue with the Geforce 7 series and earlier nvidia hardware as demonstrated by the second link.
A few examples with benchmarks:
X3:TC benchmarks
X3 Reunion benchmarks
-
- EGOSOFT
- Posts: 227
- Joined: Wed, 6. Nov 02, 20:31
-
- Posts: 2141
- Joined: Wed, 6. Nov 02, 20:31
-
- Posts: 50
- Joined: Tue, 21. Oct 08, 16:37
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
D3D has variable render-ahead. In Nvidia drivers I even think that's what it's called, "frames to render ahead" or something similar. Set that to 1 for double-buffering, 2 for triple-buffering and higher for something that has no comparison but will also cause lag as your controls will be several frames removed from the action on the screen.
/ Per
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
-
- Posts: 50
- Joined: Tue, 21. Oct 08, 16:37
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
What else would render the frames?
The frame buffer is the screen you're seeing now. The "double" is because there's a second frame buffer being rendered that'll be displayed in the next frame. The "triple" is for having a second extra frame to render to so that there is always a full frame to swap out for the current one.
At least under Direct3D, this is only used when VSYNC is on.
Not sure if there's a special setting for how many frames to render ahead or how many buffers to use under Direct X on Nvidia hardware if it isn't affected by the render-ahead; sorry.
The frame buffer is the screen you're seeing now. The "double" is because there's a second frame buffer being rendered that'll be displayed in the next frame. The "triple" is for having a second extra frame to render to so that there is always a full frame to swap out for the current one.
At least under Direct3D, this is only used when VSYNC is on.
Not sure if there's a special setting for how many frames to render ahead or how many buffers to use under Direct X on Nvidia hardware if it isn't affected by the render-ahead; sorry.
/ Per
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
(Core i7 920 2.67GHz, 6GB 3chan DDR3 1066MHz, HD4870 512MB, Vista Enterprise 64)
-
- Posts: 50
- Joined: Tue, 21. Oct 08, 16:37
-
- Posts: 3008
- Joined: Wed, 6. Nov 02, 20:31
I have 2 monitors and only use the primary for playing games (ie not "span" mode) unless the game specifically uses the 2nd monitor (supreme commander, world in conflict etc.). I find single display performance mode improves my fps in at least some games. It's possible it reduces it in supcom or WiC, mind.andi2 wrote:Multi-Display/mixed-GPU acceleration: Single Display performance mode
This may improve performance but is also driver version dependent, cant recall what driver version it was first added. But using "Multiple display performance mode" and make sure to run the game on the "primary" screen should also work with the same fps.
So u have to test it and see if the fps drop with multi monitor setups.
Math problems? Call 0800-[(10x)(13i)^2]-[sin(xy)/2.362x]
-
- EGOSOFT
- Posts: 227
- Joined: Wed, 6. Nov 02, 20:31
There is no setting since X3 default's are vsync on and will than use tipple buffering. Just make sure u don't override the vsync setting via driver panel.gregs4163 wrote:How do you turn on Triple Buffering in X3? I thought it was a D3D game and triple buffering was for OpenGL games? I use nHancer and triple buffering is noted for OpenGL only. I would love this because I always use Vsync.
Like noted what happens is that u have 3 framebuffers, which means one is shown on screen one is rendered to and one holds the "next" best full image. If the actual rendered frame is completed before the second already finished frame can be show via monitor it can be swapped, which means u don't need to depend/wait for the the monitor to release the show buffer.
Since nowdays we have lot's of gpu memory the extra framebuffer will never harm the memory this much. Vsync + tripple buffering should be default for all games now days anyway.
As note for the "frames to render ahead" feature, this means that the driver can buffer up to 3 frames, which helps multithreaded code and also some internal optimizations. In some very rare cases lowering to 1-2 frames or raising to 4-5 "may" improve performance. But in general this can be left on default all the time.
-
- Posts: 50
- Joined: Tue, 21. Oct 08, 16:37
-
- Posts: 295
- Joined: Mon, 19. Jan 09, 17:31
-
- Posts: 50
- Joined: Tue, 21. Oct 08, 16:37
Sorry, what I meant to say was I switched Vsync to use "application" in the driver control, so the driver was not forcing it on but let the application determine the control of it and I still get tearing. I can see it most when there is a large planet taking up most of the background. When I force it on in the driver it goes away I have plenty of graphics horsepower and video memory so this isn't the problem, 9800 GT w/1024 mb RAM (650Mhz Core 1600Mhz Shaders)
-
- Posts: 933
- Joined: Fri, 5. Mar 04, 17:23
-
- Posts: 59
- Joined: Sun, 25. Jan 09, 06:01
NVidia has added X3TC Settings
Based on research by NVidia, in cooperation with egosoft, they have added a X3TC profile to their drivers. This profile provides SLI by default, and many other enhancements that weren't available before without customization. To see this profile:
NOTE to SLI Users: I find it best to use a monitoring app such as Everest Ultimate to monitor my temps and fan speeds. This way I know if I'm getting the most out of my dual card config. Each time you update your NVidia drivers, you have to go into the control panel, select 'Set SLI' on the left pane, the click the radio button for "Enable SLI". Also, take note that PhysX should be enabled as well. After much research I found that PhysX GPU proccessing does NOT take over one of your cards. Rather, it manages the resources effeciently and allocates a portion of each GPU for PhysX only when necessary.
Thanks,
shawn
- Goto NVidia Control Panel
Turn on Advanced Settings
Click on Manage 3D Settings
Click on Program Settings
Uncheck box "Show Only Programs Found on This Computer"
Popdown programs menu under section one and find X3TC (demo is listed as well)
NOTE to SLI Users: I find it best to use a monitoring app such as Everest Ultimate to monitor my temps and fan speeds. This way I know if I'm getting the most out of my dual card config. Each time you update your NVidia drivers, you have to go into the control panel, select 'Set SLI' on the left pane, the click the radio button for "Enable SLI". Also, take note that PhysX should be enabled as well. After much research I found that PhysX GPU proccessing does NOT take over one of your cards. Rather, it manages the resources effeciently and allocates a portion of each GPU for PhysX only when necessary.
Thanks,
shawn
-
- Posts: 3008
- Joined: Wed, 6. Nov 02, 20:31