Hi Guys,
I wrote a short game in C# Managed DirectX, using the Direct3D interface.
In my computer and Graphic Card, the FPS is being controlled by the Screen Refresh Rate ... means 85FPS when using PresentationInterval=default.
When im using PresentationInterval=immidiate it gets to 300FPS !
On others computers the PresentationInterval - default doest not work !! - and i cant see the graphics ... why is it?
- and the Refresh Rate is not being used to synchronize the program. I want it always to work with PresentationInterval - default so i can use the screen refresh rate to update the screen and not to have 300FPS.
10x, LDV
I wrote a short game in C# Managed DirectX, using the Direct3D interface.
In my computer and Graphic Card, the FPS is being controlled by the Screen Refresh Rate ... means 85FPS when using PresentationInterval=default.
When im using PresentationInterval=immidiate it gets to 300FPS !
On others computers the PresentationInterval - default doest not work !! - and i cant see the graphics ... why is it?
- and the Refresh Rate is not being used to synchronize the program. I want it always to work with PresentationInterval - default so i can use the screen refresh rate to update the screen and not to have 300FPS.
10x, LDV