Back to Top

  • The beginning


  • The land of Nyleve

  • Entering  Dig

  • DugWeb

  •              Passage


  • Chizra

  •           Ceremony

  • Dark

  • Harobed

  • TerraLift

  • Terraniux

  • Noork

  • Ruins

  • Trench

  • IvsKran4

  • IvsKran32

This Page contain archived drivers

For UT , Rune and Deus ex

This Page is a Copy from original by Chris Dohnal

I did put it here as those pages are lost in time being.

Every File can be found on my Website , section downloads or Click here

Enhanced OpenGL renderer for Unreal Tournament ,Rune and Deus Ex

Last updated August 28, 2008

I added some enhancements to the OpenGL Renderer for Unreal Tournament. A Win32/x86 binary and the source code are available on this page.

The settings page documents some of the options.

The latest stable OpenGL renderer is version 3.4: (111 KB).

The previous OpenGL renderer is version 3.2: (48 KB).

It's just a new OpenGLDrv.dll as noted in the Installation instructions section.

The latest stable Direct3D8 renderer is version 1.4: (102 KB).

A previous Direct3D8 renderer is version 1.2: (41 KB).

It's a couple new files that go in the UnrealTournament\System directory. To use it, change video drivers in UT, select show all devices, and then select Direct3D8 Support. If the OpenGL renderer doesn't work right on your system for whatever reason (video driver problems or some other issue), you can try using this one instead. It has mostly the same feature set as the updated OpenGL renderer. Note that this Direct3D8 renderer doesn't support selection in the editor, so consider it unsupported and not fully functional as far as use with the editor is concerned.

For Deus Ex (works with Deus Ex version 1112fm):

OpenGL renderer version 1.8 for Deus Ex: (111 KB).

Direct3D8 renderer version 1.3 for Deus Ex: (102 KB).

For Rune (works with Rune version 1.07 or compatible):

OpenGL renderer version 1.2 for Rune: (112 KB).

Direct3D8 renderer version 1.2 for Rune: (103 KB).

Latest news

No news.


Finished a D3D9 renderer. Its feature set is very similar to the most recent OpenGL renderer. Like the D3D8 renderer, it doesn't support selection in the editor.

- Fixed a minor bug with masked texture blending and OneXBlending disabled that's present in the latest D3D8 renderer.

- All pixel shader 2.0 (if UseFragmentProgram enabled). Many could easily be done with 1.x, but also wouldn't offer much over fixed function if not using the most complicated pixel shaders.

- Added SceneNodeHack to all builds and set to enabled by default.

- Picks up SinglePassDetail paths for base texture only plus detail texture. Latest OpenGL renderer has this, but latest D3D8 renderer does not.

Version 1.0 or UT: (106 KB).

Version 1.0 for Deus Ex (works with Deus Ex version 1112fm): (106 KB).

Version 1.0 for Rune (works with Rune version 1.07 or compatible): (107 KB).

The source code package for this version of the D3D9 renderer is (64 KB). It contains MSVC8 project files. If rebuilding the source with this compiler, make sure to use service pack 1 to avoid a code generation bug with some of the SSE code. Also make sure to apply the UTGLR_NO_APP_MALLOC changes to the copy of UnFile.h that comes with the headers in the Core/Inc directory to avoid problems with certain debug features and sstream class usage.


New D3D8 renderer builds to fix a bug related to internal texture tracking that can occur when handling a lost D3D device. This one is unlikely to cause any noticeable problems in previous builds unless using single pass fog or single pass detail texture modes. If it does occur in previous builds, a flush command from the console will fix things.

Version 1.4 or UT: (102 KB).

Version 1.3 for Deus Ex (works with Deus Ex version 1112fm): (102 KB).

Version 1.2 for Rune (works with Rune version 1.07 or compatible): (103 KB).

The source code package for this version of the D3D8 renderer is (56 KB). It contains MSVC8 project files. If rebuilding the source with this compiler, make sure to use service pack 1 to avoid a code generation bug with some of the SSE code. Also make sure to apply the UTGLR_NO_APP_MALLOC changes to the copy of UnFile.h that comes with the headers in the Core/Inc directory to avoid problems with certain debug features and sstream class usage.


Built a new experimental Deus Ex renderer (beta page). It contains a few fixes, features, and minor improvements.


I think some modifications will need to be made to the HDR mode in the not a joke joke beta renderer. Not entirely unexpected. Still unfortunate that instead of getting HDR for free with pulling lighting into the fragment program to instead have to look at adding extra calculations in an attempt to make it not look too different in various cases. It's supposed to make things look brighter when there are bright lights, but without it some of those bright lights might have been made far brighter than they're supposed to be without much noticeable effect. Specular could be a problem without more detailed material properties built into the engine and levels. I think I'll look into significantly limiting specular while trying to not break anything else too badly in this mode. Hopefully this mode can still work to a lesser extent in cases where it's supposed to, including cases like allowing bright colored lights to be able to have a greater impact on the color of the meshes they illuminate.

I'm not sure when I'll get a chance to look at this code again, so it may be best to leave PixelLightHDR disabled in most cases for now. I think this should leave things looking very similar to the existing lighting in most cases. It probably won't look too much different in screenshots unless looking at certain cases where the vertex lighting doesn't work so well. It should make things look a bit smoother when lit objects are moving. I'm hoping it won't be a major problem in cases where it does occur, but in some cases certain objects could end up with darker corners or edges that might look better if lit up a little bit more.

If you want the same lighting, but have lots of mesh polys on screen, the new mesh path without pixel lighting enabled might be worth testing. Although it's not designed to be a pixel perfect match, as long there aren't any bugs (possibly due to less common cases I missed and didn't modify it to handle), it should be close enough that the differences won't be noticeable unless doing detailed compares on screenshots. I haven't taken much time to try to benchmark the new mesh path across various cases, but if you're CPU limited with lots of mesh polys on the screen, it should be faster.

As some already know, UT and Deus Ex aren't the fastest when it comes to handling high poly counts. They do a lot of per vertex processing on the CPU. Even if there's no fog, this still ends up taking around 50 instructions per mesh vertex to figure out on the CPU side. I didn't look at optimizing this case yet, but there are a few other far more significant cases taken care of in the new mesh code along with the more streamlined renderer interface it's using. There's also some SSE code in the new mesh code that should help a bit. Unlike the SSE/SSE2 code in the renderer, some of this SSE code should make more of a difference. Fast approximate reciprocal square roots and floating point compares are good.


April 1st. Time for that experimental Deus Ex renderer with some new per pixel lighting features. It's on the beta page. Requires a video card with advanced feature support that doesn't exist yet. No, that part's not right. A video card with OpenGL vertex program and fragment program support should be enough (DirectX 9 class video hardware). Remember that this one is an experimental beta build. It may not run smoothly and has a much higher probability of having rendering bugs in certain areas.


Despite not actively working on this project anymore, I've still been a bit short on time over the past few months. Since I messed up the projection matrix in previous D3D8 renderers, I'd recommend only using the latest version 1.2, though I'll leave the version 1.1 link around for a while.

Although it might not break anything in UT, it's also best to use the latest OpenGL and Direct3D8 renderers because of the color clamping fix I added to the BufferTileQuads path in these. The only thing I currently know it breaks with the old code is coronas going multi-color in Unreal when getting close enough to them, but there's always some chance it might break something else too.

I noticed one map where enabling ZRangeHack partially breaks skybox rendering, though it's not too major in this case. If you have a 24-bit z-buffer maximum, which is very likely these days, it's a choice between enabling this option or having far away decals flicker due to z-fighting with an OpenGL renderer or the Direct3D8 renderer. Enabling this option also partially breaks wireframe first person weapon rendering, but I don't consider that a major problem.

There are a few settings that get mixed in with the device specific renderer settings but are controlled exclusively by higher level code (probably render.dll). Unless going entirely for speed rather than quality and features, you may want to enable some or all of these:

- HighDetailActors

- Coronas

- ShinySurfaces

- VolumetricLighting

These will all come up disabled with a new D3D8 renderer install, or if you delete all settings in the .ini file to get defaults back, etc.

If you have problems with slowdowns when larger animated textures are in view (often water or flame textures) and you don't mind losing the animation, you can try enabling NoFractalAnim in the Display section of Advanced Options. Large per frame texture uploads may be slow on some systems and disabling this feature can avoid them.


I built another Glide renderer with a few more minor efficiency updates. I also added code to support wireframe (only tested on Voodoo2 but not original Voodoo, so hope no clipping related problems). The file is Updated source code is


I think I found out why some coronas display an incorrect faintly visible box in areas where they should be transparent in OpenGL and D3D, but not in Glide. The texture GenFX.LensFlar.3 contains the RGB color (1, 1, 1) instead of (0, 0, 0) in areas where it should be transparent. I believe this problem gets corrected by another bug in the Glide renderer. The fast x86 version of the appFloor() function in UT ends up subtracting 1 from every other integer (probably would have been good to bias it a little to move the error case to very slightly under every other integer). In the case of the lens flare texture in question, this turns all those 1s in the palette into 0s, which avoids the problem. It also means each palette entry only selects from 7 bits instead of 8 bits in this case, though in some quick tests I ran, I couldn't notice any perceptible difference due to this sometimes lost low bit. In cases where the full range of a color channel isn't used, the incorrect rounding is far less likely to make as much of a difference (though still some limited cases it could frequently change things a little). This is based on my analysis of the OpenUT Glide renderer source, which is a little out of date, but there's a very good chance this part of it is still the same in the UT Glide renderer.

Since I was looking through the Glide renderer source in more detail anyway, I added a few tweaks to it and built a new one. The OpenUT Glide renderer doesn't contain line drawing code, and I didn't try to add it, so some render debug modes won't work right with it (wireframe models won't draw). Hopefully this doesn't break anything critical. It also won't draw detailed debug stats. Other than this, I didn't notice anything different, though I also didn't spend a lot of time looking. So, if you're interested in trying a tweaked Glide renderer, I built one that should be a little faster. Make sure to keep a backup of the original renderer, especially since I know this one doesn't implemented a couple things (that are hopefully not really needed). I'm not sure how useful it'll be in 2006..., but perhaps there are still a number of Voodoo3s out there in older systems. I tested it a little on my old Voodoo2 and it seemed to work okay. I'm not sure I'm interested in spending any more time on this one, but if you notice any obvious bugs you can send me the info. Hopefully it would be something minor or easy to fix, otherwise I'd probably just say won't try to fix it and go back to the original one.


The simple workaround I tried for timer related problems has other not so minor side effects, so I won't be releasing it. So, for multiple reasons, UT is likely to have trouble on a significant number of systems with Pentium M or Athlon 64 X2 processors, and possibly some other less common configurations. There are a couple ways I might be able to write something that can avoid these timer related problems externally, but I'd rather not. These potential solutions are more difficult to implement and/or potentially slightly unreliable compared to the proper fix. The proper solution is fairly easy to implement, but it's core engine stuff so you should refer this one to UTPG or Epic. The timer APIs to use are supported all the way back through Win95 and WinNT 3.1 so there wouldn't be any need for special dynamic linking to preserve backwards compatibility.

I discovered something else of interest while experimenting with the timer workaround. It looks like parts of UT's typically always on internal profiling code may be using the rdtsc instruction a bit too much when it comes to running efficiently on P4s. This instruction is very slow on P4s, and although I'd need to do some better tests with a reliable external time source to know for sure, I have good reason to believe that this code may slow down the frame rate by 5%+ in various demos I use for testing. What I can say for sure is that getting rid of a couple of close proximity rdtsc instructions in the lower half renderer side of the buffered actor triangle code path was able to improve the frame rate by a little over 1% on my P4 2.8C in select frames with a lot of mesh triangles (I usually use 6-7 bots on screen and nearby for these tests). If I ever do another renderer release, these instructions are gone in a couple places since it's fair to consider these areas a subset of other profiling times for the buffered paths. This still leaves a lot of likely similar slow cases elsewhere however. These profiling times aren't essential during normal game operation, so it's just a matter of having a simple flag/setting to disable them (if not doing better updates that might not make this so significant...). If an existing flag already used in the current code were split, this could be implemented with no additional overhead compared to what's already there.


Numerous versions of ATI's drivers up to and including the current version 5.8 have issues that may affect gamma correction for UT in OpenGL mode, and various other games and applications. This isn't anything I can fix. If gamma correction doesn't work, try moving the in game gamma slider back and forth as this will sometimes fix it. Switching out of full screen mode and back again may also sometimes work around the problem.

Now that I added a second monitor, ATI's gamma correction problems are even worse. Their control panel reports that the hotkeys to adjust gamma for full screen 3D don't work with the desktop extended to another monitor. I disabled the second monitor and the hotkeys still didn't work with D3D games where I had seen them working before. I never had any luck with them working with various OpenGL applications the few times I'd tried. So, the hotkeys don't work with anything I use now and yet their drivers still mess with SetDeviceGammaRamp.

ATI drivers have had glitchy hardware gamma ramp support for around a year and a half now (since version 3.10 I believe). You can ask ATI if they ever intend to fix it.

A detail texture issue that comes up every once in a while may be due to a hardware bug in ATI R300 family chipsets. This is the one that may look similar to a mipmap line when using bilinear filtering, but is actually something different. I'm not going to take the time to try to prove this one in the case of UT, but I wouldn't be surprised if this issue only shows up on ATI R300 and later hardware (unless avoiding it by using single pass detail texture mode).

ATI has other driver bugs and issues. Although they took care of the major stability problems a while ago, their OpenGL support continues to be weak or broken in various areas.

I created a list of bugs I won't fix because they're not in the renderer. Some of the things that end up on this list may be due to video driver bugs. Most are due to things elsewhere in the game engine code. Only the lower half of the renderer was open sourced. There are things in the upper half of the renderer that I can't really fix. Other issues may be caused by other code in the game engine that isn't in the upper half of the renderer, but as long as it's not in the lower half of the renderer, it's probably nothing I can fix.


Various things.

Broken TruForm support in the renderer

TruForm support in the renderer is broken for a few reasons. Consider it an experimental and incomplete feature right now. There is no easy fix for the problem with player models where it doesn't look good when enabled. There are two other fixes that could be made to other parts of the game engine code that could correct two other outstanding issues.

Higher level rendering code clips actor polygons to the edge of the screen. This destroys information contained in normals needed to implement TruForm correctly. This will lead to polygons that cross the edge of the screen having minor to potentially severe graphical corruption. This is trivial to fix, but the code that does it isn't in the part of the renderer that was open sourced, so it's nothing I can fix right now. Also, with many video card/driver combinations, letting the driver or hardware deal with clipping polygons that are partially clipped by the edge of the screen should speed things up a little.

Once actor triangles make it to DrawGouraudPolygon in the renderer, there's no good way to tell if they're from a player actor that should have TruForm applied or some other actor that should not have TruForm applied. It would probably be fairly easy for higher level rendering code to use a new PolyFlags bit to tag triangles from objects that should have TruForm applied if enabled. This could reliably eliminate problems with weapons and other objects that look bad with TruForm applied. Note that the TruFormMinVertices setting attempts to solve these problems, but cannot do so reliably, and while it can fix some cases, it will also break others.

Linux builds

I never seem to hear any good news about attempts to build the updated renderer on Linux. Unfortunately, I can only provide limited help in this area. I do try to keep the code cross platform friendly, but it's unlikely that I will be able to attempt to build it on other platforms anytime in the near future. I know the current code won't compile as is with gcc, but I expect that only minor syntax fixups and basic non-essential feature removal will be able to make the updates I added both compile cleanly and work correctly.

The first major step in attempting to build the updated renderer on Linux is to make sure you can build the original renderer code before I added any updates to it. This will ensure that there are not any major existing issues before going forward and attempting to use the new code. If any problems are encountered in this stage, it's not really anything I can help with much because I don't have a local build environment for this platform, didn't write this code, etc. If there's some problem like maybe the ut432 header files don't quite match the current Linux version of UT and cause problems because of this, that falls into the I didn't break it and I can't fix it category.

You'll need to use a compatible version of gcc. Unfortunately, ABI changes mean you will almost certainly not be able to use a newer version of gcc (unless the rest of the game were compiled with it of course).

You can easily ifdef out the SSE code I added because whatever compiler you use probably won't support it. This is not a major loss since the SSE code I added only provides very minor speedups.

There's a chance there will be problems with the sstream code I used for the debug stream when using an older gcc and/or older libraries. Although it requires numerous changes to remove it, this code is non-essential, and the changes should be simple.

There are good reasons to try to get an updated renderer working on Linux if running UT natively here. Besides just being very obsolete at this point, the original OpenGL renderer code does contain a couple of more major design/implementation issues that would be good to have fixed.


I may have fixed the problem with UnrealEd not restoring gamma on exit, but I still need to review the changes to make sure they have little risk of causing new problems. It doesn't help that ATI's drivers still do odd stuff with gamma correction. They're getting close to a year of breaking things in this area to various degrees. There's something odd about their installer for 4.9 too, as I had to temporarily rename my bin directory to prevent it from failing.


I built a new version of SetGamma that fixes various minor problems. It's a simple command line utility program that adjusts the hardware gamma ramp on the primary display adapter. A shortcut that sends it the -reset option can be used to reset the hardware gamma ramp to 1.0 after a crash that prevents it from being restored.

Some of the old news gets moved to the News Archive page.


- Additional options are documented in the [New options] section.

Installation instructions

Go to your UnrealTournament\System directory. Make a backup of your old OpenGLDrv.dll in case the new one doesn't work. Then put the new OpenGLDrv.dll in your UnrealTournament\System directory. This one contains a number of optimizations that should improve performance over the base UT 4.36 OpenGL renderer. It also contains a number of new options, which are described further down on this page.

OpenGLDrv.dll for Win32/x86: (111 KB)

OpenGLDrv.dll for Win32/x86: (49 KB)

OpenGLDrv.dll for Win32/x86: (48 KB)

OpenGLDrv.dll for Win32/x86: (48 KB)

OpenGLDrv.dll for Win32/x86: (48 KB)

OpenGLDrv.dll for Win32/x86: (48 KB)

OpenGLDrv.dll for Win32/x86: (46 KB)

Source code: (110 KB)

Source code: (117 KB)

Source code: (115 KB)

Source code: (110 KB)

Source code: (110 KB)

Source code: (95 KB)

Source code: (94 KB)

Notes about the source code

The source code has been modified extensively. Although I did not try to break Linux support completely, I did add some Windows specific code. Feel free to email me at if you need any help getting it to build on Linux. Make sure to add the NO_UNICODE_OS_SUPPORT define when building it on Win32.

The source code package only contains .cpp and .h files from the OpenGL\Src subdirectory, which is where my changes are. You will need to get the 432 headers from Epic to be able to build it. You can download these from the Unreal Technology Downloads page.

For version 1.2 and up, I had to remove the operator new and delete overrides to make the new C++ debug functions work. I included a copy of the modified UnFile.h with the proper ifdefs. I just have it pass things through to malloc and free instead. I believe the problem may be with the overrides not handling 0 byte allocations as malloc and new do.



New options

This enhanced UT OpenGL renderer supports some new options. They go in the [OpenGLDrv.OpenGLRenderDevice] section of your UnrealTournament.ini file. Most options are documented on the settings page.


I'd like to thank Epic Games for releasing the source code to the UT OpenGL renderer, which made adding these updates to it possible.

NitroGL for the original TruForm renderer modification. Initial experimental TruForm code is based on these modifications.

Leonhard Gruenschloss for help with implementing and testing additional TruForm related updates and new Deus Ex specific code.

Copyright 2002-2008 Chris Dohnal