bugfix(smudge): Fix Microwave Heat Haze blackout on forced AA#2374
bugfix(smudge): Fix Microwave Heat Haze blackout on forced AA#2374xezon merged 7 commits intoTheSuperHackers:mainfrom
Conversation
Co-authored-by: stm <14291421+stephanmeesters@users.noreply.github.com>
|
| Filename | Overview |
|---|---|
| Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DShaderManager.cpp | Core of the fix: preRender now permanently returns FALSE (disabling RTT for the default filter), MSAA detection added to init(), and startRenderToTexture gains a permanent-disable failsafe on SetRenderTarget failure; one minor concern around m_oldRenderSurface being released in the permanent-disable path of startRenderToTexture while m_currentFilter may be stale, but endRenderToTexture is correctly guarded by m_renderingToTexture. |
| Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DSmudge.cpp | Removes USE_COPY_RECTS conditional compilation, unconditionally uses the CopyRects path; testHardwareSupport now reports SMUDGE_SUPPORT_YES when RTT is disabled globally if m_backgroundTexture is valid; render() acquires backBuffer and background at the top with proper null-check early returns; resource lifetimes appear correct across all return paths. |
| Core/GameEngineDevice/Include/W3DDevice/GameClient/W3DSmudge.h | m_backgroundTexture member unconditionally declared (USE_COPY_RECTS guard removed), consistent with making the CopyRects path always active. |
| Core/GameEngineDevice/Source/W3DDevice/GameClient/Water/W3DWater.cpp | Local SAFE_RELEASE macro definition removed as it is now centralised in WWCommon.h. |
| Core/Libraries/Source/WWVegas/WWLib/WWCommon.h | SAFE_RELEASE macro promoted to this shared header with a proper #ifndef guard to avoid redefinition; uses nullptr correctly. |
Sequence Diagram
sequenceDiagram
participant V as W3DView
participant F as ScreenDefaultFilter
participant S as W3DSmudgeManager
participant D as D3D8 Device
Note over V,D: OLD PATH - RTT broken under forced MSAA
V->>F: filterPreRender
F->>D: startRenderToTexture - SetRenderTarget with MSAA depth
Note over D: Silent depth corruption, black screen
V->>D: doRender into RTT texture
V->>F: filterPostRender - endRenderToTexture, blit fullscreen
V->>S: render smudges over blit
Note over V,D: NEW PATH - CopyRects always safe
V->>F: filterPreRender
F-->>V: return FALSE, RTT never started
V->>D: doRender directly to back buffer
Note over V,F: filterPostRender is NOT called
V->>S: render
S->>S: testHardwareSupport - SMUDGE_SUPPORT_YES via m_backgroundTexture
S->>D: Get back buffer surface
S->>S: background Copy from back buffer
S->>D: draw smudge quads using copied background texture
Prompt To Fix All With AI
This is a comment left during a code review.
Path: Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DShaderManager.cpp
Line: 183-187
Comment:
**`postRender` is now permanently dead code**
`ScreenDefaultFilter::preRender` always returns `FALSE`, so `W3DView.cpp` gates `filterPostRender` behind `if (preRenderResult)` and this function is never invoked. Despite that, it still calls `W3DShaderManager::endRenderToTexture()` and asserts on the result. Since it can never be reached through the normal filter dispatch, the assertion will never fire — but the function body is now misleading for future maintainers, and it references the RTT texture that is no longer populated for this filter. Consider either removing the body (returning `false` immediately with a comment) or removing the function entirely if no derived class relies on it.
How can I resolve this? If you propose a fix, please make it concise.Last reviewed commit: f7ffb6a
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DSmudge.cpp
Outdated
Show resolved
Hide resolved
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DShaderManager.cpp
Outdated
Show resolved
Hide resolved
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
|
Maybe we can accurately tell if MSAA was forced with some of the changes in #1073 like below, and use that in place of Do you know if the RTT path is much faster when MSAA is off compared to copyRects? Is it worth the extra complexity to support both? |
To find this bug I tested a benchmark, but the test ran differently on different vendors, is this needed?
I tried benchmarking them, but the game lacks robust benchmarking tools. Using CapFrameX with a DX8 bridge means we’re mostly benchmarking the wrapper’s translation overhead; there’s too much fluctuation between runs to see a real difference between RTT and copyRects. It might be worth keeping both around in case we move to a native DX9 base where RTT isn't broken? |
If you're up for it you could give #2202 a try; add a zone in the smudge code, then do a testrun on a replay or something that's kinda reproduceable. Tracy can compare histograms of zones between two different captures |
|
I actually tried to set this up by bringing the Tracy profiling (#2202) into this branch to compare them. Looking at the traces, the overall render zone copy is almost invisible below PassDefault in both pathways. The smudge rendering is incredibly fast either way. I added the detail zones and did a direct comparison between the RTT and CopyRects (smudge copy) pathways. A true 1:1 benchmark would require reverting to a pre-safety commit, reapplying Tracy, and resolving build conflicts, which I've partially done but found overly time intensive. Given the minimal frame time impact, I think CopyRects is sufficient here, but if you have ideas for a cleaner benchmark setup, I'm happy to iterate. These are the results:
|
I will take a look at this soon Edit: D3DFormat doesn't appear to report accurately either, so it's not useful for detecting forced MSAA. Maybe you can use the surface mismatch that's throwing API errors as detector, (if it's important to detect this... maybe not) |
There was a problem hiding this comment.
I can confirm that it also fixes the Smudge on my machine (AMD Video Card). Very nice.
I did a very basic performance comparison between before and after fix with 35 Microwave tanks in an otherwise empty scene.
Before: ~230 FPS
After: ~230 FPS
I think we need to not worry about performance for now. The Microwave tank is also not a unit that is spammed much.
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DShaderManager.cpp
Outdated
Show resolved
Hide resolved
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DSmudge.cpp
Outdated
Show resolved
Hide resolved
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DShaderManager.cpp
Outdated
Show resolved
Hide resolved
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DSmudge.cpp
Outdated
Show resolved
Hide resolved
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DSmudge.cpp
Outdated
Show resolved
Hide resolved
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DSmudge.cpp
Outdated
Show resolved
Hide resolved
|
Title needs updating because it is not Nvidia specific bug. |
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
|
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DShaderManager.cpp
Outdated
Show resolved
Hide resolved
Core/GameEngineDevice/Source/W3DDevice/GameClient/W3DSmudge.cpp
Outdated
Show resolved
Hide resolved
|
So this could do with a followup to remove all of the RTT code from the smudge filter |





This PR aims to resolve the black screen bug caused by driver-forced MSAA, continuing and refining the work started in #1073.
The Bug: When users force MSAA via their driver control panel, the driver secretly upgrades the depth buffer to be multisampled. However, the Direct3D 8 API still reports MultiSampleType=NONE for created textures. When ScreenDefaultFilter attempts to use Render-To-Texture (RTT), it binds a non-MSAA texture to this secretly-MSAA depth buffer. This surface mismatch is a D3D API violation that silently breaks depth testing, resulting in a black screen.
This PR fixes the issue by permanently disabling the RTT path inside ScreenDefaultFilter::preRender (falling back to the CopyRects smudge path).