Skip to content

Instantly share code, notes, and snippets.

@aras-p
Created November 1, 2012 16:47
Show Gist options
  • Select an option

  • Save aras-p/3994964 to your computer and use it in GitHub Desktop.

Select an option

Save aras-p/3994964 to your computer and use it in GitHub Desktop.
// Macs with Radeon HD cards (2400-4850) have bugs when using depthstencil
// texture as both depth buffer for testing and reading as a texture. Happens
// at least on 10.5.8 to 10.6.4. Same happens on Radeon X1600 as well, so we
// just do the workaround for all Radeons.
//
// Also, 10.6.4 graphics driver update broke regular sampling of depth stencil
// textures. Some words from Apple's xxxx xxxxxxx:
// "What I suspect is happening is that ATI doesn't handle sampling from depth
// textures correctly if Hi-Z/Z-compression is on AND that texture has no
// mipmapping. I want to see if we can force the driver to alloc mipmaps for
// you. Typically we check a couple things to determine whether to allocate:
// whether the app has requested mipmaps, whether one the sampling mode is
// a mipmap filter, if mipmaps have been manually specified, or if MAX_LEVELS
// has been changed from the default value."
// Setting TEXTURE_MAX_LEVELS to zero before creating the texture fixes this
// particular problem; but sampling of the texture while using it for depth
// testing is still broken.
#if UNITY_OSX
if (hasRenderTargetStencil && isRadeon)
{
printf_console("GL: buggy packed depth stencil; Deferred rendering will use slower rendering path\n");
gl.buggyPackedDepthStencil = true;
buggyTextureBothColorAndDepth = true;
}
#endif
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment