Just got a new laptop: System76 Galago Ultrapro, with Intel Iris Pro graphics. So far, despite some shortcomings, I’m liking it. Programs spring to action, and Dota 2 runs with all the eye candies. But Second Life refuses to load any Shaders, and as a result, the screen looks like it’s from ten years ago.
So I had to roll my own viewer After messing with stuff for two days, I’ve finally got Windlight atmospheric shaders working. Here’s how:
(Note: I use Kokua as my primary viewer, so this guide might not 100% match the official release; but most of the stuff is common to all viewers.)
First you need to get the source and compile successfully. Instructions are avaliable on the Second Life wiki, although they aren’t very clear.
First problem is that gcc 4.8, the default for Ubuntu 13.10, complains about “array subscript is above array bounds [-Werror=array-bounds]”. This is most likely a false alarm, so use gcc 4.7:
CC=gcc-4.7 CXX=g++-4.7 AUTOBUILD_PLATFORM_OVERRIDE='linux64' kokua-autobuild/bin/autobuild configure -c RelWithDebInfoOS -- -DLL_TESTS=OFF
(remove AUTOBUILD_PLATFORM_OVERRIDE='linux64'
) if you’re building the official viewer – it doesn’t support 64-bit Linux that well.
gcc 4.7 still complains about truncating precision, so add explicit casts:
diff -r 4581f8365e7d indra/newview/llworld.cpp --- a/indra/newview/llworld.cpp Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/llworld.cpp Sat Apr 05 19:59:19 2014 -0400 @@ -1427,8 +1427,8 @@ center_y = min_y + (wy >> 1); S32 add_boundary[4] = { - 512 - (max_x - (rwidth - 256) - region_x), - 512 - (max_y - (rwidth - 256) - region_y), + 512 - (max_x - (rwidth - 256) - (S32)region_x), + 512 - (max_y - (rwidth - 256) - (S32)region_y), 512 - ((S32)region_x - min_x), 512 - ((S32)region_y - min_y) }; |
Now we got the viewer to compile. But this is only the beginning. We need to make a few changes.
First, a few shaders in Second Life use requires some shader extensions. But by OpenGL spec, #extension
must come before all non-preprocessing directives… and yes, that includes comments! Other drivers would silently accept those, but Mesa considers that to be a bug in SL (see https://bugs.freedesktop.org/show_bug.cgi?id=69226).
The proper way would be reading the shader files and moving all the #extension
s to the front of the shader. For now, I’ll just use this hack that works for non-deferred shaders:
diff -r 4581f8365e7d indra/llrender/llshadermgr.cpp --- a/indra/llrender/llshadermgr.cpp Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/llrender/llshadermgr.cpp Sat Apr 05 19:59:19 2014 -0400 @@ -629,8 +629,8 @@ text[count++] = strdup("#version 130\n"); //some implementations of GLSL 1.30 require integer precision be explicitly declared - text[count++] = strdup("precision mediump int;\n"); - text[count++] = strdup("precision highp float;\n"); + //text[count++] = strdup("precision mediump int;\n"); + //text[count++] = strdup("precision highp float;\n"); } else { //set version to 400 @@ -859,6 +859,7 @@ |
And then we need to edit the shaders to move the extension directive before comments:
diff -r 4581f8365e7d indra/newview/app_settings/shaders/class1/effects/glowExtractF.glsl --- a/indra/newview/app_settings/shaders/class1/effects/glowExtractF.glsl Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/app_settings/shaders/class1/effects/glowExtractF.glsl Sat Apr 05 19:59:19 2014 -0400 @@ -1,3 +1,5 @@ +#extension GL_ARB_texture_rectangle : enable + /** * @file glowExtractF.glsl * @@ -23,7 +25,7 @@ * $/LicenseInfo$ */ -#extension GL_ARB_texture_rectangle : enable + #ifdef DEFINE_GL_FRAGCOLOR out vec4 frag_color; diff -r 4581f8365e7d indra/newview/app_settings/shaders/class1/interface/downsampleDepthRectF.glsl --- a/indra/newview/app_settings/shaders/class1/interface/downsampleDepthRectF.glsl Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/app_settings/shaders/class1/interface/downsampleDepthRectF.glsl Sat Apr 05 19:59:19 2014 -0400 @@ -1,3 +1,4 @@ +#extension GL_ARB_texture_rectangle : enable /** * @file debugF.glsl * @@ -23,7 +24,7 @@ * $/LicenseInfo$ */ -#extension GL_ARB_texture_rectangle : enable + #ifdef DEFINE_GL_FRAGCOLOR out vec4 frag_color; diff -r 4581f8365e7d indra/newview/app_settings/shaders/class1/interface/glowcombineF.glsl --- a/indra/newview/app_settings/shaders/class1/interface/glowcombineF.glsl Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/app_settings/shaders/class1/interface/glowcombineF.glsl Sat Apr 05 19:59:19 2014 -0400 @@ -1,3 +1,4 @@ +#extension GL_ARB_texture_rectangle : enable /** * @file glowcombineF.glsl * @@ -29,7 +30,7 @@ #define frag_color gl_FragColor #endif -#extension GL_ARB_texture_rectangle : enable + uniform sampler2D glowMap; uniform sampler2DRect screenMap; diff -r 4581f8365e7d indra/newview/app_settings/shaders/class1/interface/splattexturerectF.glsl --- a/indra/newview/app_settings/shaders/class1/interface/splattexturerectF.glsl Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/app_settings/shaders/class1/interface/splattexturerectF.glsl Sat Apr 05 19:59:19 2014 -0400 @@ -1,3 +1,4 @@ +#extension GL_ARB_texture_rectangle : enable /** * @file splattexturerectF.glsl * @@ -23,7 +24,7 @@ * $/LicenseInfo$ */ -#extension GL_ARB_texture_rectangle : enable + #ifdef DEFINE_GL_FRAGCOLOR out vec4 frag_color; |
Now we need to edit the launcher script, otherwise LibGL won’t be able to find the graphics drivers.
diff -r 4581f8365e7d indra/newview/linux_tools/wrapper.sh --- a/indra/newview/linux_tools/wrapper.sh Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/linux_tools/wrapper.sh Sat Apr 05 19:59:19 2014 -0400 @@ -65,8 +65,11 @@ MULTIARCH_ERR=$? if [ $MULTIARCH_ERR -eq 0 ]; then echo 'Multi-arch support detected.' - MULTIARCH_GL_DRIVERS="/usr/lib/${I386_MULTIARCH}/dri" + AMD64_MULTIARCH="$(dpkg-architecture -aamd64 -qDEB_HOST_MULTIARCH 2>/dev/null)" + + MULTIARCH_GL_DRIVERS="/usr/lib/${AMD64_MULTIARCH}/dri:/usr/lib/${I386_MULTIARCH}/dri" export LIBGL_DRIVERS_PATH="${LIBGL_DRIVERS_PATH}:${MULTIARCH_GL_DRIVERS}:/usr/lib64/dri:/usr/lib32/dri:/usr/lib/dri" + echo ${LIBGL_DRIVERS_PATH} else export LIBGL_DRIVERS_PATH="${LIBGL_DRIVERS_PATH}:/usr/lib64/dri:/usr/lib32/dri:/usr/lib/dri" fi |
Almost done! Now we need to mark Mesa as “supported”, so viewer will enable shaders:
diff -r 4581f8365e7d indra/newview/gpu_table.txt --- a/indra/newview/gpu_table.txt Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/gpu_table.txt Sat Apr 05 21:03:49 2014 -0400 @@ -345,7 +345,7 @@ Intel B45/B43 .*Intel.*B4.* 1 1 1 2.1 Intel 3D-Analyze .*Intel.*3D-Analyze.* 2 1 0 0 Matrox .*Matrox.* 0 0 0 0 -Mesa .*Mesa.* 1 0 1 3 +Mesa .*Mesa.* 1 1 1 3 Gallium .*Gallium.* 1 1 1 2.1 NVIDIA GeForce Pre-Release .*NVIDIA .*GeForce[ ]Pre-Release.* 2 1 1 3.3 NVIDIA D1xP1 .*NVIDIA .*D1[0-4]P1.* 0 0 0 0 |
Turn on cubemap, else there will be an assert error if you enable transparent water:
diff -r 4581f8365e7d indra/newview/featuretable_linux.txt --- a/indra/newview/featuretable_linux.txt Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/featuretable_linux.txt Sat Apr 05 21:03:49 2014 -0400 @@ -441,7 +441,7 @@ list Intel RenderAnisotropic 1 0 // Avoid some Intel crashes on Linux -RenderCubeMap 0 0 +RenderCubeMap 1 0 RenderFSAASamples 1 0 list GeForce2 |
Finally if you’re building Kokua, you might need to comment out these so you can open the preferences box:
diff -r 4581f8365e7d indra/newview/llfloaterpreference.cpp --- a/indra/newview/llfloaterpreference.cpp Tue Mar 25 18:54:59 2014 -0500 +++ b/indra/newview/llfloaterpreference.cpp Sat Apr 05 21:03:49 2014 -0400 @@ -600,9 +600,9 @@ void LLFloaterPreference::onShowStreamMetadataChanged() { - BOOL enable = gSavedSettings.getBOOL("ShowStreamMetadata"); - - getChild<llcheckboxctrl>("ShowStreamName")->setEnabled(enable); +// BOOL enable = gSavedSettings.getBOOL("ShowStreamMetadata"); +// +// getChild<llcheckboxctrl>("ShowStreamName")->setEnabled(enable); } |