- Home /
How can I use UnityShaderCompiler.exe?
In C:\Program Files\Unity\Editor\Data\Tools there is UnityShaderCompiler.exe, which is used by unity for compiling shaders. Is it possible to use this by itself?
I've found that if I set up a TCP server, at, say, port 4567, then run
UnityShaderCompiler.exe "C:\Program Files\Unity\Editor\Data\Tools" "C:\Users\Me\Desktop\shadercompiler.log" "4567"
It will connect to my server, and sit waiting for requests. However, every request that I am trying to send to it seems to crash it. What kinds of requests does it expect?
I found this, but it only works for Unity 4.5, so I might end up repeating what they did except for with 5.4 ins$$anonymous$$d. I'm interested if anyone else has already done this though, or knows an alternate answer to my question.
Just to be clear, what exactly are you trying to use it for?
I'd like to be able to compile shaders (preferably compute shaders as well) during runtime
Answer by Bunny83 · Jun 19, 2016 at 01:09 AM
Well, your actual usecase is still not clear. First of all you should be aware of the fact that the UnityShaderCompiler.exe belongs to the Unity editor and most likely is not allowed to ship with your product. It's not part of the engine.
Next thing is the Unity engine itself doesn't support loading any kind of shader code dynamically, not as high level source code, nor as compiled source code. So i'm not sure what you think you might use that program (UnityShaderCompiler) for.
About the protocol, it seems to be proprietary protocol. It's message based protocol that builds up on a TCP stream. Every message is initiated with a magic byte sequence which looks like:
E6 C4 02 0C
followed by a 4 byte length field in little endian, followed by the actual message data. What message is expected next seems to vary depending on the currend command.
Commands seem to be prefixed with "c:" (without the quotes). Some commands i have encountered are:
c:getPlatforms
c:preprocess
c:compileSnippet
c:disassembleShader
Some commands expect further messages following. For example c:preprocess seems to expect 4 additional messages following. So after sending c:preprocess Unity sends:
The whole shader file as text
The asset directory (i.e. "Assets/SomeFolder" where the shader in question is located in "SomeFolder")
The shader include directory (usually: "C:/Program Files/Unity/Editor/Data/CGIncludes")
and finally a 1 byte message which contains the ASCII character "0" (hex 0x30). This might be some kind of setting / flag / ???
Again, each of those messages is initiated with the magic 4 byte sequence followed by a 4 byte length which indicates the length of the payload data.
Each command (once all parts are send) are usually answered by several messages by the ShaderCompiler. So the exact structure of those messages is different for each command.
You can simply create a proxy program that sits in between Unity and the "UnityShaderCompiler.exe". I've written such a general purpose program a long time ago. It's simply a TCP server and a TCP client which you can start / connect seperate from each other but i have an option to simply forward everything to the other side. If you write your own proxy you can put it in between Unity and the UnityShaderCompiler by following those steps on a windows machine:
Start Unity.
Open the taskmanager and check the details page. Make sure you display the "command line" column. Find the UnityShaderCompiler.exe and check it's command line. There you can see which port Unity opened for this session. The port is different each time you restart Unity.
Start your proxy and let it run it's server on an arbitrary port (for example "10000").
Start another UnityShaderCompiler manually and pass your server port of your proxy program (10000 in this example)
In the taskmanager you should now kill the original UnityShaderCompiler.exe that Unity started in the background.
Quickly let your proxy (client) connect to the same port that Unity used initially.
Now when Unity wants to use the UnityShaderCompiler.exe, all that traffic goes through your proxy so you can log and examine the traffic.
Thank you so much! This was what I was looking for =).
Well, your actual usecase is still not clear.
Specifically, I'm making a 3D fractal viewer in VR that does raycasting in real time on a compute shader. I'd like to allow the user to write their own distance estimators, then be able to use my underlying code (raycasting, ambient occlusion, etc.) to render it. I'd like to have a standalone client to do this because the unity editor actually introduces quite a bit of lag if you have certain menus open, and doesn't make that much sense to use when the 3D scene isn't relevant. Does that make more sense?
First of all you should be aware of the fact that the UnityShaderCompiler.exe belongs to the Unity editor and most likely is not allowed to ship with your product. It's not part of the engine.
Hmm. Unfortunately that makes sense. Could anyone else confirm this?
If that's true I could ask my users to install Unity themselves, but that seems like too much of a hassle (also it has to be the right version). I'll probably use an external plugin with a built-in compiler and blit the resulting texture to the camera then.
Next thing is the Unity engine itself doesn't support loading any kind of shader code dynamically, not as high level source code, nor as compiled source code. So i'm not sure what you think you might use that program (UnityShaderCompiler) for.
Oh. This is good to know. An external plugin makes sense then I suppose.
Yea so I ended up just making a "pseudo OpenGL context" that is running at the same time as my unity program. It compiles shaders and runs them and such then sends the data back to Unity for rendering.
It's a hack but works great. Thanks again, because this saved me a lot of time :)
Answer by Dani Phye · Jun 20, 2016 at 11:48 PM
Okay so this is relevant enough to the question that I figure it is worth sharing.
I still wanted to be able to compile shaders during runtime, so I made a OpenGL context that runs under the hood (it's "headless", i.e., not attached to a window) and renders to a texture, and then transfers data back to Unity. Cool part about this is now I have the full functionality of any OpenGL call in unity, even if my project itself is using, say, DirectX11. It uses OpenGL.NET which allows all of this code to work without any external plugins.
Here's the code. After dropping OpenGL.Net.dll into your Assets/Plugin folder, attach this script to some game object and press Play. Then, if you look at its unityTexture, it will display a texture that is the result after running a native compute shader :) (It should be come gradient thing)
using UnityEngine;
using OpenGL;
using System.Runtime.InteropServices;
using System;
using System.Text;
using System.IO;
public class NativeCompute : MonoBehaviour {
public Texture2D unityTexture;
// From http://www.pinvoke.net
[DllImport("user32.dll")]
static extern IntPtr GetActiveWindow();
[DllImport("user32.dll")]
static extern IntPtr GetDC(IntPtr hWnd);
[DllImport("user32.dll")]
static extern bool ReleaseDC(IntPtr hWnd, IntPtr hDC);
void MakeTempContext()
{
// OpenGl context without window, from http://stackoverflow.com/questions/576896/can-you-create-opengl-context-without-opening-a-window
// We'll just use the active window and then not perturb it
hwnd = GetActiveWindow();
Wgl.PIXELFORMATDESCRIPTOR pfd = new Wgl.PIXELFORMATDESCRIPTOR();
// Get the device context
hdc = GetDC(hwnd);
// Set the pixel format for the DC
pfd.nSize = (short)Marshal.SizeOf(typeof(Wgl.PIXELFORMATDESCRIPTOR));
pfd.nVersion = 1;
pfd.dwFlags = Wgl.PixelFormatDescriptorFlags.SupportOpenGL;
pfd.iPixelType = Wgl.PFD_TYPE_RGBA;
pfd.cColorBits = 32;
pfd.cDepthBits = 24;
int iFormat = Wgl.ChoosePixelFormat(hdc, ref pfd);
Wgl.SetPixelFormat(hdc, iFormat, ref pfd);
// Create our temporary OpenGL context
hrc = Wgl.CreateContext(hdc);
Wgl.MakeCurrent(hdc, hrc);
}
IntPtr hwnd;
IntPtr hdc;
IntPtr hrc;
void DeleteTempContext()
{
// We have made our actual context, we can get rid of our temporary one now
Wgl.MakeCurrent(IntPtr.Zero, IntPtr.Zero);
Wgl.DeleteContext(hrc);
}
uint CompileComputeShader(string shaderText, out string infoLog)
{
// Create shader
uint shader = Gl.CreateShader(Gl.COMPUTE_SHADER);
// Set shader source code
Gl.ShaderSource(shader, new string[] { shaderText });
// Compile shader
Gl.CompileShader(shader);
// Check for compilation errors, false (0) if error, true (1) if success
int didCompile;
Gl.GetShader(shader, Gl.COMPILE_STATUS, out didCompile);
if (didCompile == Gl.FALSE)
{
int infoLogLength;
Gl.GetShader(shader, Gl.INFO_LOG_LENGTH, out infoLogLength);
StringBuilder resultInfo = new StringBuilder(infoLogLength);
int gotLength;
Gl.GetShaderInfoLog(shader, infoLogLength, out gotLength, resultInfo);
infoLog = "Compilation error:\n" + resultInfo.ToString();
return 0;
}
// Create shader program that holds our compute shader
uint program = Gl.CreateProgram();
// Attach our shader
Gl.AttachShader(program, shader);
// Link the program together
Gl.LinkProgram(program);
// Check for link errors
int didLink;
Gl.GetProgram(program, Gl.LINK_STATUS, out didLink);
if (didLink == Gl.FALSE)
{
int infoLogLength;
Gl.GetProgram(program, Gl.INFO_LOG_LENGTH, out infoLogLength);
StringBuilder resultInfo = new StringBuilder(infoLogLength);
int gotLength;
Gl.GetProgramInfoLog(program, infoLogLength, out gotLength, resultInfo);
infoLog = "Link error:\n" + resultInfo.ToString();
return 0;
}
// No errors were found, compiled successfully!
infoLog = "";
return program;
}
void Start()
{
// Make temporary context
MakeTempContext();
// Use OpenGL 4.3, min OpenGL needed for compute shaders
int[] attribs = new int[]
{
Wgl.CONTEXT_MAJOR_VERSION_ARB, 4,
Wgl.CONTEXT_MINOR_VERSION_ARB, 3,
0
};
// Make actual context
// We need to make our temporary one first because OpenGL is weird and
// doesn't let you call this unless you are in an active context already
IntPtr other = Wgl.CreateContextAttribsARB(hdc, IntPtr.Zero, attribs);
// Delete temporary context
DeleteTempContext();
// Set our actual context to be used
Wgl.MakeCurrent(hdc, other);
// Put shaders in ProjectRoot/Shaders (make a new folder named that)
// If these shaders are in Assets, Unity will get upset because they aren't in its format
string shaderFolder = Application.dataPath + "/../Shaders/";
// Compile our compute shader and test for errors
string infoLog;
uint testProgram = CompileComputeShader(File.ReadAllText(shaderFolder + "test.compute"), out infoLog);
if (testProgram == 0)
{
Debug.Log("Failed to compile compute shader: " + infoLog);
return;
}
// Create the texture we will be rendering to
int textureWidth = 512;
int textureHeight = 512;
uint texture = Gl.GenTexture();
Gl.BindTexture(TextureTarget.Texture2d, texture);
Gl.TexParameter(TextureTarget.Texture2d, TextureParameterName.TextureMagFilter, Gl.NEAREST);
Gl.TexParameter(TextureTarget.Texture2d, TextureParameterName.TextureMinFilter, Gl.NEAREST);
Gl.TexParameter(TextureTarget.Texture2d, TextureParameterName.TextureWrapS, Gl.CLAMP_TO_EDGE);
Gl.TexParameter(TextureTarget.Texture2d, TextureParameterName.TextureWrapT, Gl.CLAMP_TO_EDGE);
Gl.TexImage2D(TextureTarget.Texture2d, 0, Gl.RGBA8, textureWidth, textureHeight, 0, PixelFormat.Bgra, PixelType.UnsignedByte, IntPtr.Zero);
// Create framebuffer
uint framebuffer = Gl.GenFramebuffer();
Gl.BindFramebuffer(Gl.FRAMEBUFFER, framebuffer);
Gl.FramebufferTexture2D(Gl.FRAMEBUFFER, Gl.COLOR_ATTACHMENT0, Gl.TEXTURE_2D, texture, 0);
// Create renderbuffer
uint depthRenderbuffer = Gl.GenRenderbuffer();
Gl.BindRenderbuffer(Gl.RENDERBUFFER, depthRenderbuffer);
Gl.RenderbufferStorage(Gl.RENDERBUFFER, Gl.DEPTH_COMPONENT24, textureWidth, textureHeight);
// Attach renderbuffer to framebuffer
Gl.FramebufferRenderbuffer(Gl.FRAMEBUFFER, Gl.DEPTH_ATTACHMENT, Gl.RENDERBUFFER, depthRenderbuffer);
// Make sure we successfully attached it
int framebufferStatus = Gl.CheckFramebufferStatus(Gl.FRAMEBUFFER);
if (framebufferStatus != Gl.FRAMEBUFFER_COMPLETE)
{
Debug.Log("Failed to create framebuffer");
return;
}
// Clear texture to black. This isn't actually needed since we are going to overwrite every pixel
Gl.ClearColor(0, 0, 0, 0);
Gl.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);
// Tell OpenGl that we are going to be using our program soon
Gl.UseProgram(testProgram);
// Bind our texture to the destTex uniform variable in the shader
// (uniform means that it gets values from some non-shader code)
uint unit = 0;
int location = Gl.GetUniformLocation(testProgram, "destTex");
Gl.Uniform1(location, unit);
Gl.BindImageTexture(unit, texture, 0, false, 0, Gl.READ_WRITE, Gl.RGBA8);
// Run our compute shader with one work group per pixel. This is not optimal but works as an example
Gl.DispatchCompute((uint)textureWidth, (uint)textureHeight, 1);
// Wait until our compute shader is done
Gl.MemoryBarrier(Gl.SHADER_IMAGE_ACCESS_BARRIER_BIT);
// Now we can retreive the data, we want to store it to a Color32[] to be sent to our Unity textures
Color32[] pixels = new Color32[textureWidth * textureHeight];
// Get the IntPtr to our array
GCHandle pinnedArray = GCHandle.Alloc(pixels, GCHandleType.Pinned);
IntPtr pixelArrayPointer = pinnedArray.AddrOfPinnedObject();
// Read our pixels. This writes directly to our Color32[] array as desired
Gl.ReadPixels(0, 0, textureWidth, textureHeight, PixelFormat.Bgra, PixelType.UnsignedByte, pixelArrayPointer);
// Free our IntPtr since we are done with it
pinnedArray.Free();
// Create our unity texture. Remember that SetPixels32 only works when you are using TextureFormat.ARGB32
unityTexture = new Texture2D(textureWidth, textureHeight, TextureFormat.ARGB32, false);
unityTexture.filterMode = FilterMode.Point;
// Set the contents of our new texture to the retreived texture pixels
unityTexture.SetPixels32(pixels);
unityTexture.Apply();
// Unbind framebuffer
Gl.BindFramebuffer(Gl.FRAMEBUFFER, 0);
// Delete texture
Gl.DeleteTextures(1, texture);
// Delete renderbuffer
Gl.DeleteRenderbuffers(1, depthRenderbuffer);
// Delete framebuffer
Gl.DeleteFramebuffers(1, framebuffer);
// Delete OpenGl context because we are done
// You shouldn't do this until you are done using native OpenGL,
// for example it might make sense to put this in OnApplicationQuit
Wgl.MakeCurrent(IntPtr.Zero, IntPtr.Zero);
Wgl.DeleteContext(other);
Wgl.ReleaseDC(hwnd, hdc);
}
}
And here's the compute shader. Name this test.compute and put it in ProjectRoot/Shaders (make a new folder named that), because if native shaders are in Assets, Unity will get upset because they aren't in its format.
#version 430
layout(local_size_x = 1, local_size_y = 1) in;
layout(rgba32f) uniform image2D destTex;
void main() {
ivec2 pos = ivec2(gl_GlobalInvocationID.xy);
imageStore(destTex, pos, vec4(sin(pos.x/512.0), sin(pos.y/512.0), cos(pos.x/512.0), 1.0));
}
You forgot to call ReleaseDC for the device context that you created with GetDC when you're done.
You might want to seperate your temp render context creation from your device context creation since the device context has to live longer than your temp render context.
Oh! Thank you! I have updated that now. If I've missed anything else feel free to let me know.
Yea, there are some organizational things like that that I have done now with my own code. I just posted this code as a "$$anonymous$$imal example" of sorts, if that makes sense.
Your answer

Follow this Question
Related Questions
Reflections not working on IOS 2 Answers
How to use the Unity 5 Transparent Shader? 1 Answer
Read depth buffer on the cpu 1 Answer
Control cubemap reflection amount from metallic and smoothness properties 0 Answers
Unlit Transparent Shader 0 Answers