- Home /
Shader inconsistency between OpenGL and DirectX
Hi, I modified a shader i found on the forums in order to render a mesh that only consists of vertices (MeshTopology.Points). I want to be able to define the size each vertex is rendered with on screen using the shader.
Now I think it should work and changing the PointSize value of the shader changes the appearance of the vertices on screen. The problem I'm facing now is that it does work on Windows when Unity is running in DirectX mode, but not on any other platform that uses OpenGL. I tried the same Windows machine with -force-opengl, a mac, and an Android tablet. On each of these platforms the vertices are rendered as single pixel vertices, but the PointSize value does not have any influence!
Does anyone have any clues, why this is happening or an idea how to resolve this issue? I'm quite new to shaders in Unity, so I'll appreciate any suggestions.
Best, Stefan
Shader "Custom/PC2" {
//based on http://forum.unity3d.com/threads/176317-Point-Sprite-automatic-texture-coords
Properties {
_PointSize("PointSize", Float) = 1
}
SubShader {
Tags { "Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
LOD 200
Pass {
Cull Off ZWrite On Blend SrcAlpha OneMinusSrcAlpha
CGPROGRAM
#pragma exclude_renderers flash
#pragma vertex vert
#pragma fragment frag
struct appdata {
float4 pos : POSITION;
fixed4 color : COLOR;
};
struct v2f {
float4 pos : SV_POSITION;
float size : PSIZE;
fixed4 color : COLOR;
};
float _PointSize;
v2f vert(appdata v){
v2f o;
o.pos = mul(UNITY_MATRIX_MVP, v.pos);
o.size = _PointSize;
o.color = v.color;
return o;
}
half4 frag(v2f i) : COLOR0
{
return i.color;
}
ENDCG
}
}
}
Answer by Nico de Poel · Oct 02, 2013 at 01:30 PM
The problem is that for PSIZE to work in a vertex shader on OpenGL, you need to enable the GL_VERTEX_PROGRAM_POINT_SIZE feature using glEnable(). Unity doesn't do that by default, even though Direct3D always has it enabled and enabling it for OpenGL would bring functional parity between the two APIs. I posted a feature request for this on the Unity Feedback forum once, but never got any response.
In any case, I figured out a workaround for this by using P/Invoke to call glEnable directly before rendering. Here is the script that I ended up with:
#if UNITY_STANDALONE
#define IMPORT_GLENABLE
#endif
using UnityEngine;
using System;
using System.Collections;
using System.Runtime.InteropServices;
public class EnablePointSize : MonoBehaviour
{
const UInt32 GL_VERTEX_PROGRAM_POINT_SIZE = 0x8642;
const string LibGLPath =
#if UNITY_STANDALONE_WIN
"opengl32.dll";
#elif UNITY_STANDALONE_OSX
"/System/Library/Frameworks/OpenGL.framework/OpenGL";
#elif UNITY_STANDALONE_LINUX
"libGL"; // Untested on Linux, this may not be correct
#else
null; // OpenGL ES platforms don't require this feature
#endif
#if IMPORT_GLENABLE
[DllImport(LibGLPath)]
public static extern void glEnable(UInt32 cap);
private bool mIsOpenGL;
void Start()
{
mIsOpenGL = SystemInfo.graphicsDeviceVersion.Contains("OpenGL");
}
void OnPreRender()
{
if (mIsOpenGL)
glEnable(GL_VERTEX_PROGRAM_POINT_SIZE);
}
#endif
}
You can simply add this script to a Camera game object and it should work.
It's tested and working on both Windows and Mac; Linux should work as well but remains untested. As you can see I also added a check to see if we're actually running in OpenGL mode, so we don't inadvertently end up loading the OpenGL library when we're in Direct3D.
I also tested it for OpenGL ES on Android and iOS, but it looks like GL_VERTEX_PROGRAM_POINT_SIZE does not exist there, and while setting the PSIZE register did have some effect, it caused all sorts of random glitchiness. I'll have to investigate further to see if there's some way to get this working on OpenGL ES.
Thank you a lot, Niko. Your solution works for me on a $$anonymous$$ac! And don't spend too much time on OpenGL ES. I've seen the same random artifacts and I filed a bug report on this. I got confirmation, that they could reproduce the issue, so I hope they will care about it soon.
Hi there, I am now trying to display point sprite using DirectX 11, and it seems that the issue is the same, and that I also have to set m_pDirect3DDevice->SetRenderState(D3DRS_POINTSCALEENABLE,true). Is there a way to do it for DirectX as well in Unity4 ? Cheers, $$anonymous$$at.
That's a bit more difficult as you can't easily invoke C++ code from C#. What you could do is create a small native plugin that implements the UnitySetGraphicsDevice API to get the D3D device handle when the renderer is initialized, and then invoke SetRenderState on it to enable point scaling. See Unity's docs on the Low-level Native Plugin Interface for more details.
Answer by Metal867 · Jul 26, 2019 at 12:26 PM
Hello I'm having trouble with this I puted the code exactly as you did and it doesn't change the points size.
Can you help me? thanks.
Update: I found the problem, apparently DirectX11 doesn't support point clouding anymore and althought it can render points in my mesh it doesn't change the size of them leaving them at size 1. It's a shame because the project I've been making only works in DirectX11 so I have to kind of give up in this approaching and try another one that I've been making for a while.
Your answer
Follow this Question
Related Questions
Shader not working in OpenGL (Working in DirectX) 0 Answers
Lower-level way to find NEARBY TRIANGLES on the mesh? (Answered.) 4 Answers
Setting and storing vertex locations in a shader 0 Answers
clipping shader for OpenGL quad 0 Answers
ES2.0 Diffuse Shader? 0 Answers