- Home /
People with Intel graphics cards cannot play my game. What is the solution?
Anyone who is not using an Intel graphics card is able to play my Unity game without a problem: http://i.imgur.com/3RtNYIR.png
However, anyone with an Intel graphics card reports that they can only see black models on a blue background: http://i.imgur.com/uLWJDUc.png
Is this a known issue? What is the solution?
Are you running DirectX 11 stuff where the fallback shader is just vertex lit? Have you tried exporting the game with the quality dialogue enabled and let your players choose worse settings until it works and see the difference?
DirectX 11 is disabled in my game.
Yes, I've shared a build of the game with the quality dialogue enabled. Even at the worst settings, it's still a blue background with black unlit models.
I did find out one bit of information; if I disable all occlusion culling, then people with Intel graphics cards can see the game normally. However, shipping the game without any occlusion culling is highly unpreferable.
With this new information in $$anonymous$$d, any further suggestions?
Answer by hexagonius · Feb 17, 2015 at 07:15 AM
Have you tried using a default diffuse shader on at least one affected model to see if it reappears?
Your answer
Follow this Question
Related Questions
is it possible to get unity 3 on macbook pro intel Tiger 10.4.11? 1 Answer
Galaxy Tab 3 build problems? 2 Answers
Intel Iris Pro 580 driver crash causes Unity graphics freeze 3 Answers
Intel HD3000 chipsets supports occlusion ? 1 Answer
Does Unity 4.3 support Mobile Intel 945 Express Chipset? 1 Answer