Wayback Machinekoobas.hobune.stream
May JUN Jul
Previous capture 14 Next capture
2021 2022 2023
2 captures
13 Jun 22 - 14 Jun 22
sparklines
Close Help
  • Products
  • Solutions
  • Made with Unity
  • Learning
  • Support & Services
  • Community
  • Asset Store
  • Get Unity

UNITY ACCOUNT

You need a Unity Account to shop in the Online and Asset Stores, participate in the Unity Community and manage your license portfolio. Login Create account
  • Blog
  • Forums
  • Answers
  • Evangelists
  • User Groups
  • Beta Program
  • Advisory Panel

Navigation

  • Home
  • Products
  • Solutions
  • Made with Unity
  • Learning
  • Support & Services
  • Community
    • Blog
    • Forums
    • Answers
    • Evangelists
    • User Groups
    • Beta Program
    • Advisory Panel

Unity account

You need a Unity Account to shop in the Online and Asset Stores, participate in the Unity Community and manage your license portfolio. Login Create account

Language

  • Chinese
  • Spanish
  • Japanese
  • Korean
  • Portuguese
  • Ask a question
  • Spaces
    • Default
    • Help Room
    • META
    • Moderators
    • Topics
    • Questions
    • Users
    • Badges
  • Home /
avatar image
2
Question by Ted 1 · Oct 11, 2011 at 07:38 PM · shadermeshgpugeneratedperlin

GPU Generated Mesh?

Hello!I was wondering : is it possible to generate meshes just using shaders?Or deform already generated meshes with their wireframe?Can you calculate perlin noise using them?Thanks in advance!Any anwser is greatly apriciated!

Comment
Add comment
10 |3000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

3 Replies

· Add your reply
  • Sort: 
avatar image
5
Best Answer

Answer by CHPedersen · Oct 12, 2011 at 06:46 AM

It is possible to generate a mesh just using Shaders, but not from nothing. Here's why (simplified version):

Shaders are programs that are executed on key positions in the graphics pipeline. The pipeline is a series of steps on the graphics card through which all geometry goes. Geometry starts out as vertices ordered into meshes (as you know) and the first steps involves "primitive assembly", that is, the assembly of vertices into the polygons they're a part of. After this point, the execution of a vertex shader takes place that optionally transforms each vertex further by changing its color, position, UV coordinates, or a wide range of other things.

When the vertex shader has finished executing on every vertex of a model, the model is rasterized into screen-space candicates for pixel updates to the frame buffer, the so-called "fragments" of the model. The reason they're not pixels yet but only candidates is that they might not get written to the frame buffer (they might fail a depth-test, for instance, because they're obscured by some other object). This is where the fragment shader kicks in - just like the vertex shader operates on every vertex of a model, the fragment shader operates on every fragment of the rasterized model, after it has "become pixels", to put it bluntly.

But the fragments passed to the fragment shader do not necessarily have to stem from just the geometry that originally entered the graphics card (i.e. a mesh generated on the CPU side). After DirectX10/OpenGL 3.2, a type of shader that can create new geometry at the GPU level was introduced to the pipeline. This is the so called "Geometry Shader", and its execution takes place after the vertex shader. Its input is the whole primitive, that is, the whole mesh currently being rendered, and it can generate new geometry based on the input, so it effectively changes what is rasterized into fragments at the GPU level. That's the key point in the answer to your question: Yes, you can generate mesh on the graphics card, but it requires that you write a Geometry Shader, and as far as I know, that you pass something to it to begin with, I don't think it executes unless something is already passing through the pipeline.

You can't generate new geometry with just vertex and fragment programs, because as explained, they operate on vertices that are sent through the pipeline from the CPU side, and the fragments that result from the rasterization of these vertices.

Your other two questions are easy. :)

Yes, you can deform already generated meshes by their wireframe, that is what the vertex shader does when you change the position of the vertices passed to it.

Yes, you can calculate perlin noise using shaders, see this link for an example. There are other examples if you google for it a bit.

Comment
Add comment · Show 9 · Share
10 |3000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users
avatar image Ted 1 · Oct 12, 2011 at 04:27 PM 0
Share

Thank you verrrry much for your answers!I've been googleing quite a lot,but i haven't found any good and precise answers as yours!Thank you! Just one more question :

In the following example,it doesn't actually change the wireframe too...so if i add a $$anonymous$$eshCollider,for instance,it will collide with the old mesh...Is this a wrong example?Would i have to use that thing with o.position?Sorry,but i am new to shader scripting :)

 Shader "Example/Normal Extrusion" {
 Properties {
   _$$anonymous$$ainTex ("Texture", 2D) = "white" {}
   _Amount ("Extrusion Amount", Range(-1,1)) = 0.5
 }
 SubShader {
   Tags { "RenderType" = "Opaque" }
   CGPROGRA$$anonymous$$
   #pragma surface surf Lambert vertex:vert
   struct Input {
       float2 uv_$$anonymous$$ainTex;
   };
   float _Amount;
   void vert (inout appdata_full v) {
       v.vertex.xyz += v.normal * _Amount;
   }
   sampler2D _$$anonymous$$ainTex;
   void surf (Input IN, inout SurfaceOutput o) {
       o.Albedo = tex2D (_$$anonymous$$ainTex, IN.uv_$$anonymous$$ainTex).rgb;
   }
   ENDCG
 } 
 Fallback "Diffuse"

}

Thanks again!

avatar image CHPedersen · Oct 12, 2011 at 05:54 PM 0
Share

To be perfectly honest with you, the reason I know how the pipeline works at a basic, theoretical level is because I'm studying it myself these days. ;) I'm not that great at GPU program$$anonymous$$g either, I'm just reading through nvidia's own book on Cg, available online here: http://developer.nvidia.com/node/76 The first few chapters explain about the pipeline. But I think I can answer your question anyways. :)

You're right that extruding vertices at the GPU level will leave the $$anonymous$$eshCollider unchanged. This is because of the boundary between the CPU and the GPU in terms of data availability; data that appears or gets calculated in the GPU side exist only on the graphics card's video memory, it isn't available to the CPU, so once you've sent something for processing there, there's no going back. This means your $$anonymous$$eshCollider (a CPU thing) has no knowledge of what the graphics card is doing to your poor vertices at the GPU side. Therefore, it stays the same shape and size as the CPU-side $$anonymous$$esh it is attached to.

There are workarounds to get this to work anyway, of course. The value you're extruding your mesh by is set on the CPU side (it's the $$anonymous$$aterial property called "_Amount" in that example). So, our goal is to be able to change the size of the mesh collider by the same relative amount as the GPU is going to change the actual $$anonymous$$esh in the vertex shader. I don't think you can change the size of a $$anonymous$$eshCollider because it depends on the mesh, but you can change the size of a Primitive-based collider (capsule, box, etc.) just fine, so perhaps you can approximate the collider with a set of primitive colliders ins$$anonymous$$d? Sort of like what they do to the $$anonymous$$4A1 on this page:

http://unity3d.com/support/documentation/Components/class-BoxCollider.html

If you really must use a $$anonymous$$eshCollider, you could have two meshes, copies of each other, in exactly the same position, attach the collider to one of them, and disable its renderer. Then you can render one and mess with it on the GPU, and use the value "_Amount" to resize the other, invisible one, causing its $$anonymous$$eshCollider to resize itself with it.

avatar image CHPedersen · Oct 12, 2011 at 05:56 PM 0
Share

As an added note, just for the sake of correctness: I mentioned in my answer that the vertex shader executes after primitive assembly. It doesn't. Primitive assembly comes after vertex shading. :)

avatar image Ted 1 · Oct 13, 2011 at 12:17 PM 0
Share

Sorry for the late reply. Yes,but what if i deform it using Perlin noise?How would i pass that to the CPU?Each vertex has another value and i really need to avoid calculating anything expensive on the CPU : So i would need to get their new positions calculated on the GPU and then store them into an array (i have no ideea how to do this tho :[ ).Is it even possbile?I want to generate terrains,so i unfortunately i can't use Box Colliders and such...Thanks for your help,once again!

avatar image CHPedersen · Oct 13, 2011 at 01:49 PM 0
Share

If this is for terrain generation, you should be fine calculating this on the CPU, I think? Unless the terrain's vertices get morphed and move around all the time, a terrain is something static that it's ok to have the CPU pre-calculate before starting the game, I should think.

If I have misunderstood, and this is something else about the terrain that changes all the time and must take place in real time, then I think what you want is a displacement map. It's mentioned in this article, too:

http://www.ozone3d.net/tutorials/vertex_displacement_mapping_p03.php

It's basically a texture but, ins$$anonymous$$d of the fragment shader sampling it to deter$$anonymous$$e the color of a fragment, the vertex shader does lookups in it to deter$$anonymous$$e how to change the position of vertices. Is that close to what you're looking for?

Show more comments
avatar image
0

Answer by MountDoomTeam · Sep 03, 2013 at 03:29 AM

here is an example of mesh generated using DX 11 ... including some source code

http://johannes.gotlen.se/blog/

Comment
Add comment · Show 3 · Share
10 |3000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users
avatar image sasantv · Sep 20, 2016 at 09:56 PM 0
Share

Your blog needs authentication now. Perhaps it did not when you posted this, but anyhow I am very interested to see the source codes generating mesh with DX 11 ...

avatar image Bunny83 sasantv · Sep 20, 2016 at 10:21 PM 0
Share

$$anonymous$$ountDoomTeam was last online on $$anonymous$$arch 08, so it's unlikely to get a response any time soon. $$anonymous$$aybe he has email notification enabled...

avatar image sasantv Bunny83 · Sep 20, 2016 at 11:05 PM 0
Share

You are right. But at least I tried!

avatar image
0

Answer by Jeff-Kesselman · Jul 27, 2015 at 03:20 AM

Actually with DX11 it is now possible to create all geometry on the GPU. We create a vertex buffer with a Compute Shader and push it into the render pipeline.

And if you use a ComputeShader, you can pull the resulting mesh back from the GPU to create or modify a mesh collider with it. Be forewarned however that pushing data to and from the GPU is expensive.

Comment
Add comment · Share
10 |3000 characters needed characters left characters exceeded
▼
  • Viewable by all users
  • Viewable by moderators
  • Viewable by moderators and the original poster
  • Advanced visibility
Viewable by all users

Your answer

Hint: You can notify a user about this post by typing @username

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

Follow this Question

Answers Answers and Comments

7 People are following this question.

avatar image avatar image avatar image avatar image avatar image avatar image avatar image

Related Questions

2D Sprite Shader Acting like Light 0 Answers

texture/mesh/shader problem on Android 3 Answers

wood embers shader/texture 0 Answers

More vertex or more texture ? 0 Answers

Mesh Obj export/import WITH SHADER REFERENCE 3 Answers


Enterprise
Social Q&A

Social
Subscribe on YouTube social-youtube Follow on LinkedIn social-linkedin Follow on Twitter social-twitter Follow on Facebook social-facebook Follow on Instagram social-instagram

Footer

  • Purchase
    • Products
    • Subscription
    • Asset Store
    • Unity Gear
    • Resellers
  • Education
    • Students
    • Educators
    • Certification
    • Learn
    • Center of Excellence
  • Download
    • Unity
    • Beta Program
  • Unity Labs
    • Labs
    • Publications
  • Resources
    • Learn platform
    • Community
    • Documentation
    • Unity QA
    • FAQ
    • Services Status
    • Connect
  • About Unity
    • About Us
    • Blog
    • Events
    • Careers
    • Contact
    • Press
    • Partners
    • Affiliates
    • Security
Copyright © 2020 Unity Technologies
  • Legal
  • Privacy Policy
  • Cookies
  • Do Not Sell My Personal Information
  • Cookies Settings
"Unity", Unity logos, and other Unity trademarks are trademarks or registered trademarks of Unity Technologies or its affiliates in the U.S. and elsewhere (more info here). Other names or brands are trademarks of their respective owners.
  • Anonymous
  • Sign in
  • Create
  • Ask a question
  • Spaces
  • Default
  • Help Room
  • META
  • Moderators
  • Explore
  • Topics
  • Questions
  • Users
  • Badges