I'm thinking about a unity port of my iOS app "[Prism HD][1]".
The fundamental rendering task each frame is this:
1. Compute a whole bunch of line segments to simulate optical effects.
" many" = maybe 10,000 or so. each segment has x & y data plus a u & v texture coordinate.
for what it's worth, the simulation is 2D, not 3D.
2. Draw them.
so i have two questions:
1. if C# turns out not to be fast enough to crunch 10,000 rays at 60Hz, would i be better off doing the math in C (that's how i do it in iOS), or should i learn how to use compute shaders ? I'm only interested in targeting iOS, Android, and maybe desktop, so i don't need it to work in WebGL. There's not a lot of computer shader examples w/ unity.
2. how do I draw these things ? I'm super lazy. CommandBuffers seem pretty mesh-oriented, and this isn't a mesh. I'd prefer not to convert each line segment to a quad, because that quadruples the number of vertices which need to go over the bus. Again, is a shader the way ? I've never stepped outside of the fixed-function pipeline.
[1]: https://itunes.apple.com/us/app/prism-hd/id439622311?mt=8
↧