61
Support / Re: getSphere high RAM usage
« on: September 27, 2012, 12:21:41 pm »
Ok will do thanks Egon.
This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
uniform float u_time;
uniform vec3 u_centerPosition;
attribute float a_lifetime;
attribute vec3 a_startPosition;
attribute vec3 a_endPosition;
varying float v_lifetime;
void main()
{
if ( u_time <= a_lifetime )
{
gl_Position.xyz = a_startPosition + (u_time * a_endPosition);
gl_Position.xyz += u_centerPosition;
gl_Position.w = 1.0;
}
else
gl_Position = vec4( -1000, -1000, 0, 0 );
v_lifetime = 1.0 - ( u_time / a_lifetime );
v_lifetime = clamp ( v_lifetime, 0.0, 1.0 );
gl_PointSize = ( v_lifetime * v_lifetime ) * 40.0;
}
I tried it myself after remembering that i got this fatal crash with my old Radeon 5870. Now with the GTX 680, it works. I got the same memory error and fixed it in the same way...strange. The rendered image looked good, but performance is p*** poor. From a Core i7 @ 4Ghz, 16GB, GTX 680 setup, i would expect a little more than max. 7 fps in my RPG thingy @ 800*480. In the dungeons, it slowed down to 4 fps...maybe because of the parallax mapping shader that i'm using. So while it worked technically, it's far too slow IMHO. At least when using OpenGL 2.0. It might be better with 1.x.