So I've been focusing on this "normalized iteration count algorithm" and I've run into a small mental hurdle sorting out what needs to be calculated. I've been trying to find the best method of coloring the points on the complex plane, and I've come across some very interesting ideas: The other reason I didn't tinker with the value is that the black outlines do actually help in this case since it's purely diffuse lighting without shadows and if I used the "correct" "joined threshold" the depth perception of the render would vanish (without using animation).I've recently started making a small fractal app in Javascript using the famous Mandelbrot bulb $(z = z^2 + c)$. This render was my very first attempt at zooming into this particular fractal and I didn't play around tweaking the distance limit ("joined threshold") for deciding when points are valid for use in the normal calculation, the result being that on the edges of overlap the normals are calculated as being almost perpendicular to line of sight. they're invalid if there's no surface or if the adjacent point is further than a given threshold away from the central point - the valid (up to four) (non-unit) normals are summed and the result is normalised. The 4 adjacent points are tested for validity i.e. In the prototype UF formula I used the fractal is calculated in the global section and the normals are calculated directly from the z-buffer using the centre point and 4 adjacent points giving 4 triangles. Why is it that the outlines of the object are black !?! « Last Edit: October 06, 2009, 12:43:55 AM by David Makin if the shape would be green it would look like a vegetable ! hmm, yummy yummy. Also I've optimised the code such that smooth iteration values are only calculated when absolutely necessary and modified the binary search section slightly.ĮDIT #3: (October 5th 2009) Corrected the rather alarming oversight of not assigning the new calculated step distance to the "step" variable when not in the binary search I even made the same mistake in my original implimented version ! If DE>solid threshold - solid threshold is a user parameter, say 1e-4ĮDIT: Please note the "****ADDED section - I checked and it is actually necessary to conditionally store the DE value found as the maximum for the next iteration depth.ĮDIT#2: (August 28th 2009) The code above has now been modified such that if the first iteration does not reach maxiter but the second does then we test to see if it's the first point on the ray in which case we exit with solid found or if not the first point then we initiate the binary search. Iterate point p (set i to number of iterations done here and remember final values for computing smooth iteration)Ĭompute point p = vp + (alpha+abit)*d - where abit is a user parameter, say 1e-10Ĭompute smooth iteration value for first iteration s0Ĭompute smooth iteration value for second iteration s1Ĭompute directional DE = 1.0/(1.0+(1/abit)*abs(s1-s0)) Set all maxd to a maximum step distance value (usually a user parameter) - n from 0 to maxiter We use an array "maxd" for storing maximum distances to step at each iteration level.Īlpha = starting alpha (as dictated by front clip) We want to scan along the ray segment from a start point dictated by our front cutting plane giving us a starting value for alpha to our end pointĭictated by the back cutting plane giving us an end value for alpha. Assume position on ray is "vp+alpha*d" where vp is the viewpoint and d is our normalised direction vector for the viewing ray.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |