I love the idea of superlinear convergence. This superlinear convergence is also present for the secant method (alpha = 1.618), Muller's method and inverse quadratic interpolation (alpha = 1.84 for both). However, only the secant method yields interesting fractal results, while this method, Jarratt's method, resulting from successive parabolic interpolation, shows interesting results similar to the secant method. Plus, these methods have what I like to call the order of "subquadratic convergence". Like Muller's method and inverse quadratic interpolation, it needs three points, as a result of relying on Lagrange polynomials. There's also a relationship about the interest of fractal structures and order of convergence. The higher the order is, the faster it converges, but it makes less interesting fractals, while lower convergence order yields more interesting results. By the way, is there a way to share my fractal works with you?
Your post on Durand-Kerner is being marked as spam by RUclips (probably because of the links), but I did take a look at the fractals that you and others were sharing on Fractalforums.
@@OscarVeliz This shows that any root finding method can be represented as a fractal, but some are more difficult than others, like Jenkins-Traub, Bisection, Regula falsi and Lehmer-Schur.
@@OscarVeliz Would it be possible to convert some of your code to .ufm files (the format used for fractal formula files in Ultra Fractal)? (although you may need to get used to it first)
@@OscarVeliz I also tried the Nelder-Mead method on Ultra Fractal, and it worked, and though it did not generate any notable fractal pattern, it did show the minima of the function. And what would be the order of convergence of this method?
Many thanks Prof. for the great lecture with beautiful illustrations
I love the idea of superlinear convergence. This superlinear convergence is also present for the secant method (alpha = 1.618), Muller's method and inverse quadratic interpolation (alpha = 1.84 for both). However, only the secant method yields interesting fractal results, while this method, Jarratt's method, resulting from successive parabolic interpolation, shows interesting results similar to the secant method. Plus, these methods have what I like to call the order of "subquadratic convergence". Like Muller's method and inverse quadratic interpolation, it needs three points, as a result of relying on Lagrange polynomials. There's also a relationship about the interest of fractal structures and order of convergence. The higher the order is, the faster it converges, but it makes less interesting fractals, while lower convergence order yields more interesting results. By the way, is there a way to share my fractal works with you?
Your post on Durand-Kerner is being marked as spam by RUclips (probably because of the links), but I did take a look at the fractals that you and others were sharing on Fractalforums.
@@OscarVeliz This shows that any root finding method can be represented as a fractal, but some are more difficult than others, like Jenkins-Traub, Bisection, Regula falsi and Lehmer-Schur.
@@OscarVeliz Have you tried Ultra Fractal?
I haven't although I understand it is quite capable. My fractal programs (found on GitHub) are written using gnuplot.
@@OscarVeliz Would it be possible to convert some of your code to .ufm files (the format used for fractal formula files in Ultra Fractal)? (although you may need to get used to it first)
Thank You!
Does anyone have an idea of an unconstrained derivative/gradient-free optimization technique for more than one variable?
Pattern Search. I have it in the queue but there are at least three or four topics ahead of it.
@@OscarVeliz I also tried the Nelder-Mead method on Ultra Fractal, and it worked, and though it did not generate any notable fractal pattern, it did show the minima of the function. And what would be the order of convergence of this method?
Neat that you were able to get it working on Ultra. I do not know the order. A quick search left me with the impression that it is an open question.