Simon Wardley has made a compelling case for the process of “commoditization” of software functionality, described by Wardley-Duncan map. Commodization means that overtime functionality starts as research, then becomes available to a few, then widely available, then cheap, then free.
The best example of this is relational database. PostgreSQL started out as a research project and is now essentially free, and, by the way, awesome. I have laid out this argument in more detail here. Even putting aside the improvements in hardware which enable this, this is a significant gift to anyone who builds or uses software, though of course most people don’t think about it, just as most of us take cheap electricity mostly for granted. Another example would be the use of regular expressions engines, which I consider Perl to have pioneered, but has now been so widely copied that they are ubiquitous and free.
The Free Software movement lubricates and reinforces this process, or perhaps is a cause of this process, or perhaps causes it — you decide.
There is another piece of computing functionality which is being commoditized but is not yet ripe: numerical optimization. Basically, if you have a function, you want to be able to find a value that minimizes (or maximizes) this function. Sound simple, right?
And, in fact, it is — if the function is of one variable, and perfectly smooth, and differentiable, and if you know about where the minimum is. In this case, you can use a Newton-Raphson solver to find the minimum and it works like a charm.
If however, the function is spiky, or not differentiable, or has several minima, or subject to constraints, the problem is harder, and the subject of much research for the last 40 years. It is more or less true that entire classes are taught on this subject and some people devote their entire careers to improving our ability to solve this problem.
Which is fitting, because to a working programmer this is a magic, all-purpose piece of functionality. You can use to do many, many things. I’d rather have a regular expression engine — but I also want a good, free, standardized optimization engine that more or less solves problems that are not too horrendous without me having to learn or re-learn all about numerical optimization. The time is past due for that to be available to us. In 10 years this should be built into every programming language, as it seems to be provided with Matlab and Mathematica now.
We should all thank the authors of this junk pile; it is from such mounds of detritus that useful machines can be cobbled together. So my criticism should be taken only as criticism against a perfect world, not the people who have gone out of their way to contribute what they have.
Someone really needs to pull all of this together. Whenever I write something like that, I am painfully reminded that “someone” means “anyone but me”. A truly noble Public Inventor would polish up the junk pile, put in a box, write an excellent manual, create a great set of tests, create an even greater set of examples of worked problems, and put a ribbon on it. I’m not going to do that; but perhaps this essay will be a start. Please take what follows in the spirit of just being a starting point, perhaps for others.
I found this Newton-Raphson solver to be simple as pie and functional for my purposes: https://github.com/scijs/newton-raphson-method. I use it in my project “Untwisting the Tetrahelix”: https://pubinv.github.io/tetrahelix/.
I first tried using the excellent project, fmin: https://github.com/benfred/fmin. Fmin is the best documented of the projects I have reviewed — it has awesome example projects with animations. It has automated tests, although they could be strengthened. Unfortunately, my problem has constraints.
In general, you can take constraints on an optimization problem and turn them into “penalty functions” that change the value to try to get the optimizer to avoid solutions which violates the constraints. This sort of half-way works some of the time, and requires a lot of careful crafting. In particular, my attempt to do it for Fmin violates the “Wolfe Conditions”, and Fmin gives me no way out.
I tried this code: https://github.com/iaroslav-ai/optimization-js to solve a much more difficult problem that I call Actuator Net Optimization: https://pubinv.github.io/actnetopt/. It worked very well. On the basis of this I would recommend the “optimization-js” package.
However, it has no tests and certainly no beautiful animation like Fmin.
It would be nice to have a “meta-project” which combined the two repos, giving the user a palette of strategies to deploy, unifying the call mechanisms (the programmatic interface) and providing tests and animations to all the algorithms. If anyone is excited about taking on this project, please contact me.
In the interest of sharing early and often, I’m going to publish this right now, but not submit it to any magazines. Hopefully I will update as I learn more. It would be a great pleasure to me if someone would respond by explaining to me that this problem has already been solved, or initiate a serious project to accelerate the commoditization of numerical optimization.