An Efficient Heuristic for
Multidimensional Non-Linear Parameter Estimation

This website present an efficient gradient-free algorithm designed to handle "hard" optimisation problems involving "highly" non-linear functions with up to a few hundreds of parameters (N < 200) in an unbounded space, relying on a "smart" adaptive local search.

This algorithm combines recent ideas in optimisation and has proven to be an extremely efficient heuristic in many optimisation problems, especially in modeling and parameter estimation.

You are invited to try it on your problems through this web API.

Try now (free API + example code)

Use Case Examples:

Precise Curve Fitting

  • Interpolations / Regressions
  • Non linear functions / Differential Equations (PDE)

Example (N=3): find a,b,c such as
f(x)=a(x^b)(1-x)^c interpolates 3 points up to 10^-24 precision.

Geometrical Problems

Geometrcal Problems

Example (N=22): find a 11-points polygon which cover a given 2D shape

Optimisation of Simulations


Example (N=70): Optimise Weight Neural Network

The possibility to buy an executable with full functionnality and consulting sessions to help on your specific parameter estimation problems will be released soon (and possibly a small subscription fee for the API if the AWS bill gets too high).

You are then encouraged to leave your email at the bottom of the page to get information about new developements (especially if you are going to use this algorithm regularly).

Web API:

(Up to N<=16 because of limitations with AWS).
Send a mail for the executable with full functionnality.

Step 1: Set up the Minimisation

Get Initial Key And Params

The number of parameters to optimize the length of the Initial Point (IG).

The number of points to evaluate at each iteration is then given by the algorithm.


Initial Point:

Initial search space radius:

Key on first line and Params on second line

Step 2: Iteratively Perform the Minimisation

Feed Scores or Ranks of given params, then get New Params and New Key:



Get only the params associated to a key:


Example Code in Python:

import numpy as np
import requests

## Useful functions ##
def Optimy_GetKeyAndInitialParams(InitialPoint,InitialSearchSpaceSize):
  return SR[0],[[float(w) for w in x.split(', ')] for x in SR[1][2:-2].split('], [')]
def Optimy_FeedScoresAndGetNewParams(MyOptimyKey,scores):
  R = requests.get('http://api.optimy.io/feedscoresandgetnewparams?key='+MyOptimyKey+'&scores='+str(scores.tolist())).text
  return SR[0],[[float(w) for w in x.split(', ')] for x in SR[1][2:-2].split('], [')]

## Problem Definition ###
def FunctionToOptimise(a):
  return (a[0]-3)**2+np.cos(a[0]*a[1]*a[2])+np.exp(-a[2])


# Optimisation w/ Optimy API ##
for i in range(0,100):
  scores=np.apply_along_axis(FunctionToOptimise, 1, np.asarray(Params))
  print "** Iteration "+str(i)+" **"
  print "MinValue:"+str(scores[np.argmin(scores)])
  print 'MinParameters:'+str(Params[np.argmin(scores)])
  MyKey,Params = Optimy_FeedScoresAndGetNewParams(MyKey,scores)

Similar code currently available in Javascript, provided you can bypass cross-domain requests limitation through your own server.
Request by mail.

Send a message


Suggestion ? Bug ? Interested to buy the upcoming executable ?

Other inquiries on Non-Linear Parameter Estimation ?

Send a mail and we'll do the best to answer.

Receive Updates

Leave us you email to be informed of updates and new features of this app.