Optimy.io
An Efficient Heuristic for
Multidimensional Non-Linear Parameter Estimation

This website present an efficient gradient-free algorithm designed to handle "hard" optimisation problems involving "highly" non-linear functions with up to a few hundreds of parameters (N < 200) in an unbounded space, relying on a "smart" adaptive local search.

This algorithm combines recent ideas in optimisation and has proven to be an extremely efficient heuristic in many optimisation problems, especially in modeling and parameter estimation.

You are invited to try it on your problems through this web API.

Try now (free API + example code)


Use Case Examples:



Precise Curve Fitting

  • Interpolations / Regressions
  • Non linear functions / Differential Equations (PDE)
Regression

Example (N=3): find a,b,c such as
f(x)=a(x^b)(1-x)^c interpolates 3 points up to 10^-24 precision.


Geometrical Problems

Geometrcal Problems

Example (N=22): find a 11-points polygon which cover a given 2D shape

Optimisation of Simulations

Simulation

Example (N=70): Optimise Weight Neural Network



The possibility to buy an executable with full functionnality and consulting sessions to help on your specific parameter estimation problems will be released soon (and possibly a small subscription fee for the API if the AWS bill gets too high).

You are then encouraged to leave your email at the bottom of the page to get information about new developements (especially if you are going to use this algorithm regularly).




Web API:

(Up to N<=16 because of limitations with AWS).
Send a mail for the executable with full functionnality.

Step 1: Set up the Minimisation

Get Initial Key And Params

api.optimy.io/getkeyandinitialparams?IG=[InitialCoord1,InitialCoord2,InitialCoord3]&IC=InitialSearchSpaceRadius
The number of parameters to optimize the length of the Initial Point (IG).

The number of points to evaluate at each iteration is then given by the algorithm.



Example

Initial Point:

Initial search space radius:

Key on first line and Params on second line

Step 2: Iteratively Perform the Minimisation

Feed Scores or Ranks of given params, then get New Params and New Key:

 api.optimy.io/feedscoresandgetnewparams?key=Key_Obtained_Above&scores=[score1,score2,score3]
or
 api.optimy.io/feedscoresandgetnewparams?key=Key_Obtained_Above&ranks=[rank1,rank2,rank2]

(Additionaly:)

Get only the params associated to a key:

api.optimy.io/feedscoresandgetnewparams?key=Key_Obtained_Above



Example Code in Python:

import numpy as np
import requests

## Useful functions ##
def Optimy_GetKeyAndInitialParams(InitialPoint,InitialSearchSpaceSize):
  R=requests.get('http://api.optimy.io/getkeyandinitialparams?IG='+str(InitialPoint)+'&IC='+str(InitialSearchSpaceSize)).text
  SR=R.split('\n')
  return SR[0],[[float(w) for w in x.split(', ')] for x in SR[1][2:-2].split('], [')]
def Optimy_FeedScoresAndGetNewParams(MyOptimyKey,scores):
  R = requests.get('http://api.optimy.io/feedscoresandgetnewparams?key='+MyOptimyKey+'&scores='+str(scores.tolist())).text
  SR=R.split('\n')
  return SR[0],[[float(w) for w in x.split(', ')] for x in SR[1][2:-2].split('], [')]

## Problem Definition ###
def FunctionToOptimise(a):
  return (a[0]-3)**2+np.cos(a[0]*a[1]*a[2])+np.exp(-a[2])

InitialPoint=[0,-1,1]
InitialSearchSpaceSize=0.2

# Optimisation w/ Optimy API ##
MyKey,Params=Optimy_GetKeyAndInitialParams(InitialPoint,InitialSearchSpaceSize)
for i in range(0,100):
  scores=np.apply_along_axis(FunctionToOptimise, 1, np.asarray(Params))
  print "** Iteration "+str(i)+" **"
  print "MinValue:"+str(scores[np.argmin(scores)])
  print 'MinParameters:'+str(Params[np.argmin(scores)])
  MyKey,Params = Optimy_FeedScoresAndGetNewParams(MyKey,scores)

Similar code currently available in Javascript, provided you can bypass cross-domain requests limitation through your own server.
Request by mail.

Send a message


contact@optimy.io


Suggestion ? Bug ? Interested to buy the upcoming executable ?

Other inquiries on Non-Linear Parameter Estimation ?

Send a mail and we'll do the best to answer.

Receive Updates


Leave us you email to be informed of updates and new features of this app.