Optimization in mixed search domains of continuous, discrete and categorical parameters

Before starting this tutorial we suggest you to read documentation#getstarted which will provide an example of data communication between your client code and Indie Solver.

You should have a registered account to communicate with the solver. If you are a student or a researcher, don't forget to contact us at contact@indiesolver.com after registering your account to upgrade it to the standard subscription plan which is free for academic researchers. It will allow you to deal with more challenging optimization problems.

The source code of this tutorial is available for download and includes two files: Mixed1.py and IndieSolver.py. The former is specific to this tutorial while the latter is our compact wrapper code to communicate with Indie Solver. Please have a look at the internals of the latter to gain more confidence that our code is safe to use.

We consider a simple optimization problem with 4 parameters of different types. One parameter is categorical and thus can take values from a set of possible alternatives, here, x_1 can be "A", "B" or "C" (any other string value can be used). Two other variables, x_2 and x_3 are continuous and constrained to vary in [0, 1] (it makes sense to normalize parameter values but any other range can be used as well). The fourth variable, x_4, is of discrete type and its possible parameter values are not continuous but differ by some user-defined step factor. For instance, by setting this step factor to 1 we make x_4 represent an integer. Optionally, one can tell what initial parameter values are suggested to be considered. The following code defines the search space and the only objective function to be minimized.

We formulate our problem as follows

f(x)= (x_4 - 7)^2 + \begin{cases} 0.2 + (x_2 - 0.3456789)^2 & x_1 = "A"\\ 0.1 + (x_2 - 0.3456789)^2 + (x_3 - 0.3456789)^2 & x_1 = "B"\\ (x_2 - 0.456789)^2 + (x_3 - x_2 - 0.234567)^2 & x_1 = "C" \end{cases}

where i) our categorical parameter x_1 has a strong impact on the search space, but ii) optimal values of x_2 are somewhat comparable under different values of x_1, iii) x_3 is active only under certain values of x_1, iv) x_4 is indepedent on x_1 and has a great impact, and v) there is a simple linkage between x_2 and x_3 which makes the problem non-separable. The optimum f(x^*)=0 is at x^*=["C", 0.456789, 0.691356, 7].

Candidate solutions will be evaluated in a function called "evaluate" which implements f(x).


In order to communicate with Indie Solver, you should initialize a worker and tell it to connect to a particular IP address and port while providing your secret authentication token. All this information is available on your profile page . Then, you can create a problem according to your previously defined "problem" object. Note that every time you create a problem with some "problem_name", the server will rewrite all the information available for that "problem_name". Therefore, when you want to continue to optimize an already existing problem, you can ask its description by calling "ask_problem_description" and set it to your worker by calling "set_problem" as shown in the last two commented out lines given below:

The following JSON request describing the problem to be created will be sent to Indie Solver:

Your worker can ask solutions iteratively one after another by calling "ask_new_solutions". The reply of this call will include solutions to be evaluated. For the case when some errors were detected by the server, one should check for the status of the reply in order to avoid unnecessary and potentially expensive calls to the objective function. Once some solutions are evaluated, one can communicate them to the server by calling "tell_metrics".


The loop given above will request and evaluate solutions 100 times. If you are currently on our free subscription plan, then only 80 function evaluations will be performed because of the limit to perform at most 20x the number of parameters function evaluations per problem. If you are a researcher, contact us to get a free extension of the limit to 1000x which corresponds to 4000 function evaluations on this problem.

The entire code of this tutorial example is given below:


All evaluated solutions can be viewed in your dashboard . Select the problem of interest as shown below.

You can verify the correctness of your problem description.

The evolution of the objective function value over the number of function evaluations is given below. You can modify what will be shown on different axes.

Importantly, you can apply different transformations (e.g., log10() transformation for the y-axis) to better visualize your data. In order to view a solution of interest, double-click on it and it will appear in the text-box below.

Similarly, the evolution of parameter values can be visualized as shown below.

We select "Show metrics and parameters" for the following parallel coordinates figure. You can change colors and move individual coordinates to re-order them.

We select a range of interest for "obj1" to make the corresponding solutions appear in the table below the figure.

All evaluated solutions are stored in the table below.

The results suggest that the algorithm approaches the optimum successfully. In order to compare the solver to random search, we run both algorithms with 100 random initializations (only relevant for Indie Solver) and present the median results below.

Random search can be competitive for low-dimensional problems, for problems without any global structure and when only very small computational budgets are available. Even on our toy problem of only 4 variables, random search can be outperformed very quickly.

In this introduction tutorial we demonstrated how to solve a simple optimization problem and view its solutions in the dashboard. You may need to make a few adjustments of the problem description to match your actual problem at hand. Have a look at our other tutorials and let us know at contact@indiesolver.com if you need some help setting up your experiment.