Global optimization in Python with scipy.optimize

Einblick Content Team - March 1st, 2023

In data analysis, finding the global minimum of a function is a common task. However, it can be challenging to find the optimal solution due to the presence of multiple local minima. The scipy library provides a convenient method for global optimization called scipy.optimize.basinhopping(). This function implements the basin-hopping algorithm, which uses a combination of local optimization methods and stochastic jumps to find the global minimum of a given function.

In this tutorial, we provide an example of using the scipy.optimize.basinhopping() function to find the global minimum of a one-dimensional multimodal function. The entire code can be found in the canvas below:

1. Import libraries

# Import the necessary libraries
import numpy as np
import scipy.optimize as opt

2. Define objective function and initial guess

We converted the following function into Python, and gave an initial guess of x = -2 for the location of the global minimum.

f(x)=10cos(πx2.2)+(x+1.5)xf(x) = -10\cos(\pi x-2.2)+(x+1.5)x
# Define objective function
def f(x):
    return -10*np.cos(np.pi*x - 2.2) + (x + 1.5)*x

# Set initial guess
x0 = [-2]

3. Setup and call `basinhopping()` function

In this case, we call on the BFGS (Broyden, Fletcher, Goldfarb, and Shanno) method for unconstrained minimization. scipy implements a number of other methods as well.

We have also specified how many basin-hopping iterations to run via the argument niter.

# Set up args for basinhopping and call function
minimizer_kwargs = {"method": "BFGS"}
optimization_algorithm = opt.basinhopping(f, x0, minimizer_kwargs = minimizer_kwargs, niter = 200)

4. Print results

print("1-D function")
print(optimization_algorithm.message[0])

# Save results
optimized_x = optimization_algorithm.x
optimized_fun = optimization_algorithm.fun

# Print results
print("Optimized x: ", optimized_x)
print("Optimized function value: ", optimized_fun)

Output:

1-D function
requested number of basinhopping iterations completed successfully
Optimized x:  [-1.28879778]
Optimized function value:  -10.26631244852453

From the results, stored in the message, x, and fun attributes, we can see that the algorithm detected the global minimum at:

x=1.299f(x)=10.266x = -1.299 \newline f(x) = -10.266

BONUS. Plot function, and check results

To check that our function has not gotten caught at a local minimum, we can plot the function using the following code:

import seaborn as sns
import matplotlib.pyplot as plt
sns.set_theme()

# Generate data for objective function graph
X = np.arange(-10, 10, 0.2)
y = [f(x) for x in X]

# Plot global minimum
plt.vlines(x = optimized_x, ymin = -10, ymax = 125, colors = 'red')
plt.hlines(y = optimized_fun, xmin = -10, xmax = 10, colors = 'red')
plt.plot(X, y)
plt.plot(optimized_x, optimized_fun,'o', color = "black")

Output:

Plot of global minimum of multimodal functionPlot of global minimum of multimodal function

About

Einblick is an AI-native data science platform that provides data teams with an agile workflow to swiftly explore data, build predictive models, and deploy data apps. Founded in 2020, Einblick was developed based on six years of research at MIT and Brown University. Einblick is funded by Amplify Partners, Flybridge, Samsung Next, Dell Technologies Capital, and Intel Capital. For more information, please visit www.einblick.ai and follow us on LinkedIn and Twitter.

Start using Einblick

Pull all your data sources together, and build actionable insights on a single unified platform.

  • All connectors
  • Unlimited teammates
  • All operators