In data analysis, finding the global minimum of a function is a common task. However, it can be challenging to find the optimal solution due to the presence of multiple local minima. The scipy
library provides a convenient method for global optimization called scipy.optimize.basinhopping()
. This function implements the basin-hopping algorithm, which uses a combination of local optimization methods and stochastic jumps to find the global minimum of a given function.
In this tutorial, we provide an example of using the scipy.optimize.basinhopping()
function to find the global minimum of a one-dimensional multimodal function. The entire code can be found in the canvas below:
1. Import libraries
# Import the necessary libraries
import numpy as np
import scipy.optimize as opt
2. Define objective function and initial guess
We converted the following function into Python, and gave an initial guess of x = -2
for the location of the global minimum.
# Define objective function
def f(x):
return -10*np.cos(np.pi*x - 2.2) + (x + 1.5)*x
# Set initial guess
x0 = [-2]
3. Setup and call `basinhopping()` function
In this case, we call on the BFGS
(Broyden, Fletcher, Goldfarb, and Shanno) method for unconstrained minimization. scipy
implements a number of other methods as well.
We have also specified how many basin-hopping iterations to run via the argument niter
.
# Set up args for basinhopping and call function
minimizer_kwargs = {"method": "BFGS"}
optimization_algorithm = opt.basinhopping(f, x0, minimizer_kwargs = minimizer_kwargs, niter = 200)
4. Print results
print("1-D function")
print(optimization_algorithm.message[0])
# Save results
optimized_x = optimization_algorithm.x
optimized_fun = optimization_algorithm.fun
# Print results
print("Optimized x: ", optimized_x)
print("Optimized function value: ", optimized_fun)
Output:
1-D function
requested number of basinhopping iterations completed successfully
Optimized x: [-1.28879778]
Optimized function value: -10.26631244852453
From the results, stored in the message
, x
, and fun
attributes, we can see that the algorithm detected the global minimum at:
BONUS. Plot function, and check results
To check that our function has not gotten caught at a local minimum, we can plot the function using the following code:
import seaborn as sns
import matplotlib.pyplot as plt
sns.set_theme()
# Generate data for objective function graph
X = np.arange(-10, 10, 0.2)
y = [f(x) for x in X]
# Plot global minimum
plt.vlines(x = optimized_x, ymin = -10, ymax = 125, colors = 'red')
plt.hlines(y = optimized_fun, xmin = -10, xmax = 10, colors = 'red')
plt.plot(X, y)
plt.plot(optimized_x, optimized_fun,'o', color = "black")
Output:

About
Einblick is an AI-native data science platform that provides data teams with an agile workflow to swiftly explore data, build predictive models, and deploy data apps. Founded in 2020, Einblick was developed based on six years of research at MIT and Brown University. Einblick is funded by Amplify Partners, Flybridge, Samsung Next, Dell Technologies Capital, and Intel Capital. For more information, please visit www.einblick.ai and follow us on LinkedIn and Twitter.