Optimization involves finding the minimum or maximum of a function, or fitting a model to data. While NumPy provides basic tools for finding extrema in arrays, SciPy is the go-to library for more complex optimization tasks.
Finding Extrema with NumPy
If you have discrete data in an array, you can find the minimum and maximum values and their indices.
import numpy as npdata = np.array([10, 5, 2, 8, 12, 3])# Minimum and Maximum valuesmin_val = np.min(data)max_val = np.max(data)# Indices of Minimum and Maximummin_idx = np.argmin(data)max_idx = np.argmax(data)print(f"Min: {min_val} at index {min_idx}")print(f"Max: {max_val} at index {max_idx}")
Min: 2 at index 2
Max: 12 at index 4
Curve Fitting with SciPy
A common task in experimental mechanics is fitting a function to data points.
from scipy.optimize import curve_fit# Define the model function (e.g., linear)def model(x, a, b):return a * x + b# Generate some noisy datax_data = np.linspace(0, 10, 10)y_data =2.5* x_data +1.0+ np.random.normal(0, 0.5, size=len(x_data))# Fit the model to the datapopt, pcov = curve_fit(model, x_data, y_data)print(f"Fitted parameters: a={popt[0]:.2f}, b={popt[1]:.2f}")
Fitted parameters: a=2.43, b=1.16
Minimization with SciPy
To find the minimum of a mathematical function \(f(x)\).
from scipy.optimize import minimize# Define the function to minimize: f(x) = (x - 3)^2 + 5def f(x):return (x -3)**2+5# Initial guessx0 =0# Minimizeresult = minimize(f, x0)print(f"Minimum found at x = {result.x[0]:.2f}")print(f"Minimum value = {result.fun:.2f}")
Minimum found at x = 3.00
Minimum value = 5.00
Implementing a Bracketing Method
Sometimes it is useful to implement a simple optimization algorithm yourself to understand how it works. Here is an implementation of the Golden Section Search, a bracketing method for finding the minimum of a unimodal function.
def golden_section_search(f, a, b, tol=1e-5):""" Finds the minimum of f in the interval [a, b] using Golden Section Search. """ gr = (np.sqrt(5) +1) /2# Golden ratio c = b - (b - a) / gr d = a + (b - a) / grwhileabs(b - a) > tol:if f(c) < f(d): b = delse: a = c# Recompute sample points c = b - (b - a) / gr d = a + (b - a) / grreturn (b + a) /2# Test the function# Minimize f(x) = (x - 3)^2 + 5min_x = golden_section_search(f, 0, 6)print(f"Golden Section Search found minimum at x = {min_x:.5f}")
Golden Section Search found minimum at x = 3.00000