Academia.eduAcademia.edu

Optimisation methods

description14 papers
group6 followers
lightbulbAbout this topic
Optimisation methods are mathematical techniques used to identify the best solution or outcome from a set of feasible alternatives, often subject to constraints. These methods are applied across various fields, including operations research, engineering, and economics, to enhance efficiency, reduce costs, or improve performance in decision-making processes.
lightbulbAbout this topic
Optimisation methods are mathematical techniques used to identify the best solution or outcome from a set of feasible alternatives, often subject to constraints. These methods are applied across various fields, including operations research, engineering, and economics, to enhance efficiency, reduce costs, or improve performance in decision-making processes.

Key research themes

1. How can population-based metaheuristic algorithms effectively balance exploration and exploitation in complex optimization problems?

This research area focuses on developing and improving population-based metaheuristic optimization algorithms to effectively navigate complex, nonlinear, and multimodal search spaces. A central challenge is achieving a suitable balance between exploration (diversifying search across the solution space) and exploitation (intensifying search near promising areas). Efficient balancing improves convergence speed, avoids local optima, and enhances solution quality for a broad class of problems, including engineering design and real-world applications.

Key finding: Introduces a metaphor-free population-based optimizer, RUN, which utilizes the Runge Kutta method's slope variations to guide search, incorporating two active phases of exploration and exploitation. The algorithm includes an... Read more
Key finding: Proposes a Two-Stage Optimization (TSO) algorithm that updates each population member in two iterative steps, leveraging a selected group of good members for position updates. This novel update mechanism enhances both global... Read more
Key finding: Develops a hybrid algorithm H-JTLBO combining JAYA and teaching-learning-based optimization to leverage the global search of JAYA and local refinement capabilities of TLBO. This hybridization compensates for individual... Read more
Key finding: Enhances gradient-based optimization (GBO) by incorporating a modified inertia weight and novel parameter tuning to accelerate convergence and escape local optima. The improved GBO algorithm maintains a balance between global... Read more
Key finding: Introduces C-EGPSO, an advanced particle swarm optimization variant where each particle individually adapts weighting parameters through evolutionary game theory to improve convergence behavior and diversity. This... Read more

2. What are best practices and key considerations in implementing optimization methods in statistical computing environments like R?

This theme centers on methodological insights and software implementation aspects of nonlinear parameter estimation and optimization methods within open-source statistical computing platforms, particularly the R environment. It addresses challenges of user-friendliness, default method choices, evolution and maintenance of optimization packages, the integration of legacy and modern algorithms, and developing a coherent, best-practice framework to enhance reliability and extensibility of optimization tools in applied statistics and scientific computing.

Key finding: Provides a critical analysis of the nonlinear optimization tools available in R, emphasizing the need for clear, consistent default methods accessible to novices, while supporting experimental flexibility for developers. It... Read more
Key finding: Offers conceptual grounding on modeling and simulation within optimization, emphasizing distinctions between fundamental, empirical, and hybrid models and the importance of variable classification (input, decision,... Read more
Key finding: Conducts a comprehensive qualitative review of optimization literature over the past decade focused on operations research, identifying trends in problem formulations (unconstrained, constrained), historical methodological... Read more

3. How do mathematical and computational advances address specific challenges in large-scale and nonlinear constrained optimization problems in engineering and applied sciences?

This theme explores advances in mathematical optimization techniques tailored for large-scale, nonlinear, and constrained problems prevalent in engineering applications, including numerical PDEs, linear programming interior point methods, batch plant design, and resource optimization in construction project management. It focuses on algorithmic innovations such as improved gradient methods, preconditioning strategies, and hybrid deterministic/stochastic approaches to enhance computational efficiency, convergence guarantees, and practical applicability in complex systems.

Key finding: Presents an up-to-date, rigorous treatment of optimization theory and numerical strategies focused on modern computational requirements, including handling piecewise smooth and noisy objective functions through gradient-only... Read more
Key finding: Develops and implements efficient preconditioning techniques for solving the indefinite augmented KKT systems arising in large-scale linear programming interior point methods. By exploiting sparsity and using null-space... Read more
Key finding: Extends affine conjugate Newton methods to nonconvex minimization problems arising from nonlinear elastomechanics PDEs, proposing three algorithmic variants implemented with adaptive multilevel finite element discretizations.... Read more
Key finding: Evaluates and compares deterministic mathematical programming techniques (DICOPT++, SBB) and a genetic algorithm on benchmark batch plant design MINLP problems of increasing complexity. The study identifies SBB as the best... Read more
Key finding: Analyzes resource-leveling optimization challenges in construction project management using project management software such as MS Project and Primavera. Discussing approaches for handling work, material, and cost resources... Read more

All papers in Optimisation methods

Due to their large variety of applications, complex optimisation problems induced a great effort to develop efficient solution techniques, dealing with both continuous and discrete variables involved in non-linear functions. But among the... more
Due to their large variety of applications, complex optimisation problems induced a great effort to develop efficient solution techniques, dealing with both continuous and discrete variables involved in non-linear functions. But among the... more
We discuss the use of preconditioned conjugate gradients method for solving the reduced KKT systems arising in interior point algorithms for linear programming. The (indefinite) augmented system form of this linear system has a number of... more
This paper develops a novel particle swarm optimiser algorithm. The focus of this study is how to improve the performance of the classical particle swarm optimisation approach, i.e., how to enhance its convergence speed and capacity to... more
Preconditioned iterative methods provide an effective alternative to direct methods for the solution of the KKT linear systems arising in Interior Point algorithms, especially when large-scale problems are considered. We analyze the... more
In February 1979 a note by L. G. Khachiyan indicated how an ellipsoid method for linear programming can be implemented in polynomial time. This result has caused great excitement and stimulated a flood of technical papers. Ordinarily... more
Responding to the international calls for high energy performance buildings like nearly-zero energy buildings (nZEB), recent years have seen significant growth in energy-saving and energy-supply measures in the building sector. A detailed... more
In this paper, we address the preconditioned iterative solution of the saddle-point linear systems arising from the (regularized) Interior Point method applied to linear and quadratic convex programming problems, typically of large scale.... more
Preconditioned iterative methods provide an effective alternative to direct methods for the solution of the KKT linear systems arising in Interior Point algorithms, especially when large-scale problems are considered. We analyze the... more
Momentum Iterative Hessian Sketch (M-IHS) techniques, a group of solvers for large scale regularized linear Least Squares (LS) problems, are proposed and analyzed in detail. Proposed M-IHS techniques are obtained by incorporating the... more
We review the use of block diagonal and block lower/upper triangular splittings for constructing iterative methods and preconditioners for solving stabilized saddle point problems. We introduce new variants of these splittings and obtain... more
Given A := {a 1 ,. .. , a m } ⊂ R d whose affine hull is R d , we study the problems of computing an approximate rounding of the convex hull of A and an approximation to the minimum volume enclosing ellipsoid of A. In the case of... more
We study the problem of computing a (1 +)-approximation to the minimum volume enclosing ellipsoid of a given point set S = {p 1 , p 2 ,. .. , p n } ⊆ R d. Based on a simple, initial volume approximation method, we propose a modification... more
Presented here is a fast method that combines curve matching techniques with a surface matching algorithm to estimate the positioning and respective matching error for the joining of three-dimensional fragmented objects. Furthermore, this... more
Interior point methods (IPMs) have proven to be an efficient way of solving quadratic programming problems in predictive control. A linear system of equations needs to be solved in each iteration of an IPM. The ill-conditioning of this... more
We propose a class of deflated preconditioners to address the problem of efficiently solving the systems of the normal equations (NE) arising at each step of the interior point method in constrained optimization. As well known, the... more
In this article, Momentum Iterative Hessian Sketch (M-IHS) techniques, a group of solvers for large scale linear Least Squares (LS) problems, are proposed and analyzed in detail. The proposed techniques are obtained by incorporating the... more
The implementation of a linear programming interior point solver is described that is based on iterative linear algebra. The linear systems are preconditioned by a basis matrix, which is updated from one interior point iteration to the... more
A dual logarithmic barrier method for solving large, sparse semidefinite programs is proposed in this paper. The method avoids any explicit use of the primal variable X and therefore is well-suited to problems with a sparse dual matrix S.... more
Recently, Resende and Veiga [31] have proposed an efficient implementation of the Dual Affine (DA) interior-point algorithm for the solution of linear transportation models with integer costs and right-hand side coefficients. This... more
Systems of linear equations with "normal" matrices of the form AD 2 A T is a key ingredient in the computation of search directions for interior-point algorithms. In this article, we establish that a well-known basis preconditioner for... more
We propose a method for solving a Hermitian positive definite linear system Ax = b, where A is an explicit sparse matrix (real or complex). A sparse approximate right inverse M is computed and replaced by M̃ = (M + MH)/2, which is used as... more
We propose a framework for building preconditioners for sequences of linear systems of the form (A + ∆ k)x k = b k , where A is symmetric positive semidefinite and ∆ k is diagonal positive semidefinite. Such sequences arise in several... more
Large-scale optimization problems that seek sparse solutions have become ubiquitous. They are routinely solved with various specialized first-order methods. Although such methods are often fast, they usually struggle with not-so-well... more
Recently, Resende and Veiga 31] have proposed an e cient implementation of the Dual A ne (DA) interior-pointalgorithm for the solution of linear transportationmodels with integer costs and right-hand side coe cients. This procedure... more
One of the most efficient interior-point methods for some classes of block-angular structured problems solves the normal equations by a combination of Cholesky factorizations and preconditioned conjugate gradient for, respectively, the... more
We study the problem of computing a (1 + )-approximation to the minimum volume enclosing ellipsoid of a given point set S = {p 1 , p 2 , . . . , p n } ⊆ R d .
The paper extends affine conjugate Newton methods from convex to nonconvex minimization, with particular emphasis on PDE problems originating from compressible hyperelasticity. Based on well-known schemes from finite dimensional nonlinear... more
Several issues concerning an analysis of large and sparse linear programming problems prior to solving them with an interior point based optimizer are addressed in this paper. Three types of presolve procedures are distinguished. Routines... more
We discuss the use of preconditioned conjugate gradients (CG) method for solving the reduced KKT systems arising in interior point algorithms for linear programming. The (indefinite) augmented system form of this linear system has a... more
Systems of linear equations with "normal" matrices of the form AD 2 A T is a key ingredient in the computation of search directions for interior-point algorithms. In this article, we establish that a well-known basis preconditioner for... more
In this paper we analyze a class of approximate constraint preconditioners in the acceleration of Krylov subspace methods fot the solution of reduced Newton systems arising in optimization with interior point methods. We propose a dynamic... more
Every Newton step in an interior-point method for optimization requires a solution of a symmetric indefinite system of linear equations. Most of today's codes apply direct solution methods to perform this task. The use of logarithmic... more
Optimal process design often requires the solution of mixed integer non-linear programming problems. Optimization procedures must be robust and efficient if they are to be incorporated in automated design systems. For heat integrated... more
Nature-inspired algorithms are proving to be very successful on complex optimisation problems. A new algorithm, inspired by the way plants, and in particular the strawberry plant, propagate is presented. The algorithm is explained, tested... more
Given an arbitrary set A ∈ I R n , we know that there exists an ellipsoid E which provides an n-rounding of the set A, i.e. n −1 E ⊆ conv(A) ⊆ E. The minimumvolume ellipsoid that encloses the set A provides such an ellipsoid and will be... more
We show the linear convergence of a simple first-order algorithm for the minimum-volume enclosing ellipsoid problem and its dual, the D-optimal design problem of statistics. Computational tests confirm the attractive features of this... more
Download research papers for free!