This chapter discusses a generalization of the expected improvement used in Bayesian global optim... more This chapter discusses a generalization of the expected improvement used in Bayesian global optimization to the multicriteria optimization domain, where the goal is to find an approximation to the Pareto front. The expected hypervolume improvement (EHVI) measures improvement as the gain in dominated hypervolume relative to a given approximation to the Pareto front. We will review known properties of the EHVI, applications in practice and propose a new exact algorithm for computing EHVI. The new algorithm has asymptotically optimal time complexity O(nlogn). This improves existing computation schemes by a factor of n∕logn. It shows that this measure, at least for a small number of objective functions, is as fast as other simpler measures of multicriteria expected improvement that were considered in recent years.
Der Hypervolumen-Indikator (HVI) wird haufig fur die Qualitatsbewertung von finiten Pareto-Front ... more Der Hypervolumen-Indikator (HVI) wird haufig fur die Qualitatsbewertung von finiten Pareto-Front Approximationsmengen in der Mehrzieloptimierung eingesetzt. Approximationsmengen, die den HVI-Wert maximieren, befinden sich i.d.R. auf der Pareto-Front und die Punkte verteilen sich uber die Pareto-Front. Die Verteilung ist besonders an Randern der Pareto-Front und in Kniepunkten dicht. In neueren Arbeiten wurden Generalisierungen des HVIs vorgestellt. Diese Indikatoren werden in diesem Kapitel behandelt. Insbesondere werden Fragen der Berechnungskomplexitat und der Verteilung von Punkten in den bezuglich dieser Indikatoren optimierten Approximationsmengen betrachtet, sowie die Frage, welche Entscheidungspraferenzen durch die Wahl des jeweiligen Indikators ausgedruckt werden.
Studies in computational intelligence, Jun 2, 2019
Bayesian Global Optimization (BGO) (also referred to as Bayesian Optimization, or Efficient Globa... more Bayesian Global Optimization (BGO) (also referred to as Bayesian Optimization, or Efficient Global Optimization (EGO)), uses statistical models—typically Gaussian process regression to approximate an expensive objective function. Based on this prediction an infill criterion is formulated that takes into account the expected value and variance. BGO adds a new point at the position where this infill criterion obtains its optimum. In this chapter, we will review different ways to formulate such infill criteria. A focus will be on approaches that measure improvement utilizing integrals or statistical moments of a probability distribution over the non-dominated space, including the probability of improvement and the expected hypervolume improvement, and upper quantiles of the hypervolume improvement. These criteria require the solution of non-linear integral calculations. Besides summarizing the progress in the computation of such integrals, we will present new, efficient, procedures for the high dimensional expected improvement and probability of improvement. Moreover, the chapter will summarize main properties of these infill criteria, including continuity and differentiability as well as monotonicity properties of the variance and mean value. The latter will be necessary for constructing global optimization algorithms for non-convex problems.
By a multi-objective optimization problem (MOP) – aka vector optimization problem – we mean the p... more By a multi-objective optimization problem (MOP) – aka vector optimization problem – we mean the problem of simultaneously optimizing a finite set of real valued functions with a common domain. The object of interest for multiobjective optimization is the so-called Pareto Front (PF).The indicator based approach in solving multi-objective optimization problems has become very popular. Indicators are used, among others, to compare the quality of approximation sets to PFs produced by an algorithm or different algorithms. Among the indicators used the R2 indicator attracted wide spread interest as it is relatively frugal in using computational resources as compared to other indicators. We will study the expected improvement of this indicator given an approximation set to the PF and given a probability density function of a predictive distribution of objective function vectors. The improvement of this indicator is defined as follows: the R2 indicator is evaluated on the given approximation set of the PF to which a point in the image of the feasible set is added and the R2 indicator is evaluated on the the given approximation set of the PF, subsequently from the former the latter is subtracted; the resulting difference is the R2-improvement of the chosen point with respect to the given approximation set. The expected improvement is the mean of the improvement over the image of the feasible set with respect to the given pdf. For 2 dimensional MOPs we derive a formula for the expected improvement with respect to a probability density function of a predictive distribution of objective function vectors.By a multi-objective optimization problem (MOP) – aka vector optimization problem – we mean the problem of simultaneously optimizing a finite set of real valued functions with a common domain. The object of interest for multiobjective optimization is the so-called Pareto Front (PF).The indicator based approach in solving multi-objective optimization problems has become very popular. Indicators are used, among others, to compare the quality of approximation sets to PFs produced by an algorithm or different algorithms. Among the indicators used the R2 indicator attracted wide spread interest as it is relatively frugal in using computational resources as compared to other indicators. We will study the expected improvement of this indicator given an approximation set to the PF and given a probability density function of a predictive distribution of objective function vectors. The improvement of this indicator is defined as follows: the R2 indicator is evaluated on the given approximation set of the PF to whic...
Recently, the Hypervolume Newton Method (HVN) has been proposed as a fast and precise indicator-b... more Recently, the Hypervolume Newton Method (HVN) has been proposed as a fast and precise indicator-based method for solving unconstrained bi-objective optimization problems with objective functions. The HVN is defined on the space of (vectorized) fixed cardinality sets of decision space vectors for a given multi-objective optimization problem (MOP) and seeks to maximize the hypervolume indicator adopting the Newton–Raphson method for deterministic numerical optimization. To extend its scope to non-convex optimization problems, the HVN method was hybridized with a multi-objective evolutionary algorithm (MOEA), which resulted in a competitive solver for continuous unconstrained bi-objective optimization problems. In this paper, we extend the HVN to constrained MOPs with in principle any number of objectives. Similar to the original variant, the first- and second-order derivatives of the involved functions have to be given either analytically or numerically. We demonstrate the applicabili...
This chapter discusses a generalization of the expected improvement used in Bayesian global optim... more This chapter discusses a generalization of the expected improvement used in Bayesian global optimization to the multicriteria optimization domain, where the goal is to find an approximation to the Pareto front. The expected hypervolume improvement (EHVI) measures improvement as the gain in dominated hypervolume relative to a given approximation to the Pareto front. We will review known properties of the EHVI, applications in practice and propose a new exact algorithm for computing EHVI. The new algorithm has asymptotically optimal time complexity O(nlogn). This improves existing computation schemes by a factor of n∕logn. It shows that this measure, at least for a small number of objective functions, is as fast as other simpler measures of multicriteria expected improvement that were considered in recent years.
Der Hypervolumen-Indikator (HVI) wird haufig fur die Qualitatsbewertung von finiten Pareto-Front ... more Der Hypervolumen-Indikator (HVI) wird haufig fur die Qualitatsbewertung von finiten Pareto-Front Approximationsmengen in der Mehrzieloptimierung eingesetzt. Approximationsmengen, die den HVI-Wert maximieren, befinden sich i.d.R. auf der Pareto-Front und die Punkte verteilen sich uber die Pareto-Front. Die Verteilung ist besonders an Randern der Pareto-Front und in Kniepunkten dicht. In neueren Arbeiten wurden Generalisierungen des HVIs vorgestellt. Diese Indikatoren werden in diesem Kapitel behandelt. Insbesondere werden Fragen der Berechnungskomplexitat und der Verteilung von Punkten in den bezuglich dieser Indikatoren optimierten Approximationsmengen betrachtet, sowie die Frage, welche Entscheidungspraferenzen durch die Wahl des jeweiligen Indikators ausgedruckt werden.
Studies in computational intelligence, Jun 2, 2019
Bayesian Global Optimization (BGO) (also referred to as Bayesian Optimization, or Efficient Globa... more Bayesian Global Optimization (BGO) (also referred to as Bayesian Optimization, or Efficient Global Optimization (EGO)), uses statistical models—typically Gaussian process regression to approximate an expensive objective function. Based on this prediction an infill criterion is formulated that takes into account the expected value and variance. BGO adds a new point at the position where this infill criterion obtains its optimum. In this chapter, we will review different ways to formulate such infill criteria. A focus will be on approaches that measure improvement utilizing integrals or statistical moments of a probability distribution over the non-dominated space, including the probability of improvement and the expected hypervolume improvement, and upper quantiles of the hypervolume improvement. These criteria require the solution of non-linear integral calculations. Besides summarizing the progress in the computation of such integrals, we will present new, efficient, procedures for the high dimensional expected improvement and probability of improvement. Moreover, the chapter will summarize main properties of these infill criteria, including continuity and differentiability as well as monotonicity properties of the variance and mean value. The latter will be necessary for constructing global optimization algorithms for non-convex problems.
By a multi-objective optimization problem (MOP) – aka vector optimization problem – we mean the p... more By a multi-objective optimization problem (MOP) – aka vector optimization problem – we mean the problem of simultaneously optimizing a finite set of real valued functions with a common domain. The object of interest for multiobjective optimization is the so-called Pareto Front (PF).The indicator based approach in solving multi-objective optimization problems has become very popular. Indicators are used, among others, to compare the quality of approximation sets to PFs produced by an algorithm or different algorithms. Among the indicators used the R2 indicator attracted wide spread interest as it is relatively frugal in using computational resources as compared to other indicators. We will study the expected improvement of this indicator given an approximation set to the PF and given a probability density function of a predictive distribution of objective function vectors. The improvement of this indicator is defined as follows: the R2 indicator is evaluated on the given approximation set of the PF to which a point in the image of the feasible set is added and the R2 indicator is evaluated on the the given approximation set of the PF, subsequently from the former the latter is subtracted; the resulting difference is the R2-improvement of the chosen point with respect to the given approximation set. The expected improvement is the mean of the improvement over the image of the feasible set with respect to the given pdf. For 2 dimensional MOPs we derive a formula for the expected improvement with respect to a probability density function of a predictive distribution of objective function vectors.By a multi-objective optimization problem (MOP) – aka vector optimization problem – we mean the problem of simultaneously optimizing a finite set of real valued functions with a common domain. The object of interest for multiobjective optimization is the so-called Pareto Front (PF).The indicator based approach in solving multi-objective optimization problems has become very popular. Indicators are used, among others, to compare the quality of approximation sets to PFs produced by an algorithm or different algorithms. Among the indicators used the R2 indicator attracted wide spread interest as it is relatively frugal in using computational resources as compared to other indicators. We will study the expected improvement of this indicator given an approximation set to the PF and given a probability density function of a predictive distribution of objective function vectors. The improvement of this indicator is defined as follows: the R2 indicator is evaluated on the given approximation set of the PF to whic...
Recently, the Hypervolume Newton Method (HVN) has been proposed as a fast and precise indicator-b... more Recently, the Hypervolume Newton Method (HVN) has been proposed as a fast and precise indicator-based method for solving unconstrained bi-objective optimization problems with objective functions. The HVN is defined on the space of (vectorized) fixed cardinality sets of decision space vectors for a given multi-objective optimization problem (MOP) and seeks to maximize the hypervolume indicator adopting the Newton–Raphson method for deterministic numerical optimization. To extend its scope to non-convex optimization problems, the HVN method was hybridized with a multi-objective evolutionary algorithm (MOEA), which resulted in a competitive solver for continuous unconstrained bi-objective optimization problems. In this paper, we extend the HVN to constrained MOPs with in principle any number of objectives. Similar to the original variant, the first- and second-order derivatives of the involved functions have to be given either analytically or numerically. We demonstrate the applicabili...
Uploads
Papers