Exploration Of Gaussian Mixture Model for Data Driven Multiobjective optimization Under Uncertainty

Purnima, Mishra and Mitra, Kishalay (2019) Exploration Of Gaussian Mixture Model for Data Driven Multiobjective optimization Under Uncertainty. Masters thesis, Indian institute of technology Hyderabad.

[img] Text
Mtech_Thesis_TD1423_2019.pdf
Restricted to Repository staff only until 28 June 2021.

Download (3MB) | Request a copy

Abstract

In optimization, certain numbers of objectives are optimized such that all other given constraints are satisfied. When there are uncertainties present in data or model, which are used in the calculation of objective functions and constraints, the problem comes under the category of optimization under uncertainty. Among many suggested methods in the literature, Robust optimization classifies itself as one of the most efficient ways to solve such problems. One kind of formulation in robust optimization involves the calculation of several moments of uncertain objective functions and constraints such as expectation and standard deviation, which are calculated by sampling the uncertain parameter space. Generally, a very limited number of data points are available in the uncertain space, which are used to calculate these statistical moments and this calls for the generation of more number of points in the uncertain space to calculate them more accurately. Certain methods exist in literature which are generally used to perform the sampling in the uncertain space such as box sampling, diamond sampling and polygon sampling. However, these methods are not very accurate and thereby provide samples outside the desired space, when the data points are scattered, leading to inaccurate solutions. In this work, the Gaussian mixture model (GMM) has been proposed to sample data points from the uncertain space more accurately. Using GMM, the uncertain dataset is divided into different clusters, and sampling of points is performed within these clusters, leaving no room to generate points outside these regions. Improvement is also provided in the learning of parameters of the GMM, which is generally conducted by the expectation maximization algorithm. Conventionally, this algorithm is based on the classical derivative based approach, in which the Lagrange function (likelihood function) is differentiated with respect to parameters of the model and equated to zero, leading to the limitation of converging to local optima. Derivative free evolutionary approach such as genetic algorithm has been utilized to increase the probability of convergence to the global optima. A case study of Himmelblau’s function is chosen to demonstrate the efficacy of the proposed approach in terms of usage of an efficient global optimizer and sampling efficiency.

[error in script]
IITH Creators:
IITH CreatorsORCiD
Mitra, Kishalayhttp://orcid.org/0000-0001-5660-6878
Item Type: Thesis (Masters)
Uncontrolled Keywords: Optimization Under Uncertainty, Robust Optimisation, Gaussian Mixture Models, Genetic Algorithm
Subjects: Electrical Engineering
Divisions: Department of Electrical Engineering
Depositing User: Team Library
Date Deposited: 28 Jun 2019 10:20
Last Modified: 28 Jun 2019 10:20
URI: http://raiithold.iith.ac.in/id/eprint/5588
Publisher URL:
Related URLs:

    Actions (login required)

    View Item View Item
    Statistics for RAIITH ePrint 5588 Statistics for this ePrint Item