Go to  Advanced Search

Please note that starting Tuesday, October 13th we will be making a number of modifications to integrate cIRcle into the new UBC Library Open Collections service. This may result in some temporary outages next week. Please keep this in mind when planning your work schedule. Apologies for any inconvenience.

A derivative-free approximate gradient sampling algorithm for finite minimax problems

Show full item record

Files in this item

Files Size Format Description   View
ubc_2012_spring_nutini_julie.pdf 739.5Kb Adobe Portable Document Format   View/Open
Title: A derivative-free approximate gradient sampling algorithm for finite minimax problems
Author: Nutini, Julie Ann
Degree: Master of Science - MSc
Program: Mathematics
Copyright Date: 2012
Issue Date: 2012-04-23
Publisher University of British Columbia
Abstract: Mathematical optimization is the process of minimizing (or maximizing) a function. An algorithm is used to optimize a function when the minimum cannot be found by hand, or finding the minimum by hand is inefficient. The minimum of a function is a critical point and corresponds to a gradient (derivative) of 0. Thus, optimization algorithms commonly require gradient calculations. When gradient information of the objective function is unavailable, unreliable or ‘expensive’ in terms of computation time, a derivative-free optimization algorithm is ideal. As the name suggests, derivative-free optimization algorithms do not require gradient calculations. In this thesis, we present a derivative-free optimization algorithm for finite minimax problems. Structurally, a finite minimax problem minimizes the maximum taken over a finite set of functions. We focus on the finite minimax problem due to its frequent appearance in real-world applications. We present convergence results for a regular and a robust version of our algorithm, showing in both cases that either the function is unbounded below (the minimum is −∞) or we have found a critical point. Theoretical results are explored for stopping conditions. Additionally, theoretical and numerical results are presented for three examples of approximate gradients that can be used in our algorithm: the simplex gradient, the centered simplex gradient and the Gupal estimate of the gradient of the Steklov averaged function. A performance comparison is made between the regular and robust algorithm, the three approximate gradients, and the regular and robust stopping conditions. Finally, an application in seismic retrofitting is discussed.
Affiliation: Science, Faculty of
URI: http://hdl.handle.net/2429/42200
Scholarly Level: Graduate

This item appears in the following Collection(s)

Show full item record

UBC Library
1961 East Mall
Vancouver, B.C.
Canada V6T 1Z1
Tel: 604-822-6375
Fax: 604-822-3893