Convex Optimization With Computational Errors - Zaslavski Alexander J. | Libro Springer 02/2020 - HOEPLI.it


home libri books ebook dvd e film top ten sconti 0 Carrello


Torna Indietro

zaslavski alexander j. - convex optimization with computational errors

Convex Optimization with Computational Errors




Disponibilità: Normalmente disponibile in 15 giorni


PREZZO
88,98 €
NICEPRICE
84,53 €
SCONTO
5%



Questo prodotto usufruisce delle SPEDIZIONI GRATIS
selezionando l'opzione Corriere Veloce in fase di ordine.


Pagabile anche con App18 Bonus Cultura e Carta Docenti


Facebook Twitter Aggiungi commento


Spese Gratis

Dettagli

Genere:Libro
Lingua: Inglese
Editore:

Springer

Pubblicazione: 02/2020
Edizione: 1st ed. 2020





Sommario

Preface.- 1. Introduction.- 2. Subgradient Projection Algorithm.- 3. The Mirror Descent Algorithm.- 4. Gradient Algorithm with a Smooth Objective Function.- 5. An Extension of the Gradient Algorithm.- 6. Continuous Subgradient Method.- 7. An optimization problems with a composite objective function.- 8. A zero-sum game with two-players.- 9. PDA-based method for convex optimization.- 10 Minimization of quasiconvex functions.-11. Minimization of sharp weakly convex functions.-12. A Projected Subgradient Method for Nonsmooth Problems.- References. -Index.  





Trama

The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. 

The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. 

It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. 

This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors.  For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE].  In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors.  In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors.  All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to mi




Autore

?Alexander J. Zaslavski is professor in the Department of Mathematics, Technion-Israel Institute of Technology, Haifa, Israel.







Altre Informazioni

ISBN:

9783030378219

Condizione: Nuovo
Collana: Springer Optimization and Its Applications
Dimensioni: 235 x 155 mm Ø 723 gr
Formato: Copertina rigida
Illustration Notes:150 Illustrations, black and white
Pagine Arabe: 360
Pagine Romane: xi






Utilizziamo i cookie di profilazione, anche di terze parti, per migliorare la navigazione, per fornire servizi e proporti pubblicità in linea con le tue preferenze. Se vuoi saperne di più o negare il consenso a tutti o ad alcuni cookie clicca qui. Chiudendo questo banner o proseguendo nella navigazione acconsenti all’uso dei cookie.

X