Optimization methods for achieving high diffraction efficiency with perfect electric conducting gratings

04/04/2020
by   Rubén Aylwin, et al.
0

This work presents the implementation, analysis, and convergence study of first- and second-order optimization methods applied to one-dimensional periodic gratings. Through boundary integral equations and shape derivatives, the profile of a grating (taken to be a perfect electric conductor) is optimized such that it maximizes the diffraction efficiency for a given diffraction mode. We provide a thorough comparison of two optimization methods: a first-order one based on gradient descent and a second-order approach based on Newton iteration. For the latter, two variations have been explored; in one option, the first Newton method replaces the usual Newton step for the absolute values of the spectral decomposition of the Hessian matrix to deal with non-convexity, while in the second, a modified version of this Newton method is considered to reduce computational time required to compute the Hessian. Numerical examples are provided to validate our claims.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2020

SPAN: A Stochastic Projected Approximate Newton Method

Second-order optimization methods have desirable convergence properties....
research
04/11/2019

Modified online Newton step based on element wise multiplication

The second order method as Newton Step is a suitable technique in Online...
research
04/08/2021

Exact Stochastic Second Order Deep Learning

Optimization in Deep Learning is mainly dominated by first-order methods...
research
02/27/2017

A Unifying Framework for Convergence Analysis of Approximate Newton Methods

Many machine learning models are reformulated as optimization problems. ...
research
09/28/2020

Apollo: An Adaptive Parameter-wise Diagonal Quasi-Newton Method for Nonconvex Stochastic Optimization

In this paper, we introduce Apollo, a quasi-Newton method for nonconvex ...
research
11/28/2022

A survey of deep learning optimizers-first and second order methods

Deep Learning optimization involves minimizing a high-dimensional loss f...
research
11/08/2022

The Hypervolume Indicator Hessian Matrix: Analytical Expression, Computational Time Complexity, and Sparsity

The problem of approximating the Pareto front of a multiobjective optimi...

Please sign up or login with your details

Forgot password? Click here to reset