News
Conjugate gradient methods form a class of iterative algorithms that are highly effective for solving large‐scale unconstrained optimisation problems. They achieve efficiency by constructing ...
In this paper we test different conjugate gradient (CG) methods for solving largescale unconstrained optimization problems. The methods are divided in two groups: the first group includes five basic ...
Description: Survey of computational tools for solving constrained and unconstrained nonlinear optimization problems. Emphasis on algorithmic strategies and characteristic structures of nonlinear ...
CSE Core Courses are classified into six areas: Introduction to CSE, Computational Mathematics, High Performance Computing, Intelligent Computing, Scientific Visualization, and Computational ...
Studies linear and nonlinear programming, the simplex method, duality, sensitivity, transportation and network flow problems, some constrained and unconstrained optimization theory, and the ...
See "Nonlinear Optimization and Related Subroutines" for a listing of all NLP subroutines. See Chapter 11, "Nonlinear Optimization Examples," for a description of the inputs to and outputs of all NLP ...
There are three SAS procedures that enable you to do maximum likelihood estimation of parameters in an arbitrary model with a likelihood function that you define: PROC MODEL, PROC NLP, and PROC IML.
When solving the general smooth nonlinear and possibly nonconvex optimization problem involving equality and/or inequality constraints, an approximate first-order critical point of accuracy ϵ can be ...
Conjugate gradient methods form a class of iterative algorithms that are highly effective for solving large‐scale unconstrained optimisation problems. They achieve efficiency by constructing search ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results