Compare different optimization techniques.
up vote
0
down vote
favorite
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
add a comment |
up vote
0
down vote
favorite
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
Nov 19 at 10:46
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
Nov 19 at 10:58
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
Nov 19 at 11:13
Various surrogate function approaches, such as kriging
– Johan Löfberg
Nov 19 at 18:29
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
optimization computational-complexity
asked Nov 19 at 10:38
Alex Ozerov
101
101
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
Nov 19 at 10:46
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
Nov 19 at 10:58
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
Nov 19 at 11:13
Various surrogate function approaches, such as kriging
– Johan Löfberg
Nov 19 at 18:29
add a comment |
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
Nov 19 at 10:46
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
Nov 19 at 10:58
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
Nov 19 at 11:13
Various surrogate function approaches, such as kriging
– Johan Löfberg
Nov 19 at 18:29
1
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
Nov 19 at 10:46
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
Nov 19 at 10:46
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
Nov 19 at 10:58
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
Nov 19 at 10:58
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
Nov 19 at 11:13
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
Nov 19 at 11:13
Various surrogate function approaches, such as kriging
– Johan Löfberg
Nov 19 at 18:29
Various surrogate function approaches, such as kriging
– Johan Löfberg
Nov 19 at 18:29
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004770%2fcompare-different-optimization-techniques%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
Nov 19 at 10:46
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
Nov 19 at 10:58
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
Nov 19 at 11:13
Various surrogate function approaches, such as kriging
– Johan Löfberg
Nov 19 at 18:29