Neural network as a nonlinear system?












1












$begingroup$


I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).










share|cite|improve this question











$endgroup$












  • $begingroup$
    Genetic algorithms?
    $endgroup$
    – N74
    Nov 29 '18 at 21:46










  • $begingroup$
    I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    $endgroup$
    – Filippo Portera
    Dec 2 '18 at 18:49










  • $begingroup$
    I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    $endgroup$
    – N74
    Dec 2 '18 at 22:28
















1












$begingroup$


I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).










share|cite|improve this question











$endgroup$












  • $begingroup$
    Genetic algorithms?
    $endgroup$
    – N74
    Nov 29 '18 at 21:46










  • $begingroup$
    I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    $endgroup$
    – Filippo Portera
    Dec 2 '18 at 18:49










  • $begingroup$
    I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    $endgroup$
    – N74
    Dec 2 '18 at 22:28














1












1








1


1



$begingroup$


I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).










share|cite|improve this question











$endgroup$




I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).







nonlinear-system






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 30 '18 at 7:55







Filippo Portera

















asked Nov 29 '18 at 21:27









Filippo PorteraFilippo Portera

112




112












  • $begingroup$
    Genetic algorithms?
    $endgroup$
    – N74
    Nov 29 '18 at 21:46










  • $begingroup$
    I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    $endgroup$
    – Filippo Portera
    Dec 2 '18 at 18:49










  • $begingroup$
    I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    $endgroup$
    – N74
    Dec 2 '18 at 22:28


















  • $begingroup$
    Genetic algorithms?
    $endgroup$
    – N74
    Nov 29 '18 at 21:46










  • $begingroup$
    I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    $endgroup$
    – Filippo Portera
    Dec 2 '18 at 18:49










  • $begingroup$
    I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    $endgroup$
    – N74
    Dec 2 '18 at 22:28
















$begingroup$
Genetic algorithms?
$endgroup$
– N74
Nov 29 '18 at 21:46




$begingroup$
Genetic algorithms?
$endgroup$
– N74
Nov 29 '18 at 21:46












$begingroup$
I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
$endgroup$
– Filippo Portera
Dec 2 '18 at 18:49




$begingroup$
I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
$endgroup$
– Filippo Portera
Dec 2 '18 at 18:49












$begingroup$
I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
$endgroup$
– N74
Dec 2 '18 at 22:28




$begingroup$
I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
$endgroup$
– N74
Dec 2 '18 at 22:28










0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019243%2fneural-network-as-a-nonlinear-system%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019243%2fneural-network-as-a-nonlinear-system%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Plaza Victoria

In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...