What is the intutition behind the negative exponential ? in linear logic?
$begingroup$
The positive exponential ! has a very satisfying interpretation in terms of the standard resource interpretation of linear logic. Given a resource $a$, we know that $!a$ means an infinite supply of $a$. Or, stated more concretely in terms of the connectives of linear logic: $!a equiv !a otimes a$. My question is, under this same interpretation of atomic propositions $a$ in linear logic as resources, how are we to interpret $?a$?
The best interpretation that I've seen so far seems to be that $?a$ "consumes $a$'s", but what does that mean concretely in terms of the logic? Is the an analog of the formula $!a = !a otimes a$ for $?$?
Unfortunately, the usual sequent rules for linear logic are not very helpful to my intuition. I think the two-sided sequent presentation of linear logic is the most intuitive so far -- $!$ reintroduces weakening/contraction on the left side of a sequent, and $?$ allows for the same on the right side of a sequent (as evidenced by the rules $!W, ?W, !C, ?C, !D,$ and $?D$ in the presentation here), but what are we to make of the rules $?L,$ and $!R$?
So far I have not been able to find an adequate or complete explanation of either the resource interpretation of $?$ in general, or even the meaning of the rules $?L$ and $!R$ whether it be with regards to the resource interpretation, or the original motivations of linear logic by Girard, and was curious if anyone could point me towards better explanations or resources than I have currently been able to find.
logic soft-question intuition proof-theory linear-logic
$endgroup$
add a comment |
$begingroup$
The positive exponential ! has a very satisfying interpretation in terms of the standard resource interpretation of linear logic. Given a resource $a$, we know that $!a$ means an infinite supply of $a$. Or, stated more concretely in terms of the connectives of linear logic: $!a equiv !a otimes a$. My question is, under this same interpretation of atomic propositions $a$ in linear logic as resources, how are we to interpret $?a$?
The best interpretation that I've seen so far seems to be that $?a$ "consumes $a$'s", but what does that mean concretely in terms of the logic? Is the an analog of the formula $!a = !a otimes a$ for $?$?
Unfortunately, the usual sequent rules for linear logic are not very helpful to my intuition. I think the two-sided sequent presentation of linear logic is the most intuitive so far -- $!$ reintroduces weakening/contraction on the left side of a sequent, and $?$ allows for the same on the right side of a sequent (as evidenced by the rules $!W, ?W, !C, ?C, !D,$ and $?D$ in the presentation here), but what are we to make of the rules $?L,$ and $!R$?
So far I have not been able to find an adequate or complete explanation of either the resource interpretation of $?$ in general, or even the meaning of the rules $?L$ and $!R$ whether it be with regards to the resource interpretation, or the original motivations of linear logic by Girard, and was curious if anyone could point me towards better explanations or resources than I have currently been able to find.
logic soft-question intuition proof-theory linear-logic
$endgroup$
add a comment |
$begingroup$
The positive exponential ! has a very satisfying interpretation in terms of the standard resource interpretation of linear logic. Given a resource $a$, we know that $!a$ means an infinite supply of $a$. Or, stated more concretely in terms of the connectives of linear logic: $!a equiv !a otimes a$. My question is, under this same interpretation of atomic propositions $a$ in linear logic as resources, how are we to interpret $?a$?
The best interpretation that I've seen so far seems to be that $?a$ "consumes $a$'s", but what does that mean concretely in terms of the logic? Is the an analog of the formula $!a = !a otimes a$ for $?$?
Unfortunately, the usual sequent rules for linear logic are not very helpful to my intuition. I think the two-sided sequent presentation of linear logic is the most intuitive so far -- $!$ reintroduces weakening/contraction on the left side of a sequent, and $?$ allows for the same on the right side of a sequent (as evidenced by the rules $!W, ?W, !C, ?C, !D,$ and $?D$ in the presentation here), but what are we to make of the rules $?L,$ and $!R$?
So far I have not been able to find an adequate or complete explanation of either the resource interpretation of $?$ in general, or even the meaning of the rules $?L$ and $!R$ whether it be with regards to the resource interpretation, or the original motivations of linear logic by Girard, and was curious if anyone could point me towards better explanations or resources than I have currently been able to find.
logic soft-question intuition proof-theory linear-logic
$endgroup$
The positive exponential ! has a very satisfying interpretation in terms of the standard resource interpretation of linear logic. Given a resource $a$, we know that $!a$ means an infinite supply of $a$. Or, stated more concretely in terms of the connectives of linear logic: $!a equiv !a otimes a$. My question is, under this same interpretation of atomic propositions $a$ in linear logic as resources, how are we to interpret $?a$?
The best interpretation that I've seen so far seems to be that $?a$ "consumes $a$'s", but what does that mean concretely in terms of the logic? Is the an analog of the formula $!a = !a otimes a$ for $?$?
Unfortunately, the usual sequent rules for linear logic are not very helpful to my intuition. I think the two-sided sequent presentation of linear logic is the most intuitive so far -- $!$ reintroduces weakening/contraction on the left side of a sequent, and $?$ allows for the same on the right side of a sequent (as evidenced by the rules $!W, ?W, !C, ?C, !D,$ and $?D$ in the presentation here), but what are we to make of the rules $?L,$ and $!R$?
So far I have not been able to find an adequate or complete explanation of either the resource interpretation of $?$ in general, or even the meaning of the rules $?L$ and $!R$ whether it be with regards to the resource interpretation, or the original motivations of linear logic by Girard, and was curious if anyone could point me towards better explanations or resources than I have currently been able to find.
logic soft-question intuition proof-theory linear-logic
logic soft-question intuition proof-theory linear-logic
edited Apr 19 '18 at 23:57
Taroccoesbrocco
5,64271840
5,64271840
asked Jun 21 '16 at 2:29
Nathan BeDellNathan BeDell
1,517915
1,517915
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
The key to understanding the negative exponential is first understanding the interpretation of negation in linear logic. Unlike in classical logic, where negation roughly indicates the absence of something (i.e. the absence of truth), this does not carry over to the resource interpretation of linear logic. $A^{perp}$ is best thought of as not the absence of an $A$, but the demand for an $A$. And hence, when we have in our possession both an $A$ and an $A^perp$, we obtain $1$, or the absence of resources. In other words, $A otimes A^perp multimap 1$.
Now that we have this, recall that $?A$ may be defined dually in terms of $!$ as $?A = !A^perp$. Thus, in the resource interpretation of linear logic, $?A$ means infinite potential demand for the resource A.
$endgroup$
1
$begingroup$
In the last paragraph of your answer, the proposed definition of ? needs another negation; it should read $?A=(!(A^bot))^bot$.
$endgroup$
– Andreas Blass
Jun 29 '16 at 22:07
$begingroup$
Ah, yes. Thanks for the correction. This complicates my interpretation then, I'll have to think about this more...
$endgroup$
– Nathan BeDell
Jun 29 '16 at 22:56
$begingroup$
@Sintrastes Did you find a better interpretation ?
$endgroup$
– Boris E.
May 20 '17 at 18:02
add a comment |
$begingroup$
The way I think of $!A$ is as a choice of how many $A$'s you want to receive, rather than simply an infinite number of them. You could write it as an additive disjunction like so:
$$!A approx 1&(A)&(Aotimes A)&(Aotimes Aotimes A)& ...$$
The dual of that would be:
$$DeclareMathOperator{par}{unicode{8523}}
?A = (!A^bot)^bot approx botoplus(A)oplus(Apar A)oplus(Apar Apar A)oplus ...$$
So $?A$ is sort of similar to an unknown number (zero or more) of $A$'s that must be used in parallel in that way that $par$ requires.
Edit: This interpretation also explains a lot of the exponential sequents, and their rules start to look very similar to those for the additives. I'll just show the right-handed sequents, as described by the Stanford Encyclopedia of Philosophy:
Weakening: If you can get $Gamma$, you can also produce some number (i.e. zero) of $B$.
$$cfrac{DeltavdashGamma}{Deltavdash?B,Gamma}$$
Dereliction: If you can get $B$, you can instead produce some number (i.e. one) of $B$.
$$cfrac{Deltavdash B,Gamma}{Deltavdash?B,Gamma}$$
Contraction: If you have an unknown quantity of $B$ and another unknown quantity of $B$, you can join them into one unknown quantity.
$$cfrac{Deltavdash?B,?B,Gamma}{Deltavdash?B,Gamma}$$
Promotion: If you can use unlimited $Delta$ to produce some number of $Gamma$, and you can also produce a single $B$, then you can produce any amount of $B$ that you want.
$$cfrac{!Deltavdash B,?Gamma}{!Deltavdash!B,?Gamma}$$
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1834022%2fwhat-is-the-intutition-behind-the-negative-exponential-in-linear-logic%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The key to understanding the negative exponential is first understanding the interpretation of negation in linear logic. Unlike in classical logic, where negation roughly indicates the absence of something (i.e. the absence of truth), this does not carry over to the resource interpretation of linear logic. $A^{perp}$ is best thought of as not the absence of an $A$, but the demand for an $A$. And hence, when we have in our possession both an $A$ and an $A^perp$, we obtain $1$, or the absence of resources. In other words, $A otimes A^perp multimap 1$.
Now that we have this, recall that $?A$ may be defined dually in terms of $!$ as $?A = !A^perp$. Thus, in the resource interpretation of linear logic, $?A$ means infinite potential demand for the resource A.
$endgroup$
1
$begingroup$
In the last paragraph of your answer, the proposed definition of ? needs another negation; it should read $?A=(!(A^bot))^bot$.
$endgroup$
– Andreas Blass
Jun 29 '16 at 22:07
$begingroup$
Ah, yes. Thanks for the correction. This complicates my interpretation then, I'll have to think about this more...
$endgroup$
– Nathan BeDell
Jun 29 '16 at 22:56
$begingroup$
@Sintrastes Did you find a better interpretation ?
$endgroup$
– Boris E.
May 20 '17 at 18:02
add a comment |
$begingroup$
The key to understanding the negative exponential is first understanding the interpretation of negation in linear logic. Unlike in classical logic, where negation roughly indicates the absence of something (i.e. the absence of truth), this does not carry over to the resource interpretation of linear logic. $A^{perp}$ is best thought of as not the absence of an $A$, but the demand for an $A$. And hence, when we have in our possession both an $A$ and an $A^perp$, we obtain $1$, or the absence of resources. In other words, $A otimes A^perp multimap 1$.
Now that we have this, recall that $?A$ may be defined dually in terms of $!$ as $?A = !A^perp$. Thus, in the resource interpretation of linear logic, $?A$ means infinite potential demand for the resource A.
$endgroup$
1
$begingroup$
In the last paragraph of your answer, the proposed definition of ? needs another negation; it should read $?A=(!(A^bot))^bot$.
$endgroup$
– Andreas Blass
Jun 29 '16 at 22:07
$begingroup$
Ah, yes. Thanks for the correction. This complicates my interpretation then, I'll have to think about this more...
$endgroup$
– Nathan BeDell
Jun 29 '16 at 22:56
$begingroup$
@Sintrastes Did you find a better interpretation ?
$endgroup$
– Boris E.
May 20 '17 at 18:02
add a comment |
$begingroup$
The key to understanding the negative exponential is first understanding the interpretation of negation in linear logic. Unlike in classical logic, where negation roughly indicates the absence of something (i.e. the absence of truth), this does not carry over to the resource interpretation of linear logic. $A^{perp}$ is best thought of as not the absence of an $A$, but the demand for an $A$. And hence, when we have in our possession both an $A$ and an $A^perp$, we obtain $1$, or the absence of resources. In other words, $A otimes A^perp multimap 1$.
Now that we have this, recall that $?A$ may be defined dually in terms of $!$ as $?A = !A^perp$. Thus, in the resource interpretation of linear logic, $?A$ means infinite potential demand for the resource A.
$endgroup$
The key to understanding the negative exponential is first understanding the interpretation of negation in linear logic. Unlike in classical logic, where negation roughly indicates the absence of something (i.e. the absence of truth), this does not carry over to the resource interpretation of linear logic. $A^{perp}$ is best thought of as not the absence of an $A$, but the demand for an $A$. And hence, when we have in our possession both an $A$ and an $A^perp$, we obtain $1$, or the absence of resources. In other words, $A otimes A^perp multimap 1$.
Now that we have this, recall that $?A$ may be defined dually in terms of $!$ as $?A = !A^perp$. Thus, in the resource interpretation of linear logic, $?A$ means infinite potential demand for the resource A.
answered Jun 29 '16 at 21:39
Nathan BeDellNathan BeDell
1,517915
1,517915
1
$begingroup$
In the last paragraph of your answer, the proposed definition of ? needs another negation; it should read $?A=(!(A^bot))^bot$.
$endgroup$
– Andreas Blass
Jun 29 '16 at 22:07
$begingroup$
Ah, yes. Thanks for the correction. This complicates my interpretation then, I'll have to think about this more...
$endgroup$
– Nathan BeDell
Jun 29 '16 at 22:56
$begingroup$
@Sintrastes Did you find a better interpretation ?
$endgroup$
– Boris E.
May 20 '17 at 18:02
add a comment |
1
$begingroup$
In the last paragraph of your answer, the proposed definition of ? needs another negation; it should read $?A=(!(A^bot))^bot$.
$endgroup$
– Andreas Blass
Jun 29 '16 at 22:07
$begingroup$
Ah, yes. Thanks for the correction. This complicates my interpretation then, I'll have to think about this more...
$endgroup$
– Nathan BeDell
Jun 29 '16 at 22:56
$begingroup$
@Sintrastes Did you find a better interpretation ?
$endgroup$
– Boris E.
May 20 '17 at 18:02
1
1
$begingroup$
In the last paragraph of your answer, the proposed definition of ? needs another negation; it should read $?A=(!(A^bot))^bot$.
$endgroup$
– Andreas Blass
Jun 29 '16 at 22:07
$begingroup$
In the last paragraph of your answer, the proposed definition of ? needs another negation; it should read $?A=(!(A^bot))^bot$.
$endgroup$
– Andreas Blass
Jun 29 '16 at 22:07
$begingroup$
Ah, yes. Thanks for the correction. This complicates my interpretation then, I'll have to think about this more...
$endgroup$
– Nathan BeDell
Jun 29 '16 at 22:56
$begingroup$
Ah, yes. Thanks for the correction. This complicates my interpretation then, I'll have to think about this more...
$endgroup$
– Nathan BeDell
Jun 29 '16 at 22:56
$begingroup$
@Sintrastes Did you find a better interpretation ?
$endgroup$
– Boris E.
May 20 '17 at 18:02
$begingroup$
@Sintrastes Did you find a better interpretation ?
$endgroup$
– Boris E.
May 20 '17 at 18:02
add a comment |
$begingroup$
The way I think of $!A$ is as a choice of how many $A$'s you want to receive, rather than simply an infinite number of them. You could write it as an additive disjunction like so:
$$!A approx 1&(A)&(Aotimes A)&(Aotimes Aotimes A)& ...$$
The dual of that would be:
$$DeclareMathOperator{par}{unicode{8523}}
?A = (!A^bot)^bot approx botoplus(A)oplus(Apar A)oplus(Apar Apar A)oplus ...$$
So $?A$ is sort of similar to an unknown number (zero or more) of $A$'s that must be used in parallel in that way that $par$ requires.
Edit: This interpretation also explains a lot of the exponential sequents, and their rules start to look very similar to those for the additives. I'll just show the right-handed sequents, as described by the Stanford Encyclopedia of Philosophy:
Weakening: If you can get $Gamma$, you can also produce some number (i.e. zero) of $B$.
$$cfrac{DeltavdashGamma}{Deltavdash?B,Gamma}$$
Dereliction: If you can get $B$, you can instead produce some number (i.e. one) of $B$.
$$cfrac{Deltavdash B,Gamma}{Deltavdash?B,Gamma}$$
Contraction: If you have an unknown quantity of $B$ and another unknown quantity of $B$, you can join them into one unknown quantity.
$$cfrac{Deltavdash?B,?B,Gamma}{Deltavdash?B,Gamma}$$
Promotion: If you can use unlimited $Delta$ to produce some number of $Gamma$, and you can also produce a single $B$, then you can produce any amount of $B$ that you want.
$$cfrac{!Deltavdash B,?Gamma}{!Deltavdash!B,?Gamma}$$
$endgroup$
add a comment |
$begingroup$
The way I think of $!A$ is as a choice of how many $A$'s you want to receive, rather than simply an infinite number of them. You could write it as an additive disjunction like so:
$$!A approx 1&(A)&(Aotimes A)&(Aotimes Aotimes A)& ...$$
The dual of that would be:
$$DeclareMathOperator{par}{unicode{8523}}
?A = (!A^bot)^bot approx botoplus(A)oplus(Apar A)oplus(Apar Apar A)oplus ...$$
So $?A$ is sort of similar to an unknown number (zero or more) of $A$'s that must be used in parallel in that way that $par$ requires.
Edit: This interpretation also explains a lot of the exponential sequents, and their rules start to look very similar to those for the additives. I'll just show the right-handed sequents, as described by the Stanford Encyclopedia of Philosophy:
Weakening: If you can get $Gamma$, you can also produce some number (i.e. zero) of $B$.
$$cfrac{DeltavdashGamma}{Deltavdash?B,Gamma}$$
Dereliction: If you can get $B$, you can instead produce some number (i.e. one) of $B$.
$$cfrac{Deltavdash B,Gamma}{Deltavdash?B,Gamma}$$
Contraction: If you have an unknown quantity of $B$ and another unknown quantity of $B$, you can join them into one unknown quantity.
$$cfrac{Deltavdash?B,?B,Gamma}{Deltavdash?B,Gamma}$$
Promotion: If you can use unlimited $Delta$ to produce some number of $Gamma$, and you can also produce a single $B$, then you can produce any amount of $B$ that you want.
$$cfrac{!Deltavdash B,?Gamma}{!Deltavdash!B,?Gamma}$$
$endgroup$
add a comment |
$begingroup$
The way I think of $!A$ is as a choice of how many $A$'s you want to receive, rather than simply an infinite number of them. You could write it as an additive disjunction like so:
$$!A approx 1&(A)&(Aotimes A)&(Aotimes Aotimes A)& ...$$
The dual of that would be:
$$DeclareMathOperator{par}{unicode{8523}}
?A = (!A^bot)^bot approx botoplus(A)oplus(Apar A)oplus(Apar Apar A)oplus ...$$
So $?A$ is sort of similar to an unknown number (zero or more) of $A$'s that must be used in parallel in that way that $par$ requires.
Edit: This interpretation also explains a lot of the exponential sequents, and their rules start to look very similar to those for the additives. I'll just show the right-handed sequents, as described by the Stanford Encyclopedia of Philosophy:
Weakening: If you can get $Gamma$, you can also produce some number (i.e. zero) of $B$.
$$cfrac{DeltavdashGamma}{Deltavdash?B,Gamma}$$
Dereliction: If you can get $B$, you can instead produce some number (i.e. one) of $B$.
$$cfrac{Deltavdash B,Gamma}{Deltavdash?B,Gamma}$$
Contraction: If you have an unknown quantity of $B$ and another unknown quantity of $B$, you can join them into one unknown quantity.
$$cfrac{Deltavdash?B,?B,Gamma}{Deltavdash?B,Gamma}$$
Promotion: If you can use unlimited $Delta$ to produce some number of $Gamma$, and you can also produce a single $B$, then you can produce any amount of $B$ that you want.
$$cfrac{!Deltavdash B,?Gamma}{!Deltavdash!B,?Gamma}$$
$endgroup$
The way I think of $!A$ is as a choice of how many $A$'s you want to receive, rather than simply an infinite number of them. You could write it as an additive disjunction like so:
$$!A approx 1&(A)&(Aotimes A)&(Aotimes Aotimes A)& ...$$
The dual of that would be:
$$DeclareMathOperator{par}{unicode{8523}}
?A = (!A^bot)^bot approx botoplus(A)oplus(Apar A)oplus(Apar Apar A)oplus ...$$
So $?A$ is sort of similar to an unknown number (zero or more) of $A$'s that must be used in parallel in that way that $par$ requires.
Edit: This interpretation also explains a lot of the exponential sequents, and their rules start to look very similar to those for the additives. I'll just show the right-handed sequents, as described by the Stanford Encyclopedia of Philosophy:
Weakening: If you can get $Gamma$, you can also produce some number (i.e. zero) of $B$.
$$cfrac{DeltavdashGamma}{Deltavdash?B,Gamma}$$
Dereliction: If you can get $B$, you can instead produce some number (i.e. one) of $B$.
$$cfrac{Deltavdash B,Gamma}{Deltavdash?B,Gamma}$$
Contraction: If you have an unknown quantity of $B$ and another unknown quantity of $B$, you can join them into one unknown quantity.
$$cfrac{Deltavdash?B,?B,Gamma}{Deltavdash?B,Gamma}$$
Promotion: If you can use unlimited $Delta$ to produce some number of $Gamma$, and you can also produce a single $B$, then you can produce any amount of $B$ that you want.
$$cfrac{!Deltavdash B,?Gamma}{!Deltavdash!B,?Gamma}$$
edited Dec 15 '18 at 18:02
answered Dec 15 '18 at 17:03
WoofmaoWoofmao
166
166
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1834022%2fwhat-is-the-intutition-behind-the-negative-exponential-in-linear-logic%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown