check if estimation is unbiased?












0












$begingroup$


Assume we that we calculate the expected value of some measurements $x=dfrac {x_1 + x_2 + x_3 + x_4} 4$. what if we dont include $x_3$ and $x_4$, but instead we use $x_2$ as $x_3$ and $x_4$. Then We get the following expression $v=dfrac {x_1 + x_2 + x_2 + x_2} 4$.



How do I know if $v$ is a unbiased estimation of $x$?



I am not sure how to approach this problem, any ideas are appreciated!










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    So $v$ is the same thing as $x$? If that's not what you meant, then you need to clarify your question.
    $endgroup$
    – Michael Hardy
    Dec 6 '15 at 20:07










  • $begingroup$
    I assumed here that $x_k$ are random variables.
    $endgroup$
    – manofbear
    Dec 6 '15 at 20:35










  • $begingroup$
    x is expected value of a random variable
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:05










  • $begingroup$
    (x1+x2+x3+x4)/4 calculates the expected value if a random variable has exactly four, equiprobable possible outcomes x1, x2, ..., x4. Alternatively, if x_1, x_2, ... x_4 denote 4 independent draws form some probability distribution, then (x1+x2+x3+x4)/4 is an estimator of the expected value, but it is not the actual expected value! To be precise, this is an important distinction.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 23:12


















0












$begingroup$


Assume we that we calculate the expected value of some measurements $x=dfrac {x_1 + x_2 + x_3 + x_4} 4$. what if we dont include $x_3$ and $x_4$, but instead we use $x_2$ as $x_3$ and $x_4$. Then We get the following expression $v=dfrac {x_1 + x_2 + x_2 + x_2} 4$.



How do I know if $v$ is a unbiased estimation of $x$?



I am not sure how to approach this problem, any ideas are appreciated!










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    So $v$ is the same thing as $x$? If that's not what you meant, then you need to clarify your question.
    $endgroup$
    – Michael Hardy
    Dec 6 '15 at 20:07










  • $begingroup$
    I assumed here that $x_k$ are random variables.
    $endgroup$
    – manofbear
    Dec 6 '15 at 20:35










  • $begingroup$
    x is expected value of a random variable
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:05










  • $begingroup$
    (x1+x2+x3+x4)/4 calculates the expected value if a random variable has exactly four, equiprobable possible outcomes x1, x2, ..., x4. Alternatively, if x_1, x_2, ... x_4 denote 4 independent draws form some probability distribution, then (x1+x2+x3+x4)/4 is an estimator of the expected value, but it is not the actual expected value! To be precise, this is an important distinction.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 23:12
















0












0








0





$begingroup$


Assume we that we calculate the expected value of some measurements $x=dfrac {x_1 + x_2 + x_3 + x_4} 4$. what if we dont include $x_3$ and $x_4$, but instead we use $x_2$ as $x_3$ and $x_4$. Then We get the following expression $v=dfrac {x_1 + x_2 + x_2 + x_2} 4$.



How do I know if $v$ is a unbiased estimation of $x$?



I am not sure how to approach this problem, any ideas are appreciated!










share|cite|improve this question











$endgroup$




Assume we that we calculate the expected value of some measurements $x=dfrac {x_1 + x_2 + x_3 + x_4} 4$. what if we dont include $x_3$ and $x_4$, but instead we use $x_2$ as $x_3$ and $x_4$. Then We get the following expression $v=dfrac {x_1 + x_2 + x_2 + x_2} 4$.



How do I know if $v$ is a unbiased estimation of $x$?



I am not sure how to approach this problem, any ideas are appreciated!







statistics estimation-theory parameter-estimation






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 7 '15 at 11:55







dumble24

















asked Dec 6 '15 at 19:55









dumble24dumble24

1016




1016








  • 1




    $begingroup$
    So $v$ is the same thing as $x$? If that's not what you meant, then you need to clarify your question.
    $endgroup$
    – Michael Hardy
    Dec 6 '15 at 20:07










  • $begingroup$
    I assumed here that $x_k$ are random variables.
    $endgroup$
    – manofbear
    Dec 6 '15 at 20:35










  • $begingroup$
    x is expected value of a random variable
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:05










  • $begingroup$
    (x1+x2+x3+x4)/4 calculates the expected value if a random variable has exactly four, equiprobable possible outcomes x1, x2, ..., x4. Alternatively, if x_1, x_2, ... x_4 denote 4 independent draws form some probability distribution, then (x1+x2+x3+x4)/4 is an estimator of the expected value, but it is not the actual expected value! To be precise, this is an important distinction.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 23:12
















  • 1




    $begingroup$
    So $v$ is the same thing as $x$? If that's not what you meant, then you need to clarify your question.
    $endgroup$
    – Michael Hardy
    Dec 6 '15 at 20:07










  • $begingroup$
    I assumed here that $x_k$ are random variables.
    $endgroup$
    – manofbear
    Dec 6 '15 at 20:35










  • $begingroup$
    x is expected value of a random variable
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:05










  • $begingroup$
    (x1+x2+x3+x4)/4 calculates the expected value if a random variable has exactly four, equiprobable possible outcomes x1, x2, ..., x4. Alternatively, if x_1, x_2, ... x_4 denote 4 independent draws form some probability distribution, then (x1+x2+x3+x4)/4 is an estimator of the expected value, but it is not the actual expected value! To be precise, this is an important distinction.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 23:12










1




1




$begingroup$
So $v$ is the same thing as $x$? If that's not what you meant, then you need to clarify your question.
$endgroup$
– Michael Hardy
Dec 6 '15 at 20:07




$begingroup$
So $v$ is the same thing as $x$? If that's not what you meant, then you need to clarify your question.
$endgroup$
– Michael Hardy
Dec 6 '15 at 20:07












$begingroup$
I assumed here that $x_k$ are random variables.
$endgroup$
– manofbear
Dec 6 '15 at 20:35




$begingroup$
I assumed here that $x_k$ are random variables.
$endgroup$
– manofbear
Dec 6 '15 at 20:35












$begingroup$
x is expected value of a random variable
$endgroup$
– dumble24
Dec 6 '15 at 21:05




$begingroup$
x is expected value of a random variable
$endgroup$
– dumble24
Dec 6 '15 at 21:05












$begingroup$
(x1+x2+x3+x4)/4 calculates the expected value if a random variable has exactly four, equiprobable possible outcomes x1, x2, ..., x4. Alternatively, if x_1, x_2, ... x_4 denote 4 independent draws form some probability distribution, then (x1+x2+x3+x4)/4 is an estimator of the expected value, but it is not the actual expected value! To be precise, this is an important distinction.
$endgroup$
– Matthew Gunn
Dec 7 '15 at 23:12






$begingroup$
(x1+x2+x3+x4)/4 calculates the expected value if a random variable has exactly four, equiprobable possible outcomes x1, x2, ..., x4. Alternatively, if x_1, x_2, ... x_4 denote 4 independent draws form some probability distribution, then (x1+x2+x3+x4)/4 is an estimator of the expected value, but it is not the actual expected value! To be precise, this is an important distinction.
$endgroup$
– Matthew Gunn
Dec 7 '15 at 23:12












2 Answers
2






active

oldest

votes


















0












$begingroup$

[EDIT: Assumed $x_k$ are random variables.]



We say $v$ is an estimator of random variable $x$ if $E[v]=E[x]$, where $E[cdot]$ is expectation of random variables.



Recall that expectation is a linear operator, i.e. if $X$ and $Y$ are random variables and $a,b$ are constants, $E[aX+bY]=aE[X]+bE[Y]$. So we get $E[x]=frac{1}{4}(E[x_1]+E[x_2]+E[x_3]+E[x_4])$, and $E[v]=frac{1}{4}(E[x_1]+3E[x_2])$. Notice that $E[x]=E[v]$ is equivalent to $E[x]-E[v]=0$.



So $v$ is an unbiased estimator if $E[x]-E[v]=0 Leftrightarrow E[x_3]+E[x_4]-2E[x_2]=0.$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    How do we decide if $v$ is unbiased or not unbiased using $ E[x_3]+E[x_4]-2E[x_2]=0.$ as $x_2,x_3$ and $x_4$ are not defined?
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:01












  • $begingroup$
    What is this, if E[v] = E[x]?! Why are you taking expectations on the right hand side?
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:14










  • $begingroup$
    Almost certainly dumble24 is in a classical statistics environment and is estimating some parameter, not another random variable.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:24










  • $begingroup$
    Edited the question, added some more context.
    $endgroup$
    – dumble24
    Dec 7 '15 at 11:58



















-1












$begingroup$

Let $theta$ be some parameter. Let $X$ be an estimator.



$X$ is called an unbiased estimator for $theta$ if $E[X] = theta$.



Note that $X$ is a random variable (or random vector) while $theta$ would be a scalar (or vector).



Example



Let's say $x_1$ and $x_2$ are random variables with $E[x_1] = E[x_2] = mu$. Then estimator $y = frac{1}{5} x_1 + frac{4}{5} x_2$ would be an unbiased estimate of $mu$ since $E[y] = mu$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Whoever modded this down was either sloppy or doesn't know what he/she is talking about.
    $endgroup$
    – Matthew Gunn
    Dec 8 '15 at 3:59











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1563004%2fcheck-if-estimation-is-unbiased%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

[EDIT: Assumed $x_k$ are random variables.]



We say $v$ is an estimator of random variable $x$ if $E[v]=E[x]$, where $E[cdot]$ is expectation of random variables.



Recall that expectation is a linear operator, i.e. if $X$ and $Y$ are random variables and $a,b$ are constants, $E[aX+bY]=aE[X]+bE[Y]$. So we get $E[x]=frac{1}{4}(E[x_1]+E[x_2]+E[x_3]+E[x_4])$, and $E[v]=frac{1}{4}(E[x_1]+3E[x_2])$. Notice that $E[x]=E[v]$ is equivalent to $E[x]-E[v]=0$.



So $v$ is an unbiased estimator if $E[x]-E[v]=0 Leftrightarrow E[x_3]+E[x_4]-2E[x_2]=0.$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    How do we decide if $v$ is unbiased or not unbiased using $ E[x_3]+E[x_4]-2E[x_2]=0.$ as $x_2,x_3$ and $x_4$ are not defined?
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:01












  • $begingroup$
    What is this, if E[v] = E[x]?! Why are you taking expectations on the right hand side?
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:14










  • $begingroup$
    Almost certainly dumble24 is in a classical statistics environment and is estimating some parameter, not another random variable.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:24










  • $begingroup$
    Edited the question, added some more context.
    $endgroup$
    – dumble24
    Dec 7 '15 at 11:58
















0












$begingroup$

[EDIT: Assumed $x_k$ are random variables.]



We say $v$ is an estimator of random variable $x$ if $E[v]=E[x]$, where $E[cdot]$ is expectation of random variables.



Recall that expectation is a linear operator, i.e. if $X$ and $Y$ are random variables and $a,b$ are constants, $E[aX+bY]=aE[X]+bE[Y]$. So we get $E[x]=frac{1}{4}(E[x_1]+E[x_2]+E[x_3]+E[x_4])$, and $E[v]=frac{1}{4}(E[x_1]+3E[x_2])$. Notice that $E[x]=E[v]$ is equivalent to $E[x]-E[v]=0$.



So $v$ is an unbiased estimator if $E[x]-E[v]=0 Leftrightarrow E[x_3]+E[x_4]-2E[x_2]=0.$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    How do we decide if $v$ is unbiased or not unbiased using $ E[x_3]+E[x_4]-2E[x_2]=0.$ as $x_2,x_3$ and $x_4$ are not defined?
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:01












  • $begingroup$
    What is this, if E[v] = E[x]?! Why are you taking expectations on the right hand side?
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:14










  • $begingroup$
    Almost certainly dumble24 is in a classical statistics environment and is estimating some parameter, not another random variable.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:24










  • $begingroup$
    Edited the question, added some more context.
    $endgroup$
    – dumble24
    Dec 7 '15 at 11:58














0












0








0





$begingroup$

[EDIT: Assumed $x_k$ are random variables.]



We say $v$ is an estimator of random variable $x$ if $E[v]=E[x]$, where $E[cdot]$ is expectation of random variables.



Recall that expectation is a linear operator, i.e. if $X$ and $Y$ are random variables and $a,b$ are constants, $E[aX+bY]=aE[X]+bE[Y]$. So we get $E[x]=frac{1}{4}(E[x_1]+E[x_2]+E[x_3]+E[x_4])$, and $E[v]=frac{1}{4}(E[x_1]+3E[x_2])$. Notice that $E[x]=E[v]$ is equivalent to $E[x]-E[v]=0$.



So $v$ is an unbiased estimator if $E[x]-E[v]=0 Leftrightarrow E[x_3]+E[x_4]-2E[x_2]=0.$






share|cite|improve this answer











$endgroup$



[EDIT: Assumed $x_k$ are random variables.]



We say $v$ is an estimator of random variable $x$ if $E[v]=E[x]$, where $E[cdot]$ is expectation of random variables.



Recall that expectation is a linear operator, i.e. if $X$ and $Y$ are random variables and $a,b$ are constants, $E[aX+bY]=aE[X]+bE[Y]$. So we get $E[x]=frac{1}{4}(E[x_1]+E[x_2]+E[x_3]+E[x_4])$, and $E[v]=frac{1}{4}(E[x_1]+3E[x_2])$. Notice that $E[x]=E[v]$ is equivalent to $E[x]-E[v]=0$.



So $v$ is an unbiased estimator if $E[x]-E[v]=0 Leftrightarrow E[x_3]+E[x_4]-2E[x_2]=0.$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 6 '15 at 20:36

























answered Dec 6 '15 at 20:02









manofbearmanofbear

1,579515




1,579515












  • $begingroup$
    How do we decide if $v$ is unbiased or not unbiased using $ E[x_3]+E[x_4]-2E[x_2]=0.$ as $x_2,x_3$ and $x_4$ are not defined?
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:01












  • $begingroup$
    What is this, if E[v] = E[x]?! Why are you taking expectations on the right hand side?
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:14










  • $begingroup$
    Almost certainly dumble24 is in a classical statistics environment and is estimating some parameter, not another random variable.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:24










  • $begingroup$
    Edited the question, added some more context.
    $endgroup$
    – dumble24
    Dec 7 '15 at 11:58


















  • $begingroup$
    How do we decide if $v$ is unbiased or not unbiased using $ E[x_3]+E[x_4]-2E[x_2]=0.$ as $x_2,x_3$ and $x_4$ are not defined?
    $endgroup$
    – dumble24
    Dec 6 '15 at 21:01












  • $begingroup$
    What is this, if E[v] = E[x]?! Why are you taking expectations on the right hand side?
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:14










  • $begingroup$
    Almost certainly dumble24 is in a classical statistics environment and is estimating some parameter, not another random variable.
    $endgroup$
    – Matthew Gunn
    Dec 7 '15 at 10:24










  • $begingroup$
    Edited the question, added some more context.
    $endgroup$
    – dumble24
    Dec 7 '15 at 11:58
















$begingroup$
How do we decide if $v$ is unbiased or not unbiased using $ E[x_3]+E[x_4]-2E[x_2]=0.$ as $x_2,x_3$ and $x_4$ are not defined?
$endgroup$
– dumble24
Dec 6 '15 at 21:01






$begingroup$
How do we decide if $v$ is unbiased or not unbiased using $ E[x_3]+E[x_4]-2E[x_2]=0.$ as $x_2,x_3$ and $x_4$ are not defined?
$endgroup$
– dumble24
Dec 6 '15 at 21:01














$begingroup$
What is this, if E[v] = E[x]?! Why are you taking expectations on the right hand side?
$endgroup$
– Matthew Gunn
Dec 7 '15 at 10:14




$begingroup$
What is this, if E[v] = E[x]?! Why are you taking expectations on the right hand side?
$endgroup$
– Matthew Gunn
Dec 7 '15 at 10:14












$begingroup$
Almost certainly dumble24 is in a classical statistics environment and is estimating some parameter, not another random variable.
$endgroup$
– Matthew Gunn
Dec 7 '15 at 10:24




$begingroup$
Almost certainly dumble24 is in a classical statistics environment and is estimating some parameter, not another random variable.
$endgroup$
– Matthew Gunn
Dec 7 '15 at 10:24












$begingroup$
Edited the question, added some more context.
$endgroup$
– dumble24
Dec 7 '15 at 11:58




$begingroup$
Edited the question, added some more context.
$endgroup$
– dumble24
Dec 7 '15 at 11:58











-1












$begingroup$

Let $theta$ be some parameter. Let $X$ be an estimator.



$X$ is called an unbiased estimator for $theta$ if $E[X] = theta$.



Note that $X$ is a random variable (or random vector) while $theta$ would be a scalar (or vector).



Example



Let's say $x_1$ and $x_2$ are random variables with $E[x_1] = E[x_2] = mu$. Then estimator $y = frac{1}{5} x_1 + frac{4}{5} x_2$ would be an unbiased estimate of $mu$ since $E[y] = mu$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Whoever modded this down was either sloppy or doesn't know what he/she is talking about.
    $endgroup$
    – Matthew Gunn
    Dec 8 '15 at 3:59
















-1












$begingroup$

Let $theta$ be some parameter. Let $X$ be an estimator.



$X$ is called an unbiased estimator for $theta$ if $E[X] = theta$.



Note that $X$ is a random variable (or random vector) while $theta$ would be a scalar (or vector).



Example



Let's say $x_1$ and $x_2$ are random variables with $E[x_1] = E[x_2] = mu$. Then estimator $y = frac{1}{5} x_1 + frac{4}{5} x_2$ would be an unbiased estimate of $mu$ since $E[y] = mu$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Whoever modded this down was either sloppy or doesn't know what he/she is talking about.
    $endgroup$
    – Matthew Gunn
    Dec 8 '15 at 3:59














-1












-1








-1





$begingroup$

Let $theta$ be some parameter. Let $X$ be an estimator.



$X$ is called an unbiased estimator for $theta$ if $E[X] = theta$.



Note that $X$ is a random variable (or random vector) while $theta$ would be a scalar (or vector).



Example



Let's say $x_1$ and $x_2$ are random variables with $E[x_1] = E[x_2] = mu$. Then estimator $y = frac{1}{5} x_1 + frac{4}{5} x_2$ would be an unbiased estimate of $mu$ since $E[y] = mu$.






share|cite|improve this answer











$endgroup$



Let $theta$ be some parameter. Let $X$ be an estimator.



$X$ is called an unbiased estimator for $theta$ if $E[X] = theta$.



Note that $X$ is a random variable (or random vector) while $theta$ would be a scalar (or vector).



Example



Let's say $x_1$ and $x_2$ are random variables with $E[x_1] = E[x_2] = mu$. Then estimator $y = frac{1}{5} x_1 + frac{4}{5} x_2$ would be an unbiased estimate of $mu$ since $E[y] = mu$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 7 '15 at 10:17

























answered Dec 7 '15 at 9:57









Matthew GunnMatthew Gunn

37618




37618












  • $begingroup$
    Whoever modded this down was either sloppy or doesn't know what he/she is talking about.
    $endgroup$
    – Matthew Gunn
    Dec 8 '15 at 3:59


















  • $begingroup$
    Whoever modded this down was either sloppy or doesn't know what he/she is talking about.
    $endgroup$
    – Matthew Gunn
    Dec 8 '15 at 3:59
















$begingroup$
Whoever modded this down was either sloppy or doesn't know what he/she is talking about.
$endgroup$
– Matthew Gunn
Dec 8 '15 at 3:59




$begingroup$
Whoever modded this down was either sloppy or doesn't know what he/she is talking about.
$endgroup$
– Matthew Gunn
Dec 8 '15 at 3:59


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1563004%2fcheck-if-estimation-is-unbiased%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Plaza Victoria

In PowerPoint, is there a keyboard shortcut for bulleted / numbered list?

How to put 3 figures in Latex with 2 figures side by side and 1 below these side by side images but in...