Expectation of inverse of sum of positive iid variablesExpectation of reciprocal of a variableI've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?Taking the expectation of Taylor series (especially the remainder)Error bars on log of big numbersExpectation of Multivariate sample varianceIf I group my data the variance changes, what does this tell me?Extreme value distribution with unknown varianceShowing that the difference of two variance matrices should be positive semi-definite… or notTaylor approximation of expected value of multivariate functionExpectation of square root of sum of independent squared uniform random variablesExpectation of the softmax transform for Gaussian multivariate variablesApproximating expectation with Taylor series
Can criminal fraud exist without damages?
Why are on-board computers allowed to change controls without notifying the pilots?
Can I use my Chinese passport to enter China after I acquired another citizenship?
Best way to store options for panels
The baby cries all morning
Everything Bob says is false. How does he get people to trust him?
Your magic is very sketchy
How was Earth single-handedly capable of creating 3 of the 4 gods of chaos?
Hide Select Output from T-SQL
What would happen if the UK refused to take part in EU Parliamentary elections?
Print name if parameter passed to function
Irreducibility of a simple polynomial
Opposite of a diet
Hostile work environment after whistle-blowing on coworker and our boss. What do I do?
How can I get through very long and very dry, but also very useful technical documents when learning a new tool?
Are there any comparative studies done between Ashtavakra Gita and Buddhim?
How do I rename a LINUX host without needing to reboot for the rename to take effect?
Applicability of Single Responsibility Principle
Is it correct to write "is not focus on"?
Is a roofing delivery truck likely to crack my driveway slab?
What's a natural way to say that someone works somewhere (for a job)?
Transcription Beats per minute
How could Frankenstein get the parts for his _second_ creature?
Can somebody explain Brexit in a few child-proof sentences?
Expectation of inverse of sum of positive iid variables
Expectation of reciprocal of a variableI've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?Taking the expectation of Taylor series (especially the remainder)Error bars on log of big numbersExpectation of Multivariate sample varianceIf I group my data the variance changes, what does this tell me?Extreme value distribution with unknown varianceShowing that the difference of two variance matrices should be positive semi-definite… or notTaylor approximation of expected value of multivariate functionExpectation of square root of sum of independent squared uniform random variablesExpectation of the softmax transform for Gaussian multivariate variablesApproximating expectation with Taylor series
$begingroup$
Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.
My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$
So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
New contributor
$endgroup$
add a comment |
$begingroup$
Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.
My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$
So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
New contributor
$endgroup$
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber♦
5 hours ago
add a comment |
$begingroup$
Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.
My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$
So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
New contributor
$endgroup$
Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.
My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?
There seems to be some strategy that may work based on the taylor extension, but
- I'm not sure about the hypothesis that need to be met;
- if it works in this case; and
- if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?
More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$
So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks
EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.
PS: this is almost a cross-post of this on Math.SE.
variance expected-value iid
variance expected-value iid
New contributor
New contributor
edited yesterday
Gopi
New contributor
asked yesterday
GopiGopi
1405
1405
New contributor
New contributor
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber♦
5 hours ago
add a comment |
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber♦
5 hours ago
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber♦
5 hours ago
$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber♦
5 hours ago
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.
$endgroup$
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*
and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*
Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.
New contributor
$endgroup$
$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.
$endgroup$
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.
$endgroup$
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.
$endgroup$
You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.
For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.
EDIT
After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.
edited yesterday
answered yesterday
kjetil b halvorsenkjetil b halvorsen
31.4k984225
31.4k984225
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday
1
1
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday
1
1
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday
1
1
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday
|
show 2 more comments
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*
and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*
Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.
New contributor
$endgroup$
$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday
add a comment |
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*
and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*
Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.
New contributor
$endgroup$
$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday
add a comment |
$begingroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*
and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*
Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.
New contributor
$endgroup$
I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:
There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.
In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.
Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*
and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$
For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:
beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*
Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.
New contributor
edited 7 hours ago
New contributor
answered yesterday
GopiGopi
1405
1405
New contributor
New contributor
$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday
add a comment |
$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday
$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday
$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday
add a comment |
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Gopi is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
-expected-value, iid, variance
$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday
$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday
$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday
$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber♦
5 hours ago