Expectation of inverse of sum of positive iid variablesExpectation of reciprocal of a variableI've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?Taking the expectation of Taylor series (especially the remainder)Error bars on log of big numbersExpectation of Multivariate sample varianceIf I group my data the variance changes, what does this tell me?Extreme value distribution with unknown varianceShowing that the difference of two variance matrices should be positive semi-definite… or notTaylor approximation of expected value of multivariate functionExpectation of square root of sum of independent squared uniform random variablesExpectation of the softmax transform for Gaussian multivariate variablesApproximating expectation with Taylor series

Can criminal fraud exist without damages?

Why are on-board computers allowed to change controls without notifying the pilots?

Can I use my Chinese passport to enter China after I acquired another citizenship?

Best way to store options for panels

The baby cries all morning

Everything Bob says is false. How does he get people to trust him?

Your magic is very sketchy

How was Earth single-handedly capable of creating 3 of the 4 gods of chaos?

Hide Select Output from T-SQL

What would happen if the UK refused to take part in EU Parliamentary elections?

Print name if parameter passed to function

Irreducibility of a simple polynomial

Opposite of a diet

Hostile work environment after whistle-blowing on coworker and our boss. What do I do?

How can I get through very long and very dry, but also very useful technical documents when learning a new tool?

Are there any comparative studies done between Ashtavakra Gita and Buddhim?

How do I rename a LINUX host without needing to reboot for the rename to take effect?

Applicability of Single Responsibility Principle

Is it correct to write "is not focus on"?

Is a roofing delivery truck likely to crack my driveway slab?

What's a natural way to say that someone works somewhere (for a job)?

Transcription Beats per minute

How could Frankenstein get the parts for his _second_ creature?

Can somebody explain Brexit in a few child-proof sentences?



Expectation of inverse of sum of positive iid variables


Expectation of reciprocal of a variableI've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?Taking the expectation of Taylor series (especially the remainder)Error bars on log of big numbersExpectation of Multivariate sample varianceIf I group my data the variance changes, what does this tell me?Extreme value distribution with unknown varianceShowing that the difference of two variance matrices should be positive semi-definite… or notTaylor approximation of expected value of multivariate functionExpectation of square root of sum of independent squared uniform random variablesExpectation of the softmax transform for Gaussian multivariate variablesApproximating expectation with Taylor series













6












$begingroup$


Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.










share|cite|improve this question









New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    yesterday










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    yesterday










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    yesterday










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    5 hours ago
















6












$begingroup$


Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.










share|cite|improve this question









New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    yesterday










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    yesterday










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    yesterday










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    5 hours ago














6












6








6


1



$begingroup$


Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.










share|cite|improve this question









New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




Let $(X_i)_i$ be a sequence of iid positive variables of mean 1 and variance $sigma^2$. Let $barX_n = fracsum_i=1^n X_in$.



My question is: Can we can bound $mathbbE(1/barX_n)$ as a function of $sigma$ and $n$?



There seems to be some strategy that may work based on the taylor extension, but



  • I'm not sure about the hypothesis that need to be met;

  • if it works in this case; and

  • if we can say something definite on $barX_n$ or if we need to use the central limit theorem and can only say this for the normal approximation?

More details about the Taylor expansion. According to this wikipedia article,
$$mathbbE(f(X)) approx f(mu_X) +fracf''(mu_X)2sigma_X^2$$



So in my case it would give something like:
$$mathbbE(1/barX_n) approx 1 +fracsigma^24 n$$
I'm trying to find maybe a formal proof of a similar result, or hypothesis so that it works. Maybe references?
Thanks



EDIT: if needed, we can consider that the $(X_i)_i$ are discrete, there exists $v_1<cdots<v_K$ such that $mathbbP(X=v_k)=p_k$ and $sum p_k = 1$. In this case we know that $barX_n geq v_1$. Although I believe something can be said in the general case.



PS: this is almost a cross-post of this on Math.SE.







variance expected-value iid






share|cite|improve this question









New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|cite|improve this question









New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this question




share|cite|improve this question








edited yesterday







Gopi













New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked yesterday









GopiGopi

1405




1405




New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    yesterday










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    yesterday










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    yesterday










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    5 hours ago

















  • $begingroup$
    Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
    $endgroup$
    – Ferdi
    yesterday










  • $begingroup$
    It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
    $endgroup$
    – Gopi
    yesterday










  • $begingroup$
    Perhaps Markov's inequality or Chebyshev's inequality are useful.
    $endgroup$
    – Ertxiem
    yesterday










  • $begingroup$
    Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
    $endgroup$
    – whuber
    5 hours ago
















$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday




$begingroup$
Hello Gopi, thank you for your question. If this is ALMOST a cross-post your question is welcomed here, but if it is a cross-post you should rather put a bounty on the post in Math.SE
$endgroup$
– Ferdi
yesterday












$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday




$begingroup$
It's almost, the question is different: bounds on the expectation vs convergence speed. If I get the answer on MSE it probably won't help me with this question, but I've referenced it because it's likely that someone that has the answer to one has the answer to both.
$endgroup$
– Gopi
yesterday












$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday




$begingroup$
Perhaps Markov's inequality or Chebyshev's inequality are useful.
$endgroup$
– Ertxiem
yesterday












$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber
5 hours ago





$begingroup$
Without the discreteness criterion, the answer is no, because any finite bound based on $sigma^2,n$ will be exceeded when the underlying distribution is a suitable mixture of a $Gammaleft(frac1n+1, (n+1)right)$ distribution with some other distribution. The presence of this Gamma component assures the sum of $n$ iid values will have a Gamma component with shape less than $1,$ whose PDF diverges at $0,$ forcing the reciprocal sum to have infinite expectation. This demonstrates the hopelessness of using a Taylor expansion in the analysis.
$endgroup$
– whuber
5 hours ago











2 Answers
2






active

oldest

votes


















2












$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    yesterday







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    yesterday










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    yesterday






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    yesterday






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    yesterday


















1












$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer










New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$












  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    yesterday










Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);






Gopi is a new contributor. Be nice, and check out our Code of Conduct.









draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









2












$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    yesterday







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    yesterday










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    yesterday






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    yesterday






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    yesterday















2












$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$












  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    yesterday







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    yesterday










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    yesterday






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    yesterday






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    yesterday













2












2








2





$begingroup$

You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.






share|cite|improve this answer











$endgroup$



You cannot bound that expectation in $sigma, n$. That's because there is the distinct possibility that the expectation do not exist at all (or, is $infty$.) See I've heard that ratios or inverses of random variables often are problematic, in not having expectations. Why is that?. If the conditions given there is fulfilled for the density of $X_1$, it will so be for the density of $barX_n$. If densities do not exist, but probability mass functions do, it is simpler, since your assumptions prohibit a probability atom at zero, but a probability density can still be positive at zero even if $P(X >0)=1$.



For a useful bound you will at least need to restrict the common distribution of $X_1, dotsc, X_n$ much more.



EDIT


After your new information, and with $v_1>0$, the expectation of $1/barX_n$ certainly will exist (irrespective if $K$ is finite or not.) And, since the function $xmapsto 1/x$ is convex for $x>0$, we can use the Jensen Inequality to conclude that $DeclareMathOperatorEmathbbEE 1/barX_n ge 1/E barX_n$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited yesterday

























answered yesterday









kjetil b halvorsenkjetil b halvorsen

31.4k984225




31.4k984225











  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    yesterday







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    yesterday










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    yesterday






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    yesterday






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    yesterday
















  • $begingroup$
    But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
    $endgroup$
    – Gopi
    yesterday







  • 1




    $begingroup$
    But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
    $endgroup$
    – kjetil b halvorsen
    yesterday










  • $begingroup$
    In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
    $endgroup$
    – Gopi
    yesterday






  • 1




    $begingroup$
    Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
    $endgroup$
    – kjetil b halvorsen
    yesterday






  • 1




    $begingroup$
    Obviously the support is not positive integers ;). The support is the set of rational numbers.
    $endgroup$
    – Gopi
    yesterday















$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday





$begingroup$
But the thinig is that this is really not a general case but a very specific case: it's very unlikely that there is a probability mass near 0 (we can even evaluate it with Markov's inequality): $barX_n$ is centered around 1 and has a variance of $sigma^2 / n$.
$endgroup$
– Gopi
yesterday





1




1




$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday




$begingroup$
But even a tiny (but positive) probability close to zero can lead to expectation of inverse being $infty$. Can you rule out that possibility?
$endgroup$
– kjetil b halvorsen
yesterday












$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday




$begingroup$
In my case I can indeed (the distribution of $X_i$ is discrete). I'll add this hypothesis to the question. I'm still interested from a theoretical perspective by the general case, I believe something can still be said in this case (or I'm interested by a counter example that would show your statement).
$endgroup$
– Gopi
yesterday




1




1




$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday




$begingroup$
Information about discreteness is really important! And, n what support? If positive integers, $P(X=0)=0$ and $mu=1$ is really restrictive ...
$endgroup$
– kjetil b halvorsen
yesterday




1




1




$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday




$begingroup$
Obviously the support is not positive integers ;). The support is the set of rational numbers.
$endgroup$
– Gopi
yesterday













1












$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer










New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$












  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    yesterday















1












$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer










New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$












  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    yesterday













1












1








1





$begingroup$

I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.






share|cite|improve this answer










New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$



I think I have the gist of it.
Given that $f(x)=1/x$ is infinitely differentiable in 1. Taylor's theorem tells us:



There exists $varepsilon>0$ such that $f(x) = f(1) + f'(1) (x-1)+ fracf''(1)(x-1)^22 + fracf'''(varepsilon) (x-1)^22$.



In our case, if $X_i$ belongs in the domaine $[v_1;+infty[$, then $barX_n$ has the same domain and we have $varepsilon geq v_1$.



Hence
$mathbbE(1/barX_n) = mathbbEleft (1 - (barX_n-1) + frac(x-1)^24+ fracf'''(varepsilon) (barX_n-1)^22 right)$, and
beginalign*
mathbbE(1/barX_n) &= 1 + fracf'''(varepsilon) mathbbEleft ((barX_n-1)^2right )2 = 1 +frac V(barX_n)4 - frac V(barX_n)12 varepsilon^4\
endalign*

and hence
$$1 + frac sigma^24 n- fracsigma^212 v_1^4 n leq mathbbE(1/barX_n) leq 1 + frac sigma^24 n.$$



For the case where $X_i$ do not admit a minimum but has an unlimited number of moments, one can do a similar transformation using the full taylor expansion:



beginalign*
mathbbE(1/barX_n) &= sum_i=0^+infty fracf^(i)(1)i!mathbbEleft((barX_n-1)^iright)\
&= sum_i=0^+infty frac(-1)^ii!i!mathbbEleft((barX_n-1)^iright)
endalign*



Now if we can say something about the $k^th$ moment of $tildeX_n$ being $O(1/n^k/2)$ this validates that $mathbbE(1/barX_n) approx 1 + frac sigma^24 n$.







share|cite|improve this answer










New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this answer



share|cite|improve this answer








edited 7 hours ago





















New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









answered yesterday









GopiGopi

1405




1405




New contributor




Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Gopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    yesterday
















  • $begingroup$
    Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
    $endgroup$
    – Gopi
    yesterday















$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday




$begingroup$
Turns out that $barX_n$ does admit $n$ moments and that they are of the form $O(n^-p/2)$: arxiv.org/pdf/1105.6283.pdf
$endgroup$
– Gopi
yesterday










Gopi is a new contributor. Be nice, and check out our Code of Conduct.









draft saved

draft discarded


















Gopi is a new contributor. Be nice, and check out our Code of Conduct.












Gopi is a new contributor. Be nice, and check out our Code of Conduct.











Gopi is a new contributor. Be nice, and check out our Code of Conduct.














Thanks for contributing an answer to Cross Validated!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f399261%2fexpectation-of-inverse-of-sum-of-positive-iid-variables%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







-expected-value, iid, variance

Popular posts from this blog

Mobil Contents History Mobil brands Former Mobil brands Lukoil transaction Mobil UK Mobil Australia Mobil New Zealand Mobil Greece Mobil in Japan Mobil in Canada Mobil Egypt See also References External links Navigation menuwww.mobil.com"Mobil Corporation"the original"Our Houston campus""Business & Finance: Socony-Vacuum Corp.""Popular Mechanics""Lubrite Technologies""Exxon Mobil campus 'clearly happening'""Toledo Blade - Google News Archive Search""The Lion and the Moose - How 2 Executives Pulled off the Biggest Merger Ever""ExxonMobil Press Release""Lubricants""Archived copy"the original"Mobil 1™ and Mobil Super™ motor oil and synthetic motor oil - Mobil™ Motor Oils""Mobil Delvac""Mobil Industrial website""The State of Competition in Gasoline Marketing: The Effects of Refiner Operations at Retail""Mobil Travel Guide to become Forbes Travel Guide""Hotel Rankings: Forbes Merges with Mobil"the original"Jamieson oil industry history""Mobil news""Caltex pumps for control""Watchdog blocks Caltex bid""Exxon Mobil sells service station network""Mobil Oil New Zealand Limited is New Zealand's oldest oil company, with predecessor companies having first established a presence in the country in 1896""ExxonMobil subsidiaries have a business history in New Zealand stretching back more than 120 years. We are involved in petroleum refining and distribution and the marketing of fuels, lubricants and chemical products""Archived copy"the original"Exxon Mobil to Sell Its Japanese Arm for $3.9 Billion""Gas station merger will end Esso and Mobil's long run in Japan""Esso moves to affiliate itself with PC Optimum, no longer Aeroplan, in loyalty point switch""Mobil brand of gas stations to launch in Canada after deal for 213 Loblaws-owned locations""Mobil Nears Completion of Rebranding 200 Loblaw Gas Stations""Learn about ExxonMobil's operations in Egypt""Petrol and Diesel Service Stations in Egypt - Mobil"Official websiteExxon Mobil corporate websiteMobil Industrial official websiteeeeeeeeDA04275022275790-40000 0001 0860 5061n82045453134887257134887257

Frič See also Navigation menuinternal link

Identify plant with long narrow paired leaves and reddish stems Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?What is this plant with long sharp leaves? Is it a weed?What is this 3ft high, stalky plant, with mid sized narrow leaves?What is this young shrub with opposite ovate, crenate leaves and reddish stems?What is this plant with large broad serrated leaves?Identify this upright branching weed with long leaves and reddish stemsPlease help me identify this bulbous plant with long, broad leaves and white flowersWhat is this small annual with narrow gray/green leaves and rust colored daisy-type flowers?What is this chilli plant?Does anyone know what type of chilli plant this is?Help identify this plant