Skip to main content

Stack Exchange Network

Stack Exchange network consists of 183 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.

Visit Stack Exchange

Questions tagged [moment-generating-functions]

For questions relating to moment-generating-functions (m.g.f.), which are a way to find moments like the mean$~(μ)~$ and the variance$~(σ^2)~$. Finding an m.g.f. for a discrete random variable involves summation; for continuous random variables, calculus is used.

Filter by
Sorted by
Tagged with
1 vote
1 answer
84 views

An upper bound for the moment generating function $f(t) = \mathbb{E}[e^{tX}]$ when the mean, variance and upper bound of $X$ are given

Suppose a real random variable $X$ has an upper bound $c > 0$ but has no lower bound. $\mathbb{E}[X] = 0$ and $\mathbb{E}[X^2] = \sigma^2$ ($\mathbb{E}[]$ denotes the expectation of random ...
user31587575's user avatar
4 votes
2 answers
115 views

A random variable $X$ is sub-Gaussian iff $\exists C>0: \mathbb{E}[e^{Xt}]\le C\exp(Ct^2)$

This is exercise 1.1.4 from Tao's "Topics in Random Matrix Theory". It asks the reader to prove that a real-valued random variable $X$ is sub-Gaussian iff there exists such $C>0$ that $\...
Daigaku no Baku's user avatar
0 votes
1 answer
124 views

Can two non independent random variables be split into independent ones?

Let $(X,Y)$ be a random vector with joint moment generating function $$M(t_1,t_2) = \frac{1}{(1-(t_1+t_2))(1-t_2)}$$ Let $Z=X+Y$. Then, Var(Z) is equal to: (IIT JAM MS 2021, Q21) Using $M_{X+Y}(t) = ...
Starlight's user avatar
  • 2,674
1 vote
0 answers
42 views

Moments of a sum of iid random variables with mean 0 and variance 1

The method of moments Wikipedia page says that odd moments of the sum $S_n$ of iid random variables $X_i$ (each mean 0 and variance 1) vanish. While even moments are all finite and have closed form, ...
Andras Vanyolos's user avatar
2 votes
0 answers
59 views

What is the most efficient algorithm for computing $E[x_1 x_2 \cdots x_n]$ in a multivariate normal distribution? [closed]

I am working with a multivariate normal distribution $\mathbf{x} = [x_1, x_2, \ldots, x_n] \sim \mathcal{N}(\mathbf{\mu}, \mathbf{\Sigma})$, and I need to compute the expectation $E[x_1 x_2 \cdots x_n]...
cloudmath's user avatar
2 votes
1 answer
245 views

What does Chernoff bound want to point out? Intuition

e.g. Chebyshev's bound $P(|x-\mu| \geq a) \leqslant \frac{\sigma^2}{a^2}$ tells us the upper-bound probability that $X$ deviates from its mean (in absolute value) by more than a certain amount. I can'...
user avatar
0 votes
0 answers
48 views

On the finiteness of moments

Consider a random variable $X$ with full support and assume $$ \mathbb{E}(\max\{0,\exp(\beta X)\})^p<+\infty $$ for some $\beta>0$ and $p>0$. Does this imply that $$ \mathbb{E}(\max\{0,X\})^...
Star's user avatar
  • 404
3 votes
1 answer
236 views

Proof that a moment generating function is unique

I am trying to understand the proof behind why the Moment Generating Function (https://en.wikipedia.org/wiki/Moment-generating_function) of a random variable is is unique. For example, consider the ...
stats_noob's user avatar
  • 4,155
1 vote
1 answer
70 views

Moment Generating Function of a Product of Random Variables

Suppose $X,Y$ are independent random variables, with respective moment generating functions (MGFs) $M_X(t)$ and $M_Y(t)$. It is known that the MGF of $XY$ is given by $$\int_{-\infty}^{\infty} \int_{-\...
Hello's user avatar
  • 2,227
4 votes
0 answers
81 views

What "methods" does a multivariate Moment-Generating Function provide?

I don't think the Probability and Random Variables course I studied really touched on multivariate Moment-Generating Functions at all, and Wikipedia appears surprisingly silent on this question. The ...
user10478's user avatar
  • 2,162
1 vote
0 answers
59 views

Proof of central limit theorem by moment generating functions and the assumption on $X_i$

I was reading the the proof of CLT by MGF in this answer Let $Y_i$'s be i.i.d random variables with mean 0 and variance 1, and in the original answer, by Taylor expansion we have $$M_{Y_1}(s) = E[\exp(...
taylor's user avatar
  • 925
2 votes
1 answer
56 views

Identifying a moment generating function.

I am working on the following problem, and I am having trouble understanding why (a) is not a moment-generating other than it doesn't satisfy the general form of the MGF (i.e. $E[e^{tX}]$), and that (...
Harry Lofi's user avatar
-1 votes
1 answer
112 views

moment generating function of the Borel distribution [closed]

Calculate the moment generating function of the Borel distribution given by $P(X=x; \mu) = \frac{e^{-\mu x} (\mu x)^{x-1}}{x!}$ with $\mu \in (0,1)$ and $x = 1, 2, 3,\ldots$.
André Ferrari Castanheiro's user avatar
2 votes
1 answer
76 views

Compute $\mathbb{E}\left[\tau^2\right]$ when $\tau=\inf\{t>0|W_t-W_0\geq +a\textrm{ or }W_t-W_0\leq -b\}$

Let $W$ be a Brownian motion, $a>0$ and $b>0$ real constants. Let $\tau$ be the stopping time $$ \tau=\inf\left\{t\geq 0:W_t-W_0\geq +a\textrm{ or }W_t-W_0\leq -b\right\} $$ I need to compute $\...
AlmostSureUser's user avatar
6 votes
0 answers
152 views

What is the expectation of this power series of random variables?

Let $(a_k)_{k\in\mathbb{N}}$ be iid random variables distributed uniformly in $(-1, 1)$, and consider, for some fixed $r \in [0, 1)$, the limiting random variable $$ X = \lim_{n \to \infty} \sum_{k=0}^...
smalldog's user avatar
  • 1,883

15 30 50 per page
1
2 3 4 5
91
Morty Proxy This is a proxified and sanitized view of the page, visit original site.