| dbo:description
|
- continuous probability distribution (en)
|
| dbo:thumbnail
| |
| dbo:wikiPageExternalLink
| |
| dbo:wikiPageWikiLink
| |
| dbp:border
| |
| dbp:mathStatement
|
- If are subgaussian and independent, then (en)
- If are subgaussian, with , then (en)
- If is subgaussian, then (en)
- If , and for all , then where depends on only. (en)
- If is -Lipschitz, and is a standard gaussian vector, then concentrates around its expectation at a rate and similarly for the other tail. (en)
- If are subgaussians, with , then Further, the bound is sharp, since when are IID samples of we have . (en)
- where depends only on . (en)
- * If is subgaussian, and , then and .
* If are subgaussian, then
* If is subgaussian, then for all (en)
- If is a random vector in , such that for all on the unit sphere , then For any , with probability at least , (en)
- Linear sums of subgaussian random variables are subgaussian. (en)
- are independent random variables with the same upper subgaussian tail: for all . Also, , then for any unit vector , the linear sum has a subgaussian tail: (en)
- Fix a finite set of vectors . If is a random vector, such that each , then the above 4 inequalities hold, with replacing . Here, is the convex polytope hulled by the vectors . (en)
|
| dbp:name
|
- Corollary (en)
- Basic properties (en)
- Independent subgaussian sum bound (en)
- Partial converse (en)
- Subgaussian deviation bound (en)
- Gaussian concentration inequality for Lipschitz functions (en)
|
| dbp:note
|
- Exercise 2.5.10 (en)
- Matoušek 2008, Lemma 2.2 (en)
- Matoušek 2008, Lemma 2.4 (en)
- Tao 2012, Theorem 2.1.12. (en)
- over a convex polytope (en)
- over a finite set (en)
- subgaussian random vectors (en)
|
| dbp:proof
|
- For any t>0:This is a standard proof structure for proving Chernoff-like bounds for sub-Gaussian variables. For the second equation, it suffices to prove the case with one variable and zero mean, then use the union bound. First by Markov, , then by definition of variance proxy, , and then optimize at . (en)
- By the Chernoff bound, . Now apply the union bound. (en)
- By triangle inequality, . Now we have . By the equivalence of definitions and of subgaussianity, we have . (en)
- Let be the CDF of . The proof splits the integral of MGF to two halves, one with and one with , and bound each one respectively.
Since for ,
For the second term, upper bound it by a summation:
When , for any , , so
When , by drawing out the curve of , and plotting out the summation, we find that Now verify that , where depends on only. (en)
- By shifting and scaling, it suffices to prove the case where , and .
Since every 1-Lipschitz function is uniformly approximable by 1-Lipschitz smooth functions , it suffices to prove it for 1-Lipschitz smooth functions.
Now it remains to bound the cumulant generating function.
To exploit the Lipschitzness, we introduce , an independent copy of , then by Jensen,
By the circular symmetry of gaussian variables, we introduce . This has the benefit that its derivative is independent of it.
Now take its expectation, The expectation within the integral is over the joint distribution of , but since the joint distribution of is exactly the same, we have
Conditional on , the quantity is normally distributed, with variance , so
Thus, we have (en)
- If independent, then use that the cumulant of independent random variables is additive. That is, .
If not independent, then by Hölder's inequality, for any we have
Solving the optimization problem
, we obtain the result. (en)
|
| dbp:style
| |
| dbp:ta
| |
| dbp:title
| |
| dbp:wikiPageUsesTemplate
| |
| dct:subject
| |
| rdfs:label
|
- Sub-Gaussian distribution (en)
- Sub-гауссівский розподіл (uk)
|
| owl:sameAs
| |
| prov:wasDerivedFrom
| |
| foaf:depiction
| |
| foaf:isPrimaryTopicOf
| |
| is dbo:wikiPageRedirects
of | |
| is dbo:wikiPageWikiLink
of | |
| is foaf:primaryTopic
of | |