Questions tagged [parameter-estimation]

Questions about parameter estimation. Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured/empirical data that has a random component. (Def: http://en.m.wikipedia.org/wiki/Estimation_theory)

Questions about parameter estimation. Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured/empirical data that has a random component. Reference: Wikipedia.

The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.

1176 questions
20
votes
4 answers

Maximum likelihood estimation of $a,b$ for a uniform distribution on $[a,b]$

I'm supposed to calculate the MLE's for $a$ and $b$ from a random sample of $(X_1,...,X_n)$ drawn from a uniform distribution on $[a,b]$. But the likelihood function, $\mathcal{L}(a,b)=\frac{1}{(b-a)^n}$ is constant, how do I find a maximum? Would…
12
votes
2 answers

simple example of recursive least squares (RLS)

I'm vaguely familiar with recursive least squares algorithms; all the information about them I can find is in the general form with vector parameters and measurements. Can someone point me towards a very simple example with numerical data, e.g. $y =…
Jason S
  • 2,917
  • 1
  • 19
  • 24
10
votes
1 answer

Estimating Parameter - What is the qualitative difference between MLE fitting and Least Squares CDF fitting?

Given a parametric pdf $f(x;\lambda)$ and a set of data $\{ x_k \}_{k=1}^n$, here are two ways of formulating a problem of selecting an optimal parameter vector $\lambda^*$ to fit to the data. The first is maximum likelihood estimation (MLE):…
8
votes
1 answer

Minimum variance unbiased estimator for scale parameter of a certain gamma distribution

Let $X_1, X_2, ..., X_n$ be a random sample from a distribution with p.d.f., $$f(x;\theta)=\theta^2xe^{-x\theta} ; 00$$ Obtain minimum variance unbiased estimator of $\theta$ and examine whether it is attained? MY WORK: Using MLE i…
7
votes
1 answer

Improper Uniform Prior Distribution

In Bayesian method, choosing the prior distribution is an important step when using the Bayesian method. When choosing prior, we consider the prior knowledge to choose which prior distribution is the best for our problem. By hold to Laplace…
7
votes
1 answer

MLE for Uniform $(0,\theta)$

I am a bit confused about the derivation of MLE of Uniform$(0,\theta)$. I understand that $L(\theta)={\theta}^{-n}$ is a decreasing function and to find the MLE we want to maximize the likelihood function. What is confusing me is that if a function…
7
votes
1 answer

Finding UMVUE of $\theta$ when the underlying distribution is exponential distribution

Hi I'm solving some exercise problems in my text : "A Course in Mathematical Statistics". I'm in the chapter "Point estimation" now, and I want to find a UMVUE of $\theta$ where $X_1 ,...,X_n$ are i.i.d random variables with the p.d.f $f(x;…
6
votes
2 answers

Parameter estimation for Stochastic differential equation

I have a process $X(t)$ defined on some finite time horizon $[0,T]$ and I know that my process satisfies the following SDE: $dX(t)=X_t \mu dt + X_t \sigma dB_t$. where $B$ is a standard Brownian motion. In particular I'm assuming that both the…
sigmatau
  • 2,482
  • 1
  • 15
  • 27
6
votes
1 answer

MLE (Maximum Likelihood Estimator) of Beta Distribution

Let $X_1,\ldots,X_n$ be i.i.d. random variables with a common density function given by: $f(x\mid\theta)=\theta x^{\theta-1}$ for $x\in[0,1]$ and $\theta>0$. Clearly this is a $\operatorname{BETA}(\theta,1)$ distribution. Calculate the maximum…
6
votes
1 answer

Asymptotic distribution for MLE of exponential distribution

Let $X$ have an exponential distribution with parameter $\theta$ (pdf is $f(x, \theta) = \theta e^{-\theta x}$). I already found that the MLE for $\theta$ after $n$ observations is $$\hat{\theta}_{MLE} = \bar{X}^{-1} =…
njaja
  • 122
  • 1
  • 6
5
votes
1 answer

Estimating a gaussian distribution from a GMM

Suppose that we have a Gaussian mixture model (GMM) in n-dimensional space: $$P_1(x) = \sum_{i=1}^{C}\pi(c_i)\mathcal{N}(\mu_i,\Sigma_i)$$ We want to estimate a single Gaussian distribution from this set. $$P_2(x) = \mathcal{N}(\mu,\Sigma)$$ For…
5
votes
2 answers

Sufficient statistic for normal distribution with known mean.

Let $X$ be from a normal distribution $N(\theta,1)$. a) Find a sufficient statistic for $\theta$. b) Is $S_n^2$ a sufficient statistic for $\theta$? My answers For part a) Since the joint p.d.f is $1 \over (2\pi)^{n/2}$$e^{{-1 \over…
clarkson
  • 1,759
  • 4
  • 29
  • 47
5
votes
1 answer

Fisher information in one parameter exponential family.

We define the one-parameter exponential family of distribution functions as those whose pmf/pdf can be written as $$\exp\{c(\theta)T(x) + d(\theta) + s(x)\}$$ I would like to show that if c is twice differentiable with a positive derivative and…
5
votes
2 answers

Efficiency of $\hat{\theta}_{MLE}$ from $\operatorname{Beta}(\theta,1)$

I am working on a problem which asks me to discuss the efficiency of the MLE $\hat{\theta}$ given that $X_1,\ldots,X_n \sim_{iid} \operatorname{Beta}(\theta,1) $. I was able to deduce that $$\hat{\theta} = \frac{n}{-\sum_{i=1}^n \ln X_i}$$ and that…
hyg17
  • 4,697
  • 3
  • 32
  • 61
5
votes
1 answer
1
2 3
78 79