Skip to main content

[Community Question] Statistics: Can a summation be transferred into the denominator?

One of our user asked:

I include a bit of an introduction, even though my main question is more mathematical. I was tasked with finding the Maximum Likelihood Estimate for $\theta$ in $$\mathrm P(X>x) = \left(\frac ax \right)^\theta $$ where $X$ is a variable, and $x$ represents a value that variable can take on. The Probability Density Function is $\newcommand\diff[2]{\frac{\mathrm d#1}{\mathrm d#2}}\diff Fx=\frac{-\theta a^\theta}{x^{\theta + 1}}$, where $F = \mathrm P(X>x)$. I maximise the loglikelihood function $l = \ln(-\theta) + \theta \ln a - (\theta + 1)\ln x\ $ to get $\hat\theta(x_i) = \frac 1{\ln x_i - \ln a}$, where the $\hat.$ indicates that $\hat\theta$ is an estimate of $\theta$, based on the data sample. Now, the answer is supposed to be $$\hat\theta = \frac 1{\overline {\ln x} - \ln a}$$ where $\overline {\phantom{x}}$ indicates the average: $\overline{\ln x} = \frac 1n \sum_i \ln x_i$. I am stumped as to how to get this answer directly from $\hat\theta(x_i)$.

Does $$\frac 1n \sum_i \frac 1{\ln x_i - \ln a} = \frac 1{\overline {\ln x} - \ln a}\qquad ?$$

I think $\frac 1n \sum_i \widehat{\frac 1{\theta(x_i)}} = \frac 1n\sum_i (\ln x_i - \ln a) =\overline{\ln x} - \ln a = \widehat {\frac 1\theta} \implies \hat\theta = \frac 1{\overline{\ln x} - \ln a}$, but is this the only way to show the above?


Comments

Popular posts from this blog

[Community Question] Linear-algebra: Are linear transformations between infinite dimensional vector spaces always differentiable?

One of our user asked: In class we saw that every linear transformation is differentiable (since there's always a linear approximation for them) and we also saw that a differentiable function must be continuous, so it must be true that all linear operators are continuous, however, I just read that between infinite dimensional vector spaces this is not necessarily true. I would like to know where's the flaw in my reasoning (I suspect that linear transformations between infinite dimensional vector spaces are not always differentiable).

[Community Question] Calculus: prove $\int_0^\infty \frac{\log^2(x)}{x^2+1}\mathrm dx=\frac{\pi^3}{8}$ with real methods

One of our user asked: I am attempting to prove that $$J=\int_0^\infty\frac{\log^2(x)}{x^2+1}\mathrm dx=\frac{\pi^3}8$$ With real methods because I do not know complex analysis. I have started with the substitution $x=\tan u$ : $$J=\int_0^{\pi/2}\log^2(\tan x)\mathrm dx$$ $$J=\int_0^{\pi/2}\log^2(\cos x)\mathrm dx-2\int_{0}^{\pi/2}\log(\cos x)\log(\sin x)\mathrm dx+\int_0^{\pi/2}\log^2(\sin x)\mathrm dx$$ But frankly, this is basically worse. Could I have some help? Thanks.