Skip to main content

[Community Question] Calculus: Asymptotic behavior of $\sum\limits_{n=0}^{\infty}x^{b^n}$

One of our user asked:

This follow my previous post here, where Song has proven that $\forall b>1,\lim\limits_{x\to 1^{-}}\frac{1}{\ln(1-x)}\sum\limits_{n=0}^{\infty}x^{b^n}=-\frac{1}{\ln(b)}$, that is to say : $$\forall b>1,\sum\limits_{n=0}^{\infty}x^{b^n}=-\log_b(1-x)+o_{x\to1^-}\left(\log_b(1-x)\right)$$ (The $o_{x\to1^-}\left(\log_b(1-x)\right)$ representing a function that is asymptotically smaller than $\log_b(1-x)$ when $x\to1^{-}$, that is to say whose quotient by $\log_b(1-x)$ converges to $0$ as $x\to1^{-}$, see small o notation)

So we have here a first asymptotical approximation of $\sum\limits_{n=0}^{\infty}x^{b^n}$.

I now want to take it one step further and refine the asymptotical behaviour, by proving a stronger result which I conjecture to be true (backed by numerical simulations) : $$\sum\limits_{n=0}^{\infty}x^{b^n}=-\log_b(1-x)+O_{x\to1^-}\left(1\right)$$

(The $O_{x\to1^-}\left(1\right)$ representing a function that is asymptotically bounded when $x\to1^-$, see big o notation)

In other words, we want to go from :

"this sum is $-\log_b(1-x)$ + something that is asymptotically smaller than $\log_b(1-x)$ when $x\to1^-$"

to :

"this sum is $-\log_b(1-x)$ + something that is asymptotically bounded when $x\to1^-$"

And since $\log_b(1-x)$ diverges to $-\infty$ when $x\to1^-$, this is indeed a much more precise evaluation of the asymptotical behaviour !

Now, the way to go would be to show that $\sum\limits_{n=0}^{\infty}x^{b^n}+\log_b(1-x)$ is asymptotically bounded when $x\to1^-$, that is to say that $\exists M>0, \exists x_0\in\left(0,1\right) \text{ such that }\forall x\in\left[x_0,1\right), \left|\sum\limits_{n=0}^{\infty}x^{b^n}+\log_b(1-x)\right|\leqslant M$.

And to be honest, I'm kind of stuck. Any suggestion ?


Comments

Popular posts from this blog

[Community Question] Linear-algebra: non-negative matrix satisfying two conditions

One of our user asked: A real matrix $B$ is called non-negative if every entry is non-negative. We will denote this by $B\ge 0$ . I want to find a non-negative matrix $B$ satisfying the following two conditions: (1) $(I-B)^{-1}$ exists but not non-negative. Here $I$ is the identity matrix. (2) There is a non-zero and non-negative vector $\vec{d}$ such that $(I-B)^{-1}\vec{d}\ge 0$ . I tried all the $2\times 2$ matrices, but it did not work. I conjecture that such a $B$ does not exist, but don't know how to prove it.

[Community Question] Linear-algebra: Are linear transformations between infinite dimensional vector spaces always differentiable?

One of our user asked: In class we saw that every linear transformation is differentiable (since there's always a linear approximation for them) and we also saw that a differentiable function must be continuous, so it must be true that all linear operators are continuous, however, I just read that between infinite dimensional vector spaces this is not necessarily true. I would like to know where's the flaw in my reasoning (I suspect that linear transformations between infinite dimensional vector spaces are not always differentiable).