Wednesday 25 September 2019

calculus - How to determine with certainty that a function has no elementary antiderivative?




Given an expression such as $f(x) = x^x$, is it possible to provide a thorough and rigorous proof that there is no function $F(x)$ (expressible in terms of known algebraic and transcendental functions) such that $ \frac{d}{dx}F(x) = f(x)$? In other words, how can you rigorously prove that $f(x)$ does not have an elementary antiderivative?


Answer



To get some background and references you may start with this SE thread.



Concerning your specific question about $z^z$ here is an extract from a sci.math answer by Matthew P Wiener :



"Finally, we consider the case $I(z^z)$.



So this time, let $F=C(z,l)(t)$, the field of rational functions in $z,l,t$, where $l=\log z\,$ and $t=\exp(z\,l)=z^z$. Note that $z,l,t$ are algebraically

independent. (Choose some appropriate domain of definition.) Then $t'=(1+l)t$, so for $a=t$ in the above situation, the partial fraction
analysis (of the sort done in the previous posts) shows that the only possibility is for $v=wt+\cdots$ to be the source of the $t$ term on the left,
with $w$ in $C(z,l)$.



So this means, equating $t$ coefficients, $1=w'+(l+1)w$. This is a first order ODE, whose solution is $w=\frac{I(z^z)}{z^z}$. So we must prove that no
such $w$ exists in $C(z,l)$.
So suppose (as in one of Ray Steiner's posts) $w=\frac PQ$, with $P,Q$ in $C[z,l]$ and having no common factors. Then $z^z=
\left(z^z\cdot \frac PQ\right)'=z^z\cdot\frac{[(1+l)PQ+P'Q-PQ']}{Q^2}$, or $Q^2=(1+l)PQ+P'Q-PQ'$.



So $Q|Q'$, meaning $Q$ is a constant, which we may assume to be one. So we have it down to $P'+P+lP=1$.




Let $P=\sum_{i=0}^n [P_i l^i]$, with $P_i, i=0\cdots n \in C[z]$. But then in our equation, there's a dangling $P_n l^{n+1}$ term, a contradiction."



$$-$$



For future references here is a complete re-transcript of Matthew P Wiener's $1997$ sci.math article (converted in $\LaTeX$ by myself : feel free to fix it!).
A neat translation in french by Denis Feldmann is available at his homepage.






What's the antiderivative of $\ e^{-x^2}\ $? of $\ \frac{\sin(x)}x\ $? of $\ x^x\ $?







These, and some similar problems, can't be done.



More precisely, consider the notion of "elementary function". These are the functions that can be expressed in terms of exponentals and logarithms, via the usual algebraic processes, including the solving (with or without radicals) of polynomials. Since the trigonometric functions and their inverses can be expressed in terms of exponentials and logarithms using the complex numbers $\mathbb{C}$, these too are elementary.



The elementary functions are, so to speak, the "precalculus functions".



Then there is a theorem that says certain elementary functions do not
have an elementary antiderivative. They still have antiderivatives,

but "they can't be done". The more common ones get their own names.
Up to some scaling factors, "$\mathrm{erf}$" is the antiderivative of $e^{-x^2}$ and "$\mathrm{Si}$" is the antiderivative of $\frac{\sin(x)}x$, and so on.






For those with a little bit of undergraduate algebra, we sketch a proof
of these, and a few others, using the notion of a differential field.
These are fields $(F,+,\cdot,1,0)$ equipped with a derivation, that is, a
unary operator $'$ satisfying $(a+b)'=a'+b'$ and $(a.b)'=a.b'+a'.b$. Given
a differential field $F$, there is a subfield $\mathrm{Con}(F)=\{a:a'=0\}$, called the constants of $F$. We let $I(f)$ denote an antiderivative. We ignore $+C$s.




Most examples in practice are subfields of $M$, the meromorphic functions on $\mathbb{C}$ (or some domain). Because of uniqueness of analytic extensions, one rarely has to specify the precise domain.



Given differential fields $F$ and $G$, with $F$ a subfield of $G$, one calls $G$ an algebraic extension of $F$ if $G$ is a finite field extension of $F$.



One calls $G$ a logarithmic extension of $F$ if $G=F(t)$ for some transcendental
$t$ that satisfies $t'=\dfrac{s'}s$, some $s$ in $F$. We may think of $t$ as $\;\log s$, but note that we are not actually talking about a logarithm function on $F$.
We simply have a new element with the right derivative. Other "logarithms" would have to be adjoined as needed.



Similarly, one calls $G$ an exponential extension of $F$ if $G=F(t)$ for some transcendental $t$ that satisfies $t'=t.s'$, some $s$ in $F$. Again, we may think of $t$ as $\;\exp s$, but there is no actual exponential function on $F$.




Finally, we call $G$ an elementary differential extension of $F$ if there is a finite chain of subfields from $F$ to $G$, each an algebraic, logarithmic, or exponential extension of the next smaller field.



The following theorem, in the special case of $M$, is due to Liouville.
The algebraic generality is due to Rosenlicht. More powerful theorems have been proven by Risch, Davenport, and others, and are at the heart of symbolic integration packages.



A short proof, accessible to those with a solid background in undergraduate algebra, can be found in Rosenlicht's AMM paper (see references). It is probably easier to master its applications first, which often use similar techniques, and then learn the proof.







MAIN THEOREM. Let $F,G$ be differential fields, let $a$ be in $F$, let $y$ be in $G$, and suppose $y'=a$ and $G$ is an elementary differential extension field of $F$, and $\mathrm{Con}(F)=\mathrm{Con}(G)$. Then there exist $c_1,...,c_n \in \mathrm{Con}(F), u_1,\cdots,u_n, v\in F$ such that



$$a = c_1\frac{u_1'}{u_1}+ ... + c_n\frac{u_n'}{u_n}+ v'$$



That is, the only functions that have elementary antiderivatives are the ones that have this very specific form. In words, elementary integrals always consist of a function at the same algebraic "complexity" level as the starting function (the $v$), along with the logarithms of functions at the same algebraic "complexity" level (the $u_i$'s).






This is a very useful theorem for proving non-integrability. Because this topic is of interest, but it is only written up in bits and pieces, I give numerous examples. (Since the original version of this FAQ from way back when, two how-to-work-it write-ups have appeared. See Fitt & Hoare and Marchisotto & Zakeri in the references.)




In the usual case, $F,G$ are subfields of $M$, so $\mathrm{Con}(F)=\mathrm{Con}(G)$ always holds, both being $\mathbb{C}$. As a side comment, we remark that this equality is necessary.
Over $\mathbb{R}(x)$, $\frac 1{1+x^2}$ has an elementary antiderivative, but none of the above form.



We first apply this theorem to the case of integrating $f\cdot\exp(g)$, with $f$ and $g$ rational functions. If $g=0$, this is just $f$, which can be integrated
via partial fractions. So assume $g$ is nonzero. Let $t=\exp(g)$, so $t'=g't$.
Since $g$ is not zero, it has a pole somewhere (perhaps out at infinity), so $\exp(g)$ has an essential singularity, and thus $t$ is transcendental over $C(z)$. Let $F=C(z)(t)$, and let $G$ be an elementary differential extension containing an antiderivative for $f\cdot t$.



Then Liouville's theorem applies, so we can write



$$f\cdot t = c_1\frac{u_1'}{u_1} + \cdots + c_n \frac{u_n'}{u_n} + v'$$




with the $c_i$ constants and the $u_i$ and $v$ in $F$. Each $u_i$ is a ratio of two $C(z)[t]$ polynomials, $\dfrac UV$ say. But $\dfrac {(U/V)'}{U/V}=\dfrac {U'}U-\dfrac{V'}V$ (quotient rule), so we may rewrite the above and assume each $u_i$ is in $C(z)[t]$.
And if any $u_i=U\cdot V$ factors, then $\dfrac{(U\cdot V)'}{(U\cdot V)}=\dfrac {U'}U+\dfrac {V'}V$ and so we can further assume each $u_i$ is irreducible over $C(z)$.



What does a typical $\frac {u'}u$ look like? For example, consider the case of $u$ quadratic in $t$. If $A,B,C$ are rational functions in $C(z)$, then $A',B',C'$ are also rational functions in $C(z)$ and



\begin{align}
\frac {(A.t^2+B.t+C)'}{A.t^2+B.t+C} &= \frac{A'.t^2 + 2At(gt) + B'.t + B.(gt) + C'}{A.t^2 + B.t + C}\\
&= \frac{(A'+2Ag).t^2 + (B'+Bg).t + C'}{A.t^2 + B.t + C}\\
\end{align}




(Note that contrary to the usual situation, the degree of a polynomial in $t$ stays the same after differentiation. That is because we are taking derivatives with respect to $z$, not $t$. If we write this out explicitly, we get $(t^n)' = \exp(ng)' = ng'\cdot \exp(ng) = ng'\cdot t^n$.)



In general, each $\frac {u'}u$ is a ratio of polynomials of the same degree. We can, by doing one step of a long division, also write it as $D+\frac Ru$, for some $D \in C(z)$ and $R \in C(z)[t]$, with $\deg(R)<\deg(u)$.



By taking partial fractions, we can write $v$ as a sum of a $C(z)[t]$ polynomial
and some fractions $\frac P{Q^n}$ with $\deg(P)<\deg(Q)$, $Q$ irreducible, with each $P,Q \in C(z)[t]$. $v'$ will thus be a polynomial plus partial fraction like terms.



Somehow, this is supposed to come out to just $f\cdot t$. By the uniqueness of partial fraction decompositions, all terms other than multiples of $t$ add up to $0$. Only the polynomial part of $v$ can contribute to $f\cdot t$, and this must be a monomial over $C(z)$. So $f\cdot t=(h\cdot t)'$, for some rational $h$. (The temptation to assert $v=h\cdot t$ here is incorrect, as there could be some $C(z)$ term, cancelled by $\frac {u'}u$ terms. We only need to identify the terms in $v$ that contribute to $f\cdot t$, so this does not matter.)




Summarizing, if $f\cdot \exp(g)$ has an elementary antiderivative, with $f$ and $g$ rational functions, $g$ nonzero, then it is of the form $h\cdot \exp(g)$, with $h$ rational.



We work out particular examples, of this and related applications. A bracketed function can be reduced to the specified example by a change of variables.



$\quad\boxed{\displaystyle\exp\bigl(z^2\bigr)}$ $\quad\left[\sqrt{z}\cdot\exp(z),\frac{\exp(z)}{\sqrt{z}}\right]$



Let $h\cdot \exp\bigl(z^2\bigr)$ be its antiderivative. Then $h'+2zh=1$.
Solving this ODE gives $h=\exp(-z^2)\cdot I\left(\exp\bigl(z^2\bigr)\right)$, which has no pole (except perhaps at infinity), so $h$, if rational, must be a polynomial. But the derivative of $h$ cannot cancel the leading power of $2zh$, contradiction.



$\quad\boxed{\displaystyle\frac{\exp(z)}z}$ $\quad\left[\exp(\exp(z)),\frac 1{\log(z)}\right]$




Let $h\cdot \exp(z)$ be an antiderivative. Then $h'+h=\frac 1z$. I know of two quick ways to prove that $h$ is not rational.



One can explicitly solve the first order ODE (getting
$\exp(-z)\cdot I\left(\frac{\exp(z)}z\right))$, and then notice that the solution has a logarithmic singularity at zero.
For example, $h(z)\to\infty$ but $\sqrt{z}\cdot h(z)\to 0$ as $z\to 0$. No rational function does this.



Or one can assume $h$ has a partial fraction decomposition. Obviously no
$h'$ term will give $\frac 1z$, so $\frac 1z$ must be present in $h$ already. But $\left(\frac 1z\right)'=-\frac 1{z^2}$,
and this is part of $h'$. So there is a $\frac 1{z^2}$ in $h$ to cancel this. But $\left(\frac 1{z^2}\right)'$ is $-\frac 2{z^3}$, and this is again part of $h'$. And again, something in $h$ cancels this, etc etc etc. This infinite regression is impossible.




$\quad\boxed{\displaystyle\frac {\sin(z)}z}$ $\quad[\sin(\exp(z))]$



$\quad\boxed{\displaystyle\sin\bigl(z^2\bigr)}$ $\quad\left[\sqrt{z}\sin(z),\frac{\sin(z)}{\sqrt{z}}\right]$



Since $\sin(z)=\frac 1{2i}[\exp(iz)-\exp(-iz)]$, we merely rework the above $f\cdot \exp(g)$ result. Let $f$ be rational, let $t=\exp(iz)$ (so $\frac {t'}t=i$) and let $T=\exp(iz^2)$ (so $\frac{T'}T=2iz$) and we want an antiderivative of either $\frac 1{2i}f\cdot\left(t-\frac 1t\right)$ or $\frac 1{2i}f\cdot(T-\frac 1T)$. For the former, the same partial fraction results still apply in identifying $\frac 1{2i}f\cdot t=(h\cdot t)'=(h'+ih)\cdot t$, which can't happen for $f=\frac 1z$, as above. In the case of $f\cdot\sin\bigl(z^2\bigr)$, we want $\frac 1{2i}f\cdot T=(h\cdot T)'=(h'+2izh)\cdot T$, and again, this can't happen for $f=1$, as above.



Although done, we push this analysis further in the $f\cdot \sin(z)$ case, as there are extra terms hanging around. This time around, the conclusion gives an additional $\frac kt$ term inside $v$, so we have $-\frac 1{2i}\frac ft=\left(\frac kt\right)'=\frac{k'-ik}t$.
So the antiderivative of $\frac 1{2i}f\cdot\left(t-\frac 1t\right)$ is $h\cdot t+\frac kt$.



If $f$ is even and real, then $h$ and $k$ (like $t=\exp(iz)$ and $\frac 1t=\exp(-iz)$) are parity flips of each other, so (as expected) the antiderivative is even.
Letting $C=\cos(z), S=\sin(z), h=H+iF$ and $k=K+iG$, the real (and only) part of the antiderivative of $f$ is $(HC-FS)+(KC+GS)=(H+K)C+(G-F)S$.
So over the reals, we find that the antiderivative of (rational even).$\sin(x)$ is of the form (rational even).$\cos(x)+$ (rational odd).$\sin(x)$.




A similar result holds for (odd)$\cdot\sin(x)$, (even)$\cdot\cos(x)$, (odd)$\cdot\cos(x)$.
And since a rational function is the sum of its (rational) even and odd parts, (rational)$\cdot\sin$ integrates to (rational)$\cdot\sin$ + (rational)$\cdot\cos$, or not at all.



Let's backtrack, and apply this to $\dfrac {\sin(x)}x$ directly, using reals only.
If it has an elementary antiderivative, it must be of the form $E\cdot S+O\cdot C$.
Taking derivatives gives $(E'-O)\cdot S+(E+O')\cdot C$. As with partial fractions, we have a unique $R(x)[S,C]$ representation here (this is a bit tricky, as $S^2=1-C^2$: this step can be proven directly or via solving for $t, \frac 1t$ coefficients over $C$). So $E'-O=\frac 1x$ and $E+O'=0$, or $O''+O=-\frac 1x$.
Expressing $O$ in partial fraction form, it is clear only $(-\frac 1x)$ in $O$ can contribute a $-\frac 1x$. So there is a $-\frac 2{x^3}$ term in $O''$, so there is a $\frac 2{x^3}$ term in $O$ to cancel it, and so on, an infinite regress. Hence, there is no such rational $O$.



$\quad\boxed{\displaystyle\frac{\arcsin(z)}z}$ $\quad[z.\tan(z)]$




We consider the case where $F=C(z,Z)(t)$ as a subfield of the meromorphic functions on some domain, where $z$ is the identify function, $Z=\sqrt{1-z^2}$, and $t=\arcsin z$. Then $Z'=-\frac zZ$, and $t'=\frac 1Z$. We ask in the main theorem result if this can happen with $a=\frac tz$ and some field $G$. $t$ is transcendental over $C(z,Z)$, since it has infinite branch points.



So we consider the more general situation of $f(z)\cdot \arcsin(z)$ where $f(z)$ is rational in $z$ and $\sqrt{1-z^2}$. By letting $z=\frac {2w}{1+w^2}$, note that members of $C(z,Z)$ are always elementarily integrable.



Because $x^2+y^2-1$ is irreducible, $\frac{C[x,y]}{x^2+y^2-1}$ is an integral domain, $C(z,Z)$ is isomorphic to its field of quotients in the obvious manner, and $C(z,Z)[t]$ is a UFD whose field of quotients is amenable to partial fraction analysis in the variable $t$. What follows takes place at times in various $z$-algebraic extensions of $C(z,Z)$ (which may not have unique factorization), but the terms must combine to give something in $C(z,Z)(t)$, where partial fraction decompositions are unique, and hence the $t$ term will be as claimed.



Thus, if we can integrate $f(z)\cdot\arcsin(z)$, we have $f\cdot t$ = $\sum$ of $\frac {u'}u s$ and $v'$, by the main theorem.



The $u$ terms can, by logarithmic differentiation in the appropriate algebraic extension field (recall that roots are analytic functions of

the coefficients, and $t$ is transcendental over $C(z,Z)$), be assumed to all be linear $t+r$, with $r$ algebraic over $z$. Then $\frac {u'}u=\frac {1/Z+r'}{t+r}$.
When we combine such terms back in $C(z,Z)$, they don't form a $t$ term (nor any higher power of $t$, nor a constant).



Partial fraction decomposition of $v$ gives us a polynomial in $t$, with coefficients in $C(z,Z)$, plus multiples of powers of linear $t$ terms.
The latter don't contribute to a $t$ term, as above.



If the polynomial is linear or quadratic, say $v=g\cdot t^2 + h\cdot t + k$, then $v'=g'\cdot t^2 + \left(\frac{2g}Z+h'\right)\cdot t + \left(\frac hZ+k'\right)$. Nothing can cancel the $g'$, so $g$ is just a constant $c$. Then $\frac {2c}Z+h'=f$ or $I(f\cdot t)=2c\cdot t+I(h'\cdot t)$. The $I(h'.t)$ can be integrated by parts. So the antiderivative works out to $c\cdot(\arcsin(z))^2 + h(z)\cdot \arcsin(z) - I\left(\frac{h(z)}{\sqrt{1-z^2}}\right)$, and as observed above, the latter is elementary.



If the polynomial is cubic or higher, let $v=A.t^n+B.t^{n-1}+\cdots$, then $v'=A'.t^n + \left(n\cdot\frac AZ+B'\right).t^{n-1} +\cdots$ $A$ must be a constant $c$. But then $\frac{nc}Z+B'=0$, so $B=-nct$, contradicting $B$ being in $C(z,Z)$.




In particular, since $\frac 1z + \frac c{\sqrt{1-z^2}}$ does not have a rational in "$z$ and/or $\sqrt{1-z^2}$" antiderivative, $\frac {\arcsin(z)}z$ does not have an elementary integral.



$\quad\boxed{\displaystyle z^z}$



In this case, let $F=C(z,l)(t)$, the field of rational functions in $z,l,t$, where $l=\log z$ and $t=\exp(z\,l)=z^z$. Note that $z,l,t$ are algebraically independent. (Choose some appropriate domain of definition.) Then $t'=(1+l)t$, so for $a=t$ in the above situation, the partial fraction analysis (of the sort done in the previous posts) shows that the only possibility is for $v=wt+\cdots$ to be the source of the $t$ term on the left, with $w$ in $C(z,l)$.



So this means, equating $t$ coefficients, $1=w'+(l+1)w$. This is a first order ODE, whose solution is $w=\frac{I(z^z)}{z^z}$. So we must prove that no such $w$ exists in $C(z,l)$. So suppose (as in one of Ray Steiner's posts) $w=P/Q$, with $P,Q$ in $C[z,l]$ and having no common factors. Then $z^z= \left(z^z\cdot \frac PQ\right)'=z^z\cdot\frac{[(1+l)PQ+P'Q-PQ']}{Q^2}$, or $Q^2=(1+l)PQ+P'Q-PQ'$.



So $Q|Q'$, meaning $Q$ is a constant, which we may assume to be one. So we have it down to $P'+P+lP=1$.




Let $P=\sum_{i=0}^n [P_i l^i]$, with $P_i, i=0\cdots n \in C[z]$. But then in our equation, there's a dangling $P_n l^{n+1}$ term, a contradiction.






On a slight tangent, this theorem of Liouville will not tell you that Bessel functions are not elementary, since they are defined by second order ODEs. This can be proven using differential Galois theory. A variant of the above theorem of Liouville, with a different normal form, does show however that $J_0$ cannot be integrated in terms of elementary methods augmented with Bessel functions.






What follows is a fairly complete sketch of the proof of the Main Theorem.
First, I just state some easy (if you've had Galois Theory 101) lemmas.




Throughout the lemmas $F$ is a differential field, and $t$ is transcendental over $F$.




  • Lemma $1$: If $K$ is an algebraic extension field of $F$, then there exists a unique way to extend the derivation map from $F$ to $K$ so as to make $K$ into
    a differential field.

  • Lemma $2$: If $K=F(t)$ is a differential field with derivation extending $F$'s, and $t'$ is in $F$, then for any polynomial $f(t)$ in $F[t]$, $f(t)'$ is a
    polynomial in $F[t]$ of the same degree (if the leading coefficient is not in $\mathrm{Con}(F)$) or of degree one less (if the leading coefficient is in $\mathrm{Con}(F)$).

  • Lemma $3$: If $K=F(t)$ is a differential field with derivation extending $F$'s, and $\frac{t'}t$ is in $F$, then for any $a$ in $F$, $n$ a positive integer, there exists $h$ in $F$ such that $(a\cdot t^n)'=h\cdot t^n$. More generally, if $f(t)$ is any polynomial in $F[t]$, then $f(t)'$ is of the same degree as $f(t)$, and is a multiple of $f(t)$ iff $f(t)$ is a monomial.




These are all fairly elementary. For example, $(a\cdot t^n)'=\bigl(a'+a\frac {t'}t\bigr)\cdot t^n$ in lemma $3$. The final 'iff' in lemma $3$ is where transcendence of $t$ comes in. Lemma $1$ in the usual case of subfields of $M$ is an easy consequence of the implicit function theorem.






MAIN THEOREM. Let $F,G$ be differential fields, let $a$ be in $F$, let $y$ be in $G$, and suppose $y'=a$ and $G$ is an elementary differential extension field of $F$, and $\mathrm{Con}(F)=\mathrm{Con}(G)$. Then there exist $c_1,...,c_n \in \mathrm{Con}(F), u_1,\cdots,u_n, v\in F$ such that



$$(*)\quad a = c_1\frac{u_1'}{u_1}+ ... + c_n\frac{u_n'}{u_n}+ v'$$



In other words, the only functions that have elementary antiderivatives are the ones that have this very specific form.







Proof:



By assumption there exists a finite chain of fields connecting $F$ to $G$ such that the extension from one field to the next is given by performing an algebraic, logarithmic, or exponential extension. We show that if the form $(*)$ can be satisfied with values in $F2$, and $F2$ is one of the three kinds of allowable extensions of $F1$, then the form $(*)$ can be satisfied in $F1$. The form $(*)$ is obviously satisfied in $G$: let all the $c$'s be $0$, the $u$'s be $1$, and let $v$ be the original $y$ for which $y'=a$. Thus, if the form $(*)$ can be pulled down one field, we will be able to pull it down to $F$, and the theorem holds.



So we may assume without loss of generality that $G=F(t)$.





  • Case $1$ : $t$ is algebraic over $F$. Say $t$ is of degree $k$. Then there are polynomials $U_i$ and $V$ such that $U_i(t)=u_i$ and $V(t)=v$. So we have $$a = c_1 \frac{U_1(t)'}{U_1(t)} +\cdots + c_n \frac{ U_n(t)'}{U_n(t)} + V(t)'$$ Now, by the uniqueness of extensions of derivatives in the algebraic case, we may replace $t$ by any of its conjugates $t_1,\cdots, t_k,$ and the same equation holds. In other words, because $a$ is in $F$, it is fixed under the Galois automorphisms. Summing up over the conjugates, and converting the $\frac {U'}U$ terms into products using logarithmic differentiation, we have $$k a = c_1 \frac{[U_1(t_1)\times\cdots\times U_1(t_k)]'}{U_1(t_1)\times \cdots \times U_n(t_k)}+ \cdots + [V(t_1)+\cdots +V(t_k)]'$$ But the expressions in $[\cdots]$ are symmetric polynomials in $t_i$, and as they are polynomials with coefficients in $F$, the resulting expressions are in $F$. So dividing by $k$ gives us $(*)$ holding in $F$.


  • Case $2$ : $t$ is logarithmic over $F$. Because of logarithmic differentiation we may assume that the $u$'s are monic and irreducible in $t$ and distinct.
    Furthermore, we may assume v has been decomposed into partial fractions.
    The fractions can only be of the form $\dfrac f{g^j}$, where $\deg(f)<\deg(g)$ and $g$ is monic irreducible. The fact that no terms outside of $F$ appear on the left hand side of $(*)$, namely just $a$ appears, means a lot of cancellation must be occuring.



    Let $t'=\dfrac{s'}s$, for some $s$ in $F$. If $f(t)$ is monic in $F[t]$, then $f(t)'$ is also in $F[t]$, of one less degree. Thus $f(t)$ does not divide $f(t)'$. In particular, all the $\dfrac{u'}u$ terms are in lowest terms already. In the $\dfrac f{g^j}$ terms in $v$, we have $a g^{j+1}$ denominator contribution in $v'$ of the form $-jf\dfrac{g'}{g^{j+1}}$.
    But $g$ doesn't divide $fg'$, so no cancellation occurs. But no $\dfrac{u'}u$ term can cancel, as the $u$'s are irreducible, and no $\dfrac{(**)}{g^{j+1}}$ term appears in $a$, because $a$ is a member of $F$. Thus no $\dfrac f{g^j}$ term occurs at all in $v$.
    But then none of the $u$'s can be outside of $F$, since nothing can cancel them. (Remember the $u$'s are distinct, monic, and irreducible.) Thus each of the $u$'s is in $F$ already, and $v$ is a polynomial. But $v' = a -$ expression in $u$'s, so $v'$ is in $F$ also. Thus $v = b t + c$ for some $b$ in $\mathrm{con}(F)$, $c$ in $F$, by lemma 2. Then $$a= c_1 \frac{u_1'}{u_1} +\cdots + c_n\frac{u_n'}{u_n} + b \frac{s'}s + c'$$ is the desired form. So case 2 holds.


  • Case $3$ : $t$ is exponential over $F$. So let $\dfrac {t'}t=s'$ for some $s$ in $F$. As in case 2 above, we may assume all the $u$'s are monic, irreducible, and distinct and put $v$ in partial fraction decomposition form. Indeed the argument is identical as in case 2 until we try to conclude what form $v$ is. Here lemma 3 tells us that $v$ is a finite sum of terms $b\cdot t^j$ where each coefficient is in $F$. Each of the $u$'s is also in $F$, with the possible exception that one of them may be $t$. Thus every $\dfrac {u'}u$ term is in $F$, so again we conclude $v'$ is in $F$. By lemma 3, $v$ is in $F$. So if every $u$ is in $F$, $a$ is in the desired form. Otherwise, one of the $u$'s, say $u_n$, is actually $t$, then $$a = c_1\frac{u_1'}{u_1} + \cdots + (c_n s + v)'$$ is the desired form. So case 3 holds.








References:



A D Fitt & G T Q Hoare "The closed-form integration of arbitrary functions", Mathematical Gazette (1993), pp 227-236.
I Kaplansky An introduction to differential algebra (Hermann, 1957)
E R Kolchin Differential algebra and algebraic groups (Academic Press, 1973)
A R Magid "Lectures on differential Galois theory" (AMS, 1994)
E Marchisotto & G Zakeri "An invitation to integration in finite terms", COLLEGE MATHEMATICS JOURNAL (1994), pp 295-308.
J F Ritt Integration in finite terms (Columbia, 1948).
J F Ritt Differential algebra (AMS, 1950).
M Rosenlicht "Liouville's theorem on functions with elementary integrals", PACIFIC JOURNAL OF MATHEMATICS (1968), pp 153-161.
M Rosenlicht "Integration in finite terms", AMERICAN MATHEMATICS MONTHLY, (1972), pp 963-972.
G N Watson A treatise on the theory of Bessel functions (Cambridge, 1962).



-Matthew P Wiener


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...