# Differentiation Under Integral Sign and Error Function

• ## Differentiation under Integral Sign, DUIS

• ### Introduction

Not all integrals can be evaluated using analytical techniques, such as integration by substitution, by parts or by partial fractions. People come up with different ways of solving the integrals and DUIS is one of them.

The class of definite integrals can be treated as integrals of function of a variable $x$ and a parameter $t$ and can be evaluated using the DUIS.

• ### Theorem I

Let $f$ be a function of single variable $x$ and a parameter $t$. Then,

${\frac {d}{dt} \int \limits_a^b f(x,t) dx = \int \limits_a^b \frac {\partial }{\partial t} f(x,t) dx}$

Note that, sometimes, we might require to integrate again w.r.t. $t$. (To be done, when there are 2 parameters.)

• ### Theorem II (Leibniz Rule)

Theorem I is a special case of theorem II. In this case, the limits are not constants, but are functions of the parameter $t$.

${\frac {d}{dt} \int \limits_{h(t)}^{g(t)} f(x,t) dx = \int \limits_{h(t)}^{g(t)} \frac {\partial }{\partial t} f(x,t) dx + f[g(t),t] \frac {d \ g(t)}{dt} - f[h(t),t] \frac {d \ h(t)}{dt}}$

• ## Error Function

• ### Introduction

The error function is in the form a definite integral, whose limits contain the independent variable. It is defined as follows :

${erf(x) = \frac {1}{\sqrt {\pi}} \int \limits_{-x}^{x} e^{-u^2}du}$

Using the property of definite integrals,

${erf(x) = \frac {2}{\sqrt {\pi}} \int \limits_{0}^{x} e^{-u^2}du}$

How the function got the name error function is an interesting thing.

The function appears in probability, statistics and partial differential equations describing diffusion. In statistics, when any of population parameters (mean $\mu$ or variance $\sigma^2$) are not known, they’re estimated by some techniques. The distribution of the difference between the actual and the estimated parameter is normal. (Normal distribution is the most general probability distribution). Now, the error function gives the probability that the error lies in $[-x,x]$.

Normal distribution in theory of probability is given by the probability density function

${f(x) = \mathcal {N} (\mu, \sigma) = \frac {1}{\sigma \sqrt {2 \pi}}\int e^{-\frac {(x - \mu)^2}{2 \sigma^2}} dx}$

(Try tracing this curve $f(x)$! The article on curve tracing might be useful.)

• ### Properties

I) $erf(0)= 0$

II) $erf ( \infty) = 1$

III) By substituting $u^2 =t$, an alternative form of $erf(x)$ is obtained, given by

${erf (x) = \frac {1}{\sqrt \pi} \int \limits_0^{x^2} e^{-t} \frac {1}{\sqrt t} dt}$

IV) Complementary error function is given by

${erf_c(x) = \frac {2}{\sqrt \pi} \int \limits_0^{\infty} e^{-u^2} du}$

Thus, ${erf(x) + erf_c(x)= 1}$

V) ${erf (-x) = - erf (x)}$. It is an odd function of $x$.

NOTE : Since $x$ is the parameter in the definite integral defining $erf (x)$, the rule of DUIS is used to solve problems on definite integration.