What is a Derivative and Partial Derivative? How to find derivative of an arbitrary function?

Part 1: ‘First order Differentiation’

The derivative is a concept that some of us have heard of but never understood, and some of us may never have encountered in our lives. Yes, the derivative. Also known as the infinitely small rate of change.

Even in English, this concept, referred to as “derivation,” does not actually mean things that derive from something else. On the contrary, its correct usage is “differentiation ratio,” meaning the “ratios of differentiation.”

So, what exactly are these differentiation ratios, or the derivative? The derivative is the ratio of how much a variable, which depends on another variable, changes in response to an infinitesimally small change in the variable it depends on.


For example, let’s consider a variable \(y\) whose change we are observing. Let there also be a variable \(x\) that influences \(y\). The question is: when \(x\) undergoes an infinitesimally small change at any given point, how many times does \(y\) change in response? To understand this, we write an equation for \(y\).

\begin{equation} \label{eq:myequ}
y=f(x)
\end{equation}

If \(x\) changes infinitesimally, the relationship of changes for the equation \eqref{eq:myequ} is:

\begin{equation} \label{eq:limitsequation}
y+ \lim_{\Delta y \to 0}\Delta y=\lim_{\Delta x \to 0} f(x+\Delta x)
\end{equation}

can be written in this way. If variables \(dy\) and \(dx\) are introduced to represent the infinitesimally small changes of the variables, the relationship between these infinitesimal changes at a specific point…

\begin{equation} \label{eq:difeq}
y+dy =f(x+dx)
\end{equation}

can be written. This means that the infinitesimally small change in \(y\) is related to the infinitesimally small change in \(x\) through a function. This is the equation that relates infinitesimally small differences to each other — in other words, the so-called equation of infinitesimally small differences.

To give an example, let’s consider the equation \(y=x^2\). If we want to express this equation in terms of infinitesimally small differences:

\begin{equation} \label{eq:diff}
y+dy =f(x+dx)=(x+dx)^2
\end{equation}

From here

\begin{equation} \label{eq:diff2}
y+dy =x^2+2xdx+(dx)^2
\end{equation}

can be found. Using \(y=x^2\):

\begin{equation} \label{eq:diff3}
dy =2xdx+(dx)^2
\end{equation}

is obtained At this point, a specific trick needs to be applied. Since the square of a number close to zero will be much closer to zero, it can be neglected in comparison to the other terms. In other words, if it is said to be equal to zero:
\begin{equation} \label{eq:diff4}
dy =2xdx
\end{equation}

from here, if we want to find the derivative, which is the ratio of infinitesimally small differences,

\begin{equation} \label{eq:diff5}
\frac{dy}{dx} =2x
\end{equation}

This is the derivative of \(x^2\), that is, the ratio of infinitesimally small differences.

Now, if we want to find the ratio of infinitesimally small differences of a function that depends on more than one variable…

As the variable we are tracking; the change of \(y\):

\begin{equation} \label{eq:diff6}
y=f(x_1,x_2,x_3,\ldots,x_n)\end{equation}

that it depends on multiple variables. If we write this equation in terms of infinitesimal differences:

\begin{equation} \label{eq:diff7}
y+dy=f(x_1+dx_1,x_2+dx_2,x_3+dx_3,\ldots,x_n+dx_n)\end{equation}

The expression is obtained. Here, if we define\(\frac{\partial{y}}{\partial{x_i}}\) as the ratio of the change in \(y\) to the change in \(x_i\) when only \(x_i\) varies, and \(\frac{\partial {y}}{\partial{x_i}}dx_i\)as the infinitesimal change in \(y\) corresponding to an infinitesimal change in \(x_i\) alone:

\begin{equation} \label{eq:diff8}
y+dy=y+\frac{\partial{y}}{\partial{x_1}}dx_1+\frac{\partial{y}}{\partial{x_2}}dx_2+\frac{\partial{y}}{\partial{x_3}}dx_3+\cdots+\frac{\partial{y}}{\partial{x_n}}dx_n\end{equation}

from the equation of the change of variables’, more correctly, exact differentiation is

\begin{equation} \label{eq:diff9}
dy=\frac{\partial{y}}{\partial{x_1}}dx_1+\frac{\partial{y}}{\partial{x_2}}dx_2+\frac{\partial{y}}{\partial{x_3}}dx_3+\cdots+\frac{\partial{y}}{\partial{x_n}}dx_n\end{equation}

is found. This is the total differential of \(y\), or the exact infinitesimal differentiation.

The equality given in\eqref{eq:diff4} is an example where \(y\) depends solely on \(x\), and thus, it is the total differential of \(y\).

If we want to give an example of a multivariable total differential let \(y\) be equal to:

\begin{equation} \label{eq:diff10}
y=x^2 z+xsin(z)\end{equation}

If we want to find the total differential of \(y\) here,

\begin{equation} \label{eq:diff11}
y+dy=(x+dx)^2 (z+dz)+(x+dx)sin(z+dz)\end{equation}

is obtained. From here,

\begin{gather}\label{eq:diff1x}\begin{aligned} y+dy&=(x^2+2xdx+dx^2) (z+dz)+(x+dx)sin(z+dz) \\\quad&=[zx^2+2xzdx+z(dx)^2 +x^2dz+2xdzdx+dz(dx)^2]\\&\quad+[xsin(z+dz)+dxsin(z+dz)]\end{aligned}\end{gather}

is found. Here, if the products and squares of infinitesimal changes are considered zero,

\begin{equation}\label{eq:diff13}\begin{aligned} y+dy&=(zx^2+2xzdx )+ (xsin(z+dz)+dxsin(z+dz))\end{aligned}\end{equation}

From the trigonometric sum formulas,

\begin{equation}\label{eq:diff14} sin(z+dz)=sin(z)cos(dz)+cos(z)sin(dz)\end{equation}

The equality is found. Here, for the infinitesimal \(dz\),\(cos(dz)\) is equal to 1 and \(sin(dz)\) is equal to \(dz\) (please note that sine is approximately equal to the variable for small angles);

\begin{equation}\label{eq:diff16}sin(z+dz)=sin(z)+cos(z)dz\end{equation}

is found. Using this equality, the expression \eqref{eq:diff13} is:

\begin{equation}\begin{aligned}\label{eq:diff17}y+dy&=[zx^2+2xzdx +x^2dz]\\&\quad+ [x(sin(z)+cos(z)dz)+dx(sin(z)+cos(z)dz)]\end{aligned}\end{equation}

By rearranging this,

\begin{equation}\begin{aligned}\label{eq:diff18}y+dy&=[zx^2+2xzdx +x^2dz]\\&\quad+ [xsin(z)+xcos(z)dz+dxsin(z)+cos(z)dzdx]\end{aligned}\end{equation}

is obtained. By simplifying the product of the infinitesimals again and factoring out \(dx\) and \(dy\), ultimately,

\begin{equation}\begin{aligned}\label{eq:diff19}y+dy&=zx^2+xsin(z)+[2xz+sin(z)]dx \\&\quad+[x^2+xcos(z)]dz\end{aligned}\end{equation}

With \eqref{eq:diff10} being used,

\begin{equation}\label{eq:diff20}dy=[2xz+sin(z)]dx +[x^2+xcos(z)]dz\end{equation}

is obtained. This is the total differential of \(y\). The partial rates of change, that is, the partial derivatives, are:

\begin{equation}\label{eq:diff21}\begin{aligned}\frac{\partial y}{\partial x}=2xz+sin(z) \\ \frac{\partial y}{\partial z}=x^2+xcos(z)\end{aligned}\end{equation}

is found.

Apart from this, in the most general case, \(x\) and \(z\) may depend on different variables. In this case, if we examine the concept commonly referred to as the chain rule: For example, let \(x = f(t)\) and \(z = g(u)\). Then, we have:

\begin{equation}\begin{aligned}\label{eq:diff25}dx=\frac{\partial f}{\partial t}dt\\dz=\frac{\partial g}{\partial u}du\end{aligned}\end{equation}

if we use \eqref{eq:diff9} ,then,

\begin{equation}\label{eq:diff22}\begin{aligned}dy=\underbrace{\frac{\partial y}{\partial x}\frac{\partial x}{\partial t}}_{\textstyle{ \frac{\partial y}{\partial t}}\mathstrut }dt+\underbrace{\frac{\partial y}{\partial z}\frac{\partial z}{\partial u}}_{\textstyle\frac{\partial y}{\partial u}\mathstrut }du\end{aligned}\end{equation}

The phenomenon known as the chain rule is then obtained. For example, using the result obtained in \eqref{eq:diff21}:

\begin{equation}\begin{aligned}\label{eq:diff23}dy=[2f(t)g(u)+sin(f(t))]\frac{\partial f}{\partial t}dt +[f(t)^2+f(t)cos(g(u))]\frac{\partial g}{\partial u}du\end{aligned}\end{equation}

is obtained. From here, partial derivations

\begin{equation}\begin{aligned}\label{eq:diff24}\frac{\partial y}{\partial t}=[2f(t)g(u)+sin(f(t))]\frac{\partial f}{\partial t}\\ \frac{\partial y}{\partial u}=[f(t)^2+f(t)cos(g(u))]\frac{\partial g}{\partial u}\end{aligned}\end{equation}

is obtained. Go to 2nd page for 2nd order partial derivatives .

Share:FacebookX
Bilimin