Calculus (literally 'small pebble'), is the mathematical study of continuous change, in the same way that geometry is the study of shape and algebra is the study of generalizations of arithmetic operations. It includes two main branches - Differential Calculus and Integral Calculus. Both branches are related to each other by the fundamental theorems of calculus. They make use of the fundamental notions of convergence of infinite sequences and infinite series to a well-defined limit. Calculus as we see it today, was initiated by Isaac Newton and Gottfried Wilhelm Leibniz, back in the 17th century.

Calculus is a vast topic but this is only an overview refresher. If you are interested, you can check out the textbooks that cover the topics in detail Thomas' Calculus by George B. Thomas and Joel Hass is one such interesting text.

The subject is vast and interesting. Each aspect of calculus is immensely useful in different branches of science, engineering and finance.

This is the fundamental concept that forms basis for all of Calculus. What exactly does it mean? In simple words, Limits deal with the concept of value that are not equal but "almost" equal to the subject. A simple example quoted to demonstrate this concept is this:

For a real variable x, What is the value of x/x? We may be tempted to say 1. But that is not so. x/x is 1 for all values of x except 0. Its value is not defined at 0, because 0/0 is not defined. But for any other number, it is defined and the value is 1. It is 1 for x = 1 or 0.1 or 0.00001 or 0.00000000001 ... For any number however small. Thus, for any number that is "almost" zero, its value is 1. Thus, although it is not defined at 0,

`lim`_{ x → 0 }x/x = 1

That is, "limit as x tends to 0 of x/x is 1"

The example above was pretty straightforward. But life is not so simple! We often deal with functions that are not continuous. Consider for example, a complicated function that describes tossing of a coin - maps the details of the throw to the outcome head or tail. This cannot be a continuous function. It should have a step. How do we calculate limits in such a case?

Here, we cannot get the limit. But, It may be possible to get the left and right limits independently. In general, if the left and right limits match, the function is continuous at that point.

` lim`_{ x → a }f(x) = M
if and only if
lim_{ x → a- }f(x) = lim_{ x → a+ }f(x) = M

In general, it is possible that we the limit for a function at a given value may not be defined.

This is an interesting concept that emerged out of the theory of limits. Division by 0 is not defined in algebra. But what does the theory of limits say about it? What is the limit of 1/x as x tends to 0? We can easily see that the value of 1/x increases as x reduces. For any 1/x, we have 2/x that is even larger. Moreover, the closer to zero we get, the value of 1/x increases faster and faster. Thus, as we go to 'almost zero', the value of 1/x is larger than any number known. That is defined as infinity. In formal terms,

` lim`_{ x → 0 }1/x = ∞

Now how about the lower and upper limits? Are they equal? As per the definition of infinity, they are. Thus, the number line is twisted around to create a number circle, so that negative infinity is equal to positive infinity.

So what is the big deal about just defining a crazy new number? Does it have any meaning anywhere? Yes, it has a lot of meaning in calculus itself. As we go deeper, we will come across several theorems that depend upon this infinity. We will often see the applications of the term limit as n tends to infinity.

Some important theorems related to Limits are useful when working with calculus. For a in the input space of functions f(x), g(x), constant k and natural number n:

- Suppose that f and g are functions such that f(x) = g(x) for all x in some open interval interval containing a except possibly for a, then

` lim`_{ x → a }f(x) = lim_{ x → a }g(x)

- The limit of a constant function is the constant itself.

` lim`_{ x → a }k = k

- Constant multiplied by the limit of a function is the limit of constant multiplied by the function

` lim`_{ x → a }k . f(x) = k . lim_{ x → a }f(x)

- The limit of sum is sum of the limits

` lim`_{ x → a }(f(x) + g(x)) = lim_{ x → a }f(x) + lim_{ x → a }g(x)

- The limit of difference is the difference of the limits

` lim`_{ x → a }(f(x) - g(x)) = lim_{ x → a }f(x) - lim_{ x → a }g(x)

- The limit of a product is the product of limits

` lim`_{ x → a }(f(x) . g(x)) = lim_{ x → a }f(x) . lim_{ x → a }g(x)

- The limit of ratio is the ratio of limits. Only if the limit of denominator is not 0.

` lim`_{ x → a }(f(x) / g(x)) = lim_{ x → a }f(x) / lim_{ x → a }g(x)

- The limit of the exponential is the exponential of the limit - if it is defined.

` lim`_{ x → a } (f(x))^{n} = (lim_{ x → a }f(x))^{n}

- If one function is less than the other, then so is the limit.

` f(x) >= g(x) for all x≠a, => lim`_{ x → a }f(x) >= lim_{ x → a }g(x)

- Squeeze Play Theorem: if we have three functions f, g and h such that f(x) <= g(x) <=h(x) for all values of x. And limit of f(x) and h(x) at a are equal, the limit of g(x) at a has to be same.

` lim`_{ x → a }f(x) = lim_{ x → a }h(x) = M => lim_{ x → a }g(x) = M

As the name suggests, this is all about differences. How does the function change? What is the direction of change? Does it increase or decrease with the input parameter? Or is it constant?

Change cannot be limited to a point. By definition we need a duration to define this "change". But, what if we want to know how it is doing at the given point? The derivative of the function helps us with this.

Essentially, the derivative of a function is the limit of change in the function over the duration that tends to 0.

` f'(x) = lim`_{ h → 0 }(f(x+h) - f(x-h)) / 2h

For this to exist, the function must be continuous at that point. Intuitively, we can imagine the derivative as the slope of the tangent at that point.

For functions f, g, h and constant b;

- Derivative of the constant function is 0

` f(x) = b => f'(x) = 0`

- Derivative of identity function is 1

` f(x) = x => f'(x) = 1`

- Derivative of constant multiple of a function is the multiple of the derivative

` g(x) = k.f(x) => g'(x) = k.f'(x)`

- Derivative of n
^{th}power of x is n times (n-1)^{th}power of x

` f(x) = x`^{n} => f'(x) = n.x^{n-1}

- The derivative of an exponential function is the value of the function multiplied by the derivative at 0

` f(x) = b`^{x} => f'(x) = b^{x}.f'(0)
f(x) = e^{x} => f'(x) = e^{x}

- Sum of derivatives is derivative of the sum

` (f + g)'(x) = f'(x) + g'(x)`

- Derivative of product of functions.

` (fg)'(x) = f(x).g'(x) + g(x).f'(x)`

- Derivative of quotient of functions

` (f/g)'(x) = (f'(x).g(x) - f(x).g'(x)) / (g(x))`^{2}

- Chain Rule

` f(x) = g '(h(x)) => f'(x) = g'(h(x)).h'(x)`

As we saw above, the derivative of a function is essentially yet another function of x. This function can have a derivative as well - giving us another function. This gives us higher order derivatives. The derivative of the derivative is called the second order derivative - f''(x). Its derivative is called the third order derivative... leading us to the n^{th} order derivative f^{n}(x)

The derivative of a function is an indicator of how the function is moving. If it is positive, the function is increasing. But it has little to say about how much it will increase? Will it continue to increase? Faster or slower? Will it decrease after some time? All these questions can be important when analyzing the output.

To answer these questions, we look at the higher order derivatives. In fact, we can get the value of a function just looking at its derivatives.

` f(x) = f(0) + x.f'(0) + x`^{2}.f''(0) / 2! + ... + x^{n}.fn(x) / n! + ... ∞

Remember that n! (n factorial) is the product of all natural numbers up to n. We can prove this recursively. A special case is f(x) = e^{x} - where all derivatives are 1.

A point where the derivative is 0 is called a critical point. Here, we have 4 possibilities.

- Constant function
- Local Minimum
- Local Maximum
- Inflection

A constant function always has a derivative of 0. So there is nothing special about this particular point. Hence we ignore that case.

But, if it is not a constant function, we can have two scenarios - Either it is taking a U turn or is a simple inflection. If it is a U turn, it could be a local minimum or maximum. Now how do we identify this? We can check it out by looking at the second order derivative.

If the second order derivative is positive, then it means that the first derivative is increasing. If it is 0 at this point, it means it was negative before this - implying that the function was decreasing. Now that the first derivative has crossed 0, it will be positive - implying that the function will start increasing. Thus, we have are on a local minimum. Local because it need not be the absolute minimum. We can have other minimum that is lower than this.

Similarly, for a negative second order derivative, we can conclude that it was a local maximum. But what if the second order derivative is also 0? That means it was not a minimum or a maximum. It was just a small inflection.

So far we have seen functions that have only one input parameter. In real life, we rarely have an outcome that depends upon a single input. Most real life functions take multiple input variables. How do we work on derivatives of such functions? To visualize this, we need to imagine an n+1 dimensional space where n is the number of input parameters. The function is defined by a plane in this n+1 dimensional space - defined by the value of the function at each possible value of the input.

Here, the the output value potentially changes with any change in any of the input variables. This defines the derivative with respect to that particular input variable. In such a scenario, we consider the partial derivatives. That is the derivative of the function with respect to one variable, while the others are constant.

A simple example:

` f(x, y) = x`^{2} + xy + 3y^{3}

Here, we can calculae the partial derivative with respect to x, and partial y at x,y:

` f'`_{x}(x, y) = 2x + y
f'_{y}(x, y) = x + 9y^{2}

Integrals are often called anti-derivatives. It can be considered a reverse of the derivative process. If f'(x) is the derivative of function f(x), then f(x) is the integral of f'(x) - indefinite integral.

` g(x) = f'(x) => f(x) = ∫g(x)dx + c`

Note that we always add the constant c with the anti-derivative. That is to remind us that we had eliminated a constant when we obtained the derivative of the function.

(Remember that derivative of a constant is 0 - hence f(x) = g(x) + c => f'(x) = g'(x). We need to get that constant back when we get the anti-derivative.)

But that is a bad way of looking at integrals. Integrals have their own identity and are not just reverse of derivatives. Intuitively, integral can be considered to be the area covered under a given function curve.

To elaborate this further, suppose we have to calculate the area under a constant function f(x) = c. We just multiply x * c. That gives us the area of the rectangle under it.

But, if the function is not a simple straight line, we have to do more. We can pick up smaller rectangles under the curve. But we might end up missing the accuracy if we try to approximate the area to rectangles under the curve. But that loss is related to the width of the rectangles we take. Thinner the rectangles, lesser is the loss. And as the width of the rectangle tends to 0, the loss is 0 as well. Thus,

` ∫f(x)dx = f(0).x + lim`_{ n → ∞ }∑ x.f(x.i/n)/n

Thus, in a way, derivatives refer to difference and division while integral refers to sum and multiplication.

Indefinite integrals refer to generic functions. But definite integrals are defined between two end points.

Essentially, ∫f(x)dx is a function of x - say F(x). The difference between F(b) and F(a) is what defines the definite integral. Note that the result of the definite integral is not a function of x. There is no x in F(b) - F(a).

Here, a and b can take various different values. The common ones are (0, x), (0, 1), (0, infinity), (-infinity, infinity)

Thus, we have two fundamental theorems of calculus: