# PPOL564 | Data Science 1: Foundations

## Concepts Covered:¶

• Build up a conceptual understanding of the derivative.
• Derive the derivative computationally and analytically
In [1]:
import numpy as np

# Bokeh for interactive plots
from bokeh.plotting import figure, output_notebook, show
from bokeh.layouts import row
output_notebook()

def plot(w=950,h=500,title=''):
'''Wrapper function to ease starting a new plot.
'''
p = figure(plot_width=w,plot_height=h,title=title,toolbar_location="below")
p.xaxis.axis_label = 'X'
p.yaxis.axis_label = 'f(X)'
return p


# Functions as Mappings¶

$$f: x \mapsto y$$

In [2]:
def f(x):
return 3*x

def g(x):
return x**2

def h(x):
return np.cos(x)*100


Each maps $x$ to $y$ in a different fashion.

In [3]:
x = [.15,3,10]
print("\nf:",x[0],"|-->",f(x[0]))
print("f:",x[1],"|-->",f(x[1]))
print("f:",x[2],"|-->",f(x[2]))

print("\ng:",x[0],"|-->",g(x[0]))
print("g:",x[1],"|-->",g(x[1]))
print("g:",x[2],"|-->",g(x[2]))

print("\nh:",x[0],"|-->",h(x[0]))
print("h:",x[1],"|-->",h(x[1]))
print("h:",x[2],"|-->",h(x[2]))

f: 0.15 |--> 0.44999999999999996
f: 3 |--> 9
f: 10 |--> 30

g: 0.15 |--> 0.0225
g: 3 |--> 9
g: 10 |--> 100

h: 0.15 |--> 98.87710779360422
h: 3 |--> -98.99924966004454
h: 10 |--> -83.90715290764524


When we plot these mappings, distinct functional forms emerge.

In [4]:
x = np.arange(-5, 15, .1)
p = plot()
p.line(x,f(x),line_width=3)
p.line(x,g(x),line_width=3,color="green")
p.line(x,h(x),line_width=3,color="orange")
show(p)


# Continuous vs. Discontinuous Functions¶

In [5]:
def f(x):
return 10*np.sin(x)

def h(x):
return np.abs(x)

# Define our functions piece meal...
def g1(x):
y = np.zeros_like(x)
y[np.where(x>=0)] = 5 + -1*x[np.where(x>=0)] + -.2*x[np.where(x>=0)]**2 + .01*x[np.where(x>=0)]**3
y[np.where(y==0)] = np.nan
return y

def g2(x):
y = np.zeros_like(x)
y[np.where(x<0)] = x[np.where(x<0)]**2
y[np.where(y==0)] = np.nan
return y

In [6]:
x = np.arange(-5, 5, .1)
p = plot()
p.line(x,f(x),line_width=3)
p.line(x,h(x),line_width=3,color="green")
p.line(x,g1(x),line_width=3,color="orange")
p.line(x,g2(x),line_width=3,color="orange")
show(p)


# Rate of the Change¶

In [7]:
def f(x):
return 3*x

In [8]:
x1 = 1
h = 1 # some change
x2 = x1 + h

print(f'''
As x moves from {x1} to {x2}
The mapping moves from {f(x1)} to {f(x2)}
''')

    As x moves from 1 to 2
The mapping moves from 3 to 6



We can calculate the rate at which things are change by looking dividing over changes in the "rise" (y-axis) and the "run" (x-axis).

$$m = \frac{f(x_2) - f(x_1)}{x_2 - x_1}$$

In [9]:
(f(x2) - f(x1))/(x2 - x1)

Out[9]:
3.0

We can observe these discrete changes visually.

In [10]:
# plot
x = np.arange(-5, 15, .1)
p = plot()
p.line(x,f(x),line_width=3,alpha=.5)

# How much does a one unit change in x change y?

p.scatter(0,f(0),color='black',size=6)
p.scatter(1,f(1),color='black',size=6)

# Rise: "Red"
p.line([0,0],[0,f(1)],color='red',line_width=3,alpha=.5)
# Run: Run
p.line([1,0],[f(1),f(1)],color='black',line_width=3,alpha=.5)

show(p)


For a linear function, it doesn't matter how big of the difference between the two points is. The change is always the same (i.e. it's constant).

In [11]:
h = 100 # A larger change
print((f(x1 + h) - f(x1))/(h))

h = .00001 # A smaller change
print((f(x1 + h) - f(x1))/(h))

3.0
3.000000000019653


Note the subtle shift in notation.

$$\frac{f(x_2) - f(x_1)}{x_2 - x_1} \to \frac{f(x_1+h) - f(x_1)}{h}$$

Let's now extend this to a non-linear function.

In [12]:
def g(x):
return x**2


In addition, let's build a simple function that implements our change protocol.

In [13]:
def change(x,nudge=1):
'''
Take slight changes in x "nudges" and see the discrete difference in y
'''
return (g(x+nudge)-g(x))/nudge

for i in np.arange(-5,5):
print(f'''{i} goes in and {g(i)} comes out. The degree of change is {change(i)}''')

-5 goes in and 25 comes out. The degree of change is -9.0
-4 goes in and 16 comes out. The degree of change is -7.0
-3 goes in and 9 comes out. The degree of change is -5.0
-2 goes in and 4 comes out. The degree of change is -3.0
-1 goes in and 1 comes out. The degree of change is -1.0
0 goes in and 0 comes out. The degree of change is 1.0
1 goes in and 1 comes out. The degree of change is 3.0
2 goes in and 4 comes out. The degree of change is 5.0
3 goes in and 9 comes out. The degree of change is 7.0
4 goes in and 16 comes out. The degree of change is 9.0


The degree of change is different given different inputs of $x$. When we plot this, we can see clearly why this is.

In [14]:
x = np.arange(-5,5+.1,.1)
p = plot()
p.line(x,g(x),line_width=3)
show(p)


Let's see if we can plot out the rate in which things change, using our change function. Moreover, let's see what happens as we take smaller and smaller "nudges" (changes in x).

In [15]:
for i in [4,2,1,.5,.01,.001,.00001]:
p.line(x,change(x,nudge=i),color="red",line_width=3,alpha=.2)
show(p)


This idea of making our nudges smaller and smaller is the known as taking something to it's limit. That is, as we approach 0 (i.e. as our changes get smaller and smaller), we get a closer and closer approximation of the rate of change at a given points.

$$\frac{f(x_1+.00001) - f(x_1)}{.00001}$$

$$\frac{f(x_1+.00000000001) - f(x_1)}{.00000000001}$$

$$\lim_{h \to 0} \frac{f(x_1+h) - f(x_1)}{h} = \frac{d y}{d x} = f'(x)$$

Ultimately we converge on the "Instantaneous rate of change": the rate of change at a specific point.

### Calculating the derivative¶

$$\lim_{h \to 0} \frac{g(x+h) - g(x)}{h}$$

$$\lim_{h \to 0} \frac{ (x+h)^2 - x^2}{h}$$

$$\lim_{h \to 0} \frac{ x^2 + 2hx + h^2 - x^2}{h}$$

$$\lim_{h \to 0} \frac{ 2hx + h^2 }{h}$$

$$\lim_{h \to 0} 2x + h$$

$$g'(x) = 2x$$

In [16]:
def g1(x):
return 2*x

In [17]:
p.line(x,g1(x),line_width=3,color='black',line_dash='dashed')
show(p)


Maps onto what we had before! Naturally a closed form solution is always preferable, but it is quite interesting that we could approximate the rate of change by plugging in a a range of values into our computer.

In [18]:
def deriv(x,func,nudge=.001):
'''
More generic derivative function
'''
return (func(x + nudge) - func(x))/nudge


Does this work on a really complex function... let's return to $h(x)$ which is the $\cos x$

let's apply our simple derivative function to this...

In [19]:
def h(x):
return np.sin(x)

In [20]:
p = plot()
p.line(x,h(x),line_width=3)
p.line(x,deriv(x,h,nudge=.0001),color="red",line_width=3,alpha=.3,line_dash='dashed')
show(p)


What does the derivative look like? The $\cos x$ function (which is in fact the derivative for the $\sin x$)!

In [21]:
p.line(x,np.cos(x),color="purple",line_width=8,alpha=.1)
show(p)


As we can see, computational approximation can be pretty useful (at the very least for getting the intuition of what we are doing and to even sometimes check our math!)