Hypothesis Test: conversion rate changed or not?
Computing the likelihood of a time series
Computing the likelihood with Python
n = array([ 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000., 1000.])
c = array([51, 40, 51, 41, 44, 39, 54, 41, 61, 52, 65, 58, 44, 49, 34, 39, 24,
28, 36, 43])
theta = array([ 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05])
theta = array([ 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05, 0.05])
def log_likelihood(n, c, theta):
return sum(binom.logpmf(c, n, theta))
def likelihood(n, c, theta):
return exp(log_likelihood(n,c,theta))
In [1]: from pylab import *
In [2]: from scipy.stats import binom
In [3]: def log_likelihood(n, c, theta):
...: return sum(binom.logpmf(c, n, theta))
...:
In [4]: def bayesian_jump_detector(n, c, base_cr=0.05, null_prior=0.98, post_jump_cr=0.03):
...: """ Returns a posterior describing our beliefs on the probability of a
...: jump, and if so when it occurred.
...:
...: First return value is probability null hypothesis is true, second return
...: value is array representing the probability of a jump at each time.
...: """
...: theta = full(n.shape, base_cr)
...: likelihood = zeros(shape=(n.shape[0] + 1,), dtype=float) #First element represents the probability of no jump
...:
...: likelihood[0] = null_prior #Set likelihood equal to prior
...: likelihood[1:] = (1.0-null_prior) / n.shape[0] #Remainder represents probability of a jump at a fixed increment
...:
...: likelihood[0] = likelihood[0] * exp(log_likelihood(n, c, theta))
...: for i in range(n.shape[0]):
...: theta[:] = base_cr
...: theta[i:] = post_jump_cr
...: likelihood[i+1] = likelihood[i+1] * exp(log_likelihood(n, c, theta))
...: likelihood /= sum(likelihood)
...: return (likelihood[0], likelihood[1:])
...:
In [5]: n = full((20,), 100)
...: c = binom(100, 0.05).rvs(20) #No jump in CR occurs
...:
...: bayesian_jump_detector(n, c, null_prior=0.99)
...:
Out[5]:
(0.99909917827083117,
array([ 1.76896708e-09, 1.84711683e-09, 5.58550880e-09,
5.83226645e-09, 1.03635573e-08, 3.13384293e-08,
1.92289232e-08, 9.89510066e-08, 5.09196569e-07,
1.07887083e-07, 9.44781935e-07, 1.40796089e-05,
8.63909656e-06, 7.56537499e-05, 3.89310137e-04,
1.40370734e-04, 5.06124581e-05, 8.99350387e-05,
3.24272250e-05, 9.80569000e-05]))
In [6]: n = full((20,), 100)
...: c = binom(100, 0.05).rvs(20)
...: c[13:] = binom(100, 0.03).rvs(7) #Jump occurs at t=13
...:
...: bayesian_jump_detector(n, c, null_prior=0.99)
...:
Out[6]:
(0.42826882885196343,
array([ 0.05374295, 0.09549772, 0.02023378, 0.0359541 , 0.03754249,
0.03920105, 0.02405334, 0.01475887, 0.02622555, 0.0466011 ,
0.04865985, 0.05080955, 0.0311762 , 0.01912938, 0.01997448,
0.00423213, 0.00259679, 0.00093631, 0.00019838, 0.00020715]))
Enjoyed reading the article above, really explains everything in detail, the article is very interesting and effective. Thank you and good luck for the upcoming articles learn python certification
ReplyDelete