Anda di halaman 1dari 17

rubenszmm@gmail.

com

http://github.com/RubensZimbres




NAVE BAYES MIXTURE MODELS

. () = | . ()
=
()


BAYES OPTIMAL CLASSIFIER MIXTURE OF GAUSSIANS
ANOMALY DETECTION

arg max . (|)
1 1 !
= .
2 ! 2




NAVE BAYES CLASSIFIER ! ! + ! !
!" =
! + !
arg max | . (|)

(!" ) 0.50

BAYES MAP (maximum a posteriori)
EM ALGORITHM
!"# = arg max | . ()
. |
| =
.

MAXIMUM LIKELIHOOD
(|)
=
!" = arg max |


| =
TOTAL PROBABILITY

= ( = 1| = 1, = 0)
= | . ()
() + . ()
LAPLACE ESTIMATE (small samples) =
() ()!

+ 0.5
= 2 = 2
++1


BAYESIAN NETWORKS
+ = +

= 0 = 1


LIMITES + 2 = + 2


+ ()
lim
!! CHAIN RULE

= =
= ! () . ()


solve f(x) apply in g(x)
DERIVADAS

!
= . !!!

VARIANCE
! !
= .
( )!
=
1

PRODUCT RULE

STANDARD DEVIATION
. = + . ()




COVARIANCE SUM OF SQUARED ERRORS

( )!
. ( ) =
= 2
1

COST FUNCTION

CONFIDENCE INTERVAL ( )!
! ! .
2
1.96



NUMBER OF EXAMPLES
CHI SQUARED
1
log(! ) + log ( )
( )! !
= =


= =



R SQUARED

MARKOV CHAINS
.
! =
! ( )! . ! ( )!
!!! = = ! . = . ( )
!

LOSS

= ! + ! +


K NEAREST NEIGHBOR LINEAR REGRESSION

()
!
! ! ! ! !

! =
!! !! ( ! ! )!

! , ! = ! !
!
+ (!" !" )!
= ! ! ! !


WEIGHTED NEAREST NEIGHBOR !
= ! ! +
!!!
()
= . (! ! )!
(! ! )!

LOGISTIC REGRESSION

PRINCIPAL COMPONENTS ANALYSIS

= = +
= 1

=

= !"!!
1
= . []


= ! . [!! . . . !" ] . log () + 1 . log (1 )
=


1
=
1 + !"!!

= 0 = 1


2 0
ENTROPY

! ~ ! ! ~ !
= . ()


+ =
1
JOINT ENTROPY

+
=
+ + 1 , = , . (, )


1
=
100. log ((|))
CONDITIONAL ENTROPY




DECISION TREES
| = , . (|)
!

= . log ()
!!!
MUTUAL INFORMATION

= ! . !! . log !! !(!!!) . log (!(!!!) )
, = (|)


RULE INDUCTION
EIGENVECTOR CENTRALITY = PAGE RANK
= . [ !!! . log () (! . log ())]
1 () ()
= d +
() ()

RULE VOTE where d=1 few connections

Weight=accuracy . coverage
RATING BATCH GRADIENT DESCENT

= ! + ! . (!" ! )
( )! .
! ! .
2

SIMILARITY
STOCHASTIC GRADIENT DESCENT
! !" ! . (!" ! )
!" =

! !" ! ! . (!" ! )! ! ! . ( )! .





CONTENT-BASED RECOMMENDATION NEURAL NETWORKS

!"#$$ ! !

= ! ! = = ! + ! !
!!! !!! !!!



COLLABORATIVE FILTERING LOGIT


log = + =
!" = ! + . 1


! !" ! . (!" ! )
!" ! .
!" ! ! . (!" ! )! SOFTMAX NORMALIZATION
!

!"!!
( ) =
!"!!



CROSS ENTROPY PERCEPTRON

!
( , = . ( )
= ! !"
!!!

LOSS
PERCEPTRON TRAINING
(( , ())
= ! ! + !


! = . .


L2 REGULARIZATION ERROR FOR A SIGMOID

. !
. . +
2 = . . 1 .






SIGMOID


1 AVOID OVERFIT NEURAL NETWORKS L2

1 + !(!"!!) ( )!
!"# !"#
= + F. !"!
2

RADIAL BASIS FUNCTION
where F=penalty

(!!!)!
!
= !!




BACKPROPAGATION NESTEROV


! = ! . 1 ! . ( ! ) = (!!! + . ( !!! ))


! = ! . 1 ! . !" ! ADAGRAD



!" !" + !" . ! . !" = . ()
!"#$ +

! = 1 + ( ! )

ADADELTA
!" () = . ! . !" + . !" ( 1)
[]!!!
where M=momentum =
()


NEURAL NETWORKS COST FUNCTION
= ! +

!! !!!! !
!
!!!
!
!!! ! . log + 1 . log (1 )
!!!!! !!! !!! !"
! = +
2 RMSprop



= . ()
MOMENTUM ! +


= (!!! + . ) ADAM


= .
+


! ! ! + (! ! )!
! !!! + 1 ! . () ! ! = (! ! + ! ! ). 1
= ! ! + ! !
1 !


! !!! + 1 ! . ()!
=
1 !
SUPPORT VECTOR REGRESSION

= . ! ! +
SUPPORT VECTOR MACHINES

!
= . . (! ! ) ! ! + (! ! )!
! ! =
!!"#
!
! ! + (! ! )!
! ! =
!!"# arg


= 0
RIDGE REGRESSION - REGULARIZATION

= 1 = 1 ! .



= ! .

= . +

+ ! = 1
!

LASSO REGRESSION - REGULARIZATION

!
! ! + (!" !" )!
=
!

( )! . MDIA GEOMTRICA
+


!
0 1,2,4 = 1.2.4


= + . +

MEDIANA

SKEWNESS

2
Skewness < 1

KOLMOGOROV SMIRNOV TESTE t

Normal sig > .005 ! ! (! ! )
=
! !

Diferena significante sig < .05

NO PARAMTRICOS TESTE t 2 AMOSTRAS

T test = Normal Teste U Mann Whitney sig < .05 Levene Varincia


CRONBACH
ANOVA + 3
> .60 .70

=

MDIA ARITMTICA
Sig < .05



TOLERNCIA


ANLISE DISCRIMINANTE
Tolerncia > .1

1 Box M sig < .05 rejeita H0
=


Wilks Lambda sig < .05
VARIANCE INFLATION FACTOR

VIF <10 ! ~ ! ! ~ !
1 1 !

= .
2 ! 2


ENTER METHOD

! ! + ! !
+ 15 cases / Variable !" =
! + !






STEPWISE METHOD
ERROR MARGIN


+ 50 cases / Variable
1.96


VARIABLE SELECTION
ACCURACY


F Test = 47 sig < .05
Confidence Interval ~ P value






MISSING DATA
HYPOTHESES TESTING


Delete if > 15%
P value < .05







TRANSFORMATION OK MAHALANOBIS DISTANCE
same variable

< 4

(! ! )!
=
!
MULTICOLLINEARITY

Correlation > .90 MANHATTAN DISTANCE L

VIF <10 = |! ! | + |! ! |

Tolerance > .1
NET PRESENT VALUE

SUM OF SQUARES (explain) ! = ! . !

!"#!"$$%&' . ( ) ! = ! . !!
!"#$% =
1 . !"#$%&'(#


MARKOV DECISION PROCESS

! = ! + max , , . ()
!
!
STANDARD ERROR ESTIMATE (SEE)

! = argmax , , . ()
!
!

=
2 !,! = ! + max , , ! . max ( ! , )
!! ! !
!

!,! ! ! + max ! ,
( )! !
=
2

PROBABILIDADE (coins) ()!! !"#$%&!'(!)
= + + ( ) (
) ( )
()
=
()
EVENTO COMPLEMENTAR


= 1 ()

FREQUENTISTA

PROBABILIDADE MARGINAL

lim = = = ( = )
!!
=
()

AXIOMTICA

() 0
PROBABILIDADE A e B
(, , ) = 1
( )
=
()

TEROREMAS DE PROBABILIDADE

PROBABILIDADE CONDICIONAL
UNIO = A ou B

()!"#$%&!'(!) = + () !"#$%$"#$"&$' = ()


()!! !"#$%&!'(!) = + ( )


BAYES (52 cartas , cancer) INTEGRAIS

!

( ) . ()
= = !
() ()

!
1 1 1
BINOMIAL DISTRIBUTION (0,1 sucesso) ! = ! = 2! 1!
! 3 3 3



= . ! . (1 )! PRODUCT RULE



= . ! . ( )!
. . = .



= . ()
CHAIN RULE
!! !!

!
= . ! . (1 )!
! ! + . = . + . ()



INTEGRATION


PROBABILIDADE TOTAL (urnas)


= 0
.
= = . (|)




DIFFERENTIATION


PROBABILITY k SUCCESS in n TRIALS


+ ()
= . ! . (1 )!!! lim
!!

1 2 3 1 1 2 3 5
LINEAR ALGEBRA 1 4 5 2 = 1 1 + 2 4 + 0 5 = 9
0 3 2 0 0 3 2 6
ADDITION

x Matrix: Colunas A = Linhas B
1 2 2 2 2 4 Linhas A = Colunas B
+ =
4 3 5 3 9 6
, =

SCALAR MULTIPLY 0 3
1 2 3 8 24
1 3 =
0 4 5 14 37
2 5
2 2 6 6
3 =
5 3 15 9
1 2 3
1 2 0 4 5 6 = 12 30 0
MATRIX VECTOR MULTIPLICATION 7 8 9

Linhas x Colunas
IMPORTANTE
x Vetor: Colunas A = Linhas B
, =
!,! !,! = !,!

0 3 6 1 0 0 1 2 1
1 3 1 0 3 8 1 =
1 3 = 7
2 0 0 1 0 4 1
2 4 9

!,! !,! !,! 1 2 1
1 2 3 1 5
= !,! !,! !,! = 0 2 2
1 4 5 2 = 9 !,! !,! !,! 0 4 1
0 3 2 0 6

OU


PERMUTATION PROPRIEDADES

LEFT=exchange rows
Not commutative
0 1
=
1 0

RIGHT=exchange columns Associative
= ( )
0 1
=
1 0



Inverse (only squared)

1
IDENTIDADE !!


1 0 0
1 0
0 1 0 !! . = =
0 1
0 0 1



DETERMINANTE
DIAGONAL

1 3
2 0 0 = 1.2 3.4 = 10
4 2
0 2 0
0 0 2
1 4 7 1 4
2 5 8 2 5 = 1.5.9 + 4.8.3 + 7.2.6 7.5.3 1.8.6 4.2.9
3 6 9 3 6


TRANSPOSE ELASTICIDADE DE DEMANDA

1 4 (! ! ) (! + ! )
1 2 3 ! = .
= = 2 5
4 5 6 (! + ! ) (! ! )
3 6

Anda mungkin juga menyukai