Skip to main content

Information Theory, Inference, and Learning Algorithms

If you are looking for a very good book about machine learning look at this free on-line
Information Theory, Inference, and Learning Algorithms book.

Comments

Anonymous said…
This comment has been removed by a blog administrator.

Popular posts from this blog

how the make HCL and G graphs, and on the fly compositon of HCL and G for KALDI

Well, I had again to do something ;-) The task is to generate/create/update a decoding graph for KALDI on the fly. In my case, I aim at changing a G (grammar) in the context of a dialogue system. One can generate a new HCLG but this would take a lot of time as this involves FST determinization, epsilon-removal, minimization, etc. Therefore, I tried to use on-the-fly composition of statically prepared HCL and G. At first, I struggled with it but later I made it work. See  https://github.com/jpuigcerver/kaldi-decoders/issues/1 Here is a short summary: At the end, I managed to get LabelLookAheadMatcher to work. It is mostly based on the code and examples in opendcd, e.g. https://github.com/opendcd/opendcd/blob/master/script/makegraphotf.sh . First, Here is how I build and prepare the HCL and G. Please not that OpenFST must be compiled with  --enable-lookahead-fsts , see http://www.openfst.org/twiki/bin/view/FST/ReadMe . #--------------- fstdeterminize ${lang}/...

Viterbi Algorithm in C++ and using STL

To practice my C++ and STL skills, I implemented the Viterbi algorithm example from the Wikipedia page:  http://en.wikipedia.org/wiki/Viterbi_algorithm . The original algorithm was implemented in Python. I reimplemented the example in C++ and I used STL (mainly  vector  and  map  classes).  This code is in public-domain. So, use it as you want.  The complete solution for MS Visual C++ 2008 can be found at  http://filip.jurcicek.googlepages.com/ViterbiSTL.rar // ViterbiSTL.cpp : is an C++ and STL implementatiton of the Wikipedia example // Wikipedia: http://en.wikipedia.org/wiki/Viterbi_algorithm#A_concrete_example // It as accurate implementation as it was possible #include "stdafx.h" #include "string" #include "vector" #include "map" #include "iostream" using namespace std; //states = ('Rainy', 'Sunny') //  //observations = ('walk', 'shop', 'clean') //  //start_probability = {'Rainy': 0.6...

kaldi editing nnet3 chain model - adding a softmax layer on top of the chain output

I had to do one more thing: to edit a trained  kaldi  nnet3 chain model and add a softmax layer on top of the chain model. The reason for this is to get "probability" like output directly from the chain model First, let's look at the nnet structure: nnet3-am-info final.mdl input-dim: 20 ivector-dim: -1 num-pdfs: 6105 prior-dimension: 0 # Nnet info follows. left-context: 15 right-context: 15 num-parameters: 15499085 modulus: 1 input-node name=input dim=20 component-node name=L0_fixaffine component=L0_fixaffine input=Append(Offset(input, -1), input, Offset(input, 1)) input-dim=60 output-dim=60 component-node name=Tdnn_0_affine component=Tdnn_0_affine input=L0_fixaffine input-dim=60 output-dim=625 component-node name=Tdnn_0_relu component=Tdnn_0_relu input=Tdnn_0_affine input-dim=625 output-dim=625 component-node name=Tdnn_0_renorm component=Tdnn_0_renorm input=Tdnn_0_relu input-dim=625 output-dim=625 component-node name=Tdnn_1_affine component=Tdnn_1_affi...