Skip to content

Instantly share code, notes, and snippets.

@ricklentz
Created October 12, 2017 01:07
Show Gist options
  • Save ricklentz/23423b59db83591b50d0c33a56e6a102 to your computer and use it in GitHub Desktop.
Save ricklentz/23423b59db83591b50d0c33a56e6a102 to your computer and use it in GitHub Desktop.
Dolphin whistles example -> Move to the frequency domain and use Sakoe Chiba bounds to perform dynamic time warping.
Viterbi Path - most probable path through the trellis
Hidden Markov Models training notes:
try many different topologies
use cross-validation
split independent signals that have the same logical meaning vs. trying to forcing the model (analogous coarticulation of phenoms):
s1_telem_pos_updt_3M22_Zircon
s2_telem_pos_updt_3M22_Zircon
Putting these signal networks into active memory is prohibitive so that we can use stochastic beam search. Also, to deal with real-world environments, we can use Baum Welch (error/2), statistical grammars (error/4), and state tying, and segmentally boosting (may be coming to HTK via GA Tech) to help overcome context training.
Researchers are working at Google, combining HMMs with deep belief networks and finding better results.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment