You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Dolphin whistles example -> Move to the frequency domain and use Sakoe Chiba bounds to perform dynamic time warping.
Viterbi Path - most probable path through the trellis
Hidden Markov Models training notes:
try many different topologies
use cross-validation
split independent signals that have the same logical meaning vs. trying to forcing the model (analogous coarticulation of phenoms):
s1_telem_pos_updt_3M22_Zircon
s2_telem_pos_updt_3M22_Zircon
Putting these signal networks into active memory is prohibitive so that we can use stochastic beam search. Also, to deal with real-world environments, we can use Baum Welch (error/2), statistical grammars (error/4), and state tying, and segmentally boosting (may be coming to HTK via GA Tech) to help overcome context training.
Researchers are working at Google, combining HMMs with deep belief networks and finding better results.