Lucas de Levy Oliveira edited section_Methods_subsection_Pre_processing__.tex  almost 9 years ago

Commit id: ad5924a2ccce171f74251fd2fc9d6c3e570fb4bd

deletions | additions      

       

Also, tools such as Peristimulus Time Histogram (PSTH) can be used at this point in order to study variations of frequency rate, as done in \cite{22170970}.  \subsection{Complexity reduction}  Although the numerousness of inputs regarding USEAs is a big benefit regarding data collection and precision, a big quantity of data may be a big burden to processors and its algorithms. That is why it is important to use intelligent methods to reduce data that can be considered "redundant". In this sense, algorithms like Principal Component Analysis (PCA) can come in very handy. PCA is broadly known in statistics as a transformation in a dataset used to extract orthogonal data which is the least correlated to others -- here named as "principal". This way, signals acquired from an 100-length electrode can be reduced to very fewer element, helping decrease processing time. \subsection{Decoding}  At last, this is the step which is believed to possess the greatest potentials for algorithms implementation. Initially, Kalman Filter payed a big role in here, due to its modeling-corretion fashion. Nowadays, algorithms seem to have chosen this path and, thus, have incremented Kalman Filter with extensions and non-linearities.