Abstract:
Time series analysis gives us a window to look at the past events and make predictions about the future. It has been long since it was discovered that various natural process exhibit a long memory property, characterized by the Hurst parameter H. The main goal of this project is to extract significant information contained in large correlated multivariate time series in terms of information entropic measures.
The data was projected onto principal components (using PCA) where maximum variance of the data was captured by information entropic measures. In this thesis
we study the variation of the information entropy of the the top principal components (PCs) with a variation in H and find that as the value of H increases, the net information entropies of the top PCs decrease, indicating an increment in the
amount of variation in top PCs as H increases.