Please use this identifier to cite or link to this item:
http://dr.iiserpune.ac.in:8080/xmlui/handle/123456789/6390
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Kumar, Bipin | en_US |
dc.contributor.advisor | Chattopadhyay, Rajib | en_US |
dc.contributor.author | ABHISHEK, NAMIT | en_US |
dc.date.accessioned | 2021-11-25T03:41:39Z | - |
dc.date.available | 2021-11-25T03:41:39Z | - |
dc.date.issued | 2021-08 | en_US |
dc.identifier.citation | 50 | en_US |
dc.identifier.uri | http://dr.iiserpune.ac.in:8080/xmlui/handle/123456789/6390 | - |
dc.description.abstract | In this work, we use convolutional recurrent neural network-based architectures involving ConvLSTM and ConvGRU for precipitation forecasting over the Indian region. We first compare direct and iterative approach for forecasting and find the iterative approach better for our model. We use multiple variables such as specific humidity, orography, soil moisture and surface pressure as input features for better capturing the underlying dynamics. We also analyse the forecasts over different homogeneous rainfall regions. Finally, we compare ConvLSTM and ConvGRU based models. We find similar performance in both the models, ConvGRU being faster. | en_US |
dc.language.iso | en | en_US |
dc.subject | Machine learning | en_US |
dc.subject | Convolutional Recurrent Neural Networks | en_US |
dc.subject | Medium Range Precipitation Forecasting | en_US |
dc.subject | Spatiotemporal Forecasting | en_US |
dc.subject | Deep Learning | en_US |
dc.title | Application of Convolutional Recurrent Neural Network in Precipitation Forecasting | en_US |
dc.type | Thesis | en_US |
dc.type.degree | BS-MS | en_US |
dc.contributor.department | Dept. of Data Science | en_US |
dc.contributor.registration | 20161023 | en_US |
Appears in Collections: | MS THESES |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Thesis_20161023.pdf | 5.37 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.