Please use this identifier to cite or link to this item:
http://dr.iiserpune.ac.in:8080/xmlui/handle/123456789/4717
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | MADHUSUDHAN, M.S. | en_US |
dc.contributor.author | BHAGDIKAR, TANAYAA | en_US |
dc.date.accessioned | 2020-06-16T06:12:21Z | - |
dc.date.available | 2020-06-16T06:12:21Z | - |
dc.date.issued | 2020-06 | en_US |
dc.identifier.uri | http://dr.iiserpune.ac.in:8080/xmlui/handle/123456789/4717 | - |
dc.description.abstract | The aim of our project was to predict the depths of all the amino acids in a given query sequence. Residue depth is an important parameter to study various other properties of the protein. To predict the same we used neural networks. We had earlier tried statistical methods to predict residue depth but failed. We extracted all information from the query sequence and used these as features for our neural network. We subsequently trained several networks and tweaked the input features to improve the accuracy. We also tried changing the number of nodes in every hidden layer and the number of hidden layers. Our final model had negligible accuracy. This was because the input features we could extract, only given the query was far too weakly correlated to the final residue depth. Other constraints of energy and structure would have probably led to a better prediction. | en_US |
dc.language.iso | en | en_US |
dc.subject | Amino Acids | en_US |
dc.subject | Residue depth | en_US |
dc.subject | Neural Networks | en_US |
dc.subject | 2020 | en_US |
dc.title | Predicting the depths of Amino Acids using Neural Networks | en_US |
dc.type | Thesis | en_US |
dc.type.degree | BS-MS | en_US |
dc.contributor.department | Dept. of Biology | en_US |
dc.contributor.registration | 20151144 | en_US |
Appears in Collections: | MS THESES |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
MSThesis_20151144.pdf | MS Thesis | 1.19 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.