Digital Repository

The Asymptotic information rate function in coding theory

Show simple item record

dc.contributor.advisor KAIPA, KRISHNA en_US
dc.contributor.author GAIKWAD, AJINKYA RAMDAS en_US
dc.date.accessioned 2019-05-29T10:03:33Z
dc.date.available 2019-05-29T10:03:33Z
dc.date.issued 2019-04 en_US
dc.identifier.uri http://dr.iiserpune.ac.in:8080/xmlui/handle/123456789/3037
dc.description.abstract Transmission of information through a unreliable communication channel introduces noise in the data. The process of introducing redundancy in this data in order to recover it later is called an error correcting code. Error correcting codes are frequently used for reliable storage in CD’s, DVD’s, SSD’s and in many other cases. A good code must have high error correcting capacity and efficiency which are measured by relative minimum distance and rate of transmission respectively, however these two requirements are mutually conflicting. The trade-off between these quantities is studied through the question of finding the largest size of the code for minimum distance at least d. In general, it is hard to calculate the exact value of this quantity. Therefore, it is important to have good upper and lower bounds for the same. This question become more tractable by defining the asymptotic rate function. It is the asymptotic version of the above quantity which captures the trade-off between these quantities. This in turn raises more mathematical questions about the properties of asymptotic rate function like convexity and differentiability. The distance enumerator polynomial determines the distance distribution of a code. Binomial moments provide an alternate way to deal with the distance enumerator polynomial which helps in finding bounds on the size of the code. The most intuitive upper bound on the size of the code is sphere packing bound or Hamming bound. The codes attaining this bound are called perfect codes. The problem of existence of perfect codes is a very interesting problem which is still unresolved for case of double error correcting perfect codes. In this thesis, we attempt the above problem by generalizing the differential equation derived by Lloyd in [5] for binary case to find necessary conditions on the existence of perfect codes. We will also present a different proof of the non-linear Mac-Williams identities using binomial moments. The bounds on the size of the code are discussed in Chapter 1 and later we have used them to derive bounds and properties of the asymptotic rate function. We study the recent improvement in Elias bound on the asymptotic rate function due to K. Kaipa. In Chapter 4, We discuss about the open problem of ∪−convexity property of the asymptotic rate function. en_US
dc.language.iso en en_US
dc.subject 2019
dc.subject Perfect codes en_US
dc.subject Binomial moments en_US
dc.subject Asymptotic rate function en_US
dc.subject Lloyd's theorem en_US
dc.subject Mac-WilliaMS identities en_US
dc.title The Asymptotic information rate function in coding theory en_US
dc.type Thesis en_US
dc.type.degree BS-MS en_US
dc.contributor.department Dept. of Mathematics en_US
dc.contributor.registration 20141164 en_US


Files in this item

This item appears in the following Collection(s)

  • MS THESES [1705]
    Thesis submitted to IISER Pune in partial fulfilment of the requirements for the BS-MS Dual Degree Programme/MSc. Programme/MS-Exit Programme

Show simple item record

Search Repository


Advanced Search

Browse

My Account