Please use this identifier to cite or link to this item: http://repository.kln.ac.lk/handle/123456789/5436
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKumara, K.H.-
dc.contributor.authorDias, N.G.J.-
dc.contributor.authorSirisena, H.-
dc.date.accessioned2015-02-27T05:29:21Z-
dc.date.available2015-02-27T05:29:21Z-
dc.date.issued2007-
dc.identifierStatistics and Computer Scienceen_US
dc.identifier.citationKumara, K.H., Dias, N.G.J. and Sirisena, H. (2007). Automatic Segmentation Of Given Set Of Sinhala Text Into Syllables For Speech Synthesis, Journal of Science, University of Kelaniya, 3: 53-62.en_US
dc.identifier.urihttp://www.kln.ac.lk/science/web/journals/vol3/03-05.pdf-
dc.identifier.urihttp://repository.kln.ac.lk/handle/123456789/5436-
dc.description.abstractA dictionary based automatic syllabification tool has been given for Speech Synthesis in Sinhala language. This tool is also capable of providing frequency distributions of Vowels, Consonants and Syllables for a given set of Sinhala text. A method of determining syllable boundaries has also been shown. Detection of Syllable boundaries for a given Sinhala sentence is achieved by four main phases and those phases have been described with examples. Rules for the automatic segmentation of words into syllables have been derived based on a dictionary. An algorithm has been produced for the implementation of these rules which utilizes the dictionary together with an accurate mark up of the syllable boundaries.en_US
dc.language.isoenen_US
dc.publisherUniversity of Kelaniyaen_US
dc.subjectSpeech Synthesisen_US
dc.subjectText to Speech (TTS)en_US
dc.subjectPhoneticen_US
dc.subjectVowelsen_US
dc.subjectConsonantsen_US
dc.subjectSyllablesen_US
dc.subjectalgorithmen_US
dc.titleAutomatic Segmentation Of Given Set Of Sinhala Text Into Syllables For Speech Synthesisen_US
dc.typeArticleen_US
Appears in Collections:Volume 03 - 2007

Files in This Item:
File Description SizeFormat 
03-05.pdf188.21 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.