KICACT 2017
Permanent URI for this collectionhttp://repository.kln.ac.lk/handle/123456789/17369
Browse
Item 5G Wireless Communication over Heterogeneous Networks: Solutions for Hardware and Software Fallacies.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Abeysinghe, A.Proliferating use of mobile communications have urged the need to develop networks able to cater to higher bandwidth, speeds and support a plethora of upcoming technologies. Introduction of 5G networks in a heterogeneous network architecture has been chosen as a viable solution to persistent issues in current implementations. However, these network designs lack several fundamental software and hardware pitfalls associated with problems in designing: associated cell optimizations, schemes on simultaneous base station associations and cooperation between tiers in the architecture. Therefore, this research will focus in fine tuning these software and hardware fallacies for the successful implementation of proposed 5G networks. A main software drawback in current networks is persistence of lazy caching themes. As shown in figure 1, currently user requests are often matched to arbitrary locations without the use of preenabled caching mechanisms. To overcome this issue proactive caching where base stations (BS) identify external clients possessing cached information and dynamic Device-to-Device (D2D) connection creation could be implemented. As shown in figure 2, significant improvements in successful requests could be achieved both under high load and under low load as users are efficiently matched to potential targets. A main drawback in mobile network hardware design is high energy consumption proportional to increasing user requests. To overcome this issue cell zooming could be introduced to 5G implementations. As shown in figure 3, the central cell could use an algorithmic approach to identify the network request density around it and zoom its range in or out to efficiently serve while neighboring cells could be switched off for a predetermined interval and vice versa. T-tests carried under this paradigm proved that significant cost savings in efficient use of energy in these cells could be achieved under this solutionItem Accessing a Moodle based Learning Management System and Exam Performance by Medical Students: A Retrospective Analysis.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Hettiarachchi, W.G.; Hettige, S.; Ediriweera, E.P.D.S.; Chandrathilake, M.N.; de Silva, N.R.Considering the high computer literacy of students and the need for being students centred, the MBBS programme of University of Kelaniya introduced a Moodle-based learning management system (LMS) as a supplementary component to its blended delivery approach of the curriculum. Accordingly, LMS is a parallel component to the curriculum delivered face-to-face. Each module of learning in the curriculum is represented in the LMS by giving lecture notes, PowerPoint presentations, web links and assignments. The broad aim of this study is to determine the relationship between the extent of interactively of learners with LMS and their exam performance. First-year medical students (172 from 2016 intake and 166 from 2017 intake) at University of Kelaniya were considered for the analysis. Student’s access to LMS during the first two modules of the MBBS curriculum and the results of the first continuous assessment were compared. Total LMS access for each student during the two modules were calculated by counting the number of views in the course log. The particular course logs of LMS were downloaded and filtered out the details pertaining to students in the above two modules in both batches. Total access during the two modules were calculated for individual student and used for the analysis. Continuous assessment results ranged from grades A to F and we assigned sequential numerical marks in the descending order from 6 to 1 to denote grade A to F. Total LMS access with respect to assessment grades were visualized using boxplots and median with interquartile ranges were calculated. Association between LMS access and assessment grades were investigated. Statistical analysis was done in R. Median (interquartile range) of LMS access of the students were 43.0 (12.25 – 72.0) times. The number of results grades for the students as follows; A – 3, B – 41, C – 117, D – 122, E – 51 and F – 4 and the respective number of median (interquartile range) access to LMS for the above grades were 110.0 (102.0 – 113.0), 51.0 (21.0 – 76.0), 49.0 (16.0 – 76.0), 39.0 (11.0 – 64.75), 29.0 (6.0 – 59.5) and 6.5 (3.0 – 16.0). There was significant correlation between LMS access and results grades (rho = 0.2, P < 0.01). Students with grade A showed significantly higher LMS access compared to the rest of groups. There was no difference in LMS access between students with grades B and C, C and D, D and E, D and F or E and F. However, grade B showed significantly higher LMS access compared to grades D, E and F; and grade C showed significantly higher LMS access compared to grades E and F. The findings demonstrate that students’ interaction with LMS were significantly associated with the performance in the examination. The learning management system has a positive impact on student performance.Item Analysis of Road Traffic Accidents Using Data Mining(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Liyanaarachchi, K.L.P.P.; Charles, E.Y.A.Accident happens unexpectedly and unintentionally, typically resulting in damage or injury or in fatalities. Data mining is the extraction of implicit, previously unknown, and potentially useful information from data collected for various purposes. The main objective of this research is to identify more accurate and useful patterns that would exists in the road traffic accident data using data mining techniques. It is believed that these patterns can be utilized to take measures to reduce the number of accidents or the severity of the accidents. As part of this research work, details of accidents occurred in Colombo district in the year 2015 were collected from the Traffic Headquarters, Colombo, Sri Lanka. A data set with 9487 accident incidents each detailed with 55 features was created from the collected data. This data consists four types of accidents, namely, Fatal (154), Grievous (877), Non-Grievous (2028) and Vehicle damage only (6428). There are a quite a few published studies on traffic accident analysis using data mining methods. In most of these studies, J48 classifier has produced higher accuracy than other methods. So far no such study has been reported on accidents occurred in Sri Lankan roads. A correlation analysis was performed on the data set and as a result 10 attributes were removed. In this study, the J48 decision tree classifier was usedin two ways. In the first one all four type of accidents were considered. The decision tree built with 70% of the data was able to achieve an average accuracy of 71.4687%. In the second analysis, three types Fatal, Grievous and nongrievous types were combined into one class and named as Injured. This approach was taken to reduce the effect of the vehicle damage only class, which is around 68% of the total data. The decision tree built with this merged classes was able to achieve an accuracy of 78.7288 % using a tenfold cross validation. The decision tree was converted into 20 rules, which can predict the type of accident based on the identified attribute values. The results were found to be helpful to identify the factors influencing traffic accidents and can be further analyzed to find more subtle reasons or situations that are causing accidents.Item Android Shopping Cart Application (ASCA)(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Bandara, K.; Wijegunasekara, M.C.Due to the busy lifestyle of people in today’s society it has become more convenient for them to buy all their daily shopping items in one place. Therefore, shopping grocery items in a supermarket has become a common activity in Sri Lanka too. The major problems faced by customers when shopping grocery items in a supermarket, is the effort they have to put and the time consuming tasks to be faced almost every day during shopping. One such difficulty is the need of frequently visiting the supermarket in order to buy day-to-day items. Also when buying the goods, most of the time, a walk around the shop to select the necessary products is an inevitable task. Even after buying, they need to stand in long queues at the counters to do the payments. Therefore, using the modern technology to build a suitable system to solve such problems is valuable. Mainly there are two approaches to solve this problem. First is a web shopping cart application and the other is a mobile application. Today as most of the people always carry smart phones with them, nowadays every business requires to have its own business application for mobile users. This research project has two major parts: the mobile application and the website which acts as a content management system. Using this mobile application, the customers are given the facility to buy online or to get the products delivered to their home by the delivery service provided from the shop or else they can send the order confirmation and get the ordered items by payments done at the shop. This mobile application is being developed using the Android Studio Software. The client side of the application is designed as a website, for the supermarket owners to manage the online database which stores the content for the mobile application. In order to measure the effectiveness of the implementation of this project, questionnaires were distributed to a total population of 50 people who buy their daily groceries in a supermarket and having an android smart phone. With the analysis of data, 32% of the people strongly agreed and 48% agreed that traditional shopping will be superseded with online shopping in near future and only 6% has disagreed the above idea. Half of the population agreed and 16% strongly agreed the fact that only credit card holders being able to buy products online is a major drawback in a shopping cart application. As future enhancements, the application will be developed to run on any type of mobile operating system other than just android. Currently only the bank portal and a link to connect with Paypal is designed and the payment gateway is to be developed further. The client side can also be created as a mobile application. In conclusion, the result of this research project is a user friendly mobile application which runs on Android Operating System and a Content Management System has developed as a Website to interact with the Database. The ASCA was a success in developing an online mobile shopping cart which could satisfy the current problems of customers who buy their daily grocery items in a supermarket.Item Anti-Eavesdropping Data Safety Framework for Highly Secured Enterprise Networks.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Imran, A.A.; Shafana, A.R.F.The rapid advancement in Internet has paved way for several malicious intrusions which allow information to be accessed without proper authorization and privileges. Thus, the integrity, privacy and confidentiality of data and information are readily lost. One such malicious intrusion is the eavesdropping, man-in-the-middle attack. They often utilize the backdoor of the encryption that jeopardize the security of billions of devices and their communication. Hardware Security Module (HSM) has been the topper as an anti-eavesdropping device since the time of its introduction, as they were specifically built to create a tamper-resistant environment to perform cryptographic processes. Thus, HSMs are widely used in Military and Security Forces to obtain heightened security and to preserve the privacy of critical data processing. However, high cost, availability of HSM vendors in minor scale and the practical difficulties in its operation and maintenance have made it less prevalent in enterprise networks. Therefore, there is an immense need for the development of a mechanism that is equally competent to the functionality of HSM to withstand pernicious attacks and unauthorized surveillance on communication, but at a low cost. Since, HSM has proven track record of its performance and tamper-resistant feature, this paper aims to make use of the virtualization process of functionality of the expensive HSM. The development of the anti-eavesdropping Data Safety Framework can be described, as follows. Through a thorough review of literature, the key features of HSM have been studied and thus, it is proposed to be implemented as a software that comprises its entire functionalities using VMware as the virtualization platform. To the HSM that has been virtually developed, Pretty Good Privacy (PGP), a low-cost privacy ensuring program will be used for encryption processes. A Virtual Private Network (VPN) has to be created by next, as the environment where the particular simulated software will function. The built private network created thus is the test enterprise network in this case. The HSM authorized network system will be managed by an Observer Management Server in order to provide additional benefits such as temporary decryption keys. Upon building the intended simulated software as HSM and its functioning environment VPN as Enterprise Network, ethical hacking tools will be used to evaluate the robustness and performance of the built simulated software. The simulated software thus implemented and tested will be then tested for interoperability on an Enterprise Network. Several security policies and security tools exist today. However, security breaches are happening prevalently bypassing the underlying security mechanisms. This particular study has proposed the implementation of a virtual hardware based on HSM, which has proven track record of its robust security feature, as an anti-eavesdropping data safety framework. The simulated software can support enterprise networks to preserve the privacy, confidentiality and security of the data communication.Item Applying Intelligent Speed Adaptation to a Road Safety Mobile Application –DriverSafeMode(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Perera, W.S.C.; Dias, N.G.J.During the last decades, Sri Lanka has experienced a highly accelerated growth level of motorized transportation with the rapid urbanization due to the economic development. However, the increasing motorization has also placed a significant burden on people’s health in the form of uncontrollable growth rate of road accidents and fatalities. We have focused on excess speed and mobile distraction which are two major factors that have caused majority of road accidents. Exceeding the speed limit, which is enforced under the traffic law, increases both the risk of a road crash as well as the severity of the injuries by reducing the ability to judge the forthcoming events. Use of mobile phones distracts a driver in the means of visual, physical and cognitive. These factors are largely preventable but are unlikely; due to the lack of adequate mechanisms in existing road safety plans in Sri Lanka. Especially in rural areas, roads are poorly maintained which has led to faded, hidden, foliage obscured speed limit signs and absence of appropriate signs at vulnerable locations (schools, hospitals). Existing plans also lack alert systems to avoid drivers from using phones while driving. Proposed application uses Advisory Intelligent Speed Adaptation (ISA) to ensure drivers' compliance with legally enforced speed limits by informing the driver on vehicle speed along with speed limits and giving feedback. There exist many ISA systems deployed using various methods such as GPS, Transponders, compasses, speed sensors and map matching, based on native traffic infrastructures of other countries. Google Fused location provider API web service was used combined with GPS sensor of the smartphone to obtain continuous geo location points (latitude, longitude). Distance between two location points was calculated using Haversine Algorithm. Using the distance and time spent between two location updates, vehicle speed was calculated. Google Maps Geocoding API was used to obtain the type of road on which the driver is driving. Accepted speed limits were stored in a cloud hosted database according to each road type and vehicle type. Application establishes a connection to the database to gain the accepted speed limit whenever a new road type is detected. It compares real-time speed Vs speed limit and initiate audio and visual alerts when the vehicle speed exceeds the limit. Google Places API was used to identify schools and hospitals within 100m and informs the driver using audio and visual alerts. Application uses in-built GSM service to reject incoming calls and in-built notification service to mute distracting notifications. A test trial was carried out to evaluate the accuracy of speed detection. Mean speed of the test vehicle speedometer was 14.4122kmph (Standard Deviation=14.85891) and that of the application was 13.7488kmph (Standard Deviation=14.31279). An independent-sample t-test proved that the speed values of the test vehicle and the application are not significantly different at 5% level of significance. User experiences of 30 randomly selected test drivers were evaluated. 80% of lightmotor vehicle test drivers had stated that the application is very effective. 10% of the heavy-motor vehicle drivers and 20% of tricycle test drivers had found it difficult to perceive the audio alerts due to the noisy surrounding. Evaluations prove that the usage of the proposed system can impose a direct and positive effect on the road safety of Sri Lanka as expected.Item Arduino Based Home Automation and Security System(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Ranasinghe, R.A.N.N.; Weerasinghe, K.G.H.Home security systems are designed to minimize the risk of break-ins and protect families from crime. When studying the discipline of modern society, the homeowners are interested on having an Internet based Home Security and Automation systems with reliable method to remote securely. The Arduino Based Home Automation & Security system can recognize as a new approach to the business which will helps to complete the cheaper and most secure system for the modern society. The developed system is a low cost and flexible home appliances control and home environmental monitoring system. It employs an embedded micro – web server in Arduino Mega 2560. The IP connectivity is established via Ethernet shield for accessing and controlling devices and home appliances remotely. All components can be controlled through a web application called (SmartHomeWeb) or through an Anddroid app called (SmartHome).For the system development process,Arduino, Android, Video Streaming (for monitoring home environment) and cloud based technologies have been used. The Arduino with Ethernet technology is considered one of modern programmable device and utilize from mobile phone and internet. This framework is intended to help and give help to satisfy the needs of the elderly and the handicapped at houses. The fundamental control system uses an Ethernet shield device gives a wireless access to smart phones and any web browser. The security system includes whole surround of the home. Outside of security system is working as movement detection (by PIR Sensor) and security camera viewer. Using this system, the homeowners can control the appliances (Switch ON/OFF) and awareness of the home appliances Status from anywhere via web application or through android app. And any intruder attacks are detected from sensor and send an email to homeowner with detection time. Further the home owners can monitor the home area from Security camera using video streaming technology in whatever place. This system is designed to control electrical devices throughout the house with ease of installing it, ease of use and cost effective design and implement. At the end of the development process, evaluation was done using several testing methods, with the help of the other familiar users. The system was deployed with enhancing the processes and performances. And it can be concluded, it has user friendly interfaces and eases of maintenance and cost effective. As Future enhancements, this project can be extended for automatic doors, automatic lighting floor, Garage etc. And also using WIFI Shield, connect the electronic devices as wirelessly.Item Artificial Neural Network based Emotions Recognition System for Tamil Speech.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Paranthaman, D.; Thirukumaran, S.Emotion has become the important part in communication between human and machine, so the detection of emotions has become important part in pattern recognition through Artificial Neural Network (ANN). Human's emotions can be detected based on the physiological measurements, facial expressions and speech. Since human shows different expressions for a particular emotion when they are speaking therefore the emotions can be quantified. The English speech dataset is provided with descriptions of each emotional context available in Emotional Prosody Speech and Transcripts in the Linguistic Data Consortium (LDC). The main objective of this project describes the ANN based approach for Tamil speech emotions recognition by analyzing four basic emotions sad, angry, happy and neutral using the mid-term features. Tamil speeches are recorded with four emotions by males and females using the software “Cubase”. The time duration is set to three seconds with the sampling frequency of 44.1 kHz as it is the logical and default choice for most digital audio material. For the simulations, these recorded speech samples are categorized into different datasets and 40 samples are included in each dataset. Preprocessing includes sampling, normalization and segmentation and is performed on the speech signals. In the sampling process the analog signals are converted into digital signals then each speech sentence is normalized to ensure that all the sentences are in the same volume range. Next, the signals are separated into frames in the segmentation process. Then, the mid-term features such as speech rate, energy, pitch and Mel Frequency Cepstral Coefficients (MFCC) are extracted from the speech signals. Mean and Variance values are calculated from the extracted features. To create the classifier for the emotions, the above statistical results as an input matrix with their related emotions-target matrix are fed to train, validate and test. The neural network back propagation algorithm is executed by the classifier to recognize completely new samples of Tamil speech datasets. Each of the datasets consists of different combinations of speech sentences with different emotions. Then, the new speech samples are assigned to identify the recognition rate of the speech emotions using the confusion matrix. In conclusion, the selected mid-term features of Tamil speech signals classify the four emotions with the overall accuracy of 83.45%. Thus, the mid-term features selected are proven to be the good representations of emotions for Tamil speech signals and correctly recognize the Tamil speech emotions using ANN. The input gathered by a group of experienced drama artists who have the voice with the good emotional feelings would help to increase the accuracy of the dataset.Item Artificial Neural Network based New Hybrid Approach for Forecasting Electricity Demands in Sri Lanka.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Rathnayaka, R.M.K.T.; Seneviratna, D.M.K.N.The electricity generation and forecasting are playing a significant role to enhance national economic growth. It has a direct impact on both individual’s standards of living and industrial enhancements; especially, it is a prerequisite to enhance industrialization, farming and residential requirements. As a result, most of the countries are allocating a considerable amount for power generation and forecasting from nation’s annual budget. The main objective of this study is to focus on analyzing the electricity demands in Sri Lanka using a new proposed combined hybrid approach based on Artificial Neural Network. The methodology of the study is carried as follows. In the first phase, electricity demand of Sri Lanka is forecasting based on the autoregressive integrated moving average (ARIMA) and Artificial Neural Network (ANN) approaches separately. In the next stage, the new proposed combined approach of ANN and ARIMA (ANN-ARIMA) is applied. According to the Akaike Information Criterion, Schwarz Information Criterion and Hannan Quinn Criterion results, ARIMA(0,1,1) (R-squared : 45%, Durbin-Watson stat: 2.32) and ARIMA (1, 1, 1) (R-squared : 55%, Durbin-Watson stat: 2.03) are best models for forecasting electricity production and electricity consumption under the linear framework respectively. As a next step, proposed ANN-ARIMA hybrid methodology is applied to forecast non-linear composite based on MATLAB training algorithms. Furthermore, the model selection results concluded that, Backpropagation Neural Network (BPNN) (1-4-1) with 0.06 learning rates and BPNN (1-2-1) with 0.04 learning rates are the best one-step-ahead forecasting for electricity production and electricity consumption respectively. According to the empirical results, the electricity production and consumption curves went parallel trend up to 1995. However, after 1995 consumption rate has been increasing rapidly with respect to the production rate. When this is the case until 2020, it will create distortions in the Sri Lankan future. So this study is a good sign for the government and energy sources must be introduced and implemented for national power grid early as possible.Item Automated Characters Recognition and Family Relationship Extraction.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Bajracharya, A.; Shrestha, S.; Upadhyaya, S.; Shrawan, B.K.; Shakya, S.“Automated characters recognition and family relationship extraction” is an application of Natural Language Processing to identify characters from the story and determine the family relationship among them. This application is the use of specialized computer programs to identify entities, classify them and extract characters from them and determine relationship between them. This paper follows basic steps of NLP i.e. Tokenization, POS tagging, sentence parsing followed by the pronoun resolution implementing various algorithms and finally extracting entities and relations among them. Heretofore, we have successfully resolved pronoun from simple sentences by resolving Noun Phrase using the recursive algorithm for tree generation and hence extracting relation between the Noun Phrase (NP). Basic approach towards this project is to do Tokenization and POS tagging first. Then, sentence which is recursive composition of Noun phrase, verb phrase and prepositional phrase is parsed and recursive tree is generated. Then tree is traversed to determine the noun phrase which is replaced by the entity object of that particular noun phrase. Pronoun resolution is the essence of NLP and is of different type. Here, Co reference resolution has been used. After resolving the entire pronoun, then finally relationship is extracted from the story by comparing the relation ID of each Entity. Given the simple story, entities are being extracted and relationship is also determined. Understanding the approach of NLP and implementing them to showcase its use is the main theme of this project which is being done with as accurate result as possible. This paper can act as a base.Item Automated Financial Management System with an Android Application.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Gunathilaka, M.D.T.; Weerakoon, W.A.C.A financial company who offers both daily and monthly loans to their customers had to increase their functionality in an efficient manner to full fill the higher demand. The solution was to introduce an automated financial management system to their company by automating their manual process. The main objective of this project was to develop a main system with an android application to use in the company. The main system included the features to input, edit, update, and store the details such as customer, collector, package and loan. Further, it was required to send massages to collectors’ smart phone or Tab, update the system database using incoming massages form collectors’ smart phone or Tab, and generate reports. The android application includes features to manage a database in smart phone or Tab, update device database using incoming messages from the main system, send messages to update the system database and generate an invoice through a Bluetooth printer. Therefore, there were two parts in this project. The first part was to design and construct the main system, which was located in the head office. This was implemented using Java Standard Edition. By using the main system, owner or manager can handle the activities done inside the office. When a loan was issued to a customer, the particular details are stored in the main system and send to the mobile phone of the field collector via SMS using a GSM modem. Further, the details received via SMS to the modem from field collectors are used to update the MySQL system database. The second part was to build the android application using Dalvik Virtual Machine on Linux Kernel to use in field collectors’ mobile phone. Furthermore, the application is automatically updated with the details received from the main system located in the head office. While field collectors are collecting loans, they can access the details through the application and they can print an invoice using a Bluetooth printer to issue for the customer. Further, the collection details are stored in the device and are sent to the main system via SMS. The two parts were connected through a mobile network. Since, they have to use this android application in the areas with lower or no internet facilities, online solutions could not be provided. Although, the internet facility is not available, the main system and the application can be upgraded with software agents using JADE or JaCa like platforms. Further, system testing was conducted by the colleagues using about 100 test cases. In addition to that, the customer acceptance testing was conducted according to the criteria defined by the company. Hence, it was able to prove the completeness and the functionality of the entire system. Finally, with the automated system, they were able to improve the performance of the company by saving the human and physical resources and removing the unnecessary queues in the head office.Item Automotive Windshield Wiper Blade from Incorporating RSS/Skim Rubber Blend in the EPDM Formulation.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Weerakoon, A.D.Ethylene-propylene-diene rubber (EPDM) is one of the highly demanded synthetic rubber to manufacture automotive windshield wiper blades. In order to Sulphur vulcanizing of EPDM, powerful synergistic systems are needed to cope with the low level of unsaturation in the EPDM rubber. Blending natural rubber in the form of ribbed smoked sheets (RSS) and skim rubber is an obvious method of taking advantage of the relatively low price of skim rubber whilst minimizing its disadvantages, not least that possible variability. In addition, the vulcanization characteristics of skim rubber better due to presence of higher amount of non-rubber ingredients in skim, which act as co-accelerator activators for the vulcanization reaction. Hence skim rubber is used for increasing the rate of vulcanization of SBR. Similarly, non-rubber ingredients present in skim rubber can act as co-accelerator activators for the vulcanization of EPDM also. Optimum amount of skim rubber that can be used as an additive to improve cure characteristics and physicomechanical properties of EPDM lies are around 5 phr with TMTD/TBBS accelerator system. This research was focused to investigate the potential of EPDM/RSS /skim rubber composite to manufacture automobile windshield wiper blades. The used blend ratios of EPDM/RSS /skim rubber composites were 30/70/0, 30/65/5, 30/60/10, 30/55/15, 30/50/20 and 30/45/25 and the composites were prepared according to the ISO 4097-1980 (E) formulation with TMTD/TBBS accelerator system. Optimum cure time of the composites progressively has decreased from the first composite to the last composite. Torque difference of the composites has increased with the increment of skim rubber content up to 15 phr and after that it has decreased. However, scorch safety has gradually increased when increasing the skim rubber quantity of the composite. Cure characteristics after 30/55/15 ratio implies the decrement of crosslink density of the composites due to decreasing the number of cross linking sites in the composites. Highest hardness value (61 IRHD) and the lowest compression set % (27) was also obtained with the 30/55/15 composite. The overall results of this study shows that even though the amount of non-rubber ingredients increases with the increment of skim rubber, the number of cross linking sites available for the Sulphur vulcanization has been decreased. This research study concludes EPDM/RSS/skim rubber composite of 30/55/15 ratio can be used to manufacture automotive windshield wiper blades.Item Challenges in Implementing ERP Systems in Small Medium Manufacturing Companies in Sri Lanka(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Yasotha, R.; Ramramanan, L.There are numerous information systems available in the market to be selected for implementation in manufacturing organizations. When many information systems manually intergraded for management reporting for a company, there are high risks for accuracy of information. ERP is one of the information systems with inbuilt capacity to integrate many parts of the functional areas that provides meaningful information to the management. This paper describes the experiences on how a small medium size growing roof manufacturing company in Sri Lanka problem and then overcome in implementing ERP system. Small medium size manufacturing companies in Sri Lanka do not normally have electronic information system in all part of business process, whereas some processes such as production process operates outside the information system. Therefore, it is very important to predefine what level of integration to be done, who are the related parties to be consulted and what level of management information is required. The success of ERP implementation is partially depending on the selection of suitable ERP system compatible with company business process and the capability of implementation partner to map those standardized business processes into ERP by conducting BPR. This manufacturing company has many automated manufacturing plants with Programmable Logic Controllers (PLC) versions from year 1960 to 2013. When these PLCs try to integrate into ERP system, there are so many problems faced by the company that leads up to modification of plant. Finally, company decided to implement ERP by postponing the PLC integration. Well tested bugs free less customized SAP B1 system has been implemented to the company by monitoring progress by several log books. The big bang approach has been followed to implement the SAP B1 system with short term parallel run of legacy system. More importantly, top management support and motivation on change management has fuelled up the success of the SAP B1 implementation. This paper reveals the experience gained during the planning to implementation stages of SAP B1 that may occur in small medium manufacturing companies in Sri Lanka.Item Coal Fly Ash as an Alternative Substrate to Replace River Sand in Cement Mortar Mixture(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Jayaweera, N.J.S.T.; De Silva, P.; Herath, H.M.P.I.K.; Jayasinghe, G.Y.Coal is the most extensively used primary source of energy that accounts globally for 25% of total energy consumption. The global generation of coal fly ash (CFA) is estimated to be above 6x108 Mg per annum and its recycling rate is rather low (15%). Sri Lanka is also facing major economic and environmental problems of disposing CFA from Norochcholai thermal power plant and part of CFA disposal is being used as a raw material for cement production. However, CFA with high loss on ignition (LOI) values cannot be used for blending with cement and this study was designed to investigate the potential utilization of high LOI-CFA as an alternative substrate to river sand in cement mortar preparation. Compressive Strength (CS), water demand (WD), moisture content (MC), initial setting time (IST), and final setting time (FST) were examined to select the most suitable mixing ratio of CFA and river sand. Treatments were prepared in accordance with SLS ISO 1253−107: part 2−2008, with 30 replicates for LOI and MC. Treatments were defined as the percentage of added CFA into sand as T1=0 (control), T2=5%, T3=10%, T4=12%, T5=15%, T6=18%, T7=20%, and T8=25%. Four replicates per each treatment in different three ages (one day−1D, seven days−7D, and twenty-eight days−28D) were tested for CS of mortar in accordance with SLS ISO 679:2008. Initial and final setting time of cement CFA mixture was determined in accordance with SLS ISO 9597:2008(E) with 8 treatments. Results have proven that high LOI-CFA can be used as an alternative substrate to sand up to 20%. The average CS for 1D, 7D, and 28D of control treatment were 16.8 MPa, 41.3 MPa, and 51.3 MPa respectively. The highest CS for 1D (21.9 MPa) and 28D (71.1 MPa) were given by 10% CFA treatment, but the highest seven-day CS results (50.1 MPa) was given by 12% CFA treatment. Each treatment was significantly different from other treatments. Means for CS of T2, T3, T4, T5, T6 and T7 were not significantly different from the mean of control treatment, while T8 (25% CFA and 75% sand) was significantly different from the control. R2 between WD and CFA percentage obtained by regression analysis was 93.2%, which showed a strong relationship between them. R2 of IST versus WD, and FST versus WD were 97.7 % and 94.8 % respectively, which showed strong relationships with WD. Hence, it can be concluded that increasing CFA percentage up to 20, gave increased WD, IST, and FST.Item Conflict Categorization of ERP Implementations in Asia Pacific Region.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Herath, H.M.P.S.; Rajakaruna, J.P.An Enterprise Resource Planning (ERP) system is an integrated software system, typically offered by a vendor as a package that supports the seamless integration of all the information flowing through Business Processes, Business Intelligence, Business Integrations, Collaborations, etc. This research is intended to discuss on complications in ERP implementation in Asia Pacific (APAC) region with the client, vendor, implementer, consultant and project management perspectives. The objective of this research-in-progress paper is to develop a clear visibility of categories of conflicts in ERP projects in multicultural environments. Categorization of ERP project implementation related conflicts would provide better preparation for a successful project implementation and delivery. This is the first attempt for the journey to consolidate the literature on the conflicts associated with ERP projects. Also seeking for uplift the understanding of conflict and managing the same effectively in APAC region. In this case our research question is “Can we categorize ERP project related conflicts?” and if so, “What are the categories of conflicts in relation to ERP implementation in APAC region?” Alsulami (2013) on his “Consolidating Understanding of ERP Conflicts : A dialectic Perspective, Computer Science and Information Systems Faculty, Umm Al-Qura University” categorised ERP projects conflict related to Australian experience into two; such as “Technical related and Process related”. However, thirteen business cases in Sri Lanka, India and Malaysia show us conflicts can be categorised as “People related, Technology related & Methodology related”. These findings can be effectively used by ERP Implementers, Vendors, Consultants, Project Managers and Researchers in their respective projects.Item De-Identification for Privacy Protection in Audio Contents(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Induruwa, K.G.; Pallewatta, A.P.Among different forms of audio data or information, the author wishes to limit the scope of this research to privacy protection in voice contents of speakers, because voice generally conveys intelligence such as gender, emotion and it differs from speaker to speaker. De-identification of voice may bring numerous advantages, such as preserving the privacy of speakers during communication, maintaining confidentiality of inquirers who conduct critical investigations and improve the clarity of voice signals used in airport/aviation communication by standardizing the voices of Pilots and Air Traffic Controllers. Though advanced voice encryption methods are available to deteriorate the intelligence of speech, they do not directly address the issues of speaker de-identification. This research project aims at de-identification of voice signals while preserving the intelligence of the speech during communication. Designed GUI for mono LPC spectrums of original and de-identified voice signals In this project, the de-identification process was done at three stages, where the last two processes are irreversible. First, in the frequency normalization stage, pitch of the original signal is changed and slightly de-identified the voice in frequency domain. Then 12 LPC (Linear Predictive Coding) co-efficient values of the subject-person’s original voice signal is subtracted from the 12 coefficient values of the reference sample voice signal. As a result, features are slightly moderated by the second stage. In the third stage the features are destroyed again by shuffling LPC coefficients randomly within three categories. Therefore, this whole process is expected to preserve a higher level of privacy. Based on the test carried out by using 15 samples of male and 15 samples of female voice produced a degree of 10% and 20% de-identification, which could be accepted as a very satisfactory result.Item Design and Development of a Dashboard for a Real-Time Anomaly Detection System.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Korala, H.C.; Weerasooriya, G.N.R.; Udantha, M.; Dias, G.Web logs contain a wealth of undiscovered information on user activities and if analyzed in a proper way they can be utilized for many purposes. Identifying malicious attacks and having a daily summary on user activities are some valuable information that can be extracted from these log files. At present, many tools and algorithms have been developed to extract information from these log files but on most occasions, they have failed to present this information to the user to make decisions in real-time. This paper presents a novel approach taken to design and develop a dashboard for a real-time anomaly detection system with the use of some open source tools to process complex events in real-time, batch process stored data using big data tools and dashboard development techniques. The system accepts web log files as the input; first they are cleaned by a preprocessing unit and then published to WSO2’s complex event processor as events to identify and filter out special patterns and summarised by using a set of user specified rules. If an anomaly is detected, an alert or warning will be displayed on the widget based dashboard in real time. Furthermore, each and every event stream that comes to the CEP will be forwarded to WSO2’s Data Analytic Server via 'Thrift' protocol. That data will be saved in a Cassandra big data database for further batch processing which is used for drill down purposes. A widget based Dashboard has been developed with the use of modern dashboard concepts and web technologies to display information such as daily summary, possible security breaches in an interactive way allowing system administrators to make operational decisions then and there based on the information provided. Moreover, users can drill down and analyze the historical security breach information and also can customize the dashboard according to their preference. The evaluation techniques used fall under the criteria of evaluation against well-established standards and evaluation by external expert review. Evaluation for security standards has done against the security standard set by the PCI security standards council and evaluation for dashboard has been carried out against the dashboard standards defined by Oracle which describes about the best practices in developing an effective dashboard. Evaluation by external expert review was done in line with the people who have prior experience in dealing with a dashboard in different contexts. Ten expert evaluators from different expertise areas (System Administrators, UX engineers and QA engineers) have been used for this evaluation and a score based model was used to determine how efficient this dashboard is to view and drill information. Based on the results yielded from the evaluation, it is identified that the dashboard meets with the international standards of dashboard designs, well established security standards in dashboard design as well as provides the best user experience for users in different functional areas.Item Detecting and Classifying Vehicles in Video Streams of Homogeneous and Heterogeneous Traffic Environments Using Gaussian Mixture Model.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Jayathilake, M.V.M.; Jayalal, S.G.V.S.; Rajapakse, R.A.C.P.Traffic and transportation play an important part in modern national economics. Efficient use of transportation infrastructure leads to huge economic benefits. Traffic can be classified into two main categories as homogeneous traffic and heterogeneous traffic. In transportation engineering, sufficient, reliable, and diverse traffic data is necessary for effective planning, operations, research, and professional practice. Even though, Intelligent Transport System are used to find answers for that issue still it is not yet fully successful. Many technologies have been developed to collect different types of traffic data. Traditional data collection technologies have several drawbacks. On the other hand, video based traffic analyzing has become popular. Computer vision techniques are used for detecting and classifying data in traffic videos. Those technologies are highly beneficial as it can give us more information about the parameters, easy to install and maintain and has got wide-range operation. In Computer vision, vehicle detection process has two main steps as Hypothesis Generation (HG) and Hypothesis Verification (HV). Background Subtraction is a popular method used in HG. There are several algorithms used in Background Subtraction and Gaussian Mixture Model is one of them. These methods are used in homogenous traffic situations. The objective of this study is to detect and classify vehicles from a homogenous and heterogeneous traffic video stream using Gaussian Mixture model. This study was conducted using an experimental method. Several set of road traffic videos were collected. One is collected at off peak time; i.e. 9.00am to 10.00am. At that time behavior of the traffic is similar to homogenous traffic environment. The other set of videos is collected from 7.00am to 8.30am. At that time, road traffic has no order and the traffic density is high. It is similar to heterogeneous traffic environment. After Gray Scaling and Noise reduction, the videos were submitted to algorithm based on Gaussian Mixture Model. The algorithm was implemented using Math Lab software. Vehicles are classified as large, medium and small. Manual observation results and experiment results were compared. Accurate results were observed from homogenous traffic conditions. But results in heterogeneous traffic conditions is less accurate. The Gaussian Mixture Model can be used to detect vehicles in homogenous traffic conditions successfully, but it is needed to be improved in heterogeneous traffic conditions.Item Dynamic Human Workflow handling by PL/SQL.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Ranasinghe, A.N.; Dasanayaka, D.M.N.K.Human task assignment is a predominant operation in organizational problem-solving process which can rapidly change from situation to situation. Current workflow handling systems are lessadaptable and less-customizable regardless of whether the workflows are manual or automated. This study introduces a series of algorithms written in SQL to handle human task workflows by executing XML based objects. The given solution consists of entity objects which can either be a single person or a group of people. The two types of entity objects were connected to each other by the relationships. The relationships from one entity to another entity will hold the actions that the first entity can perform. Based on the action taken by one entity will decide the proceeding path of the workflow. These entity objects consist with a property called status which can be true, false, or null which represents whether it is available for the execution or not. SQL scripts were developed to handle the workflow components written in XML format which will provide a better opportunity to gather information of each entity and relationship objects through Graphical User Interfaces. As the first step of the SQL script, it will convert the XML data into a SQL table format which provide a better way to handle the information gathered. Backward Process is used whenever a component is being executed, previous components from that level will be checked for status values and it will set status false whenever finds a component without any status value. This process will prevent the execution of unwanted branches of the workflow and speedup the execution process because only the components with true or null as the status value will be considered in the execution time. Reset Workflow is used to reset status values of the objects from directed object onwards when the path of the workflow directs to a previous level of the workflow. It will make sure that previously taken actions will not effect on the next execution cycle. Execute an Action method will handle the actions taken corresponding to a certain object. It will decide the proceeding path of the workflow hierarchy according the action taken. Then it invokes the “execute next component” method to move along in the selected path. Execute next component method will check all the other objects related to the object which used to invoke the method. It will execute all the logical operations based on the action taken by an entity in order to decide the proceeding path of the workflow. Method given in the study was tested by integrating to an existing system where it showed the capability of executing complex workflows accurately. Contrary to manual workflow engine, this architecture is efficient and effective in business process as it can increase the performance of organizational workflow allocation. Instead of using a separate application, this solution can be integrated with an existing system since it is very adaptable and customizable. Approach to handle scheduled tasks can be identified as a major future aspect for the study where the performance can also be improved in future.Item Dynamic Mechanical and Thermal Properties of Natural Rubber Latex Films Filled with Surface Modified Silica.(Faculty of Computing and Technology, University of Kelaniya, Sri Lanka., 2017) Somaratne, M.C.W.; Silva, S.N.H.; Liyanage, N.M.V.K.; Walpalage, S.Natural Rubber Latex (NRL) is used in numerous fields due to its outstanding performances such as excellent elasticity and essential eco-friendly nature. However, the existing tensile and tear strength of NRL is not sufficient for the functions of extra thin film products. Therefore, scientists work on reinforcement of NRL with silica filler. However, the hydrophilic nature of silica particle makes an unsupportive role to its compatibility with hydrophobic rubber. Hence, surface modification of silica is essential to convert the hydrophilic surface into hydrophobic and it has been successfully accomplished by using hydrophilic polymers. The modified filler is succeeded in preceding studies for the reinforcement of NRL films. Beyond the reinforcement, properties like low stiffness, mechanical and thermal stability also play a major role in enhancing the quality of a consumable thin film product. In order to investigate those properties, Dynamic Mechanical Thermal Analyzer (DMTA) is widely used. The present study focuses on studying such properties of NRL cast films reinforced with silica filler. The surface modification of silica particles and preparation of modified filler (8phr) added cast films are carried out as per the reported methods of our team and the films filled with modified (8M) and unmodified (8U) filler and unfilled (STD) films are analyzed by using tension and dual cantilever modes of DMTA instrument to investigate their thermal and mechanical properties as a function of temperature. The obtained storage modulus and tan delta curves show the energy dispersion throughout the rubber film with the temperature increment. The tan delta curves shown in Figure 1 illustrate the lowest peak value of tan delta given by modified filler added rubber film (8M). It reveals that 8M has higher energy dispersive ability. The rate of decrease in storage modulus also low in 8M sample at the phase transition region; from glass state to visco-elastic state. It proves that improved interfacial interactions between modified silica and rubber matrix. The higher energy dispersive ability reveals the sustaining of foremost rubbery properties whilst improved interfacial interactions reveal the reinforcement property of modified filler added rubber film.
- «
- 1 (current)
- 2
- 3
- »