Optimized Protocol for IOT's Data Transmission Using Machine Learning

Welcome to DSpace BU Repository

Welcome to the Bahria University DSpace digital repository. DSpace is a digital service that collects, preserves, and distributes digital material. Repositories are important tools for preserving an organization's legacy; they facilitate digital preservation and scholarly communication.

Show simple item record

dc.contributor.author M.Awais Ali, 01-133162-084
dc.contributor.author Aizaz, 01-133162-007
dc.date.accessioned 2022-04-13T06:34:15Z
dc.date.available 2022-04-13T06:34:15Z
dc.date.issued 2020
dc.identifier.uri http://hdl.handle.net/123456789/12606
dc.description Supervised by: Ammara Nasim en_US
dc.description.abstract Internet of Things (IoT) based sensor networks have gained popularity in the recent years and they become vital for supporting high data rate real time applications. By the end of 2020 there will be 31 billion IoT devices around the world. To achieve efficient data transmission each IoT node has to gain an understanding of recent time and spectral feature of channel in order to increase the throughput. The literature has proposed different methods such as channel allocation and channel quality measurement protocol for multiple channel sensor networks. As far as we know there are few protocols that can adapt and learn with respect to the changing channel attributes in IoT network to maximizing data transmission and channel throughput. We propose an automated self-learning and modifiable protocol that can automatically transmit multiple user data efficiently through recognizing channel frequency and time features. The proposed protocol is novel in a way that it can develop understanding and grasp itself to changing network dynamics such as network density and amount of data to be transmitted. This is achieved by constantly extracting well defined features from a network. Best channel selection according to time and spectral characteristics is done by utilizing these features. Each node is stocked with non-linear support vector machine with Gaussian radial basis kernel function classification model to make the decision for using Time based partitioning or Frequency based partitioning. This protocol shows promising results in increasing network density. It uses the bandwidth efficiently and shows better data transmission. en_US
dc.language.iso en en_US
dc.publisher Bahria University Engineering School en_US
dc.relation.ispartofseries BEE;P-1636
dc.subject Electrical Engineering en_US
dc.title Optimized Protocol for IOT's Data Transmission Using Machine Learning en_US
dc.type Project Reports en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account