2021 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT) Program
Wednesday, September 29
Wednesday, September 29 9:30 - 9:35 (Asia/Bahrain)
Wednesday, September 29 9:35 - 9:40 (Asia/Bahrain)
Wednesday, September 29 9:40 - 9:50 (Asia/Bahrain)
Wednesday, September 29 9:50 - 10:00 (Asia/Bahrain)
Wednesday, September 29 10:00 - 10:40 (Asia/Bahrain)
Computing is being transformed to a model consisting of services that are delivered in a manner similar to utilities such as water, electricity, gas, and telephony. In such a model, users access services based on their requirements without regard to where the services are hosted or how they are delivered. Cloud computing paradigm has turned this vision of "computing utilities" into a reality. It offers infrastructure, platform, and software as services, which are made available to consumers as subscription-oriented services. Cloud application platforms need to offer (1) APIs and tools for rapid creation of elastic applications and (2) a runtime system for deployment of applications on geographically distributed computing infrastructure in a seamless manner. The Internet of Things (IoT) paradigm enables seamless integration of cyber-and-physical worlds and opening up opportunities for creating new class of applications for domains such as smart cities and smart healthcare. The emerging Fog/Edge computing paradigm is extends Cloud computing model to edge resources for latency sensitive IoT applications with a seamless integration of network-wide resources all the way from edge to the Cloud. This keynote presentation will cover (a) 21st century vision of computing and identifies various IT paradigms promising to deliver the vision of computing utilities; (b) innovative architecture for creating elastic Clouds integrating edge resources and managed Clouds, (c) Aneka 5G, a Cloud Application Platform, for rapid development of Cloud/Big Data applications and their deployment on private/public Clouds with resource provisioning driven by SLAs, (d) a novel FogBus software framework with Blockchain-based data-integrity management for facilitating end-to-end IoT-Fog/Edge-Cloud integration for execution of sensitive IoT applications, (e) experimental results on deploying Cloud and Big Data/ IoT applications in engineering, and health care (e.g., COVID-19), deep learning/Artificial intelligence (AI), satellite image processing, natural language processing (mining COVID-19 research literature for new insights) and smart cities on elastic Clouds; and (f) directions for delivering our 21st century vision along with pathways for future research in Cloud and Edge/Fog computing.
Wednesday, September 29 10:40 - 13:00 (Asia/Bahrain)
Wednesday, September 29 13:00 - 14:40 (Asia/Bahrain)
- 13:00 Remaining Useful Life Prediction of Turbofan Engine using Long-Short Term Memory
- The aero-engine is a crucial component of the aircraft that provides thrust for plane. To ensure the safety of the aircraft, it is vital to estimate the remaining useful life (RUL) of the engine. Over the past decades, research regarding Prognostic Health Management (PHM) has gained popularity in the field of engineering due to the machineries' fault. The failure of the machinery systems can cause many incidents, such as delays or an increase in operating costs. Thus, to monitor the reliability and safety of an engineering system, which improves the maximum operating availability and reduces maintenance cost, RUL is used to predict the future performance of the machinery to prevent fault. In this study, we propose a model for RUL estimation based on Long-Short Term Memory (LSTM), which can fully exploit sensor sequence information and reveal hidden patterns in sensor data. The proposed LSTM model has achieved an accuracy of 0.978 and F1-score of 0.960. While the regression model performance has been evaluated using three evaluation matric, mean absolute error (MAE), coefficient of determination (R2), recall. Lastly, the results achieved for MAE, R2, and recall were 12, 0.7856 and 1, respectively.
- 13:20 A Homomorphic Cloud Framework for Big Data Analytics Based on Elliptic Curve Cryptography
- Homomorphic Encryption (HE) comes as a sophisticated and powerful cryptography system that can preserve the privacy of data in all cases when the data is at rest or even when data is in processing and computing. All the computations needed by the user or the provider can be done on the encrypted data without any need to decrypt it. However, HE has overheads such as big key sizes and long ciphertexts and as a result long execution time. This paper proposes a novel solution for big data analytic based on clustering and the Elliptical Curve Cryptography (ECC). The Extremely Distributed Clustering technique (EDC) has been used to divide big data into several subsets of cloud computing nodes. Different clustering techniques had been investigated, and it was found that using hybrid techniques can improve the performance and efficiency of big data analytic while at the same time data is protected and privacy is preserved using ECC.
- 13:40 Modified Version of the Cumulative Sum Statistical Analysis Method
- A cumulative sum (CUSUM) is a method used to identify significant changes in data trends using the cumulative sum of deviations from a predefined targeted value. The current process can only detect changes in horizontal trends relying on visual inspection. The paper presents a modified method to detect upward and downward trends using a data transformation algorithm. The algorithm automates the detection process and overcomes the need for visual inspection. The technique enhances and prompts the ability of stakeholders to anticipate fluctuations in large data sets and make informed decisions.
- 14:00 Hyper Parameter Tuned Ensemble Approach for Gestational Diabetes Prediction
- Diabetes Mellitus is commonly found in human beings around the world and this is one of the serious diseases which causes boundless suffering among patients. There are numerous reasons for the prevalence of this disease. It would be better to consider the predictions carried out earlier in this respect, since diabetes is a non - communicable disease and makes a great impact on the health condition of people nowadays, making them vulnerable to COVID - 19. This is the reason why the existing medicinal practices in most hospitals are collecting patients' life history or the record of the disease. This is done for diagnosing diabetes using various medical tests followed by proper treatment for the disease. Machine learning provides an immense contribution to the sector of healthcare. For this research, Pima Indians Diabetes Dataset, obtained from the University of California, Irvine (UCI) machine learning source that included 768 patients' details along with nine attributes had been chosen for a comprehensive investigation of this grave and widespread problem in the health sector. Eventually, an adequate perfect outcome could be achieved and some effective and transparent conclusions were made. Among 768 diabetics, 500 were recognized as positive for the disease while 268 were recognized as negative. Besides, the recorded facts were put into particular supervised machine learning techniques such as Support Vector Machine (SVM), Naïve Bayes (NB), Decision Tree (DT), Artificial Neural Networks (ANN), Linear Discriminant Analysis (LDA), Logistic Regression (LR) and k-nearest neighbors (k-NN). Along with this, bagging and boosting techniques like Random Forest (RF), Extreme Gradient Boosting (XGBoost), LightGBM, and CatBoost too were taken into consideration. In addition, by considering classifiers with the highest accuracies, the final ensemble model was developed with the adaption of SVM, CatBoost and RF to predict the diabetes mellitus. Thus, the model resulted in an accuracy of 86.15%.
- 14:20 Developing A Predictive Model for Diabetes Using Data Mining Techniques
- According to a World Health Organization (WHO) survey from 2018, diabetes mellitus is one of the rapidly developing chronic life-threatening illnesses, affecting 422 million people worldwide. Early diagnosis of diabetes is often preferred for a clinically significant outcome due to the occurrence of a long asymptomatic period. Data science approaches have the potential to help other research fields. The tools, which are heavily dependent on Data Mining (DM) techniques, can be used to forecast diabetes patients effectively. In this article, three DM methods are used to investigate the early detection of diabetes: Naïve Bayes (NB), Logistic Regression (LR), and Random Forest (RF). According to this research study, the RF experiment results showed that it has the highest level of accuracy compared to other techniques.
- 13:00 Stylometric Analysis of Writing Patterns Using Artificial Neural Networks
- Plagiarism checkers have been widely used to verify the authenticity of dissertation/project submissions. However, when non-verbatim plagiarism or online examinations are considered, this practice is not the best solution. In this work, we propose a better authentication system for online examinations that analyses the submitted text's stylometry for a match of writing pattern of the author by whom the text was submitted. The writing pattern is analyzed over many indicators (i.e., features of one's writing style). This model extracts 27 such features and stores them as the writing pattern of an individual. Stylometric Analysis is a better approach to verify a document's authorship as it doesn't check for plagiarism, but verifies if the document was written by a particular individual and hence completely shuts down the possibility of using text-convertors or translators. This paper also includes a brief comparative analysis of some simpler algorithms for the same problem statement. These algorithms yield results that vary in precision and accuracy and hence plotting a conclusion from the comparison shows that the best bet to tackle this problem is through Artificial Neural Networks.
- 13:20 Cracked Wall Image Classification Based on Deep Neural Network Using Visibility Graph Features
- Visibility graphs are graphs created by making use of the relations of objects with each other depending on their visibility features. Today, visibility graphs are used quite frequently in signal processing applications. In this study, cracked and non-cracked wall images taken from a dataset were classified by a deep neural network depending on the visibility graph properties. In the proposed method, firstly, histograms of the images are obtained. The resulting histogram is then expressed by visibility graphs. A feature vector of each image is created with the maximum clique and maximum degree features of the obtained visibility graphs. Then, deep neural network training is performed with the feature vectors created. The classification success of the proposed method on images separated for testing is 99%.
- 13:40 An Effective Cost-Sensitive Convolutional Neural Network for Network Traffic Classification
- The volume, and density of computer network traffic are increasing dramatically with the technology advancements, and this has led to the emergence of various new protocols. Analyzing the huge data in large business networks has become important for the owners of those networks. As the majority of the developed applications need to guarantee the network services, while some traditional applications may work well enough without a specific service level. Therefore, the performance requirements of future internet traffic will increase to a higher level. Increasing pressure on the performance of computer networks requires addressing several issues, such as maintaining the scalability of new service architectures, establishing control protocols for routing, and distributing information to identified traffic streams. The main concern is flow detection and traffic detection mechanisms to help establish traffic control policies. A cost-sensitive deep learning approach for encrypted traffic classification has been proposed in this research, to confront the effect of the class imbalance problem on the low-frequency traffic data detection. The developed model can attain a high level of performance, particularly for low-frequency traffic data. It outperformed other traffic classification methods.
- 14:00 Urban Sound Classification Using DNN, CNN & LSTM a Comparative Approach
- Like air pollution, sound pollution has grown to be a major concern for city residents, designers and developers. Detecting and recognizing sound types and sources in cities and suburban areas or any environment has become a necessity for the quality of life as well as security. In recent years, researchers have explored many models using Convolutional Neural Network (CNN), Long Short Term Memory (LSTM) Neural Network and different combinations of these techniques, which produced promising results when combined with spectrogram images, or its different variations, to classify urban sounds. This research compares the performance of DNN as a baseline with CNN and LSTM in classifying urban sound using Mel scale cepstral analysis (MEL) spectrum images using an open source library called Librosa for sound processing. Models' performance was evaluated on the UrbanSound8k dataset. The CNN model underperformed with an accuracy rate of 87.15% and f1 score of 85.63% compared to both the base model and the LSTM model. In contrast, comparing LSTM model with CNN, LSTM shows better accuracy performance on test data with 90.15%, an f1 score of 90.15%.
- 14:20 Cardiovascular Diseases Classification Via Machine Learning Systems
- Heart disease patient's classification is one of the most important keys in cardiovascular disease diagnosis. Researchers used several data mining methods to support healthcare specialists in the disease's analysis. This research has studied diverse of supervised machine learning systems for heart disease data classification, Decision Tree (DT), Artificial Neural Networks (ANN) classifiers, Naïve Bayes (NB), and Support Vector Machine (SVM), and have been used over two datasets of heart disease archives from the UCI machine-learning source. Results showed that ANN classifier overtook the three other classifiers with highest accuracy rate.
- 13:00 Socio-technical Challenges in the Implementation of Smart City
- The smart city concept is a solution to many problems that we are facing in our day-to-day life. Many countries have been started adopting many prototypes to solve daily challenges. Still, the smart city term does not have any universally accepted definition. The smart city is a ubiquitous term whose definition varies from person to person, city to city, and country to country. The lack of a common definition remains the term smart city is in a chaotic state. This paper has presented the socio-technical challenges during the implementation of smart city plans. It has analyzed the smart city execution problems in the context of developed and underdeveloped countries.
- 13:20 Expected Load Loss and Investment Cost of Four Power Plants Reliability Study Investigation
- The electric power systems are studied to reach a reliable system. Reaching a reliable system of different cases for the power system units under study is considered in the present research. In the present study, four multi-power units are considered with different rates of capacity, different forced outage rate values, and different numbers of power units. The binomial distribution is applied to the selected power units, where the probabilities are calculated and after that, the expected load loss (ELL) is determined for the four selected power plants. Out of the obtained results, the most reliable power plant found and recorded based on the priority. In addition, the developed models for the four power plants considered are found using the curve fitting technique based on the output results. The corresponding electric power units were determined based on the priority of expected load loss results when compared together. The present model's expected load loss is providing a practical tool for quantitative analysis of power plant units. Finally, the investment cost is calculated for the four power plants.
- 13:40 Analysis of the impact of faults in a photovoltaic generator
- Throughout this article, we have identified an electrical model based on Bishop's one that mathematically even represents the actual behavior of a PV cell heavily involved in the production of a PV generator (PVG). It turns out that the performance of PVG is intrinsically linked to operating, manufacturing or environmental conditions. The nonlinear model that we have retained due to the atypical behavior of the semiconductors used in the manufacture of the panels, reveals to us faults requiring an efficient diagnosis. We present some results of simulations under Matlab, which characterize the impact and influence of certain PVG faults on the performance of the studied system. The good operating prospects resulting from the simulation results are very encouraging for further work in the field of fault diagnosis and prognosis
- 14:00 Automatic Human Fall Detection Using Multiple Tri-axial Accelerometers
- Accurately detecting human falls of elderly people at an early stage is vital for providing early alert and avoid serious injury. Towards this purpose, multiple triaxial accelerometers data has been used to uncover falls based on an unsupervised monitoring procedure. Specifically, this paper introduces a one-class support vector machine (OCSVM) scheme into human fall detection. The main motivation behind the use of OCSVM is that it is a distribution-free learning model and can separate nonlinear features in an unsupervised way need for labeled data. The proposed OCSVM scheme was evaluated on fall detection databases from the University of Rzeszow's. Three other promising classification algorithms, Mean shift, Expectation-Maximization, k-means, were also assessed based on the same datasets. Their detection performances were compared with those obtained by the OCSVM algorithm. The results showed that the OCSVM scheme outperformed the other methods.
- 14:20 Household Disposal of Medications as a Pathway of Environmental Contamination in the Kingdom of Bahrain
- Background: Unused and expired medications are continuously disposed of through toilets, drain, and household trash. This is potentially dangerous and polluting, posing risks to public health and the environment. Objective: This study investigated public awareness in the Kingdom of Bahrain regarding contamination of the environment by pharmaceutical waste and assessed patterns of household medication disposal as well as factors influencing the chosen disposal practice. Methods: A cross-sectional study was designed, using a self-administered online questionnaire that was sent publicly to all people living in Bahrain and aged above 18 years, through social media platforms. Results: The questionnaire was completed by a total of 450 participants; of whom 421 were Bahrainis (93.6%) and 29 were non-Bahrainis (6.4%). Almost two-thirds (60.9%) of the participants had good knowledge regarding environmental contamination by pharmaceutical wastes. The majority (73.3%) of the participants discarded the leftover medications in the household trash, and only 12.0% of them returned them to the pharmacy. More than three-quarters (79.6%) of the participants did not check if a disposal method was mentioned on the medication's packaging. Interestingly, most of the participants (85.1%) declared to be willing to participate in pharmaceutical waste minimizing programs if applied in the Kingdom of Bahrain. Conclusion: Environmental contamination by pharmaceutical waste can considerably be resolved by improving public awareness of household disposal of medications and stimulating their willingness to participate in pharmaceutical waste management interventions if established in the future.
- 13:00 Automated Detection and Control System Using Computer Vision based Video Analytics: Case COVID-19
- The classification of COVID-19 as global pandemic has led researchers and scientists to design solutions in order to reduce the speed of spread of the virus. This paper presents a novel detection and control system using Computer Vision based video analytics to help in reducing the speed of the spread of the virus by recognize peoples and detect masks. The system also uses body temperature and user biometrics to give control access inside in closed environment, to reduce the spread of COVID-19. The proposed system can identify a person who wants to access an indoor environment and track his movement. The system can control the door of the main entrance, an elevator, or any access zone, and generate audio notifications to alert user(s) to put their mask(s). The implementation results show that the proposed system has the advantages of high precision, and insure a safe environment while preserving the benefits of being low cost and modular.
- 13:20 Low Cost Sensor Based Hand Washing Solution for COVID-19 Prevention
- One of the most effective solutions in the preventive endeavors for infectious diseases such as the COVID-19 pandemic is the application of frequent hand washing. In developing countries, there are several low cost touch-free hand washing solutions involving the use of foot operated mechanisms. However, the use of embedded processors in the design of automatic electronic systems to provide convenience and smarter solutions has in recent times gained unique attention globally. In this paper, we employed an Arduino based microcontroller as processor and ultrasonic based distance sensors to implement a touch-free hand washing mechanism. The microcontroller processes received sensor signals and sends desired command signals to operate two DC motors. A DC pump and servo motor are used to facilitate simple yet effective dispensation of water and soap respectively without any physical contact with the user. The simulation of the developed system was performed with Proteus. The system is also experimentally verified to meet the desired design requirements.
- 13:40 A Secure Internet of Healthcare Things for tackling COVID-19
- Tracking and monitoring systems have gained increasing attention with the rise of the COVID-19 pandemic. This paper provides a brief overview of the IoT and cognitive capabilities of the Internet of Medical Things (CIoMT) and the Internet of Healthcare Things (IoHT) for COVID-19. Then, Xilinx Vivado is used to evaluate various builds of SIMON, as a lightweight cryptography technique, as a part of securing these systems. The performance indicators in this paper are LUTs, frequency, IOBs, and throughput-to-slice (TP/slice). Moreover, the Cadence Genus tool is used to calculate estimated battery life, area, and gate count.
- 14:00 Communication Limitations During COVID-19 Pandemic: Challenges and Solutions for Public Sector in The Kingdom of Bahrain
- Ineffective communication is a prevalent issue that affects organizations' continuity. Due to COVID-19 pandemic, organizations have been facing increasing communication challenges, which resulted in reduced productivity and increased employee frustration. The study aimed to explore the impact of communication limitation during COVID-19 pandemic on public organizations in the Kingdom of Bahrain. The findings reported that the organization size, most respondents (53.1%) belong to small sized organizations (less than 3000 employees). During the pandemic, the most popular mode of online communications was Zoom/ Ms. Teams (40.7%) followed by other internet applications (29.6%). Some of the common challenges faced by government employees across the public organization included change in working hours from regular to shift duty or extended working hours with no annual leaves, which made it difficult to be motivated. Organizational Network Analysis (ONA) software, which is a suggested tool that can provide powerful insights on the formal and informal communication channels with clear visualization. Designing continuous training program for public sector employees by utilizing available communication tools and technology aids towards increasing efficiency. Moreover, communication skills training must be given intensively at all levels of employees and adhere to the "new normal" of working distantly.
- 14:20 Continuity of Project's Follow-up Practice During COVID-19: Identifying Predictors and Challenges
- The importance of studying organizations' continuity of follow-up mechanisms is raised by the absence of research conducted on the follow-up mechanisms, especially during sudden pandemics. Therefore, this study attempts to research the continuity of follow-up mechanisms organizations use to monitor projects progress and accomplishment. Also, explore the predictors, problems, and challenges for managing remote working. Follow-up is the monitoring and evaluation of project progress against standards to enable management to make decisions for interventions towards project completion through team communication. Findings show that continuity of follow-up practice during covid19 is influenced by remote monitoring challenges and Organization compliance to pandemic restrictions (R2 = 0.35). Organization compliance to pandemic restrictions is a function of three determinants related to the organization's behavior regarding monitoring structure, internal policies, and communication and resource facilities (R2 = 0.54). Researchers used the mixed method approach consist of quantitative and qualitative methods. A survey was randomly distributed to an achievable sample of 158 respondents, followed by interviews with twelve decision-makers, including managers and executives in selected organizations. The study suggests more technological tools and applications for improving follow-up performance and overcoming remote monitoring challenges.
- 13:00 Development of Vision-Based Autonomous UAV for Railway Tracking
- Railway transportation is commonly used today. The reason for this common use is that it is a cheap and safe mode of transportation. Defects in railways have a significant impact on safety. In recent years, automatic inspection systems are recommended for fast and reliable inspection of defects in railways. Automatic inspection systems are based on the process of collecting images from the railway and examining these images with various techniques. Image collection can be done in various ways. UAVs can be used for defect detection in terms of speed and cost. This study presents an autonomous cruise control mechanism for tracking the railway and collecting images using UAV. It is important to develop a suitable controller as it directly affects the overall response time, accuracy and rail tracking performance of the UAV. In this study, the rail lines were determined by using Gabor filter from the images taken from the UAV. Right and left rails were determined by forming two different Gabor kernels. Then, the vanishing point was found from the intersections of the obtained rail lines on the horizon. The obtained vanishing point was followed by the UAV at a constant speed using a PID controller. Experimental results have shown that the UAV successfully follows the railway by centering.
- 13:20 Implementations of Two Answer Submission Methods for Reducing Errors in Android Programming Learning Assistance System
- Recently, Android programming educations have become important in IT departments in universities due to the strong market demands. The Android Programming Learning Assistance System (APLAS) is a self-learning platform for Android programming. It provides assignments to students where the correctness of their answers can be checked automatically using unit testing tools through a web application. For each assignment in APLAS, students need to submit multiple files with the correct configuration as one project. However, many students cause errors in the current implementation of submitting one by one the files in the Android project made using Android Studio. In this paper, we implement two submission methods for reducing errors in APLAS. The zip-file method makes the zipped file of the Android project to be submitted. The GitHub-link method synchronizes the files in the Android project through the URL of the GitHub project. For evaluations, we asked 40 students to solve three assignments in APLAS and submit the answer files by freely choosing one method. The results show that the zip-file method was most popular and the GitHub-link method produced least errors.
- 13:40 Genetic Algorithm for Feature Selection in Predicting Repurchase Intention from Online Reviews
- This paper proposes a methodology to predict the repurchase intention based on the reviews and the customer's stated intention. However, there is a large number of words in the reviews. Using those words as features in the prediction model tends to decrease the accuracy of the model and cause model overfitting. A methodology that is based on Genetic Algorithm is proposed to improve the selection iteratively. Each chromosome is encoded as a set of randomly selected indices of words in the vocabulary. The fitness of a chromosome is measured as the accuracy of the Decision Tree prediction model using the selected features (i.e., words). Decision Tree model also provides the feature importance values, which are used to rearrange the genes, such that the Crossover procedure ensures important genes are passed to the offspring. For the Mutation, the information about the Tendency Rank of the features is used alter a gene. Therefore, the Crossover and Mutation procedures are not merely combining and modifying the chromosomes. The proposed methodology is implemented to two data sets. For both data sets, the prediction accuracy of the proposed methodology is significantly higher than the baseline, i.e., random selection.
- 14:00 Steps Towards Successful Artificial Intelligence Integration in the Era of Industry 4.0
- In the era of Industry 4.0, the rapid emergence of cloud computing and Internet of Things (IoT) have thrived the organizations and businesses with explosive growth of data. Artificial intelligence (AI) and Machine Learning (ML) becomes the hot topic that attracts the organizations, industry, businesses and academia at extensive level. In this article, AI defined briefly with its value and dimensions in the perspective of Industry 4.0. An exploratory qualitative study is conducted to understand the interest of phenomenon. Specifically, great challenges are discussed with possible solutions. Explored AI aspects will be helpful to contribute the success of AI projects.
- 14:20 Development of a mobile based birth and funeral event planning application in Bahrain
- In decades, the demand for event planning and management fields with specialty niches has extremely increased. Event planning is an important process towards the successful execution of an event as clients struggle to get suitable event managers and service providers for their events. Classification of events is a wide subject at which managerial expertise differs for different event types. Thus, understanding the event type is crucial for efficient collection of client's requirements aiming for client's satisfaction. This research is aimed at developing a mobile application "Emotive Events", acting as an event planner that maintains time and budget for a private event including funeral considering multiple religions and birth parties and addresses many features such as checklists creation, guest list, religion-based services, online payments, a collaboration of service providers/services that most existing applications typically lack all in one place. The theoretical framework is based on a systematic literature review (SLR) and web content analysis (CA) techniques. For the mobile application development, an Agile system development life cycle (SDLC) methodology was employed at which two questionnaires were conducted, one for collecting user and application requirements, while the other for the mobile application's evaluation. The research outcomes recognized the shortcomings of the currently used event planning applications, informed the requirements of the "Emotive event" mobile application which revealed 86.23% of acceptance in terms of application's usability.
- 13:00 Cat-Dog Face Detector Based on YOLOv5
- Object detection is a common research topic for many fields. In particular, objects that are close together are difficult to detect. The breed of cats and dogs includes many species. These species are similar to each other and to some species in the other class. Therefore, it is difficult to distinguish the faces of cats and dogs, especially for some species. The study uses the YOLO algorithms, which has very high sensitivity and speed in numerous object detection challenges. The Oxford pets dataset, consisting of approximately 3600 images, containing images from 37 different types of cat/dog classes, is utilized for training and testing. We propose a method based on YOLOv5 to find cats and dogs. We utilized the YOLOv5 algorithm with different parameters. Four different models are compared and evaluated. Experiments demonstrate that YOLOv5 models achieve successful results for the respective task. The mAP of YOLOv5l is 94.1, demonstrating the efficacy of YOLOv5-based cat/dog detection.
- 13:20 Traffic sign recognition and distance estimation with YOLOv3 model
- Due to the expeditious increase in the number of vehicles, there is an increase in the number of road casualties even in a highly sophisticated roadway. This depicts the natural limitation of a human in maintaining Traffic rules. To avoid any lethal circumstance assistive driving vehicles are introduced which consists of systems that guide drivers in different Traffic situations. Traffic sign recognition systems play a crucial role in assistive driving vehicles these systems have been based on characteristics of the sign and two-state detectors due to accuracy and real-time factors systems on these bases are not used for real-time application. In this paper, we present a system that can recognize Traffic signs and their distance from the vehicle in non-ideal lighting as well as in varying climatic conditions. Our work proceeds with the implementation of YOLOv3(deep convolutional network based on end-to-end detection algorithm) used for Traffic sign recognition and segmentation. Training of the model is done with GTSRB dataset and achieves an accuracy of about 98.5% for the recognition task in different real-time scenarios. Furthermore, an efficient Heuristic-based approach has been deployed for estimating the distance between the Traffic sign and the monocular camera(placed in the vehicle) at every instance.
- 13:40 Twin Support Vector Machines for Thalassemia Classification
- Thalassemia is one of the incurable blood disorders inherited from parents with its history. This disease causes abnormality in the blood cells, specifically the protein composition such as hemoglobin. Furthermore, it has spread out across the Mediterranean Sea and through Indonesia due to the migration of people. Early detection to diagnose thalassemia is necessary to prevent the disease from spreading to another generation. This study aims to analyze the impact of machine learning in medical diagnosis, and its disease detection methods based on clinical history. Several previous studies have been incorporated into early screening for diagnosis of thalassemia with machine learning technique based on classification problem, and it showed great performance evaluation beyond 90% accuracy. In addition, the data used was laboratory results of blood check obtained from Harapan Kita Children and Women's Hospital, Jakarta, Indonesia. Twin Support Vector Machines (TSVM) is used in this study as one of the machine learning developed techniques inspired by Support Vector Machines (SVM), as this technique purposed to find the non-parallel hyperplanes to solve binary classification problem. This was conducted through three commonly used kernels from several previous studies, including Linear, Polynomial, and Radial Basis Function (RBF). The results showed that RBF TSVM gave the best results with 99.32%, 99.75% and 99.24% average of accuracy, precision, and F1 score, respectively. However, Polynomial TSVM, as the lowest results had 99.79% average of recall. In this context, the TSVM role is recommended for future studies to facilitate medical diagnosis based on the clinical history of other diseases.
- 14:00 Segmentation of Benign and Malign lesions on skin images using UNet
- One of the types of cancer that requires early diagnosis is skin cancer. Melanoma is a deadly type of skin cancer. Computer-aided systems can detect the findings in medical examinations that human perception cannot recognize, and these findings can help the clinicans to make an early diagnosis. Therefore, the need for computer aided systems has increased. In this study, a deep learning-based method that segments melanoma with color images taken from dermoscopy devices is proposed. For this method, ISIC 2017 (International Skin Image Collaboration) database is used. It contains 1403 training and 597 test data. The method is based on preprocessing and U-Net architecture. Gaussian and Difference of Gaussian (DoG) filters are used in the preprocessing stage. It is aimed to make skin images more convenient before U-Net. As a result of the segmentation performed with these data, the education success rate reached 96-95%. A high similarity coefficient obtained. On the other hand, as a result of the training of the preprocessed data, accuracy rate has reached 86-85%.
- 14:20 Automated Mapping of Environmental Higher Education Ranking Systems Indicators to SDGs Indicators using Natural Language Processing and Document Similarity
- To evaluate the ESHERSs and determine their efficiency to measure environmental sustainability, we tackle this problem as a classification assignment. This study benchmark three ESHERSs: UI GreenMetric, Times Higher Education Impact ranking, and STARS (Sustainability Tracking, Assessment Rating System) by AASHE (the association for the advancement of sustainability in higher education). Next, we recruited a group of experts who mapped the ESHERS indicators to the SDGs indicators. Then, we use NLP techniques to classify (map) the ESHERS indicators to the SDGs indicators. Since most of the ESHERS indicators and the SDGs indicators are in the form of short text, we use the query expansion technique to make the NLP techniques more effective. Each ESHERS indicator and its expanded text represents a document. And, each SDG indicator and its expanded text represents a document. We took the expanded text from the description of the ESHERS indicators and the description of SDG indicators, forming the corpus for our study. Then, we used document similarity to find the similarity between every pair of the corpus documents. We used different similarity measures to see the similarity between the forms. Then, we used a voting system to map the ESHERSs indicators to the SDGs indicators. The proposed system was able to automatically map the underlying ranking systems indicators to the UN SDGs with 99% accuracy compared to the experts mapping.
Wednesday, September 29 14:40 - 16:00 (Asia/Bahrain)
Wednesday, September 29 16:00 - 17:20 (Asia/Bahrain)
- 16:00 Autoregressive and neural network models: a comparative study with linearly lagged series
- Time series analysis such as stock price forecasting is an important part of financial research. In this regard, autoregressive (AR) and neural network (NN) models offer contrasting approaches to time series modeling. Although AR models remain widely used, NN models and their variant long short-term memory (LSTM) networks have grown in popularity. In this paper, we compare the performance of AR, NN, and LSTM models in forecasting linearly lagged time series. To test the models we carry out extensive numerical experiments based on simulated data. The results of the experiments reveal that despite the inherent advantage of AR models in modeling linearly lagged data, NN models perform just as well, if not better, than AR models. Furthermore, the NN models outperform LSTMs on the same data. We find that a simple multi-layer perceptron can achieve highly accurate out of sample forecasts. The study shows that NN models perform well even in the case of linearly lagged time series.
- 16:20 Autocorrelation for time series with linear trend
- The autocorrelation function (ACF) is a fundamental concepts in time series analysis including financial forecasting. In this note, we investigate the properties of the sample ACF for a time series with linear trend. In particular, we show that the sample ACF of the time series approaches 1 for all lags as the number of time steps increases. The theoretical results are supported by numerical experiments. Our result helps researchers better understand the ACF patterns and make correct ARMA selection.
- 16:40 Car Accident Severity Classification Using Machine Learning
- Car accidents have always been a terrible and extremely dangerous phenomenon. It caused the loss of many lives. The delay of the needed medical treatment for injuries at accident locations puts lives at risk. In this work, machine learning was used to predict the severity of accidents that occurred in the United Kingdom between the years 2005 - 2014. The combination of this AI solution and other systems to report to relevant authorities when accidents occur will preserve more lives. The medical support that will reach the accident location will depend on the severity of the accident. Several machine learning models were used, including Support Vector Machine (SVM), Artificial Neural Network (ANN), and Random Forest (RF). The best accuracy has been achieved was using the RF model with an accuracy of 83.9 %.
- 17:00 Defeating the Credit Card Scams Through Machine Learning Algorithms
- Credit card fraud is a major problem that is not going to go away. It's a growing problem, and surged during the Covid-19 pandemic since more transactions are done without cash in hand now. Credit card frauds are very difficult to distinguish as the characteristics of legitimate and fraudulent transactions are very similar. The performance evaluation of various Machine Learning (ML)-based credit card fraud recognition schemes is significantly pretentious due to the processing on datasets which includes the collection of variables and corresponding ML mechanism being used. One possible way to counter this problem is to apply some ML algorithms such as Support Vector Machine (SVM), K nearest neighbor (KNN), Naive Bayes, and logistic regression. This research work is aimed at the performance comparison of the aforementioned ML models and its impact of the credit card scam detection especially in situation with imbalanced datasets.
- 16:00 A Heuristic and Decentralized Approach to Intelligent Traffic Management for Low-Density Road Junctions
- Road intersections are good locations for implementing a smart and adaptive traffic management system depending on the current traffic situation. To allow these junctions to regulate vehicular flow in a coordinated manner, accurate and up-to-date data exchange between infrastructure and intelligent vehicles must be considered. To address the traffic management problem of an urban section, this study proposes a traffic management scheme for a two-way, four-road segment, and low-density intersections (TMS-2W4R) that enables real-time information exchange between intelligent vehicles and roadside unit (RSU) to facilitate an adaptive and smooth transition of vehicles at an intersection. TMS-2W4R receives vehicular direction demands and satisfies their requests by utilizing the index coding-based transmission scheme. Our simulation results show that applying TMS-2W4R at every roadside unit (RSU) can improve the road map data dissemination along with its direction request by 71% in terms of transmitted data size. Through our analysis, we have also shown the adaptability and flexibility of TMS-2W4R with respect to the current traffic scenario at the junction.
- 16:20 The Effect of Redefining the Role of the Public Sector on Stakeholder Satisfaction: BENAYAT as a case study
- The BENAYAT system streamlines the delivery of building permit services by redefining the government of Bahrain's role from service provider to regulator. Through a re-engineered, automated, and transparent system, the redefinition process applied through a transformational policy provides maximum tangible benefits. This study aims to determine how satisfied related stakeholders are with the transformational policy's operational effectiveness and the BENAYAT system's technological application. This research employed an online survey to gather the opinions of public sector engineers, private consultant firms, real-estate developers, and investors in Bahrain. The study found no correlation between knowledge of the transformational policy and the operational effectiveness of the same policy or its expected benefits. However, there was a significant relationship between the technology used to implement the transformational policy and the expected benefits and operational efficiency of the building permit process. Additionally, the expected benefits of the transformational policy had a significant correlation with the operational effectiveness, and so did the operational effectiveness with the overall satisfaction of the stakeholders. Furthermore, this study suggests that the BENAYAT system be better promoted in order to raise awareness and train stakeholders to use it professionally.
- 17:00 Fuzzy Logic based Recommendation System: Crafts to Clients Suggestion
- With the population explosion and the increasing number of different types of buildings, the need for rapid proper searching of craftsmen to make new installations and/or make some repairs is also increased. Usually, people seek close, reasonable, and professional craftsmen. Searching for such craftsmen is not an easy job with the increase of population in countries as well as the lack of related information. In this work, we are suggesting a fuzzy logic-based recommendation system embedded within a web-based database application. The system enables clients (customers) to manually search for craftsmen as well as the ability of the system to suggest craftsmen to clients according to the professionally of craftsmen and their closeness, all ranked in descending order. Since manual searching for artisans is also not an easy task, the recommendation system is able to suggest the most suitable artisans to clients according to their needs. The experimental tests at the end of the work compare the time needed to contact craftsmen using the old fashion with our invented recommendation system.
- 16:00 On The Authentication Method Based On Visual Cryptography
- In this work, we build, implement and research a new authentication approach. Our method is based on the use of visual cryptography. In the authentication system, the user needs to combine the image stored in the smartphone's memory (the first shadow) with the image stored in the memory of the stationary computer (the second shadow), find the password and enter it in the input field. It is well known that it is theoretically impossible to get information about the secret by one shadow. This is the basis of the resistance of our scheme to the loss/theft of one shadow. In our work, experiments were carried out in order to study the performance and practical applicability of this scheme. Based on the results of the experiments we determine a set of hard-to-recognize characters and, as a result, a set of admissible (easily recognizable) characters. Also, the results of the experiments made it possible to identify the characteristics of smartphones that affect the success of recognition of a secret image. We determine the ways of further improvement of the system to reduce the probability of errors.
- 16:20 AI-Based Anomaly and Data Posing Classification in Mobile Crowd Sensing
- Mobile Crowd Sensing (MCS) became the popular paradigm for sensing data. MCS is vulnerable to many types of threats and faces many challenges. Trustworthiness is one of the main MCS challenges; attackers aim to inject faulty data to corrupt the system or waste its resources. To have trusted sensed data, MCS organizers are required to ensure that no malicious users are contributing. Faulty sensor readings in MCS can be due to sensor failure or malicious behavior. Attackers' targets degrade the system performance and reduce the worker's reputation by injecting false data. This paper evaluates different machine learning algorithms classifying the received sensed data as true, a faulty sensor, or attacker behavior. These algorithms are Decision Tree (DT), Support Vector Machine (SVM), and Random Frost (RF). Evaluating the result for comparison obtained based on accuracy, precision, recall, f1 score, and the confusion matrix. The result shows that among all classifiers, RF shows the highest accuracy of 97.9% in classifying the received sensed data and detect the data posing attack.
- 16:40 A new approach to detect next generation of malware based on machine learning
- In these days, malware attacks target different kinds of devices as IoT, mobiles, servers even the cloud. It causes several hardware damages and financial losses especially for big companies. Malware attacks represent a serious issue to cybersecurity specialists. In this paper, we propose a new approach to detect unknown malware families based on machine learning classification and visualization technique. A malware binary is converted to grayscale image, then for each image a GIST descriptor is used as input to the machine learning model. For the malware classification part we use 3 machine learning algorithms. These classifiers are so efficient where the highest precision reach 98%. Once we train, test and evaluate models we move to simulate 2 new malware families. We do not expect a good prediction since the model did not know the family; however our goal is to analyze the behavior of our classifiers in the case of new family. Finally, we propose an approach using a filter to know either the classification is normal or it's a zero-day malware.
- 17:00 VPN Remote Access OSPF-based VPN Security Vulnerabilities and Counter Measurements
- Through the COVID-19 pandemic, the number of clients using Virtual Private Network (VPN) has dramatically increased. Consequently, VPN vulnerabilities have become target points to be exploited by attackers. However, studies have been released to defend against such attacks with the purpose of securing VPN. Nevertheless, attacks with high sophistication still target VPNs to comprise the critical data being communicated. VPN servers use protocols to secure connections with clients. However, these protocols are still targeted specifically with Denial-of-Service (DoS) attacks. This paper analyzes and treats the vulnerability of key negotiation process in the main mode as well as aggressive mode of Internet Key Exchange (IKE) protocol in IP Security (IPsec) VPN. We demonstrate experiments of a DoS attack based on Open Shortest Path First (OSPF) protocol adjacent route spoofing. Thereafter, we propose a method to tackle those attacks through exploiting the Suricata as an Intrusion Detection System (IDS) in defending the VPN against DoS attacks.
- 16:00 Reviewing Dynamic Feature Location Techniques: Basic Elements and Challenges
- Dynamic Feature Location Techniques (DFLTs) seek to pertain the software functionalities artifacts which are known as software features, to the relevant source code based on execution trace. These techniques enhance developers to comprehend a software system to perform various activities such as software maintenance, code refactoring, and others. For instance, allocating source code that correspond to a software feature by analysing the execution trace would support the developers to comprehend the software application at runtime. Monitoring the system actual behaviour increases the accuracy of locating features to the source code, as well as the traceability between them. This work introduces a systematic literature review for several works in DFLTs top. An illustration is presented to show their common structure approach and idealize the used solution on an industrial software application. Moreover, we review their characterization, advantages, disadvantages and administer a possible framework that would be utilized by researchers and developers to boost their decision to opt the correct DFLTs to solve a problem. Our survey reveals that there are no one existing DFLT is designed to solve all problems for all software applications. Therefore, we examine possible directions for standardizing DFLTs architecture to improve the accuracy of mapping software functionalities to the related software artifact based on the execution trace.
- 16:20 Fault Diagnosis of Open Switch Failure in Voltage Source Inverter Using Average and RMS Phase Voltages
- For a VSI in a grid-connected renewable energy system, there are several failure modes that can occur. In this paper, a new technique is proposed and implemented to detect and localize open switch faults in a three-phase VSI. The proposed technique utilizes a series of measurements of both average and RMS phase voltages at the output terminals of the inverter. Using MATLAB/Simulink, the proposed fault diagnosis system is designed and tested for each type of open switch fault in an attempt to detect the presence of both single and double open switch faults. The study covers a total number of about 16 fault cases including the normal operation condition. The method is tested for its robustness and reliability against variation of load, transient condition, and noisy signals added directly to the phase voltages under testing. On the other hand, advantages and limitations will be discussed according to several important criteria such as effectiveness, detection time, implementation effort, tuning effort, and the diagnostic variables required. Finally, after identifying the faulty phase(s), remedial measures are employed that include a rapid and safe turnoff of the power switches and a fail-safe reconfiguration operation to avoid failures that are more dangerous.
- 16:40 B2C Mapping Based On A Fuzzy Logic Recommender System
- Saving time and effort in searching for products and services is becoming an important issue in people's daily lives with the existence of new supporting technologies. Advanced search engines, directed advertising, and recommendation systems are all examples of such technologies. The services provided by restaurants are improved lately due to the utilization of different technologies in providing their services. Customers search for the optimal services provided by the restaurants. This work is related to enhance the process of searching for the best restaurants to offer some meals by implementing a fuzzy-based recommendation system for this purpose. The clients are able to search for restaurants that satisfy their desires. The searching technique developed in the work helps customers search and get a list of the best-ranked restaurants related to their searching criteria. The experimental tests conducted in the work show a promising reduction in both time and effort compared to the ordinary searching of optimal meals.
- 17:00 Optimized PID-Sliding Mode Controller Based Predictor Design
- In this paper, a robust control scheme of a time-delayed process has been elaborated through a predictor structure. The control technique is constructed based on an optimized proportional-integral-derivative first-order sliding mode control (SMC) technique combined with Smith-Predictor. The optimization of the control parameters is addressed with genetic algorithms (GA). The stability of the closed-loop system is analysed in the sense of the Lyapunov theorem. The compensation results are investigated with various scenarios. The control performances are evaluated in terms of tracking precision, transient response, and disturbance rejection capability.
- 16:00 Visualizing Ruwah Related Data By Interactive Graph
- With the continuous achievements in Information Technology and its applications in different life fields, huge amounts of data are generated daily that makes searching for specific data items is a time/effort consuming process. However, several techniques are implemented and used to seek for information such as search engines and information generation centers. Requesting data from historical warehouses is a famous routine as well, since extracting knowledge from historical repositories is needed in several daily life applications. The Arabic language has a lot of historical repositories represented in literature periodicals and books. Prophet Mohammad's (PBUH) talks are one of these important historical source that can be used for knowledge extraction. These talks are collected and verified by a set of Muslim scholars in which Al-Bukhari was a famous one of them. This work is related to visualize the narrators of prophet Mohammad (PBUH) talks as an interactive graph for both the narrator's related information and the talks themselves. Moreover, a set of graph centrality measures have been executed in order to quantify the importance of each narrator in the process of talks narration. The conducted experimental test emerges the importance of using the Interactive Graph versus the manual searching of Ahadith.
- 16:20 Constructing Global Researchers Network Using Google Scholar Profiles for Collaborator Recommendation Systems
- Researchers like to collaborate with other researchers to share their knowledge, experience, and resources with each other. By selecting appropriate collaborators researchers can obtain accurate output and they can publish more papers with high quality. They can get higher recognition within their research community. However, selecting appropriate collaborators is a challenging task. Thus, researchers proposed collaborator recommendation systems (CRS) to address that challenge. A network-based recommendation system is one type of CRS. Constructing an effective network is very important in the CRS. Existing network constructing approaches used co-author lists or publications to generate the network. However, the method is not suitable for some researchers like junior researchers and undergraduate students due to a lack of co-authors and publications. In this paper, we propose Google Scholar-based researchers network constructing approach. Common co-authors, the similarity of the area of interests, citation rate, and several co-authored publications between two researchers are extracted from GS to construct the network. The empirical work of our prototype system shows the efficacy of the technique presented.
- 16:40 A New Approach for Labelling XML Data
- Extensible Markup Language (XML) has become a key technique for transferring data through the internet. Updating and retrieving a large amount of XML data is a very active research field. The XML labelling schemes play an important role in handling XML data efficiently and robustly. Thus, many labelling schemes have been proposed. Nevertheless, these labelling schemes have limitations. Therefore, in this paper, a new method for labelling XML documents is developed. In addition, this approach used the idea of clustering-based XML data and dividing the nodes of an XML document into groups and labelling them accordingly. Two existing labelling schemes were chosen to label the clusters and their nodes as well. The level-based labelling scheme (LLS) and Dewey labelling scheme were used to label the nodes and clusters. The data model of this scheme has been developed. The mechanism of the proposed scheme also has been developed. Finally, this proposed scheme and the other two labelling schemes that used to build the proposed scheme have been implemented.
- 17:00 Development of a Bahraini job seeking web based portal for UOB-IS graduates
- The discipline of information systems (IS) at the university of Bahrain (UOB) is characterized by its dynamicity and diversity in terms of the employment of diverse and up to date information technologies to achieve the strategic goals and dynamic operational objectives of organizations. However, the UOB-IS graduates and IS job employers alongside face difficulties in job seeking/provision. Seekers hardly find the best matching job offerings to their degree qualifications and skills, while employers hardly find the best candidates for the required job positions. This project is aimed at developing a Bahraini web-based job offering portal (J4IS) that acts as an automated recruiter for the job seekers and employers to facilitate an efficient and professional medium of communication and time and cost reduction tool to seek/provide suitable jobs for both sides. The theoretical framework is based on a systematic literature review (SLR) and web content analysis (CA) techniques. For the portal's application development, a six phased system development life cycle (SDLC) methodology was employed at which three questionnaires, two of which were conducted for collecting user and system requirements, while the third was for the web portal's evaluation. The research outcomes recognized the nature of the Bahraini workforce and the qualifications and skills of the UOB-IS graduates, the shortcomings of the commercial online job portals and the related reasons, and subsequently informed the user (seekers and employers) requirements of the developed "J4IS" which revealed a 94% of usability acceptance.
Wednesday, September 29 17:20 - 18:00 (Asia/Bahrain)
Wednesday, September 29 18:00 - 18:30 (Asia/Bahrain)
Internet of Things (IoT) deployments offer a much higher value proposition if these can function in the context of smart buildings. Such advanced information and communication technology (ICT) applications in commercial buildings, schools, libraries, shopping centers, etc. offer low cost but highly effective monitoring and control opportunities. Sensors deployed in key locations can monitor the building environment in real-time, collect information for intelligent decision making, and facilitate various services. An IoT sensor platform has been developed that provides a unified communication platform which can integrate information from disparate sources and provide one control hierarchy. It is a powerful, low-cost, open-architecture software platform that can monitor and control major electrical loads (e.g., HVAC, lighting and plug loads), as well as solar PV systems, energy storage units and other IoT sensors in commercial buildings. The platform can provide new or legacy buildings with a building automation system (BAS) or connect with existing BAS systems in large and small commercial buildings. This platform leverages machine learning algorithms to draw insights from a deployed building's historical operating data and occupant preferences to save energy (kWh) while increasing occupant comfort. This also allows buildings to reduce peak demand (kW) through direct communication with utilities using demand response protocols such as openADR.
Thursday, September 30
Thursday, September 30 10:00 - 11:20 (Asia/Bahrain)
- 10:00 Index Coding-based Data Exchange in a Vehicular Network Junction
- Traffic lights play crucial roles in maintaining the smooth transition of vehicles from one road segment to another. They have evolved from time-based controllers to adaptive and density-based traffic regulators. They can now even provide more advanced capabilities such as data processing, storage, and computation. In this work, we investigate data dissemination in an intersection with the traffic light acting as the intelligent roadside unit (RSU) that can receive and broadcast demanded environment data from/to its nearby vehicles. Information exchange is implemented by utilizing the index coding-based transmission scheme able to optimize the number of transmissions and transmitted data size while satisfying vehicular demands depending on its availability. Assuming various mobility scenarios and incorporating the presence or absence of stored information, two transmission schemes are compared. We evaluate the single junction data exchange between vehicles and infrastructure in terms of data map throughput, message security, number of broadcast transmissions, and amount of consumed wireless bandwidth. Our simulation results highlight the advantage of the index coding-based transmission scheme over the conventional broadcasting technique by inspecting the performance metrics used in the study.
- 10:20 1-trit Ternary Multiplier and Adder Designs Using Ternary Multiplexers and Unary Operators
- This work proposes models for a 1-trit TMUL (Ternary Multiplier) and THA (Half-Adder) using TMUXs (Ternary Multiplexers) and unary operators. The target of the proposed designs is to minimize energy consumption in nanoscale embedded circuits to improve their battery usage. To achieve that, different techniques are used: 32-nm CNTFET transistor, Multiple-Valued Logic (MVL), two voltage supplies (Vdd, Vdd/2), TMUXs, and unary operators to reduce the transistors' number and PDP (Power Delay Product). Extensive simulations using HSPICE for different Process, Voltage, Temperature (PVT), and noise effects are applied. The obtained results show improvements regarding PDP, robustness of process variations, and noise tolerance with respect to recent similar designs.
- 10:40 Future Micro Hydro Power: Generation of Hydroelectricity and IoT based Monitoring System
- The paper focuses on the Future Micro Hydro Power: generation of hydroelectricity and its monitoring system. The world is moving towards technological advancement day by day. For this reason, the energy need will surge further in the coming days. But we could not yet ensure the proper electricity needs in the poor or developing country. Now it's an essential needy thing to survive this 4.0 industry's time. This 'Future Micro Hydro Power' device will generate energy by exploiting the small water sources (i.e., Washroom, Kitchen, Etc.) in the multi steroid buildings. A massive amount of water is used in the house every day. Water taps are used not only in homes but in all modern buildings. We have demonstrated how hydropower will generate from these tiny water sources and how this power can run a house. Here the user will be able to monitor the amount of energy produced and use it if desired. The cost of the devices will be much lower, and their performance will be much higher. After the experimental installation, we got some data that proves its outstanding efficiency.
- 11:00 A Fuzzy-based Clustering and Data Collection for Internet of Things based Wireless Sensor Networks
- The Internet of Things (IoT) has become an integral part of our daily lives and this is possible due to the billions of connected devices. Thanks to the small and cheap sensors with the ubiquity of wireless networks, which form any kind and size of IoT. A huge number of sensors that can exchange various data among themselves make it a basic building block of IoT. Sensor nodes play a major role in IoT but the major issue is resource constraints. Wireless sensor network (WSN) is mainly used a large set of sensors inside an IoT to collect data independently and transmit data to the cloud via a coordinator/ gateway. As sensor nodes are resource constraints devices, energy conservation is the main objective in most of the networks. This paper proposes an energy-efficient data collection mechanism using dynamic clustering routing. The soft computing technique (Fuzzy Inference System) is used for dynamic clustering with three important network parameters: residual energy, node density, and packet generated. A Wi-Fi enabled mobile coordinator router (MCR) is used to collect data from the cluster head (CH) directly. After gathering data from any CH, it transmits that received data to the gateway using the IEEE803.11 protocol instantly for further processing and applications. Based on the requirement and application, the gateway will send these data to the IoT cloud. The extensive simulations for the proposed scheme show a better performance on different parameters mainly with reference to network life, average data delivery, energy depletion, and average delay.
- 10:00 Virtual Dietitian: A Nutrition Knowledge-Based System Using Forward Chaining Algorithm
- The association between nutrition and health has been repeatedly established by the field of nutrition science and evidence-based practices. Nevertheless, inadequate nutrition is still prevalent among Filipino households. As a response to this public health issue, a nutrition system called Virtual Dietitian (VD) was conceived. Through a mixed-methods approach, VD was beta tested via a user study and System Usability Scale (SUS) by six information technology experts and six registered dietitians. Participants performed the standardized tasks with a mean of 85% completion rate and 106.2 seconds, and graded SUS with a mean score of 83.4 (excellent). Albeit the prototype successfully exhibited the potential of VD as a nutrition system, qualitative feedback from experts revealed some modifications that are needed to accomplish before going to the next phase of the study. Healthcare professionals delivered their feedback on the correctness of processes and meal plan generation while the information technology experts pointed out the deficiencies of VD from the technical perspective (e.g., web standards, layout and design, functionality, navigation, usability). With this beta evaluation, an overview of the true experience gained by end users while using VD was determined without the trouble of undergoing the whole project lifecycle. Feedback from experts, which will be used in the next phase, were beneficial to ensure that the final version of VD will be correct, useful, and valid.
- 10:20 Implementation of Web-based Interactive Learning Platform for User Interface Design in Android Programming Learning Assistance System
- According to the popularity of Android smartphone devices, the demand for Android application programmers has been increasing rapidly. Due to this fact, the subject of mobile application programming based on Android smartphones has become one of the important subjects in software development fields. In Android applications, the user interface takes an important role to make it interactive. Based on previous studies, the Android Programming Learning Assistance System (APLAS) has been used as a self-learning platform for Android programming studies. In this paper, we propose a web-based platform in APLAS for learning user interface design interactively using XML code. The 15 learning topics as learning materials cover the learning of UI layout and application resources. Students will get an assignment to be solved in each topic and the answer can be validated automatically on the server. To evaluate the effectiveness of this platform, we asked 40 students in an IT department in Indonesia to solve all topics. As the result, all students have successfully solved the 15 provided topics on schedule. Also, they gave positive comments regarding the ease of the learning process, which has confirmed the feasibility of our proposal to support learning UI design using XML.
- 10:40 Crafting the Digital Competence Behavior among Female Students in Developing Countries Context
- Technology integration at mass level can help the emerging economies to fill the skill gapes by enhancing the digital competency in young fellows, particularly female students. The study aims to understand and craft the predictors of digital competency among female students. Therefore, a research framework was hypothesized by enlisting the personal innovativeness, facilitating condition and social influence towards behavioral intention to become the digital competent. An online survey was conducted to collect data from female students in two developing economies i.e., Pakistan and Bangladesh. The survey resulted in 254 responses with 130 from Pakistan and 124 from Bangladesh. Structural equation modeling analysis was conducted to comprehend the behavior predictors. The results portrayed that collective responses conferred all predictors significant towards behavior intention while in Bangladesh scenario, the social influence had no impact on female student's behavior towards becoming the digital competent. The study results will support the stakeholders to comprehend the predicaments of digital competency in females from developing countries.
- 11:00 An Implementation and Evaluation of Basic Data Storage Topic for Content Provider Stage in Android Programming Learning Assistance System
- Recently, Android-based smartphones have become the market leader for mobile devices. Thus, the need for Android application developers has increased significantly. Following this trend, many schools and universities set up Android application programming to become the mandatory subject for IT students. In Android applications, utilizing data is necessary to make dynamic and interactive applications. Based on previous studies, the Android Programming Learning Assistance System (APLAS) has been applied as a self-learning platform for Android programming studies. In this paper, we implement the Basic Data Storage learning topic in APLAS as the first topic in the Content Provider stage (second stage). It offers learning materials and assignments for the Model-View-Viewmodel (MVVM) architecture with data binding, the utilization of Shared Preferences, and storing data in internal storage. The results of evaluation through 50 students of an IT department showed that all students have solved all assignments successfully. With the domination of positive comments from them, the effectiveness of this learning topic to support Android programming learning was confirmed.
- 10:00 Classification of flower species using CNN models, Subspace Discriminant, and NCA
- Flowers have an important place in human life. Because flowers can appear at every stage of human life. People want to know these types of flowers that they come across even in daily life. However, due to a large number of flower types, there are difficulties in recognizing these types. We used deep learning methods in this study to overcome these difficulties. Deep learning methods have been widely used in different fields recently. In this study, we used 3 different deep learning methods. In the first stage, we performed the classification process using the pre-trained Efficientnetb0, MobilenetV2 and Alexnet architectures. In the second step, we extracted the feature maps of the images in the dataset using these three pre-trained deep learning models. Then, we optimized these features using the NCA size reduction method to save time and cost. Next, we classified these optimized features in the features Subspace Discriminant classifier. In the final stage, we combined the features we obtained with three pre-trained deep learning architectures. After optimizing these combined features with the NCA method, we classified the features in the Subspace Discriminant classifier. In the first step, the highest accuracy we achieved in the three pre-trained deep learning architectures was 83.67%, while our accuracy rate was 94% in this hybrid method we recommend. This shows that our proposed model is successful.
- 10:20 Malicious URL Detection using Multilayer CNN
- Due to developing Internet-based technologies, the number of online domains and URLs is increasing globally. Parallel several cybersecurity threats and phishing attacks have been encountered while accessing these websites. Accessing a malicious webpage can create serious harm to the physical system. Data loss, privacy breach, credential theft and many security threats are entered while an Internet user clicks a malicious URL. Several defence and detection strategies have been proposed in the previous research works. But, the works have used a traditional classifier which is not adequate. This is because the size of the URL is huge, and the URL patterns are changed over time so finding the correlation between old and new patterns is almost impossible. Hence, this paper proposed malicious URL detection using multilayer Convolutional Neural Network (CNN). The proposed model first considered one layer of CNN. After that, to improve the accuracy, a two-layer of CNN will be used. The achieved result illustrated that malicious website detection accuracy is enhanced 89% to 91% when the model uses two layers of CNN.
- 10:40 Forecasting Demand Using ARIMA Model and LSTM Neural Network: a Case of Detergent Manufacturing Industry
- Generating reliable and meaningful product demand predictions is an open challenge in the industrial environment. Demand forecasting is still an active avenue of research since it significantly affects business profitability because of uncertainties related to demand predictability, high product variety, and supply fluctuation. This paper deals with a practical real-life case study of a leading international company. Particularly, we investigate the demand forecasting for the industrial production of household detergents. To tackle this challenging problem over medium long-term prediction horizons, we propose two different techniques: (i) the traditional statistical approach namely the AutoRegressive Integrated Moving Average model (ARIMA), and (ii) the artificial neural networks based on Long Short Term Memory algorithm (LSTM). We empirically assess and compare these approaches to real data sets. Numerical experiments attest to the competitiveness of the results obtained for household detergents and cleaning products. Furthermore, the results reveal that deep learning models have a better overall performance than traditional statistical techniques. Precisely, the average percentage errors provided by the LSTM algorithm is 22% when compared to ARIMA (34%) showing better forecasting accuracy of the LSTM prediction model.
- 11:00 FitNet: A deep neural network driven architecture for real time posture rectification
- A methodology of real time pose estimation, which is believed to mitigate many orthopaedic adversaries pertaining to wrong posture, has been illustrated in this paper. Vast array of problems get reported that are known to arise due to maintaining a wrong posture during exercising or performing yoga, for a prolonged period of time. Several developments were made with regard to this issue, yet a major drawback was the presumption that a person during exercising or performing yoga or any kind of gym sessions, will keep the camera facing only at a fixed pre-determined portrayal direction. The approach, towards this problem, mainly deals with precise ROI detection, correct identification of human body joints and tracking down the motion of the body, all in real time. A major step towards converging to the solution is determining the angular separation between the joints and comparing them with the ones desired. Another important facet of the stated methodology is analysis of performance of the deep neural architecture in different camera positions. This is a major bottleneck for many different models that are intended to track posture of a person in real time. All these operations are done efficiently, with an appropriate trade-off between time complexity and performance metrics. At the end a robust feedback based support system has been obtained, that performs significantly better than the state of the art algorithm due to the precise transformation of input color space, contributing significantly in the field of orthopaedics by providing a feasible solution to avoid body strain and unnecessary pressure on joints during exercise.
- 10:00 Design and Fabrication of Rectangular Microstrip Antenna with Various Flexible Substrates
- In addition to being small, light, practical, and cheap to manufacture, microstrip antennas are also exceedingly difficult to obtain the most suitable electrical parameters such as resonance frequency, bandwidth, return loss, gain, efficiency, and standing wave ratio. To achieve this, researchers are trying different physical structures and applying optimization techniques to them in order to obtain the most suitable radiation power and shape in different sizes and materials. Especially at high frequencies, the dielectric property of the material used can change all the parameters of microstrip antennas and affect the antenna performance to a great extent. The purpose of this study is to investigate the impacts of the physical structure and electrical properties of various textile materials and obtaining the most suitable material. For this purpose, textile-based wearable rectangular microstrip antenna designs were carried out on three different resonant frequency bands, which are widely used with different textile products such as felt, photo paper, and fiberglass, and their performances were examined. The proposed antennas on felt, photographic paper, and fiberglass substrates, were designed and manufactured. The feeding line and radiating and ground planes were formed using conductive (copper) tape. The operating frequency range of the antenna was chosen between 2 GHz and 10 GHz, and the simulated gain of the antenna was obtained as 5.31 dB. The measurement S11 results demonstrate that the results are in good agreement with the simulated ones. The proposed antenna allows continuous monitoring of patients at high risk of cancer.
- 10:20 Indoor Localization of a Multi-Storey Residential Household using Multiple WiFi Signals
- Tracking individuals, equipment, store locations, and floor level in a multi-story building becomes accessible through the implementation of an indoor positioning system. In this experimental study, we implement a multi-story indoor localization scheme by utilizing multiple WiFi Received Signal Strength Indicator (RSSI) signals installed in various locations of a three-floor residential household. Initially, our work focuses on static target locations spaced one meter apart and captures RSSI readings from four WiFi routers coming from different floors. These RSSI readings are stored in a database of fingerprints. To localize an indoor target, the cross-correlation between the offline and online (captured by a smartphone with a developed RSSI-capturing application) RSSI readings is calculated. Our empirical results have shown a 90% rate of correctly localizing a static indoor location when using only the average of a three-minute time series of RSSI values. We captured the WiFi RSSI values every 200 ms and present the localization utilizing the Time Reversal Resonating Strength (TRRS) concept.
- 10:40 A hybrid beamforming Massive MIMO system for 5G: Performance assessment study
- Recently, massive MIMO (multiple-input multiple-output) technology has been employed to improve the coverage and capacity of 5G cellular networks. This paper investigates the impact of different factors on the performance of a hybrid MIMO system for 5G. This study is essential to find a suitable configuration when designing a hybrid MIMO system for 5G. Specifically, we investigate the effect of changing several variables, including the number of users, the number of transmitters, receiver antennas, the type of used modulation, and noisy data on the performance of a hybrid MIMO system for 5G. Furthermore, the bit error rate (BER) is computed under different considered scenarios and used as an indicator of the effectiveness. Finally, simulations are conducted to accomplish this study. This assessment provides relevant information about the performance of a hybrid MIMO system for 5G under different circumstances.
- 11:00 COCOSO-based Network Interface Selection Algorithm for Heterogeneous Wireless Networks
- Network Interface Selection (NIS) aims to connect the user equipment to the best available network in the context of heterogeneous wireless networks environments (HWN). NIS is one of the main current issues in HWNs that raised great scientific interest in the last few years. Multi-attribute decision-making (MADM) are the most common approaches applied to solve the NIS problem as they are easy to understand, they can be used in real scenarios, and they perform fast networks' ranking. In this paper, we apply, for the first time, the Combined Compromise Solution (COCOSO) to model and solve the network interface selection problem. Simulation results showed that our proposed approach outperforms TOPSIS and SAW in terms of reducing the rank reversal problem and meeting QoS requirements.
- 10:00 MinkowRadon: Multi-Object Tracking Using Radon Transformation and Minkowski Distance
- The latest trend in multiple object tracking (MOT) is bending to utilize deep learning to improve tracking performance. With all advanced models such as R-CNN, YOLO, SSD, and RetinaNet, there will always be a time-accuracy trade-off which puts constraints to computer vision advancement. However, it is not trivial to solve those kinds of challenges using end-to-end deep learning models, adopting new strategies to enhance the aforementioned models are appreciated. In this paper we introduce a novel radon transformation based framework, which takes advantage of color space conversion and squeezes the MOT problem to signal domain using radon transformation. Afterwards, the inference of Minkowski distance between sequence of signals is used to estimate the objects' location . Adaptive Region of Interest (ROI) and thresholding criteria have been adopted to ensure the stability of the tracker. We experimentally demonstrated that the proposed method achieved a significant performance improvement in both The Multiple Object Tracking Accuracy (MOTA) and ID F1 (IDF1) with respect to previous state-of-the-art using two public benchmarks.
- 10:20 Predicting the Health Impacts of Commuting Using EEG Signal Based on Intelligent Approach
- Commuting to work is an everyday activity for many which can have a significant effect on our health. Commuting on regular basis can be a cause of chronic stress which is linked to poor mental health, high blood pressure, heart rate, and exhaustion. This research investigates the neurophysiological and psychological impact of commuting in real-time, by analyzing brain waves and applying machine learning. The participants were healthy volunteers with mean age of 30 years. Portable electroencephalogram (EEG) data were acquired as a measure of stress level. EEG data were acquired from each participant using non-invasive NeuroSky MindWave headset for 5 continuous activities during their commute to work. This approach allowed effects to be measured during and following the period of commuting. The results indicate that whether the duration of commute was low or large, when participants were in a calm or relaxed state the bio-signal alpha band exceeded beta band whereas beta band was higher than alpha band when participants were stressed due to their commute. For modelling of cognitive and semantic processes underlying social behavior, the most of the recent research projects are still based on individuals, while our research focuses on approaches addressing groups as a complete cohort. This study recorded the experience of commuters with a special focus on the use and limitation of emerging computing technologies in telehealth sensors.
- 10:40 Fingerprint Measurements-Based Human Height Estimation
- Biometric predictive have gained a lot of interest in the last decades, such as age estimation, weight estimation, etc. This paper investigates human height estimation based on fingerprint measurements (fingerprint width, fingerprint length, fingerprint area, fingerprint circumference) and fingertip temperature. A linear regression analysis and optimization techniques are performed to find the best coefficients of the non-linear equations used. A piecewise function is proposed to estimate human height based on gender, fingerprint width, and fingerprint circumference. The analysis was performed on a dataset of 200 participants (117 males, 83 females).
- 11:00 Design and Implementation of Intelligent Socializing 3D Humanoid Robot
- Social intelligence in robots is a relatively new concept. In many application areas and circumstances where robots must communicate and work with other robots or people, social and interaction capabilities have become more evident. This paper presents the design and implementation of intelligent socialized 3D humanoid robot called "RUBEX". The designed robot was implemented by integrating different technologies and parts like 3-D printing, electronical and mechanical parts and different AI and machine learning algorithms. RUBEX has very engaging, rich, and friendly dialogue and interaction with the appearance that resembles humans. In designing the robot head, 3-D printer is used to manufacture a handy human like face. Servo motors and sensors are used to control robot face emotions and interaction. The robot was trained to greet people upon their recognition, interact with and was also customized to detect the emotions and communicate accordingly with people. The intelligent socializing 3D humanoid robot was implemented successfully, tested, and validated and proved to be a successful product that can be manufactured in a large scale in future.
Thursday, September 30 11:20 - 12:30 (Asia/Bahrain)
Thursday, September 30 12:30 - 13:00 (Asia/Bahrain)
Thursday, September 30 13:00 - 14:40 (Asia/Bahrain)
- 13:00 AI and Machine Learning Techniques in the Development of Intelligent Tutoring System: A review
- An Intelligent Tutoring System (ITS) is critical in education because it provides one to one personalized teaching assistance to learners as they educate how to solve problems through guidance and prompt feedback. ITS is one application of Artificial Intelligence (AI) in education. It provides a smart learning environment for students without intervention from the teacher. ITS's primary goal is to support and help learners obtain domain-specific intellectual knowledge in a practical and productive manner through the use of different computing technologies. This paper presents a comprehensive survey for previous research on ITS that utilize various techniques of AI and Machine Learning (ML). It gives an overview of ITS, its architecture, and some existing ITS examples. In addition, it highlights and summarizes the current research efforts and obstacles to ITS using AI, as well as some future opportunities. This study shows the importance of AI and ML in ITS development. It is noticed that researchers focus more on Reinforcement Learning (RL), Artificial Neural Networks (ANN), clustering, Bayesian Network (BN) and Fuzzy Logic (FL) approaches.
- 13:20 Cartoonize Images using TinyML Strategies with Transfer Learning
- The gradual advancements towards the development of Deep Neural Networks has ultimately led to the development of certain specialised hardware for the application of such Networks with superior performance. But, many of the consumer grade hardwares and Single Board Computers (SBCs) which are used in embedded scenarios are not (yet) computationally efficient enough to perform the execution of even an already trained model within feasible limits. Thus in this paper, we focus on the implication of Post Training Quantization (PTQ) strategies to mitigate the great computational demands arising due to evolution of complex models. As the title implies, the original model incorporates a Generative Adversarial Network to generate Cartoonized versions of provided Real-World input images. This model, in its original state, takes nearly twice as much time to process the output on a single threaded workload. Our Solution to the above stated issue involves Quantizing the pre-trained model from 32-bit Floating Point values to a minimum of 8-bit Integer Values with the addition of Transfer Learning in-general. The results from our testing show that PTQ allowed the model to be compressed to a smaller size as compared to the original novelty, making it ready to be deployed on resource-constrained environments. In addition to that, a significant increase in the Inference Engine's processing performance has also been observed on general purpose hardware.
- 13:40 Oil Spill Detection System in the Arabian Gulf Region: An Azure Machine-Learning Approach
- Locating oil spills may be a crucial portion of effective marine contamination administration. In this paper, we address the issue of oil spillage location exposure within the Arabian Gulf region leveraging a Machine-Learning (ML) workflow on a cloud-based computing platform: Microsoft Azure Machine-Learning Service (Custom Vision). Our workflow comprises of virtual machine, database, and four modules (Information Collection Module, Discovery Show, Application Module, and a Choice Module). The adequacy of the proposed workflow is assessed on SAR imagery of the targeted region. Qualitative and quantitative analysis show that the proposed algorithm can detect oil spillage occurrence with an accuracy of 90.5%.
- 14:00 Photovoltaic Solar Power Plant Maintenance Management based on IoT and Machine Learning
- Photovoltaic solar energy requires novel algorithms to ensure suitable maintenance management. Supervisory control and data acquisition system, combined with machine learning techniques, is required to obtain reliable information about the real state of photovoltaic systems. This paper introduces an Internet of Things platform for photovoltaic maintenance management based on classification algorithms to detect patterns, where performance ratio decreases significantly in time series. A real case study is presented with SCADA data from a photovoltaic solar plant located in Spain. The classification algorithms employed are Shapelets and K-nearest neighbors. The results prove the robust performance of both algorithms in pattern recognition, whereas K-nearest neighbors is preferable for implementation on the Internet of Things platform due to the reduced execution time. The application of the platform developed in this paper improve photovoltaic maintenance management detecting performance ratio reductions.
- 14:20 Machine Learning techniques implemented in IoT platform for fault detection in photovoltaic panels
- Novel condition monitoring systems and data analysis methods are required to support and enhance the information obtained with supervisory control and data acquisition systems. The application of advanced conditions monitoring systems based on thermographic cameras embedded in unmanned aerial vehicles is a challenge in maintenance. This paper presents an Internet of Things platform to detect faults by analyzing thermal images acquired by aerial inspections. The combination of two artificial neural networks is applied to detect regions with faults in photovoltaic solar panels, providing high accuracy. A real case study is studied with thermograms from two different photovoltaic solar plants. The analysis is performed in the platform, and the average results obtained shows 93% of accuracy for hot spot detection.
- 13:00 Time-Series Forecasting of COVID-19 Cases Using Stacked Long Short-Term Memory Networks
- The extent of the COVID-19 pandemic has devastated world economies and claimed millions of lives. Timely and accurate information such as time-series forecasting is crucial for government, healthcare systems, decision-makers, and policy-implementers in managing the disease's progression. With the potential value of early knowledge to save countless lives, the research investigated and compared the capabilities and robustness of sophisticated deep learning models to traditional time-series forecasting methods. The results show that the Stacked Long Short-Term Memory Networks (SLSTM) outperforms the Exponential Smoothing (ES) and Autoregressive Integrated Moving Average (ARIMA) models for a 15-day forecast horizon. SLSTM attained a collective mean accuracy of 92.17% (confirmed cases) and 82.31% (death cases) using historical data of 419 days from March 6, 2020 to April 28, 2021 of four countries - the Philippines, United States, India, and Brazil.
- 13:20 Lockdown Duration-based Impact Analysis Model of Residential Community Mobility Change and COVID-19 Cases in Malaysia
- Many governments in the world had imposed a national lockdown for restricting community mobility for slowing down the infection rate of the COVID-19 virus before the whole healthcare system becomes overwhelmed. However, a study on the impact of lockdown and its duration is still lacking in the literature. This paper attempt to investigate and measure the impact of lockdown on community mobility pattern in the residential area and how it correlates with the new case of COVID-19 nationwide. Time series and correlation analysis have been adopted as research protocol in this study utilizing Google community mobility report (CMR) data and the national daily new cases reported by the Ministry of Health. The findings show there is an impact of lockdown start to show negative correlation at day 5 and give the optimal score between 13 to 28 days of duration. This paper contributes the metric for measuring the impact of lockdown in a varied duration of time. While the result cannot be generalized to another context, the measurement method from this study has potential application for state government or municipal targeting specific locality to develop more resilience policies and strategies to balance the impact of lockdown on life and living of the community.
- 13:40 Development of COVID-19 mRNA Vaccine Degradation Prediction System
- The threatening Coronavirus which was assigned as the global pandemic concussed not only the public health but society, economy and every walks of life. Some measurements are taken to stifle the spread and one of the best ways is to carry out some precautions to prevent the contagion of SARS-CoV-2 virus to uninfected populaces. Injecting prevention vaccines is one of the precaution steps under the grandiose blueprint. Among all vaccines, it is found that mRNA vaccine which shows no side effect with marvelous effectiveness is the most preferable candidates to be considered. However, degradation had become its biggest drawback to be implemented. Hereby, this study is held with desideratum to develop prediction models specifically to predict the degradation rate of mRNA vaccine for COVID-19. 3 machine learning algorithms, which are, Linear Regression (LR), Light Gradient Boosting Machine (LGBM) and Random Forest (RF) are proposed for 12 models development. Dataset comprises of thousands of RNA molecules that holds degradation rates at each position from Eterna platform is extracted, pre-processed and encoded with label encoding before loaded into algorithms. The results show that the LGBM-based model which is trained along with auxiliary bpps features and encoded with method 1 label encoding performs the best (RMSE = 0.24466), followed by the same criteria LGBM-based model but encoded with label encoding method 2, with a difference in 0.00003 in tow the topnotch model. The RF-based model with applaudable performance (RMSE = 0.25302) even without the ubieties of the riddled bpps features in contradistinction to the training and encoding criteria of the superb mellowed LGBM-based model is worth being further cultivated for the prediction study on COVID-19 mRNA vaccines' degradation rate.
- 14:00 The role of Internet of Things, Blockchain, Artificial Intelligence, and Big Data Technologies in Healthcare to Prevent the Spread of the COVID-19
- The spread of new coronavirus pandemic (COVID-19) has led to a major crisis in the economic and health sector, which required prompt response by medical personnel, health organizations, scientists, as well as the government sector. Globally, health care institutions have been affected greatly and unexpectedly by this COVID-19 pandemic put the current systems of healthcare under tremendous pressures, and at their maximum capabilities and resources in order to provide medical services to those infected. In this global health emergency situation and given the current limited healthcare resources, the necessity of finding quick and innovative solutions has been required. As a result, using new technologies to struggle COVID-19 and meeting the pandemic's specified requirements, such as detecting, monitoring, diagnosing, screening, surveillance, tracking, and raising awareness, has become unavoidable. The focus of this research is to understand how the healthcare system use these new technologies to fight against the pandemic. This paper provides a guideline to practitioners on the benefits and application areas of Artificial Intelligence, Internet of things, Blockchain, and Big data technologies in the healthcare industry to face the crisis caused by this pandemic. A detailed analysis of strengths, weaknesses, opportunities, and threats for the thorough implementation of these technologies has been conducted. Also, the paper addresses the obstacles to adopt these technologies in the healthcare systems and make some recommendations for future studies. The paper assists researchers, experts, and readers in recognizing how the use of technology is aiding in the management of the coronavirus infection in a synergistic manner, as well as encourage the need for these techniques in existing and potential times of emergency.
- 14:20 Deep Ensemble Approaches for Classification of COVID-19 in Chest X-Ray Images
- The COVID-19 pandemic has severely crippled the healthcare industry as a whole. Efficient screening techniques are crucial to suppress the escalation of the disease. Medical image analysis of chest X-rays has recently become increasingly important in radiology examination and screening of infected patients. Studies have shown that Deep CNN models can help in the diagnosis of this infection by automatically classifying chest X-ray images as infected or not. Ensemble modelling these Deep CNN architectures can further improve the performance by reducing the generalisation error when compared to a single model. This paper presents different Ensemble Learning approaches to synergize the features extracted by Deep CNN models to improve the classification. These automatic classification approaches can be used by radiologists to help identify infected chest X-rays and support resistance.
- 13:00 Design and Simulation of AES S-Box Towards Data Security in Video Surveillance Using IP Core Generator
- Broadcasting applications such as video surveillance systems are using High Definition (HD) videos. The use of high-resolution videos increases significantly the data volume of video coding standards such as High-Efficiency Video Coding (HEVC) and Advanced Video Coding (AVC), which increases the challenge for storing, processing, encrypting, and transmitting these data over different communication channels. Video compression standards use state-of-the-art techniques to compress raw video sequences more efficiently, such techniques require high computational complexity and memory utilization. With the emergent of using HEVC and video surveillance systems, many security risks arise such as man-in-the-middle attacks, and unauthorized disclosure. Such risks can be mitigated by encrypting the traffic of HEVC. The most widely used encryption algorithm is the Advanced Encryption Standard (AES). Most of the computational complexity in AES hardware-implemented is due to S-box or sub-byte operation and that because it needs many resources and it is a non-linear structure. The proposed AES S-box ROM design considers the latest HEVC used for homeland security video surveillance systems. This paper presents different designs for VHDL efficient ROM implementation of AES S-box using IP core generator, ROM components, and using Functions, which are all supported by Xilinx. IP core generator has Block Memory Generator (BMG) component in its library. S-box IP core ROM is implemented using Single port block memory. The S-box lookup table has been used to fill the ROM using the .coe file format provided during the initialization of the IP core ROM. The width is set to 8-bit to address the 256 values while the depth is set to 8-bit which represents the data filed in the ROM. The whole design is synthesized using Xilinx ISE Design Suite 14.7 software, while Modelism (version10.4a) is used for the simulation process. The proposed IP core ROM design has shown better memory utilization compared to non-IP core ROM design, which is more suitable for memory-intensive applications. The proposed design is suitable for implementation using the FPGA ROM design. Hardware complexity, frequency, memory utilization, and delay are presented in this paper.
- 13:20 Exploring Blockchain-enabled Intelligent IoT Architecture
- Internet of Things (IoT) architecture, despite its strong functionality and compatibility with numerous smart devices, is limited by its vulnerability to security threats. To overcome this limitation, attempts to introduce blockchain and Artificial Intelligence (AI) to improve IoT architecture have been gaining traction in the past few years. While a significant number of iterations have been made in this regard, the complexity of the integration process has made it difficult to identify best practices that are suitable across different applications. This study analyses the issues and limitations of integrating Blockchain and AI in an IoT architecture, by looking at different iterations and implementations to arrive at a clear picture of existing trends involving research limitations and challenges. The overall results seem to indicate a positive trajectory, as the integration of IoT, blockchain, and AI has been successful across various implementation. While the extent of blockchain integration of different components depend upon the purpose of the system, the caveat is that there are possible issues involving increased complexity, compatibility, and efficiency. The use of AI algorithms has been instrumental in filling in the gaps and improving the overall efficiency of such systems.
- 13:40 Blockchain in Healthcare: Attacks and Countermeasures
- Blockchain technology is a collection of records called blocks, and it is characterized by several features including transparency, which means that everyone in the network can view all information stored in the blockchain, and blockchain tech- nology is also decentralized, meaning that there is no authority within it. This technology is used in many areas, including digital identities. Digital auctions as well as cryptocurrency and so on. Blockchain technology plays a large role in healthcare in terms of its many services, as it helps healthcare researchers decode the genetic code and securely transmit patient medical records, and it also manages the drug supply chain. All Blockchain- based systems are exposed to various attacks by attackers, which threatens the security of information and data stored on blockchain systems, and among these systems are blockchain- based healthcare systems. This paper focuses on the recent attacks and countermeasures of blockchain technology in the healthcare field. Several expected attacks are discussed and countermeasures that can be used to prevent or reduce the risk of these attacks are mentioned.
- 14:00 On the Implementation of Access Control in Ethereum Blockchain
- Access control is a main component in Blockchain systems. Access control is usually implemented in smart contracts and defines the security policy, in other words, it determines who can access a protected resource in the network. In this paper, we present a review of the major implementations of access control in Ethereum platform. The latter is based on RBAC model (Role-Based Access Control). Implementations require to take into account the whole RBAC process, that is, user role assignment and permission assignment. Three implementations currently exist and are described and compared in this work according to several features: RBAC-SC, Smart policies and OpenZepplin contracts.
- 14:20 Decentralised Blockchain-based Solutions for Electronic Healthcare Record with Interacting Social Networking Components
- The healthcare industry is a conjunction of monolithic applications based on neutral and time-stamped diagnostics. Hence, it is particularly interested in decentralised technologies to achieve global and social-driven electronic health records. Blockchain is a decentralised networking system where multiple copies of immutable data records are distributed and validated among different independent nodes. While traditionally used to create currently famous digital cryptocurrencies like Bitcoin, its worldwide fever helped to extend its applications beyond the financial industry. However, given how fast completely new technologies evolve, healthcare stakeholders are struggling to find the correct usage of blockchain ecosystem developments. Our goal is to provide a summary of the state-of-the-art efforts in the development of decentralised blockchain-based solutions for electronic healthcare records, in which social networking and wearable technology data is considered in order to provide patient-driven diagnostics -- proven to be critical in the current SARS-CoV-2 pandemic. In this paper, we present a comprehensive guide on blockchain technologies competing in the healthcare market, including emerging blockchain-based initiatives and trends. Additionally, we discuss the main caveats of blockchain applications development in this industry based on practical experience.
- 13:00 Deep And Machine Learning Towards Pneumonia And Asthma Detection
- Machine learning is a branch of artificial intelligence widely used in the medical field for the analysis of high-dimensional medical data and the early detection of certain dangerous diseases. Lung diseases continue to be one of the leading causes of death worldwide. The early and accurate prediction of lung diseases has become a primary necessity to save patients' lives and facilitate doctor's works. This paper focuses on the prediction of certain chest diseases such as Pneumonia and Asthma using machines and deep learning techniques, which are respectively the K-nearest Neighbors (KNN) and the Deep Neural Network (DNN) methods. These approaches are evaluated using a private data set from the pulmonary diseases department of Diyarbakir hospital, Turkey. It consists of 212 samples, each one is characterized by 38 input characteristics. The results obtained showed the effectiveness of these methods especially the KNN and confirmed that they can be used effectively in the detection of pulmonary diseases
- 13:20 Quality Categorisation of Corn (Zea mays) Seed using Feature-Based Classifier and Deep Learning on Digital Images
- Corn yield improvement program aims to attain continuous national self-sufficiency. The program needs to be supported by the availability of food resources, including high-quality corn seeds. In corn seed production, grading is one of the factors that affect the quality of corn seeds. The grading process is conducted manually by visual observations of workers. This process tends to be subjective and ineffective. Some corn seed factories use sieve machines to do grading by seed size. In this paper, an imaging-based classification system is proposed to perform corn seeds BIMA-20 URI Hybrid grading of two classes, which are categorised as good and bad. Three different methods are studied in the paper. The methods are respectively based on (1) shape, colour, and size features, (2) seed roundness, and (3) deep learning approach. Images data is acquired in a group of five corn kernels. Region-of-interest (ROI) segmentation is performed to select every single seed from the group image. Features values are then extracted from a single seed image and used as a classification parameter. The F1 score of the proposed classification system, roundness differentiation, and model training performance can be used to show the categorisation capability. The deep learning approach has achieved the best F1 score among the other proposed techniques. The best F1 value, 0.983, is obtained at the ResNet-50 implementation. In separated observation, Method 6 (Size and Colour), Method 7 (Size, Shape, and Colour), Roundness, and ResNet-50 are represented as the best model for each group method. These methods reach F1 scores more than 0.9, except the roundness parameter. The F1 score of the roundness parameter is found at 0.854. Additional parameters might be required by the method based on the roundness feature for improving its final performance.
- 13:40 A Novel Deep Learning Based Intrusion Detection System: Software Defined Network
- Over the last few years, Software Defined Networking (SDN) has brought forward a potential software-based networking framework that, with the existing network management system, allows for the network programming to operate alongside the overall network management system. Tracking data to the data centre is more effective with this new method. It prevents security flaws from causing new threats to appear in the network since these vulnerabilities will only reveal themselves at the time of OpenFlow packet transmission via a centralized system and symmetric controller. A study was conducted for a Deep Learning (DL) based approach that is proposed to be implemented on SDN. The Deep Neural Network model is used to monitor network activity for both regular and anomalous data transfer to check for malicious traffic. A dataset of IDS, publicly accessible, KDD-CUP99, NSL-KDD and UNSW-NB15 Dataset are used to determine the possible behaviour of security flaws. The study explores SDN security and IDS about security concerns and also gives a very high and acceptable accuracy rate.
- 14:00 Eye-Tracking Analysis with Deep Learning Method
- The eyes are a rich source of information about mental activities as well as providing the perception of the outside world. Because they cannot be consciously controlled, they can reveal unique characteristics such as preferences and intentions. For this reason, eye-tracking technology is widely used in medicine, gaming, and commercial applications. In this study, an estimate of what type of text is read was made using the analysis of eye movements during daily reading activity. In the study, deep learning approaches were preferred due to the insufficient results of machine learning approaches before. Multiplexing was performed using a dataset with 52 features consisting of 20 participants (10 males, 10 females). 627 data were obtained as a result of multiplexing from 20 data. As a result of the creation of visual representations (spectrograms) of the data produced in sufficient numbers and processing with deep learning architectures, a good success rate of 97.88% was achieved with AlexNet. While the best values in news and text types were obtained with AlexNet and Resnet101, better results were produced with ResNet18 and ResNet50 in comedy with high visual content. It was noticed that the success rate in women was higher in documents with visual content.
- 14:20 Chronic Kidney Disease Classification Using Machine Learning Classifiers
- One of the important health problems is chronic kidney disease (CKD). It is increasing every day as a result of poor eating habits, insufficient water consumption, and lack of health awareness. CKD is a condition in which the kidneys are damaged over time due to a variety of causes, resulting in a progressive and irreversible loss of kidney function. Technological advancements, such as Machine Learning (ML), have a significant influence on the health sector by giving more accurate detection and successful treatment of many chronic diseases. This research paper explores a variety of ML techniques to predict kidney disease. The objective is to identify which ML classifier could be the best classifier for detecting and predicting CKD. In this work, seven ML approaches are used to predict kidney disease, namely, Gaussian Nave Bayes (GNB), Decision Tree (DT), K-Nearest Neighbor (KNN), Random Forest (RF), Logistic Regression (LR), AdaBoost and Gradient Boosting. RF and GNB scored better than the other classifiers, with 100% accuracy.
- 13:00 An Intensity Estimation Application Based on Website Microservice Logs
- The number of digital platforms that use cloud systems with microsystem architectures has increased day by day. By using public cloud systems efficiently, costs and expenses can be significantly reduced. This study tries to determine the necessary resource for the website by examining user activities for cloud resources management. A successful estimating system is essential for adjusting the price/performance balance of resource management. In this study, more than 1.5 million user logs with 18 different features were collected. SVM RBF and decision tree forest have been applied for this data. This study is shown that the SVM RBF method modeled the service rush time with an approximately 95% success rate. With the study, it has been revealed that a sound cloud resources management system can a significant economic benefit by adjusting the number of resources according to rush time prediction.
- 13:20 Performance Monitoring of MongoDB on Varied Cluster Configuration: An Experimental Approach
- Data has become one of the most valuable assets in today's digital world. The nature of the generated data varies from structured to semi-structured and even completely unstructured data items. Like many other verticals, the educational domain is also generating a huge volume of data with high variation. The main objective is to store and retrieve data in optimal time, efficiently utilizing recourses, and maintaining security. The evolution of different "Not only SQL (NoSQL)" data models to manage this huge volume and variety of data has helped to facilitate many powerful applications like Facebook, Instagram, WhatsApp, etc. NoSQL data models are provided by many vendors as installation-based services and cloud-based services as well. MongoDB is one of the popular document-oriented NoSQL data models. In this paper, our objective is to analyze the performance in different aspects of MongoDB on installation-based clustering system and on cloud-based clustering system. We have deployed the methodology for the stated semi-structured big-data handling with a case study of Visitors' Registrar. The performance of the execution of Mongo Query Language (MQL) on the two different platforms is analyzed.
- 13:40 An Alternate Switch Selection for Fault Tolerant Load Administration and VM Migration in Fog Enabled Cloud Datacenter
- Users of Cloud-fog can access elastic clusters of available Virtual Machines (VM) for their data processing needs. Consumption of lesser hardware by the individuals in additional to the VM technologies was attained by means of introducing Fog-IoT set-up and Cloud-resources employment. A better recapturing of failure renovation services requires VM based infrastructure. Service accuracy for the provider's Virtual Domain Controller (DC) can be achieved through dedicated routing resolution. In case of node failure, it is very difficult to decide which and how-many VM's should be chosen for migration to keeps up the accuracy assurance at the failed node. The selected VM can restrict the number of VM's should be migrated. For decreasing the given cloud-fog resources loads, choosing a single or multiple promising VMs to be migrated may create an issue. An Alternate switch Identification and Fault Tolerant Load Administration (AsI_FTLA) for implementing the Cloud-fog data center infrastructure while VM migration via improved recovery method of Virtual Network (VN) is proposed in this work. A model of linear-integer programming is imposed to study the path traffic that examines every associated numerical factor for selecting the most favorable VM through the best route. The new VM migration is then established by alternative switch identification algorithm and routing is achieved. CloudSim was employed to study the performance of proposed AsI_FTLA system. Simulation results states, there is a considerable improvements in average resource and storage utilization along with throughput, and reduction in total execution time against existing strategies.
- 14:00 Quality of Life Integrated Framework: Perspective of Cloud Computing Usage
- This research aims to measure the impact of cloud computing on people's quality of life in the Kingdom of Bahrain and to recognize factors that could impact people's intention to use cloud computing services. An online survey has been used to collect primary data for the research. It was distributed to a random sample of 443 respondents in the Kingdom of Bahrain. The achievable sample comprised 394 represent people of different ages and educational levels. The researchers adapted selected factors from the diffusion of innovation (DOI) theory, including relative advantage, complexity, and compatibility. In addition to the quality of life factors consisting of education, healthcare, well-being, and entertainment. These factors are used to establishing the framework of this research. The research limitation was in examining only the factors proposed in the framework. Also, as a consequence of the coronavirus's current situation (COVID-19), the method of collecting data was restricted to the quantitative approach using an online survey. Findings show that administrability of cloud computing usage is the most impacting factor on people's quality of life and more specifically on people's education
- 14:20 Verified Framework for Distributed Processing Cost Reduction using Excess Cloud Resources
- Distributed computing is one of the important technologies for processing big data. A distributed computing system is based on the use of many computing devices by linking them together to process data. A distributed computing system can be leveraged using cloud resources. Pricing for booking cloud resources varies and leasing redundant resources are less expensive with some drawbacks. In this paper, distributed processing cost reduction using excess cloud resources verified framework is presented. The framework is based on using redundant resources via cloud services. The framework is verified through simulation. As a result of implementing the framework, it was found that the use of excess cloud resources reduces the cost of implementing a distributed computing system by 67% compared to use on-demand cloud resources.
- 13:00 Support Vector Machine and Decision Tree-Based Elective Course Suggestion System: A Case Study
- Nowadays, online education has become widespread, and the search for new techniques has begun to increase. The high number of quotas in university education in Turkey increases the number of students per instructor. It is not at the desired level for the student to receive a good education in the presence of an advisor and choose the appropriate course for his / her field due to a large number of students. In this study, a suggestion system is proposed by analyzing the previous courses taken by university students in directing the elective course. In this study, which courses would be beneficial to choose and which would be useless are presented with a web interface in which Support Vector Machine and decision trees are used. In the pilot study that the model developed conducted in the Computer Engineering department, an average of 76% success was achieved in test data sets. This success shows that the student can examine the compulsory courses and suggest elective courses suitable for his/her field and that he/she will like.
- 13:20 An Effective Hybrid Approach Based on Machine Learning Techniques for Auto-Translation: Japanese to English
- In recent years machine learning techniques have been able to perform tasks previously thought impossible or impractical such as image classification and natural language translation, as such this allows for the automation of tasks previously thought only possible by humans. This research work aims to test a naïve post processing grammar correction method using a Long Short Term Memory neural network to rearrange translated sentences from Subject Object Verb to Subject Verb Object. Here machine learning based techniques are used to successfully translate works in an automated fashion rather than manually and post processing translations to increase sentiment and grammar accuracy. The implementation of the proposed methodology uses a bounding box object detection model, optical character recognition model and a natural language processing model to fully translate manga without human intervention. The grammar correction experimentation tries to fix a common problem when machines translate between two natural languages that use different ordering, in this case from Japanese Subject Object Verb to English Subject Verb Object. For this experimentation 2 sequence to sequence Long Short Term Memory neural networks were developed, a character level and a word level model using word embedding to reorder English sentences from Subject Object Verb to Subject Verb Object. The results showed that the methodology works in practice and can automate the translation process successfully.
- 13:40 Training Time Optimization Trough Adaptive Learning Strategy
- Digital Learning is rapidly evolving and adapting to new learning needs. In every field of daily life, training is a fundamental asset to achieve any goals. Modern e-learning systems aim to make learning quick and effective. The training courses are often delivered sequentially, and there is a high waste of time in training since learners must attend lessons on topics they already master. This research aims to demonstrate that an Adaptive Learning Strategy can optimise training by drastically reducing the throughput time of the learning path, avoiding time-wasting and maintaining a high level of learner engagement. Those goals will be reached using a learning management system platform and an adaptive learning algorithm on a modular course to build up and deliver personalised learning paths, recognising the prior knowledge of each user. Thanks to an Adaptive Learning Strategy, a learner will optimise his/her training achieving the learning goals in a shorter time. He will not have to attend topics he/she already demonstrates to have a complete knowledge level.
- 14:00 Effectiveness of Online Teaching During COVID-19
- While considering the challenges of online teaching in the current scenario of Covid-19 pandemic, the current study aimed to analyze the effect of student participation, teachers' skills and strategies, teacher training, teaching domain and teaching perception on effectiveness of online teaching. Primary data was used to test the proposed model of the study, data was collected through emails using convenience sampling from university teachers of Pakistan. Structural equation modelling technique was used through SmartPLS (v.3.) to analyze the model of this research. Findings of the current study indicate that student participation, teachers' skills and strategies, teacher's training, teaching domain and teaching perception have a significant positive effect on effectiveness of online teaching. Hence, this study recommends that universities must have to focus on the individual teachers' need to make online teaching more effective.
- 14:20 A Survey on E-learning Methods and Effectiveness in Public Bahrain Schools during the COVID-19 pandemic
- Educational organizations have used e-learning as an alternative to traditional learning at the COVID-19 pandemic and the need for social distancing. This paper presents the e-learning methods used during the COVID-19 pandemic period in public Bahrain schools. In addition, determines the positive and negative effects of the e-learning system. This research was conducted using a sample of 522 students from different age groups and different schools to measure the level of e-learning performance. The study showed that most students believe the effectiveness of e-learning is high in providing academic requirements during the pandemic period. On the other hand, some obstacles affect the level of e-learning productivity, and plans must be developed to overcome the obstacles.
Thursday, September 30 14:40 - 16:00 (Asia/Bahrain)
Thursday, September 30 16:00 - 17:40 (Asia/Bahrain)
- 16:00 A Comparative Review of Energy Management Controllers in Building
- In this paper, the concept of building's energy and power demand is highlighted and an overview of all the recent possible control strategies, available in the literature, is classified and presented. It also illustrates the sub-types of each strategy in aiming to simplify the difference between them. Moreover, a detailed table of prioritized indicators is suggested to finally select the best fitting strategy of energy management controllers.
- 16:20 A Review of Model Predictive Control Strategy for Managing Building's Energy
- In this paper, the concept of building energy management is highlighted. An overview of the new enhanced hierarchized hybrid model predictive controller for buildings is discussed. The contribution of this paper concerns shedding light on an enhanced model predictive control strategy which is the hierarchized hybrid made up of multi-layers and time scales. It contains complete details about its developed typologies, methods as well as the architecture of the hierarchized hybrid predictive model.
- 16:40 A Prototype to Produce an Integrated GIS-SUE Map
- The Subsurface Utility Engineering (SUE) is an international model of presenting and preparing the maps and engineering drawings that contain underground utilities. The SUE classifies the subsurface features' accuracy into four categories based on their data collecting method, which indicates the degree of confidence in their actual location. But SUE doesn't have attributes for spatial analysis. On the other hand, the Geographic information system (GIS) represents a primary tool to manage, operate, and maintain utilities. Unfortunately, GIS-utility doesn't provide the accuracy levels of subsurface utility features. This article aimed to design a systematic prototype approach to produce a GIS-utility map supported the Quality levels of subsurface features based on SUE standards. The research methodology is to design two prototype algorithms to produce a GIS-SUE integrated map by adding the SUE standards to the GIS-utility map. The second prototype is to convert a classical SUE map to a GIS-SUE integrated map. These prototypes were designed in a high-level unified modeling language (UML). These UML charts could be translated to any programming language based on available software and budget. The data used in this research is obtained from actual fieldwork done with the Ministry of Works in Bahrain. The produced prototypes are the main results of this research. Ubon implementation and execution of the proposed algorithm would represent the future framework for subsurface utility mapping systems representing the backbone for future urban infrastructure. The new system can be used to facilitate and utilize the integrated map to do spatial analysis. The proposed outcome of this integration might be a new decision support system that helps the user determine the cost and time to operate or maintain existing utilities. Finally, some mapped subsurface points features were mapped using MATLAB code (point feature type). The tabular data from mentioned data was represented as a digital spatial map using ArcGIS software. Developing a mathematical-statistical model for accuracy assessment remains a challenge.
- 17:00 A benchmark of GRU and LSTM networks for short-term electric load forecasting
- Recently, electric power systems have been modernised to be integrated with distributed energy systems having intermittent characteristics. Herein, short-term electric load forecasting (STLF), which covers hour, day, or week-ahead predictions of electric loads, is a crucial piece of the modern power system puzzle whose level of complexity has become more and more sophisticated owing to incorporating microgrids and smart grids. Due to the nonlinear feature of electric loads and the uncertainties in the modern power systems, deep learning algorithms are frequently applied to STLF problem which can be described as an arduous challenge because of being affected by several impacts. In this paper, gated recurrent unit (GRU) and long short-term memory (LSTM) networks are implemented in forecasting an hour-ahead electric loads of a large hospital complex located in Adana, Turkey. Overall results belonging to the benchmark of GRU and LSTM networks for STLF revealed that employing GRU networks performed better in terms of mean absolute percentage error (MAPE) by 7.8% and computational time by 15.5% in comparison with utilising LSTM networks.
- 16:00 Artificial Intelligence Composer
- In this study, classical music has been investigated mainly based on pieces of well-known composers Mozart and Beethoven, then AI composer based on Markov chains and RNN has been proposed. AI is an efficient tool in science and technology for many specific applications including the music field. The database has been collected based on 25 classical music sheets. The notes were separated into two groups where they are right hand and left hand. The database includes the notes and their frequencies and durations. The transition probability of each note was calculated. After the selection of the first note randomly, then the following notes were generated by means of the transition matrix. According to the results, both methods show an adequate level of quality considering the generation of notes by means of AI composer. The authors recommend using Markov chains if a simple but efficient tool is appropriate considering the design criteria.
- 16:20 Reference Points Generated on Unit Hypersurfaces for MaOEAs
- This paper proposes a method to uniformly generate reference points on a hypersurface for many-objective optimization evolutionary algorithms (MaOEAs). Recently, MaOEAs have been proposed to obtain selection pressure in a multidimensional objective space by using a reference point set, but there is no method for generating a reference point set that is supposed to incorporate user orientation. This paper proposes a method for generating uniform reference points on unit hyperspheres and unit hyperplanes in a multidimensional objective space. The proposed method is applied to the multi-objective GP problem by NSGA-III and to the multi-objective combinatorial optimization problem by MOEA/D. As a result, we confirm that the proposed method gives non-inferior results compared to conventional methods. Since the proposed method can easily incorporate user orientation, this shows the effectiveness of the proposed method.
- 16:40 E-commerce Product Recommendation Based on Product Specification and Similarity
- Recommender systems play the role of leading users to customized suggestions in the broad universe of available possibilities. While producers use it for cross-selling, which suggests additional products or services to customers, consumers use recommender systems to seek items that match their interests and preferences. By establishing a value-added relationship between the system and the customer, recommender systems boost loyalty. In present e-commerce systems, user pattern search, item, and historical analysis is a substantial component of a recommendation system. A better recommendation system based on product specifications and product similarity measures rather than historical data could lead to a progressive change in e-commerce recommendation technologies. This paper proposes a model that uses product specifications and various similarity measures to compute the user recommendations. The model considers product description and specifications to calculate a similarity measure and then uses these similarity values to form clusters of products. Based on the generated cluster of products, relevant products are recommended to the user. The paper presents method analysis of the various measures and matrices using a sample data set. It also compares the results of our model with the traditionally followed model. The proposed methodology promises to build a user-friendly recommendation system.
- 17:00 LSTM and Ensemble Based Approach for Predicting the Success of Movies Using Metadata and Social Media
- Twitter, for example, offers a wealth of information on people's choices. Because of social media's growing acceptability and popularity, extracting information from data produced on social media has emerged as a prominent study issue. These massive amounts of data are used to build models that anticipate behavior and trends. On Twitter, people express their opinions regarding movies. In this study, a Long Short-Term Memory (LSTM) and ensemble-based approach was proposed predicting the success of movies using metadata and social media. In this research, both social media data and movie metadata were consumed to predict the success of the movies. The metadata of the movie also plays an important role, which can be utilized to predict the success of the movies. IMDb ratings, the genre of the movies, and details about the awards that the movies won or nominated are some of the metadata used in addition to the tweets. LSTM, a neural network (NN) model, was applied to identify the sentiment value of the Twitter posts. Then, the ensemble approach was employed to predict the success of movies using movie metadata and results from the LSTM based NN model. This combined model was able to obtain 81.2% accuracy and outperformed the other implemented models.
- 17:20 Big Data Analytics, Greedy Approach, and Clustering Algorithms for Real-Time Cash Management of Automated Teller Machines
- Automated Teller Machines (ATMs) often lack the required funds or become malfunctioned, which affects the customer experience and the reputation of the bank. Banks try to quickly resolve the problem through cash-in-transit companies that handle the operations of ATM refilling and maintenance. However, one of the largest dilemmas is to determine the order of visiting the ATMs as well as to balance the workload among the workforces during the day. In addition, there is a need to handle real-time and urgent requests during the day. This problem was modelled as a real-time multiple Travelling Salesmen Problem (mTSP). New constrains including traffic data, ATM priorities, and safety measurements were considered. We used big data analytics to extract useful features related to the customer withdrawal trends and active locations from real data provided by a Bahraini bank. To solve this NP-hard problem, we proposed a brute force method that generates optimal routes for limited-sized problem instances, up to 35 ATMs. Moreover, a greedy technique was proposed to solve large-sized instances considering one salesman. The obtained TSP route is then cut into clusters using unsupervised machine learning models. A modified version of k-Means has been applied with constrains to control the size of each cluster.
- 16:00 Application of Artificial Intelligence in Digital Breast Tomosynthesis and Mammography
- Medical data mining is known as methods of extracting data from human services databases to help clinicians get the best diagnosis. In this field, and among cancer diseases, breast cancer is the highest deadly disease in the world in recent years. Therefore, data mining techniques will be the largest part used in this study. To complement the previous research related to breast cancer detection, this paper proposed a model to help solve the difficulty of determining the degree of risk of the disease and obtain the best results. Aiming of reducing the costs and time used in diagnosing the disease too. The experiment used a dataset computed from a digitized fine needle aspirate (FNA) image of a breast mass dataset, diagnostic (WBCD) available in UCI machine learning repository. The model is the application of classification techniques to the breast cancer data collected, which, in turn, predicts the severity of a patient's breast cancer. In addition, this paper classifies and diagnose cancers using Deep-learning algorithms only, hybrid Machine learning algorithms and feature selection methods (applied correlative rules to find out what traits related to breast cancer severity).
- 16:20 A Review of Malicious Altering Healthcare Imagery using Artificial intelligence
- During the second half of 2020, healthcare is and has been the number one target for cybercrime, enormous amount of cyberattacks on hospitals and health systems increased, and specialists trust there are more to come. Attackers with access to medical records can do much more than hold the data for deal or vend it on the underground economy, hostage the systems and the sensitive data, that has a significant impact on operations. In this review paper, we show how an attacker can use deep- learning to add or remove evidence of medical conditions from medical scans and reports. An attacker may achieve this act in order to stop a political candidate, lockup investigations, obligate insurance scam, execute an act of violence, or even commit homicide. Many related studies focused on GAN techniques had been reviews in the period below 2000 to 2021. Associated to evaluate the attack, and focused on injecting and removing evidence from patient medical images scans. Many papers showed how hospital system, physicians and radiology's specialists and a state-of-the-art deep learning AI are highly vulnerable to the attack..
- 16:40 Generative Adversarial Networks (GAN) for Arabic Calligraphy
- Arabic calligraphy is one of the most aesthetic art forms in the world due to its variety and long history. However, generating calligraphic style is mainly done by human expert calligrapher (also known as Khattat) and has not been carried out by machine learning techniques. Generative adversarial networks (GAN) are deep learning tools that achieved outstanding results in the field of style transfer and generation. In this paper, various GAN architectures were investigated such as CycleGAN, Pix2pix, and deep convolutional generative adversarial networks (DCGAN) within Arabic calligraphy in two aspects: generation and style transfer. The experiments were limited to two styles: Naskh and Thulth. A tool was also created to remove noise from calligraphy papers which is necessary for building a training dataset. The models are evaluated qualitatively using a preference judgment technique survey.
- 17:00 Linear Regression and Counterfactual Fairness
- Over the last several years, society has begun to grapple with the extent to which human prejudices might infiltrate Artificial Intelligence (AI) systems, with potentially disastrous consequences. Being acutely aware of such hazards and striving to eliminate them is an essential concern at a time when many firms are aiming to adopt AI systems across their operations. Non-smooth regularized convex optimization algorithms have been developed as a strong tool for recovering structured signals from noisy linear measurements. If the studied data is unbiased and of finite variance, linear regression produces minimum-variance, unbiased estimates of the adjustable parameters. If the data is normally distributed, the calculated parameters will also be normally distributed. However, before fitting, data is usually altered, and even if the original data are normal, the altered data may not be. As a result, even when the converted data are appropriately weighted in accordance with the transformation, least-squares analysis of such data gives skewed, non-normally distributed parameters. In this research study, many strategies are investigated and evaluated for developing a novel regularized linear regression model in the setting of a data set with a modest sample size compared to the number of parameters. Presented research paves way for regularized linear model in cases of finite number of data values, as the biases often remain in such cases, implying that the estimators are not only flawed but also inconsistent.
- 16:00 Combatting Resistance to Change During the COVID19 Pandemic with Design Thinking Approach: Making a Case for the Public Sector
- Human mind thrives on distraction for a change. Yet, counterintuitively, any alteration from the regular or routine baffles mankind and is perceived by default as a problem that automates resistance. Conventionally defined problems generate conventional solutions which usually don't last. Contrarily, a problem defined by those most affected by it or by living the experience of the affected ones, yields richer insights providing far lasting solutions. The early 2020 quarantines and social distancing practices globally, in response to the spread of COVID-19 resulted in the major disruption of workflow worldwide across public and private sectors with the digitalized operations. To solve the problem resulting due to this scenario, the current study used a design thinking approach for innovative and lasting solutions with wide applicability. The human-centric core of this design investigates resistance to change due to the COVID-19 pandemic by understanding human mindsets, needs, and limitations. Engaging a purpose-led participatory research design, the qualitative data on why people resist change is collected using ethnographic tools with focus groups of employees from the Ministry of Education and the Ministry of Health. The quantitative data is collected including other public sectors using a survey. With a sample size of 34 participants who volunteered to take part in the study in a short span of time, the paper culminates in proposing solutions that can be prototyped for testing and refined before being generalized and acceptable for wider implementation. The design thinking approach adopted, thus aims to establish transition guidelines for managing future organizational change with minimal resistance.
- 16:20 Post COVID-19 Work Transformation Behavior for Optimum Performance in the Public Sector in the Kingdom of Bahrain: A Design Thinking Approach
- The COVID-19 pandemic has brought changes in the administrative routine in the public sector. New norms at workplace have taken place. A portion of employees are required to work from home (WFH) due to the precautionary measures. WFH implied that communication was heavily relying on technological mediated platforms. The new working setup have posed a new challenge to supervisors in distributing tasks to employees. This study aims to investigate the challenges in maintaining and optimizing the performance of public sector during this new norm. Six online semi-structured interviews were conducted using Microsoft Teams with supervisors and employees from public sector to have a better understanding of this challenge. The analysis showed that organizational factors, supervisor factors, peer factors and infrastructure factors played a significant role in determining the efficiency of task distribution. Based on the findings, it is recommended to set up a detailed WFH policy as well as utilizing Microsoft Office Planner to address the problem.
- 16:40 Social Distancing Compliant Virtual Queuing System for Public Services
- Many services provided to the public involve gathering in a waiting area to be fairly served on a first-come, first-served basis, i.e. queueing. Over time, there have been improvements to the process by providing electronic systems that generate sequenced numbers to ensure fairness. Other advances have also led to enhancements of the waiting area environment to make it more relaxing and hospitable. Such efforts have been successful to some extent, but still, they rely on the customers' physical presence in a waiting area. The recent crises due to the COVID19 pandemic and the necessity for everyone to observe social distancing posed a new challenge. The space provided by the waiting areas has become insufficient. It has been noticed on several occasions that people had to break social distancing restrictions due to space limitations to complete a service, causing health risks to themselves and others, including service staff. Such risks are propagated in healthcare facilities involving diseased, sick, and reduced immunity people. To address this new challenge and provide a more compliant environment for social distancing restrictions, a design-thinking approach is used to develop a customer-focused digital transformation solution. The proposed solution enhances the queuing process and optimizes the physical presence within the waiting area in exchange for a much dynamic digital application. The application enables customers to enter the queue remotely and synchronize their physical location and attendance with the service availability time and distance, allowing customers to utilize their time better and even being physically elsewhere. At the same time, they are still counted on the service queue. The system determines who should be available at the waiting area according to space and service availability yet ensuring fairness. This solution is expected to reduce the number of people physically available at the service area; thus, social distancing can be better observed. At the same time, the solution provides customers with a waiting location of their choice and comfort; and is expected to increase customers' safety and service satisfaction.
- 17:00 Sustaining Work Continuity through Hybrid Work Environments: Tracking Systems
- Work continuity is one of the most critical issues that have emerged due to the COVID-19 pandemic. The different public and private sectors have suffered from the consequent disruption, leading to numerous cases of work discontinuity. The pandemic experience is an example of a national crisis where sustaining work and duties comprise a serious challenge, but it has also introduced the concept of remote and hybrid working. This paper explores sustaining work continuity through hybrid work environments by analyzing the different challenges and threats that would lead to work discontinuities, including culture, technology, geography, focusing on the public sector experience in the Kingdom of Bahrain. It mainly questions the efficiency and quality of services provided by public sector institutions during the pandemic in addition to employees' satisfaction with the continuity of remote work as a permanent alternative to the traditional ways of in-person attendance. In order to deduce feasible and pragmatic solutions, the adopted methodology integrates two approaches to analyze data, which substantially relies on the Institute of Public Administration in Bahrain (BIPA) and other academic resources. First, the collected data is analyzed through the Design Thinking approach, and then the addressed problems and possible solutions are validated through two Delphi Protocol rounds. In terms of building hybrid environments, a model has been adopted based on looking into employee and customer satisfaction by extracting a number of indicators related to the sustainability of work continuity. The paper concludes with recommendations that would improve work continuity during and after crises through hybrid work systems and policies that would assure smooth work performance and effective internal and external communication.
- 17:20 The Effectiveness of the Performance Appraisal of Public Sector Employees During Covid-19 Pandemic
- The spread of the Covid-19 virus has obligated many organizations to restructure the means to conduct daily tasks and responsibilities. Employees were given the possibility of providing the service online from homes. Nevertheless, their work performance needs to be assessed using a transparent and fair policy. This paper is based on previous research from Bahrain Institute of Public Administration (BIPA), which measures the impact of the COVID-19 pandemic on employee performance in the public sector of the Kingdom of Bahrain, especially those employees who graduated from the National Program for the Development of Governmental Leadership. The purpose of this paper is to study the effectiveness of the current appraisal system in measuring the performance of employees remotely in the public sector. The researchers used a mixed-method approach. A questionnaire was developed and validated to collect data. The results show that the majority of respondents believe that the evaluation process needs to be revised and improved to align it with the organization's vision and mission. The current appraisal system is not motivating and therefore somehow ineffective. Results from this study are beneficial to the Kingdom's public sector to adequately set an effective method of evaluating the employees' performance. The paper highlighted the importance of the involvement of all stakeholders to strategically develop the plans and objectives of the organization, which should reflect on the overall goals of the firm.
- 16:00 Design and Implementation of Inverse Kinematics Algorithm to Manipulate 5-DOF Humanoid Robotic Arm
- Inverse Kinematics (IK) is a mathematical approach to compute joint angles of an arm based on given position and orientation of the wrist in the space. This paper presents implementation of new analytical solution of Inverse Kinematics algorithm to manipulate new 3D printed 5-DOF Humanoid Robotic Arm termed (HRA). The proposed IK algorithm is programmed using MATLAB to find the joints angles which used to control the servos of HRA via Arduino microcontroller. Six desired positions are used as targets to be achieved on experimental workspace in the LAB. The IK algorithm is assessed by calculating the Root Mean Squared Error (RMSE) for the absolute error vector for six positions. Results shows that minimum achieved (RMSE) for the arm's end effector position is 0.5774.
- 16:20 Implementation of a WGAN-GP for Human Pose Transfer using a 3-channel pose representation
- The computational problem of Human Pose Transfer (HPT) is addressed in this paper. HPT in recent days have become an emerging research topic which can be used in fields like fashion design, media production, animation, virtual reality. Given the image of a human subject and a target pose, the goal of HPT is to generate a new image of the human subject with the novel pose. That is, the pose of the target pose is transferred to the human subject. HPT has been carried out in two stages. In stage 1, a rough estimate is generated and in stage 2, the rough estimate is refined with a generative adversarial network. The novelty of this work is the way pose information is represented. Earlier methods used computationally expensive pose representations like 3D DensePose and 18-channel pose heatmaps. This work uses a 3-channel colour image of a stick figure to represent human pose. Different body parts are encoded with different colours. The convolutional neural networks will now have to recognize colours only, and since these colours encode body parts, eventually the network will also learn about the position of the body parts.
- 16:40 Image Recognition System Using Neural Network Techniques: an Overview
- The huge demand for image recognition systems has led to an increased requirement for raising the efficiency of this technology and finding solutions to reduce the deficiencies and weaknesses of image recognition systems. This field of research confronts many challenges and issues that can affect the accuracy of systems. This paper reviews several studies and approaches to the framework for system optimization for both object and face recognition system. The main difference between these areas is that the object recognition system is more concerned with class objects rather than a specific case as the face recognition system. The review thereafter illustrates the Convolutional Neural Network (CNN) aspects. It analyzes biometric technology and compares several methods of solving based on a (CNN) taxonomies. Finally, it presents some of the challenges and potential future opportunities for objects and faces recognition systems research field. The main objective is to highlight the important technologies used to build a robust image recognition system.
- 17:00 Principal Components-Artificial Neural Network in Functional Near-Infrared Spectroscopy (fNIRS) for Brain Control Interface
- Functional near-infrared spectroscopy (fNIRS) is a non-invasive brain imaging technology that is widely utilized in Brain Control Interface (BCI) applications. Feature extraction is crucial to remove unwanted signals and improve the accuracy of a machine learning algorithm in BCI. Despite principal component analysis (PCA) is a popular feature extraction method in near-infrared spectroscopy, PCA is rarely studied in fNIRS. Thus, this study compared fNIRS-based BCI models that used PCA and that used statistical features in BCI for four mental activities classification. First, PCA was applied to transform pre-processed fNIRS signals into few principal components that were the inputs of artificial neural network (ANN) to form PCs-ANN. Three different combinations of fNIRS signals were used to study the performance of PCs-ANN using 10-fold cross-validation. The best PCs-ANN was compared with ANN that used statistical-based features. The finding shows that PCs-ANN outperformed ANN that used statistical-based features in the BCI classification application.
- 17:20 Automatic Classification of Sleep Stages using EEG Sub-bands based Time-spectral Features
- Sleep scoring is proved of having major impact on treating various sleep oriented disorders. But achieving this task manually is very time consuming and long process. Hence, an efficient computer based system is required to carry out the epoch based multi-class sleep stages classification. Among all the polysomnography (PSG) signals, Electroencephalogram (EEG) provides valuable information for sleep related analysis by sensing and monitoring the brain functions. Hence in this study, an effective computer-assisted technique is proposed for classifying various sleep stages. Firstly, the input signal is segmented into 30 seconds epochs as per the Rechtschaffen and Kales criteria (1968). From the six EEG sub-bands, five features such as Normalized power, Movement, Mean Absolute Deviation, Inter-quartile range and Fourier Synchrosqueezed transform are extracted. The feature vector is subjected to 10-fold cross-validation for 2-class to 6-class classification. The results are obtained after computing accuracy, sensitivity, specificity and Cohen Kappa's statistics using SVM classifier. The highest accuracy of 98.4%, 95.8%, 94.3%, 93.4% and 92.5% is achieved for 2-class to 6-class classification respectively. Also, subject specific results are computed for 5-class problem for which F1 score is evaluated for each stage. This proposed method offers improved results as compared with other previous studies.
- 16:00 Fuzzy Financial Fraud Risk Governance System in an Information Technology Environment
- The financial fraud risk assessment requires expertise concerning the audit methodology and the risk assessment of business processes. The assessment of financial fraud risks is a significant challenge for independent, external auditors chiefly when dealing with the audit in an information technology (IT) environment. A meta--analysis financial fraud risk governance model taking into account an IT environment and based on fuzzy inference system is proposed in this paper. The fuzzy set theory and fuzzy logic are employed to deal with actual business audits, which do not always concern dichotomic fraud risk conditions. The inputs of the proposed fuzzy financial fraud risk governance system concern meta--analysis input factors related to the fundamental domains of risk in IT audit comprising the (i) effective identity and access protocol, (ii) system development, (iii) control of business operations, and (iv) change in systems or applications. Results demonstrate that the proposed approach enables supporting the operational risk management to promote the operational efficiency by identifying, measuring, and disclosing events (risk conditions) both in terms of qualitative (stratification) and quantitative (score) analysis. The fuzzy IT financial fraud risk system is able to work as a first barrier to reflect the adequacy of the information technologies and systems used to avoid risk in the financial fraud governance.
- 16:20 Robust and Secure AMI Framework Model
- The smart metering system in the smart grid requires the need to secure the transmission of data from customers to the service provider to maintain consumer privacy. On the other side, operators must be able to safely manage the smart metering system equipment as it is one of the most important infrastructures in smart electricity grids. In this work, investigate the smart meters authentication and secure information exchange model against a combination of attacks such as a replay attack, data modification attack, and eavesdropping attack. A new feature has been added to secure device management in the system by keeping it away from TCP SYN-Flood attacks and using the time synchronization protocol in the most appropriate way to reduce bandwidth consumption.
- 16:40 Securing SCADA Systems against Cyber-Attacks using Artificial Intelligence
- Monitoring and managing electric power generation, distribution and transmission requires supervisory control and data acquisition (SCADA) systems. As technology has developed, these systems have become huge, complicated, and distributed, which makes them susceptible to new risks. In particular, the lack of security in SCADA systems make them a target for network attacks such as denial of service (DoS) and developing solutions for this issue is the main objective of this thesis. By reviewing various existing system solutions for securing SCADA systems, a new security approach is recommended that employs Artificial Intelligence(AI). AI is an innovative approach that imparts learning ability to software. Here deep learning algorithms and machine learning algorithms are used to develop an intrusion detection system (IDS) to combat cyber-attacks. Various methods and algorithms are evaluated to obtain the best results in intrusion detection. The results reveal the Bi-LSTM IDS technique provides the highest intrusion detection (ID) performance compared with previous techniques to secure SCADA systems
- 17:00 A Comparative Review of Security Threats Datasets for Vehicular Networks
- With the rapid growth of vehicular technology, Vehicle-to-everything (V2X) communication systems are becoming increasingly challenging, especially regarding security aspects. Using Machine Learning (ML) techniques to build Intrusion Detection Systems (IDS) has shown a high level of accuracy in minimizing V2X communications attacks. However, the effectiveness of ML-based IDSs depends on the availability of a sufficient amount of relevant network traffic logs that cover a wide variety of normal and abnormal samples to train and verify these models. In this paper, we provide the most up-to-date review of existing V2X security datasets. We classify these datasets according to the targeted architecture, the involved attacks, and their severity, etc. Based on these different effectiveness criteria we suggest four distinct yet realistic and reliable datasets including ROAD, VDDD, VeReMi, and VDOS-LRS datasets.
Thursday, September 30 16:00 - 16:40 (Asia/Bahrain)
S5-G KS: Keynote Speaker-4: Co-education in Engineering: how to promote the equality and diversity in STEAM careers
The underrepresentation of women in STEM (Science, Technology, Engineering, and Mathematics) careers is a global problem that is being studied and addressed through various initiatives. This talk presents an analysis of the factors that influence the gender gap in STEM studies, both in terms of career choice, career retention, supports, and interventions that promote diversity and inclusion. It also highlights some of the main challenges and initiatives that could help narrow the gender gap in STEM careers, among which co-education stands out.
Thursday, September 30 16:40 - 17:40 (Asia/Bahrain)
- 16:40 Technology Adoption Intention as a Driver of Success of Women Architect Entrepreneurs
- There are very few studies that directly address the effects of technology adoption intention on the success of women entrepreneurs specifically in the Indian context. The current study addresses the linkage between technology adoption intention and its antecedents on the success of a very niche and unexplored segment of women entrepreneurs i.e., architects. Using a modified form of the unified theory of acceptance and use of technology (UTAUT) model, this study uses structural equation modeling to test the proposed model. The model consists of the following constructs: Mental Access towards technology, Technical Skills, Performance Expectancy, Effort Expectancy, Facilitating Conditions, Social Influence, Technology Adoption Intention, and Women Entrepreneurial Success. The data has been collected from 188 respondents using the chain referral sampling method. The benefit of this study can be seen as a better understanding of technology adoption which will help to reduce barriers that women architects face in technology adoption and devise strategies promoting entrepreneurial success for women architects working all over India. Keywords- Architects; Professional Success; SEM; Technology Adoption Intention; UTAUT; Women Entrepreneurs
- 17:00 Investigating The Effects Of Societal Perceived Gender Differences On Female Entrepreneurship - Case Of Bahrain
- This paper explores societal perceived gender differences and their effects on female entrepreneurship. Entrepreneurship as a phenomenon that is rooted in the social and geo-cultural environments has been consistently documented by the literature. Furthermore, societal perceived gender differences are increasingly emphasized as strong influencers of entrepreneurial engagement around the globe, for both male and female entrepreneurs. However, the effects of these societal assigned roles are more pronounced in the case of female entrepreneurs, who according to the literature, deal with the magnified effects of such gender roles and the disadvantages brought about by the social setting in their place of business. This paper reflects on these issues through conducting one-on-one in-depth interviews with 30 female entrepreneurs in the Kingdom of Bahrain. As a result of the study, the following aspects are identified as major factors affecting 1) economic conditions and 2) societal attitudes.
Thursday, September 30 17:40 - 18:00 (Asia/Bahrain)
Thursday, September 30 18:00 - 18:30 (Asia/Bahrain)
KS-Prof Jay: Keynote Speaker: "Industrial AI and Resilient Manufacturing Systems - Technologies, Challenges and Research Issues"
Today, many manufacturing companies are facing increasing challenges to manage its global operations due to several factors including pandemic, geopolitical, workforce, and other unknowns. It necessitates new thinking and technologies to make manufacturing system smarter and resilient. Industrial AI, Big Data Analytics, Machine Learning, and Cyber Physical Systems are changing the way we design product, manufacturing, and service systems. It is clear that as more sensors and smart analytics software are integrated in the networked industrial products, manufacturing, and maintenance systems, predictive technologies can further learn and autonomously optimize productivity and performance. This presentation will give an introduction about Industrial AI for smart resilient machines and manufacturing operations. First, Industrial AI systematic approach will be introduced. Case studies on advanced predictive analytics technologies for different manufacturing and maintenance operations will be demonstrated. In addition, research issues on data quality for high performance and real-time data analytics in future predictive manufacturing and maintenance will be discussed.