International Journal of Computer Science & Network Security
International Journal of Computer Science & Network Security (IJCSNS)
- Monthly
- /
- 1738-7906(pISSN)
Volume 24 Issue 9
-
This paper develops a generalized framework for the analysis of multicarrier communication system, using a generic pair of transmitters- and receiver side terraforms, Qτ | QR, such that the DFT-transform based "conventional OFDM" is its special case. This analysis framework is then used to propose and prove theorems on various performance metrics of a multicarrier communication system, which will apply to any system that fits the architecture, which most will do. The analysis framework also derives previously unknown closed-form expressions for these metrics, such as how the performance degradation due to carrier frequency offset or timing synchronization error, amongst others, are function of generic transforms. While extensive work exists on the impact of these challenges on conventional OFDM, how are these functions of transform matrices is unknown in the literature. It will be shown, how the analysis of OFDM based system is special case of analysis in this paper. This paper is Part I of three paper series, where the other two supplements the arguments present here.
-
Sentiment analysis has become a pivotal component in understanding public opinion, market trends, and user experiences across various domains. The advent of GPT (Generative Pre-trained Transformer) models has revolutionized the landscape of natural language processing, introducing a new dimension to sentiment analysis. This comprehensive roadmap delves into the transformative impact of GPT models on sentiment analysis tasks, contrasting them with conventional methodologies. With an increasing need for nuanced and context-aware sentiment analysis, this study explores how GPT models, known for their ability to understand and generate human-like text, outperform traditional methods in capturing subtleties of sentiment expression. We scrutinize various case studies and benchmarks, highlighting GPT models' prowess in handling context, sarcasm, and idiomatic expressions. This roadmap not only underscores the superior performance of GPT models but also discusses challenges and future directions in this dynamic field, offering valuable insights for researchers, practitioners, and AI enthusiasts. The in-depth analysis provided in this paper serves as a testament to the transformational potential of GPT models in the realm of sentiment analysis.
-
Precoding of the orthogonal frequency division multiplexing (OFDM) with Walsh Hadamard transform (WHT) is known in the literature. Instead of performing WHT precoding and inverse discrete Fourier transform separately, a product of two matrix can yield a new matrix that can be applied with lower complexity. This resultant transform, T-transform, results in T-OFDM. This paper extends the limited existing work on T-OFDM significantly by presenting detailed account of its computational complexity, a lower complexity receiver design, an expression for PAPR and its cumulative distribution function (cdf), sensitivity of T-OFDM to timing synchronization errors, and novel analytical expressions signal to noise ratio (SNR) for multiple equalization techniques. Simulation results are presented to show significant improvements in PAPR performance, as well improvement in bit error rate (BER) in Rayleigh fading channel. This paper is Part II of a three-paper series on alternative transforms and many of the concepts and result refer to and stem from results in generalized multicarrier communication (GMC) system presented in Part I of this series.
-
Rotsnarani Sethy;Soumya Ranjan Mahanta;Mrutyunjaya Panda 30
Building an accurate 3-D spatial road network model has become an active area of research now-a-days that profess to be a new paradigm in developing Smart roads and intelligent transportation system (ITS) which will help the public and private road impresario for better road mobility and eco-routing so that better road traffic, less carbon emission and road safety may be ensured. Dealing with such a large scale 3-D road network data poses challenges in getting accurate elevation information of a road network to better estimate the CO2 emission and accurate routing for the vehicles in Internet of Vehicle (IoV) scenario. Clustering and regression techniques are found suitable in discovering the missing elevation information in 3-D spatial road network dataset for some points in the road network which is envisaged of helping the public a better eco-routing experience. Further, recently Explainable Artificial Intelligence (xAI) draws attention of the researchers to better interprete, transparent and comprehensible, thus enabling to design efficient choice based models choices depending upon users requirements. The 3-D road network dataset, comprising of spatial attributes (longitude, latitude, altitude) of North Jutland, Denmark, collected from publicly available UCI repositories is preprocessed through feature engineering and scaling to ensure optimal accuracy for clustering and regression tasks. K-Means clustering and regression using Support Vector Machine (SVM) with radial basis function (RBF) kernel are employed for 3-D road network analysis. Silhouette scores and number of clusters are chosen for measuring cluster quality whereas error metric such as MAE ( Mean Absolute Error) and RMSE (Root Mean Square Error) are considered for evaluating the regression method. To have better interpretability of the Clustering and regression models, SHAP (Shapley Additive Explanations), a powerful xAI technique is employed in this research. From extensive experiments , it is observed that SHAP analysis validated the importance of latitude and altitude in predicting longitude, particularly in the four-cluster setup, providing critical insights into model behavior and feature contributions SHAP analysis validated the importance of latitude and altitude in predicting longitude, particularly in the four-cluster setup, providing critical insights into model behavior and feature contributions with an accuracy of 97.22% and strong performance metrics across all classes having MAE of 0.0346, and MSE of 0.0018. On the other hand, the ten-cluster setup, while faster in SHAP analysis, presented challenges in interpretability due to increased clustering complexity. Hence, K-Means clustering with K=4 and SVM hybrid models demonstrated superior performance and interpretability, highlighting the importance of careful cluster selection to balance model complexity and predictive accuracy. -
This paper proposes dual symbol superposition block carrier transmission with frequency domain equalization (DSS-FDE) system. This system is based upon χ-transform matrix, which is obtained by concatenation of discrete Hartley transform (DHT) matrix and discrete Fourier transform (DFT) matrices into single matrix that is remarkably sparse, so that, as it will be shown in this paper, it only has non-zero entries on its principal diagonal and one below the principle anti-diagonal, giving it shape of Latin alphabet χ. When multiplied with constellation mapped complex transmit vector, each entry of resultant vector is weighted superposition of only two entries of original vector, as opposed to all entries in conventional DFT based OFDM. Such a transmitter is close to single carrier block transmission with frequency domain equalization (SC-FDE), which is known to have no superposition. The DSS-FDE offers remarkable simplicity in transmitter design and yields great benefits in reduced complexity and low PAPR. At receiver-end, it offers the ability to harvest full diversity from multipath fading channel, full coding gain, with significant bit error rate (BER) improvement. These results will be demonstrated using both analytical expressions, as well as simulation results. As will be seen, this paper is Part III of three-paper series on alternative transforms for multicarrier communication (MC) systems.
-
The decision to integrate mobile cloud computing (MCC) in higher education without first defining suitable usage scenarios is a global issue as the usage of such services becomes extensive. Consequently, this study investigates the security determinants of the educational use of mobile cloud computing among universities' students. This study proposes and develops a theoretical model by adopting and modifying the Protection Motivation Theory (PMT). The study's findings show that a significant amount of variance in MCC adoption was explained by the proposed model. MCC adoption intention was shown to be highly influenced by threat appraisal and coping appraisal factors. Perceived severity alone explains 37.8% of students "Intention" to adopt MCC applications, which indicates the student's perception of the degree of harm that would happen can hinder them from using MCC. It encompasses concerns about data security, privacy breaches, and academic integrity issues. Response cost, perceived vulnerability and response efficacy also have significant influence on students "intention" by 18.8%, 17.7%, and 6.7%, respectively.
-
Bhagwati Sharan;Mohammad Husain;Mohammad Nadeem Ahmed;Anil Kumar Sagar;Arshad Ali;Ahmad Talha Siddiqui;Mohammad Rashid Hussain 63
Weather forecasting has become a very popular topic nowadays among researchers because of its various effects on global lives. It is a technique to predict the future, what is going to happen in the atmosphere by analyzing various available datasets such as rain, snow, cloud cover, temperature, moisture in the air, and wind speed with the help of our gained scientific knowledge i.e., several approaches and set of rules or we can say them as algorithms that are being used to analyze and predict the weather. Weather analysis and prediction are required to prevent nature from natural losses before it happens by using a Deep Learning Approach. This analysis and prediction are the most challenging task because of having multidimensional and nonlinear data. Several Deep Learning Approaches are available: Numerical Weather Prediction (NWP), needs a highly calculative mathematical equation to gain the present condition of the weather. Quantitative precipitation nowcasting (QPN), is also used for weather prediction. In this article, we have implemented and analyzed the various distinct techniques that are being used in data mining for weather prediction. -
Anitawati Mohd Lokman;Muhammad Nur Aiman Rosmin;Saidatul Rahah Hamidi;Surya Sumarni Hussein;Shuhaida Mohamed Shuhidan 77
Emotional health is important for overall health, and those who are experiencing difficulties should seek professional help. However, the social stigma associated with emotional health, as well as the influence of cultural beliefs, prevent many people from seeking help. This makes early detection difficult, which is critical for such health issues. It would be extremely beneficial if they could assess their emotional state and express their thoughts without prejudices. On the market, there are emotional health apps. However, there was little to no evidence-based information on their quality. Hence, this study was conducted in order to provide evidence-based quality in emotional health mobile apps. Eleven functionality task scenarios were used to assess functional quality, while a System Usability Scale test (n=20) was used to assess usability, customer acceptability, learnability, and satisfaction. The findings show that the app for emotional health management is highly efficient and effective, with a high level of user satisfaction. This contributes to the creation of an app that will be useful and practical for people experiencing early-stage emotional health issues, as well as related stakeholders, in order to manage early-stage emotional health issues. -
Information Security is the foremost concern for IoT (Internet of things) devices and applications. Since the advent of IoT, its applications and devices have experienced an exponential increase in numerous applications which are utilized. Nowadays we people are becoming smart because we started using smart devices like a smartwatch, smart TV, smart home appliances. These devices are part of the IoT devices. The IoT device differs widely in capacity storage, size, computational power, and supply of energy. With the rapid increase of IoT devices in different IoT fields, information security, and privacy are not addressed well. Most IoT devices having constraints in computational and operational capabilities are a threat to security and privacy, also prone to cyber-attacks. This study presents a CIA triad-based information security implementation for the four-layer architecture of the IoT devices. An overview of layer-wise threats to the IoT devices and finally suggest CIA triad-based security techniques for securing the IoT devices..Make sure that the abstract is written as one paragraph.
-
The segmentation, detection, and extraction of infected tumour area from magnetic resonance (MR) images are a primary concern but a tedious and time taking task performed by radiologists or clinical experts, and their accuracy depends on their experience only. So, the use of computer aided technology becomes very necessary to overcome these limitations. In this study, to improve the performance and reduce the complexity involves in the medical image segmentation process, we have investigated many algorithm methods are available in medical imaging amongst them the Threshold technique brain tumour segmentation process gives an accurate result than other methods for MR images. The proposed method compare with the K-means clustering methods, it gives a cluster of images. The experimental results of proposed technique have been evaluated and validated for performance and quality analysis on magnetic resonance brain images, based on accuracy, process time and similarity of the segmented part. The experimental results achieved more accuracy, less running time and high resolution.
-
Ivan Vincent;Thanh.T.T.P;Suk-Hwan Lee;Ki-Ryong Kwon 97
Leukemia induced death has been listed in the top ten most dangerous mortality basis for human being. Some of the reason is due to slow decision-making process which caused suitable medical treatment cannot be applied on time. Therefore, good clinical decision support for acute leukemia type classification has become a necessity. In this paper, the author proposed a novel approach to perform acute leukemia type classification using sequential neural network classifier. Our experimental result only covers the first classification process which shows an excellent performance in differentiating normal and abnormal cells. Further development is needed to prove the effectiveness of second neural network classifier. -
Nowadays, permutation problems with large state spaces and the path to solution is irrelevant such as N-Queens problem has the same general property for many important applications such as integrated-circuit design, factory-floor layout, job-shop scheduling, automatic programming, telecommunications network optimization, vehicle routing, and portfolio management. Therefore, methods which are able to find a solution are very important. Genetic algorithm (GA) is one the most well-known methods for solving N-Queens problem and applicable to a wide range of permutation problems. In the absence of specialized solution for a particular problem, genetic algorithm would be efficient. But holism and random choices cause problem for genetic algorithm in searching large state spaces. So, the efficiency of this algorithm would be demoted when the size of state space of the problem grows exponentially. In this paper, the new method presented based on genetic algorithm to cover this weakness. This new method is trying to provide partial view for genetic algorithm by locally searching the state space. This may cause genetic algorithm to take shorter steps toward the solution. To find the first solution and other solutions in N-Queens problem using proposed method: dividing N-Queens problem into subproblems, which configuring initial population of genetic algorithm. The proposed method is evaluated and compares it with two similar methods that indicate the amount of performance improvement.
-
The concept of social stratification and hierarchy among human dates back to the origin of human race. Presently, the growing reputation of social networks has given us with an opportunity to analyze these well-studied phenomena over different networks at different scales. Generally, a social network could be defined as a collection of actors and their interactions. In this work, we concern ourselves with a particular type of social networks, known as trust networks. In this type of networks, there is an explicit show of trust (positive interaction) or distrust (negative interaction) among the actors. In other words, an actor can designate others as friends or foes. Trust networks are typically modeled as signed networks. A signed network is a directed graph in which the edges carry an edge weight of +1 (indicating trust) or -1 (indicating distrust). Examples of signed networks include the Slashdot Zoo network, the Epinions network and the Wikipedia adminship election network. In a social network, actors tend to connect with each other on the basis of their perceived social hierarchy. The emergence of such a hierarchy within a social community shows the manner in which authority manifests in the community. In the case of signed networks, the concept of social hierarchy can be interpreted as the emergence of a tree-like structure comprising of actors in a top-down fashion in the order of their ranks, describing a specific parent-child relationship, viz. child trusts parent. However, owing to the presence of positive as well as negative interactions in signed networks, deriving such "trust hierarchies" is a non-trivial challenge. We argue that traditional notions (of unsigned networks) are insufficient to derive hierarchies that are latent within signed networks In order to build hierarchies in signed networks, we look at two interpretations of trust namely presence of trust (or "good") and lack of distrust (or "not bad"). In order to develop a hierarchy signifying both trust and distrust effectively, the above interpretations are combined together for calculating the overall trustworthiness (termed as deserve) of actors. The actors are then arranged in a hierarchical fashion based on these aggregate deserve values, according to the following hypothesis: actor v is assigned as a child of actor u if: (i) v trusts u, and (ii) u has a higher deserve value than v. We describe this hypothesis with additional qualifiers in this thesis.
-
Diabetes is a chronic condition that happens when the pancreas fails to produce enough insulin or when the body's insulin is ineffectively used. Uncontrolled diabetes causes hyperglycaemia, or high blood sugar, which causes catastrophic damage to many of the body's systems, including the neurons and blood vessels, over time. The burden of disease on the global healthcare system is enormous. As a result, early diabetes diagnosis is critical in saving many lives. Current methods for determining whether a person has diabetes or is at risk of acquiring diabetes, on the other hand, rely heavily on clinical biomarkers. This research presents a unique deep learning architecture for predicting whether or not a person has diabetes and the severity levels of diabetes from the person's retinal image. This study incorporates datasets such as EyePACS and IDRID, which comprise Diabetic Retinopathy (DR) images and uses Dense-121 as the base due to its improved performance.
-
The acute respiratory infection known as a coronavirus (COVID-19) may present with a wide range of clinical manifestations, ranging from no symptoms at all to severe pneumonia and even death. Expert medical systems, particularly those used in the diagnostic and monitoring phases of treatment, have the potential to provide beneficial results in the fight against COVID-19. The significance of healthcare mobile technologies, as well as the advantages they provide, are quickly growing, particularly when such applications are linked to the internet of things. This research work presents a knowledge-based smart system for the primary diagnosis of COVID-19. The system uses symptoms that manifest in the patient to make an educated guess about the severity of the COVID-19 infection. The proposed inference system can assist individuals in self-diagnosing their conditions and can also assist medical professionals in identifying the ailment. The system is designed to be user-friendly and easy to use, with the goal of increasing the speed and accuracy of COVID-19 diagnosis. With the current global pandemic, early identification of COVID-19 is essential to regulate and break the cycle of transmission of the disease. The results of this research demonstrate the feasibility and effectiveness of using a knowledge-based smart system for COVID-19 diagnosis, and the system has the potential to improve the overall response to the COVID-19 pandemic. In conclusion, these sorts of knowledge-based smart technologies have the potential to be useful in preventing the deaths caused by the COVID-19 pandemic.
-
Shakeel Ahmed;Ahmad Shukri Mohd Noor;Wazir Zada Khan;Mohamed Saad Eldin Mohamed 135
This research aimed to promote the electronic evaluation tools to tackle the pandemic implications (corona, COVID-19) and analyze the attitude and academic acceptance at the level of the female student's in the department of computer science - faculty of computer science and information technology at Jazan University, Saudi Arabia. The student's attitude toward e-assessment tolls has been measured and the main research sample consisted of 40 students' experimental group. A survey is also conducted to the assessment of the validity and reliability of research questions with the help of 50 students before implementation. There was a statistically significant difference between students' average grades in the post-measurement of the tendency toward electronic evaluation of the experimental groups in favor of the experimental group, at the significance level (0.01). The results also showed a statistically significant difference at the level of significance (0.01) between average scores of students in academic acceptance level in the experimental groups in favor of the experimental group. The findings of this research indicate the achievement of the e-Evaluation Acceptance and are highly recommended to propagate the use of electronic evaluation. -
With the advent of personalized search engines, a myriad of approaches came into practice. With social media emergence the personalization was extended to different level. The main reason for this preference of personalized engine over traditional search was need of accurate and precise results. Due to paucity of time and patience users didn't want to surf several pages to find the result that suits them most. Personalized search engines could solve this problem effectively by understanding user through profiles and histories and thus diminishing uncertainty and ambiguity. But since several layers of personalization were added to basic search, the response time and resource requirement (for profile storage) increased manifold. So it's time to focus on optimizing the layered architectures of personalization. The paper presents a layout of the multi agent based personalized search engine that works on histories and profiles. Further to store the huge amount of data, distributed database is used at its core, so high availability, scaling, and geographic distribution are built in and easy to use. Initially results are retrieved using traditional search engine, after applying layer of personalization the results are provided to user. MongoDB is used to store profiles in flexible form thus improving the performance of the engine. Further Weighted Sum model is used to rank the pages in personalization layer.
-
Cloud computing becomes an important technology for distributed computing and parallel computing. Cloud computing provides various facility like to share resources, software packages, information, storage and many different applications depending on user demand at any time and at any place. It provides an extensive measure for computing and storage. A service provided by it to user follows pay-as-you-go model. Although it provides many facilities still there is some problem which are resource discovery, fault tolerance, load balancing, and security. Out of these Load balancing is the main challenges. There are many techniques which used to distribute wor9kload or task equally across the servers. This paper includes cloud computing, cloud computing architecture, virtualization and MS load balancing technique which provide enhanced load balancing.
-
VANET (vehicular ad hoc network) refers to the case networks designed for vehicles. Such networks are established among the vehicles which are equipped with communication tools. Within these networks, vehicles are regarded as the network nodes. On time and on schedule transmission of data is of high significance for these networks. In order to accomplish the objective of on-time data transmission, specific electronic equipment is embedded in each vehicle which maintains ad hoc communications among the passengers. Information about traffic, road signs and on-line observation of traffic status can be transmitted via these networks; such data makes it possible for the driver to select the best route to reach his destination. If there are not any infrastructures, two broadcasting approaches can be considered: overflowing and rebroadcasting. The overflowing approach leads to heavy traffic. Hence, the challenge we face is to avoid the broadcasting flood. In this paper, an approach for the management of the broadcasting flood is proposed based on fuzzy theory. The proposed method is assumed to have better performance and efficiency than any other approaches in terms of crash rate, the rate of message success and delay
-
Due to levitate and rapid growth of E-Commerce, most of the organizations are moving towards cashless transaction Unfortunately, the cashless transactions are not only used by legitimate users but also it is used by illegitimate users and which results in trouncing of billions of dollars each year worldwide. Fraud prevention and Fraud Detection are two methods used by the financial institutions to protect against these frauds. Fraud prevention systems (FPSs) are not sufficient enough to provide fully security to the E-Commerce systems. However, with the combined effect of Fraud Detection Systems (FDS) and FPS might protect the frauds. However, there still exist so many issues and challenges that degrade the performances of FDSs, such as overlapping of data, noisy data, misclassification of data, etc. This paper presents a comprehensive survey on financial fraud detection system using such data mining techniques. Over seventy research papers have been reviewed, mainly within the period 2002-2015, were analyzed in this study. The data mining approaches employed in this research includes Neural Network, Logistic Regression, Bayesian Belief Network, Support Vector Machine (SVM), Self Organizing Map(SOM), K-Nearest Neighbor(K-NN), Random Forest and Genetic Algorithm. The algorithms that have achieved high success rate in detecting credit card fraud are Logistic Regression (99.2%), SVM (99.6%) and Random Forests (99.6%). But, the most suitable approach is SOM because it has achieved perfect accuracy of 100%. But the algorithms implemented for financial statement fraud have shown a large difference in accuracy from CDA at 71.4% to a probabilistic neural network with 98.1%. In this paper, we have identified the research gap and specified the performance achieved by different algorithms based on parameters like, accuracy, sensitivity and specificity. Some of the key issues and challenges associated with the FDS have also been identified.
-
WebRTC (Web Real-Time Communication) is a technology that enables browser-to-browser communication. Therefore, a signalling mechanism must be negotiated to create a connection between peers. The main aim of this paper is to create and implement a WebRTC hybrid signalling mechanism named (WebNSM) for video conferencing based on the Socket.io (API) mechanism. WebNSM was designed over different topologies such as simplex, star and mesh. Therefore it offers several communications at the same time such as one-to-one (unidirectional/bidirectional), one-to-many (unidirectional) and many-to-many (bi-directional) without any downloading or installation. In this paper, WebRTC video conferencing was accomplished via LAN and WAN networks, including the evaluation of resources in WebRTC like bandwidth consumption, CPU performance, memory usage, Quality of Experience (QoE) and maximum links and RTPs calculation. This paper presents a novel signalling mechanism among different users, devices and networks to offer video conferencing using various topologies at the same time, as well as other typical features such as using the same server, determining room initiator, keeping the communication active even if the initiator or another peer leaves, etc. This scenario highlights the limitations of CPU performance, bandwidth consumption and the use of different topologies for WebRTC video conferencing.
-
Vehicular Ad-hoc Networks (VANETs) became very popular in few years and it has been widely used in research and industry communities. VANET is a collection of wireless vehicle nodes forming a temporary network without using any centralized Road Side Unit (RSU). VANET is a subset of Mobile Adhoc Networks (MANET). It improves the safety of vehicles. It also supports Intelligent Transportation Systems.Routing is the major component of communication protocols in VANETs. Packets are to be routed from the source node to destination node. Because of frequent topology changes and routing overhead, selection of routing protocol in VANET is a great challenge. There are various routing protocols available for VANET. This paper involves study of Temporally Ordered Routing protocol (TORA) and performance metrics are analyzed with the help of NS2 Simulator.
-
Robust system may not guaranty its applicability and adaptability. That is why research and development go together in the modern research concept. In this paper we are going to examine the applicability and adaptability of gait-based biometric identity verification system especially in the GCC (Gulf Cooperation Council). The system itself closely involved with human interaction where privacy and personality are in concern. As of 1st phase of our research we will establish gait-based identity verification system and then we will explain them in and out of human interaction with the system. With involved interaction we will conduct an extensive survey to find out both applicability and adoptability of the system. To conduct our experiment, we will use UCMG databased [1] which is readily available for the research community with more than three thousand video sequences in different viewpoint collected in various walking pattern and clothing. For the survey we will prepare questioners which will cover approach of data collection, potential traits to collect and possible consequences. For analyzing gait biometric trait, we will apply multivariate statistical classifier through well-known machine learning algorithms in a ready platform. Similarly, for the survey data analysis we will use similar approach to co-relate the user view point for such system. It will also help us to find the perception of the user for the system.