Skip to main content

2018 | Buch

Smart Trends in Systems, Security and Sustainability

Proceedings of WS4 2017

herausgegeben von: Prof. Xin-She Yang, Prof. Dr. Atulya K. Nagar, Amit Joshi

Verlag: Springer Singapore

Buchreihe : Lecture Notes in Networks and Systems

insite
SUCHEN

Über dieses Buch

The volume deals with sustainability transitions which are transformations of major socio-technical systems of provision and use in areas such as energy, water, mobility, and food, towards more sustainable ways of production and consumption. The book provides insights of World Conference on Smart Trends in Systems, Security and Sustainability (WS4 2017) which is divided into different sections such as Smart IT Infrastructure for Sustainable Society; Smart Management prospective for Sustainable Society; Smart Secure Systems for Next Generation Technologies; Smart Trends for Computational Graphics and Image Modelling; and Smart Trends for Biomedical and Health Informatics. The book volume contains 31 high-quality papers presented at WS4 2017.

Inhaltsverzeichnis

Frontmatter
MITIGATE: A Dynamic Supply Chain Cyber Risk Assessment Methodology

Supply chain services and logistic chains of the modern era have become dependent on ICT assets establishing a complex, interrelated and interactive cyber ecosystem. Isolated vulnerabilities in any of the cyber assets may have catastrophic impact in the whole supply chain. In this context, the paper proposes an evidence-driven Supply Chain Risk Assessment methodology which relies on high quality scientific and experimental based proofs and findings, (including simulation results and indicators, e.g. CVE) to optimize the evaluation and mitigation of the supply chain related risks and cyber threats.

Spyridon Papastergiou, Nineta Polemi
Small Polar Codes Concatenating to STBC Applied to MIMO Systems

This paper uses a concatenation of small Polar Codes length (N=32) and Space Time Block Code, this Polar-STBC is applied to no diversity (SISO), SIMO, MISO and MIMO systems. Minimum Mean Square Error using Successive Interference Cancellation (MMSE-SIC) is a soft output used to the receiver in order to improve Bit Error Rate (BER) and finally Successive Cancellation Decoder (SCD) is placed to the decoder in order to improve the BER and Frame Error Rate (FER). Comparison between several STBC without concatenation schemes and this small Polar-STBC shown that the proposed allows minimizing the BER and FER performances.

Madiop Diouf, Idy Diop, Ibra Dioum, Birahime Diouf, Sidi Mohamed Farssi, Khaly Tall, Lamine Sane
Target Tracking in Wireless Sensor Networks Using NS2

Wireless sensor network has enormous number of sensors spread over a geographic zone. Each mote in WSN has the ability to communicate with other motes and can process the data in limited capacity. Localization is a technique in which we estimate the location of sensor nodes. In wireless sensor network, nodes locations are not fixed. So to track the target, many solutions come to the mind like GPS, RFID, Bluetooth, Multilateration and Trilateration. In this paper, we use a simple algorithm which reduces the energy usage by minimizing the communication overhead and track the target efficiently.

Umair Shafiq Khan, Nazar A. Saqib, Muazzam A. Khan
Taking Away the Green Screen—A Brief Discussion of a Multidisciplinary Approach to IoT via, Smart Parking and Interaction Design

“See a need, Fill a need” Wedge et al. (Robots 20th Century Fox, [1]) though these words can aptly sum up an entrepreneurial spirit, they are also a potentially good attitude for finding relevant solutions to global environmental issues. The important question that arises is how the need is filled? Long term, sustainable and creative solutions are required to reduce the impact of the human footprint. A multidisciplinary approach has great potential, as it is capable of combining a variety of domains and enriching the knowledge base. This paper will discuss a smart parking project to address the issues of carbon emissions, time management and congestion in cities whilst the 2nd project portrays a new approach to interacting and designing computers inspired by efficient biological systems.

Muftah Fraifer, Helen Hasenfuss, Mikael Fernström
Modeling SCADA Attacks

SCADA systems play a vital role in the efficient operations of the ports’ Critical Infrastructures (CIs) and their Maritime Logistics and Supply Chain Services (MLoSC). In this paper we provide a process-centric modeling approach using BPMN 2.0 specification in order to visualize an attack likely to be detected on SCADA systems. The SCADA model serves as a study on how security concepts (e.g. security paths, vulnerabilities, propagation of attacks) can be represented with modeling notations.

Eleni-Maria Kalogeraki, Nineta Polemi, Spyridon Papastergiou, Themis Panayiotopoulos
Automatic Lung Field Segmentation Based on Non Negative Matrix Factorization and Fuzzy Clustering

Obtaining accurate and automated lung field segmentation is a challenging step in the development of Computer-Aided Diagnosis (CAD) system. In this paper fully automatic lung field segmentation is proposed. Initially, a visual appearance model is constructed by considering spatial interaction of the neighbouring pixels. Then constrained non-negative matrix factorization (CNMF) factorized the data matrix obtained from the visual appearance model into basis and coefficient matrices. Initial lung segmentation is achieved by applying fuzzy c-means clustering on the obtained coefficient matrix. Trachea and bronchi appearing in the initial lung segmentation are removed by 2-D region growing operation. Finally, the lung contour is smooth by using boundary smoothing step. The experimental results on different database shows that the proposed method produces significant DSC 0.987 as compared to the existing lung segmentation algorithms.

Ganesh Singadkar, Shubham Talbar, Parang Sanghavi, Bhavin Jankharia, Sanjay Talbar
Competitive Advantage Through Social Media: A Study of Indian Firms

The purpose of this paper is to visit the role of social media in attaining or/and sustaining competitive advantage. Not before some 5–10 years, especially in the Indian context, was social media this active and influential. Internet existed merely as a source of information; and nothing more. Technological and social advances have led internet to give birth to an entirely new phenomenon called “social media”. Twitter, Facebook, LinkedIn, and many other web platforms have been established. People share their views, companies promote themselves, all the protests and campaigning is done, very much through social media. In this changed time, companies are opting to stand out among their competitors through the use of social media. In proposing out the new social marketing strategies, this paper contributes to the literature pool as well as opens scope for practicing new ways to gain advantage through social media.

Pallavi Thacker, H. P. Mathur
A Real-Time Devnagari Sign Language Recognizer (α-DSLR) for Devnagari Script

Devnagari Sign Language (DSL) is used for communication between dump and deaf users. Our Alphabet Devnagari Sign Language Recognizer (α-DSLR) system is used to translate DSL alphabets into Devnagari alphabets along with speech. Devnagari alphabets comprises fourteen vowels ranging from “A” to “A:” and thirty-three consonants ranging from “k” to “&”. Work flow of α-DSLR system emphasizes on sequential phases along with algorithmic approach used in our system. The system works with Single Hand Single Camera approach and applies template based and clustering based algorithms. The detection rate of 97% is accomplished by α-DSLR system against a complex background.

Jayshree Pansare, Maya Ingle
Novel Strategy for Fairness-Aware Congestion Control and Power Consumption Speed with Mobile Node in Wireless Sensor Networks

The power issue in wireless sensor network (WSN) stays one of the real barriers keeping the complete abuse of this technology. The WSN is a dense network of sensors which sense the environmental conditions, process and propagate that data towards sink node. Limited battery life of sensor node and unbalanced utilization of that energy can affect the lifetime of the entire sensor network. In proposed work, mobile nodes are used to transmit the data nearby the area where the power consumption of the nodes is more. The mobile node reduces the workload and congestion of the nodes which is controlled by adjusting the reporting rate (RR) according to the buffer occupancy level. In this paper, we have correlated the existing Ad hoc on demand vector (AODV) routing protocol with our new approach to the delivery of power consumption. The proposed work evaluate the performance of the lifetime and energy consumption of the WSN with and without mobile nodes and results achieved by adjusting the number of mobile nodes, location and the speed of mobile node. The RR is also adjusted to control the buffer occupancy of each node and mitigate the congestion that occurs in the sensor network. Based on this simulation results, this proposed work increases the lifetime of the nodes whose power consumption speed is high and enhances the life of the entire network. The use of a dual threshold for buffer and mobile node can reduce the congestion and the waiting time for data reducing the delay.

Sagar B. Tambe, Suhas S. Gajre
Two Stage Wireless Sensor Node Localization Using Firefly Algorithm

Locations of the sensors in wireless sensor networks usually have to be known. Wireless sensor networks contain large number of sensors so installing global position system device in each of them is not acceptable. Few anchor nodes with known positions are usually used. Estimating sensor nodes positions from radio strength signal index is a hard optimization problem. In this paper we proposed a two stage algorithm with semi-mobile anchors that uses swarm intelligence firefly algorithm for optimization. The proposed algorithm was tested by simulation of standard benchmark wireless networks where it obtained better results compared to other approaches from literature.

Eva Tuba, Milan Tuba, Marko Beko
Detection of Brain Tumor from MR Image Sequence Using Image Segmentation and Blob’s Centroid

This paper proposes a method to search for a probable tumor in magnetic resonance (MR) images of a human brain. Typically, a tumor can be found in some contiguous images of the MR sequence and positions of its appearance in such contiguous images usually have similar centroid thus their corresponding projections should be able to be detected automatically in order to support a user or a doctor for further diagnosis. Once region of a probable tumor is detected, matched checking between a pair of contiguous MR images can be done and relabeled to indicate the same area of the tumor amongst sequential images. Any regions without match between contiguous images are initially considered as irrelevant components and will not be analyzed further unless the doctor indicates otherwise. Then, ratio of tumor to brain is calculated to support as an initial diagnosis of tumors appeared in an MR image sequence .

Thanawit Chaisuparpsirikun, Varin Chouvatut
A Project Metric Model for Assessing ICT Project Success/Failure

Project success metrics play a vital role in project control as they focus on project deliverables as justification of success. However, this traditional view is not sufficient for measuring the degree of success/failure with consideration of the resulting Return on Investment (ROI). This study presents a model project management metric for assessing project success or failure which sufficiently accounts for necessary expectations from the user community. The model was evolved from existing project management metrics by adding features to justify precise measurement of degrees of project success/failures.

Ezekiel U. Okike, Ofaletse Mphale
Analysing Distribution of Copyrighted Files by Tracking BitTorrent Network

Now a days Internet usage continually increase at a rapid speed; and with that internet-based violation also. BitTorrent is the most popular open Internet application for content sharing. There is free availability of copyrighted content through p2p network. So; it continually grows up the popularity of BitTorrent gateways. This protocol is one of the highest internet bandwidth consumers. Recent tracking summary shows that content being shared are mostly illegal. Thus aim of this paper is to find copyrighted files from torrent networks and to find distributor of that files. So to find illegal copies, I propose crawling algorithm to collect efficient listing portals. And characterizing sharing of illegal copies of content in torrent networks. Crawling algorithm mainly having three engine Scanner Engine, Monitor Engine and Analyzer Engine.

Chintan B. Sidpara, Darshana H. patel, Kunal U. Khimani, Darshan Upadhyay, Avani R. Vasant
Ant Colony Optimization Based Load Frequency Control of Multi-area Interconnected Thermal Power System with Governor Dead-Band Nonlinearity

The interconnected thermal power system consists of several areas. Various parameters should be provided to reach power systems’ firm operation. The current work proposed an optimization algorithm, namely Ant colony optimization (ACO) to optimize the Proportional-Integral-Derivative (PID) controller for the load frequency control of two-area interconnected non-reheat thermal power system with Governor dead band nonlinearity. The ACO in used to determine optimal controller’s parameters, where an objective function, namely Integral Time Absolute Error is conducted. A comparative study for the ACO performance to the Craziness based Particle swarm optimization (CPSO) is studied to examine the proposed approach performance in the interconnecting thermal power system. The result established the ACO optimized PID controller response superiority of the compared to the CPSO optimized controller.

Gia Nhu Nguyen, K. Jagatheesan, Amira S. Ashour, B. Anand, Nilanjan Dey
A Qualitative Review of Two Evolutionary Algorithms Inspired by Heuristic Population Based Search Methods: GA & PSO

PSO is relatively recent Evolutionary computational technique which is inspired by swarming behavior of biological populations. PSO and GA are resembles in sense that both are heuristic population based search methods. In other words both starts from random variable and reach to a final desired solutions without any user input. GA has been utilizing in many research and in many form because its easiness in implementation and ability to solve highly complex non-linear engineering problems but it is costly. This paper claims that PSO is more effective then GA while it is very recent but in sense of implementation, in solving complex problems it is more efficient.

Kuldeep S. Raghuwanshi
Configuration of IRS Tool on Linux OS Using Low Cost Low Power Computer

Infringement detection system and infringement avoidance systems (IDS/IAS) are crucial elements of computer network security. By using a freeware IDS tool like Snort, the outcomes of research are presented. Research has been made on a Snort IDS on a flexible, inexpensive reasonably powered machine known as Raspberry Pi2 (Model B), with a precise aim of finding their capability, competence and worth in the computer network atmosphere, where the price is an influencing aspect. SOHO and learning organization computer networks are some examples.

S. R. Kattamuri, V. Kakulapati, N. Muthyala
Intelligent Twitter Spam Detection: A Hybrid Approach

Over the years there has been a large upheaval in the social networking arena. Twitter being one of the most widely-used social networks in the world has always been a key target for intruders. Privacy concerns, stealing of important information and leakage of key credentials to spammers has been on the rise. In this paper, we have developed an Intelligent Twitter Spam Detection System which gives the precise details about spam profiles by identifying and detecting twitter spam. The system is a Hybrid approach as opposed to single-tier, single-classifier approaches which takes into account some unique feature sets before analyzing the tweets and also checks the links with Google Safe Browsing API for added security. This in turn leads to better tweet classification and improved as well as intelligent twitter spam detection.

Varad Vishwarupe, Mangesh Bedekar, Milind Pande, Anil Hiwale
Use of Learning Style Based Approach in Instructional Delivery

Technology Enabled Learning (TEL) has started journey from off-line, non-interactive content available on storage media, and now current destination of that journey is personalised e-learning. Instructional Delivery is an important phase in e-learning environment. In our Personalised e-learning model, we have used Learning Style as deciding factor in Instructional Delivery mechanism. We tested our model on 111 learners. Our result shows that Learning style based learning object selection and their delivery elevates learning which in turn improves learner understanding in that subject.

Ravindra Vaidya, Manish Joshi
An Intelligent Hardware Calibration Scheme for Real Time VoIP Applications

VoIP (voice over IP) is a methodology for the transmission of data, voice, video, messaging and chat services over Internet Protocol. The quality of VoIP is heavily dependent on the type of hardware used. Hardware calibration is a mechanism for selecting a suitable hardware for the required system characteristics. Sometimes, it becomes quite difficult to select the possible hardware with optimal cost from a class of processors which provide the same services. The aim of this paper is to use an Intelligent hardware calibration scheme for VoIP based real-time applications by considering different parameters like CPU utilization, cost, RAM utilization, jitter and delay. The selection of the processor by using an Intelligent scheme saves the execution time and availability of resources when it takes calls per second as input and gives output as whichever hardware is recommended for call manager to handle a different number of calls at an optimized cost. In this paper, Fuzzy Analytic Hierarchy Process (FAHP) based scheme is proposed which allows the user to make the best decision from a set of available choices to compare a set of processors that would help service providers to optimally provide different types of VoIP based services.

Chinu Singla, Gurjot Kaur, Jasleen Kaur, Nitish Mahajan, Shubhani Aggarwal, Sakshi Kaushal
Automatic Sub Classification of Benign Breast Tumor

This paper describes about automatic classification of benign lesions. Our re-search work mostly aims at development of a data driven effective classification system to determine type of benign breast tumors. The existing system in medical field identifies the type of benign breast tumor based on histopathological report obtained after a painful surgical process. Our target is to develop a system that would work without histopathological attributes to avoid pains to the patient. Our focus was to eliminate the role of histopathological attributes in the detection of benign tumor type. So we tried to identify correlation of histopathological features with mammographic image features and patient history features in order to explore if histopathological features can be replaced by corresponding correlated features. With replaced attributes we gain training accuracy for J48 as 79.78% and with SVM 81.91%. We obtained testing accuracy for J48 and SVM as 100% and 90.90% respectively.

Aparna Bhale, Manish Joshi
Prevention of Conjunct Black Hole MANET on DSR Protocol by Cryptographic Method

As in MANETs the infrastructure is not present they are more vulnerable for attack which encourages the attacker nodes to become part of the network. A lot of efficient protocols have been proposed in this field. All of these routing protocols depend only in the conviction and supportive environment. Conjunct Black hole attack is a network layer attack in which the malicious node makes use of routing protocol to pretend that it has the shortest path for the destination. In our research work we proposed a protocol, using RSA algorithm to eliminate Conjunct Black hole attack, even comparison with the DSR protocol is shows by the effects of the performance metrics in terms of delay and throughput of the network in the AD Hoc network environment.

Jyoti Prabha, Dinesh Goyal, Savita Shivani, Amit Sanghi
Enhance the Data Security in Cloud Computing by Text Steganography

Cloud computing is new age computing model. Which gives that how to use the resources even if don’t own them. It gives the facility to share the resources over the internet, without building the necessary infrastructure. This is an on demand service. This model gives faster accessing of the resources and reduces management related problems. In this paper we are going to discuss the different cloud computing models with its advantages, characteristic and various security measures in the current scenario. At last we discussed a new level of security in cloud computing for data diffusion and storage on hybrid cloud in security concern. We are presenting new linguistic text steganography approach for increasing data security. In these techniques first we encrypt our secret data by using substitution method and then embed it in some random cover media for hiding the aspect of original text. We have proposed new method of Indian script encoding method and capital alphabet shape encoding which would be hard to decipher by inarticulate person. We are applying this techniques on private cloud server when we transfer our private data to public cloud so our data will also be secure in transmission and storage on public cloud.Time overhead of this techniques is very less and also it’s not taking too much space as one character of random cover is use to hide one character of encrypted secret text.

Amit Sanghi, Sunita Chaudhary, Meenu Dave
A Hybrid Community Based Rough Set Feature Selection Technique in Android Malware Detection

Feature selection is the process of grouping most significant set of features which reduces dimensionality and generates most analytical result. Choosing relevant attributes are a critical issue for competitive classifiers and for data reduction also. This work proposes a hybrid feature selection technique based on Rough Set Quick Reduct algorithm with Community Detection scheme. The proposed technique is applied in Android malware detection domain and compared with the performances of existing feature selectors. It produces highest average classification accuracy of 97.88% and average ROC values up to 0.987.

Abhishek Bhattacharya, Radha Tamal Goswami
Natural Language Interface for Ontology in Agriculture Domain

Ontologies have recently become hot topic in research community. Ontologies have numerous applications and they are being used by researchers in almost all fields. They are primarily used for information retrieval, information extraction, knowledge representation and knowledge management purposes. This paper has two objectives. First is design and development of an ontology based on fertilizers in agriculture domain and second is to create a natural language interface for that ontology. Development of an Ontology requires both manual and expert efforts, so it requires significant amount of time as well. One of the goals behind designing and developing this ontology is to make it useful in real time scenario. Integration with other ontologies from the same domain such as soil or crop will enhance its real time usage. Information from the ontology can be fetched by making a natural language interface for it. It will accept user’s questions in natural language and will generate corresponding SPARQL query.

Nidhi Malik, Aditi Sharan, Jaya Shrivastav
3D-Based Unattended Object Detection Method for Video Surveillance

Inspired by 2D GrabCut for still images, this paper proposes automated abandoned object segmentation by introducing 3D GrabCut in surveillance scenario. This allows the method to produce precise object extraction without needing user intervention. Both RGB and depth input are utilized to build abandoned object detector which can resist to shadow and brightness changes. We performed the indoor experiments to show that our system obtains an accurate detection and segmentation. Both quantitative and qualitative measurements are provided to analyze the result.

Chia-Hung Yeh, Kahlil Muchtar, Chih-Yang Lin
Electrical Energy Output Prediction Using Cuckoo Search Based Artificial Neural Network

The environmental ever demanding improvement along with the increasing demand of electricity attracted researchers in designing efficient, accurate and robust models. Such models are used mainly to predict the energy output of combined steam and gas turbine mechanisms. The applicability of these systems depends on their sustainability. It is inevitable to predict the combined mechanisms output energy in order to produce more trustworthy mechanisms. Since the acceptability of the aforesaid turbine systems is judged in terms of their profitability, the output energy prediction plays a vital role. In machine learning, the neural network (NN) based models has been proven to be a trustworthy in critical prediction tasks. However, the traditional learning algorithms in the NNs suffer from premature convergence to local optima while finding the optimum weight vectors. Consequently, the present work proposed a Cuckoo Search (CS) supported NN (NN-CS) and a Particle Swarm Optimization (PSO) supported NN (NN-PSO) to efficiently predict the electrical energy output of the combined cycle gas turbines. In the current study, five features are extracted, namely the ambient temperature, relative humidity and ambient pressure in gas turbines and exhaust vacuum from a steam turbine. The results established the improved performance of the CS based NN compared to the multilayer perceptron feed-forward neural network (MLP-FFN) and the NN-PSO (particle swarm optimization) in terms of root mean squared error. Proposed NN-CS achieved an average of 2.58% the mean square error (RMSE).

Sankhadeep Chatterjee, Nilanjan Dey, Amira S. Ashour, Cornelia Victoria Anghel Drugarin
ASMAN Framework: A Framework for Comparison and Selection of SaaS Services

Cloud computing is an amazing technology in the present era, in all services provided by the service provider. Users only need a web browser and an Internet connection. As part of its definition of a theory of no server needs to be run in the hosted hardware and transfer data to your computer or device is only increased their own connection to the Internet. The user is required for each of the services used. It includes three different types of models have different uses, naming the parts of assigned amount [Service Platform] [IAAS] infrastructure as a service, software-as-a-service [service] software. The aim of this research is to support the saas consumers in choosing the right saas provider. In recent times, the list of service providers that offer various services to consumers. Consumer demand to select suitable saas providers to fulfill their needs. Asman the framework provides the advantages of the user in choosing the best SAAS SERVICES when several provider SAAS-based products. It includes the Compare the few parameters; Cost, speed, ease of use, reliability and availability. Asman algorithms depend on the selected suitable saas services based on the priority of the selected parameters for a comparison of the desired and the parameters provided.

Mamta Dadhich, Vijay Singh Rathore
Estimation of Heterogeneity to Improve Robust Anisotropic Diffusion

Anisotropic diffusion (AD) become a prominent image enhancement and de-noising method after its introduction in 1987. However, anisotropic diffusion requires selection of a diffusion function whose definition remained ad hoc. Additionally, AD requires determining a scale parameter on which the final result is intimately related. Typically this parameter is chosen in ad hoc basis which makes difficult to compare the final output. In literature, Median absolute deviation (MAD) is proposed as a heterogeneity scale for using with robust anisotropic diffusion (RAD). Despite its strong statistical foundation, diffusing an image by RAD results in undesirable staircasing effect and artefacts are created in presence of noise. In this paper, we propose a robust L-estimator scale that correctly demarcates the homogenous and heterogeneous regions of an image and smoothens the image with minimal staircasing effect. Experimental results show that this scale remains effective in presence of heavy noise.

Rohit Kamal Chatterjee, Avijit Kar
Efficient Economic Profit Maximization: Genetic Algorithm Based Approach

Economic profit is the main governing power of any industrial and socio-economic growth. It is imperative to maximize profit related with economic systems to retain economic stability. Traditional methods involving Lewis model has been found to be unsuitable in terms of computational complexity. Motivated by the recent developments and successful application of meta-heuristic algorithms in achieving potent solutions, the present work proposed efficient meta-heuristic algorithm to support the profit maximization formalism. The genetic algorithm (GA) is employed to maximize the profit in terms of the total revenue (TR) and total cost (TC). The real parameter objective function depicting profit has been gradually optimized by GA. Experimental results suggest that the GA based profit maximization method is extremely fast, accurate and robust.

Sankhadeep Chatterjee, Rhitaban Nag, Nilanjan Dey, Amira S. Ashour
DLNEx: A Tool to Automatically Extract Desired Learning Nuggets from Various Learning Materials

The current educational ecosystem of untethered, anytime anywhere learning pedagogy has accelerated individualized and personalized learning. With the ubiquitous 24/7 access to educational resources each learner wants the learning contents to be filtered and manifested as per his/her desire. These short capsules of knowledge, referred to as learning nuggets, improve time efficiency of learners by distilling and targeting specific utilizable information. DLNEx tool automatically extracts learning nuggets such as examples, problem solving sums, formulae, lines of code, theorems, diagrams, charts, graphs and readability score from three types of files: Pdf, doc and HTML. For a given topic/keyword this tool also facilitates student/instructor to view their desired nugget from various learning materials at one place instead of pondering through all different materials. Experiments show that the average F-measure over three types of files in learning nugget extraction is 90%.

Jyoti Pareek, Maitri Jhaveri
A Novel Approach to Optimize Subqueries for Open Source Databases

Query Optimization is an important process in Relational databases. Here, we present an overview of the Query Optimization process, with a focus on providing insight of MySQL sub-queries to optimize Join Operation. Efficient join processing for sub queries is one of the most fundamental and well-studied tasks in database research. For a given query, there are many plans that can be considered, though the output will be the same, but amount of time required is a key consideration. In this work, we examined existing algorithms for the way the sub queries are joined over many relations and described a novel sorting based algorithm to process these queries optimally. The algorithm is implemented and the results are compared with current strategies of MySQL. The proposed sorting based algorithm outperforms current algorithms without applying DBMS’ advanced techniques like parallel processing, hashing etc.

Bhumika Shah, Jyoti Pareek, Darshit Kanziya
Backmatter
Metadaten
Titel
Smart Trends in Systems, Security and Sustainability
herausgegeben von
Prof. Xin-She Yang
Prof. Dr. Atulya K. Nagar
Amit Joshi
Copyright-Jahr
2018
Verlag
Springer Singapore
Electronic ISBN
978-981-10-6916-1
Print ISBN
978-981-10-6915-4
DOI
https://doi.org/10.1007/978-981-10-6916-1

Neuer Inhalt