Skip to main content
main-content
Top

About this book

This book systematically examines and quantifies industrial problems by assessing the complexity and safety of large systems. It includes chapters on system performance management, software reliability assessment, testing, quality management, analysis using soft computing techniques, management analytics, and business analytics, with a clear focus on exploring real-world business issues. Through contributions from researchers working in the area of performance, management, and business analytics, it explores the development of new methods and approaches to improve business by gaining knowledge from bulk data. With system performance analytics, companies are now able to drive performance and provide actionable insights for each level and for every role using key indicators, generate mobile-enabled scorecards, time series-based analysis using charts, and dashboards.

In the current dynamic environment, a viable tool known as multi-criteria decision analysis (MCDA) is increasingly being adopted to deal with complex business decisions. MCDA is an important decision support tool for analyzing goals and providing optimal solutions and alternatives. It comprises several distinct techniques, which are implemented by specialized decision-making packages. This book addresses a number of important MCDA methods, such as DEMATEL, TOPSIS, AHP, MAUT, and Intuitionistic Fuzzy MCDM, which make it possible to derive maximum utility in the area of analytics. As such, it is a valuable resource for researchers and academicians, as well as practitioners and business experts.

Table of Contents

Frontmatter

Chapter 1. Investing Data with Machine Learning Using Python

Data can be classified into various forms like structured, semi-structured and unstructured. Data is growing rapidly in exponential form. Not all the data is useful; percentage of useful data is to be identified for future use. From useful data, future is to be predicted by analyzing the past. There are various types of algorithms and tools available in market through which we can predict our future [9]. Through this paper, we shall try to analyze and predict stocks and investing data of various companies such as S&P 500 Index, standard and poor’s 500, Yahoo Finance, Google Finance through supervised and unsupervised machine learning algorithms. Machine learning plays vital role in today’s scenario from self-driven cars, Google Assistance and Siri to news recommendation systems. It has been incorporated into mainstream tools, news recommendation engines, sentiment analysis, stock screeners, etc.

Anish Gupta, Manish Gupta, Prateek Chaturvedi

Chapter 2. Probabilistic Assessment of Complex System with Two Subsystems in Series Arrangement with Multi-types Failure and Two Types of Repair Using Copula

This paper presents the study of a repairable complex system entailing of two subsystems in a series configuration with a controller. Under consideration of operations, the subsystem-1 works 2-out-of-4: G policy but the subsystem-2 has a single unit which is coupled with subsystem-1 in a series arrangement. The subsystem-1 has assumed to suffer by two types of failures, namely partial failure and complete failure. Further, the partial failure has been categorized into two kinds of partial failures (minor partial failure and major partial failure). The minor partial failures degrade the efficiency of the system, but due to major partial failure the system can work with seriously degraded mode, which may cause sudden damage by a further operation. The failure rates of both subsystems and controller are constant and follow the negative exponential distribution, but the repair follows two kinds of distribution. The supplementary variable technique has employed to assess the system for particular cases. The graph has demonstrated numerical results.

Pratap Kumar, Kabiru H. Ibrahim, M. I. Abubakar, V. V. Singh

Chapter 3. Recognition and Optimizing Process Lagging Indicators During Software Development

Software industry has grown tremendously since the last few years. Customers today have very high expectations from systems delivered to them, in terms of quality, cost and time. In order to satisfy the customer, the project manager needs to monitor and control these three so-called major project management constraints during project execution. It is mostly seen that it is not possible for project managers to control all these three factors together and satisfy the customer needs. The task of measuring and monitoring the project data is considered as an overhead, and so, the project manager becomes selective in his approach. Project managers choose the high priority factor based on the customer requirements or as per the project signed agreement. Thereafter, the relevant data needs to be measured, collected, monitored and controlled using statistical techniques. But focusing and controlling only one of the factors many times is not sufficient for the overall success of the project. The manager needs to enhance and improve project monitoring to encompass the other factors also so that the project does not fail from any loose ends. The paper describes stepwise approach followed to achieve this: Firstly, how the importance of controlling more than one factor was accepted by the project managers; secondly, statistical techniques used in implementing the same; and thirdly, improvement in the final product as a result.

Amrita Wadhawan, Shalu Gupta

Chapter 4. Risk and Safety Analysis of Warships Operational Lifetime Defects

Indian warship building and ownership is a maturing industry in India. Warships are significantly different in building and maintenance philosophy than commercial liners. The GAO congressional committee report (GAO-09-322) lists these differences as ‘In Navy shipbuilding, the buyer favors the introduction of new technologies on lead ships—often at the expense of other competing demands—including fleet size. This focus—along with low volume, a relative lack of shipyard competition, and insufficient expertise—contributes to high-risk practices in Navy programs.’ Although the ‘cost’ of operational fleet is measured in strategic terms, it is imperative to develop a robust risk and safety model to chart out the tangible costs of warship building and ownership aspects of this relatively nascent enterprise in India.

Uday Kumar, Ajit Kumar Verma, Piyush Pratim Das

Chapter 5. Facebook Data Breach: A Systematic Review of Its Consequences on Consumers’ Behaviour Towards Advertising

Advancement of Internet and digitalisation of information has led to increase of data breach incidences in social networking sites (SNS) particularly Facebook, thus affecting its market value and reputation. The intriguing question is whether and how does data breach affects consumers’ behaviour towards Facebook advertising? The privacy literature provides inadequate account on this question. In order to fill the gap, the current paper analyses privacy in Facebook and investigates the consequences of perceived data breach (PDB) on consumers’ behaviour towards Facebook advertising. A systematic review of privacy literature was undertaken to answer the research question. This paper reveals that protection of consumers’ privacy in Facebook is in deficit, and it further hypothesises that PDB influences consumers’ behaviour negatively by discouraging both acceptance and engagement with ads. Moreover, PDB influences ad avoidance positively. Likewise, it influences consumers’ psychology by increasing privacy concerns and emotional violation and reducing trust concurrently. The findings of this study are not confirmatory; thus, a scope for further empirical research to test the developed model and inclusion of other mediating and moderating constructs is provided. Practically, this paper recommends tough personal data protection regulations and privacy policy that prohibit Facebook and other SNS from online tracking of consumers. Also, Facebook should build trust and revise privacy settings to enable consumers to opt in or out of receiving ads. Finally, this paper’s theoretical contribution is the model which proposes that perceived data breach affects consumers’ behaviour towards Facebook advertising both directly and indirectly through psychological mediating variables.

Emmanuel Elioth Lulandala

Chapter 6. Fuzzy DEMATEL Approach to Identify the Factors Influencing Efficiency of Indian Retail Websites

The growth in the Indian retail sector, specifically the online retailers, is coaxing the industry to look for innovative ways to attract customer attention. To respond to these changing market dynamics, the focus of the online retailers is diverted towards improvising their online portals. Modern Websites must accommodate a number of factors that can enhance the user experience. Thus, it is very imperative for the online retailers to understand the factors that affect the buying behaviour of Indian consumers. Contemporary studies in the field of e-retailing have mainly focused their attention on giving the customers the buying experience which can ultimately be culminated into seeking their loyalty. However, a systematic and effective overview of factors influencing the buying behaviour of e-commerce users is still lacking in the literature. In this regard, the present study focuses on developing a quantitative assessment framework in order to identify and evaluate the relationships between the factors that can lead to enhanced online user experience using fuzzy set theory and DEMATEL technique. The fuzzy set theory helps us to incorporate the vagueness in the decision-making, while DEMATEL method is used to portray the contextual relationship among the identified factors. The study is based on the data collected from users of online grocery retail Website. The results help the e-retail companies to identify the crucial indicators for developing a responsive and performance-driven e-commerce Website.

Loveleen Gaur, Vernika Agarwal, Kumari Anshu

Chapter 7. Application of Statistical Techniques in Project Monitoring and Control

Projects monitoring and control is a critical aspect in the execution of projects and delivering the high-quality product. Product quality and delivery are dependent on the processes defined and followed. In these days, projects are of complex nature and monitoring of these projects requires the application of statistical techniques. For this purpose, periodic project data collection is required to generate metrics. These metrics are baseline by deriving process performance baseline. Further, the regression analysis is carried out and process performance model is created. The purpose of this paper is to discuss the challenges faced before the application of these techniques and analyze the improvement, impact using statistical techniques. To end with, the paper will discuss how these techniques have benefited in delivering a quality product within budgeted effort and schedule.

Shalu Gupta, Deepshikha Bharti

Chapter 8. Sensor-Based Smart Agriculture in India

Till recently, drones were used for warfare. But the present century has found that drones are a boon for easy and green agriculture. Also termed UAVs, they give farmers extensive, crucial information that results in healthy and abundant crop at cheaper cost and less labor. Drones are the result of the new Internet of Things era. The researchers have done an analysis of the way that drones are used in Indian agriculture which is mainly dependent on secondary data. Based on the review of studies conducted using drones in agriculture and also literature in that area, the researchers have made a few recommendations. The limitation of the research is that it is only secondary in nature. But the researchers plan to build upon this research to conduct primary research at a later stage. The study is significant as it will help both farmers and the government.

Pradnya Vishwas Chitrao, Rajiv Divekar, Sanchari Debgupta

Chapter 9. Integrating UTAUT with Trust and Perceived Benefits to Explain User Adoption of Mobile Payments

This study investigates the factors affecting behavioral intentions to adopt mobile payment services by Indian citizens. The proposed model integrated factors from the unified theory of acceptance and use of technology (UTAUT) along with other important factors like perceived benefits and trust. This research model was empirically tested in New Delhi—the capital city of India, using 341 responses from a field survey. Data were analyzed using multiple regression analysis (MRA). The results showed that behavioral intention to adopt mobile payments is significantly and positively influenced by facilitating conditions, effort expectancy, performance expectancy, perceived benefits, and trust. Theoretically, this study significantly contributes to the existing knowledge specifically related to mobile payment channels and technology acceptance area in general. Practically, the study looks forward to provide the mobile payment service providers in India with suitable guidelines for effectively implementing and designing mobile payment services.

Rishi Manrai, Kriti Priya Gupta

Chapter 10. Improving the Quality of Customer Service of Financial Institutions: The Implementation of Challenger Banks

The past few years have been characterized by information and communication technologies in all areas of the world economy. Today, there formed the concept of the digital economy, denoting economic sectors in which the economic effect is achieved through the introduction and mass use of innovative digital technologies. The world economy is at the threshold of the Fourth Industrial Revolution. The implementation of innovative technologies gains the avalanche character. The use of innovative digital technologies provides a quality improvement in all sectors of economic development, both in the industry and in the financial sector. Improving the quality of services in the financial sector is an integral part of every financial institution, striving not only to keep its position in the market but also to be sufficiently competitive, especially in our rather difficult economic time. The paper analyzes the new innovative form of business in the financial sphere—challenger banks. There is a credit organization completely functioning on the Internet. Such an innovative form of a financial institution can raise the quality of bank customer service to a new level, providing them with new forms of interaction with a credit institution. The introduction of new innovative projects always faces the problem of evaluating the effectiveness of the proposed project. Therefore, a stochastic frontier model was suggested in the study. It allows estimating both individual financial institutions and all challenger banks. The modeling resulted in that the new innovative form of financial institution challenger bank exceeds traditional forms in the effectiveness.

Alexey V. Bataev, Dmitriy G. Rodionov

Chapter 11. Corporate Image Building in the Indian IT Sector: Conceptualizing Through HR Practices

This research paper’s objective is to facilitate the researches, academicians and marketing and HR practitioners gain insights on the corporate image building concept in the context of HR practices and to provide research opportunities in the area of corporate image building in Indian IT sector. Adopting the inductive and conceptual theory building approach with a qualitative research method for data collection and data analysis has been used. Data was collected from hundred respondents specifically professionals from IT companies, namely HCL, Infosys, IBM, Wipro, TCS and Accenture. In this research, IT company-specific conceptual perspective have been integrated and extracted to formulate a conceptual framework that includes three main variables (1) employer branding (2) corporate social responsibility (3) corporate social performance and results indicates that these variables play a significant role in corporate image building in Indian IT companies. Also, this research provides and suggests innovative measures for corporate image building through HR practices in Indian IT sector.

Renu Rana, Shikha Kapoor

Chapter 12. Vague Reliability Analysis of Standby Systems with an (N + 1) Units

Vague logic plays an important role in the reliability analysis of a repairable standby system consisting of N + 1 units. The present paper also investigates the reliability of an N + 1 units system in which at starting stage all units of the system are in good condition but when in the system the failure comes to some module, the system immediately takes reconfiguration operation, within negligible time. One repairman is available to repair one faulty module at any time, except when the system is fully failed. The functioning of the repaired unit is performed successfully, and the repair rate is constant from state one to another state.

Kapil Kumar Bansal, Mukesh K. Sharma, Dhowmya Bhatt

Chapter 13. Experimental Study for MRR and TWR on Machining of Inconel 718 using ZNC EDM

Manufacturing industries are widely using very hard materials like super alloys, composite material, etc., for various engineering applications. Machining of these materials is difficult by using conventional processes, which necessitate the make use of advanced material removal process such as electrodischarge machining. EDM is a thermal-based material removal process, in which repetitive electrical discharge between electrode tool and work piece tool causes melting and evaporation of material resulting in removal of material. By virtue of inherent principle, any conductive material irrespective of hardness can be machined with the EDM process. In present paper, experimental study on electrodischarge machining of Inconel 718 attempted using Taguchi approach. The effects of process parameters on material removal rate (MRR) and tool wear rate (TWR) are investigated. Peak current, pulses on time, pulses off time and gap voltage are the process parameters considered for experimental study. Pulse on time and current are observed to be the most significant factors, affecting MRR and TWR.

Rajesh Kumar, Balbir Singh

Chapter 14. Blind Quantitative Steganalysis Using CNN–Long Short-Term Memory Architecture

Covered communication is termed as steganography. And unveiling details about this camouflaged communication is steganalysis. In the current era, steganalysis helps forensics examiners to analyze and detect malicious communication. Feature extraction, training phase and testing phase are three steps used by traditional machine learning algorithms for learning complex characteristics of images before using them for analysis. Deep learning models, on the other hand, comprise a large number of hidden network layers which are used to capture various statistical information of digital images and learn them automatically. This paper proposes a novel deep neural network technique using convolutional neural network (CNN) with long short-term memory (LSTM) for quantitative steganalysis, prediction of embedded message length in given stego-object. Experiments are performed using Keras Python library. Statistical measure MAE is used to compare proposed quantitative steganalysis. Experimental results show the good performance of the proposed model.

Anuradha Singhal, Punam Bedi

Chapter 15. Performance of Autoregressive Tree Model in Forecasting Cancer Patients

In this paper, our aim is to study the time series data on cancer patients of Punjab, a state of India, by applying the autoregressive tree (ART) model. ART model is an extension of AR model which provides a superior predictive accuracy as compared to AR. In ART model, at first predicted variable and target variable are to be decided on the basis of our own data. And the second step is to form a decision tree for our target variable by transforming the data. This approach of AR model is very much effective for forecasting and a very few research work had been done up to date with ART model. We choose this region for our research purpose as Punjab is considered as a cancer capital of India due to its high increasing rate of cancer-affected patients. In this paper, our goal is to forecast the future trend of cancer patients of Punjab and a case study of cancer incidence is also analyzed and forecast trends of common cancer types among male, female, and both of them. A hypothesis test and calculating our forecast result are also provided in this paper on the basis of mean forecast error and mean square forecast error. The purpose of this study is to aware the people and various organizations regarding the frightened issue of this region and our results could provide appropriate planning regarding eradicate this disease.

Sukhpal Kaur, Madhuchanda Rakshit

Chapter 16. Developing a Usage Space Dimension Model to Investigate Influence of Intention to Use on Actual Usage of Mobile Phones

India is one of the largest mobile markets. This study aims to validate a usage space dimension model for investigating the influence of intention to use on actual usage of mobile phones and the effect of gender and age on intention to use and actual usage across the usage space dimensions. This paper used a systematic usage space analysis across age and gender to extend the intention to use from the original UTAUT model. Structural equation modeling (SEM) was used to develop a usage space dimension model. The empirical finding suggests that intention to use as a variable is not one-dimensional as initially proposed in the UTAUT model. Two distinct factors have emerged specific to mobile intention to use: inter-factors pertaining to coordinating and using mobile with the world outside; intra-factors pertaining to coordinating and using mobile for self. The study concluded that both gender and age influence the actual mobile phone usage, age having a stronger influence on intra-intention. The study also validates usage dimensions as a viable and practical approach to understand the adoption and integration of mobile phone in everyday life. It also helps give a picture on the potential adoption of new mobile features and applications. Using the usage space dimension, the study brings out the different ways mobile users are using and applying the mobile features in their daily lives.

Geeta Kumar, P. K. Kapur

Chapter 17. Examining the Relationship Between Customer-Oriented Success Factors, Customer Satisfaction, and Repurchase Intention for Mobile Commerce

Availability of smartphones at affordable prices and efficient data packs provided by telecom companies have shifted digitalization toward phone-enabled mobile commerce (m-commerce). This has forced the firms to reengineer their strategies to incorporate mobile commerce within their organizational models. The paper proposes a conceptual model that includes the variables such as m-commerce application success (MAS), customer satisfaction (CS), and repurchase intention (RI) toward mobile commerce. The conceptual model analyses the significance of the relationship between MAS with CS, CS, and MAS with RI. The model has been validated through a survey consisting of 530 respondents, mainly from northern India. The study was performed using structural equation modeling (SEM) approach. Results validated the association of MAS with CS, and CS with RI. But, there exists no direct relation of MAS with RI. Future studies may extend the present model by incorporating loyalty framework.

Abhishek Tandon, Himanshu Sharma, Anu Gupta Aggarwal

Chapter 18. Smart Industrial Packaging and Sorting System

In today’s world, automation has become an inseparable part of our lives. Simultaneously, food is the basic unit on which humans survive and when technology and food are thought together, it becomes important to heed to the hygiene, quality and cost. As of today’s scenario, 34% food available in the globe is packaged which makes it crucial to look forward towards the shortcomings and the advancements in the production, packaging, inventory, supply chain, package handling, quality management and other related aspects. In the conventional packaging system, power failure and human error affect the efficiency of the system, miss-match of the processes and waste of time which is non-negotiable for any FMCG. This paper encompasses engineering solution to the mentioned challenges. It includes barcode sensor assisted with a load cell, a decision-making device—Arduino Mega 2560, programmable logic controller(s) (PLC) for the identification and sorting of CLD(s) or RSC(s) along a retracting conveyor. For the packaging solution, a revolving disc type lid closing system and a compact taping mechanism have been introduced which can reduce dependency on manpower for CLD packing. GSM module in the feedback to the microcontroller is placed, such that any deviation from the default parameters will be sensed automatically, and area of problem occurrence (AOC) is easily identified. In addition to above advents, this paper also talks about the SCADA implemented HMI(s) in order to meet the industry standards of monitoring and control along with the overall condition monitoring and data acquisition for research and analysis.

Sameer Tripathi, Samraddh Shukla, Shivam Attrey, Amit Agrawal, Vikas Singh Bhadoria

Chapter 19. Credentials Safety and System Security Pay-off and Trade-off: Comfort Level Security Assurance Framework

System security is largely incomplete without credential safety. They are two sides of the same coin, but seldom touched in pair while addressing the concern of unauthorized access attack from a compromised system. The expression of credentials reflects the ownership and assures system security while the unexpressed credentials assure the credential safety. Limiting the expression of credentials up to a safe extent is the limitation of current practices of monolithic authentication. We propose comfort level security assurance framework for pay-off and trade-off credentials safety with security requisite as per the required functionality. A proof of concept demonstrates how to reduce the expression of credentials during the course of authentication and preserve privacy. Its safety, security and privacy are evaluated based on user-adversary model and that demonstrates how the comfort level credential safety and application security can be achieved by correctly mapping the ‘right credentials’ with ‘right functionality’ of an application.

Habib ur Rehman, Mohammed Nazir, Khurram Mustafa

Chapter 20. A Study of Microfinance on Sustainable Development

This article has attempted to reach out to the rustic clients in identifying the impact of microfinance on poverty reduction, women economic empowerment, as well as non-financial outcomes: gender equality, children enrolment in school, especially girls, and health status. It also manifests the developmental relationship of microfinance between India (Katihar) and Nepal (Morang). Primary data are surveyed from the two different areas; sample size comprises 60 respondents each from both districts. The study systematically examines and analyses the quantitative data using statistical tools: SPSS processes percentage, mean, standard deviation and t-test. The study further provides evidence that microfinance leads to Sustainable Development Goals where many respondents positively agree while some strong agree on reducing poverty, women economic empowerment, gender equality and increase in children enrolment in school especially girls, although few disagree. Moreover, microfinance is effective on poverty reduction, women economic empowerment, gender equality and increase in children enrolment in school, especially girls in Morang district, whereas improvement in health status is effective in Katihar district. However, the study further indicates that there is no significant relationship in the developmental impact of microfinance among Katihar and Morang districts.

Shital Jhunjhunwala, Prasanna Vaidya

Chapter 21. Development of Decision Support System for a Paper Making Unit of a Paper Plant Using Genetic Algorithm Technique

In this paper, decision support system (DSS) for a paper making unit of a paper plant has been developed by using genetic algorithm. Process industries like paper plant, sugar industry, fertilizer, etc., are comprised of various complex engineering units. A paper plant comprises chipping, digesting, washing, bleaching, screening, stock preparation, paper making units, etc. These units are connected in combined configuration. The paper making unit is having four main subsystems. The mathematical model of paper making unit has been developed by using Markov birth–death process. On the basis of probabilistic approach, the differential equations are formed by using of transition diagram. These equations are solved recursively and then reduced to steady-state conditions, mostly required for the paper industry. Normalizing condition has been used to find out the probability of full working state with or without the use of any standby subsystems, and finally performance model has been developed for paper making unit. Finally, the genetic algorithm technique is used for optimum value of the unit performance by coordinating the parameters of all subsystems of paper making unit.

Rajeev Khanduja, Mridul Sharma

Chapter 22. The Challenge of Big Data in Official Statistics in India

The generation of all kinds of data has risen in recent years in reference to quantity, frequency and availability, which have led to the incorporation of the term big data. This abundance of data represents a challenge but also an extremely interesting opportunity in the field of Official Statistics. Several national statistics agencies have implemented. The United Nations through its Statistics Commission created the Global Working Group in the year 2014. The recommendations for the usage of big data in Official Statistics are focused on several aspects which have become more specific through the years but can be summarized in the following points: (a) The access to data that belongs to other organizations, (b) The implementation of a successful and collaborative partnership with data providers, (c) The development of practical activities through pilot projects and (d) The development of appropriate methodology for the usage of big data in the generation process of Official Statistics. Another real main impetus is the Governments push toward digitization with activities like Digital India that expects to change the nation into a carefully engaged society and the National Optical Fiber Network (NOFN) activity that intends to associate towns with optical fiber and give fast Web availability. The ongoing Demonetization in India has achieved an attitudinal move in the psyches of the Indian individuals with more people presently moving toward computerized installment mediums, and this expanding digitization of exchanges is an emphatically corresponding element for the Big Data Analytics request in the nation. Likewise, there is additionally the administration’s one of a kind ID venture which intends to give one of a kind computerized character an AADHAR number to every one of its subject for productive conveyance of administrations. Connecting of the AADHAR to different administrations like back, medicinal services and so on is relied upon to make a colossal pool of information which would require big data examination to change over the crude information into important data and encourage investigation. It demands intense methodological and technical work, and it should address such topics as building capacities in those methodologies and creating specific positions for the incorporation of big data foundations in the generation of Official Statistics.

Pushpendra Kumar Verma, Preety

Chapter 23. CUDA Accelerated HAPO (C-HAPO) Algorithm for Fast Responses in Vehicular Ad Hoc Networks

With an exponential growth in the number of vehicles plying on roads in metropolitan cities, there is an urgent need of means for controlling the traffic congestion. This congestion results in increase of travel cost, travel time and pollution that have a substantial impact on both the health of people and the economy of the nation. For the reduction of congestion, one needs a routing algorithm that is able to detect the congestion in advance and helps in avoiding the congested routes. To improve the problem of congestion, many algorithms have been proposed by researchers. As the speed of vehicles is very high, the algorithm needs to compute the results in minimum possible time. Hybrid ant particle optimization (HAPO) algorithm is being used in the literature to choose the best route by leaving the optimal but congested path during peak hours in vehicular ad hoc networks (VANETs). In the paper, we are proposing a CUDA accelerated HAPO (C-HAPO) algorithm in which all the phases of HAPO algorithm are parallelized in order to provide faster computations by using the harness of compute unified device architecture (CUDA) on graphical processing units (GPUs). The proposed C-HAPO algorithm is able to speed up the computations of the existing HAPO algorithm significantly with respect to its counterparts under heavy traffic conditions and decreases the travel time for commuters. An implementation for the algorithm has been carried out on the NVIDIA architecture used with CUDA toolkit version 7.5 on a real-time northwest Delhi map obtained through Google Maps.

Vinita Jindal, Punam Bedi

Chapter 24. A Survey of Portfolio Optimization with Emphasis on Investments Made by Housewives in Popular Portfolios

The paper aims at an investor who is pre-dominantly risk-averse and is not adoptive to broad allocation of investment due to lack of knowledge and adventurism. It aims at utilizing mathematical tools of portfolio optimization to simple and popular investments of housewives. After a two-stage process involving observations and analysis of the returns on investments made by housewives, an optimal choice of portfolio is suggested for them.

Sunita Sharma, Renu Tuli

Chapter 25. Single Vacation Policy for Discrete-Time Retrial Queue with Two Types of Customers

This analysis is devoted to model retrial queue with, Bernoulli feedback, preferred as well as impatient customers or units in discrete environment. Here, server follows state-dependent policy and may leave for single vacation whenever it is idle. This investigation is motivated by the increasing impact of discrete-time retrial queues in real-world scenarios. For instance, it is widely used in multiplexing of voice data, digital communication (ATMs, BISDN), switching modules and networks, etc. In such types of queueing system, time is a random variable of discrete type, and we calculate it in equally sized data units. We have considered a system with early arrival pattern and studied the Markov’s chain underlying this model. Along with this, the marginal generating function (mgf) for the total units present in the orbit depends on the server state. Few performance measures like average units in the orbit are also calculated by applying probability generating function method. Further, a practical example is numerically illustrated as well as sensitivity analysis is provided.

Geetika Malik, Shweta Upadhyaya

Chapter 26. Intuitionistic Fuzzy Hybrid Multi-criteria Decision-Making Approach with TOPSIS Method Using Entropy Measure for Weighting Criteria

Selecting a university which provides best education and fulfils all criteria among many alternatives for specific student is a challenging task. Hesitancy and vagueness in decision-making are best dealt with intuitionistic fuzzy sets. It is a multi-criteria decision-making problem among several alternatives involving several academic and non-academic criteria where for student it is difficult to decide precisely based on the available information. The study is concerned with the criteria that influence the student’s university selection and to establish the multi-criteria model for ranking the universities based on important criteria affecting selection by students. In this paper, an intuitionistic fuzzy entropy-based MCDM is proposed with TOPSIS method to determine the vagueness and exactness of alternatives over the effective academic and non-academic criteria; to aggregate the decision-maker’s opinion, intuitionistic fuzzy operator is applied over considered criteria for all alternatives, and the proposed method is applied to rank the universities.

Talat Parveen, H. D. Arora, Mansaf Alam

Chapter 27. Dynamic Analysis of Prey–Predator Model with Harvesting Prey Under the Effect of Pollution and Disease in Prey Species

In this paper, a prey–predator model has been developed in order to study the concurrent effect of contamination and impact of disease in interacting species. Hence, the model influenced by contaminants, with infection in prey species, is proposed, where predator catches infective as well as susceptible prey. Boundedness and presence of all equilibria have been established. Also, the conditions for both local and global stability of the model have been developed. We also developed the optimal control issue by picking the control variable and gave the optimal harvesting approach of system by applying Pontryagin’s maximum principle. Finally, numerical simulations along with graphical illustration have been performed to support our analytic outcomes.

Naina Arya, Sumit Kaur Bhatia, Sudipa Chauhan, Puneet Sharma

Chapter 28. Univariate and Multivariate Process Capability Indices—Measures of Process Performance—A Case Study

Process capability indices like Cp, Cpk, Cpm and Cpmk are commonly used in industry to assess the competence of a process to conform the control limits on quality parameters of process study. As product designs are complicated to meet international standards and consumers’ requirements are changeable rapidly, quality characteristics must be analyzed simultaneously to improve product’s quality and also to consider the aspect of multicollinearity among different quality characteristics. The objective of this paper is to measure univariate and multivariate capability indices to assess performance of multistage processes of paper manufacturing at Paswara Papers Ltd., Meerut. The paper manufacturing has various stages, namely pulping, screening, cleaning, deinking, refining, pressing, steam drying, bleaching, steam drying and size press. Different capability indices are measured for various processes of paper manufacturing. The results reveal that multivariate process capability indices, introduced by Shahriari and Lawrence (fourth industrial engineering research conference, pp 304–309, 1995, [6]) [Cpm = 0.6197], gives higher capability as compared to Taam and Liddy (J Appl Stat 20:12, 1993, [5]) [MCpm = 0.1408], and Pan and Lee (Qual Reliab Eng Int 26:3–15, 2010, [7]) [NMCpm = 0.0130].

Vivek Tyagi, Lalit Kumar

Chapter 29. Improving Customer Satisfaction Through Reduction in Post Release Defect Density (PRDD)

Quality of software delivered is a matter of significant concern to all stakeholders, especially the customers. Post release defect density (PRDD) is commonly used as a measure of quality of software delivered and is computationally defined as the number of defects leaked to the customer divided by the size of the software. PRDD is a major determinant of customer satisfaction and hence, improving PRDD would generally result in increase in the level of customer satisfaction. This paper presents a case study that illustrates how to improve PRDD in enhancement and maintenance type of software projects. It describes the approach followed, which involved two stages and applied lean six sigma methodology. Improvements and changes were identified and implemented in selected software project in stage one of this case study. After successful validation in the selected project, these improvements and changes were extended and replicated to other similar software projects in stage two. This paper postulates that improvements and changes in planning of certain engineering activities in the projects, selected based on statistical tools such as process capability analysis, scenario analysis, and sensitivity analysis, would result in reduction in PRDD. It also postulates that tracking of the above activities using statistical tools such as control chart and cause and effect diagram would ensure the sustenance of improvements in PRDD. Finally, it illustrates how reduction in PRDD would eventually result in higher level of customer satisfaction.

Hemanta Chandra Bhatt, Amit Thakur

Chapter 30. An Architecture for Data Unification in E-commerce using Graph

Graph model has emerged as a contemporary technique for relationship-centric applications because of its various features like index-free adjacency, schema-less data, faster traversal of relationships, etc. In large-scale e-commerce applications, heterogeneous models are utilized for storing different types of data, e.g., relational model for transactional processing, ontology for product information, graph model for user preferences, etc. However, it creates overhead of accessing multiple data models for query processing. In this paper, we present a data modeling architecture for e-commerce which can be utilized for data unification from different data models to graph model. To verify the applicability of our approach, we analyze and compare query performance of our approach with heterogeneous data models. In addition, we also discuss the issues and challenges for adopting graph model for e-commerce applications.

Sonal Tuteja, Rajeev Kumar

Chapter 31. Road Accidents in EU, USA and India: A critical analysis of Data Collection Framework

Transportation is lifeline of any country across globe for its well-known reason. Being the lifeline, it has several challenges, especially when the cases of road transportation are highlighted and safety factors are concerned. Though safety aspect has an equal importance in all modes of transportation, however in road transport, it has a higher priority due to higher density of population dependency. The safety of road transportation has one of the key concerns, i.e., road accidents or crashes and all the countries are facing challenges due to the fatalities and injuries associated apart from the loss of other resources. Road safety is backbone of a country and policy planning and precautionary measures for an efficient safety system on road are dependent on accident variables. Countries are working in minimizing the incidents, and most commonly approach is to collect the data from crashes and accordingly understand the pattern. The objective of the paper is to aggregate, explore, and outline the key parameters that recorded from accident site and act as key inputs in planning a future strategy in prevention and monitoring for such accidents. The importance of accident data variables and procedures for collection of accident evolved over the time globally and replaced by new systems over the period of time for recording accident data for better management and analysis of accidents. The examination of the systems, in the countries referred, reveals the different evolution of systems along with various accident parameters having similarities and dissimilarities in road accident parameters recorded and maintained. This paper presents various accident data recording systems and key variable recordings in the USA, European Union, and India. This paper explains about the evolutions of different systems over the time in EU, viz. from CARE to CADAs and in the USA with GES, CISS, CDS, CRSS, and FARS. These systems are explained with their data variables, and retrospective observations are discussed. The systems in India are also presented, and various limitations were discussed. The importance of information collected from accident sites has a special significance in understanding the patterns, causes, and planning for a better safer infrastructure and system.

Alok Nikhil Jha, Geetam Tiwari, Niladri Chatterjee

Chapter 32. Improving the QFD Methodology

The paper discusses the theoretical provisions and practical recommendations for the implementation, development and use of QFD analysis to improve the competitive ability of the organization and customer satisfaction, explored the application of Weber–Fechner law. Based on the analysis of theoretical and practical work, it can be concluded that at present many problems are devoted to the study of consumer satisfaction. However, most of the research involves processes to quality assurance of the products produced at each stage of the life cycle, which would guarantee a result that meets the requirements and expectations of the consumer. Attention is paid to issues of competitive ability of organizations to a lesser extent, which is currently the most urgent problem for domestic manufacturers. A universal tool for developing new products is QFD analysis, which forms a continuous information flow, ensuring that all elements of the production system are interconnected and subordinated to consumer requirements. From the point of view of the organization’s management, the introduction of QFD analysis is seen as an improvement in the design, technology or process in order to save costs, improve quality and other strategic goals to ensure the competitiveness of the organization. The disadvantages of the existing approach are that the method of calculating the significance of product characteristics does not take into account the degree of the organization’s lagging behind competitors.

Yury S. Klochkov, Albina Gazizulina, Maria Ostapenko

Chapter 33. Software Defect Prediction Based on Selected Features Using Neural Network and Decision Tree

Software defect prediction is a vital part in software reliability field. Here, the software modules are identified as defective or non-defective based on selection of the right set of software metrics. Software quality has become an emerging area for researchers to deal with challenges like defect prediction, defect removal, bug severity etc. The objective of this paper is to apply Machine Learning techniques for feature selection to remove redundant metrics and build software defect prediction model to label the software modules. We use principal component analysis technique as feature selection technique to reduce the redundant features on basis of Eigen value. After identifying the best set of software metrics, we develop software defect prediction models to classify software modules using Artificial Neural Network (ANN) and Decision Tree (DT). We have worked on data sets collected from data repository available on GitHub. The best set of software metrics act as predictors, which are obtained after removing irrelevant software metrics and building model for software defect prediction. We also compare the two classifying techniques on the basis of F-score, AUC, precision and sensitivity and accuracy.

Prarna Mehta, Abhishek Tandon, Neha

Chapter 34. Testing-Effort Dependent Software Reliability Assessment Integrating Change Point, Imperfect Debugging and FRF

With tremendous growth in the use of software in today’s hi-tech world, it is necessary to have high-quality, reliable software systems. Many SRGM’s have been developed to assess reliability growth. In this paper, an SRGM incorporating Weibull effort function has been investigated. The time point at which the fault detection and removal rates are changed is termed as change point. The testers are unable to perfectly remove a fault on its detection and may add to the number of latent faults. This scenario of imperfect debugging leads to error generation. Here, we propose a testing-effort dependent software reliability growth model incorporating fault reduction factor (FRF), change point and imperfect debugging. The model has been derived under two cases: (i) when FRF is represented by a logistic function and (ii) when FRF is constant. For testing-effort expenditure, we have used Weibull function. The proposed models are tested on two real software fault datasets, and results are compared with perfect debugging scenario. The goodness-of-fit analysis results show the accuracy of proposed models to represent the failure behaviour of the software system.

Rajat Arora, Anu Gupta Aggarwal, Rubina Mittal

Chapter 35. Assessing the Severity of Software Bug Using Neural Network

Bug severity defines that to what extent the bug influence the software. Bug reports are misclassified affecting the performance of prediction and lead to biasness. Multitude relevant and irrelevant bugs are reported through bug-tracking systems, and it becomes necessary for the developer to give precedence to the bugs reported. The critical bugs need to be fixed immediately, and minor bugs can be fixed later on the basis of available resources. In this paper, we define different severity levels and assort them on the basis of their severity levels. It becomes problematic to assort tons of bugs manually, so we have used text mining approach to congregate words pointing severity of the reports obtained from Bugzilla for different components of ECLIPSE. The TF-IDF method is used to extract features and make dictionary of critical words. We have adopted neural network for the classification of bug reports and measured its performance.

Ritu Bibyan, Sameer Anand, Ajay Jaiswal

Chapter 36. Optimal Refill Policy for New Product and Take-Back Quantity of Used Product with Deteriorating Items Under Inflation and Lead Time

This paper considers an inventory model under a condition in which the retailer trades the new item to customers in addition to gathers and trades the used items. The situation is presumed that the quadratic demand rate is price dependent for deteriorating items. The return of used item as a price and linearly time-varying purpose and a price-dependent quadratic demand function are to be discussed. Furthermore, the effect of inflation is also taken into concern. The planned delinquent is expressed as a profit maximization problem for retailer. The objective is to bargain the optimal selling price, the new item’s optimal ordering quantity, and the used item’s optimal quantity concurrently such that the retailer’s aggregate profit is maximized. A numerical example is occupied to deliberate the sensitivity of the models.

S. R. Singh, Karuna Rana

Chapter 37. Blockchain-Based Social Network Infrastructure

More than ever, users’ security and confidential data is at the risk of exploitation due to the ever-burgeoning cybercrimes and newer vulnerabilities exposed or discovered every alternate day. With increased connectivity via social networks, there is also an increased risk to data confidentiality and user privacy, never known before. The networks store a huge amount of private user data mostly related to interactions online. This sensitive information is usually meant for a very few people to see and kept protected from the outside world or unauthorized access via various security techniques. This study investigates the online conduct of users and perceived benefits of using online social networks. This study features security issues and potential attacks on different parts of the user’s security, tends to provide a stable decentralized solution to control, manage and authenticate user’s personal data and thus ensures privacy eliminating the need of third party.

Charu Virmani, Tanu Choudhary

Chapter 38. Performance Analysis of an Active–Standby Embedded Cluster System with Software Rejuvenation, Fault Coverage and Reboot

The performance analysis of a standby supported cluster system incorporating software (S/W) and hardware failures is performed in this paper. The cluster system structured with multi-components has a primary server considered as an active server, while the remaining servers function like spare units. Any fault is detected successfully, located as well as recovered with a coverage probability. The coverage factor and delay of the reboot are also considered. For increasing the performance capacity of the embedded system, the rejuvenation policy to the S/W embedded cluster system is employed. The software rejuvenation policy in two levels has been taken into account for constructing a Markov model. To achieve the performance measures such as availability and downtime cost, the steady-state probabilities are evaluated using the recursive method. The numerical results are provided to confirm the authentication of the proposed model.

Preeti Singh, Madhu Jain, Rachana Gupta

Chapter 39. Analysis of Jeffcott Rotor and Rotor with Disk Using XLrotor

The work presented in this paper emphasizes on understanding the causes of failures in rotors and use of XLrotor tool in designing a reliable rotors. The approach to rotordynamics analysis through simulation tool is to determine the undamped/damped critical speeds, imbalance response with different configuration of bearings and mountings that causes change in modes and mode shapes in synchronous vibration. The main aim of this study is to know the critical speeds of rotors at design stage and predict the failure using XLrotor computational tool that improves reliability of system/machines by modifying the parameters. To measure and control the rotor extreme vibrations, the simulation feasible XLrotor tool plays vital role in redesigning the rotor models. In this paper, an attempt is made to study and analyze the developed rotors through simulation technique with validation for understanding the rotordynamics in reliability engineering. The Jeffcott rotor and shaft with disk is modeled, the rotating machine is solved and analyzed for undamped critical speeds and modes. Also, the impact of mass and stiffness on rotor model is studied and analyzed to define the failure. This study clearly shows overview of XLrotor software and also the power of simulation tool in reliability engineering.

H. N. Suresh, N. Madhusudan, D. Sarvana Bavan, B. S. Murgayya

Chapter 40. Foodie: An Automated Food Ordering System

This paper proposes an automated food selection and ordering system that reduces the time between ordering and serving dishes to customers. This system will provide an experience of convenience to the restaurant employees while they are on duty as well as the consumer who dine-in at the restaurant, thus leading to more customer satisfaction which will help in improving business consecutively. The proposed system is fully secure and uses QR code and UID verification for placing the order for a specific restaurant. Every table in the restaurant will be equipped with electronic menu facility which will be user-friendly, and the customer will be able to select, place and pay through this interactive interface. This processing method of placing order in restaurant increases efficiency and reduces time based on QR code without the need of nearness of waiter at the table by replacing some stages of traditional order service.

Chetanya Batra, Gandharv Pathak, Siddharth Gupta, Saurabh Singh, Chetna Gupta, Varun Gupta

Chapter 41. Prioritization of Different Types of Software Vulnerabilities Using Structural Equation Modelling

Due to digitization and cloud computing, people nowadays prefer to save their important data online. But flaws in the software systems pose high risk to data’s security. Researchers are working on risk reduction and reducing time in fixation of such flaws. High risk is presented by these software vulnerabilities to the software systems security. If a particular type of flaw or vulnerability prevails or occurs repeatedly and poses highest risk, then our data can be saved by prioritizing such fixes. The work presented in this paper prioritizes those types of vulnerabilities which are most severe so that the testers and fault fixers can work on the most severe ones first and decrease the risk posed due to these vulnerabilities. Using SEM technique, we will first find out if the different types of vulnerabilities are dependent on each other and quantify their interdependence on each other. Later, we will develop a structural equation model to recognize the severity of these vulnerabilities which seems to be a more practical approach to find the priority of fixation of these flaws.

Swati Narang, P. K. Kapur, D. Damodaran

Chapter 42. Fog-Based Internet of Things Security Issues

Internet of things (IoT) environments are different in terms of architecture and platform type; fundamentally, some are cloud based, and some use traditional data centres to support all necessary computation and data handling. In cloud-based IoT, the core of the cloud where most of the computation takes place is usually located away from the ‘things’. Therefore, some researchers recently proposed using the fog-computing-based Internet of things by utilizing an adjacent layer of nodes (fog) to provide the essential computational support for devices to servers and from device to device. Hence, fog provides a more distributed environment that can provide support to IoT devices in the field. In this work, we analyse and compare different risks and advantages of using a cloud versus using a fog cloud and present discussion on these issues. We show that traditional cloud can be advantageous in some applications while fog can be better in other applications in terms of resistance to some types of security attacks, and each has certain strengths and weaknesses. Moreover, we preview the pros and cons of using fog in an IoT environment, and we preview some of the common attacks that can target the IoT environment and devices.

Omar H. Alhazmi

Chapter 43. Estimation of Fatigue Life of PLCC Solder Joints Under Vibration Loading

Solder joints are a vital part of all electronic systems and play a critical role in ensuring system reliability. However, of late, the failure distribution in electronic systems has shown increase in fatigue failures of solder joints under thermal cycling and vibration load. It is, therefore, essential to estimate the fatigue life of these solder joints if cyclic loads are prominently present in the environment. PLCC package solder joints have been considered in this work to demonstrate the methodology of fatigue life estimation of solder joints. Both finite element analysis and vibration fatigue tests have been used to estimate the parameters of a Basquin model for fatigue. This model predicts the number of cycles to failure of PLCC solder joints for any vibration stress level.

Rohit Khatri, Diana Denice, Manoj Kumar
Additional information

Premium Partner

    Image Credits