Skip to main content

Über dieses Buch

This book presents real-world problems and pioneering research that reflect novel approaches to cybernetics, algorithms and software engineering in the context of intelligent systems. It gathers the peer-reviewed proceedings of the 2nd Computational Methods in Systems and Software 2018 (CoMeSySo 2018), a conference that broke down traditional barriers by being held online. The goal of the event was to provide an international forum for discussing the latest high-quality research results.



Efficient Framework to Secure Communication in IoT Using Novel Finite Field Encryption

Offering resiliency against major threats to the exponentially massive number of physical devices in Internet-of-Things (IoT) is one of the most challenging problems in Ubiquitous Computing. Most recently, researchers have evolved up with many security solutions to find that they do not offer proper performance equilibrium between security performance and communication. Therefore, this problem is addressed by proposed a novel analytical framework that uses highly enhanced finite field encryption to ensure that messages exchanged between IoT nodes are highly secured irrespective of any forms of threat level. The study outcome of proposed system shows that it offer reduced energy, reduced delay, and maximized data delivery performance with higher degree of data integrity and privacy preservation performance in contrast to frequently used security protocols in IoT.

Eisha Akanksha

A Novel Template - Based Data Structurization Scheme for Normalizing and Analyzing Medical Data

With the ever-increasing size of the data in recent times, big data approaches are widely initiated to be adopted. This phenomenon can also be seen in the area of healthcare sector where there are generation of massive medical data that is not only bigger in dimension but also highly unstructured. After reviewing existing approaches, it was found that existing system doesn’t offer comprehensive and practical solution against data structurization problem. Therefore, this paper presents a novel template-based data structurization scheme that takes the unstructured medical data as input and subjects it to a novel data structurization scheme. The system also contributes in obtaining knowledge related to criticality of disease associated with the dataset. The study outcome shows that proposed system offers reduced response time for both data structurization and knowledge discovery.

A. S. Chandru, K. Seetharam

Indirect Optimal Approach Applied to H1N1 Spread Through Moroccan Regions

This study concerns modeling the H1N1 spread through the twelve official regions of Morocco using discrete spatiotemporal epidemic model including many subsystems which describe thoroughly the local characteristics inside each region and take into account the neighborhood impact. Each region has a known neighborhood deducted from the geographic map, the effect of this neighborhood is included in the model as a second process of transmission. We endeavor to reduce infectious individuals of the target area, inaccessible region, through control of certain accessible regions. Hence, we propose indirect optimal control approach in order to reduce infected individuals in the target region with the minimum cost. The optimality system is a two-point boundary value problem, with separated boundary conditions. It’s resolved by an iterative method related to Forward-Backward Sweep Method (FBSM). The effectiveness of the proposed approach is examined via numerous numerical illustrations.

Amine Bouaine, Mostafa Rachik

Masking Operator Based Genetic Algorithm Optimization for Edge Detection in Images

In the paper proposed, a genetic algorithm optimization on masking operator for edge detection is implemented to give an optimized working model. Genetic algorithm helps in finding the operator values by optimizing the parameters to reach the target solution. Berkeley Segmentation Database images and their ground truths are used for experimentation and qualitative evaluation of the proposed model. In addition to other methods available, Genetic Algorithm optimized detection of edge, play a key role in understanding the features of images. Genetic Algorithm optimized Edge detection helps in matching the pattern, and recognizing an object in an accurate way.

G. Kavitha

A Novel Design and Implementation of Imaging Chip Using AXI Protocol for MPSOC on FPGA

The medical images are the main source for analysis of diseases. These images are having visual quality (VQ) issues like contrast, glare, noise etc, which restricts the proper analysis of disease. In order to overcome these image related issues, various techniques are presented in last decade. Digital image enhancement (IE) is the technique which improves the VQ of the image. The IE converts the desired image as image with particular VQ for an application. This paper introduces a novel approach of IE using Field Programmable Gate Array (FPGA) by implementing the different algorithms (brightness control, contrast adjustment, thresholding, negative transformation, filtering) for brain imaging. The proposed IE technique implements the Multi-Processor System on Chip (MPSoC) over FPGA. The MPSoC is distinct computer architecture while FPGA is hardware where the image processing algorithms can be implemented properly. The proposed IE methodology assures the low cost system with high accuracy. The proposed technique is incorporated with Advanced Microcontroller Bus Architecture (AMBA) which provides the interface for communication among master and slave module.

H. R. Archana, K. S. Vasundara Patel

Simple Modeling of Mobile Foreground Detection Using Probabilistic Linear Estimation Approach

The problems associated with precise, simple, and instantaneous identification of mobile objects has been researched by various investigators. However, upon reviewing existing system shows that they are quite complex, time consuming, and could not assists in identifying objects from multi-feed events. Hence, the proposed system offers a novel framework that is capable of performing identification of multiple objects from multiple feeds at different scale of time. The implementation is carried out using analytical modeling approach that uses probability theory to perform discrete time modeling integrated with linear state space modeling. The proposed system also introduced a simple clustering process that uses depth map for facilitating superior identification of foreground. The study outcome was found to offer almost instantaneous identification of the multi video feeds.

G. Madhu Chandra, G. M. Sreeramareddy

Significant Simulation Parameters for RESTART/LRE Method in Teletraffic Systems of Ultra Broadband Convergence Networks

The rare events such as blocking, overflows and losses in broadband convergence networks, especially in systems with very low probability, are one of the major estimation problems for ordinary simulation methods. Speed-up simulation is the most convenient method for estimation of rare events. The main purpose of the research is to determine and compare the significant parameters for speed-up simulation of Ultra Broadband Convergence Networks. The simulations opportunities represent a chance to reduce the digital divide and to ensure the Ultra Broadband Convergence Networks with minimum cost of time and financials. The simulation of teletraffic models allows the modeling of resource management issues, admission of system users with different parameters.

Elena Ivanova, Teodor Iliev, Grigor Mihaylov, Ventsislav Keseev, Ivaylo Stoyanov

The Implementation of an SPC Chart to Improve the Forecasting Accuracy of the ARIMA Models

The capability of forecasting techniques is based on the historical data and the autocorrelation is one of the structures used to construct a predicting model. However, a special cause, which cannot be explained by the model, always randomly occurs in the process and can significantly downgrade the performance of the forecasting model. Therefore, the statistical process control (SPC) technique widely used in the quality control area is applied to detect the outlier so that it will be removed and not be a part of the historical observation. However, since the time series data is mostly autocorrelated, the basic assumption of the SPC methodology is violated. In this research, the Box-Jenkins’ autoregressive integrated moving average (ARIMA) models were used to filter the autocorrelation so there is only the pure residual to be monitored by the SPC. To illustrate the implementation, the actual data from the Singapore mercantile exchange, the daily price of the black pepper and the West Texas immediate crude oil future contracts, were used to represent two categories of autocorrelated data, stationary and non-stationary. As the first step, the historical data were modeled by the ARIMA models. Afterwards, the residuals from the models were monitored by the X-MR chart and the outliers were specified and removed. The study shows that the forecasting errors for both stationary and non-stationary cases are significantly improved after the outliers were systematically removed.

Karin Kandananond

An Approach for Reduce Vulnerabilities in Web Information Systems

The evolution of the Internet has brought a significant change in the evolution of software, resulting in an increasing presence of Information Systems in Web environments and, consequently, an increase in security vulnerabilities and threats. In this context, secure application development has become a crucial component for information systems in the market. Exploring the main vulnerabilities in the coding of software for Web environments and the need for awareness and guidance for developers of information systems, this paper offers a approach for the treatment of vulnerabilities in Web applications composed by the selection and mapping of categories of major vulnerabilities, risks and existing proactive controls to prevent or treat occurrences of exploitation of vulnerabilities by malicious agents.

Gleidson Sobreira Leite, Adriano Bessa Albuquerque

An Approach of Filtering to Select IMFs of EEMD in Signal Processing for Acoustic Emission [AE] Sensors

The pipeline system is the important part in the media transportation for oil and gas transmission but due to weak maintenance, it leads to the corrosion, leakage stresses and mechanical damage of oil and gas pipelines. The signal processing is used to decompose the raw signal and analysis will be in time-frequency domain. Number of existing signal processing methods can be used for extracting useful information. However, the problem of signal processing method, essential to highlight the wanted information and attenuate the undesired signal is trivial. Several signal processing methods have been implemented to solve this issue. Research using Empirical Mode Decomposition (EMD) algorithm shows promising results in comparison to other signal processing methods, especially in the accuracy showing the relationship between signal energy and time – frequency distribution by represents series of the stationary signals with different amplitudes and frequency bands. However, this EMD algorithm will still have noise contamination that may compromise the accuracy of the signal processing to highlight the wanted information. It is because the mode mixing phenomenon in the Intrinsic Mode Function’s (IMF) due to the undesirable signal with the mix of additional noise. There is still room for the improvement in the selective accuracy of the sensitive IMF after decomposition that can influence the correctness of feature extraction of the oxidized carbon steel. Using two data sets from the Acoustic Emission Sensors [AE], signal processing flows have been presented in this paper. Wave propagation in the pipeline is a key parameter in acoustic method when the leak occurs. More experiments and simulation need to be carried out to get more result for leakage signature and localisation of defect.

Nur Syakirah Mohd Jaafar, Izzatdin Abdul Aziz, Jafreezal Jaafar, Ahmad Kamil Mahmood

Influence Analysis of Selected Factors in the Function Point Work Effort Estimation

The paper presents an influence analysis of selected factors (independent variables – function point count approach, business area, industry sector, relative size) on the estimation of the work effort for which the function point method is primarily used. Productivity influencing factors and the productivity estimation capability in function point method is studied too.

Zdenka Prokopova, Petr Silhavy, Radek Silhavy

The Performance Evaluation for the Efficiency of Coastal Regional Innovation Network Based on DEA

In order to evaluate the innovation efficiency of China’s coastal regional innovation network accurately, this paper selects 11 coastal provinces, municipalities and autonomous regions, and simplifies the published financial data by the factor analysis method and the principal component analysis method. We build the innovation efficiency of China’s coastal regional innovation network evaluation index system from the interaction between scientific research institutions and symbiotic environment, the interaction between enterprises and symbiotic environment and the interaction between scientific research institutions and enterprises. This paper analyzes the innovation efficiency of China’s coastal regional innovation network evaluation and gives the input-output DEA model. It gives the coastal regional innovation network evaluation results also.

Song Cheng, Hu Longying, Yuan Haiyan

Models and Algorithms of Vector Optimization in Selecting Security Measures for Higher Education Institution’s Information Learning Environment

The paper looks at the features of the application of Edgeworth-Pareto method in an effort to address the multicriteria discrete optimization problem, associated with the assessment of the security in a higher education institution’s (HEI) information learning environment (ILE) and its technical means of protection (TMP). Modification to the method, based on a combination of discrete optimization of Edgeworth-Pareto and Podinovski’s lexicographic method, is proposed. The algorithm of choosing the rational decision from the suggested TMP and ILE of HEI is proposed. In order to highlight a variety of Pareto-optimal solutions the vector criterion for evaluating output, which considers two conditions for the optimality as its elements: cost assessment of the analyzed TMP and its technical efficiency evaluation.

Berik Akhmetov, Valeriy Lakhno, Bakhytzhan Akhmetov, Yuri Myakuhin, Asselkhan Adranova, Lazat Kydyralina

Implementing DevOps in Legacy Systems

Legacy systems are a challenge for the operations of modern organizations as they limit their changing ability and grow business. However, they are often systems suited to their purpose, they deliver the expected value and investment in them is justifiable. The concept of DevOps has come to reduce the separation between existing development and operations teams in software development companies, decreasing the product lifecycle. With this concept comes the practice of continuous delivery, allowing teams to deliver and deploy any version of their software in any computing environment at any time through a fully automated process. This practice improves the feedback of the development process, so that problems are identified and resolved as early as possible. In this paper, we present a case study of deploying DevOps, and propose a formal process for deploying DevOps and its modifications necessary to adapt to legacy systems so that the delivery process of the legacy systems has a short and high quality lifecycle, delivering frequent versions in an automated way.

Adriano Bessa Albuquerque, Vinicius Lima Cruz

Development of Sectoral Intellectualized Expert Systems and Decision Making Support Systems in Cybersecurity

The paper considers the prerequisites for the integration of various expert and decision support systems for information security and cybersecurity. Analyzed the possibility of sectoral pooling and sharing of knowledge bases of such intellectualized systems. The model of the knowledge management subsystem in sectoral expert and decision making support systems for information security and cybersecurity of critical computer systems is described. Also article presents the results of studies of the knowledge management system of existing local expert systems and decision making support systems for information protection. Algorithms for the distribution of requests between similar systems are specified.

Bakhytzhan Akhmetov, Valeriy Lakhno, Berik Akhmetov, Zhuldyz Alimseitova

Optimal Multi-robot Path Finding Algorithm Based on A*

This paper presents an optimal path finding algorithm for a group of robots. Presented approach is based on A* graph traversal algorithm and contains modifications that make it possible to apply strengths of an original algorithm to solve multi-robot path planning problem. Proposed modification is to dynamically change costs of nodes used in path of one robot while planning route for another. In another words to give the cost meaning of a time which robots need to wait to be able to take the node. Although presented approach was used to solve multi-robot path finding problem on unknown map it can be successfully applied to solve navigating problems of group of mobile agents on known maps. Computer simulation showed reliable results for maps of different configuration.

Alexander Erokhin, Vladimir Erokhin, Sergey Sotnikov, Anatoly Gogolevsky

An Agent-Architecture for Automated Decision-Making on the Semantic Web

Semantic web research has largely concentrated on the representation of web-based knowledge and data, often in the form of ontologies. Only a minority of the research has explored the implementation of intelligent agents that can use the web-based knowledge. Even less research has generalized intelligent agent implementations into an architecture. Here, an implementation and architecture are described for a multi-agent decision-making system using available web-based data and information.

Richard Fox, Dustin Gulley

An Efficient Hardware Realization of DCT Based Color Image Mosaicing System on FPGA

Researchers of Surveillance systems in today’s world are exploring various ways to capture continuous image sequences. Capturing these image sequences in a single plane is highly challenging. As a consequence, camera arrays have sprung into action to capture the image sequences in a single plane. This concept of combining the captured images by an array of cameras is referred to as mosaicing. The image mosaicing generally encounters the problem of sacrificing the image quality because of improper mode of image capturing. In order to tackle this problem, Cross correlation (Ccr) method can be implemented. However, it faces quality problems again under brightness variations. This paper introduces a Discrete Cosine Transform (DCT) based image mosaicing method that works effectively against the brightness issues suitable for FPGA implementation. This proposal comprises functional blocks for DCT, inverse DCT, image registration and multipliers and dividers. The image quality assessing parameters like PSNR, MSE, device utilization and timing analysis were considered for performance analysis of the proposed scheme by using Xilinx 14.7 ISE and simulated by using Modelsim6.3 f. The proposed image mosaicing system gives excellent results in terms of image quality (PSNR of over 35 dB) and fast execution time of 23 ms per image of size 1600 × 1200 pixels.

H. Jayalaxmi, S. Ramachandran

Semantic History: Ontology-Based Modeling of Users’ Web Browsing Behaviors for Improved Web Page Revisitation

Web Browsers are software solutions that facilitate users in browsing the Web. However, the huge size of the Web makes it difficult to find relevant resources, resulting in information and cognitive overload. To mitigate this overload, researchers have attempted to find ways for re-visitation of web pages that are deemed useful and more likely to be revisited. Also, web browsers have several built-in tools including history, bookmarks, backward & forward buttons, Uniform Resource Locator (URL) auto-completion, and so on. This research focuses on web browser history, which maintains details of visited web pages with their associated metadata to enable users in finding and re-finding (revisitation) web pages without encountering the information and cognitive overload. In addition to the built-in history tools in web browsers, several third-party tools in the form of toolbars, extensions, and add-ons are available. However, these solutions exploit no or limited web page-level semantics and fail to provide full revisitation support to the users. It is, therefore, necessary to fill this semantic gap by exploiting web page-level semantics, which is the aim of this paper. We contribute “Browser History Ontology,” and use it in our developed Chrome-based browser extension, namely “Semantic History.” Experimental results show that our proposed solution provides better re-visitation support to the users by semantically organizing the web browser history.

Imam ud Din, Shah Khusro, Irfan Ullah, Azhar Rauf

A Recovery Technique for the Fog-Computing-Based Information and Control Systems

The current paper deals with the dependability issue of the fog-computing-based information and control systems (ICS). Comparing with the traditional, “cloud” ICS architectures, fog-computing paradigm brings the possibility to distribute the workload not only among the central computational units (CU), but to shift it to the fog-layer and edge-devices of the ICS infrastructure. This allows to reconfigure ICS in case of CU failure such as to maximize the CU reliability function value by workload shift to the edge of the network. In this paper the new recovery technique for the fog-computing-based ICSs is presented and discussed. The technique is based on the simplified models of the workload distribution in the fog-computing-based systems. Besides, some important conclusions are made and selected simulation results are presented.

Eduard Melnik, Anna Klimenko, Vladislav Klimenko

An Examination of Message Ferry Movement for Delay Tolerant Networks

After a disaster has occurred, it is extremely difficult to confirm the safety of victims in an environment in which the communication infrastructure has been destroyed. For this reason, Delay Tolerant Networking (DTN), which makes communication possible even in environments which do not have adequate communication infrastructure, is being actively researched. Moreover, a relaying technology called message ferrying, which utilizes Unmanned Aerial Vehicles (UAVs), has been attracting attention as a means of mediating between sites that are not connected for communication during a disaster, thus facilitating the exchange of data between them. In order to improve the performance of message ferrying, the efficient movement of UAVs is necessary to ensure that safety information of more victims can be recovered in a limited time during disaster. In this study, evaluation experiments utilizing actual UAVs are conducted, UAV movement conditions are examined in light of the results of the experiments, and the ideal movement conditions of UAVs are described, with the overall objective of clarifying the necessary conditions for efficient movement of UAVs.

Takahiro Koita, Minami Yoneda

Comparison of ERP Systems with Blockchain Platform

This research paper is devoted to a comparison of ERP systems and Blockchain platform. Despite it is too early talk about the widespread of blockchain platform, people want to know how this technology can be used for their works with compare to standard program product. One of this product is ERP system. Today experts have a different opinion about applicability and opportunities of blockchain technology. This research paper presents a definition of the main terms, benefits, the sphere of applicability, advantages and disadvantages of ERP systems and blockchain technology. In addition, this survey describes how ERP systems and Blockchain platform can be work in the different area. Results show the ambiguity of allying blockchain platform in the current moment. ERP systems also have some features reflected in the conclusions of the research paper.

Boris Sokolov, Anton Kolosov

Image Content Protection Using Hybrid Approach of Blocking and Embedding Algorithm in the Context of Social Network

The security of an informative data plays a crucial role in current world. The reversible hiding mechanisms for data are able to perform the embedding of secret data in an image in secure way so that the receiver can receive image data without any data loss in secure manner. But, the existing data hiding mechanisms are able to embed the data only after the image encryption. This may cause error during the extraction and recover of image. The security of image content shared on collaborative (social) networks is the de-facto research today to minimize losses. This paper presents a hybrid mechanism of blocking and embedding for image data protection in the context of collaborative network. The outcomes of the system give the improved execution time for optimized transformation (29.3 s) than normal transformation (139.17 s). Also, it is observed that the PSNR value of 53.15 and 53.13 is observed for two set of image data which represents the improved image quality after recovery.

M. Naveen, G. N. K. Suresh Babu

Calculation of Robustly Stabilizing PI Controllers for Linear Time-Invariant Systems with Multiplicative Uncertainty

The contribution is intended to present a method of computing robustly stabilizing PI controllers for linear time-invariant systems with unstructured multiplicative uncertainty. This graphical technique is based on the application of basic robust stability condition and plotting the robust stability border pairs of P-I parameters, which subsequently leads to the robust stability region. The illustrative example is presented to show the effectivity of this straightforward approach.

Radek Matušů

Data Extraction from the Distorted DCF77 Signal Captured Using Low-Cost Receivers

The DCF77 time signal is very often used in devices that need an accurate and stable synchronization to astronomic time. It has beed used since 1959, very well documented, and given to public usage. Nowadays it is quite easy to buy a complete DCF77 signal receiver that costs €3-4 and gives already demodulated digital signal to the output ready for further decoding. However, the DCF77 signal can be significantly distorted as it is modulated using the amplitude modulation (AM), that is very prone to noise, relatively high distance between transmitter in Mainflingen (Germany) and end-devices influencing the signal level together with low quality of some cheap DCF77 signal receivers. In this paper the solution to extract correct data from the distorted DCF77 signal captured using low-cost receivers is presented. As the proposed solution has low computational complexity, it can be easily implemented in low performance MCU, what is shown in this paper as well.

Hubert Michalak

A Mobile Terrestrial Surveillance Robot Using the Wall-Following Technique and a Derivative Integrative Proportional Controller

This research aims to develop a land mobile robot of surveillance, whose primary task is to collect real-time videos and images, in closed environments, using the Wall-Following technique and a derivative integrative proportional controller to navigate the environment. We use ultrasonic sensors, infrared, IP camera and encoders for speed control in a closed loop. The development of the robot happened in three stages: definition, acquisition of the model and used components (mechanical structure, microcontroller, sensors, and actuators). Then, the user-robot interaction software will be defined, and the creation of the robot, of the control software and integration tests will be done. The integration test has the job of improving the programming routines of the robot and the interaction with it. Finally, the supervisory module will be described, whose interface will use the visualization and capture of real-time videos and images.

Pedro Gabriel Calíope Dantas Pinheiro, Maikol Magalhães Rodrigues, João Paulo Agostinho Barrozo, Jose Pacelli Moreira de Oliveira, Plácido Rogério Pinheiro, Raimir Holanda Filho

BrainMRI Enhancement as a Pre-processing: An Evaluation Framework Using Optimal Gamma, Homographic and DWT Based Methods

The domain of medical imaging has been extensively studied for various parts of human bodies. Study on Brain MRI is quite helpful for the diagnosis of brain related diseases. The availability of noises and less clarity of the input Brain MRI brings ambiguity while analysis. This paper proposes optimized gamma correction, homographic filter and discrete wavelet transformation. The performance evaluation of these methods is analysed with parameters global contrast factor (GCF), contrast per pixel (CPP), contrast (C) and sharpness (S) values.

S. Harish, G. F. Ali Ahammed

Efficiency Comparison of Modern Computer Languages: Sorting Benchmark

The paper surveys the execution features of ready-to-use sorting procedures in various modern computer languages/compilers. The chosen sorting functions were tested for randomly generated data sets of different size and structure resembling the lists or arrays commonly used in real life IT solutions. Our results reveal some differences between particular implementations in efficiency of sorting in terms of CPU load and execution time.

Agnieszka Bier, Zdzisław Sroczyński

Input Output Data Converter for the Math Engine in an Expert System

This study offers a simple and inexpensive tool for semi-automatic control of computational algorithms with different input/output data structures. The proposed approach to data recognition is based on creating generation rules using data structure sample. Our contribution lies in proposing a built-in semi-automatic engine for recognizing and transforming input/output data (MMC Tool) does not require special end-user queries; there is no needs special end-user skills; the confirmation procedure is carried out in a user-friendly view. The proposed MMC Tool is an effective solution for automation of plugging and switching math modules running in a single system without having to resort to third-party software developers.

Simon Barkovskii, Larisa Tselykh, Alexander Tselykh

Adaptive Software System for Optimization of the Admission and Management Process for Doctoral Students

The preparation of a doctoral student and a successful thesis is considered an achievement – not only for the doctoral student and their scientific supervisor, but also for the institution that has taken responsibility in developing a young scientist. Doctoral student education is a complex activity, which includes administrative services, discursive social practice, document turnover, etc. The creation of a software system aimed at tracking the development of the doctoral students themselves and their scientific interests is one of a series of initiatives that the University of Ruse has taken to optimize the organization activity related to Ph.D. student education. The base theoretical concepts forming the foundation in the build process of a software system are outlined in this article, as well as their successful practical implementation into a well-functioning software resource. An analysis of the results of a poll intending to survey the opinions of the users in their work and experience with the system is also included.

Pavel Zlatarov, Galina Ivanova, Desislava Baeva, Diana Antonova

The Production and Simplification of Evidence - Enhancing Trust and Costs Reduction on Court

To solve the problem of slow/costly and inefficiency of the e-Justice system, the TESTOR project consists of improving legal certainty, expeditious dispute resolution, and costs reduction. The objective is to simplify the production of evidence in order to be used for legal purposes. For that, the evidence (any type of digital document/file) is digitally signed by all the stakeholders (may include a solicitor/lawyer) and submitted to an online platform that creates a proof of evidence, stores and manages permissions access to the evidence. Enabling to follow and connect the proof, that can be any digital content, with the civil process. The proof, with the full range of problems that are associated with it, has to be the nerve center of the system and Justice. The material facts must be proved and this is crucial to establish the proof of value - the value of what is provable.

Miguel Matos, Duarte Duque, Joaquim Silva, Irene Portela


Weitere Informationen

Premium Partner

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.



Best Practices für die Mitarbeiter-Partizipation in der Produktentwicklung

Unternehmen haben das Innovationspotenzial der eigenen Mitarbeiter auch außerhalb der F&E-Abteilung erkannt. Viele Initiativen zur Partizipation scheitern in der Praxis jedoch häufig. Lesen Sie hier  - basierend auf einer qualitativ-explorativen Expertenstudie - mehr über die wesentlichen Problemfelder der mitarbeiterzentrierten Produktentwicklung und profitieren Sie von konkreten Handlungsempfehlungen aus der Praxis.
Jetzt gratis downloaden!