Search results
1 – 10 of over 6000Huilong Zhang, Yudong Zhang, Atiqe Ur Rahman and Muhammad Saeed
In this article, the elementary notions and aggregation operations of single-valued neutrosophic parameterized complex fuzzy hypersoft set (sv-NPCFHSS) are characterized…
Abstract
Purpose
In this article, the elementary notions and aggregation operations of single-valued neutrosophic parameterized complex fuzzy hypersoft set (sv-NPCFHSS) are characterized initially. Then by using matrix version of sv-NPCFHSS, a decision-support system is constructed for the evaluation of real estate residential projects by observing various risk factors.
Design/methodology/approach
Two approaches are utilized in this research: set-theoretic approach and algorithmic approach. The first approach is used to investigate the notions of sv-NPCFHSS and its some aggregations whereas the second approach is used to propose an algorithm for designing its decision-support system by using the aggregation operations like reduced fuzzy matrix, decision matrix, etc. of sv-NPCFHSS. The adopted algorithm is validated in real estate scenario for the selection of residential project by observing various risk factors to avoid any expected investment loss.
Findings
The proposed approach is more flexible and reliable as it copes with the shortcomings of literature on sv-neutrosophic set, sv-neutrosophic soft set and other fuzzy soft set-like structures by considering hypersoft setting, complex setting and neutrosophic setting collectively.
Research limitations/implications
It has limitations for complex intuitionistic fuzzy hypersoft set, complex neutrosophic hypersoft set and other complex neutrosophic hypersoft set-like models.
Practical implications
The scope of this research may cover a wide range of applications in several fields of mathematical sciences like artificial intelligence, optimization, MCDM, theoretical computer science, soft computing, mathematical statistics etc.
Originality/value
The proposed model bears the characteristics of most of the relevant existing fuzzy soft set-like models collectively and fulfills their limitations.
Details
Keywords
Efthimia Mavridou, Konstantinos M. Giannoutakis, Dionysios Kehagias, Dimitrios Tzovaras and George Hassapis
Semantic categorization of Web services comprises a fundamental requirement for enabling more efficient and accurate search and discovery of services in the semantic Web era…
Abstract
Purpose
Semantic categorization of Web services comprises a fundamental requirement for enabling more efficient and accurate search and discovery of services in the semantic Web era. However, to efficiently deal with the growing presence of Web services, more automated mechanisms are required. This paper aims to introduce an automatic Web service categorization mechanism, by exploiting various techniques that aim to increase the overall prediction accuracy.
Design/methodology/approach
The paper proposes the use of Error Correcting Output Codes on top of a Logistic Model Trees-based classifier, in conjunction with a data pre-processing technique that reduces the original feature-space dimension without affecting data integrity. The proposed technique is generalized so as to adhere to all Web services with a description file. A semantic matchmaking scheme is also proposed for enabling the semantic annotation of the input and output parameters of each operation.
Findings
The proposed Web service categorization framework was tested with the OWLS-TC v4.0, as well as a synthetic data set with a systematic evaluation procedure that enables comparison with well-known approaches. After conducting exhaustive evaluation experiments, categorization efficiency in terms of accuracy, precision, recall and F-measure was measured. The presented Web service categorization framework outperformed the other benchmark techniques, which comprise different variations of it and also third-party implementations.
Originality/value
The proposed three-level categorization approach is a significant contribution to the Web service community, as it allows the automatic semantic categorization of all functional elements of Web services that are equipped with a service description file.
Details
Keywords
Barbara X. Rodriguez, Kathrina Simonen, Monica Huang and Catherine De Wolf
The purpose of this paper is to present an analysis of common parameters in existing tools that provide guidance to carry out Whole Building Life Cycle Assessment (WBLCA) and…
Abstract
Purpose
The purpose of this paper is to present an analysis of common parameters in existing tools that provide guidance to carry out Whole Building Life Cycle Assessment (WBLCA) and proposes a new taxonomy, a catalogue of parameters, for the definition of the goal and scope (G&S) in WBLCA.
Design/methodology/approach
A content analysis approach is used to identify, code and analyze parameters in existing WBLCA tools. Finally, a catalogue of parameters is organized into a new taxonomy.
Findings
In total, 650 distinct parameter names related to the definition of G&S from 16 WBLCAs tools available in North America, Europe and Australia are identified. Building on the analysis of existing taxonomies, a new taxonomy of 54 parameters is proposed in order to describe the G&S of WBLCA.
Research limitations/implications
The analysis of parameters in WBLCA tools does not include Green Building Rating Systems and is only limited to tools available in English.
Practical implications
This research is crucial in life cycle assessment (LCA) method harmonization and to serve as a stepping stone to the identification and categorization of parameters that could contribute to WBLCA comparison necessary to meet current global carbon goals.
Social implications
The proposed taxonomy enables architecture, engineering and construction practitioners to contribute to current WBLCA practice.
Originality/value
A study of common parameters in existing tools contributes to identifying the type of data that is required to describe buildings and contribute to build a standardized framework for LCA reporting, which would facilitate consistency across future studies and can serve as a checklist for practitioners when conducting the G&S stage of WBLCA.
Details
Keywords
Measures are important to healthcare outcomes. Outcome changes result from deliberate selective intervention introduction on a measure. If measures can be characterized and…
Abstract
Purpose
Measures are important to healthcare outcomes. Outcome changes result from deliberate selective intervention introduction on a measure. If measures can be characterized and categorized, then the resulting schema may be generalized and utilized as a framework for uniquely identifying, packaging and comparing different interventions and probing target systems to facilitate selecting the most appropriate intervention for maximum desired outcomes. Measure characterization was accomplished with multi-axial statistical analysis and measure categorization by logical tabulation. The measure of interest is a key provider productivity index: “patient visits per hour,” while the specific intervention is “patient schedule manipulation by overbooking.” The paper aims to discuss these issues.
Design/methodology/approach
For statistical analysis, interrupted time series (ITS), robust-ITS and outlier detection models were applied to an 18-month data set that included patient visits per hour and intervention introduction time. A statistically significant change-point was determined, resulting in pre-intervention, transitional and post-effect segmentation. Linear regression modeling was used to analyze pre-intervention and post-effect mean change while a triangle was used to analyze the transitional state. For categorization, an “intervention moments” table was constructed from the analysis results with: time-to-effect, pre- and post-mean change magnitude and velocity; pre- and post-correlation and variance; and effect decay/doubling time. The table included transitional parameters such as transition velocity and transition footprint visualization represented as a triangle.
Findings
The intervention produced a significant change. The pre-intervention and post-effect means for patient visits per hour were statistically different (0.38, p=0.0001). The pre- and post-variance change (0.23, p=0.01) was statistically significant (variance was higher post-intervention, which was undesirable). Post-intervention correlation was higher (desirable). Decay time for the effect was calculated as 11 months post-effect. Time-to-effect was four months; mean change velocity was +0.094 visits per h/month. A transition triangular footprint was produced, yielding 0.35 visits per hr/month transition velocity. Using these results, the intervention was fully profiled and thereby categorized as an intervention moments table.
Research limitations/implications
One limitation is sample size for this time series, 18 monthly cycles’ analysis. However, interventions on measures in healthcare demand short time cycles (hence necessarily yielding fewer data points) for practicality, meaningfulness and usefulness. Despite this shortcoming, the statistical processes applied such as outliers detection, t-test for mean difference, F-test for variances and modeling, all consider the small sample sizes. Seasonality, which usually affects time series, was not detected and even if present, was also considered by modeling.
Practical implications
Obtaining an intervention profile, made possible by multidimensional analysis, allows interventions to be uniquely classified and categorized, enabling informed, comparative and appropriate selective deployment against health measures, thus potentially contributing to outcomes optimization.
Social implications
The inevitable direction for healthcare is heavy investment in measures outcomes optimization to improve: patient experience; population health; and reduce costs. Interventions are the tools that change outcomes. Creative modeling and applying novel methods for intervention analysis are necessary if healthcare is to achieve this goal. Analytical methods should categorize and rank interventions; probe the measures to improve future selection and adoption; reveal the organic systems’ strengths and shortcomings implementing the interventions for fine-tuning for better performance.
Originality/value
An “intervention moments table” is proposed, created from a multi-axial statistical intervention analysis for organizing, classifying and categorizing interventions. The analysis-set was expanded with additional parameters such as time-to-effect, mean change velocity and effect decay time/doubling time, including transition zone analysis, which produced a unique transitional footprint; and transition velocity. The “intervention moments” should facilitate intervention cross-comparisons, intervention selection and optimal intervention deployment for best outcomes optimization.
Details
Keywords
Helge Wurdemann, Vahid Aminzadeh, Jian S. Dai, John Reed and Graham Purnell
This paper aims to introduce and identify a new 3D handling operation (bin picking) for natural discrete food products using food categorisation.
Abstract
Purpose
This paper aims to introduce and identify a new 3D handling operation (bin picking) for natural discrete food products using food categorisation.
Design/methodology/approach
The research shows a new food categorisation and the relation between food ordering processes and food categories. Bin picking in the food industry needs more flexible vision software compared to the manufacturing industry in order to decrease the degree of disarray of food products and transfer them into structure.
Findings
It has been shown that there are still manual operated ordering processes in food industry such as bin picking; it just needs new ideas of image processing algorithms such as active shape models (ASMs) on its development in order to recognise the highly varying shapes of food products.
Research limitations/implications
This research was aimed at locating a new ordering process and proving a new principle, but for practical implementation this bin picking solution needs to be developed and tested further.
Originality/value
Identifying new ordering processes via food categorisation is unique and applying ASMs to bin picking opens a new industrial sector (food industry) for 3D handling.
Details
Keywords
A.A. Syntetos, M. Keyes and M.Z. Babai
Spare parts have become ubiquitous in modern societies and managing their requirements is an important and challenging task with tremendous cost implications for the organisations…
Abstract
Purpose
Spare parts have become ubiquitous in modern societies and managing their requirements is an important and challenging task with tremendous cost implications for the organisations that are holding relevant inventories. An important operational issue involved in the management of spare parts is that of categorising the relevant stock keeping units (SKUs) in order to facilitate decision‐making with respect to forecasting and stock control and to enable managers to focus their attention on the most “important” SKUs. This issue has been overlooked in the academic literature although it constitutes a significant opportunity for increasing spare parts availability and/or reducing inventory costs. Moreover, and despite the huge literature developed since the 1970s on issues related to stock control for spare parts, very few studies actually consider empirical solution implementation and with few exceptions, case studies are lacking. Such a case study is described in this paper, the purpose of which is to offer insight into relevant business practices.
Design/methodology/approach
The issue of demand categorisation (including forecasting and stock control) for spare parts management is addressed and details reported of a project undertaken by an international business machine manufacturer for the purpose of improving its European spare parts logistics operations. The paper describes the actual intervention within the organisation in question, as well as the empirical benefits and the lessons learned from such a project.
Findings
This paper demonstrates the considerable scope that exists for improving relevant real word practices. It shows that simple well‐informed solutions result in substantial organisational savings.
Originality/value
This paper provides insight into the empirical utilisation of demand categorisation theory for forecasting and stock control and provides some very much needed empirical evidence on pertinent issues. In that respect, it should be of interest to both academics and practitioners.
Details
Keywords
Hasanuzzaman Hasanuzzaman and Chandan Bhar
Environmental pollution and corresponding adverse health impacts have now become a significant concern for the entire planet. In this regard, analysts and experts are continually…
Abstract
Purpose
Environmental pollution and corresponding adverse health impacts have now become a significant concern for the entire planet. In this regard, analysts and experts are continually formulating policies to reduce environmental pollution and improve natural ecological conditions. To aid in coping with the ecological predicament, a framework has been developed in the present study to inspect the adverse environmental impacts and related health issues of coal mining.
Design/methodology/approach
The parameters for this study have been identified through a review of the literature and finalized 23 critical parameters of air, water, land and soil, and noise related to coal mining by consultation with experts from industry and academia. Finally, the parameters have been categorized in accordance with the level of threat they pose to the environment by assigning weight using the Bradley–Terry model considering attitudinal data acquired by a questionnaire survey.
Findings
It is found that coal mining has a relatively higher impact on four attributes of “air pollution” (suspended particulate matter [SPM], respiratory particulate matter [RPM], sulfur dioxide [SO2] and oxides of nitrogen [NOx]), followed by “land and soil pollution” (deforestation and surface structure diversion), “noise pollution” (vehicle movement) and “water pollution” (water hardness, total solids (TSS/TDS) and iron content). It is also found that raising the air concentration of SPM and RPM results in increased respiratory and cardiopulmonary mortality. Therefore, reducing dust concentrations into the air generated during coal mining is recommended to reduce air pollution caused by coal mining, which will reduce contamination of water and land and soil.
Research limitations/implications
The model built in this study is a hypothesized model that relied on the experts' opinions considering the parameters of coal production only. However, the parameters related to the usage of coal and its consequences have been excluded. Further, only industrial and academic experts were considered for this study; however, they excluded local people, coal mining personnel, policy authorities, etc. Therefore, the study findings might differ in real circumstances. The research can further be reproduced by considering the parameters related to the use of coal and its consequences, considering the opinions of the local people, coal mining personnel and policy authorities.
Practical implications
Categorizing the parameters according to the threat they pose to the environment due to coal mining can help the decision-maker develop an effective policy to reduce environmental pollution due to coal mining by considering the parameters on a priority basis. In addition, the results further help the decision-makers to assess the environmental impact of coal mining and take necessary action.
Originality/value
The study has developed a framework using the Bradley–Terry model to categorize the environmental parameters of coal mining to develop effective environmental policies, which are original and unique in nature.
Details
Keywords
Mahakdeep Singh, Kanwarpreet Singh and Amanpreet Singh Sethi
The current manuscript is focused on evaluating the capabilities of green practices that affect various business performance (BP) parameters of small and medium scale Indian…
Abstract
Purpose
The current manuscript is focused on evaluating the capabilities of green practices that affect various business performance (BP) parameters of small and medium scale Indian manufacturing enterprises (SME’s). This study aims to obtain multiple significant factors that influence the implementation of green practices.
Design/methodology/approach
The manuscript focuses on statistical testing of responses obtained from 168 Indian SMEs to determine the relationship between input parameters and BP parameters. This paper starts with deploying tests such as Cronbach alpha and inter-item covariance test to obtain confidence in data collected, followed by various statistical tests such as Pearson correlation, multiple regression, canonical correlation to extract various significant factors the study. Further Games-Howell post hoc test is deployed to evaluate the significant improvements in BP gained over a reasonable duration of time. Finally, a discriminant validity test is used to find out the success or failure of the organizations that participated in the survey.
Findings
This research contributes to the holistic effect of green manufacturing (GM) toward gaining improvements in terms of different BP parameters taken for the study. It has been found that various input factors such as customer attributes, adoption of new technology, social pressure and government pressure are the main parameters for GM implementation. Further, it is observed that those at the maturity phase of GM implementation are reaping higher benefits than the organizations at the transition and stability phase.
Originality/value
The current study has been accomplished in Indian SME manufacturing organizations to investigate the effects of GM implementation in the organization. Although research findings imply the effective use of green practices within the organization to reap BP parameters and improve the market’s competitive image, the study cannot be generalized and can be used as an insight for both academicians and end-users in understanding the overall achievements of GM.
Details
Keywords
V. Chowdary Boppana and Fahraz Ali
This paper presents an experimental investigation in establishing the relationship between FDM process parameters and tensile strength of polycarbonate (PC) samples using the…
Abstract
Purpose
This paper presents an experimental investigation in establishing the relationship between FDM process parameters and tensile strength of polycarbonate (PC) samples using the I-Optimal design.
Design/methodology/approach
I-optimal design methodology is used to plan the experiments by means of Minitab-17.1 software. Samples are manufactured using Stratsys FDM 400mc and tested as per ISO standards. Additionally, an artificial neural network model was developed and compared to the regression model in order to select an appropriate model for optimisation. Finally, the genetic algorithm (GA) solver is executed for improvement of tensile strength of FDM built PC components.
Findings
This study demonstrates that the selected process parameters (raster angle, raster to raster air gap, build orientation about Y axis and the number of contours) had significant effect on tensile strength with raster angle being the most influential factor. Increasing the build orientation about Y axis produced specimens with compact structures that resulted in improved fracture resistance.
Research limitations/implications
The fitted regression model has a p-value less than 0.05 which suggests that the model terms significantly represent the tensile strength of PC samples. Further, from the normal probability plot it was found that the residuals follow a straight line, thus the developed model provides adequate predictions. Furthermore, from the validation runs, a close agreement between the predicted and actual values was seen along the reference line which further supports satisfactory model predictions.
Practical implications
This study successfully investigated the effects of the selected process parameters - raster angle, raster to raster air gap, build orientation about Y axis and the number of contours - on tensile strength of PC samples utilising the I-optimal design and ANOVA. In addition, for prediction of the part strength, regression and ANN models were developed. The selected ANN model was optimised using the GA-solver for determination of optimal parameter settings.
Originality/value
The proposed ANN-GA approach is more appropriate to establish the non-linear relationship between the selected process parameters and tensile strength. Further, the proposed ANN-GA methodology can assist in manufacture of various industrial products with Nylon, polyethylene terephthalate glycol (PETG) and PET as new 3DP materials.
Details