Search results

1 – 10 of 24
Article
Publication date: 23 August 2022

Kamlesh Kumar Pandey and Diwakar Shukla

The K-means (KM) clustering algorithm is extremely responsive to the selection of initial centroids since the initial centroid of clusters determines computational effectiveness…

Abstract

Purpose

The K-means (KM) clustering algorithm is extremely responsive to the selection of initial centroids since the initial centroid of clusters determines computational effectiveness, efficiency and local optima issues. Numerous initialization strategies are to overcome these problems through the random and deterministic selection of initial centroids. The random initialization strategy suffers from local optimization issues with the worst clustering performance, while the deterministic initialization strategy achieves high computational cost. Big data clustering aims to reduce computation costs and improve cluster efficiency. The objective of this study is to achieve a better initial centroid for big data clustering on business management data without using random and deterministic initialization that avoids local optima and improves clustering efficiency with effectiveness in terms of cluster quality, computation cost, data comparisons and iterations on a single machine.

Design/methodology/approach

This study presents the Normal Distribution Probability Density (NDPD) algorithm for big data clustering on a single machine to solve business management-related clustering issues. The NDPDKM algorithm resolves the KM clustering problem by probability density of each data point. The NDPDKM algorithm first identifies the most probable density data points by using the mean and standard deviation of the datasets through normal probability density. Thereafter, the NDPDKM determines K initial centroid by using sorting and linear systematic sampling heuristics.

Findings

The performance of the proposed algorithm is compared with KM, KM++, Var-Part, Murat-KM, Mean-KM and Sort-KM algorithms through Davies Bouldin score, Silhouette coefficient, SD Validity, S_Dbw Validity, Number of Iterations and CPU time validation indices on eight real business datasets. The experimental evaluation demonstrates that the NDPDKM algorithm reduces iterations, local optima, computing costs, and improves cluster performance, effectiveness, efficiency with stable convergence as compared to other algorithms. The NDPDKM algorithm minimizes the average computing time up to 34.83%, 90.28%, 71.83%, 92.67%, 69.53% and 76.03%, and reduces the average iterations up to 40.32%, 44.06%, 32.02%, 62.78%, 19.07% and 36.74% with reference to KM, KM++, Var-Part, Murat-KM, Mean-KM and Sort-KM algorithms.

Originality/value

The KM algorithm is the most widely used partitional clustering approach in data mining techniques that extract hidden knowledge, patterns and trends for decision-making strategies in business data. Business analytics is one of the applications of big data clustering where KM clustering is useful for the various subcategories of business analytics such as customer segmentation analysis, employee salary and performance analysis, document searching, delivery optimization, discount and offer analysis, chaplain management, manufacturing analysis, productivity analysis, specialized employee and investor searching and other decision-making strategies in business.

Abstract

Details

The Handbook of Road Safety Measures
Type: Book
ISBN: 978-1-84855-250-0

Article
Publication date: 13 June 2016

M. Arif Wani and Romana Riyaz

The most commonly used approaches for cluster validation are based on indices but the majority of the existing cluster validity indices do not work well on data sets of different…

Abstract

Purpose

The most commonly used approaches for cluster validation are based on indices but the majority of the existing cluster validity indices do not work well on data sets of different complexities. The purpose of this paper is to propose a new cluster validity index (ARSD index) that works well on all types of data sets.

Design/methodology/approach

The authors introduce a new compactness measure that depicts the typical behaviour of a cluster where more points are located around the centre and lesser points towards the outer edge of the cluster. A novel penalty function is proposed for determining the distinctness measure of clusters. Random linear search-algorithm is employed to evaluate and compare the performance of the five commonly known validity indices and the proposed validity index. The values of the six indices are computed for all nc ranging from (nc min, nc max) to obtain the optimal number of clusters present in a data set. The data sets used in the experiments include shaped, Gaussian-like and real data sets.

Findings

Through extensive experimental study, it is observed that the proposed validity index is found to be more consistent and reliable in indicating the correct number of clusters compared to other validity indices. This is experimentally demonstrated on 11 data sets where the proposed index has achieved better results.

Originality/value

The originality of the research paper includes proposing a novel cluster validity index which is used to determine the optimal number of clusters present in data sets of different complexities.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 9 no. 2
Type: Research Article
ISSN: 1756-378X

Keywords

Abstract

Details

The Handbook of Road Safety Measures
Type: Book
ISBN: 978-1-84855-250-0

Book part
Publication date: 25 November 2019

Ahoo Tabatabai

Using queer/crip theory as a frame, I examine the narratives of 17 mothers raising children with disabilities.

Abstract

Purpose/Methods/Approach

Using queer/crip theory as a frame, I examine the narratives of 17 mothers raising children with disabilities.

Findings

Results show that the mothers’ narratives of an imagined future for their children often involve the idea of success in terms of production and reproduction. However, some mothers do question this idea of normalcy, challenge deeply seated ideas about neoliberal inclusion, and reframe disability as a different way of existing as opposed to a deficient way of being.

Implications/Value

The focus of this paper is on how mothers imagine different kinds of social arrangements. Some mothers, instead of embracing success as narrowly defined under neoliberalism, challenge the idea and instead offer queer narratives of parenting. This study illustrates how counternarratives can be constructed to resist prevailing narratives of disability as deficiency.

Details

New Narratives of Disability
Type: Book
ISBN: 978-1-83909-144-5

Keywords

Article
Publication date: 1 July 2013

Gregory J. Soden and Antonio J. Castro

Although music can be used in social studies classrooms to give students a picture of society from different time periods, modern music of all genres can help students understand…

Abstract

Although music can be used in social studies classrooms to give students a picture of society from different time periods, modern music of all genres can help students understand more recent historical events. This practitioner paper seeks to assist and encourage teachers to utilize modern music for present-day analysis of society. We will address the current events of the Iraq and Afghanistan Wars and help teacher’s stimulate critical responses in students by using a variety of musical genres to analyze multiple perspectives of the wars. Teaching strategies and assessing students through authentic engagement with content and artists are discussed and we conclude by offering a sample lesson plan using a model of analysis.

Details

Social Studies Research and Practice, vol. 8 no. 2
Type: Research Article
ISSN: 1933-5415

Keywords

Article
Publication date: 5 March 2021

Youcef Oussama Fourar, Mebarek Djebabra, Wissal Benhassine and Leila Boubaker

The assessment of patient safety culture (PSC) is a major priority for healthcare providers. It is often realized using quantitative approaches (questionnaires) separately from…

Abstract

Purpose

The assessment of patient safety culture (PSC) is a major priority for healthcare providers. It is often realized using quantitative approaches (questionnaires) separately from qualitative ones (patient safety culture maturity model (PSCMM)). These approaches suffer from certain major limits. Therefore, the aim of the present study is to overcome these limits and to propose a novel approach to PSC assessment.

Design/methodology/approach

The proposed approach consists of evaluating PSC in a set of healthcare establishments (HEs) using the HSOPSC questionnaire. After that, principal component analysis (PCA) and K-means algorithm were applied on PSC dimensional scores in order to aggregate them into macro dimensions. The latter were used to overcome the limits of PSC dimensional assessment and to propose a quantitative PSCMM.

Findings

PSC dimensions are grouped into three macro dimensions. Their capitalization permits their association with safety actors related to PSC promotion. Consequently, a quantitative PSC maturity matrix was proposed. Problematic PSC dimensions for the studied HEs are “Non-punitive response to error”, “Staffing”, “Communication openness”. Their PSC maturity level was found underdeveloped due to a managerial style that favors a “blame culture”.

Originality/value

A combined quali-quantitative assessment framework for PSC was proposed in the present study as recommended by a number of researchers but, to the best of our knowledge, few or no studies were devoted to it. The results can be projected for improvement and accreditation purposes, where different PSC stakeholders can be implicated as suggested by international standards.

Details

International Journal of Health Governance, vol. 26 no. 2
Type: Research Article
ISSN: 2059-4631

Keywords

Article
Publication date: 16 January 2024

Thomas Pinger, Mirabela Firan and Martin Mensinger

Based on the known positive effects of conventional hot-dip galvanizing under fire exposure and indicative results on zinc–aluminum coatings from smallscale tests, a series of…

15

Abstract

Purpose

Based on the known positive effects of conventional hot-dip galvanizing under fire exposure and indicative results on zinc–aluminum coatings from smallscale tests, a series of tests were conducted on zinc-5% aluminum galvanized test specimens under fire loads to verify the previous positive findings under largescale boundary conditions.

Design/methodology/approach

The emissivity of zinc-5% aluminum galvanized surfaces applied to steel specimens was determined experimentally under real fire loads and laboratory thermal loads in accordance with the normative specifications of the standard fire curve. Both large and smallscale specimens were used in this study. The steel grade and surface conditions of the specimens were varied for both test scenarios.

Findings

Largescale tests on specimens with typical steel construction dimensions under fire loads showed that the surface emissivity of zinc-5% aluminum galvanized steel was significantly lower than that of the conventionally galvanized steel. Only minor influences from the weathering of the specimens and steel chemistry were observed. These results agree well with those obtained from smallscale tests. The design values of zinc-5% aluminum melt (Zn5Al) required for the structural fire design were proposed based on the obtained results.

Originality/value

The novel tests presented in this study are the first ones to study the behavior of zinc-5% aluminum galvanized largescale steel construction components under the influence of real fire exposure and their positive effect on the emissivity of steel components galvanized by this method. The results provide valuable insights and information on the behavior in the case of fire and the associated savings potential for steel construction.

Details

Journal of Structural Fire Engineering, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2040-2317

Keywords

Article
Publication date: 13 April 2022

Thomas Pinger, Martin Mensinger and Maria-Mirabela Firan

Based on the advantages of conventional hot-dip galvanizing made from quasi-pure zinc melts in the event of fire, this article aims to perform a series of tests to verify whether…

Abstract

Purpose

Based on the advantages of conventional hot-dip galvanizing made from quasi-pure zinc melts in the event of fire, this article aims to perform a series of tests to verify whether a similar effect can be achieved with zinc-aluminum coatings.

Design/methodology/approach

The emissivity of galvanized surfaces, which were applied to steel specimens by the batch hot-dip galvanizing process, was experimentally determined under continuously increasing temperature load. In addition to a quasi-pure zinc melt serving as a reference, a zinc melt alloyed with 500 ppm aluminum and thin-film galvanized with a melt of zinc and 5% aluminum were used. For the latter, variants of post-treatment measures in terms of a passivation and sealing of the galvanizing were also investigated.

Findings

The results show that lower emissivity can be achieved at higher temperatures by adding aluminum to the zinc melt and thereby into the zinc coating. The design values required for the structural fire design were proposed, and an exemplary calculation of the temperature development in the case of fire was carried out based on the values. The result of this calculation indicates that the savings potential becomes apparent, when using zinc-aluminum coatings.

Originality/value

The presented novel tests describe the behavior of zinc-aluminum coatings under the influence of elevated temperatures and their positive effect on the emissivity of steel components galvanized by this method. The results provide valuable insights and information on the performance in the event of fire and the associated potential savings for steel construction.

Details

Journal of Structural Fire Engineering, vol. 14 no. 1
Type: Research Article
ISSN: 2040-2317

Keywords

Article
Publication date: 4 May 2012

Amine Jaafar, Bruno Sareni and Xavier Roboam

A wide number of applications requires classifying or grouping data into a set of categories or clusters. The most popular clustering techniques to achieve this objective are…

Abstract

Purpose

A wide number of applications requires classifying or grouping data into a set of categories or clusters. The most popular clustering techniques to achieve this objective are K‐means clustering and hierarchical clustering. However, both of these methods necessitate the a priori setting of the cluster number. The purpose of this paper is to present a clustering method based on the use of a niching genetic algorithm to overcome this problem.

Design/methodology/approach

The proposed approach aims at finding the best compromise between the inter‐cluster distance maximization and the intra‐cluster distance minimization through the silhouette index optimization. It is capable of investigating in parallel multiple cluster configurations without requiring any assumption about the cluster number.

Findings

The effectiveness of the proposed approach is demonstrated on 2D benchmarks with non‐overlapping and overlapping clusters.

Originality/value

The proposed approach is also applied to the clustering analysis of railway driving profiles in the context of hybrid supply design. Such a method can help designers to identify different system configurations in compliance with the corresponding clusters: it may guide suppliers towards “market segmentation”, not only fulfilling economic constraints but also technical design objectives.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 31 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

1 – 10 of 24