Search results
1 – 10 of over 1000Mohamed Marzouk and Mohamed Zaher
Facility management gained profound importance due to the increasing complexity of different systems and the cost of operation and maintenance. However, due to the increasing…
Abstract
Purpose
Facility management gained profound importance due to the increasing complexity of different systems and the cost of operation and maintenance. However, due to the increasing complexity of different systems, facility managers may suffer from a lack of information. The purpose of this paper is to propose a new facility management approach that links segmented assets to the vital data required for managing facilities.
Design/methodology/approach
Automatic point cloud segmentation is one of the most crucial processes required for modelling building facilities. In this research, laser scanning is used for point cloud acquisition. The research utilises region growing algorithm, colour-based region-growing algorithm and Euclidean cluster algorithm.
Findings
A case study is worked out to test the accuracy of the considered point cloud segmentation algorithms utilising metrics precision, recall and F-score. The results indicate that Euclidean cluster extraction and region growing algorithm revealed high accuracy for segmentation.
Originality/value
The research presents a comparative approach for selecting the most appropriate segmentation approach required for accurate modelling. As such, the segmented assets can be linked easily with the data required for facility management.
Details
Keywords
Reza Edris Abadi, Mohammad Javad Ershadi and Seyed Taghi Akhavan Niaki
The overall goal of the data mining process is to extract information from an extensive data set and make it understandable for further use. When working with large volumes of…
Abstract
Purpose
The overall goal of the data mining process is to extract information from an extensive data set and make it understandable for further use. When working with large volumes of unstructured data in research information systems, it is necessary to divide the information into logical groupings after examining their quality before attempting to analyze it. On the other hand, data quality results are valuable resources for defining quality excellence programs of any information system. Hence, the purpose of this study is to discover and extract knowledge to evaluate and improve data quality in research information systems.
Design/methodology/approach
Clustering in data analysis and exploiting the outputs allows practitioners to gain an in-depth and extensive look at their information to form some logical structures based on what they have found. In this study, data extracted from an information system are used in the first stage. Then, the data quality results are classified into an organized structure based on data quality dimension standards. Next, clustering algorithms (K-Means), density-based clustering (density-based spatial clustering of applications with noise [DBSCAN]) and hierarchical clustering (balanced iterative reducing and clustering using hierarchies [BIRCH]) are applied to compare and find the most appropriate clustering algorithms in the research information system.
Findings
This paper showed that quality control results of an information system could be categorized through well-known data quality dimensions, including precision, accuracy, completeness, consistency, reputation and timeliness. Furthermore, among different well-known clustering approaches, the BIRCH algorithm of hierarchical clustering methods performs better in data clustering and gives the highest silhouette coefficient value. Next in line is the DBSCAN method, which performs better than the K-Means method.
Research limitations/implications
In the data quality assessment process, the discrepancies identified and the lack of proper classification for inconsistent data have led to unstructured reports, making the statistical analysis of qualitative metadata problems difficult and thus impossible to root out the observed errors. Therefore, in this study, the evaluation results of data quality have been categorized into various data quality dimensions, based on which multiple analyses have been performed in the form of data mining methods.
Originality/value
Although several pieces of research have been conducted to assess data quality results of research information systems, knowledge extraction from obtained data quality scores is a crucial work that has rarely been studied in the literature. Besides, clustering in data quality analysis and exploiting the outputs allows practitioners to gain an in-depth and extensive look at their information to form some logical structures based on what they have found.
Details
Keywords
Sanjay Saifi and Ramiya M. Anandakumar
In an era overshadowed by the alarming consequences of climate change and the escalating peril of recurring floods for communities worldwide, the significance of proficient…
Abstract
Purpose
In an era overshadowed by the alarming consequences of climate change and the escalating peril of recurring floods for communities worldwide, the significance of proficient disaster risk management has reached unprecedented levels. The successful implementation of disaster risk management necessitates the ability to make informed decisions. To this end, the utilization of three-dimensional (3D) visualization and Web-based rendering offers decision-makers the opportunity to engage with interactive data representations. This study aims to focus on Thiruvananthapuram, India, where the analysis of flooding caused by the Karamana River aims to furnish valuable insights for facilitating well-informed decision-making in the realm of disaster management.
Design/methodology/approach
This work introduces a systematic procedure for evaluating the influence of flooding on 3D building models through the utilization of Web-based visualization and rendering techniques. To ensure precision, aerial light detection and ranging (LiDAR) data is used to generate accurate 3D building models in CityGML format, adhering to the standards set by the Open Geospatial Consortium. By using one-meter digital elevation models derived from LiDAR data, flood simulations are conducted to analyze flow patterns at different discharge levels. The integration of 3D building maps with geographic information system (GIS)-based vector maps and a flood risk map enables the assessment of the extent of inundation. To facilitate visualization and querying tasks, a Web-based graphical user interface (GUI) is developed.
Findings
The efficiency of comprehensive 3D building maps in evaluating flood consequences in Thiruvananthapuram has been established by the research. By merging with GIS-based vector maps and a flood risk map, it becomes possible to scrutinize the extent of inundation and the affected structures. Furthermore, the Web-based GUI facilitates interactive data exploration, visualization and querying, thereby assisting in decision-making.
Originality/value
The study introduces an innovative approach that merges LiDAR data, 3D building mapping, flood simulation and Web-based visualization, which can be advantageous for decision-makers in disaster risk management and may have practical use in various regions and urban areas.
Details
Keywords
Daniel Coughlin and Binky Lush
At the authors’ libraries, they consolidated two departments and attempted to find ways to increase productivity, reduce duplication and improve job happiness within their…
Abstract
Purpose
At the authors’ libraries, they consolidated two departments and attempted to find ways to increase productivity, reduce duplication and improve job happiness within their software development teams. The authors have lost institutional knowledge when developers leave the team, yet the authors remain responsible for critical library services. The merging of the authors’ departments provided the opportunity to rethink how their teams are structured and whether a different model could provide better professional development, more knowledge sharing and better stability of their services. This article presents a case study of moving from a project-centric approach to a platform-based model.
Design/methodology/approach
The authors met with those responsible for establishing priorities for their services and developers to assess successful and unsuccessful implementations and pivoted based on those assessments.
Findings
The authors found that their developers were happier to increase their portfolios and professional development, and the librarians were satisfied with more stable services during a particularly unstable time within the authors’ institution.
Originality/value
This is a practical example of a positive way to structure development teams in libraries. Frequently, teams support a single service to the library because of the criticality of that service on a day-to-day basis, but that can create a lack of shared knowledge during institutional instability. This study reveals the benefits of a platform-based approach, including increased developer happiness, reduced disruptions due to staff turnover and improved system stability. It also discusses the challenges of managing product owners' expectations and balancing feature development with maintenance work.
Details
Keywords
Han Sun, Song Tang, Xiaozhi Qi, Zhiyuan Ma and Jianxin Gao
This study aims to introduce a novel noise filter module designed for LiDAR simultaneous localization and mapping (SLAM) systems. The primary objective is to enhance pose…
Abstract
Purpose
This study aims to introduce a novel noise filter module designed for LiDAR simultaneous localization and mapping (SLAM) systems. The primary objective is to enhance pose estimation accuracy and improve the overall system performance in outdoor environments.
Design/methodology/approach
Distinct from traditional approaches, MCFilter emphasizes enhancing point cloud data quality at the pixel level. This framework hinges on two primary elements. First, the D-Tracker, a tracking algorithm, is grounded on multiresolution three-dimensional (3D) descriptors and adeptly maintains a balance between precision and efficiency. Second, the R-Filter introduces a pixel-level attribute named motion-correlation, which effectively identifies and removes dynamic points. Furthermore, designed as a modular component, MCFilter ensures seamless integration into existing LiDAR SLAM systems.
Findings
Based on rigorous testing with public data sets and real-world conditions, the MCFilter reported an increase in average accuracy of 12.39% and reduced processing time by 24.18%. These outcomes emphasize the method’s effectiveness in refining the performance of current LiDAR SLAM systems.
Originality/value
In this study, the authors present a novel 3D descriptor tracker designed for consistent feature point matching across successive frames. The authors also propose an innovative attribute to detect and eliminate noise points. Experimental results demonstrate that integrating this method into existing LiDAR SLAM systems yields state-of-the-art performance.
Details
Keywords
Yunwei Gai, Alia Crocker, Candida Brush and Wiljeana Jackson Glover
Research has examined how new ventures strengthen local economic outcomes; however, limited research examines health-oriented ventures and their impact on social outcomes…
Abstract
Purpose
Research has examined how new ventures strengthen local economic outcomes; however, limited research examines health-oriented ventures and their impact on social outcomes, including health outcomes. Increased VC investment in healthcare service start-ups signals more activity toward this end, and the need for further academic inquiry. We examine the relationship between these start-ups and county-level health outcomes, health factors, and hospital utilization.
Design/methodology/approach
Data on start-ups funded via institutional venture capital from PitchBook were merged with US county-level outcomes from the County Health Rankings and Area Health Resources Files for 2010 to 2019. We investigated how the number of VC-funded healthcare service start-ups, as well as a subset defined as innovative, were associated with county-level health measures. We used panel models with two-way fixed effects and Propensity Score Matched (PSM), controlling for demographics and socioeconomic factors.
Findings
Each additional VC-funded healthcare service start-up was related to a significant 0.01 percentage point decrease in diabetes prevalence (p < 0.01), a decrease of 1.54 HIV cases per 100,000 population (p < 0.1), a 0.02 percentage point decrease in obesity rates (p < 0.01), and a 0.03 percentage point decrease in binge drinking (p < 0.01). VC-funded healthcare service start-ups were not related to hospital utilization.
Originality/value
This work expands our understanding of how industry-specific start-ups, in this case healthcare start-ups, relate to positive social outcomes. The results underscore the importance of evidence-based evaluation, the need for expanded outcome measures for VC investment, and the possibilities for integration of healthcare services and entrepreneurship ecosystems.
Details
Keywords
Olivier Dupouët, Yoann Pitarch, Marie Ferru and Bastien Bernela
This study aims to explore the interplay between community dynamics and knowledge production using the quantum computing research field as a case study. Quantum computing holds…
Abstract
Purpose
This study aims to explore the interplay between community dynamics and knowledge production using the quantum computing research field as a case study. Quantum computing holds the promise of dramatically increasing computation speed and solving problems that are currently unsolvable in a short space of time. In this highly dynamic area of innovation, computer companies, research laboratories and governments are racing to develop the field.
Design/methodology/approach
After constructing temporal co-authorship networks, the authors identify seven different events affecting communities of researchers, which they label: forming, growing, splitting, shrinking, continuing, merging, dissolving. The authors then extract keywords from the titles and abstracts of their contributions to characterize the dynamics of knowledge production and examine the relationship between community events and knowledge production over time.
Findings
The findings show that forming and splitting are associated with retaining in memory what is currently known, merging and growing with the creation of new knowledge and splitting, shrinking and dissolving with the curation of knowledge.
Originality/value
Although the link between communities and knowledge has long been established, much less is known about the relationship between the dynamics of communities and their link with collective cognitive processes. To the best of the authors’ knowledge, the present contribution is one of the first to shed light on this dynamic aspect of community knowledge production.
Details
Keywords
The purpose of this paper is to gain insight into how management accountants can become relevant business partners out of respect for existing locally developed accounts of…
Abstract
Purpose
The purpose of this paper is to gain insight into how management accountants can become relevant business partners out of respect for existing locally developed accounts of economic performance for decision-making.
Design/methodology/approach
The paper is based on qualitative semi-structured interviews with local business actors, in this case, families from seven financially successful Danish dairy farms. The casework and the analysis have been informed by pragmatic constructivism.
Findings
The local business actors do not use the official accounting system for ongoing cost-management-related decision-making. Instead, they use several epistemic methods that include locally developed decision models, experiences, rules of thumb and intuition. The farmers use these vernacular accountings to compensate for the cost management illusion that the formal accounting system tends to create. What the study suggests is that when management accountants engage as business partners, they are likely to enter a space where accounting is already present.
Originality/value
This paper argues that local business actors practice epistemic methods where they develop and use vernacular accountings to support their managerial practice, also in the absence of a professional management accountant. These vernacular accountings may lead the local actors into an illusion because the vernacular accountings do not necessarily have an inherent economic logic and theoretical reliability. The role of the management accountant in such a setting is hence to understand, support and advance local epistemic methods. Becoming a business partner requires a combination of management accounting analytical skills and a sense of empathy and sensitivity regarding what is already at play and how this can become an object of discussion without violating the values of the other.
Details
Keywords
The expected learning outcomes are to understand the complexities involved in the integration of two carriers with different business strategies and approaches, the merger of two…
Abstract
Learning outcomes
The expected learning outcomes are to understand the complexities involved in the integration of two carriers with different business strategies and approaches, the merger of two brands with distinct personas and identities and the confluence of two different cultures; figure out the strategic options in front of the Tata Group and how it can deal with various macro- and micro-level business challenges, defy the financial hiccups and manoeuvre the operational complexities to accomplish mission Vihaan.AI; and develop a pragmatic approach to macro and micro business environmental scanning for making strategic business decisions.
Case overview/synopsis
In November 2022, Tata Group, the salt to software conglomerate, announced the merger of Air India (AI) and Vistara. This would lead to the formation of the full-service airline under the brand name “Air India”. The obvious reason behind this was the higher recognition, salience and recall of the brand AI as compared with Vistara in the global market. The Tata Group envisaged the brand AI to be a significant international aviation player with the heritage, persona and ethos of the brand Vistara in the renewed manifestation of AI. To realise these goals, Tata Group laid down an ambitious plan called “Vihaan.AI”, which was aimed at capturing a domestic market share of 30% by 2027.
Complexity academic level
This case study can be taught as part of undergraduate- and postgraduate-level management programmes.
Supplementary materials
Teaching notes are available for educators only.
Subject code
CSS 11: Strategy.
Details