Search results
1 – 10 of 166Ghada Karaki, Rami A. Hawileh and M.Z. Naser
This study examines the effect of temperature-dependent material models for normal-strength (NSC) and high-strength concrete (HSC) on the thermal analysis of reinforced concrete…
Abstract
Purpose
This study examines the effect of temperature-dependent material models for normal-strength (NSC) and high-strength concrete (HSC) on the thermal analysis of reinforced concrete (RC) walls.
Design/methodology/approach
The study performs an one-at-a-time (OAT) sensitivity analysis to assess the impact of variables defining the constitutive and parametric fire models on the wall's thermal response. Moreover, it extends the sensitivity analysis to a variance-based analysis to assess the effect of constitutive model type, fire model type and constitutive model uncertainty on the RC wall's thermal response variance. The study determines the wall’s thermal behaviour reliability considering the different constitutive models and their uncertainty.
Findings
It is found that the impact of the variability in concrete’s conductivity is determined by its temperature-dependent model, which differs for NSC and HSC. Therefore, more testing and improving material modelling are needed. Furthermore, the heating rate of the fire scenario is the dominant factor in deciding fire-resistance performance because it is a causal factor for spalling in HSC walls. And finally the reliability of wall's performance decreased sharply for HSC walls due to the expected spalling of the concrete and loss of cross-section integrity.
Originality/value
Limited studies in the current open literature quantified the impact of constitutive models on the behaviour of RC walls. No studies have examined the effect of material models' uncertainty on wall’s response reliability under fire. Furthermore, the study's results contribute to the ongoing attempts to shape performance-based structural fire engineering.
Details
Keywords
Fateme Akhlaghinezhad, Amir Tabadkani, Hadi Bagheri Sabzevar, Nastaran Seyed Shafavi and Arman Nikkhah Dehnavi
Occupant behavior can lead to considerable uncertainties in thermal comfort and air quality within buildings. To tackle this challenge, the use of probabilistic controls to…
Abstract
Purpose
Occupant behavior can lead to considerable uncertainties in thermal comfort and air quality within buildings. To tackle this challenge, the use of probabilistic controls to simulate occupant behavior has emerged as a potential solution. This study seeks to analyze the performance of free-running households by examining adaptive thermal comfort and CO2 concentration, both crucial variables in indoor air quality. The investigation of indoor environment dynamics caused by the occupants' behavior, especially after the COVID-19 pandemic, became increasingly important. Specifically, it investigates 13 distinct window and shading control strategies in courtyard houses to identify the factors that prompt occupants to interact with shading and windows and determine which control approach effectively minimizes the performance gap.
Design/methodology/approach
This paper compares commonly used deterministic and probabilistic control functions and their effects on occupant comfort and indoor air quality in four zones surrounding a courtyard. The zones are differentiated by windows facing the courtyard. The study utilizes the energy management system (EMS) functionality of EnergyPlus within an algorithmic interface called Ladybug Tools. By modifying geometrical dimensions, orientation, window-to-wall ratio (WWR) and window operable fraction, a total of 465 cases are analyzed to identify effective control scenarios. According to the literature, these factors were selected because of their potential significant impact on occupants’ thermal comfort and indoor air quality, in addition to the natural ventilation flow rate. Additionally, the Random Forest algorithm is employed to estimate the individual impact of each control scenario on indoor thermal comfort and air quality metrics, including operative temperature and CO2 concentration.
Findings
The findings of the study confirmed that both deterministic and probabilistic window control algorithms were effective in reducing thermal discomfort hours, with reductions of 56.7 and 41.1%, respectively. Deterministic shading controls resulted in a reduction of 18.5%. Implementing the window control strategies led to a significant decrease of 87.8% in indoor CO2 concentration. The sensitivity analysis revealed that outdoor temperature exhibited the strongest positive correlation with indoor operative temperature while showing a negative correlation with indoor CO2 concentration. Furthermore, zone orientation and length were identified as the most influential design variables in achieving the desired performance outcomes.
Research limitations/implications
It’s important to acknowledge the limitations of this study. Firstly, the potential impact of air circulation through the central zone was not considered. Secondly, the investigated control scenarios may have different impacts on air-conditioned buildings, especially when considering energy consumption. Thirdly, the study heavily relied on simulation tools and algorithms, which may limit its real-world applicability. The accuracy of the simulations depends on the quality of the input data and the assumptions made in the models. Fourthly, the case study is hypothetical in nature to be able to compare different control scenarios and their implications. Lastly, the comparative analysis was limited to a specific climate, which may restrict the generalizability of the findings in different climates.
Originality/value
Occupant behavior represents a significant source of uncertainty, particularly during the early stages of design. This study aims to offer a comparative analysis of various deterministic and probabilistic control scenarios that are based on occupant behavior. The study evaluates the effectiveness and validity of these proposed control scenarios, providing valuable insights for design decision-making.
Details
Keywords
This paper aims to focus on solving the path optimization problem by modifying the probabilistic roadmap (PRM) technique as it suffers from the selection of the optimal number of…
Abstract
Purpose
This paper aims to focus on solving the path optimization problem by modifying the probabilistic roadmap (PRM) technique as it suffers from the selection of the optimal number of nodes and deploy in free space for reliable trajectory planning.
Design/methodology/approach
Traditional PRM is modified by developing a decision-making strategy for the selection of optimal nodes w.r.t. the complexity of the environment and deploying the optimal number of nodes outside the closed segment. Subsequently, the generated trajectory is made smoother by implementing the modified Bezier curve technique, which selects an optimal number of control points near the sharp turns for the reliable convergence of the trajectory that reduces the sum of the robot’s turning angles.
Findings
The proposed technique is compared with state-of-the-art techniques that show the reduction of computational load by 12.46%, the number of sharp turns by 100%, the number of collisions by 100% and increase the velocity parameter by 19.91%.
Originality/value
The proposed adaptive technique provides a better solution for autonomous navigation of unmanned ground vehicles, transportation, warehouse applications, etc.
Details
Keywords
Emmanouil G. Chalampalakis, Ioannis Dokas and Eleftherios Spyromitros
This study focuses on the banking systems evaluation in Portugal, Italy, Ireland, Greece and Spain (known as the PIIGS) during the financial and post-financial crisis period from…
Abstract
Purpose
This study focuses on the banking systems evaluation in Portugal, Italy, Ireland, Greece and Spain (known as the PIIGS) during the financial and post-financial crisis period from 2009 to 2018.
Design/methodology/approach
A conditional robust nonparametric frontier analysis (order-m estimators) is used to measure banking efficiency combined with variables highlighting the effects of Non-Performing Loans. Next, a truncated regression is used to examine if institutional, macroeconomic, and financial variables affect bank performance differently. Unlike earlier studies, we use the Corruption Perception Index (CPI) as an institutional variable that affects banking sector efficiency.
Findings
This research shows that the PIIGS crisis affects each bank/country differently due to their various efficiency levels. Most of the study variables — CPI, government debt to GDP ratio, inflation, bank size — significantly affect banking efficiency measures.
Originality/value
The contribution of this article to the relevant banking literature is two-fold. First, it analyses the efficiency of the PIIGS banking system from 2009 to 2018, focusing on NPLs. Second, this is the first empirical study to use probabilistic frontier analysis (order-m estimators) to evaluate PIIGS banking systems.
Details
Keywords
Ahmad Ebrahimi and Sara Mojtahedi
Warranty-based big data analysis has attracted a great deal of attention because of its key capabilities and role in improving product quality while minimizing costs. Information…
Abstract
Purpose
Warranty-based big data analysis has attracted a great deal of attention because of its key capabilities and role in improving product quality while minimizing costs. Information and details about particular parts (components) repair and replacement during the warranty term, usually stored in the after-sales service database, can be used to solve problems in a variety of sectors. Due to the small number of studies related to the complete analysis of parts failure patterns in the automotive industry in the literature, this paper focuses on discovering and assessing the impact of lesser-studied factors on the failure of auto parts in the warranty period from the after-sales data of an automotive manufacturer.
Design/methodology/approach
The interconnected method used in this study for analyzing failure patterns is formed by combining association rules (AR) mining and Bayesian networks (BNs).
Findings
This research utilized AR analysis to extract valuable information from warranty data, exploring the relationship between component failure, time and location. Additionally, BNs were employed to investigate other potential factors influencing component failure, which could not be identified using Association Rules alone. This approach provided a more comprehensive evaluation of the data and valuable insights for decision-making in relevant industries.
Originality/value
This study's findings are believed to be practical in achieving a better dissection and providing a comprehensive package that can be utilized to increase component quality and overcome cross-sectional solutions. The integration of these methods allowed for a wider exploration of potential factors influencing component failure, enhancing the validity and depth of the research findings.
Details
Keywords
Meenal Arora, Anshika Prakash, Amit Mittal and Swati Singh
HR analytics is a process for systematic computational analysis of data or statistics. It discovers, interprets and communicates significant patterns in data to enable…
Abstract
Purpose
HR analytics is a process for systematic computational analysis of data or statistics. It discovers, interprets and communicates significant patterns in data to enable evidence-based HR research and uses analytical insights to help organizations achieve their strategic objectives. However, its adoption and utilization among HR professionals remain a subject of concern. This study aims to determine the reasons that facilitate or inhibit the acceptance of HR analytics among HR professionals in the banking, financial services and insurance (BFSI) sector.
Design/methodology/approach
A sample of 387 HR professionals in BFSI firms across India was collected through non-probabilistic purposive sampling. Structural equation modeling was applied to analyze the association between predetermined variables. In addition, the predictive relevance of “Data Availability” was analyzed using hierarchical regression.
Findings
The results revealed that data availability, hedonic motivation and performance expectancy positively influenced behavioral intention (BI). In contrast, effort expectancy, social influence and habit had an insignificant effect on BI. Also, facilitating conditions (FCs), habit, BI achieved a variance of 60% in HR analytics use. The use behavior of HR analytics was significantly influenced by FCs and BIs.
Practical implications
This study focuses on insights into the elements that influence HR analytics adoption, revealing additional light on success drivers and grey areas for failed adoption.
Originality/value
This research adds to the body of knowledge by identifying factors that hinder the adoption of HR analytics in Indian organizations and signifies the relevance of easy accessibility and availability of data for technology adoption.
Details
Keywords
Narsymbat Salimgereyev, Bulat Mukhamediyev and Aijaz A. Shaikh
This study developed new measures of the routine and non-routine task contents of managerial, professional, technical, and clerical occupations from a workload perspective. Here…
Abstract
Purpose
This study developed new measures of the routine and non-routine task contents of managerial, professional, technical, and clerical occupations from a workload perspective. Here, we present a comparative analysis of the workload structures of state and industrial sector employees.
Design/methodology/approach
Our method involves detailed descriptions of work processes and an element-wise time study. We collected and analysed data to obtain a workload structure that falls within three conceptual task categories: (i) non-routine analytic tasks, (ii) non-routine interactive tasks and (iii) routine cognitive tasks. A total of 2,312 state and industrial sector employees in Kazakhstan participated in the study. The data were collected using a proprietary web application that resembles a timesheet.
Findings
The study results are consistent with the general trend reported by previous studies: the higher the job level, the lower the occupation’s routine task content. In addition, the routine cognitive task contents of managerial, professional, technical, and clerical occupations in the industrial sector are higher than those in local governments. The work of women is also more routinary than that of men. Finally, vthe routine cognitive task contents of occupations in administrative units are higher than those of occupations in substantive units.
Originality/value
Our study sought to address the challenges of using the task-based approach associated with measuring tasks by introducing a new measurement framework. The main advantage of our task measures is a direct approach to assessing workloads consisting of routine tasks, which allows for an accurate estimation of potential staff reductions due to the automation of work processes.
Details
Keywords
Punsara Hettiarachchi, Subodha Dharmapriya and Asela Kumudu Kulatunga
This study aims to minimize the transportation-related cost in distribution while utilizing a heterogeneous fixed fleet to deliver distinct demand at different geographical…
Abstract
Purpose
This study aims to minimize the transportation-related cost in distribution while utilizing a heterogeneous fixed fleet to deliver distinct demand at different geographical locations with a proper workload balancing approach. An increased cost in distribution is a major problem for many companies due to the absence of efficient planning methods to overcome operational challenges in distinct distribution networks. The problem addressed in this study is to minimize the transportation-related cost in distribution while using a heterogeneous fixed fleet to deliver distinct demand at different geographical locations with a proper workload balancing approach which has not gained the adequate attention in the literature.
Design/methodology/approach
This study formulated the transportation problem as a vehicle routing problem with a heterogeneous fixed fleet and workload balancing, which is a combinatorial optimization problem of the NP-hard category. The model was solved using both the simulated annealing and a genetic algorithm (GA) adopting distinct local search operators. A greedy approach has been used in generating an initial solution for both algorithms. The paired t-test has been used in selecting the best algorithm. Through a number of scenarios, the baseline conditions of the problem were further tested investigating the alternative fleet compositions of the heterogeneous fleet. Results were analyzed using analysis of variance (ANOVA) and Hsu’s MCB methods to identify the best scenario.
Findings
The solutions generated by both algorithms were subjected to the t-test, and the results revealed that the GA outperformed in solution quality in planning a heterogeneous fleet for distribution with load balancing. Through a number of scenarios, the baseline conditions of the problem were further tested investigating the alternative fleet utilization with different compositions of the heterogeneous fleet. Results were analyzed using ANOVA and Hsu’s MCB method and found that removing the lowest capacities trucks enhances the average vehicle utilization with reduced travel distance.
Research limitations/implications
The developed model has considered both planning of heterogeneous fleet and the requirement of work load balancing which are very common industry needs, however, have not been addressed adequately either individually or collectively in the literature. The adopted solution methodologies to solve the NP-hard distribution problem consist of metaheuristics, statistical analysis and scenario analysis are another significant contribution. The planning of distribution operations not only addresses operational-level decision, through a scenario analysis, but also strategic-level decision has also been considered.
Originality/value
The planning of distribution operations not only addresses operational-level decisions, but also strategic-level decisions conducting a scenario analysis.
Details
Keywords
Nicola Cobelli and Silvia Blasi
This paper explores the Adoption of Technological Innovation (ATI) in the healthcare industry. It investigates how the literature has evolved, and what are the emerging innovation…
Abstract
Purpose
This paper explores the Adoption of Technological Innovation (ATI) in the healthcare industry. It investigates how the literature has evolved, and what are the emerging innovation dimensions in the healthcare industry adoption studies.
Design/methodology/approach
We followed a mixed-method approach combining bibliometric methods and topic modeling, with 57 papers being deeply analyzed.
Findings
Our results identify three latent topics. The first one is related to the digitalization in healthcare with a specific focus on the COVID-19 pandemic. The second one groups up the word combinations dealing with the research models and their constructs. The third one refers to the healthcare systems/professionals and their resistance to ATI.
Research limitations/implications
The study’s sample selection focused on scientific journals included in the Academic Journal Guide and in the FT Research Rank. However, the paper identifies trends that offer managerial insights for stakeholders in the healthcare industry.
Practical implications
ATI has the potential to revolutionize the health service delivery system and to decentralize services traditionally provided in hospitals or medical centers. All this would contribute to a reduction in waiting lists and the provision of proximity services.
Originality/value
The originality of the paper lies in the combination of two methods: bibliometric analysis and topic modeling. This approach allowed us to understand the ATI evolutions in the healthcare industry.
Details
Keywords
This study collected the bibliographic data of 2034 journal articles published in 2000–2021 from Web of Science (WoS) core collection database and adopted two bibliometric…
Abstract
Purpose
This study collected the bibliographic data of 2034 journal articles published in 2000–2021 from Web of Science (WoS) core collection database and adopted two bibliometric analysis methods, namely historiography and keyword co-occurrence, to identify the evolution trend of construction risk management (CRM) research topics.
Design/methodology/approach
CRM has been a key issue in construction management research, producing a big number of publications. This study aims to undertake a review of the global CRM research published from 2000 to 2021 and identify the evolution of the research topics relating to CRM.
Findings
This study found that risk analysis methods have shifted from simply ranking risks in terms of their relative importance or significance toward examining the interrelationships among risks, and that the objects of CRM research have shifted from generic construction projects toward specified types of construction projects (e.g. small projects, underground construction projects, green buildings and prefabricated projects). In addition, researchers tend to pay more attention to an individual risk category (e.g. political risk, safety risk and social risk) and integrate CRM into cost, time, quality, safety and environment management functions with the increasing adoption of various information and communication technologies.
Research limitations/implications
This study focused on the journal articles in English in WoS core collection database only, thus excluding the publications in other languages, not indexed by WoS and conference proceedings. In addition, the historiography focused on the top documents in terms of document strength and thus ignored the role of the documents whose strengths were a little lower than the threshold.
Originality/value
This review study is more inclusive than any prior reviews on CRM and overcomes the drawbacks of mere reliance on either bibliometric analysis results or subjective opinions. Revealing the evolution process of the CRM knowledge domain, this study provides an in-depth understanding of the CRM research and benefits industry practitioners and researchers.
Details