Search results
1 – 10 of 34Hai Thi Thanh Nguyen, Tommi Tapanainen and Geoffrey Hubona
The advancement of technologies has made it possible for health-care organizations to provide convenient online services that enable people to manage their health conditions…
Abstract
Purpose
The advancement of technologies has made it possible for health-care organizations to provide convenient online services that enable people to manage their health conditions. Although many studies have investigated the adoption and benefits of e-health services, there has been little focus on health-oriented behaviors after adoption, particularly in relation to service quality and user satisfaction.
Design/methodology/approach
This paper is based on the SOR model and service quality theories to investigate behavioral responses, including word-of-mouth, intention to use and intention to act. The authors use a partial least squares structural equation modeling analysis with 194 participants and the diabetes risk test survey in Finland.
Findings
The results show that people are willing to engage in health self-management behaviors if they intend to use the e-health service and are satisfied with it. User satisfaction can be enhanced by improving the visual appeal of the website presentation, the quality of the presented information, as well as the usability of the website, all as components of e-health services.
Originality/value
The authors contribute by creating a construct “intention to act,” referring to health-oriented behaviors resulting from e-health service use. In addition, this study is among the first to apply the SOR model to investigate how user satisfaction leads to intention to use, intention to act and word-of-mouth.
Details
Keywords
Fatemeh Ravandi, Azar Fathi Heli Abadi, Ali Heidari, Mohammad Khalilzadeh and Dragan Pamucar
Untimely responses to emergency situations in urban areas contribute to a rising mortality rate and impact society's primary capital. The efficient dispatch and relocation of…
Abstract
Purpose
Untimely responses to emergency situations in urban areas contribute to a rising mortality rate and impact society's primary capital. The efficient dispatch and relocation of ambulances pose operational and momentary challenges, necessitating an optimal policy based on the system's real-time status. While previous studies have addressed these concerns, limited attention has been given to the optimal allocation of technicians to respond to emergency situation and minimize overall system costs.
Design/methodology/approach
In this paper, a bi-objective mathematical model is proposed to maximize system coverage and enable flexible movement across bases for location, dispatch and relocation of ambulances. Ambulances relocation involves two key decisions: (1) allocating ambulances to bases after completing services and (2) deciding to change the current ambulance location among existing bases to potentially improve response times to future emergencies. The model also considers the varying capabilities of technicians for proper allocation in emergency situations.
Findings
The Augmented Epsilon-Constrained (AEC) method is employed to solve the proposed model for small-sized problem. Due to the NP-Hardness of the model, the NSGA-II and MOPSO metaheuristic algorithms are utilized to obtain efficient solutions for large-sized problems. The findings demonstrate the superiority of the MOPSO algorithm.
Practical implications
This study can be useful for emergency medical centers and healthcare companies in providing more effective responses to emergency situations by sending technicians and ambulances.
Originality/value
In this study, a two-objective mathematical model is developed for ambulance location and dispatch and solved by using the AEC method as well as the NSGA-II and MOPSO metaheuristic algorithms. The mathematical model encompasses three primary types of decision-making: (1) Allocating ambulances to bases after completing their service, (2) deciding to relocate the current ambulance among existing bases to potentially enhance response times to future emergencies and (3) considering the diverse abilities of technicians for accurate allocation to emergency situations.
Details
Keywords
This study empirically demonstrates a contradiction between pillar 3 of Basel norms III and the designation of Systemically Important Banks (SIBs), also known as Too Big to Fail…
Abstract
Purpose
This study empirically demonstrates a contradiction between pillar 3 of Basel norms III and the designation of Systemically Important Banks (SIBs), also known as Too Big to Fail (TBTF). The objective of this study is threefold, which has been approached in a phased manner. The first is to determine the systemic importance of the banks under study; second, to examine if market discipline exists at different levels of systemic importance of banks and lastly, to examine if the strength of market discipline varies at different levels of systemic importance.
Design/methodology/approach
This study is based on all the public and private sector banks operating in the Indian banking sector. The Gaussian Mixture Model algorithm has been utilized to classify banks into distinct levels of systemic importance. Thereafter, market discipline has been observed by analyzing depositors' sentiments toward banks' risk (CAMEL indicators). The analysis has been performed by employing the system Generalized Method of Moments (GMM) to estimate models with different dependent variables.
Findings
The findings affirm the existence of market discipline across all levels of systemic importance. However, the strength of market discipline varies with the systemic importance of the banks, with weak market discipline being a negative externality of the SIBs designation.
Originality/value
By employing the Gaussian Mixture Model algorithm to develop a framework for categorizing banks on the basis of their systemic importance, this study is the first to go beyond the conventional method as outlined by the Reserve Bank of India (RBI).
Details
Keywords
Emerson Norabuena-Figueroa, Roger Rurush-Asencio, K. P. Jaheer Mukthar, Jose Sifuentes-Stratti and Elia Ramírez-Asís
The development of information technologies has led to a considerable transformation in human resource management from conventional or commonly known as personnel management to…
Abstract
The development of information technologies has led to a considerable transformation in human resource management from conventional or commonly known as personnel management to modern one. Data mining technology, which has been widely used in several applications, including those that function on the web, includes clustering algorithms as a key component. Web intelligence is a recent academic field that calls for sophisticated analytics and machine learning techniques to facilitate information discovery, particularly on the web. Human resource data gathered from the web are typically enormous, highly complex, dynamic, and unstructured. Traditional clustering methods need to be upgraded because they are ineffective. Standard clustering algorithms are enhanced and expanded with optimization capabilities to address this difficulty by swarm intelligence, a subset of nature-inspired computing. We collect the initial raw human resource data and preprocess the data wherein data cleaning, data normalization, and data integration takes place. The proposed K-C-means-data driven cuckoo bat optimization algorithm (KCM-DCBOA) is used for clustering of the human resource data. The feature extraction is done using principal component analysis (PCA) and the classification of human resource data is done using support vector machine (SVM). Other approaches from the literature were contrasted with the suggested approach. According to the experimental findings, the suggested technique has extremely promising features in terms of the quality of clustering and execution time.
Details
Keywords
Serena Summa, Alex Mircoli, Domenico Potena, Giulia Ulpiani, Claudia Diamantini and Costanzo Di Perna
Nearly 75% of EU buildings are not energy-efficient enough to meet the international climate goals, which triggers the need to develop sustainable construction techniques with…
Abstract
Purpose
Nearly 75% of EU buildings are not energy-efficient enough to meet the international climate goals, which triggers the need to develop sustainable construction techniques with high degree of resilience against climate change. In this context, a promising construction technique is represented by ventilated façades (VFs). This paper aims to propose three different VFs and the authors define a novel machine learning-based approach to evaluate and predict their energy performance under different boundary conditions, without the need for expensive on-site experimentations
Design/methodology/approach
The approach is based on the use of machine learning algorithms for the evaluation of different VF configurations and allows for the prediction of the temperatures in the cavities and of the heat fluxes. The authors trained different regression algorithms and obtained low prediction errors, in particular for temperatures. The authors used such models to simulate the thermo-physical behavior of the VFs and determined the most energy-efficient design variant.
Findings
The authors found that regression trees allow for an accurate simulation of the thermal behavior of VFs. The authors also studied feature weights to determine the most relevant thermo-physical parameters. Finally, the authors determined the best design variant and the optimal air velocity in the cavity.
Originality/value
This study is unique in four main aspects: the thermo-dynamic analysis is performed under different thermal masses, positions of the cavity and geometries; the VFs are mated with a controlled ventilation system, used to parameterize the thermodynamic behavior under stepwise variations of the air inflow; temperatures and heat fluxes are predicted through machine learning models; the best configuration is determined through simulations, with no onerous in situ experimentations needed.
Details
Keywords
This study aims to construct a sentiment series generation method for danmu comments based on deep learning, and explore the features of sentiment series after clustering.
Abstract
Purpose
This study aims to construct a sentiment series generation method for danmu comments based on deep learning, and explore the features of sentiment series after clustering.
Design/methodology/approach
This study consisted of two main parts: danmu comment sentiment series generation and clustering. In the first part, the authors proposed a sentiment classification model based on BERT fine-tuning to quantify danmu comment sentiment polarity. To smooth the sentiment series, they used methods, such as comprehensive weights. In the second part, the shaped-based distance (SBD)-K-shape method was used to cluster the actual collected data.
Findings
The filtered sentiment series or curves of the microfilms on the Bilibili website could be divided into four major categories. There is an apparently stable time interval for the first three types of sentiment curves, while the fourth type of sentiment curve shows a clear trend of fluctuation in general. In addition, it was found that “disputed points” or “highlights” are likely to appear at the beginning and the climax of films, resulting in significant changes in the sentiment curves. The clustering results show a significant difference in user participation, with the second type prevailing over others.
Originality/value
Their sentiment classification model based on BERT fine-tuning outperformed the traditional sentiment lexicon method, which provides a reference for using deep learning as well as transfer learning for danmu comment sentiment analysis. The BERT fine-tuning–SBD-K-shape algorithm can weaken the effect of non-regular noise and temporal phase shift of danmu text.
Details
Keywords
Stefano Costa, Eugenio Costamagna and Paolo Di Barba
A novel method for modelling permanent magnets is investigated based on numerical approximations with rational functions. This study aims to introduce the AAA algorithm and other…
Abstract
Purpose
A novel method for modelling permanent magnets is investigated based on numerical approximations with rational functions. This study aims to introduce the AAA algorithm and other recently developed, cutting-edge mathematical tools, which provide outstandingly fast and accurate numerical computation of potentials and vector fields.
Design/methodology/approach
First, the AAA algorithm is briefly introduced along with its main variants and other advanced mathematical tools involved in the modelling. Then, the analysis of a circular Halbach array with a one-pole pair is carried out by means of the AAA-least squares method, focusing on vector potential and flux density in the bore and validating results by means of classic finite element software. Finally, the investigation is completed by a finite difference analysis.
Findings
AAA methods for field analysis prove to be strikingly fast and accurate. Results are in excellent agreement with those provided by the finite element model, and the very good agreement with those from finite differences suggests future improvements. They are also easy programming; the MATLAB code is less than 200 lines. This indicates they can provide an effective tool for rapid analysis.
Research limitations/implications
AAA methods in magnetostatics are novel, but their extension to analogous physical problems seems straightforward. Being a meshless method, it is unlikely that local non-linearities can be considered. An aspect of particular interest, left for future research, is the capability of handling inhomogeneous domains, i.e. solving general interface problems.
Originality/value
The authors use cutting-edge mathematical tools for the modelling of complex physical objects in magnetostatics.
Details
Keywords
Daria Arkhipova, Marco Montemari, Chiara Mio and Stefano Marasca
This paper aims to critically examine the accounting and information systems literature to understand the changes that are occurring in the management accounting profession. The…
Abstract
Purpose
This paper aims to critically examine the accounting and information systems literature to understand the changes that are occurring in the management accounting profession. The changes the authors are interested in are linked to technology-driven innovations in managerial decision-making and in organizational structures. In addition, the paper highlights research gaps and opportunities for future research.
Design/methodology/approach
The authors adopted a grounded theory literature review method (Wolfswinkel et al., 2013) to achieve the study’s aims.
Findings
The authors identified four research themes that describe the changes in the management accounting profession due to technology-driven innovations: structured vs unstructured data, human vs algorithm-driven decision-making, delineated vs blurred functional boundaries and hierarchical vs platform-based organizations. The authors also identified tensions mentioned in the literature for each research theme.
Originality/value
Previous studies display a rather narrow focus on the role of digital technologies in accounting work and new competences that management accountants require in the digital era. By contrast, the authors focus on the broader technology-driven shifts in organizational processes and structures, which vastly change how accounting information is collected, processed and analyzed internally to support managerial decision-making. Hence, the paper focuses on how management accountants can adapt and evolve as their organizations transition toward a digital environment.
Details
Keywords
Armando Calabrese, Antonio D'Uffizi, Nathan Levialdi Ghiron, Luca Berloco, Elaheh Pourabbas and Nathan Proudlove
The primary objective of this paper is to show a systematic and methodological approach for the digitalization of critical clinical pathways (CPs) within the healthcare domain.
Abstract
Purpose
The primary objective of this paper is to show a systematic and methodological approach for the digitalization of critical clinical pathways (CPs) within the healthcare domain.
Design/methodology/approach
The methodology entails the integration of service design (SD) and action research (AR) methodologies, characterized by iterative phases that systematically alternate between action and reflective processes, fostering cycles of change and learning. Within this framework, stakeholders are engaged through semi-structured interviews, while the existing and envisioned processes are delineated and represented using BPMN 2.0. These methodological steps emphasize the development of an autonomous, patient-centric web application alongside the implementation of an adaptable and patient-oriented scheduling system. Also, business processes simulation is employed to measure key performance indicators of processes and test for potential improvements. This method is implemented in the context of the CP addressing transient loss of consciousness (TLOC), within a publicly funded hospital setting.
Findings
The methodology integrating SD and AR enables the detection of pivotal bottlenecks within diagnostic CPs and proposes optimal corrective measures to ensure uninterrupted patient care, all the while advancing the digitalization of diagnostic CP management. This study contributes to theoretical discussions by emphasizing the criticality of process optimization, the transformative potential of digitalization in healthcare and the paramount importance of user-centric design principles, and offers valuable insights into healthcare management implications.
Originality/value
The study’s relevance lies in its ability to enhance healthcare practices without necessitating disruptive and resource-intensive process overhauls. This pragmatic approach aligns with the imperative for healthcare organizations to improve their operations efficiently and cost-effectively, making the study’s findings relevant.
Details
Keywords
Luís Jacques de Sousa, João Poças Martins, Luís Sanhudo and João Santos Baptista
This study aims to review recent advances towards the implementation of ANN and NLP applications during the budgeting phase of the construction process. During this phase…
Abstract
Purpose
This study aims to review recent advances towards the implementation of ANN and NLP applications during the budgeting phase of the construction process. During this phase, construction companies must assess the scope of each task and map the client’s expectations to an internal database of tasks, resources and costs. Quantity surveyors carry out this assessment manually with little to no computer aid, within very austere time constraints, even though these results determine the company’s bid quality and are contractually binding.
Design/methodology/approach
This paper seeks to compile applications of machine learning (ML) and natural language processing in the architectural engineering and construction sector to find which methodologies can assist this assessment. The paper carries out a systematic literature review, following the preferred reporting items for systematic reviews and meta-analyses guidelines, to survey the main scientific contributions within the topic of text classification (TC) for budgeting in construction.
Findings
This work concludes that it is necessary to develop data sets that represent the variety of tasks in construction, achieve higher accuracy algorithms, widen the scope of their application and reduce the need for expert validation of the results. Although full automation is not within reach in the short term, TC algorithms can provide helpful support tools.
Originality/value
Given the increasing interest in ML for construction and recent developments, the findings disclosed in this paper contribute to the body of knowledge, provide a more automated perspective on budgeting in construction and break ground for further implementation of text-based ML in budgeting for construction.
Details