Search results
1 – 10 of over 11000Marianne Jaakkola, Soila Lemmetty, Kaija Collin, Minna Ylönen and Teuvo Antikainen
This study aims to increase the understanding of the starting points and presuppositions of organizational learning (OL) processes in a hospital’s surgical department based on the…
Abstract
Purpose
This study aims to increase the understanding of the starting points and presuppositions of organizational learning (OL) processes in a hospital’s surgical department based on the existing theory of OL and to make visible the practical possibilities of the theory in this context.
Design/methodology/approach
The study was conducted as a case study. The data were collected from personnel of the hospital’s surgical department and consisted of 26 thematic interviews. The data were analyzed using qualitative theory-driven content analysis.
Findings
This study found different starting points for both employee-oriented and organization-oriented learning processes that could potentially progress to different levels of the organization: from individuals to a wider group or from a large group to an individual. The starting point of employee-oriented learning processes was depicted as everyday life problems or situations or was based on the person’s interest. The starting points of organization-oriented learning processes were described as achieving or maintaining the organization’s expected skill levels, pursuing continuous development or pursuing the organization’s specific development needs. Different kinds of presuppositions were also located within the OL processes.
Originality/value
This study produced new practice-based knowledge about the starting points of OL processes and their presuppositions. In health-care organizations, learning is especially important due to intensive and complex changes, and this study provides empirical evidence on how to enhance learning.
Details
Keywords
Stratos Moschidis, Angelos Markos and Athanasios C. Thanopoulos
The purpose of this paper is to create an automatic interpretation of the results of the method of multiple correspondence analysis (MCA) for categorical variables, so that the…
Abstract
Purpose
The purpose of this paper is to create an automatic interpretation of the results of the method of multiple correspondence analysis (MCA) for categorical variables, so that the nonexpert user can immediately and safely interpret the results, which concern, as the authors know, the categories of variables that strongly interact and determine the trends of the subject under investigation.
Design/methodology/approach
This study is a novel theoretical approach to interpreting the results of the MCA method. The classical interpretation of MCA results is based on three indicators: the projection (F) of the category points of the variables in factorial axes, the point contribution to axis creation (CTR) and the correlation (COR) of a point with an axis. The synthetic use of the aforementioned indicators is arduous, particularly for nonexpert users, and frequently results in misinterpretations. The current study has achieved a synthesis of the aforementioned indicators, so that the interpretation of the results is based on a new indicator, as correspondingly on an index, the well-known method principal component analysis (PCA) for continuous variables is based.
Findings
Two (2) concepts were proposed in the new theoretical approach. The interpretative axis corresponding to the classical factorial axis and the interpretative plane corresponding to the factorial plane that as it will be seen offer clear and safe interpretative results in MCA.
Research limitations/implications
It is obvious that in the development of the proposed automatic interpretation of the MCA results, the authors do not have in the interpretative axes the actual projections of the points as is the case in the original factorial axes, but this is not of interest to the simple user who is only interested in being able to distinguish the categories of variables that determine the interpretation of the most pronounced trends of the phenomenon being examined.
Practical implications
The results of this research can have positive implications for the dissemination of MCA as a method and its use as an integrated exploratory data analysis approach.
Originality/value
Interpreting the MCA results presents difficulties for the nonexpert user and sometimes lead to misinterpretations. The interpretative difficulty persists in the MCA's other interpretative proposals. The proposed method of interpreting the MCA results clearly and accurately allows for the interpretation of its results and thus contributes to the dissemination of the MCA as an integrated method of categorical data analysis and exploration.
Details
Keywords
Chongjun Wu, Dengdeng Shu, Hu Zhou and Zuchao Fu
In order to improve the robustness to noise in point cloud plane fitting, a combined model of improved Cook’s distance (ICOOK) and WTLS is proposed by setting a modified Cook’s…
Abstract
Purpose
In order to improve the robustness to noise in point cloud plane fitting, a combined model of improved Cook’s distance (ICOOK) and WTLS is proposed by setting a modified Cook’s increment, which could help adaptively remove the noise points that exceeds the threshold.
Design/methodology/approach
This paper proposes a robust point cloud plane fitting method based on ICOOK and WTLS to improve the robustness to noise in point cloud fitting. The ICOOK to denoise the initial point cloud was set and verified with experiments. In the meanwhile, weighted total least squares method (WTLS) was adopted to perform plane fitting on the denoised point cloud set to obtain the plane equation.
Findings
(a) A threshold-adaptive Cook’s distance method is designed, which can automatically match a suitable threshold. (b) The ICOOK is fused with the WTLS method, and the simulation experiments and the actual fitting of the surface of the DD motor are carried out to verify the actual application. (c) The results shows that the plane fitting accuracy and unit weight variance of the algorithm in this paper are substantially enhanced.
Originality/value
The existing point cloud plane fitting methods are not robust to noise, so a robust point cloud plane fitting method based on a combined model of ICOOK and WTLS is proposed. The existing point cloud plane fitting methods are not robust to noise, so a robust point cloud plane fitting method based on a combined model of ICOOK and WTLS is proposed.
Details
Keywords
Jie Ma, Zhiyuan Hao and Mo Hu
The density peak clustering algorithm (DP) is proposed to identify cluster centers by two parameters, i.e. ρ value (local density) and δ value (the distance between a point and…
Abstract
Purpose
The density peak clustering algorithm (DP) is proposed to identify cluster centers by two parameters, i.e. ρ value (local density) and δ value (the distance between a point and another point with a higher ρ value). According to the center-identifying principle of the DP, the potential cluster centers should have a higher ρ value and a higher δ value than other points. However, this principle may limit the DP from identifying some categories with multi-centers or the centers in lower-density regions. In addition, the improper assignment strategy of the DP could cause a wrong assignment result for the non-center points. This paper aims to address the aforementioned issues and improve the clustering performance of the DP.
Design/methodology/approach
First, to identify as many potential cluster centers as possible, the authors construct a point-domain by introducing the pinhole imaging strategy to extend the searching range of the potential cluster centers. Second, they design different novel calculation methods for calculating the domain distance, point-domain density and domain similarity. Third, they adopt domain similarity to achieve the domain merging process and optimize the final clustering results.
Findings
The experimental results on analyzing 12 synthetic data sets and 12 real-world data sets show that two-stage density peak clustering based on multi-strategy optimization (TMsDP) outperforms the DP and other state-of-the-art algorithms.
Originality/value
The authors propose a novel DP-based clustering method, i.e. TMsDP, and transform the relationship between points into that between domains to ultimately further optimize the clustering performance of the DP.
Details
Keywords
Vicente Rodríguez, Cristina Olarte-Pascual and Manuela Saco
The purpose of this paper is to study the optimization of the geographical location of a network of points of sale, so that each retailer can have access to a potential geographic…
Abstract
Purpose
The purpose of this paper is to study the optimization of the geographical location of a network of points of sale, so that each retailer can have access to a potential geographic market. In addition, the authors study the importance of the distance variable in the commercial viability of a point of sale and a network of points of sale, analysing if the best location for each point (local optimum) is always the best location for the whole (global optimum).
Design/methodology/approach
Location-allocation models are applied using p-median algorithms and spatial competition maximization to analyse the actual journeys of 64,740 car buyers in 1240 postal codes using a geographic information system (GIS) and geomarketing techniques.
Findings
The models show that the pursuit of individual objectives by each concessionaire over the collective provides poorer results for the whole network of points of sale when compared to coordinated competition. The solutions provided by the models considering geographic and marketing criteria permit a reduction in the length of journeys made by the buyers. GIS allows the optimal control of market demand coverage through the collaborative strategies of the supplying retailers, in this case, car dealerships.
Originality/value
The paper contributes to the joint research of geography and marketing from a theoretical and practical point of view. The main contribution is the use of information on actual buyer journeys for the optimal location of a network of points of sale. This research also contributes to the analysis of the correlation between the optimum local and optimum global locations of a commercial network and is a pioneering work in the application of these models to the automotive sector in the territorial area of the study.
Details
Keywords
Gustaf Kastberg Weichselberger and Cristian Lagström
The authors argue that the mainstream scholarly discourse on hybridity and accounting is thus far primarily interested in the use and effects of accounting “in” hybrid…
Abstract
Purpose
The authors argue that the mainstream scholarly discourse on hybridity and accounting is thus far primarily interested in the use and effects of accounting “in” hybrid organizations. Consequently, the literature has to a lesser extent explored how accounting mediates hybrid settings (while also being mediated), and the role of disentanglements in such processes. In hybrid settings, objects are difficult to define, and measures and tools difficult to agree upon. However, the literature on hybrid accounting is inconclusive and indicates that accounting can potentially both stabilize and de-stabilize relations in a hybrid setting. The authors address the research question of how accounting emerges and manifests itself in a process of entangling and disentangling in a heterogeneous emerging hybrid setting.
Design/methodology/approach
The paper is based on a longitudinal qualitative case study of the implementation of social investments, a public sector calculative framework based on the logic of measuring long term and social and economic impact of prevention. Methodologically, the study was guided by actor-network theory. In total, 18 observations and 48 interviews were conducted.
Findings
The observation the authors make in their case study is that much effort was spent on both keeping things apart and tying elements together. What the authors add to the literature is an illumination of how the interplay between entanglements and disentanglements facilitated the design idea of social investments to be enacted as multiple semi-integrated and purified hybridizations. The authors describe different translation points, each representing a specific hybridization where elements were added, recombined and disentangled. Still, the translation points were not completely compartmentalized, but rather semi-integrated where associations were facilitated through active mediation, likeness and productiveness for each other.
Research limitations/implications
One limitation is the single case approach. A second limitation arises from the ANT approach to hybridity.
Practical implications
A practical implication of this paper is that in hybrid settings, the semi-integrated character may be interpreted as a strength because it allows the mobilization of heterogenous actors. However, this may also come at the cost of governability and raises further questions of managerial practices in hybrid settings.
Social implications
The paper suggests the potentially productive role of disentanglements in allowing multiple hybridizations to evolve in hybrid accounting settings.
Originality/value
The paper suggests the potentially productive role of disentanglements in allowing multiple stabilized hybridizations to evolve in hybrid accounting settings.
Details
Keywords
The purpose of this paper is to study the coupled fixed point problem and the coupled best proximity problem for single-valued and multi-valued contraction type operators defined…
Abstract
The purpose of this paper is to study the coupled fixed point problem and the coupled best proximity problem for single-valued and multi-valued contraction type operators defined on cyclic representations of the space. The approach is based on fixed point results for appropriate operators generated by the initial problems.
Details
Keywords
Zhishuo Liu, Yao Dongxin, Zhao Kuan and Wang Chun Fang
There is a certain error in the satellite positioning of the vehicle. The error will cause the drift point of the positioning point, which makes the vehicle trajectory shift to…
Abstract
Purpose
There is a certain error in the satellite positioning of the vehicle. The error will cause the drift point of the positioning point, which makes the vehicle trajectory shift to the real road. This paper aims to solve this problem.
Design/methodology/approach
The key technology to solve the problem is map matching (MM). The low sampling frequency of the vehicle is far from the distance between adjacent points, which weakens the correlation between the points, making MM more difficult. In this paper, an MM algorithm based on priority rules is designed for vehicle trajectory characteristics at low sampling frequencies.
Findings
The experimental results show that the MM based on priority rule algorithm can effectively match the trajectory data of low sampling frequency with the actual road, and the matching accuracy is better than other similar algorithms, the processing speed reaches 73 per second.
Research limitations/implications
In the algorithm verification of this paper, although the algorithm design and experimental verification are considered considering the diversity of GPS data sampling frequency, the experimental data used are still a single source.
Originality/value
Based on the GPS trajectory data of the Ministry of Transport, the experimental results show that the accuracy of the priority-based weight-based algorithm is higher. The accuracy of this algorithm is over 98.1 per cent, which is better than other similar algorithms.
Details
Keywords
Latifah Abdol Latif, Ramli Bahroom and Mohamad Afzhan Khan Mohamad Khalil
The purpose of this paper is to identify the “selling points” for Open University Malaysia (OUM) to be used in its marketing activities and the “critical points” that OUM should…
Abstract
Purpose
The purpose of this paper is to identify the “selling points” for Open University Malaysia (OUM) to be used in its marketing activities and the “critical points” that OUM should focus on for further improvements in providing its services to its students. These selling and critical points are derived from the analysis of the importance and satisfaction data collected from OUM’s postgraduate students.
Design/methodology/approach
This study employs a two-dimensional, i.e., Importance-Satisfaction Survey which consists of 47 items, categorized under eight dimensions. Items are phrased as positive statements and students are asked to indicate how important it is to them using a seven-point Likert scale ranging from not at all important (1) to very important (7). They are then asked to rate their level of satisfaction, using the same scale from very dissatisfied (1) to very satisfied (7). A total of 709 postgraduate students responses were used in this study. A multiple regression analysis was conducted to explain the relationship between the dependent variable, overall satisfaction and eight independent variables. The “selling points” and “critical points” are determined by combining the quadrant and gap analyses. The “selling point” items are the high-importance-high-satisfaction (HIHS) items with relatively small gap scores while the “critical points” are those in the high-importance-low-satisfaction and HIHS quadrants with relatively large gap scores.
Findings
The overall results of the Importance-Satisfaction Survey showed that the postgraduate students are generally satisfied with OUM’s programmes and services. The multiple regression analysis of all dimensions against overall satisfaction as the dependent variable showed that the five dimensions of facilitator, curriculum, faculty, support services and learning centre account for 75.7 per cent of the variation in overall satisfaction. The selling points include: the learning management system (MyVLE), online registration, course contents, modules and facilitators. The critical points include those related to facilitator interaction and feedback, students’ sense of connectedness with the faculty staff, timely responses to enquiries and complaints and accessibility to digital library and learning centre staff.
Practical implications
Importance-Satisfaction Surveys can be used to help an institution to identify the services and facilities that can be marketed and also those that need to be improved in order to better meet its students’ expectations.
Originality/value
While many similar studies had been conducted elsewhere, this study had identified the “selling points” and “critical points” which are unique to OUM. In addition, most previous studies were focused on conventional institutions, carried out in many different countries with differing learning environments and cultures.
Details
Keywords
Linh Truong-Hong, Roderik Lindenbergh and Thu Anh Nguyen
Terrestrial laser scanning (TLS) point clouds have been widely used in deformation measurement for structures. However, reliability and accuracy of resulting deformation…
Abstract
Purpose
Terrestrial laser scanning (TLS) point clouds have been widely used in deformation measurement for structures. However, reliability and accuracy of resulting deformation estimation strongly depends on quality of each step of a workflow, which are not fully addressed. This study aims to give insight error of these steps, and results of the study would be guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds. Thus, the main contributions of the paper are investigating point cloud registration error affecting resulting deformation estimation, identifying an appropriate segmentation method used to extract data points of a deformed surface, investigating a methodology to determine an un-deformed or a reference surface for estimating deformation, and proposing a methodology to minimize the impact of outlier, noisy data and/or mixed pixels on deformation estimation.
Design/methodology/approach
In practice, the quality of data point clouds and of surface extraction strongly impacts on resulting deformation estimation based on laser scanning point clouds, which can cause an incorrect decision on the state of the structure if uncertainty is available. In an effort to have more comprehensive insight into those impacts, this study addresses four issues: data errors due to data registration from multiple scanning stations (Issue 1), methods used to extract point clouds of structure surfaces (Issue 2), selection of the reference surface Sref to measure deformation (Issue 3), and available outlier and/or mixed pixels (Issue 4). This investigation demonstrates through estimating deformation of the bridge abutment, building and an oil storage tank.
Findings
The study shows that both random sample consensus (RANSAC) and region growing–based methods [a cell-based/voxel-based region growing (CRG/VRG)] can be extracted data points of surfaces, but RANSAC is only applicable for a primary primitive surface (e.g. a plane in this study) subjected to a small deformation (case study 2 and 3) and cannot eliminate mixed pixels. On another hand, CRG and VRG impose a suitable method applied for deformed, free-form surfaces. In addition, in practice, a reference surface of a structure is mostly not available. The use of a fitting plane based on a point cloud of a current surface would cause unrealistic and inaccurate deformation because outlier data points and data points of damaged areas affect an accuracy of the fitting plane. This study would recommend the use of a reference surface determined based on a design concept/specification. A smoothing method with a spatial interval can be effectively minimize, negative impact of outlier, noisy data and/or mixed pixels on deformation estimation.
Research limitations/implications
Due to difficulty in logistics, an independent measurement cannot be established to assess the deformation accuracy based on TLS data point cloud in the case studies of this research. However, common laser scanners using the time-of-flight or phase-shift principle provide point clouds with accuracy in the order of 1–6 mm, while the point clouds of triangulation scanners have sub-millimetre accuracy.
Practical implications
This study aims to give insight error of these steps, and the results of the study would be guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds.
Social implications
The results of this study would provide guidelines for a practical community to either develop a new workflow or refine an existing one of deformation estimation based on TLS point clouds. A low-cost method can be applied for deformation analysis of the structure.
Originality/value
Although a large amount of the studies used laser scanning to measure structure deformation in the last two decades, the methods mainly applied were to measure change between two states (or epochs) of the structure surface and focused on quantifying deformation-based TLS point clouds. Those studies proved that a laser scanner could be an alternative unit to acquire spatial information for deformation monitoring. However, there are still challenges in establishing an appropriate procedure to collect a high quality of point clouds and develop methods to interpret the point clouds to obtain reliable and accurate deformation, when uncertainty, including data quality and reference information, is available. Therefore, this study demonstrates the impact of data quality in a term of point cloud registration error, selected methods for extracting point clouds of surfaces, identifying reference information, and available outlier, noisy data and/or mixed pixels on deformation estimation.
Details