Search results

1 – 10 of over 14000
Article
Publication date: 17 May 2022

Qiucheng Liu

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of…

Abstract

Purpose

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Design/methodology/approach

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Findings

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Originality/value

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Details

Library Hi Tech, vol. 41 no. 5
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 6 March 2024

Lillian Do Nascimento Gambi and Koenraad Debackere

The purpose of this paper is to examine the evolution of the literature on technology transfer and culture, identifying the main contents of the current body of knowledge…

Abstract

Purpose

The purpose of this paper is to examine the evolution of the literature on technology transfer and culture, identifying the main contents of the current body of knowledge encompassing culture and technology transfer (TT), thus contributing to a better understanding of the relationship between TT and culture based on bibliometric and multivariate statistical analyses of the relevant body of literature.

Design/methodology/approach

Data for this study were collected from the Web of Science (WoS) Core Collection database. Based on a bibliometric analysis and in-depth empirical review of major TT subjects, supported by multivariate statistical analyses, over 200 articles were systematically reviewed. The use of these methods decreases biases since it adds rigor to the subjective evaluation of the relevant literature base.

Findings

The exploratory analysis of the articles shows that first, culture is an important topic for TT in the literature; second, the publication data demonstrate a great dynamism regarding the different contexts in which culture is covered in the TT literature and third, in the last couple of years the interest of stimulating a TT culture in the context of universities has continuously grown.

Research limitations/implications

This study focuses on culture in the context of TT and identifies the main contents of the body of knowledge in the area. Based on this first insight, obtained through more detailed bibliometric and multivariate analyses, it is now important to develop and validate a theory on TT culture, emphasizing the dimensions of organizational culture, entrepreneurial culture and a culture of openness that fosters economic and societal spillovers, and to link those dimensions to the performance of TT activities.

Practical implications

From the practical point of view, managers in companies and universities should be aware of the importance of identifying those dimensions of culture that contribute most to the success of their TT activities.

Originality/value

Despite several literature reviews on the TT topic, no studies focusing specifically on culture in the context of TT have been developed. Therefore, given the multifaceted nature of the research field, this study aims to expand and to deepen the analysis of the TT literature by focusing on culture as an important and commonly cited element influencing TT performance.

Details

Benchmarking: An International Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 28 August 2023

Haiyan Xie, Ying Hong, Mengyang Xin, Ioannis Brilakis and Owen Shi

The purpose of this study is to improve communication success through barrier identification and analysis so that the identified barriers can help project teams establish…

Abstract

Purpose

The purpose of this study is to improve communication success through barrier identification and analysis so that the identified barriers can help project teams establish effective information-exchange strategies.

Design/methodology/approach

The recent publications on construction communication about time management are reviewed. Then, the semi-structured interviews are performed with both questionnaires and audio recordings (n1 = 18). Next, the collected data are analyzed using both statistical measures on the questionnaire survey and qualitative coding analysis on the text transcripts from an audio recording. Particularly, the identified barriers are substantiated using a scientometrics approach based on the published articles (2011–2020, n2 = 52,915) for purposeful information-sharing solutions in construction time management. Furthermore, the intervention strategies from the top 10 most-cited articles are analyzed and validated by comparisons with the results from construction surveys and relevant studies.

Findings

Based on the discussed communication difficulties, five main barriers were identified during time-cost risk management: probability and statistical concepts, availability of data from external resources, details of team member experiences, graphics (and graphical presentation skills), and spatial and temporal (a.k.a. 4D) simulation skills. For the improvement of communication skills and presentation quality regarding probability and statistical concepts, project teams should emphasize context awareness, case studies and group discussions. Details of communication techniques can be adjusted based on the backgrounds, experiences and expectations of team members.

Research limitations/implications

The dataset n1 has both size and duration limits because of the availability of the invited industry professionals. The dataset n2 considers the literature from 2011 to 2020. Any before-the-date and unpublished studies are not included in the study.

Practical implications

A thorough comprehension of communication barriers can help project teams develop speaking, writing and analytical thinking skills that will enable the teams to better deliver ideas, thoughts and meanings. Additionally, the established discussion on barrier-removal strategies may enhance time management effectiveness, reduce project delays, avoid confusion and misunderstanding and save rework costs.

Social implications

This research calls for the awareness of communication barriers in construction project execution and team collaboration. The identified barriers and the established solutions enrich the approaches of construction companies to share information with communities and society.

Originality/value

This is the first identification model for communication barriers in the time management of the construction industry to the authors' knowledge. The influencing factors and the countermeasures of communication difficulties highlighted by the research were not examined systematically and holistically in previous studies. The findings provide a new approach to facilitate the development of powerful communication strategies and to improve project execution.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Book part
Publication date: 13 May 2024

Fisnik Morina, Albulena Syla and Sadri Alija

Purpose: This study analyses how investments and specific financial factors affect the financial performance of businesses in Kosovo. Exploring the relationship between…

Abstract

Purpose: This study analyses how investments and specific financial factors affect the financial performance of businesses in Kosovo. Exploring the relationship between investments and financial performance and their impact on performance volatility, performance is assessed using return on assets (ROA) and return on equity (ROE) investments.

Methodology: Quantitative methods using secondary data from audited financial statements of Kosova manufacturing and commercial enterprises cover a 3-year period (2019–2021), involving 40 enterprises with 120 observations. Statistical tests such as descriptive statistics, correlation analysis, linear regression, Hausman–Taylor regression, fixed effects, random effects, and generalised estimating equations (GEE) model are applied. The study also utilises ARCH–GARCH analysis to assess the relationship between investments and performance volatility.

Findings: Investments positively impact the financial performance of Kosova businesses and significantly reduce performance volatility. Long-term liabilities, retained earnings, and short-term liabilities also play a role in reducing asset return volatility, while cash flow from financial activities increases it. Investments, cash flows from financial activities, long-term liabilities, short-term liabilities, retained earnings, and solvency affect equity return volatility.

Practical Implications: The study sheds light on how investments and financial factors influence the financial performance and volatility of Kosova businesses. Policymakers can use these insights to create policies that foster the development of commercial and manufacturing enterprises, given their importance in Kosovo’s economy.

Significance: This research provides valuable insights for business managers to enhance investment strategies and improve financial performance. Policymakers can rely on this academic study to enhance the economic environment and promote the growth of businesses in Kosovo.

Details

VUCA and Other Analytics in Business Resilience, Part A
Type: Book
ISBN: 978-1-83753-902-4

Keywords

Article
Publication date: 24 November 2023

Kristina K. Lindsey-Hall, Eric J. Michel, Sven Kepes, Ji (Miracle) Qi, Laurence G. Weinzimmer, Anthony R. Wheeler and Matthew R. Leon

The purpose of this manuscript is to provide a step-by-step primer on systematic and meta-analytic reviews across the service field, to systematically analyze the quality of…

Abstract

Purpose

The purpose of this manuscript is to provide a step-by-step primer on systematic and meta-analytic reviews across the service field, to systematically analyze the quality of meta-analytic reporting in the service domain, to provide detailed protocols authors may follow when conducting and reporting these analyses and to offer recommendations for future service meta-analyses.

Design/methodology/approach

Eligible frontline service-related meta-analyses published through May 2021 were identified for inclusion (k = 33) through a systematic search of Academic Search Complete, PsycINFO, Business Source Complete, Web of Science, Google Scholar and specific service journals using search terms related to service and meta-analyses.

Findings

An analysis of the existing meta-analyses within the service field, while often providing high-quality results, revealed that the quality of the reporting can be improved in several ways to enhance the replicability of published meta-analyses in the service domain.

Practical implications

This research employs a question-and-answer approach to provide a substantive guide for both properly conducting and properly reporting high-quality meta-analytic research in the service field for scholars at various levels of experience.

Originality/value

This work aggregates best practices from diverse disciplines to create a comprehensive checklist of protocols for conducting and reporting high-quality service meta-analyses while providing additional resources for further exploration.

Article
Publication date: 29 September 2023

Beatriz Campos Fialho, Ricardo Codinhoto and Márcio Minto Fabricio

Facilities management (FM) plays a key role in the performance of businesses to ensure the comfort of users and the sustainable use of natural resources over operation and…

Abstract

Purpose

Facilities management (FM) plays a key role in the performance of businesses to ensure the comfort of users and the sustainable use of natural resources over operation and maintenance. Nevertheless, reactive maintenance (RM) services are characterised by delays, waste and difficulties in prioritising services and identifying the root causes of failures; this is mostly caused by inefficient asset information and communication management. While linking building information modelling and the Internet of Things through a digital twin has demonstrated potential for improving FM practices, there is a lack of evidence regarding the process requirements involved in their implementation. This paper aims to address this challenge, as it is the first to statistically characterise RM services and processes to identify the most critical RM problems and scenarios for digital twin implementation. The statistical data analytics approach also constitutes a novel practical approach for a holistic analysis of RM occurrences.

Design/methodology/approach

The research strategy was based on multiple case studies, which adopted university campuses as objects for investigation. A detailed literature review of work to date and documental analysis assisted in generating data on the FM sector and RM services, where qualitative and statistical analyses were applied to approximately 300,000 individual work requests.

Findings

The work provides substantial evidence of a series of patterns across both cases that were not evidenced prior to this study: a concentration of requests within main campuses; a balanced distribution of requests per building, mechanical and electrical service categories; a predominance of low priority level services; a low rate of compliance in attending priority services; a cumulative impact on the overall picture of five problem subcategories (i.e. Building-Door, Mechanical-Plumbing, Electrical-Lighting, Mechanical-Heat/Cool/Ventilation and Electrical-Power); a predominance of problems in student accommodation facilities, circulations and offices; and a concentration of requests related to unlisted buildings. These new patterns form the basis for business cases where maintenance services and FM sectors can benefit from digital twins. It also provides a new methodological approach for assessing the impact of RM on businesses.

Practical implications

The findings provide new insights for owners and FM staff in determining the criticality of RM services, justifying investments and planning the digital transformation of services for a smarter provision.

Originality/value

This study represents a unique approach to FM and provides detailed evidence to identify novel RM patterns of critical service provision and activities within organisations for efficient digitalised data management over a building’s lifecycle.

Details

Facilities, vol. 42 no. 3/4
Type: Research Article
ISSN: 0263-2772

Keywords

Article
Publication date: 29 December 2022

Martha Zuluaga Quintero, Buddhike Sri Harsha Indrasena, Lisa Fox, Prakash Subedi and Jill Aylott

This paper aims to report on research undertaken in an National Health Service (NHS) emergency department in the north of England, UK, to identify which patients, with which…

Abstract

Purpose

This paper aims to report on research undertaken in an National Health Service (NHS) emergency department in the north of England, UK, to identify which patients, with which clinical conditions are returning to the emergency department with an unscheduled return visit (URV) within seven days. This paper analyses the data in relation to the newly introduced Integrated Care Boards (ICBs). The continued upward increase in demand for emergency care services requires a new type of “upstreamist”, health system leader from the emergency department, who can report on URV data to influence the development of integrated care services to reduce further demand on the emergency department.

Design/methodology/approach

Patients were identified through the emergency department symphony data base and included patients with at least one return visit to emergency department (ED) within seven days. A sample of 1,000 index visits between 1 January 2019–31 October 2019 was chosen by simple random sampling technique through Excel. Out of 1,000, only 761 entries had complete data in all variables. A statistical analysis was undertaken using Poisson regression using NCSS statistical software. A review of the literature on integrated health care and its relationship with health systems leadership was undertaken to conceptualise a new type of “upstreamist” system leadership to advance the integration of health care.

Findings

Out of all 83 variables regressed with statistical analysis, only 12 variables were statistically significant on multi-variable regression. The most statistically important factor were patients presenting with gynaecological disorders, whose relative rate ratio (RR) for early-URV was 43% holding the other variables constant. Eye problems were also statistically highly significant (RR = 41%) however, clinically both accounted for just 1% and 2% of the URV, respectively. The URV data combined with “upstreamist” system leadership from the ED is required as a critical mechanism to identify gaps and inform a rationale for integrated care models to lessen further demand on emergency services in the ED.

Research limitations/implications

At a time of significant pressure for emergency departments, there needs to be a move towards more collaborative health system leadership with support from statistical analyses of the URV rate, which will continue to provide critical information to influence the development of integrated health and care services. This study identifies areas for further research, particularly for mixed methods studies to ascertain why patients with specific complaints return to the emergency department and if alternative pathways could be developed. The success of the Esther model in Sweden gives hope that patient-centred service development could create meaningful integrated health and care services.

Practical implications

This research was a large-scale quantitative study drawing upon data from one hospital in the UK to identify risk factors for URV. This quality metric can generate important data to inform the development of integrated health and care services. Further research is required to review URV data for the whole of the NHS and with the new Integrated Health and Care Boards, there is a new impetus to push for this metric to provide robust data to prioritise the need to develop integrated services where there are gaps.

Originality/value

To the best of the authors’ knowledge, this is the first large-scale study of its kind to generate whole hospital data on risk factors for URVs to the emergency department. The URV is an important global quality metric and will continue to generate important data on those patients with specific complaints who return back to the emergency department. This is a critical time for the NHS and at the same time an important opportunity to develop “Esther” patient-centred approaches in the design of integrated health and care services.

Details

Leadership in Health Services, vol. 36 no. 3
Type: Research Article
ISSN: 1751-1879

Keywords

Article
Publication date: 29 February 2024

Wenque Liu, Albert P.C. Chan, Man Wai Chan, Amos Darko and Goodenough D. Oppong

The successful implementation of hospital projects (HPs) tends to confront sundry challenges in the planning and construction (P&C) phases due to their complexity and…

Abstract

Purpose

The successful implementation of hospital projects (HPs) tends to confront sundry challenges in the planning and construction (P&C) phases due to their complexity and particularity. Employing key performance indicators (KPIs) facilitates the monitoring of HPs to advance their successful delivery. This study aims to comprehensively investigate the KPIs for hospital planning and construction (HPC).

Design/methodology/approach

The KPIs for HPC were identified through a systematic review. Then a comprehensive assessment of these KPIs was performed utilizing a meta-analysis method. In this process, basic statistical analysis, subgroup analysis, sensitive analysis and publication bias analysis were performed.

Findings

Results indicate that all 27 KPIs identified from the literature are significant for executing HPs in P&C phases. Also, some unconventional performance indicators are crucial for implementing HPs, such as “Project monitoring effectiveness” and “Industry innovation and synergy,” as their high significance is reflected in this study. Despite the fact that the findings of meta-analysis are more trustworthy than those of individual studies, a high heterogeneity still exists in the findings. It highlights the inherent uncertainty in the construction industry. Hence, this study applied subgroup analysis to explore the underlying factors causing the high level of heterogeneity and used sensitive analysis to assess the robustness of the findings.

Originality/value

There is no consensus among the prior studies on KPIs for HPC specifically and their degree of significance. Additionally, few reviews in this field have focused on the reliability of the results. This study comprehensively assesses the KPIs for HPC and explores the variability and robustness of the results, which provides a multi-dimensional perspective for practitioners and the research community to investigate the performance of HPs during the P&C stages.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 19 January 2024

Premaratne Samaranayake, Michael W. McLean and Samanthi Kumari Weerabahu

The application of lean and quality improvement methods is very common in process improvement projects at organisational levels. The purpose of this research is to assess the…

Abstract

Purpose

The application of lean and quality improvement methods is very common in process improvement projects at organisational levels. The purpose of this research is to assess the adoption of Lean Six Sigma™ approaches for addressing a complex process-related issue in the coal industry.

Design/methodology/approach

The sticky coal problem was investigated from the perspective of process-related issues. Issues were addressed using a blended Lean value stream of supply chain interfaces and waste minimisation through the Six Sigma™ DMAIC problem-solving approach, taking into consideration cross-organisational processes.

Findings

It was found that the tendency to “solve the problem” at the receiving location without communication to the upstream was, and is still, a common practice that led to the main problem of downstream issues. The application of DMAIC Six Sigma™ helped to address the broader problem. The overall operations were improved significantly, showing the reduction of sticky coal/wagon hang-up in the downstream coal handling terminal.

Research limitations/implications

The Lean Six Sigma approaches were adopted using DMAIC across cross-organisational supply chain processes. However, blending Lean and Six Sigma methods needs to be empirically tested across other sectors.

Practical implications

The proposed methodology, using a framework of Lean Six Sigma approaches, could be used to guide practitioners in addressing similar complex and recurring issues in the manufacturing sector.

Originality/value

This research introduces a novel approach to process analysis, selection and contextualised improvement using a combination of Lean Six Sigma™ tools, techniques and methodologies sustained within a supply chain with certified ISO 9001 quality management systems.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Book part
Publication date: 15 May 2023

Birol Yıldız and Şafak Ağdeniz

Purpose: The main aim of the study is to provide a tool for non-financial information in decision-making. We analysed the non-financial data in the annual reports in order to show…

Abstract

Purpose: The main aim of the study is to provide a tool for non-financial information in decision-making. We analysed the non-financial data in the annual reports in order to show the usage of this information in financial decision processes.

Need for the Study: Main financial reports such as balance sheets and income statements can be analysed by statistical methods. However, an expanded financial reporting framework needs new analysing methods due to unstructured and big data. The study offers a solution to the analysis problem that comes with non-financial reporting, which is an essential communication tool in corporate reporting.

Methodology: Text mining analysis of annual reports is conducted using software named R. To simplify the problem, we try to predict the companies’ corporate governance qualifications using text mining. K Nearest Neighbor, Naive Bayes and Decision Tree machine learning algorithms were used.

Findings: Our analysis illustrates that K Nearest Neighbor has classified the highest number of correct classifications by 85%, compared to 50% for the random walk. The empirical evidence suggests that text mining can be used by all stakeholders as a financial analysis method.

Practical Implications: Combining financial statement analyses with financial reporting analyses will decrease the information asymmetry between the company and stakeholders. So stakeholders can make more accurate decisions. Analysis of non-financial data with text mining will provide a decisive competitive advantage, especially for investors to make the right decisions. This method will lead to allocating scarce resources more effectively. Another contribution of the study is that stakeholders can predict the corporate governance qualification of the company from the annual reports even if it does not include in the Corporate Governance Index (CGI).

Details

Contemporary Studies of Risks in Emerging Technology, Part B
Type: Book
ISBN: 978-1-80455-567-5

Keywords

1 – 10 of over 14000