Search results
1 – 10 of 31Muni Kelly and Muni Kelly
The purpose of this study is to examine whether the gender of an audit engagement partner (EP) is associated with the quality of the EP’s audit output.
Abstract
Purpose
The purpose of this study is to examine whether the gender of an audit engagement partner (EP) is associated with the quality of the EP’s audit output.
Design/methodology/approach
This paper defines a low-quality EP as an EP who leads the audit of at least one client firm that subsequently restates its financial statements, while a high-quality EP is an EP that is not associated with any restatement. Using a sample of 6,082 observations from 2016 to 2020, the study estimates a logistic regression of EP quality on EP gender and control variables.
Findings
The results show that female EPs are more likely to be high-quality EPs. With an odds ratio of 1.25, the results imply that female EPs are 1.25 times more likely to be associated with higher-quality audits compared to male EPs.
Research limitations/implications
The results of this study imply that female EPs are more likely to perform high-quality audits, and it supports the assertion that EP gender plays a significant role in determining EP quality. Further studies may apply gender theory to investigate the behavior of female EPs.
Practical implications
The results show that female EPs are more likely to be high-quality EPs. With an odds ratio of 1.25, the results imply that female EPs are 1.25 times more likely to be associated with higher-quality audits compared to male EPs.
Originality/value
The results of this study should be of interest to stakeholders such as audit committees, regulators, investors and creditors, as they provide an indicator for assessing the quality of audits. Moreover, considering the EP’s important role in an audit, the current study extends the existing literature by providing evidence of a relationship between EP gender and EP quality.
Details
Keywords
For over a decade now, various stakeholders in accounting education have called for the integration of technology competencies in the accounting curriculum (Association to Advance…
Abstract
For over a decade now, various stakeholders in accounting education have called for the integration of technology competencies in the accounting curriculum (Association to Advance Collegiate Schools of Business (AACSB), 2013, 2018; Accounting Education Change Commission (AECC), 1990; American Institute of Certified Public Accountant (AICPA), 1996; Behn et al., 2012; Lawson et al., 2014; PricewaterhouseCoopers (PWC), 2013). In addition to stakeholder expectations, the inclusion of data analytics as a key area in both the business and accounting accreditation standards of the AACSB signals the urgent need for accounting programs to incorporate data analytics into their accounting curricula. This paper examines the extent of the integration of data analytics in the curricula of accounting programs with separate accounting AACSB accreditation. The paper also identifies possible barriers to integrating data analytics into the accounting curriculum. The results of this study indicate that of the 177 AACSB-accredited accounting programs, 79 (44.6%) offer data analytics courses at either the undergraduate or graduate level or as a special track. The results also indicate that 41 (23.16%) offer data analytics courses in their undergraduate curriculum, 61 (35.88%) at the graduate level, and 12 (6.80%) offer specialized tracks for accounting data analytics. Taken together, the findings indicate an encouraging trend, albeit slow, toward the integration of data analytics into the accounting curriculum.
Details
Keywords
Nana Y. Amoah, Isaac Bonaparte, Ebenezer K. Lamptey and Muni Kelly
Using the L. Bebchuk, Cohen, and Ferrell (2009) entrenchment index (E-index), the authors examine the relation between management entrenchment and the probability of a firm being…
Abstract
Using the L. Bebchuk, Cohen, and Ferrell (2009) entrenchment index (E-index), the authors examine the relation between management entrenchment and the probability of a firm being implicated in the stock option backdating scandal. The authors conduct the analysis of this study using logistic regression, and they document a negative relation between the E-index and the probability of a firm being implicated in the stock option backdating scandal. The results of this study are consistent with the view that management entrenchment is advantageous to shareholders as it protects managers from short-term reporting pressures and egregious opportunistic behavior that can be detrimental to firm value.
Details
Keywords
This article uses Michel Foucaultʼs theoretical work in examining relations of power within the unique context of street-level bureaucracies (Lipsky, 1980). Through Foucaultʼs…
Abstract
This article uses Michel Foucaultʼs theoretical work in examining relations of power within the unique context of street-level bureaucracies (Lipsky, 1980). Through Foucaultʼs techniques of discipline (1995), it analyzes how employees and managers are both objectified and selfproduced within collective bargaining agreements from street level organizations. Findings show that ‘managers’, ‘employees’ and ‘union representatives’ are produced but also constrained within these documents. These collective bargaining agreements also serve to ‘fix’ relationships discursively affirmed as unequal. Constrained by this ‘reality’, any potential for changing relationships between managers and employees through prescriptions that ask street-level bureaucrats to be ‘leaders’; “responsible choice-makers” (Vinzant & Crothers, 1998, p. 154) rather than policy implementers simply carrying out management directives are largely futile.
Lukasz Prorokowski, Oleg Deev and Hubert Prorokowski
The use of risk proxies in internal models remains a popular modelling solution. However, there is some risk that a proxy may not constitute an adequate representation of the…
Abstract
Purpose
The use of risk proxies in internal models remains a popular modelling solution. However, there is some risk that a proxy may not constitute an adequate representation of the underlying asset in terms of capturing tail risk. Therefore, using empirical examples for the financial collateral haircut model, this paper aims to critically review available statistical tools for measuring the adequacy of capturing tail risk by proxies used in the internal risk models of banks. In doing so, this paper advises on the most appropriate solutions for validating risk proxies.
Design/methodology/approach
This paper reviews statistical tools used to validate if the equity index/fund benchmark are proxies that adequately represent tail risk in the returns on an individual asset (equity/fund). The following statistical tools for comparing return distributions of the proxies and the portfolio items are discussed: the two-sample Kolmogorov–Smirnov test, the spillover test and the Harrell’s C test.
Findings
Upon the empirical review of the available statistical tools, this paper suggests using the two-sample Kolmogorov–Smirnov test to validate the adequacy of capturing tail risk by the assigned proxy and the Harrell’s C test to capture the discriminatory power of the proxy-based collateral haircuts models. This paper also suggests a tool that compares the reactions of risk proxies to tail events to verify possible underestimation of risk in times of significant stress.
Originality/value
The current regulations require banks to prove that the modelled proxies are representative of the real price observations without underestimation of tail risk and asset price volatility. This paper shows how to validate proxy-based financial collateral haircuts models.
Details
Keywords
Hannan Amoozad Mahdiraji, Hojatallah Sharifpour Arabi, Moein Beheshti and Demetris Vrontis
This research aims to extract Industry 4.0 technological building blocks (TBBs) capable of value generation in collaborative consumption (CC) and the sharing economy (SE)…
Abstract
Purpose
This research aims to extract Industry 4.0 technological building blocks (TBBs) capable of value generation in collaborative consumption (CC) and the sharing economy (SE). Furthermore, by employing a mixed methodology, this research strives to analyse the relationship amongst TBBs and classify them based on their impact on CC.
Design/methodology/approach
Due to the importance of technology for the survival of collaborative consumption in the future, this study suggests a classification of the auxiliary and fundamental Industry 4.0 technologies and their current upgrades, such as the metaverse or non-fungible tokens (NFT). First, by applying a systematic literature review and thematic analysis (SLR-TA), the authors extracted the TBBs that impact on collaborative consumption and SE. Then, using the Bayesian best-worst method (BBWM), TBBs are weighted and classified using experts’ opinions. Eventually, a score function is proposed to measure organisations’ readiness level to adopt Industry 4.0 technologies.
Findings
The findings illustrated that virtual reality (VR) plays a vital role in CC and SE. Of the 11 TBBs identified in the CC and SE, VR was selected as the most determinant TBB and metaverse was recognised as the least important. Furthermore, digital twins, big data and VR were labelled as “fundamental”, and metaverse, augmented reality (AR), and additive manufacturing were stamped as “discretional”. Moreover, cyber-physical systems (CPSs) and artificial intelligence (AI) were classified as “auxiliary” technologies.
Originality/value
With an in-depth investigation, this research identifies TBBs of Industry 4.0 with the capability of value generation in CC and SE. To the authors’ knowledge, this is the first research that identifies and examines the TBBs of Industry 4.0 in the CC and SE sectors and examines them. Furthermore, a novel mixed method has identified, weighted and classified pertinent technologies. The score function that measures the readiness level of each company to adopt TBBs in CC and SE is a unique contribution.
Details
Keywords
William C. Rivenbark, Dale J. Roenigk and Lidia Noto
A major part of maintaining a well-managed performance measurement system in local government is providing the infrastructure for performance management. The problem is that local…
Abstract
A major part of maintaining a well-managed performance measurement system in local government is providing the infrastructure for performance management. The problem is that local officials often struggle with moving from adopting performance measures to actually using them for improving services and for making resource allocation decisions. This article responds to this struggle by presenting information on the relationships between efficiency and effectiveness measures across six local government service areas, with the goal of providing guidance on using performance measures to support strategic resource management. Our research suggests that stronger correlations exist between efficiency and effectiveness measures associated with local services that possess private good characteristics, concluding that performance measures associated with market-oriented services lend themselves more readily to making resource allocation decisions.