Search results
1 – 10 of over 10000Modern prejudice was examined as a potential predictor of overestimating proportions of minority employees in gender-typed occupations. Strength of conjunction error was…
Abstract
Purpose
Modern prejudice was examined as a potential predictor of overestimating proportions of minority employees in gender-typed occupations. Strength of conjunction error was considered as an indicator of distorted perceptions of these proportions. Furthermore, the purpose of this paper is to investigate whether the association between modern prejudice and strength of conjunction error was weaker for gender-untypical than for gender-typical targets.
Design/methodology/approach
Modern prejudice was considered as a predictor of overestimations of black female employees in Study 1 (n=183) and black female older employees in Study 2 (n=409). Data were collected using internet-mediated questionnaires.
Findings
In Study 1, modern racism, but not modern sexism, was associated with greater strength of conjunction error when respondents were presented with gender-typical targets. In Study 2, using a sample scoring higher on modern prejudice than in Study 1, modern racism, but not modern sexism and modern ageism, was associated with greater strength of conjunction error, irrespective of target occupation. Furthermore, there was an unexpected association between lower sexism and greater strength of conjunction error for gender-typical targets, but not for gender-untypical targets.
Research limitations/implications
The findings lend support to the ethnic-prominence hypothesis in that modern racism, but not modern sexism or modern ageism, was associated with greater strength of conjunction error. Furthermore, empirical evidence suggests that target non-prototypicality can dilute the effect of modern prejudice on strength of conjunction error.
Originality/value
This is one of the rare studies examining attitudes and conjunction error in a work-relevant context, thereby bridging the gap between social cognition and applied psychology.
Details
Keywords
The purpose of this paper is to propose a new approach to further obtain reduced Hamiltonian equations for certain nonlinear cases of finite amplitude.
Abstract
Purpose
The purpose of this paper is to propose a new approach to further obtain reduced Hamiltonian equations for certain nonlinear cases of finite amplitude.
Design/methodology/approach
Chebyshev polynomials are introduced to best approximate the primitive exact wave equations.
Findings
New results are derived for certain nonlinear cases of finite amplitude. Furthermore, ranges of applicability are determined in conjunction with the error analyses for various cases. In particular, the new structure can give a new highly accurate formula for determining the wave forces of the offshore structures.
Originality/value
New reduced Hamiltonian equations for nonlinear surface gravity waves have been derived for certain cases of finite amplitude for the first time. And the new structure can give a new highly accurate formula for determining the wave forces of offshore structures. These results extend the usual results for weakly nonlinear surface waves to nonlinear surface waves over certain finite ranges.
Details
Keywords
George A. Gravvanis and Christos K. Filelis-Papadopoulos
The purpose of this paper is to propose multigrid methods in conjunction with explicit approximate inverses with various cycles strategies and comparison with the other smoothers…
Abstract
Purpose
The purpose of this paper is to propose multigrid methods in conjunction with explicit approximate inverses with various cycles strategies and comparison with the other smoothers.
Design/methodology/approach
The main motive for the derivation of the various multigrid schemes lies in the efficiency of the multigrid methods as well as the explicit approximate inverses. The combination of the various multigrid cycles with the explicit approximate inverses as smoothers in conjunction with the dynamic over/under relaxation (DOUR) algorithm results in efficient schemes for solving large sparse linear systems derived from the discretization of partial differential equations (PDE).
Findings
Application of the proposed multigrid methods on two-dimensional boundary value problems is discussed and numerical results are given concerning the convergence behavior and the convergence factors. The results are comparatively better than the V-cycle multigrid schemes presented in a recent report (Filelis-Papadopoulos and Gravvanis).
Research limitations/implications
The limitations of the proposed scheme lie in the fact that the explicit finite difference approximate inverse matrix used as smoother in the multigrid method is a preconditioner for specific sparsity pattern. Further research is carried out in order to derive a generic explicit approximate inverse for any type of sparsity pattern.
Originality/value
A novel smoother for the geometric multigrid method is proposed, based on optimized banded approximate inverse matrix preconditioner, the Richardson method in conjunction with the DOUR scheme, for solving large sparse linear systems derived from finite difference discretization of PDEs. Moreover, the applicability and convergence behavior of the proposed scheme is examined based on various cycles and comparative results are given against the damped Jacobi smoother.
Details
Keywords
Advanced big data analysis and machine learning methods are concurrently used to unleash the value of the data generated by government hotline and help devise intelligent…
Abstract
Purpose
Advanced big data analysis and machine learning methods are concurrently used to unleash the value of the data generated by government hotline and help devise intelligent applications including automated process management, standard construction and more accurate dispatched orders to build high-quality government service platforms as more widely data-driven methods are in the process.
Design/methodology/approach
In this study, based on the influence of the record specifications of texts related to work orders generated by the government hotline, machine learning tools are implemented and compared to optimize classify dispatching tasks by performing exploratory studies on the hotline work order text, including linguistics analysis of text feature processing, new word discovery, text clustering and text classification.
Findings
The complexity of the content of the work order is reduced by applying more standardized writing specifications based on combining text grammar numerical features. So, order dispatch success prediction accuracy rate reaches 89.6 per cent after running the LSTM model.
Originality/value
The proposed method can help improve the current dispatching processes run by the government hotline, better guide staff to standardize the writing format of work orders, improve the accuracy of order dispatching and provide innovative support to the current mechanism.
Details
Keywords
Ch. Pinto, R. Avilés, J. Albizuri and A. Hernández
In this second part of the paper, some properties of the discretization error estimators are presented, although their theoretical background was already developed in the first…
Abstract
In this second part of the paper, some properties of the discretization error estimators are presented, although their theoretical background was already developed in the first part. Two numerical examples have been selected and will be used to check some properties of these error estimators. In addition to this, some practical conclusions will be addressed from the results and graphical output of the implemented procedure.
Details
Keywords
This chapter explores the phenomenon of managerial overoptimism, focusing on the cognitive underpinnings of the mechanisms that generate this bias. It develops a formal model of…
Abstract
This chapter explores the phenomenon of managerial overoptimism, focusing on the cognitive underpinnings of the mechanisms that generate this bias. It develops a formal model of probability estimation that is inspired by the biological (cognitive neuroscience) evidence on associative information processing in the brain. The model is able to make novel, testable predictions about managerial overoptimism. It is able to parse out three mechanisms that could lead to overoptimism, as well as predict boundary conditions on when these effects should be observed and when the opposite (a pessimistic bias) should be observed instead. Furthermore, it predicts that under certain conditions, attempts by managers to “debias” their estimates might exacerbate the overoptimistic bias.
Details
Keywords
The purpose of this paper is to revisit the question of whether analysts anticipate accruals’ predicted reversals (or persistence) of future earnings. Prior evidence documents…
Abstract
Purpose
The purpose of this paper is to revisit the question of whether analysts anticipate accruals’ predicted reversals (or persistence) of future earnings. Prior evidence documents that analysts who provide information to investors are over optimistic about firms with high working capital (WC) accruals. The authors propose that empirical models using WC accruals alone may be incomplete and hence not entirely appropriate to assess the level of analysts’ understanding of accruals. The authors argue that analysts’ optimism about WC accruals might not be due to their lack of sophistication, but rather driven by incomplete accrual information embedded in forecast accuracy tests.
Design/methodology/approach
The authors use non-financial US firms for the period between 1976 and 2013. The authors define earnings forecast errors as the analysts’ consensus earnings forecasts minus the actual earnings provided by IBES deflated by share price from CRSP. The authors carry out forecast error regressions on individual accrual components by decomposing total accruals into categories. The authors perform the tests across 12 months starting from the initial analysts’ forecasts, which are generally issued in the first month after the prior period earnings announcement date. The final sample contains 48,142 firm–year observations per month.
Findings
The empirical tests show no correlation between analysts’ forecast errors and revised total accruals. The findings are robust to different samples, periods, model specifications, decile ranked accruals, high accruals, absolute forecast errors, controlling for cash flows (CF) and high accounting conservatism. The findings imply that if analysts are to achieve more accurate forecasts, they should be considering all rather than some accrual components. The authors interpret this evidence as an indication of analysts’ relative sophistication with respect to accruals.
Research limitations/implications
The authors recognise that analysts’ correct anticipation of accruals’ persistence does not mean that their earnings forecasts are entirely free of bias. Analysts can make forecast errors for various reasons including strategic biases. For instance, the tests show pessimistic forecast errors with respect to CF, which is in line with similar findings in prior research (Drake and Myers, 2011). Hence, the authors suggest that future research examine this correlation in greater depth as CF components are with the highest level of persistence, and hence should be predicted most accurately.
Practical implications
The results imply that the argument about analysts’ lack of sophistication with respect to accruals’ persistence is not warranted. The results imply that forecasts appear to contribute to market efficiency. Another implication is that analysts seem to utilise all relevant accrual information in their forecasts, hence traditional accrual definition should be revised in future studies. Key inferences of the paper imply that the growing use of analysts’ reports by institutional investors and money managers in their decision-making processes is justified despite the debate in the prior literature on the role and the reputation of analysts as surrogates of market expectations.
Originality/value
The research sheds a new light on the question whether sell-side security analysts are able to anticipate the persistence of accruals in future earnings.
Details
Keywords
C. Koenke, R. Harte, W.B. Krätzig and O. Rosenstein
The simulation of fracture processes for discrete crack propagation is well established for linear‐elastic cracking problems. Applying finite element techniques for the numerical…
Abstract
The simulation of fracture processes for discrete crack propagation is well established for linear‐elastic cracking problems. Applying finite element techniques for the numerical formulation, at every incremental macro‐crack step the element mesh has to be adapted such that the crack path remains independent of the initial mesh. The accuracy of the obtained results has to be controlled by suitable error estimators and error indicators. Considering the dependence of the predicted crack path on the precision of the displacement and stress computation, quality measures for the computed results are recommended. In this research the use of the Babuska/Rheinboldt error indicator in combination with linear‐elastic crack propagation problems is demonstrated. Based on this error measure an adaptive mesh refinement technique is developed. In comparison with classical discrete crack propagation simulations the advantages of the new concept can be clearly observed.
Details
Keywords
Biometrics, the measurement of ‘people‐based’ characteristics is rapidly becoming a major field of technology as an expert explains.
Thomas E Scruggs and Margo A Mastropieri
This chapter reviews problems in the identification of learning disabilities, with particular reference to issues involving discrepancy between IQ and achievement as a criterion…
Abstract
This chapter reviews problems in the identification of learning disabilities, with particular reference to issues involving discrepancy between IQ and achievement as a criterion for definition. Alternatives to present procedures for identification of learning disabilities are described. It is concluded that no presently proposed alternative meets all necessary criteria for identification of learning disabilities, and that radically altering or eliminating present conceptualizations of learning disabilities may be problematic. The major problems of identification of learning disabilities – including over-identification, variability, and specificity – can be addressed, it is suggested, by increasing specificity and consistency of state criteria and strict adherence to identification criteria on the local implementation level. However, further research in alternative methods for identifying learning disabilities is warranted.