Search results

1 – 10 of 131
Article
Publication date: 1 February 2016

Avraham Levi

– The purpose of this paper is to explain why ROC analysis is an inappropriate replacement for probative analysis in lineup research.

Abstract

Purpose

The purpose of this paper is to explain why ROC analysis is an inappropriate replacement for probative analysis in lineup research.

Design/methodology/approach

Taking as the medical example comparing two methods to detect the presence of a malignant tumor (Mickes et al., 2012), and operationally defining ROC analysis: radiologists are shown the results from two methods. Their confidence judgments create a graph of correct identifications by mistaken ones. The author can compare the methods on radiologists’ ability to differentiate sick from healthy. Lineup researchers create two distinct lineups. In target-present lineups, witnesses differentiate between the target and the foils, not the target and the innocent suspect. In target-absent lineups, witnesses cannot even differentiate between innocent suspects and foils, having seen none.

Findings

Eyewitness ROC curves are similar to probative analysis, but provide less useful information.

Research limitations/implications

Researchers ware warned against using ROC when conducting lineup research.

Originality/value

Preventing inappropriate use of ROC analysis.

Details

Journal of Criminal Psychology, vol. 6 no. 1
Type: Research Article
ISSN: 2009-3829

Keywords

Abstract

Following the Supreme Court’s 1988 decision in Basic, securities class plaintiffs can invoke the “rebuttable presumption of reliance on public, material misrepresentations regarding securities traded in an efficient market” [the “fraud-on-the-market” doctrine] to prove classwide reliance. Although this requires plaintiffs to prove that the security traded in an informationally efficient market throughout the class period, Basic did not identify what constituted adequate proof of efficiency for reliance purposes.

Market efficiency cannot be presumed without proof because even large publicly traded stocks do not always trade in efficient markets, as documented in the economic literature that has grown significantly since Basic. For instance, during the recent global financial crisis, lack of liquidity limited arbitrage (the mechanism that renders markets efficient) and led to significant price distortions in many asset markets. Yet, lower courts following Basic have frequently granted class certification based on a mechanical review of some factors that are considered intuitive “proxies” of market efficiency (albeit incorrectly, according to recent studies and our own analysis). Such factors have little probative value and their review does not constitute the rigorous analysis demanded by the Supreme Court.

Instead, to invoke fraud-on-the-market, plaintiffs must first establish that the security traded in a weak-form efficient market (absent which a security cannot, as a logical matter, trade in a “semi-strong form” efficient market, the standard required for reliance purposes) using well-accepted tests. Only then do event study results, which are commonly used to demonstrate “cause and effect” (i.e., prove that the security’s price reacted quickly to news – a hallmark of a semi-strong form efficient market), have any merit. Even then, to claim classwide reliance, plaintiffs must prove such cause-and-effect relationship throughout the class period, not simply on selected disclosure dates identified in the complaint as plaintiffs often do.

These issues have policy implications because, once a class is certified, defendants frequently settle to avoid the magnified costs and risks associated with a trial, and the merits of the case (including the proper application of legal presumptions) are rarely examined at a trial.

Details

The Law and Economics of Class Actions
Type: Book
ISBN: 978-1-78350-951-5

Keywords

Content available
Article
Publication date: 1 February 2016

Katie Dhingra

206

Abstract

Details

Journal of Criminal Psychology, vol. 6 no. 1
Type: Research Article
ISSN: 2009-3829

Article
Publication date: 5 February 2018

Elena-Mădălina Vătămănescu, Andreia Gabriela Andrei and Florina Pînzaru

The purpose of this paper is to explore the influence of five dimensions of similarity (i.e. condition similarity, context similarity, catalyst similarity, consequence similarity…

Abstract

Purpose

The purpose of this paper is to explore the influence of five dimensions of similarity (i.e. condition similarity, context similarity, catalyst similarity, consequence similarity and connection similarity) on Facebook social networks development.

Design/methodology/approach

A questionnaire-based survey was conducted with 245 Romanian college students. SmartPLS 3 statistical software for partial least squares structural equation modeling was chosen as the most adequate technique for the assessment of models with both composites and reflective constructs.

Findings

More than 52 percent of the variance in social network development was explained by the advanced similarity model. Each dimension had a positive effect on Facebook social networks development, the highest influences being exerted by condition similarity, context similarity and consequence similarity.

Research limitations/implications

The current approach is substantively based on the homophily paradigm in explaining social network development. Future research would benefit from comparing and contrasting complementary theories (e.g. the rational self-interest paradigm, the social exchange or dependency theories) with the current findings. Also, the research is tributary to a convenience-based sample of Romanian college students which limits the generalization of the results to other cultural contexts and, thus, invites further research initiatives to test the model in different settings.

Social implications

Similarity attributes and mechanisms consistently determine the dynamics of online social networks, a fact which should be investigated in depth in terms of the impact of new technologies among young people.

Originality/value

This study is among the first research initiatives to approach similarity structures and processes within an integrative framework and to conduct the empirical analysis beyond US-centric samples.

Article
Publication date: 1 June 1997

Lynn R. Kahle

The real‐time response survey can be viewed as a dialectic elaboration of the focus group and the sample survey, incorporating some of the advantages of each and producing a…

1158

Abstract

The real‐time response survey can be viewed as a dialectic elaboration of the focus group and the sample survey, incorporating some of the advantages of each and producing a program of research quickly. An evaluation of the methodology shows its predictive utility from: real‐time response purchase intentions to self‐reported actual purchases of common commodities seven days later (r = 0.97); real‐time response purchase intentions to self‐reported actual purchases of new products six months later (r = 0.94); and an index of three real‐time response ratings of a product (purchase intention, price, and extent to which product is new and different) by residents of Cincinnati to national sales data for the following year (r = 0.45). Considers some advantages and disadvantages of the methodology. It can be quite useful in new product development.

Details

Journal of Consumer Marketing, vol. 14 no. 3
Type: Research Article
ISSN: 0736-3761

Keywords

Article
Publication date: 9 May 2016

Kirk Luther and Brent Snook

A recent Supreme Court of Canada (SCC) ruling resulted in stricter rules being placed on how police organizations can obtain confessions through a controversial undercover…

Abstract

Purpose

A recent Supreme Court of Canada (SCC) ruling resulted in stricter rules being placed on how police organizations can obtain confessions through a controversial undercover operation, known as the Mr. Big technique. The SCC placed the onus on prosecutors to demonstrate that the probative value of any Mr. Big derived confession outweighs its prejudicial effect, and that the police must refrain from an abuse of process (i.e. avoid overcoming the will of the accused to obtain a confession). The purpose of this paper is to determine whether a consideration of the social influence tactics present in the Mr. Big technique would deem Mr. Big confessions inadmissible.

Design/methodology/approach

The social psychological literature related to the compliance and the six main principles of social influence (i.e. reciprocity, consistency, liking, social proof, authority, scarcity) was reviewed. The extent to which these social influence principles are arguably present in Mr. Big operations are discussed.

Findings

Mr. Big operations, by their very nature, create unfavourable circumstances for the accused that are rife with psychological pressure to comply and ultimately confess. A consideration by the SCC of the social influence tactics used to elicit confessions – because such tactics sully the circumstances preceding confessions and verge on abuse of process – should lead to all Mr. Big operations being prohibited.

Practical implications

Concerns regarding the level of compliance in the Mr. Big technique call into question how Mr. Big operations violate the guidelines set out by the SCC ruling. The findings from the current paper could have a potential impact of the admissibility of Mr. Big confessions, along with continued use of this controversial technique.

Originality/value

The current paper represents the first in-depth analysis of the Mr. Big technique through a social psychological lens.

Details

Journal of Forensic Practice, vol. 18 no. 2
Type: Research Article
ISSN: 2050-8794

Keywords

Abstract

Details

Marketisation and Forensic Science Provision in England and Wales
Type: Book
ISBN: 978-1-83909-124-7

Book part
Publication date: 6 September 2000

Adam Karp

Discrimination law has evolved from litigating or prosecuting overt, individual cases of egregious behavior solely by means of anecdotal evidence and eyewitness testimony…

Abstract

Discrimination law has evolved from litigating or prosecuting overt, individual cases of egregious behavior solely by means of anecdotal evidence and eyewitness testimony. Statistical evidence came to bear the imprimatur of the United States Supreme Court in the Seventies as a probative means of discerning guilt or liability, and has been used to shore up patterns of prejudice at a systemic level since. Courtrooms of the Twenty-First Century have struggled to define discrimination through a quantitative lens, nonetheless relying on qualitative evidence to assist the factfinder in rendering a verdict. Some definitions carry more precision and accuracy than others. Consider the inflammatory National Law Journal's indictment of the United States Environmental Protection Agency (‘EPA’) as an example of the latter. In 1992, the National Law Journal ran a Special Investigation of the EPA, claiming that the federal government had fostered a racist imbalance in hazardous site cleanup and its pursuit of polluters. Kudos to the columnists for bringing environmental equity into the spotlight of public debate and for forewarning and encouraging the EPA to conduct its enforcements reflectively, in order to avoid being on the receiving end of a Title VI lawsuit. Nonetheless, the methodology used by the National Law Journal belies a total understanding of the bureaucratic structure that pursued these actions and of the notion of statistical significance. This Article confines itself to Region X's actions between 1995 and 1999, applying linear regression and other statistical tests to determine whether biases, found using the National Law Journal's naive methodology, stand after due consideration of chance. The NLJ approach finds evidence of bias, but the author also conducts more complicated and appropriate analyses, such as those contemplated by the National Guidance. After issuing some provisos, the author dismisses charges of racism or classism. While the National Guidance represents a positive first step in identifying environmental justice communities, those with an above-average proportion of lower-class or non-Caucasian inhabitants, it lacks statistical sophistication and econometric depth. This Article concludes by recommending the use of normalized racial distributions, Gini coefficients, and Social Welfare Functions to the EPA and to other organizations conducting environmental justice analysis.

Details

Research in Law and Economics
Type: Book
ISBN: 978-1-84950-022-7

Book part
Publication date: 18 September 2006

Lori Anderson Snyder, Deborah E. Rupp and George C. Thornton

The impetus for this paper was the recognition, based on recent surveys and our own experiences, that organizations face special challenges when designing and validating selection…

Abstract

The impetus for this paper was the recognition, based on recent surveys and our own experiences, that organizations face special challenges when designing and validating selection procedures for information technology (IT) workers. The history of the IT industry, the nature of IT work, and characteristics of IT workers converge to make the selection of IT workers uniquely challenging. In this paper, we identify these challenges and suggest means of addressing them. We show the advantages offered by the modern view of validation that endorses a wide spectrum of probative information relevant to establishing the job relatedness and business necessity of IT selection procedures. Finally, we identify the implications of these issues for industrial/organizational psychologists, human resource managers, and managers of IT workers.

Details

Research in Personnel and Human Resources Management
Type: Book
ISBN: 978-1-84950-426-3

Article
Publication date: 4 June 2018

Simone Busetti and Giancarlo Vecchi

In 2009, the Italian Government initiated a national programme to improve the management of judicial offices. Programme implementation has been patchy and unsatisfactory in all…

Abstract

Purpose

In 2009, the Italian Government initiated a national programme to improve the management of judicial offices. Programme implementation has been patchy and unsatisfactory in all but a few cases. Against this background, the Law Court of Milan has achieved exceptional results and is now recognised as a good practice benchmark for Italy. The purpose of this paper is to investigate this case in order to reconstruct the local conditions for successful implementation of the national programme.

Design/methodology/approach

To test a theory of the programme based on leaders’ engagement, their access to managerial knowledge, and the transfer and consolidation of that knowledge, the present study applies process tracing, a qualitative method that uses Bayesian reasoning to improve the accuracy of within-case inferences.

Findings

The analysis shows how programme and context features interacted to support change. In particular, while the national programme succeeded in providing resources for leader engagement and knowledge access, the transfer and consolidation of managerial knowledge depended largely on a brokerage function performed locally between consultants and magistrates.

Originality/value

The paper sheds light on the local conditions for change management and does so by employing an innovative qualitative method that improves the reliability of within-case inferences.

Details

International Journal of Public Sector Management, vol. 31 no. 5
Type: Research Article
ISSN: 0951-3558

Keywords

1 – 10 of 131