Search results

1 – 10 of over 5000
Article
Publication date: 1 October 2018

Adriana Aparecida Lemos Torres, Benildes C.M.S Maculan, Célia da Consolação Dias and Gislene Rodrigues da Silva

This paper aims to present the results of an analysis about the subjectivity in the process of thematic representation of photographs.

Abstract

Purpose

This paper aims to present the results of an analysis about the subjectivity in the process of thematic representation of photographs.

Design/methodology/approach

The experiments were applied to students of the Course of Librarianship (Biblioteconomia) of the Universidade Federal de Minas Gerais, during the discipline “thematic representation of images”. The methodology included two methods of analysis: in free form and using four methodologies: the methods of Panofsky (1979), Smit (1996), Shatford (1986) and Manini (2002).

Findings

The results showed that subjectivity is always present, even with the use of methodologies.

Research limitations/implications

This paper shows the importance of doing the librarian course and the need to improve and update their methods, standards, techniques and tools to provide subsidies for the normalization of subject analysis as an essential procedure for information retrieval.

Practical implications

It is concluded that thematic representation of images requires clear and detailed indexation policies that can systematize the activity and minimize the effects of the subjectivity involved.

Social implications

The determination of the subject of the document involves many factors. One of them is the cognition of the indexer, which is influenced by his prior knowledge, limitations and biases about the context. However, it must take into account, above all, the social reality of the user, as the main purpose of the representation is the retrieval of information.

Originality/value

This study contributes to reaffirm the importance of the indexing discipline in vocational training in librarianship courses, emphasizing the cognitive aspects to which the indexer will be subject.

Article
Publication date: 16 August 2021

Mark Edward Phillips and Hannah Tarver

This study furthers metadata quality research by providing complementary network-based metrics and insights to analyze metadata records and identify areas for improvement.

Abstract

Purpose

This study furthers metadata quality research by providing complementary network-based metrics and insights to analyze metadata records and identify areas for improvement.

Design/methodology/approach

Metadata record graphs apply network analysis to metadata field values; this study evaluates the interconnectedness of subjects within each Hub aggregated into the Digital Public Library of America. It also reviews the effects of NACO normalization – simulating revision of values for consistency – and breaking up pre-coordinated subject headings – to simulate applying the Faceted Application of Subject Terminology to Library of Congress Subject Headings.

Findings

Network statistics complement count- or value-based metrics by providing context related to the number of records a user might actually find starting from one item and moving to others via shared subject values. Additionally, connectivity increases through the normalization of values to correct or adjust for formatting differences or by breaking pre-coordinated subject strings into separate topics.

Research limitations/implications

This analysis focuses on exact-string matches, which is the lowest-common denominator for searching, although many search engines and digital library indexes may use less stringent matching methods. In terms of practical implications for evaluating or improving subjects in metadata, the normalization components demonstrate where resources may be most effectively allocated for these activities (depending on a collection).

Originality/value

Although the individual components of this research are not particularly novel, network analysis has not generally been applied to metadata analysis. This research furthers previous studies related to metadata quality analysis of aggregations and digital collections in general.

Details

The Electronic Library , vol. 39 no. 3
Type: Research Article
ISSN: 0264-0473

Keywords

Article
Publication date: 17 October 2016

Hakki Ismail Bilgen and Abdulkadir Varoglu

The main purpose of this paper is to develop a methodology to analyze the competitiveness based on the diamond model and constructing a composite index; the secondary aim is to…

Abstract

Purpose

The main purpose of this paper is to develop a methodology to analyze the competitiveness based on the diamond model and constructing a composite index; the secondary aim is to apply this methodology to the national index of Turkey’s defense industry.

Design/methodology/approach

Instead of providing the results based only on diamond, a composite index study was carried out. The collected variables were distributed using subject-groups under determinants via an expert opinion survey. The variables were analyzed with alternative methods of imputation, normalization and aggregation. Factor analysis (FA) was performed with the aggregated values of each subject to find the years’ clusters.

Findings

Turkey’s diamond model indicated an improvement in defense industry between 1998-2010. And, FA revealed the clusters as 1998-2000, 2001-2007, 2008-2010. It was found that Turkey had an advantage in demand conditions but needs to give higher importance to factor conditions. In addition, the key provisions were catered to the issues related to government and the defense industry.

Research limitations/implications

Turkey’s competitiveness structure between 1998-2010 were researched.

Originality/value

This study provides a qualitative approach of the composite index to the quantitative side of the diamond model.

Details

Competitiveness Review, vol. 26 no. 5
Type: Research Article
ISSN: 1059-5422

Keywords

Article
Publication date: 1 March 2009

Kathleen A. McGinn

This article uses Michel Foucaultʼs theoretical work in examining relations of power within the unique context of street-level bureaucracies (Lipsky, 1980). Through Foucaultʼs…

Abstract

This article uses Michel Foucaultʼs theoretical work in examining relations of power within the unique context of street-level bureaucracies (Lipsky, 1980). Through Foucaultʼs techniques of discipline (1995), it analyzes how employees and managers are both objectified and selfproduced within collective bargaining agreements from street level organizations. Findings show that ‘managers’, ‘employees’ and ‘union representatives’ are produced but also constrained within these documents. These collective bargaining agreements also serve to ‘fix’ relationships discursively affirmed as unequal. Constrained by this ‘reality’, any potential for changing relationships between managers and employees through prescriptions that ask street-level bureaucrats to be ‘leaders’; “responsible choice-makers” (Vinzant & Crothers, 1998, p. 154) rather than policy implementers simply carrying out management directives are largely futile.

Details

International Journal of Organization Theory & Behavior, vol. 12 no. 1
Type: Research Article
ISSN: 1093-4537

Article
Publication date: 1 December 2003

M.J. Taylor, S. Wade and D. England

Designing a truly customer focused Web site can be difficult. This paper examines the approach of adapting the existing database design technique of normalisation for achieving…

3264

Abstract

Designing a truly customer focused Web site can be difficult. This paper examines the approach of adapting the existing database design technique of normalisation for achieving customer focused Web site design. A Web site can be thought of as a multimedia database, in fact under European law a Web site is legally classified as being a database. In traditional database design, normalisation is used to structure data and provide efficient keys for data retrieval. In Web site normalisation “data” is now the text, images, and functions (for example, e‐mail and ordering goods) that are required on the Web site. Keys for “data” are now Web site headings, sub‐groupings, and topics. Web site normalisation allows the Web site designer to structure Web site material and identify the optimal Web site navigational structure from a customer perspective, and thus, produce a truly customer focused Web site. A case study in a UK marketing organisation is provided in order to demonstrate and evaluate Web site normalisation in action.

Details

Internet Research, vol. 13 no. 5
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 25 September 2019

Nabil Moukafih, Ghizlane Orhanou and Said Elhajji

This paper aims to propose a mobile agent-based security information and event management architecture (MA-SIEM) that uses mobile agents for near real-time event collection and…

Abstract

Purpose

This paper aims to propose a mobile agent-based security information and event management architecture (MA-SIEM) that uses mobile agents for near real-time event collection and normalization on the source device. The externalization of the normalization process, executed by several distributed mobile agents on interconnected computers and devices, proposes a SIEM server dedicated mainly for correlation and analysis.

Design/methodology/approach

The architecture has been proposed in three stages. In the first step, the authors described the different aspects of the proposed approach. Then they implemented the proposed architecture and presented a new vision for the insertion of normalized data into the SIEM database. Finally, the authors performed a numerical comparison between the approach used in the proposed architecture and that of existing SIEM systems.

Findings

The results of the experiments showed that MA-SIEM systems are more efficient than existing SIEM systems because they leave the SIEM resources primarily dedicated to advanced correlation analysis. In addition, this paper takes into account realistic scenarios and use-cases and proposes a fully automated process for transferring normalized events in near real time to the SIEM server for further analysis using mobile agents.

Originality/value

The work provides new insights into the normalization security-related events using light mobile agents.

Details

Information & Computer Security, vol. 28 no. 1
Type: Research Article
ISSN: 2056-4961

Keywords

Article
Publication date: 20 February 2007

Per Skålén and Martin Fougère

Marketing “from the intra‐organizational perspective” has been comparatively untouched by the critical turn in organization studies. The objective of the present paper is to…

3321

Abstract

Purpose

Marketing “from the intra‐organizational perspective” has been comparatively untouched by the critical turn in organization studies. The objective of the present paper is to contribute to a critical examination of marketing as a change discourse by focusing on service management scholarship. In particular, the paper focuses upon the gap‐model.

Design/methodology/approach

Foucault's disciplinary power concept is used to analyze how the gap‐model tends to objectify, subjectify and normalize.

Findings

Focusing on service management contributes to the scarce critical examination of marketing in general and the almost non‐existent critical examination of service management in particular. Further, the paper contributes to the investigation of the potential production of subjectivity and normalization as an effect of marketing technologies.

Research limitations/implications

This paper suggests empirical exploration of subjective responses to marketing discourse and associated technologies.

Originality/value

Critical examinations of marketing discourse in general, and service management in particular, are very scarce. Specifically, the paper contributes to the understanding of how service management intends to fixate the subject.

Details

Journal of Organizational Change Management, vol. 20 no. 1
Type: Research Article
ISSN: 0953-4814

Keywords

Article
Publication date: 23 November 2010

Terhi Chakhovich

This paper seeks to elaborate on how subject positions promoting shareholder value are infused with an outcome focus.

Abstract

Purpose

This paper seeks to elaborate on how subject positions promoting shareholder value are infused with an outcome focus.

Design/methodology/approach

The study employs Foucault's perspectives on government and the interrelations between objectivity and subjectivity in the analysis of in‐depth case data gathered in one shareholder value‐oriented‐listed company and one non‐listed company.

Findings

The outside financial market discipline that objectifies shareholder value‐oriented company executives makes them subjects in their own organisation, allowing them to redirect discipline onwards and thereby objectify their subordinates. The non‐listed company executives, due to the relatively closed governance structure of their company and the lack of outside ownership, are not subject to such continuous outside discipline; they lack the same access to the means to create tangible outcomes within their organisations. The subject positions promoting shareholder value are focused on outcomes, whereas the non‐listed company subject positions are focused on processes.

Research limitations/implications

The subject positions of actors within different types of non‐listed companies and listed companies without a shareholder focus form a target for future studies.

Originality/value

The study contributes to the literatures on manager subject position formation and shareholder value. These contributions are achieved by uncovering a novel consequence of subject position formation and by revealing a mechanism by which outcome focus is tied with shareholder value.

Details

Qualitative Research in Accounting & Management, vol. 7 no. 4
Type: Research Article
ISSN: 1176-6093

Keywords

Book part
Publication date: 23 October 2023

Glenn W. Harrison and J. Todd Swarthout

We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset…

Abstract

We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset of CPT parameters, or more simply assumes CPT parameter values from prior studies. Our data are from laboratory experiments with undergraduate students and MBA students facing substantial real incentives and losses. We also estimate structural models from Expected Utility Theory (EUT), Dual Theory (DT), Rank-Dependent Utility (RDU), and Disappointment Aversion (DA) for comparison. Our major finding is that a majority of individuals in our sample locally asset integrate. That is, they see a loss frame for what it is, a frame, and behave as if they evaluate the net payment rather than the gross loss when one is presented to them. This finding is devastating to the direct application of CPT to these data for those subjects. Support for CPT is greater when losses are covered out of an earned endowment rather than house money, but RDU is still the best single characterization of individual and pooled choices. Defenders of the CPT model claim, correctly, that the CPT model exists “because the data says it should.” In other words, the CPT model was borne from a wide range of stylized facts culled from parts of the cognitive psychology literature. If one is to take the CPT model seriously and rigorously then it needs to do a much better job of explaining the data than we see here.

Details

Models of Risk Preferences: Descriptive and Normative Challenges
Type: Book
ISBN: 978-1-83797-269-2

Keywords

Open Access
Article
Publication date: 4 August 2023

Marco Gatti and Simone Poli

This paper explores the role that the control system – understood as a set of financial and non-financial mechanisms – introduced by the Ministerial Decree of 15th February 1860…

Abstract

Purpose

This paper explores the role that the control system – understood as a set of financial and non-financial mechanisms – introduced by the Ministerial Decree of 15th February 1860 played in promoting the ethical tolerance of prostitution in the Kingdom of Italy.

Design/methodology/approach

A qualitative research method was adopted. Specifically, this study draws on literature on accounting and deviant behaviors and on Suchman's theories of legitimation (1995) to interpret empirical evidence collected from archival primary sources as well as secondary sources.

Findings

The paper highlights how the accounting mechanisms introduced by the law were molded to limit the serious consequences of prostitution from a public health standpoint and to demonstrate that the State neither profited from prostitution nor used public money to fund it. This should have stimulated ethical tolerance of the law itself and, consequently, of the prostitution that was regulated.

Originality/value

This paper opens a new research avenue in the field of accounting history by exploring the connection between accounting and prostitution. Moreover, unlike the extant literature on accounting and deviant behaviors, this study delves into the role played by accounting mechanisms to promote ethical tolerance rather than to activate normalization processes.

Details

Accounting, Auditing & Accountability Journal, vol. 36 no. 9
Type: Research Article
ISSN: 0951-3574

Keywords

1 – 10 of over 5000