Search results

1 – 10 of over 22000
Article
Publication date: 30 August 2011

Gilbert Tekli, Richard Chbeir and Jacques Fayolle

XML has spread beyond the computer science fields and reached other areas such as, e‐commerce, identification, information storage, instant messaging and others. Data communicated…

Abstract

Purpose

XML has spread beyond the computer science fields and reached other areas such as, e‐commerce, identification, information storage, instant messaging and others. Data communicated over these domains are now mainly based on XML. Thus, allowing non‐expert programmers to manipulate and control their XML data is essential. The purpose of this paper is to present an XA2C framework intended for both non‐expert and expert programmers and provide them with means to write/draw their XML data manipulation operations.

Design/methodology/approach

In the literature, this issue has been dealt with from two perspectives: first, XML alteration/adaptation techniques requiring a certain level of expertise to be implemented and are not unified yet; and second, Mashups, which are not formally defined yet and are not specific to XML data, and XML‐oriented visual languages are based on structural transformations and data extraction mainly and do not allow XML textual data manipulations. The paper discusses existing approaches and the XA2C framework is presented.

Findings

The framework is defined based on the dataflow paradigm (visual diagram compositions) while taking advantage of both Mashups and XML‐oriented visual languages by defining a well‐founded modular architecture and an XML‐oriented visual functional composition language based on colored petri nets allowing functional compositions. The framework takes advantage of existing XML alteration/adaptation techniques by defining them as XML‐oriented manipulation functions. A prototype called XA2C is developed and presented here for testing and validating the authors' approach.

Originality/value

This paper presents a detailed description of an XML‐oriented manipulation framework implementing the XML‐oriented composition definition language.

Details

International Journal of Web Information Systems, vol. 7 no. 3
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 21 December 2021

Gianclaudio Malgieri

This study aims to discover the legal borderline between licit online marketing and illicit privacy-intrusive and manipulative marketing, considering in particular consumers’…

1141

Abstract

Purpose

This study aims to discover the legal borderline between licit online marketing and illicit privacy-intrusive and manipulative marketing, considering in particular consumers’ expectations of privacy.

Design/methodology/approach

A doctrinal legal research methodology is applied throughout with reference to the relevant legislative frameworks. In particular, this study analyzes the European Union (EU) data protection law [General Data Protection Regulation (GDPR)] framework (as it is one of the most advanced privacy laws in the world, with strong extra-territorial impact in other countries and consequent risks of high fines), as compared to privacy scholarship on the field and extract a compliance framework for marketers.

Findings

The GDPR is a solid compliance framework that can help to distinguish licit marketing from illicit one. It brings clarity through four legal tests: fairness test, lawfulness test, significant effect test and the high-risk test. The performance of these tests can be beneficial to consumers and marketers in particular considering that meeting consumers’ expectation of privacy can enhance their trust. A solution for marketers to respect and leverage consumers’ privacy expectations is twofold: enhancing critical transparency and avoiding the exploitation of individual vulnerabilities.

Research limitations/implications

This study is limited to the European legal framework scenario and to theoretical analysis. Further research is necessary to investigate other legal frameworks and to prove this model in practice, measuring not only the consumers’ expectation of privacy in different contexts but also the practical managerial implications of the four GDPR tests for marketers.

Originality/value

This study originally contextualizes the most recent privacy scholarship on online manipulation within the EU legal framework, proposing an easy and accessible four-step test and twofold solution for marketers. Such a test might be beneficial both for marketers and for consumers’ expectations of privacy.

Details

Journal of Consumer Marketing, vol. 40 no. 2
Type: Research Article
ISSN: 0736-3761

Keywords

Article
Publication date: 26 July 2011

Mohamed Zaki, Babis Theodoulidis and David Díaz Solís

Although the financial markets are regulated by robust systems and rules that control their efficiency and try to protect investors from various manipulation schemes, markets…

1080

Abstract

Purpose

Although the financial markets are regulated by robust systems and rules that control their efficiency and try to protect investors from various manipulation schemes, markets still suffer from frequent attempts to mislead or misinform investors in order to generate illegal profits. The impetus to effectively and systematically address such schemes presents many challenges to academia, industry and relevant authorities. This paper aims to discuss these issues.

Design/methodology/approach

The paper describes a case study on fraud detection using data mining techniques that help analysts to identify possible instances of touting based on spam e‐mails. Different data mining techniques such as decision trees, neural networks and linear regression are shown to offer great potential for this emerging domain. The application of these techniques is demonstrated using data from the Pink Sheets market.

Findings

Results strongly suggest the cumulative effect of “stock touting” spam e‐mails is key to understanding the patterns of manipulations associated with touting e‐mail campaigns, and that data mining techniques can be used to facilitate fraud investigations of spam e‐mails.

Practical implications

The approach proposed and the paper's findings could be used retroactively to help the relevant authorities and organisations identify abnormal behaviours in the stock market. It could also be used proactively to warn analysts and stockbrokers of possible cases of market abuse.

Originality/value

This research studies the relationships between the cumulative volume of spam touts and a number of financial indicators using different supervised classification techniques. The paper aims to contribute to a better understanding of the market manipulation problem and provide part of a unified framework for the design and analysis of market manipulation systems.

Details

Journal of Manufacturing Technology Management, vol. 22 no. 6
Type: Research Article
ISSN: 1741-038X

Keywords

Article
Publication date: 9 January 2023

Etienne G. Harb, Nohade Nasrallah, Rim El Khoury and Khaled Hussainey

Lebanon has faced one of the most severe financial and economic crises since the end of 2019. The practices of the Lebanese banks are blamed for dangerously exposing economic…

Abstract

Purpose

Lebanon has faced one of the most severe financial and economic crises since the end of 2019. The practices of the Lebanese banks are blamed for dangerously exposing economic agents and precipitating the current financial collapse. This paper examines the patterns of manipulation of the 10 biggest banks before and after implementing the financial engineering mechanism.

Design/methodology/approach

The authors apply Benford law for the first and second positions of the reports of condition and income and four out of the six aspects of the CAMELS rating system (Capital Adequacy, Assets Quality, Management expertise, Earnings Strength, Liquidity and Sensitivity to the market) by excluding Management and Sensitivity. The deviations from BL frequencies are tested using Z-statistic and Chi-square tests.

Findings

Banks seem to have manipulated their Capital Adequacy, Liquidity and Assets Quality in the pre-financial engineering and considerably in the post-financial engineering periods. Fraudulent manipulations in the banking sector can distort depositors, shareholders and regulating authorities.

Research limitations/implications

This study has many implications for governmental authorities, commercial banks, depositors, businesses, accounting and auditing firms, and policymakers. The Lebanese government needs to implement corrective fiscal and monetary policies and apply amendments to the bank secrecy and capital control law. The central bank should revamp its organizational structure, improve its disclosure practices and significantly reduce its ties to the government and the political elite.

Practical implications

The study findings suggest that the central bank should revamp its organizational structure, improve its disclosure practices and significantly reduce its ties to the government and the political elite.

Originality/value

The study is the first to examine the patterns of fraudulent manipulation in the Lebanese banking industry using Benford Law (BL).

Details

Journal of Applied Accounting Research, vol. 24 no. 4
Type: Research Article
ISSN: 0967-5426

Keywords

Article
Publication date: 1 December 1994

Gerti Kappel and Stefan Vieweg

Changes in market and production profiles require a more flexibleconcept in manufacturing. Computer integrated manufacturing (CIM)describes an integrative concept for joining…

1395

Abstract

Changes in market and production profiles require a more flexible concept in manufacturing. Computer integrated manufacturing (CIM) describes an integrative concept for joining business and manufacturing islands. In this context, database technology is the key technology for implementing the CIM philosophy. However, CIM applications are more complex and thus more demanding than traditional database applications such as business and administrative applications. Systematically analyses the database requirements for CIM applications including business and manufacturing tasks. Special emphasis is given on integration requirements due to the distributed, partly isolated nature of CIM applications developed over the years. An illustrative sampling of current efforts in the database community to meet the challenge of non‐standard applications such as CIM is presented.

Details

Integrated Manufacturing Systems, vol. 5 no. 4/5
Type: Research Article
ISSN: 0957-6061

Keywords

Article
Publication date: 1 March 1994

P. Krysl

The present paper describes a Fortran library FLIPP constituting arun‐time environment that is linked to scientific applications software(such as finite‐element analysis programs…

Abstract

The present paper describes a Fortran library FLIPP constituting a run‐time environment that is linked to scientific applications software (such as finite‐element analysis programs) to support programming of interactive program control and use of persistent user‐defined dynamic data structures. The system consists of control and data definition and manipulation subsystems. The FLIPP routines are fully‐portable standard Fortran 77 procedures and the use of FLIPP leads the programmer to information hiding, e.g. as in object‐oriented systems. Program design and maintenance are facilitated to a considerable degree, while at the same time the performance of the programs using the FLIPP system remains fairly good as demonstrated by the examples.

Details

Engineering Computations, vol. 11 no. 3
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 4 November 2014

Nikitas N. Karanikolas and Michael Vassilakopoulos

The purpose of this paper is to compare the use of two Object-Relational models against the use of a post-Relational model for a realistic application. Although real-world…

Abstract

Purpose

The purpose of this paper is to compare the use of two Object-Relational models against the use of a post-Relational model for a realistic application. Although real-world applications, in most cases, can be adequately modeled by the Entity-Relationship (ER) model, the transformation to the popular Relational model alters the representation of structures common in reality, like multi-valued and composite fields. Alternative database models have been developed to overcome these shortcomings.

Design/methodology/approach

Based on the ER model of a medical application, this paper compares the information representation, manipulation and enforcement of integrity constraints through PostgreSQL and Oracle, against the use of a post-Relational model composed of the Conceptual Universal Database Language (CUDL) and the Conceptual Universal Database Language Abstraction Level (CAL).

Findings

The CAL/CUDL pair, although more periphrastic for data definition, is simpler for data insertions, does not require the use of procedural code for data updates, produces clearer output for retrieval of attributes, can accomplish retrieval of rows based on conditions that address composite data with declarative statements and supports data validation for relationships between composite data without the need for procedural code.

Research limitations/implications

To verify, in practice, the conclusions of the paper, complete implementation of a CAL/CUDL system is needed.

Practical implications

The use of the CAL/CUDL pair would advance the productivity of database application development.

Originality/value

This paper highlights the properties of realistic database-applications modelling and management that are desirable by developers and shows that these properties are better satisfied by the CAL/CUDL pair.

Details

Journal of Systems and Information Technology, vol. 16 no. 4
Type: Research Article
ISSN: 1328-7265

Keywords

Article
Publication date: 31 May 2023

Nathanaël Betti, Steven DeSimone, Joy Gray and Ingrid Poncin

This research paper aims to investigate the effects of internal audit’s (IA) use of data analytics and the performance of consulting activities on perceived IA quality.

Abstract

Purpose

This research paper aims to investigate the effects of internal audit’s (IA) use of data analytics and the performance of consulting activities on perceived IA quality.

Design/methodology/approach

The authors conduct a 2 × 2 between-subjects experiment among upper and middle managers where the use of data analytics and the performance of consulting activities by internal auditors are manipulated.

Findings

Results highlight the importance of internal auditor use of data analytics and performance of consulting activities to improve perceived IA quality. First, managers perceive internal auditors as more competent when the auditors use data analytics. Second, managers perceive internal auditors’ recommendations as more relevant when the auditors perform consulting activities. Finally, managers perceive an improvement in the quality of relationships with internal auditors when auditors perform consulting activities, which is strengthened when internal auditors combine the use of data analytics and the performance of consulting activities.

Research limitations/implications

From a theoretical perspective, this research builds on the IA quality framework by considering digitalization as a contextual factor. This research focused on the perceptions of one major stakeholder of the IA function: senior management. Future research should investigate the perceptions of other stakeholders and other contextual factors.

Practical implications

This research suggests that internal auditors should prioritize the development of the consulting role in their function and develop their digital expertise, especially expertise in data analytics, to improve perceived IA quality.

Originality/value

This research tests the impacts of the use of data analytics and the performance of consulting activities on perceived IA quality holistically, by testing Trotman and Duncan’s (2018) framework using an experiment.

Details

Journal of Accounting & Organizational Change, vol. 20 no. 2
Type: Research Article
ISSN: 1832-5912

Keywords

Article
Publication date: 16 October 2009

Dionysios S. Demetis

The purpose of this paper is to deconstruct the problem of data growth as a complex contemporary phenomenon and discuss its implications for the domains of anti‐money laundering…

Abstract

Purpose

The purpose of this paper is to deconstruct the problem of data growth as a complex contemporary phenomenon and discuss its implications for the domains of anti‐money laundering (AML) and anti‐terrorist financing (ATF).

Design/methodology/approach

The theoretical approach of the paper is based upon an examination of the fundamental ideas around information growth, along with the tradition of systems theory and is based upon many of the theoretical concepts discussed by Niklas Luhmann. The concept of distinction is used throughout the paper to discuss technological consequences.

Findings

The volume of data for manipulation and its implications on profiling AML and ATF are discussed while the issue data growth are presented as an elemental and core issue within the paper.

Practical implications

The practical implications of this paper relate to how technology should be appropriated within financial and other institutions and how the decision‐making processes of traditional bureaucracies surrounding AML/ATF should reflect on their utilization of technology.

Originality/value

The originality of this paper lies in its unique dealing of the problem of data growth through its implications and a new theoretical concept is introduced, the concept of electreaucracy to denote the systemic technological interference in bureaucratic conditions and how these influence AML/ATF.

Details

Journal of Money Laundering Control, vol. 12 no. 4
Type: Research Article
ISSN: 1368-5201

Keywords

Open Access
Article
Publication date: 28 September 2018

Luciana Klein, Ilse Maria Beuren and Delci Dal Vesco

This study investigates which dimensions of the management control system (MCS) increase the perception of organizational justice and reduce unethical behavior in the perception…

7351

Abstract

Purpose

This study investigates which dimensions of the management control system (MCS) increase the perception of organizational justice and reduce unethical behavior in the perception of managers. The purpose of this paper is to validate the theoretical model of the study of Langevin and Mendoza (2012), testing the theoretical hypotheses formulated by the authors.

Design/methodology/approach

A survey was performed in companies listed among the Best and Largest of Exame Magazine, and the sample is composed of 102 respondents of the research, which consists of 41 assertions.

Findings

The results of the structural equation modeling show that the definition of objectives increases the perception of procedural justice, but the same was not observed regarding the remuneration of the managers. Likewise, disregarding aspects that are uncontrollable by managers in performance evaluation does not lead to the perception of procedural and distributive justice. However, feedback quality leads to the understanding that the MCS is fair. Perception of procedural and distributive justice was also observed in the use of multiple measures of performance by the company.

Research limitations/implications

Other factors that have not been investigated may interfere with and contribute to the reduction of unethical behavior (budget slack and data manipulation).

Originality/value

The only variable that interferes in the reduction of unethical behavior is feedback quality. The non-confirmation of all the hypotheses instigates the replication of the research in other contexts for empirical validation of the theoretical model of Langevin and Mendoza (2012).

Details

RAUSP Management Journal, vol. 54 no. 1
Type: Research Article
ISSN: 2531-0488

Keywords

1 – 10 of over 22000