Search results

1 – 10 of over 13000
Book part
Publication date: 25 March 2021

Tom Grimes and Stephanie Dailey

Purpose: Media violence theorists made five methodological errors, which have muddled theory construction. As such, the validity of the claim that media violence must share blame…

Abstract

Purpose: Media violence theorists made five methodological errors, which have muddled theory construction. As such, the validity of the claim that media violence must share blame for a rise in aggression in society is suspect.

Approach: Here, the authors explain those five errors: (1) Subclinical psychopathologies interact with media messages in detectable ways. Media violence researchers never paid attention to the composition of their participant samples. Consequently, they were never aware of the inherent vulnerabilities, or immunities, to media violence of their participants. (2) Media violence researchers used convenience samples when they should have used random samples to study media violence. The nature of the research questions they were asking required the use of random samples. But, with the use of convenience samples, those samples never matched the populations they were designed to examine. (3) Media violence researchers used expansive variable lists that probably triggered family-wise interaction effects, thus reporting interactions between independent and dependent variables that were meaningless. (4) Most media violence data are correlational. So, researchers used converged data from correlational studies to infer causation. But their convergence procedures were improperly executed, which led to incorrect interpretations. (5) Media violence researchers, from the outset of their work in the 1980s, pathologized media violence first, then set about trying to find out how it presumably harmed society. Those researchers should have considered the idea that media violence is nothing more than mere entertainment for most people.

Value: In addition to questioning the claims made by media violence researchers, these five errors serve as a cautionary tale to social media researchers. Scholars investigating the effects of social media use might consider the possibility that social media are nothing more than new modes of communication.

Details

Theorizing Criminality and Policing in the Digital Media Age
Type: Book
ISBN: 978-1-83909-112-4

Keywords

Article
Publication date: 5 May 2021

Shailaja Sanjay Mohite and Uttam D. Kolekar

Femtocells are low-power, inexpensive base stations (BS) used in business enterprises or homes. They could offer higher SNR in a smaller coverage area to enhance the data rates…

Abstract

Purpose

Femtocells are low-power, inexpensive base stations (BS) used in business enterprises or homes. They could offer higher SNR in a smaller coverage area to enhance the data rates and QoS. Deployment of femtocell is expected to the witness constant development in upcoming years. Despite of all these benefits, there are certain challenges to be resolved that includes management of overlaying MC, interference among femtocells and the resource allocation between 2 tiers.

Design/methodology/approach

This work analyses the issues on cross-tier interfering and resource allocation alleviation in “full-duplex (FD) Orthogonal Frequency Division Multiple Access (OFDMA) oriented Heterogeneous Networks (HetNets) that includes macrocell as well as underlying femtocells”. This work concerns on three foremost contributions: portraying a single objective issue including subcarrier allocation, price allocation and power allocation of macrocell–femtocell networks. Moreover, this work introduces a novel Cat Swarm Mated-Lion algorithm (CSM-LA) for solving the defined optimization problem in macrocell–femtocell networks. At last, the supremacy of adopted scheme is proved over traditional models regarding statistical and convergence analysis.

Findings

By concerning the cost function, the developed CSM-LA attained 87.5, 60, 93.75 and 93.75% better than LM, WOA, LA and CSO respectively. For utility analysis, it accomplished 70.58% better than LM, 88.23% superior to GWO, 85.88% superior to WOA and 88.23% better than CSO. For statistical analysis, the median performance of developed CSM-LA attained better results, which was 80.52% superior to LA, 80.74% better than GWO, 72% superior to WOA and 48.7% better than LA. Hence, the developed CSM-LA proved its performance in terms of improved results and revealed its betterment over the conventional models.

Originality/value

This paper adopts a latest optimization algorithm called CSM-LA for analyzing the issues on cross-tier interfering and resource allocation alleviation in full-duplex (FD) orthogonal frequency division multiple access (OFDMA) oriented heterogeneous networks (HetNets). This is the first work that utilizes CSM-LA framework that proposes a new CSM-LA model for power control and resource allocation by considering the multi-objectives like price, subcarrier and power as well.

Details

International Journal of Intelligent Unmanned Systems, vol. 10 no. 4
Type: Research Article
ISSN: 2049-6427

Keywords

Content available
Book part
Publication date: 25 March 2021

Abstract

Details

Theorizing Criminality and Policing in the Digital Media Age
Type: Book
ISBN: 978-1-83909-112-4

Article
Publication date: 3 August 2012

Rahul Srivatsa and Stephen L. Lee

The purpose of this paper is to test the extent of convergence in rents and yields in the European real estate office market.

1877

Abstract

Purpose

The purpose of this paper is to test the extent of convergence in rents and yields in the European real estate office market.

Design/methodology/approach

The paper uses the concepts of beta‐convergence and sigma‐convergence to evaluate empirically the hypothesis of rent and yield convergence in seven European office markets during the period 1982‐2009. Because of the introduction of a single currency in January 1999, the analysis is carried out sequentially, first for the overall sample period and then the periods before and after the introduction of the single currency.

Findings

The results indicate that, irrespective of the time period considered, there is not enough statistical evidence of beta‐convergence in either rents or yields but evidence of significant sigma‐convergence in rents and yields in the European office markets under review. Additionally, some evidence is found that the introduction of the single currency in 1999 has led to increasing signs of convergence, especially in the Continental European markets.

Practical implications

The results show that the real estate office markets in Europe are not fully integrated and so indicate that diversification across Europe is still a viable investment strategy.

Originality/value

This is the first paper to use beta and sigma convergence tests on European office market data.

Details

Journal of Property Investment & Finance, vol. 30 no. 5
Type: Research Article
ISSN: 1463-578X

Keywords

Article
Publication date: 17 January 2023

Doron Nisani, Amit Shelef and Or David

The purpose of this study is to estimate the convergence order of the Aumann–Serrano Riskiness Index.

Abstract

Purpose

The purpose of this study is to estimate the convergence order of the Aumann–Serrano Riskiness Index.

Design/methodology/approach

This study uses the equivalent relation between the Aumann–Serrano Riskiness Index and the moment generating function and aggregately compares between each two statistical moments for statistical significance. Thus, this study enables to find the convergence order of the index to its stable value.

Findings

This study finds that the first-best estimation of the Aumann–Serrano Riskiness Index is reached in no less than its seventh statistical moment. However, this study also finds that its second-best approximation could be achieved with its second statistical moment.

Research limitations/implications

The implications of this research support the standard deviation as a statistically sufficient approximation of Aumann–Serrano Riskiness Index, thus strengthening the CAPM methodology for asset pricing in the financial markets.

Originality/value

This research sheds a new light, both in theory and in practice, on understanding of the risk’s structure, as it may improve accuracy of asset pricing.

Article
Publication date: 4 January 2013

Shamsuddin Ahmed

The purpose of this paper is to present a degenerated simplex search method to optimize neural network error function. By repeatedly reflecting and expanding a simplex, the…

Abstract

Purpose

The purpose of this paper is to present a degenerated simplex search method to optimize neural network error function. By repeatedly reflecting and expanding a simplex, the centroid property of the simplex changes the location of the simplex vertices. The proposed algorithm selects the location of the centroid of a simplex as the possible minimum point of an artificial neural network (ANN) error function. The algorithm continually changes the shape of the simplex to move multiple directions in error function space. Each movement of the simplex in search space generates local minimum. Simulating the simplex geometry, the algorithm generates random vertices to train ANN error function. It is easy to solve problems in lower dimension. The algorithm is reliable and locates minimum function value at the early stage of training. It is appropriate for classification, forecasting and optimization problems.

Design/methodology/approach

Adding more neurons in ANN structure, the terrain of the error function becomes complex and the Hessian matrix of the error function tends to be positive semi‐definite. As a result, derivative based training method faces convergence difficulty. If the error function contains several local minimum or if the error surface is almost flat, then the algorithm faces convergence difficulty. The proposed algorithm is an alternate method in such case. This paper presents a non‐degenerate simplex training algorithm. It improves convergence by maintaining irregular shape of the simplex geometry during degenerated stage. A randomized simplex geometry is introduced to maintain irregular contour of a degenerated simplex during training.

Findings

Simulation results show that the new search is efficient and improves the function convergence. Classification and statistical time series problems in higher dimensions are solved. Experimental results show that the new algorithm (degenerated simplex algorithm, DSA) works better than the random simplex algorithm (RSM) and back propagation training method (BPM). Experimental results confirm algorithm's robust performance.

Research limitations/implications

The algorithm is expected to face convergence complexity for optimization problems in higher dimensions. Good quality suboptimal solution is available at the early stage of training and the locally optimized function value is not far off the global optimal solution, determined by the algorithm.

Practical implications

Traditional simplex faces convergence difficulty to train ANN error function since during training simplex can't maintain irregular shape to avoid degeneracy. Simplex size becomes extremely small. Hence convergence difficulty is common. Steps are taken to redefine simplex so that the algorithm avoids the local minimum. The proposed ANN training method is derivative free. There is no demand for first order or second order derivative information hence making it simple to train ANN error function.

Originality/value

The algorithm optimizes ANN error function, when the Hessian matrix of error function is ill conditioned. Since no derivative information is necessary, the algorithm is appealing for instances where it is hard to find derivative information. It is robust and is considered a benchmark algorithm for unknown optimization problems.

Article
Publication date: 18 October 2011

Minghu Ha, Jiqiang Chen, Witold Pedrycz and Lu Sun

Bounds on the rate of convergence of learning processes based on random samples and probability are one of the essential components of statistical learning theory (SLT). The…

Abstract

Purpose

Bounds on the rate of convergence of learning processes based on random samples and probability are one of the essential components of statistical learning theory (SLT). The constructive distribution‐independent bounds on generalization are the cornerstone of constructing support vector machines. Random sets and set‐valued probability are important extensions of random variables and probability, respectively. The paper aims to address these issues.

Design/methodology/approach

In this study, the bounds on the rate of convergence of learning processes based on random sets and set‐valued probability are discussed. First, the Hoeffding inequality is enhanced based on random sets, and then making use of the key theorem the non‐constructive distribution‐dependent bounds of learning machines based on random sets in set‐valued probability space are revisited. Second, some properties of random sets and set‐valued probability are discussed.

Findings

In the sequel, the concepts of the annealed entropy, the growth function, and VC dimension of a set of random sets are presented. Finally, the paper establishes the VC dimension theory of SLT based on random sets and set‐valued probability, and then develops the constructive distribution‐independent bounds on the rate of uniform convergence of learning processes. It shows that such bounds are important to the analysis of the generalization abilities of learning machines.

Originality/value

SLT is considered at present as one of the fundamental theories about small statistical learning.

Article
Publication date: 8 October 2019

Adriana Tiron-Tudor, Cristina Silvia Nistor and Cristina Alexandrina Stefanescu

The purpose of this paper is to approach, both theoretically and empirically, public sector reporting at European Union (EU) level. It contributes to the accounting harmonisation…

Abstract

Purpose

The purpose of this paper is to approach, both theoretically and empirically, public sector reporting at European Union (EU) level. It contributes to the accounting harmonisation literature by revealing the actual status of governmental reporting at the national level.

Design/methodology/approach

The paper carried out an exploratory data analysis of the harmonisation of statistical, budgetary and financial reporting at the EU level. A mapping visualisation offers a comprehensive overview of the current state of connections between these reporting systems.

Findings

The results reveal the complexity of governments’ reporting systems homogeneity, although all stakeholders recognise the struggle for the principles of performance and transparency in the public sector. Thus, these are following the EU Commission’s study, which concludes that there is significant heterogeneity in the accounting and reporting practices applied transversely throughout all Member States.

Research limitations/implications

The relevance of the study is comprehensive, from the economic environment to the practitioners, from the international regulatory bodies to the national ones, all can assess and quantify the significance of the past, present and future changes, considering their needs. The limitations of the research regard the documentation background because uniformly accessing some information presented by the EU Member States is relatively tricky. Future research might focus on the effects of these changes as they occur.

Originality/value

The study contributes to the scientific literature in the public sector through a comprehensive, well-supported and statistically grounded analysis performed at EU level, able to provide reliable results and to support valuable future recommendations towards harmonised reporting. Moreover, it supports and encourages all national and international efforts for improving the comparability of financial, budgetary and aggregated statistical reports.

Details

International Journal of Public Sector Management, vol. 33 no. 2/3
Type: Research Article
ISSN: 0951-3558

Keywords

Book part
Publication date: 8 October 2018

Joseph Dippong and Will Kalkhoff

We review literature linking patterns of vocal accommodation in the paraverbal range of the voice to small group structures of status and dominance. We provide a thorough overview…

Abstract

Purpose

We review literature linking patterns of vocal accommodation in the paraverbal range of the voice to small group structures of status and dominance. We provide a thorough overview of the current state of vocal accommodation research, tracing the development of the model from its early focus on patterns of mutual vocal adaptation, to the current focus on structural factors producing patterns of unequal accommodation between group members. We also highlight gaps in existing knowledge and opportunities to contribute to the development of vocal accommodation as an unobtrusive, nonconscious measure of small group hierarchies.

Approach

We trace the empirical development of vocal accommodation as a measure of status and power, and discuss connections between vocal accommodation and two prominent theoretical frameworks: communication accommodation theory (CAT) and expectation states theory. We also provide readers with a guide for collecting and analyzing vocal data and for calculating two related measures of vocal accommodation.

Findings

Across multiple studies, vocal accommodation significantly predicts observers’ perceptions regarding interactants engaged in debates and interviews. Studies have specifically linked vocal accommodation to perceptions of relative power or dominance, but have not shown a relationship between accommodation and perceptions of prestige.

Research Implications

Vocal accommodation measures have clear applications for measuring and modeling group dynamics. More work is needed to understand how accommodation functions in clearly-defined status situations, how the magnitude of status differences affects the degree of accommodation inequality, and how vocal accommodation is related to other correlates of social status, including openness to influence and contributions to group tasks.

Details

Advances in Group Processes
Type: Book
ISBN: 978-1-78769-013-4

Keywords

Article
Publication date: 24 August 2020

Ambika Aggarwal, Priti Dimri, Amit Agarwal and Ashutosh Bhatt

In general, cloud computing is a model of on-demand business computing that grants a convenient access to shared configurable resources on the internet. With the increment of…

Abstract

Purpose

In general, cloud computing is a model of on-demand business computing that grants a convenient access to shared configurable resources on the internet. With the increment of workload and difficulty of tasks that are submitted by cloud consumers; “how to complete these tasks effectively and rapidly with limited cloud resources?” is becoming a challenging question. The major point of a task scheduling approach is to identify a trade-off among user needs and resource utilization. However, tasks that are submitted by varied users might have diverse needs of computing time, memory space, data traffic, response time, etc. This paper aims to proposes a new way of task scheduling.

Design/methodology/approach

To make the workflow completion in an efficient way and to reduce the cost and flow time, this paper proposes a new way of task scheduling. Here, a self-adaptive fruit fly optimization algorithm (SA-FFOA) is used for scheduling the workflow. The proposed multiple workflow scheduling model compares its efficiency over conventional methods in terms of analysis such as performance analysis, convergence analysis and statistical analysis. From the outcome of the analysis, the betterment of the proposed approach is proven with effective workflow scheduling.

Findings

The proposed algorithm is more superior regarding flow time with the minimum value, and the proposed model is enhanced over FFOA by 0.23%, differential evolution by 2.48%, artificial bee colony (ABC) by 2.85%, particle swarm optimization (PSO) by 2.46%, genetic algorithm (GA) by 2.33% and expected time to compute (ETC) by 2.56%. While analyzing the make span case, the proposed algorithm is 0.28%, 0.15%, 0.38%, 0.20%, 0.21% and 0.29% better than the conventional methods such as FFOA, DE, ABC, PSO, GA and ETC, respectively. Moreover, the proposed model has attained less cost, which is 2.14% better than FFOA, 2.32% better than DE, 3.53% better than ABC, 2.43% better than PSO, 2.07% better than GA and 2.90% better than ETC, respectively.

Originality/value

This paper presents a new way of task scheduling for making the workflow completion in an efficient way and for reducing the cost and flow time. This is the first paper uses SA-FFOA for scheduling the workflow.

Details

Kybernetes, vol. 50 no. 6
Type: Research Article
ISSN: 0368-492X

Keywords

1 – 10 of over 13000