Search results

11 – 20 of over 16000
Article
Publication date: 23 November 2012

Sami J. Habib and Paulvanna N. Marimuthu

Energy constraint is always a serious issue in wireless sensor networks, as the energy possessed by the sensors is limited and non‐renewable. Data aggregation at intermediate base…

Abstract

Purpose

Energy constraint is always a serious issue in wireless sensor networks, as the energy possessed by the sensors is limited and non‐renewable. Data aggregation at intermediate base stations increases the lifespan of the sensors, whereby the sensors' data are aggregated before being communicated to the central server. This paper proposes a query‐based aggregation within Monte Carlo simulator to explore the best and worst possible query orders to aggregate the sensors' data at the base stations. The proposed query‐based aggregation model can help the network administrator to envisage the best query orders in improving the performance of the base stations under uncertain query ordering. Furthermore, it aims to examine the feasibility of the proposed model to engage simultaneous transmissions at the base station and also to derive a best‐fit mathematical model to study the behavior of data aggregation with uncertain querying order.

Design/methodology/approach

The paper considers small and medium‐sized wireless sensor networks comprised of randomly deployed sensors in a square arena. It formulates the query‐based data aggregation problem as an uncertain ordering problem within Monte Carlo simulator, generating several thousands of uncertain orders to schedule the responses of M sensors at the base station within the specified time interval. For each selected time interval, the model finds the best possible querying order to aggregate the data with reduced idle time and with improved throughput. Furthermore, it extends the model to include multiple sensing parameters and multiple aggregating channels, thereby enabling the administrator to plan the capacity of its WSN according to specific time intervals known in advance.

Findings

The experimental results within Monte Carlo simulator demonstrate that the query‐based aggregation scheme show a better trade‐off in maximizing the aggregating efficiency and also reducing the average idle‐time experienced by the individual sensor. The query‐based aggregation model was tested for a WSN containing 25 sensors with single sensing parameter, transmitting data to a base station; moreover, the simulation results show continuous improvement in best‐case performances from 56 percent to 96 percent in the time interval of 80 to 200 time units. Moreover, the query aggregation is extended to analyze the behavior of WSN with 50 sensors, sensing two environmental parameters and base station equipped with multiple channels, whereby it demonstrates a shorter aggregation time interval against single channel. The analysis of average waiting time of individual sensors in the generated uncertain querying order shows that the best‐case scenario within a specified time interval showed a gain of 10 percent to 20 percent over the worst‐case scenario, which reduces the total transmission time by around 50 percent.

Practical implications

The proposed query‐based data aggregation model can be utilized to predict the non‐deterministic real‐time behavior of the wireless sensor network in response to the flooded queries by the base station.

Originality/value

This paper employs a novel framework to analyze all possible ordering of sensor responses to be aggregated at the base station within the stipulated aggregating time interval.

Details

International Journal of Pervasive Computing and Communications, vol. 8 no. 4
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 7 September 2015

Yao 'Henry' Jin, Brent D. Williams, Matthew A. Waller and Adriana Rossiter Hofer

The accurate measurement of demand variability amplification across different nodes in the supply chain, or “bullwhip effect,” is critical for firms to achieve more efficient…

1583

Abstract

Purpose

The accurate measurement of demand variability amplification across different nodes in the supply chain, or “bullwhip effect,” is critical for firms to achieve more efficient inventory, production, and ordering planning processes. Building on recent analytical research that suggests that data aggregation tends to mask the bullwhip effect in the retail industry, the purpose of this paper is to empirically investigate whether different patterns of data aggregation influence its measurement.

Design/methodology/approach

Utilizing weekly, product-level order and sales data from three product categories of a consumer packaged goods manufacturer, the study uses hierarchical linear modeling to empirically test the effects of data aggregation on different measures of bullwhip.

Findings

The authors findings lend strong support to the masking effect of aggregating sales and order data along product-location and temporal dimensions, as well as the dampening effect of seasonality on the measurement of the bullwhip effect.

Research limitations/implications

These findings indicate that inconsistencies found in the literature may be due to measurement aggregation and statistical techniques, both of which should be applied with care by academics and practitioners in order to preserve the fidelity of their analyses.

Originality/value

Using product-weekly level data that cover both seasonal and non-seasonal demand, this study is the first, to the author’s knowledge, to systematically aggregate data up to category and monthly levels to empirically examine the impact of data aggregation and seasonality on bullwhip measurement.

Details

International Journal of Physical Distribution & Logistics Management, vol. 45 no. 8
Type: Research Article
ISSN: 0960-0035

Keywords

Open Access
Article
Publication date: 7 June 2021

Gregory S. Cooper, Karl M. Rich, Bhavani Shankar and Vinay Rana

Agricultural aggregation schemes provide numerous farmer-facing benefits, including reduced transportation costs and improved access to higher-demand urban markets. However…

2058

Abstract

Purpose

Agricultural aggregation schemes provide numerous farmer-facing benefits, including reduced transportation costs and improved access to higher-demand urban markets. However, whether aggregation schemes also have positive food security dimensions for consumers dependent on peri-urban and local markets in developing country contexts is currently unknown. This paper aims to narrow this knowledge gap by exploring the actors, governance structures and physical infrastructures of the horticultural value chain of Bihar, India, to identify barriers to using aggregation to improve the distribution of fruits and vegetables to more local market environments.

Design/methodology/approach

This study uses mixed methods. Quantitative analysis of market transaction data explores the development of aggregation supply pathways over space and time. In turn, semi-structured interviews with value chain actors uncover the interactions and decision-making processes with implications for equitable fruit and vegetable delivery.

Findings

Whilst aggregation successfully generates multiple producer-facing benefits, the supply pathways tend to cluster around urban export-oriented hubs, owing to the presence of high-capacity traders, large consumer bases and traditional power dynamics. Various barriers across the wider enabling environment must be overcome to unlock the potential for aggregation to increase local fruit and vegetable delivery, including informal governance structures, cold storage gaps and underdeveloped transport infrastructures.

Originality/value

To the best of the authors’ knowledge, this study is the first critical analysis of horticultural aggregation through a consumer-sensitive lens. The policy-relevant lessons are pertinent to the equitable and sustainable development of horticultural systems both in Bihar and in similar low- and middle-income settings.

Details

Journal of Agribusiness in Developing and Emerging Economies, vol. 12 no. 2
Type: Research Article
ISSN: 2044-0839

Keywords

Content available
Article
Publication date: 10 May 2021

Zachary Hornberger, Bruce Cox and Raymond R. Hill

Large/stochastic spatiotemporal demand data sets can prove intractable for location optimization problems, motivating the need for aggregation. However, demand aggregation induces…

Abstract

Purpose

Large/stochastic spatiotemporal demand data sets can prove intractable for location optimization problems, motivating the need for aggregation. However, demand aggregation induces errors. Significant theoretical research has been performed related to the modifiable areal unit problem and the zone definition problem. Minimal research has been accomplished related to the specific issues inherent to spatiotemporal demand data, such as search and rescue (SAR) data. This study provides a quantitative comparison of various aggregation methodologies and their relation to distance and volume based aggregation errors.

Design/methodology/approach

This paper introduces and applies a framework for comparing both deterministic and stochastic aggregation methods using distance- and volume-based aggregation error metrics. This paper additionally applies weighted versions of these metrics to account for the reality that demand events are nonhomogeneous. These metrics are applied to a large, highly variable, spatiotemporal demand data set of SAR events in the Pacific Ocean. Comparisons using these metrics are conducted between six quadrat aggregations of varying scales and two zonal distribution models using hierarchical clustering.

Findings

As quadrat fidelity increases the distance-based aggregation error decreases, while the two deliberate zonal approaches further reduce this error while using fewer zones. However, the higher fidelity aggregations detrimentally affect volume error. Additionally, by splitting the SAR data set into training and test sets this paper shows the stochastic zonal distribution aggregation method is effective at simulating actual future demands.

Originality/value

This study indicates no singular best aggregation method exists, by quantifying trade-offs in aggregation-induced errors practitioners can utilize the method that minimizes errors most relevant to their study. Study also quantifies the ability of a stochastic zonal distribution method to effectively simulate future demand data.

Details

Journal of Defense Analytics and Logistics, vol. 5 no. 1
Type: Research Article
ISSN: 2399-6439

Keywords

Article
Publication date: 2 October 2018

Senan Kiryakos and Shigeo Sugimoto

Multiple studies have illustrated that the needs of various users seeking descriptive bibliographic data for pop culture resources (e.g. manga, anime, video games) have not been…

Abstract

Purpose

Multiple studies have illustrated that the needs of various users seeking descriptive bibliographic data for pop culture resources (e.g. manga, anime, video games) have not been properly met by cultural heritage institutions and traditional models. With a focus on manga as the central resource, the purpose of this paper is to address these issues to better meet user needs.

Design/methodology/approach

Based on an analysis of existing bibliographic metadata, this paper proposes a unique bibliographic hierarchy for manga that is also extendable to other pop culture sources. To better meet user requirements of descriptive data, an aggregation-based approach relying on the Object Reuse and Exchange-Open Archives Initiative (OAI-ORE) model utilized existing, fan-created data on the web.

Findings

The proposed hierarchy is better able to portray multiple entities of manga as they exist across data providers compared to existing models, while the utilization of OAI-ORE-based aggregation to build and provide bibliographic metadata for said hierarchy resulted in levels of description that more adequately meet user demands.

Originality/value

Though studies have proposed alternative models for resources like games or comics, manga has remained unexamined. As manga is a major component of many popular multimedia franchises, a focus here with the intention while building the model to support other resource types provides a foundation for future work seeking to incorporate these resources.

Details

Journal of Documentation, vol. 75 no. 2
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 4 September 2017

Mozhgan Hosseinnezhad and Kamaladin Gharanjig

The purpose of this paper is to study assembling parameters in dye-sensitised solar cells (DSSCs) performance. For this end, 3a,7a-dihydroxy-5ß-cholanic acid (cheno) are selected…

Abstract

Purpose

The purpose of this paper is to study assembling parameters in dye-sensitised solar cells (DSSCs) performance. For this end, 3a,7a-dihydroxy-5ß-cholanic acid (cheno) are selected as anti-aggregation agent and two solutions, namely, tetrabutyl ammonium iodide and (PMII)IL used as electrolyte.

Design/methodology/approach

A series of organic dyes were selected using N-substituents carbazole as electron donor group and acrylic acid and cyanoacrylic acid as electron acceptor groups. Absorption properties of purified dyes were studied in solution and on photoelectrode substrate. DSSCs were prepared in the presence of anti-aggregation agent and different electrolyte to determine the photovoltaic performance of each dyes.

Findings

The results showed that all organic dyes form J-aggregation on the photoanode substrate in the absence of anti-aggregation agent and the amounts of aggregation were reduced in the presence of anti-aggregation agent. DSSCs were fabricated in the presence of anti-aggregation agent. The photovoltaic properties were improved using tetrabutyl ammonium iodide as electrolyte. The maximum power conversion efficiency was achieved for D12 in the presence of cheno and tetrabutyl ammonium iodide as anti-aggregation agent and electrolyte, respectively.

Social implications

Organic dye attracts more and more attention due to low cost, facile route synthesis and less hazardous.

Originality/value

The effect of anti-aggregation agent and electrolyte on DSSCs performance was investigated for the first time.

Details

Pigment & Resin Technology, vol. 46 no. 5
Type: Research Article
ISSN: 0369-9420

Keywords

Article
Publication date: 1 December 2003

Da Ruan, Jun Liu and Roland Carchon

A flexible and realistic linguistic assessment approach is developed to provide a mathematical tool for synthesis and evaluation analysis of nuclear safeguards indicator…

Abstract

A flexible and realistic linguistic assessment approach is developed to provide a mathematical tool for synthesis and evaluation analysis of nuclear safeguards indicator information. This symbolic approach, which acts by the direct computation on linguistic terms, is established based on fuzzy set theory. More specifically, a lattice‐valued linguistic algebra model, which is based on a logical algebraic structure of the lattice implication algebra, is applied to represent imprecise information and to deal with both comparable and incomparable linguistic terms (i.e. non‐ordered linguistic values). Within this framework, some weighted aggregation functions introduced by Yager are analyzed and extended to treat these kinds of lattice‐value linguistic information. The application of these linguistic aggregation operators for managing nuclear safeguards indicator information is successfully demonstrated.

Details

Logistics Information Management, vol. 16 no. 6
Type: Research Article
ISSN: 0957-6053

Keywords

Book part
Publication date: 28 February 2022

Javier E. Portillo

The act of consolidating multiple parcels of land to form a single larger parcel is known as land assembly. Laboratory experiments have enabled researchers to explore how various…

Abstract

The act of consolidating multiple parcels of land to form a single larger parcel is known as land assembly. Laboratory experiments have enabled researchers to explore how various factors, environments, and institutions hinder or assist the aggregation process. This chapter surveys the experimental literature and highlights the experimental design used in those studies, as well as their main findings.

Details

Experimental Law and Economics
Type: Book
ISBN: 978-1-83867-537-0

Keywords

Article
Publication date: 3 September 2024

Marco Humbel, Julianne Nyhan, Nina Pearlman, Andreas Vlachidis, JD Hill and Andrew Flinn

This paper aims to explore the accelerations and constraints libraries, archives, museums and heritage organisations (“collections-holding organisations”) face in their role as…

Abstract

Purpose

This paper aims to explore the accelerations and constraints libraries, archives, museums and heritage organisations (“collections-holding organisations”) face in their role as collection data providers for digital infrastructures. To date, digital infrastructures operate within the cultural heritage domain typically as data aggregation platforms, such as Europeana or Art UK.

Design/methodology/approach

Semi-structured interviews with 18 individuals in 8 UK collections-holding organisations and 2 international aggregators.

Findings

Discussions about digital infrastructure development often lay great emphasis on questions and problems that are technical and legal in nature. As important as technical and legal matters are, more latent, yet potent challenges exist too. Though less discussed in the literature, collections-holding organisations' capacity to participate in digital infrastructures is dependent on a complex interplay of funding allocation across the sector, divergent traditions of collection description and disciplinaries’ idiosyncrasies. Accordingly, we call for better social-cultural and trans-sectoral (collections-holding organisations, universities and technological providers) understandings of collection data infrastructure development.

Research limitations/implications

The authors recommend developing more understanding of the social-cultural aspects (e.g. disciplinary conventions) and their impact on collection data dissemination. More studies on the impact and opportunities of unified collections for different audiences and collections-holding organisations themselves are required too.

Practical implications

Sustainable financial investment across the heritage sector is required to address the discrepancies between different organisation types in their capacity to deliver collection data. Smaller organisations play a vital role in diversifying the (digital) historical canon, but they often struggle to digitise collections and bring catalogues online in the first place. In addition, investment in existing infrastructures for collection data dissemination and unification is necessary, instead of creating new platforms, with various levels of uptake and longevity. Ongoing investments in collections curation and high-quality cataloguing are prerequisites for a sustainable heritage sector and collection data infrastructures. Investments in the sustainability of infrastructures are not a replacement for research and vice versa.

Social implications

The authors recommend establishing networks where collections-holding organisations, technology providers and users can communicate their experiences and needs in an ongoing way and influence policy.

Originality/value

To date, the research focus on developing collection data infrastructures has tended to be on the drive to adopt specific technological solutions and copyright licensing practices. This paper offers a critical and holistic analysis of the dispersed experience of collections-holding organisations in their role as data providers for digital infrastructures. The paper contributes to the emerging understanding of the latent factors that make infrastructural endeavours in the heritage sector complex undertakings.

Article
Publication date: 1 March 1992

Romeo Castagna and Massimiliano Galli

In a manufacturing system, time performances are measures of systemresponse speed to external influences. This speed depends on theresource allocation process (materials…

Abstract

In a manufacturing system, time performances are measures of system response speed to external influences. This speed depends on the resource allocation process (materials, equipment, labour) which is driven by finished‐product forecasts. Describes two essential steps, in order to develop a model for evaluating time performances which is able to detect crucial resources. The first step is represented by analysing forecast characteristics; the second step is expressed by a definition of the environment of manufacturing resources. The model, depicted in its structure and in its relationships with the most common business tools, has been tested in a number of manufacturing firms and the results are also shown.

Details

Integrated Manufacturing Systems, vol. 3 no. 3
Type: Research Article
ISSN: 0957-6061

Keywords

11 – 20 of over 16000