Search results

1 – 10 of over 6000
Open Access
Article
Publication date: 28 June 2022

Olli Väänänen and Timo Hämäläinen

Minimizing the energy consumption in a wireless sensor node is important for lengthening the lifetime of a battery. Radio transmission is the most energy-consuming task in a…

995

Abstract

Purpose

Minimizing the energy consumption in a wireless sensor node is important for lengthening the lifetime of a battery. Radio transmission is the most energy-consuming task in a wireless sensor node, and by compressing the sensor data in the online mode, it is possible to reduce the number of transmission periods. This study aims to demonstrate that temporal compression methods present an effective method for lengthening the lifetime of a battery-powered wireless sensor node.

Design/methodology/approach

In this study, the energy consumption of LoRa-based sensor node was evaluated and measured. The experiments were conducted with different LoRaWAN data rate parameters, with and without compression algorithms implemented to compress sensor data in the online mode. The effect of temporal compression algorithms on the overall energy consumption was measured.

Findings

Energy consumption was measured with different LoRaWAN spreading factors. The LoRaWAN transmission energy consumption significantly depends on the spreading factor used. The other significant factors affecting the LoRa-based sensor node energy consumption are the measurement interval and sleep mode current consumption. The results show that temporal compression algorithms are an effective method for reducing the energy consumption of a LoRa sensor node by reducing the number of LoRa transmission periods.

Originality/value

This paper presents with a practical case that it is possible to reduce the overall energy consumption of a wireless sensor node by compressing sensor data in online mode with simple temporal compression algorithms.

Details

Sensor Review, vol. 42 no. 5
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 23 November 2010

Nils Hoeller, Christoph Reinke, Jana Neumann, Sven Groppe, Christian Werner and Volker Linnemann

In the last decade, XML has become the de facto standard for data exchange in the world wide web (WWW). The positive benefits of data exchangeability to support system and…

Abstract

Purpose

In the last decade, XML has become the de facto standard for data exchange in the world wide web (WWW). The positive benefits of data exchangeability to support system and software heterogeneity on application level and easy WWW integration make XML an ideal data format for many other application and network scenarios like wireless sensor networks (WSNs). Moreover, the usage of XML encourages using standardized techniques like SOAP to adapt the service‐oriented paradigm to sensor network engineering. Nevertheless, integrating XML usage in WSN data management is limited by the low hardware resources that require efficient XML data management strategies suitable to bridge the general resource gap. The purpose of this paper is to present two separate strategies on integrating XML data management in WSNs.

Design/methodology/approach

The paper presents two separate strategies on integrating XML data management in WSNs that have been implemented and are running on today's sensor node platforms. The paper shows how XML data can be processed and how XPath queries can be evaluated dynamically. In an extended evaluation, the performance of both strategies concerning the memory and energy efficiency are compared and both solutions are shown to have application domains fully applicable on today's sensor node products.

Findings

This work shows that dynamic XML data management and query evaluation is possible on sensor nodes with strict limitations in terms of memory, processing power and energy supply.

Originality/value

The paper presents an optimized stream‐based XML compression technique and shows how XML queries can be evaluated on compressed XML bit streams using generic pushdown automata. To the best of the authors' knowledge, this is the first complete approach on integrating dynamic XML data management into WSNs.

Details

International Journal of Web Information Systems, vol. 6 no. 4
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 3 October 2016

Yair Wiseman

The purpose of this paper is to study extensive enlargement and safety of flight data recorder memory.

Abstract

Purpose

The purpose of this paper is to study extensive enlargement and safety of flight data recorder memory.

Design/methodology/approach

The study involves the moving the memory of flight data recorders from an internal embedded device to a cloud.

Findings

The implementation has made the embedded memory device of flight data recorder effectively unlimited, and, hence, much more information can be stored.

Research limitations/implications

The possibility of a flight data recorder to be damaged or lost in a crash is not so high, but the implementation can be very helpful in cases such as aerial disappearances.

Practical implications

The implication is larger and protected memory for flight data recorders.

Social implications

Finding reasons for crashes is faster, and immediate actions can be taken to find remedy to the failures.

Originality/value

The use of internet and cellphones in airplanes is nothing special at present. It is suggested to take this technology for flight data recorders as well.

Details

Aircraft Engineering and Aerospace Technology, vol. 88 no. 6
Type: Research Article
ISSN: 1748-8842

Keywords

Article
Publication date: 13 March 2007

B. Pradhan, K. Sandeep, Shattri Mansor, Abdul Rahman Ramli and Abdul Rashid B. Mohamed Sharif

In GIS applications for a realistic representation of a terrain a great number of triangles are needed that ultimately increases the data size. For online GIS interactive programs…

Abstract

Purpose

In GIS applications for a realistic representation of a terrain a great number of triangles are needed that ultimately increases the data size. For online GIS interactive programs it has become highly essential to reduce the number of triangles in order to save more storing space. Therefore, there is need to visualize terrains at different levels of detail, for example, a region of high interest should be in higher resolution than a region of low or no interest. Wavelet technology provides an efficient approach to achieve this. Using this technology, one can decompose a terrain data into hierarchy. On the other hand, the reduction of the number of triangles in subsequent levels should not be too small; otherwise leading to poor representation of terrain.

Design/methodology/approach

This paper proposes a new computational code (please see Appendix for the flow chart and pseudo code) for triangulated irregular network (TIN) using Delaunay triangulation methods. The algorithms have proved to be efficient tools in numerical methods such as finite element method and image processing. Further, second generation wavelet techniques popularly known as “lifting schemes” have been applied to compress the TIN data.

Findings

A new interpolation wavelet filter for TIN has been applied in two steps, namely splitting and elevation. In the splitting step, a triangle has been divided into several sub‐triangles and the elevation step has been used to “modify” the point values (point coordinates for geometry) after the splitting. Then, this data set is compressed at the desired locations by using second generation wavelets.

Originality/value

A new algorithm for second generation wavelet compression has been proposed for TIN data compression. The quality of geographical surface representation after using proposed technique is compared with the original terrain. The results show that this method can be used for significant reduction of data set.

Details

Engineering Computations, vol. 24 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 1 August 1999

Schubert Foo Siu Cheung Hui and See Wai Yip

The Internet environment, with its packet‐switched network and lack of resource reservation mechanisms, has made the delivery of low bit‐rate real‐time communication services…

Abstract

The Internet environment, with its packet‐switched network and lack of resource reservation mechanisms, has made the delivery of low bit‐rate real‐time communication services particularly difficult and challenging. The high potential transmission delay and data packet loss under varying network conditions will lead to unpleasant and unintelligible audio and jerky video play‐out. The Internet TCP/IP protocol suite can be extended with new mechanisms in an attempt to tackle such problems. In this research, an integrated transmission mechanism that incorporates a number of existing techniques to enhance the quality and deliver “acceptable” real‐time services is proposed. These techniques include the use of data compression, data buffering, dynamic rate control, packet lost replacement, silence deletion and virtual video play‐out mechanism. The proposed transmission mechanism is designed as a generic communication system so that it can be used in different systems and conditions. This approach has been successfully implemented and demonstrated using three separate systems that include the Internet Phone, WebVideo and video‐conferencing tool.

Details

Internet Research, vol. 9 no. 3
Type: Research Article
ISSN: 1066-2243

Keywords

Open Access
Article
Publication date: 29 July 2020

Mahmood Al-khassaweneh and Omar AlShorman

In the big data era, image compression is of significant importance in today’s world. Importantly, compression of large sized images is required for everyday tasks; including…

Abstract

In the big data era, image compression is of significant importance in today’s world. Importantly, compression of large sized images is required for everyday tasks; including electronic data communications and internet transactions. However, two important measures should be considered for any compression algorithm: the compression factor and the quality of the decompressed image. In this paper, we use Frei-Chen bases technique and the Modified Run Length Encoding (RLE) to compress images. The Frei-Chen bases technique is applied at the first stage in which the average subspace is applied to each 3 × 3 block. Those blocks with the highest energy are replaced by a single value that represents the average value of the pixels in the corresponding block. Even though Frei-Chen bases technique provides lossy compression, it maintains the main characteristics of the image. Additionally, the Frei-Chen bases technique enhances the compression factor, making it advantageous to use. In the second stage, RLE is applied to further increase the compression factor. The goal of using RLE is to enhance the compression factor without adding any distortion to the resultant decompressed image. Integrating RLE with Frei-Chen bases technique, as described in the proposed algorithm, ensures high quality decompressed images and high compression rate. The results of the proposed algorithms are shown to be comparable in quality and performance with other existing methods.

Details

Applied Computing and Informatics, vol. 20 no. 1/2
Type: Research Article
ISSN: 2634-1964

Keywords

Article
Publication date: 20 December 2007

Chuanfeng Lv and Qiangfu Zhao

In recent years, principal component analysis (PCA) has attracted great attention in dimension reduction. However, since a very large transformation matrix must be used for…

Abstract

Purpose

In recent years, principal component analysis (PCA) has attracted great attention in dimension reduction. However, since a very large transformation matrix must be used for reconstructing the original data, PCA has not been successfully applied to image compression. To solve this problem, this paper aims to propose a new technique called k‐PCA.

Design/methodology/approach

Actually, k‐PCA is a combination of vector quantization (VQ) and PCA. The basic idea is to divide the problem space into k clusters using VQ, and then find a PCA encoder for each cluster. The point is that if the k‐PCA encoder is obtained using data containing enough information, it can be used as a semi‐universal encoder to compress all images in a given domain.

Findings

Although a k‐PCA encoder is more complex than a single PCA encoder, the compression ratio can be much higher because the transformation matrices can be excluded from the encoded data. The performance of the k‐PCA encoder can be improved further through learning. For this purpose, this paper‐proposes an extended LBG algorithm.

Originality/value

The effectiveness of the k‐PCA is demonstrated through experiments with several well‐known test images.

Details

International Journal of Pervasive Computing and Communications, vol. 3 no. 2
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 1 April 1974

Susan E. Creasey, Michael F. Lynch and J. Howard Petrie

The application of a variable to fixed‐length compression coding technique to two bibliographic data bases (MARC and INSPEC) is described. By appropriate transformation of…

Abstract

The application of a variable to fixed‐length compression coding technique to two bibliographic data bases (MARC and INSPEC) is described. By appropriate transformation of characters or digrams into bit patterns reflecting more accurately the distributions of characters in the data bases, and application of the encoding process, varying degrees of compression can be obtained.

Details

Program, vol. 8 no. 4
Type: Research Article
ISSN: 0033-0337

Article
Publication date: 6 January 2022

Ahmad Latifian

Big data has posed problems for businesses, the Information Technology (IT) sector and the science community. The problems posed by big data can be effectively addressed using…

Abstract

Purpose

Big data has posed problems for businesses, the Information Technology (IT) sector and the science community. The problems posed by big data can be effectively addressed using cloud computing and associated distributed computing technology. Cloud computing and big data are two significant past-year problems that allow high-efficiency and competitive computing tools to be delivered as IT services. The paper aims to examine the role of the cloud as a tool for managing big data in various aspects to help businesses.

Design/methodology/approach

This paper delivers solutions in the cloud for storing, compressing, analyzing and processing big data. Hence, articles were divided into four categories: articles on big data storage, articles on big data processing, articles on analyzing and finally, articles on data compression in cloud computing. This article is based on a systematic literature review. Also, it is based on a review of 19 published papers on big data.

Findings

From the results, it can be inferred that cloud computing technology has features that can be useful for big data management. Challenging issues are raised in each section. For example, in storing big data, privacy and security issues are challenging.

Research limitations/implications

There were limitations to this systematic review. The first limitation is that only English articles were reviewed. Also, articles that matched the keywords were used. Finally, in this review, authoritative articles were reviewed, and slides and tutorials were avoided.

Practical implications

The research presents new insight into the business value of cloud computing in interfirm collaborations.

Originality/value

Previous research has often examined other aspects of big data in the cloud. This article takes a new approach to the subject. It allows big data researchers to comprehend the various aspects of big data management in the cloud. In addition, setting an agenda for future research saves time and effort for readers searching for topics within big data.

Details

Kybernetes, vol. 51 no. 6
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 29 January 2018

Wasim Ahmad Bhat

The purpose of this paper is to investigate the prospects of current storage technologies for long-term preservation of big data in digital libraries.

3305

Abstract

Purpose

The purpose of this paper is to investigate the prospects of current storage technologies for long-term preservation of big data in digital libraries.

Design/methodology/approach

The study employs a systematic and critical review of the relevant literature to explore the prospects of current storage technologies for long-term preservation of big data in digital libraries. Online computer databases were searched to identify the relevant literature published between 2000 and 2016. A specific inclusion and exclusion criterion was formulated and applied in two distinct rounds to determine the most relevant papers.

Findings

The study concludes that the current storage technologies are not viable for long-term preservation of big data in digital libraries. They can neither fulfil all the storage demands nor alleviate the financial expenditures of digital libraries. The study also points out that migrating to emerging storage technologies in digital libraries is a long-term viable solution.

Research limitations/implications

The study suggests that continuous innovation and research efforts in current storage technologies are required to lessen the impact of storage shortage on digital libraries, and to allow emerging storage technologies to advance further and take over. At the same time, more aggressive research and development efforts are required by academics and industry to further advance the emerging storage technologies for their timely and swift adoption by digital libraries.

Practical implications

The study reveals that digital libraries, besides incurring significant financial expenditures, will suffer from potential loss of information due to storage shortage for long-term preservation of big data, if current storage technologies are employed by them. Therefore, policy makers and practitioners should meticulously choose storage technologies for long-term preservation of big data in digital libraries.

Originality/value

This type of holistic study that investigates the prospects of magnetic drive technology, solid-state drive technology, and data-reduction techniques for long-term preservation of big data in digital libraries has not been conducted in the field previously, and so provides a novel contribution. The study arms academics, practitioners, policy makers, and industry with the deep understanding of the problem, technical details to choose storage technologies meticulously, greater insight to frame sustainable policies, and opportunities to address various research problems.

Details

Library Hi Tech, vol. 36 no. 3
Type: Research Article
ISSN: 0737-8831

Keywords

1 – 10 of over 6000