Search results
1 – 10 of over 5000Guo‐Qiang Wu, Zhao‐Wei Sun and Xian‐De Wu
The purpose of this paper is to research the traditional belief‐propagation (BP) decoding algorithm of low‐density parity‐check code. The big computation load is a shortcoming of…
Abstract
Purpose
The purpose of this paper is to research the traditional belief‐propagation (BP) decoding algorithm of low‐density parity‐check code. The big computation load is a shortcoming of traditional BP decoding algorithm. Accordingly, the paper provides an improved BP decoding algorithm which raises the decoding efficiency and reduces decoding time delay.
Design/methodology/approach
An improved BP decoding algorithm is studied and the error correction performance of the improved BP decoding algorithm in the Gaussian channel is provided.
Findings
The simulation result shows the improved BP decoding algorithm has lower computational complexity and higher decoding speed based on the premise of a little decoding performance loss.
Research limitations/implications
The improved BP decoding algorithm has lower computational complexity and higher decoding speed based on the premise of a little decoding performance loss.
Practical implications
The improved BP decoding algorithm raises the decoding efficiency and reduces decoding time delay.
Originality/value
The decoding algorithm only needs to update the wrong bit information which might arise, but not update the bit information whose reliability is very high. The improved BP decoding algorithm raises the decoding efficiency and reduces decoding time delay.
Details
Keywords
Kang Wang, Xingcheng Liu and Paul Cull
The purpose of this paper is to propose a novel decoding algorithm, to decrease the complexity in decoding conventional block turbo codes.
Abstract
Purpose
The purpose of this paper is to propose a novel decoding algorithm, to decrease the complexity in decoding conventional block turbo codes.
Design/methodology/approach
In this algorithm, the signal‐to‐noise ratio (SNR) values of channels are adaptively estimated. After analyzing the relationship between the statistics of the received vectors R and the channel SNR, an adaptive method of tuning the decoding complexity is presented.
Findings
Simulation results show that the proposed algorithm has greatly decreased the decoding complexity and sped up the decoding process while achieving better bit error rate performance.
Originality/value
Simulation experiments described in this paper show that the proposed algorithm can decrease the decoding complexity, shorten the decoding time and achieve good decoding performance.
Details
Keywords
Literacy demands have changed over the years and for success in society it is necessary to handle a wide range of texts and written information. The school has been criticized for…
Abstract
Literacy demands have changed over the years and for success in society it is necessary to handle a wide range of texts and written information. The school has been criticized for not giving their pupils the necessary abilities to handle the kind of information they are faced with in society. One important dimension of literacy is reading comprehension, but even though much written information has the form of tables, drawings, graphs, etc. such presentations are most often accompanied by written text. This chapter focuses the comprehension of different kinds of written information, and data from different tasks are evaluated in light of the simple view of reading. A total of 132 grade 6 readers were given four reading comprehension tasks concurrently with a decoding task and a listening comprehension task. It was found that the sum of decoding and listening comprehension accounted for a larger part of the variance in all the reading comprehension tasks than the product of decoding and listening comprehension. The pupils' results on a naming task and morphological tasks from preschool accounted for significant parts of the variance in the comprehension of both plain text and text combined with tables and graphs over and above the concurrent decoding and listening comprehension results. Speed of orthographic identification in 2nd grade accounted for an additional, significant part of the variance in the plain text reading tasks. These results show that processing speed and linguistic knowledge, such as morphological knowledge, are important contributors to the comprehension of different kinds of written information. Even if speed of orthographic identification is especially important for comprehending plain texts, a broad linguistic and cognitive perspective seems to be important when preparing pupils to comprehend different kinds of written material.
Marcel Fernandez, Josep Cotrina‐Navau and Miguel Soriano
The purpose of this paper is to show that a fingerprinting code is a set of code words that are embedded in each copy of a digital object, with the purpose of making each copy…
Abstract
Pupose
The purpose of this paper is to show that a fingerprinting code is a set of code words that are embedded in each copy of a digital object, with the purpose of making each copy unique. If the fingerprinting code is c‐secure, then the decoding of a pirate word created by a coalition of at most c dishonest users, will expose at least one of the guilty parties.
Design/methodology/approach
The paper presents a systematic strategy for collusions attacking a fingerprinting scheme. As a particular case, this strategy shows that linear codes are not good fingerprinting codes. Based on binary linear equidistant codes, the paper constructs a family of fingerprinting codes in which the identification of guilty users can be efficiently done using minimum distance decoding. Moreover, in order to obtain codes with a better rate a 2‐secure fingerprinting code is also constructed by concatenating a code from the previous family with an outer IPP code.
Findings
The particular choice of the codes is such that it allows the use of efficient decoding algorithms that correct errors beyond the error correction bound of the code, namely a simplified version of the Chase algorithms for the inner code and the Koetter‐Vardy soft‐decision list decoding algorithm for the outer code.
Originality/value
The paper presents a fingerprinting code together with an efficient chasing algorithm.
Details
Keywords
Tao Bao, Jiadong Xu and Gao Wei
The purpose of this paper is to design a decoding software for the Reed‐Solomon (RS) codes, which are using an efficient degree computationless algorithm based on the Euclidean…
Abstract
Purpose
The purpose of this paper is to design a decoding software for the Reed‐Solomon (RS) codes, which are using an efficient degree computationless algorithm based on the Euclidean algorithm. As a consequence, the complexity of this new decoding algorithm is dramatically reduced.
Design/methodology/approach
Applying the rules of polynomial in finite field, operation modules which can carry out multiplication, inverse calculation in GF(28) are designed with “C++” language, and a RS codec software based on these is implemented. In this software, the new decoding algorithm computes the error locator polynomial and the error evaluator polynomial simultaneously without performing polynomial divisions, and there is no need for the degree computation cell and the degree comparison cell.
Findings
Owing to the help of this software, the paper can easily perform the RS code for different values of t and different primitive polynomials over GF(2m) without re‐designing the programme.
Originality/value
It will be served as an efficient auxiliary technique for algorithm development and verification together with hardware design and debugging. Furthermore, an illustrative example of (255, 223) RS code using this program shows that the speed of the decoding process is approximately three times faster than that of the conventional decoding software.
Details
Keywords
Christy R. Austin and Sharon Vaughn
A substantial number of students read significantly below grade level, and students with disabilities perform far below their non-disabled peers. Reading achievement data indicate…
Abstract
A substantial number of students read significantly below grade level, and students with disabilities perform far below their non-disabled peers. Reading achievement data indicate that many students with and at-risk for reading disabilities require more intensive reading interventions. This chapter utilizes the theoretical model of the Simple View of Reading to describe the benefit of early reading instruction, targeting both word reading and word meaning. In addition, evidence is presented supporting the use of word meaning instruction to improve accurate and efficient word reading for students who have failed to respond to explicit decoding instruction.
Details
Keywords
W. Pedrycz and E. Roventa
The concept of fuzzy information becomes a cornerstone of processing and handling linguistic data. As opposed to processing of numeric information where there is a wealth of…
Abstract
The concept of fuzzy information becomes a cornerstone of processing and handling linguistic data. As opposed to processing of numeric information where there is a wealth of advanced methods, by entering the area of linguistic information processing we are immediately faced with a genuine need to revisit the fundamental concepts. We first review a notion of information granularity as a primordial concept playing a key role in human cognition. Dwelling on that, the study embarks on the concept of interacting at the level of fuzzy sets. In particular, we discuss a basic construct of a fuzzy communication channel. The ideas of communication exploiting fuzzy information call for its efficient encoding and decoding that subsequently leads to minimal losses of transmitted information. Interestingly enough, the incurred losses depend heavily on the granularity of the linguistic information involved – in this way one can take advantage of the uncertainty residing within the transmitted information granules and exploit it in the design of the corresponding channel.
Details
Keywords
In the last half-century, individual sensory neurons have been bestowed with characteristics of the whole human being, such as behavior and its oft-presumed precursor…
Abstract
Purpose
In the last half-century, individual sensory neurons have been bestowed with characteristics of the whole human being, such as behavior and its oft-presumed precursor, consciousness. This anthropomorphization is pervasive in the literature. It is also absurd, given what we know about neurons, and it needs to be abolished. This study aims to first understand how it happened, and hence why it persists.
Design/methodology/approach
The peer-reviewed sensory-neurophysiology literature extends to hundreds (perhaps thousands) of papers. Here, more than 90 mainstream papers were scrutinized.
Findings
Anthropomorphization arose because single neurons were cast as “observers” who “identify”, “categorize”, “recognize”, “distinguish” or “discriminate” the stimuli, using math-based algorithms that reduce (“decode”) the stimulus-evoked spike trains to the particular stimuli inferred to elicit them. Without “decoding”, there is supposedly no perception. However, “decoding” is both unnecessary and unconfirmed. The neuronal “observer” in fact consists of the laboratory staff and the greater society that supports them. In anthropomorphization, the neuron becomes the collective.
Research limitations/implications
Anthropomorphization underlies the widespread application to neurons Information Theory and Signal Detection Theory, making both approaches incorrect.
Practical implications
A great deal of time, money and effort has been wasted on anthropomorphic Reductionist approaches to understanding perception and consciousness. Those resources should be diverted into more-fruitful approaches.
Originality/value
A long-overdue scrutiny of sensory-neuroscience literature reveals that anthropomorphization, a form of Reductionism that involves the presumption of single-neuron consciousness, has run amok in neuroscience. Consciousness is more likely to be an emergent property of the brain.
Details
Keywords
Shows that signal quantization can be conveniently captured and quantified in the language of information granules. Optimal codebooks exploited in any signal quantization…
Abstract
Shows that signal quantization can be conveniently captured and quantified in the language of information granules. Optimal codebooks exploited in any signal quantization (discretization) lend themselves to the underlying fundamental issues of information granulation. The paper elaborates on and contrasts between various forms of information granulation such as set theory, shadowed sets, and fuzzy sets. It is revealed that a set‐based codebook can be easily enhanced by the use of the shadowed sets. This also raises awareness about the performance of the quantization process and helps increase its quality by defining additional elements of the codebook and specifying their range of applicability. We show how different information granules contribute to the performance of signal quantification. The role of clustering techniques giving rise to information granules is also analyzed. Some pertinent theoretical results are derived. It is shown that fuzzy sets defined in terms of piecewise linear membership functions with 1/2 overlap between any two adjacent terms of the codebook give rise to the effect of lossless quantization. The study addresses both scalar and multivariable quantization. Numerical studies are included to help illustrate the quantization mechanisms carried out in the setting of granular computing.
Details
Keywords
Rayne Reid and Johan Van Niekerk
This research aims to determine whether the educational influence of the cybersecurity awareness campaign on the audience (their knowledge, behaviour and potential cybersecurity…
Abstract
Purpose
This research aims to determine whether the educational influence of the cybersecurity awareness campaign on the audience (their knowledge, behaviour and potential cybersecurity culture) matches the campaign’s educational objectives. The research focuses on the knowledge component of this metric by examining the awareness campaign audience’s interpretative role in processing the campaign content, through the lens of active audience theory (AAT).
Design/methodology/approach
Using reflective practices, this research examines a single longitudinal case study of a cybersecurity awareness and education campaign which aims to raise awareness amongst school learners. Artefacts from a single sample are examined.
Findings
Reflexive practices using theories such as active audience can assist in identifying deviations between the message a campaign intends to communicate and the message that the campaign audience receives.
Research limitations/implications
Using this research approach, measurements could only be obtained for campaign messages depicted in artefacts. Future interventions should be designed to facilitate a more rigorous analysis of the audiences’ interpretation of all campaign messages using ATT.
Originality/value
This paper applied principles of ATT to examine the audience’s interpretative role in processing an awareness campaign’s content based on artifacts they created after exposure to the campaign. Conducting such analyses as part of a reflective process between cyber awareness/education campaign cycles provides a way to identify areas or topics within the campaign that require corrective action.
Details