Search results

1 – 10 of over 12000
Article
Publication date: 14 March 2024

Qiang Wen, Lele Chen, Jingwen Jin, Jianhao Huang and HeLin Wan

Fixed mode noise and random mode noise always exist in the image sensor, which affects the imaging quality of the image sensor. The charge diffusion and color mixing between…

Abstract

Purpose

Fixed mode noise and random mode noise always exist in the image sensor, which affects the imaging quality of the image sensor. The charge diffusion and color mixing between pixels in the photoelectric conversion process belong to fixed mode noise. This study aims to improve the image sensor imaging quality by processing the fixed mode noise.

Design/methodology/approach

Through an iterative training of an ergoable long- and short-term memory recurrent neural network model, the authors obtain a neural network model able to compensate for image noise crosstalk. To overcome the lack of differences in the same color pixels on each template of the image sensor under flat-field light, the data before and after compensation were used as a new data set to further train the neural network iteratively.

Findings

The comparison of the images compensated by the two sets of neural network models shows that the gray value distribution is more concentrated and uniform. The middle and high frequency components in the spatial spectrum are all increased, indicating that the compensated image edges change faster and are more detailed (Hinton and Salakhutdinov, 2006; LeCun et al., 1998; Mohanty et al., 2016; Zang et al., 2023).

Originality/value

In this paper, the authors use the iterative learning color image pixel crosstalk compensation method to effectively alleviate the incomplete color mixing problem caused by the insufficient filter rate and the electric crosstalk problem caused by the lateral diffusion of the optical charge caused by the adjacent pixel potential trap.

Details

Sensor Review, vol. 44 no. 2
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 4 April 2023

Chao Ren, Xiaoxing Liu and Ziyan Zhu

The purpose of this paper is to test the invulnerability of the guarantee network at the equilibrium point.

Abstract

Purpose

The purpose of this paper is to test the invulnerability of the guarantee network at the equilibrium point.

Design/methodology/approach

This paper introduces a tractable guarantee network model that captures the invulnerability of the network in terms of cascade-based attack. Furthermore, the equilibrium points are introduced for banks to determine loan origination.

Findings

The proposed approach not only develops equilibrium analysis as an extended perspective in the guarantee network, but also applies cascading failure method to construct the guarantee network. The equilibrium points are examined by simulating experiment. The invulnerability of the guarantee network is quantified by the survival of firms in the simulating progress.

Research limitations/implications

There is less study in equilibrium analysis of the guarantee network. Additionally, cascading failure model is expressed in the presented approach. Moreover, agent-based model can be extended in generating the guarantee network in the future study.

Originality/value

The approach of this paper presents a framework to analyze the equilibrium of the guarantee network. For this, the systemic risk of the whole guarantee network and each node's contribution are measured to predict the probability of default on cascading failure. Focusing on cascade failure process based on equilibrium point, the invulnerability of the guarantee network can be quantified.

Details

Kybernetes, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0368-492X

Keywords

Book part
Publication date: 26 August 2020

Jessica Clements and Marianne Stowell Bracke

Research and writing can be challenging tasks for undergraduate students, and evaluating online sources for use in papers can be especially difficult to do well. This chapter…

Abstract

Research and writing can be challenging tasks for undergraduate students, and evaluating online sources for use in papers can be especially difficult to do well. This chapter discusses a partnership between a library and a writing center to teach writing consultants the basics of the Information Literacy Framework and online source evaluation. This included providing additional education for peer writing consultants. Consultants became more fluent in the aspects of evaluation and provided perspective on how and why undergraduate students struggle with this in the research and writing process, establishing the value of assembling networked pedagogical models for teaching information literacy.

Details

International Perspectives on Improving Student Engagement: Advances in Library Practices in Higher Education
Type: Book
ISBN: 978-1-83909-453-8

Keywords

Content available
Book part
Publication date: 26 August 2020

Abstract

Details

International Perspectives on Improving Student Engagement: Advances in Library Practices in Higher Education
Type: Book
ISBN: 978-1-83909-453-8

Article
Publication date: 7 December 2020

Meng Xiao, Tie Zhang, Yanbiao Zou and Shouyan Chen

The purpose of this paper is to propose a robot constant grinding force control algorithm for the impact stage and processing stage of robotic grinding.

Abstract

Purpose

The purpose of this paper is to propose a robot constant grinding force control algorithm for the impact stage and processing stage of robotic grinding.

Design/methodology/approach

The robot constant grinding force control algorithm is based on a grinding model and iterative algorithm. During the impact stage, active disturbance rejection control is used to plan the robotic reference contact force, and the robot speed is adjusted according to the error between the robot’s real contact force and the robot’s reference contact force. In the processing stage, an RBF neural network is used to construct a model with the robot's position offset displacement and controlled output, and the increment of control parameters is estimated according to the RBF neural network model. The error of contact force and expected force converges gradually by iterating the control parameters online continuously.

Findings

The experimental results show that the normal force overshoot of the robot based on the grinding model and iterative algorithm is small, and the processing convergence speed is fast. The error between the normal force and the expected force is mostly within ±3 N. The normal force based on the force control algorithm is more stable than the normal force based on position control, and the surface roughness of the processed workpiece has also been improved, the Ra value compared with position control has been reduced by 24.2%.

Originality/value

As the proposed approach obtains a constant effect in the impact stage and processing stage of robot grinding and verified by the experiment, this approach can be used for robot grinding for improved machining accuracy.

Details

Industrial Robot: the international journal of robotics research and application, vol. 48 no. 2
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 1 February 2006

Hongze Ma and Chenxia Suo

This study aims to provide a model for designing logistics networks with multiple products. In such a design model, it is important to determine the operation‐related parameters…

2309

Abstract

Purpose

This study aims to provide a model for designing logistics networks with multiple products. In such a design model, it is important to determine the operation‐related parameters accurately.

Design/methodology/approach

An iterative process is used to design a logistics network. First, a mixed integer‐programming (MIP) model is used to determine the configuration of the network. Then, based on the output of MIP model, an inventory‐planning model for multiple products is developed to decide the lot size and ordering frequency for each product at each node of the logistics network, and a vehicle routing model is used to find the shortest product delivery routes from wholesalers to retailers. After that, the operation‐related parameters are recalculated and updated, and the configuration of the logistics network is re‐optimized. Such process proceeds until it converges. By this iterative process, the operation‐related parameters can be determined more accurately.

Findings

The iterative process proposed in the paper possesses following advantages: determining operation‐related parameters accurately, and adapting to changing market quickly.

Research limitations/implications

A typical logistics network used to distribute multiple products is considered as target system to apply this model. This paper does not handle a real case study as an application example.

Originality/value

This paper proposes a modular design approach: different types of mathematics are used for different modules. The model developed in this paper is realizable as software tool in logistics management.

Details

International Journal of Physical Distribution & Logistics Management, vol. 36 no. 2
Type: Research Article
ISSN: 0960-0035

Keywords

Article
Publication date: 6 June 2016

Alison Booth

Within New Zealand, cultural festivals play a vital role in the local representation of diasporic cultures. By analysing the production design of festivals, in Auckland, New…

1212

Abstract

Purpose

Within New Zealand, cultural festivals play a vital role in the local representation of diasporic cultures. By analysing the production design of festivals, in Auckland, New Zealand representing Indian culture between 1995 and 2015, the purpose of this paper is to create a deeper understanding of collaborative networks and power relationships. Using Richard’s pulsar/iterative network theory and Booth’s notion of cultural production networks, a new theoretical model is proposed to visually track the collaborative networks that sustain and bridge cultures, empower communities and fulfil political agendas.

Design/methodology/approach

This ethnographic research draws upon event management studies, industry practice, ethnomusicology and sociology to take a multi-disciplinary approach to an applied research project. Using Richards’ pulsar and iterative event framework Castells’ network theory, combined with qualitative data, this research considers critical collaborative relationships clusters and how they might impact on the temporal nature of festivals.

Findings

The 1997 Festival of Asia and the subsequent Lantern Festival in 2000 and Diwali: Festival of Lights in 2002 were pulsar events that played a significant role in collaborative networks that expand across cultures, countries and traditions. The subsequent iterative events have played a vital role in the representation of Asian cultural identity in general and, more specifically, representing of the city’s growing – in both size and cultural diversity – Indian diaspora.

Originality/value

This research proposes a new conceptual model on festival management and diasporic communities in the Asia-Pacific region. Richards’ and Booth’s conceptual models are used, as a starting point, to offer a new way of considering the importance of looking at collaborative relationships through historical perspectives. The framework explored contributes a new approach to cultural festival network theory and a means to understand the complexity of networks required that engage actors from inside and outside both local and global communities.

Details

International Journal of Event and Festival Management, vol. 7 no. 2
Type: Research Article
ISSN: 1758-2954

Keywords

Book part
Publication date: 24 August 2011

Morten H. Abrahamsen

The study here examines how business actors adapt to changes in networks by analyzing their perceptions or their network pictures. The study is exploratory or iterative in the…

Abstract

The study here examines how business actors adapt to changes in networks by analyzing their perceptions or their network pictures. The study is exploratory or iterative in the sense that revisions occur to the research question, method, theory, and context as an integral part of the research process.

Changes within networks receive less research attention, although considerable research exists on explaining business network structures in different research traditions. This study analyzes changes in networks in terms of the industrial network approach. This approach sees networks as connected relationships between actors, where interdependent companies interact based on their sensemaking of their relevant network environment. The study develops a concept of network change as well as an operationalization for comparing perceptions of change, where the study introduces a template model of dottograms to systematically analyze differences in perceptions. The study then applies the model to analyze findings from a case study of Norwegian/Japanese seafood distribution, and the chapter provides a rich description of a complex system facing considerable pressure to change. In-depth personal interviews and cognitive mapping techniques are the main research tools applied, in addition to tracer studies and personal observation.

The dottogram method represents a valuable contribution to case study research as it enables systematic within-case and across-case analyses. A further theoretical contribution of the study is the suggestion that network change is about actors seeking to change their network position to gain access to resources. Thereby, the study also implies a close relationship between the concepts network position and the network change that has not been discussed within the network approach in great detail.

Another major contribution of the study is the analysis of the role that network pictures play in actors' efforts to change their network position. The study develops seven propositions in an attempt to describe the role of network pictures in network change. So far, the relevant literature discusses network pictures mainly as a theoretical concept. Finally, the chapter concludes with important implications for management practice.

Details

Interfirm Networks: Theory, Strategy, and Behavior
Type: Book
ISBN: 978-1-78052-024-7

Keywords

Article
Publication date: 1 June 2003

Jaroslav Mackerle

This paper gives a bibliographical review of the finite element and boundary element parallel processing techniques from the theoretical and application points of view. Topics…

1205

Abstract

This paper gives a bibliographical review of the finite element and boundary element parallel processing techniques from the theoretical and application points of view. Topics include: theory – domain decomposition/partitioning, load balancing, parallel solvers/algorithms, parallel mesh generation, adaptive methods, and visualization/graphics; applications – structural mechanics problems, dynamic problems, material/geometrical non‐linear problems, contact problems, fracture mechanics, field problems, coupled problems, sensitivity and optimization, and other problems; hardware and software environments – hardware environments, programming techniques, and software development and presentations. The bibliography at the end of this paper contains 850 references to papers, conference proceedings and theses/dissertations dealing with presented subjects that were published between 1996 and 2002.

Details

Engineering Computations, vol. 20 no. 4
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 8 May 2009

Koen Casier, Sofie Verbrugge, Jan Van Ooteghem, Didier Colle, Mario Pickavet and Piet Demeester

The purpose of this paper is to show how in a converged network, all services are provided over the same network infrastructure. This obfuscates the costs of the different

Abstract

Purpose

The purpose of this paper is to show how in a converged network, all services are provided over the same network infrastructure. This obfuscates the costs of the different services in an overall sunk cost. When deploying a new service over the network it is important to know the price that will cover the costs incurred by this service. This paper aims to investigate different approaches to calculate this price, to propose an optimal calculation approach and to estimate the sensitivity of this approach to changes in the inputs or when the inputs will recursively depend on the price set for the service.

Design/methodology/approach

The paper uses existing cost allocation schemes to this particular problem and within simulations, it investigates their outcome on the bottom price margin. Additionally dedicated Monte‐Carlo simulations give information on general sensitivity and iterative simulations are used for detecting the impact of this recursive influence of price on its inputs.

Findings

The optimal calculation approach uses a combination of incremental allocation and full allocation which places a bottom margin on the price which is both sustainable and competitive in the long run. Simulations show large differences of up to 50 percent with other approaches. Additionally the simulations indicate the importance of the length of the calculation horizon as a too small calculation horizon could also lead to differences of up to 50 percent. Sensitivity results indicated a low impact of changes on the bottom margin obtained using this optimal calculation approach and a much higher impact on the non‐optimal margins. Finally iterative calculations showed the importance of highly detailed market research as a 10 percent mismatch between market‐research implicit price and calculated price margin will lead to at least 10 percent difference and might lead to up to 25 percent difference.

Originality/value

The paper links the research field of cost allocation and bottom up cost calculation to the pricing margins calculated in a typical business case evaluation phase. It also links the pricing recursively to adoption and completes the calculations in an iterative manner. Finally the research is completed with sensitivity analysis of the results to changes in the input adoption.

Details

info, vol. 11 no. 3
Type: Research Article
ISSN: 1463-6697

Keywords

1 – 10 of over 12000