Search results

1 – 10 of over 376000
Book part
Publication date: 4 April 2016

Farley Grubb

The British North American colonies were the first western economies to rely on legislature-issued paper monies as an important internal media of exchange. This system arose…

Abstract

The British North American colonies were the first western economies to rely on legislature-issued paper monies as an important internal media of exchange. This system arose piecemeal. In the absence of banks and treasuries that exchanged paper monies at face value for specie monies on demand, colonial governments experimented with other ways to anchor their paper monies to real values in the economy. These mechanisms included tax-redemption, land-backed loans, sinking funds, interest-bearing notes, and legal tender laws. I assess and explain the structure and performance of these mechanisms. This was monetary experimentation on a grand scale.

Details

Research in Economic History
Type: Book
ISBN: 978-1-78635-276-7

Keywords

Book part
Publication date: 11 August 2014

Joseph Berger and M. Hamit Fişek

The Spread of Status Value theory describes how new diffuse status characteristics can arise out of the association of initially non-valued characteristics to existing status…

Abstract

Purpose

The Spread of Status Value theory describes how new diffuse status characteristics can arise out of the association of initially non-valued characteristics to existing status characteristics that are already well-established in a society. Our objective is to extend this theory so that it describes how still other status elements, which have become of interest to researchers such as “status objects” (Thye, 2000) and “valued roles” (Fişek, Berger, & Norman, 1995), can also be socially created.

Design/methodology/approach

Our approach involves reviewing research that is relevant to the Spread of Status Value theory, and in introducing concepts and assumptions that are applicable to status objects and valued roles.

Findings

Our major results are an elaborated theory that describes the construction of status objects and valued roles, a graphic representation of one set of conditions in which this creation process is predicted to occur, and a design for a further empirical test of the Spread of Status Value theory. This extension has social implications. It opens up the possibility of creating social interventions that involve status objects and valued roles to ameliorate dysfunctional social situations.

Originality/value

Our elaborated theory enables us to understand for the first time how different types of status valued elements can, under appropriate conditions, be socially created or socially modified as a result of the operation of what are fundamentally similar processes.

Article
Publication date: 1 September 2003

Marian White and Kate Mackenzie‐Davey

Examines what makes employees feel valued by their employer, through a survey of training consultants operating at Brathay, an educational charitable trust, associate training…

2439

Abstract

Examines what makes employees feel valued by their employer, through a survey of training consultants operating at Brathay, an educational charitable trust, associate training consultants working with Brathay to support both its youth and corporate work, and training consultants operating in a commercial organization. Clusters responses under the headings of fairness, environment and inclusion. Suggests differences that may exist between the different types of employees sampled, and their needs/expectations in terms of feeling valued by their employer.

Details

Career Development International, vol. 8 no. 5
Type: Research Article
ISSN: 1362-0436

Keywords

Article
Publication date: 12 August 2014

Yu-Ting Cheng and Chih-Ching Yang

Constructing a fuzzy control chart with interval-valued fuzzy data is an important topic in the fields of medical, sociological, economics, service and management. In particular…

Abstract

Purpose

Constructing a fuzzy control chart with interval-valued fuzzy data is an important topic in the fields of medical, sociological, economics, service and management. In particular, when the data illustrates uncertainty, inconsistency and is incomplete which is often the. case of real data. Traditionally, we use variable control chart to detect the process shift with real value. However, when the real data is composed of interval-valued fuzzy, it is not feasible to use such an approach of traditional statistical process control (SPC) to monitor the fuzzy control chart. The purpose of this paper is to propose the designed standardized fuzzy control chart for interval-valued fuzzy data set.

Design/methodology/approach

The general statistical principles used on the standardized control chart are applied to fuzzy control chart for interval-valued fuzzy data.

Findings

When the real data is composed of interval-valued fuzzy, it is not feasible to use such an approach of traditional SPC to monitor the fuzzy control chart. This study proposes the designed standardized fuzzy control chart for interval-valued fuzzy data set of vegetable price from January 2009 to September 2010 in Taiwan obtained from Council of Agriculture, Executive Yuan. Empirical studies are used to illustrate the application for designing standardized fuzzy control chart. More related practical phenomena can be explained by this appropriate definition of fuzzy control chart.

Originality/value

This paper uses a simpler approach to construct the standardized interval-valued chart for fuzzy data based on traditional standardized control chart which is easy and straightforward. Moreover, the control limit of the designed standardized fuzzy control chart is an interval with (LCL, UCL), which consists of the conventional range of classical standardized control chart.

Details

Management Decision, vol. 52 no. 7
Type: Research Article
ISSN: 0025-1747

Keywords

Article
Publication date: 10 February 2023

Rokhsaneh Yousef Zehi and Noor Saifurina Nana Khurizan

Uncertainty in data, whether in real-valued or integer-valued data, may result in infeasible optimal solutions or unreliable efficiency scores and ranking of decision-making…

Abstract

Purpose

Uncertainty in data, whether in real-valued or integer-valued data, may result in infeasible optimal solutions or unreliable efficiency scores and ranking of decision-making units. To handle the uncertainty in integer-valued factors in data envelopment analysis (DEA) models, this study aims to propose a robust DEA model which is applicable in the presence of such factors.

Design/methodology/approach

This research focuses on the application of fuzzy interpretation of efficiency to a mixed-integer DEA (MIDEA) model. The robust optimization approach is used to address the uncertain integer-valued parameters in the proposed MIDEA model.

Findings

In this study, the authors proposed an MIDEA model without any equality constraint to avoid the arise problem by such constraints in the construction of the robust counterpart of the conventional MIDEA models. We have studied the characteristics and conditions for constructing the uncertainty set with uncertain integer-valued parameters and a robust MIDEA model is proposed under a combined box-polyhedral uncertainty set. The applicability of the developed models is shown in a case study of Malaysian public universities.

Originality/value

This study develops an MIDEA model equivalent to the conventional MIDEA model excluding any equality constraint which is crucial in robust approach to avoid restricted feasible region or infeasible solutions. This study proposes a robust DEA approach which is applicable in cases with uncertain integer-valued parameters, unlike previous studies in robust DEA field where uncertain parameters are generally assumed to be only real-valued.

Details

Journal of Modelling in Management, vol. 19 no. 1
Type: Research Article
ISSN: 1746-5664

Keywords

Article
Publication date: 11 September 2007

Amit Kumar, Shiv Prasad Yadav and Surendra Kumar

The purpose of this research is to develop a new approach for analyzing the fuzzy reliability of a series and parallel system. Also to introduce definition of L‐R type interval…

Abstract

Purpose

The purpose of this research is to develop a new approach for analyzing the fuzzy reliability of a series and parallel system. Also to introduce definition of L‐R type interval valued triangular vague set and certain Tω‐based arithmetic operations between two L‐R type interval valued triangular vague sets.

Design/methodology/approach

In the proposed approach using a fault tree an interval valued vague fault tree is developed for the system in which the fuzzy reliability of each component of the system is represented by a L‐R type interval valued triangular vague set. Then with the help of a developed interval valued vague fault tree an algorithm is developed to analyze the fuzzy system reliability.

Findings

For numerical verification of the proposed approach the fuzzy reliability of the basement flooding has been analyzed using the existing approaches and the proposed approach. Comparing the results of existing approaches and the proposed approach, it has been shown that the uncertainty about the reliability is minimized using the proposed approach and the results are exact. While using the existing approaches the results are approximate due to approximate product of triangular vague sets and interval valued triangular vague sets.

Originality/value

The paper introduces a new approach for analyzing the fuzzy system reliability using Tω‐based arithmetic operations over L‐R type interval valued triangular vague sets.

Details

International Journal of Quality & Reliability Management, vol. 24 no. 8
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 24 March 2021

Jawad Ali, Zia Bashir and Tabasam Rashid

The purpose of the development of the paper is to construct probabilistic interval-valued hesitant fuzzy Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS…

Abstract

Purpose

The purpose of the development of the paper is to construct probabilistic interval-valued hesitant fuzzy Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) model and to improve some preliminary aggregation operators such as probabilistic interval-valued hesitant fuzzy averaging (PIVHFA) operator, probabilistic interval-valued hesitant fuzzy geometric (PIVHFG) operator, probabilistic interval-valued hesitant fuzzy weighted averaging (PIVHFWA) operator, probabilistic interval-valued hesitant fuzzy ordered weighted averaging (PIVHFOWA) operator, probabilistic interval-valued hesitant fuzzy weighted geometric (PIVHFWG) operator and probabilistic interval-valued hesitant fuzzy ordered weighted geometric (PIVHFOWG) operator to cope with multicriteria group decision-making (MCGDM) problems in an efficient manner.

Design/methodology/approach

(1) To design probabilistic interval-valued hesitant fuzzy TOPSIS model. (2) To improve some of the existing aggregation operators. (3) To propose the Hamming distance, Euclidean distance, Hausdorff distance and generalized distance between probabilistic interval-valued hesitant fuzzy sets (PIVHFSs).

Findings

The results of the proposed model are discussed in comparison with the aggregation-based method from the related literature and found the effectiveness of the proposed model and improved aggregation operators.

Practical implications

A case study concerning the healthcare facilities in public hospital is addressed.

Originality/value

The notion of the proposed distance measure is used as rational tool to extend TOPSIS model for probabilistic interval-valued hesitant fuzzy setting.

Details

Grey Systems: Theory and Application, vol. 12 no. 1
Type: Research Article
ISSN: 2043-9377

Keywords

Article
Publication date: 27 March 2009

Anas N. Al‐Rabadi

The purpose of this paper is to introduce an approach for mvalued classical and non‐classical (reversible and quantum) optical computing. The developed approach utilizes new…

Abstract

Purpose

The purpose of this paper is to introduce an approach for mvalued classical and non‐classical (reversible and quantum) optical computing. The developed approach utilizes new multiplexer‐based optical devices and circuits within switch logic to perform the required optical computing. The implementation of the new optical devices and circuits in the optical regular logic synthesis using new lattice and systolic architectures is introduced, and the extensions to quantum optical computing are also presented.

Design/methodology/approach

The new linear optical circuits and systems utilize coherent light beams to perform the functionality of the basic logic multiplexer. The 2‐to‐1 multiplexer is a basic building block in switch logic, where in switch logic a logic circuit is implemented as a combination of switches rather than a combination of logic gates as in the gate logic, which proves to be less‐costly in synthesizing wide variety of logic circuits and systems. The extensions to quantum optical computing using photon spins and the collision of Manakov solitons are also presented.

Findings

New circuits for the optical realizations of mvalued classical and reversible logic functions are introduced. Optical computing extensions to linear quantum computing using photon spins and nonlinear quantum computing using Manakov solitons are also presented. Three new multiplexer‐based linear optical devices are introduced that utilize the properties of frequency, polarization and incident angle that are associated with any light‐matter interaction. The hierarchical implementation of the new optical primitives is used to synthesize regular optical reversible circuits such as the mvalued regular optical reversible lattice and systolic circuits. The concept of parallel optical processing of an array of input laser beams using the new multiplexer‐based optical devices is also introduced. The design of regular quantum optical systems using regular quantum lattice and systolic circuits is introduced. New graph‐based quantum optical representations using various types of quantum decision trees are also presented to efficiently represent quantum optical circuits and systems.

Originality/value

The introduced methods for classical and non‐classical (reversible and quantum) optical regular circuits and systems are new and interesting for the design of several future technologies that require optimal design specifications such as super‐high speed, minimum power consumption and minimum size such as in quantum computing and nanotechnology.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 2 no. 1
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 10 April 2009

Minghu Ha, Witold Pedrycz, Jiqiang Chen and Lifang Zheng

The purpose of this paper is to introduce some basic knowledge of statistical learning theory (SLT) based on random set samples in set‐valued probability space for the first time…

Abstract

Purpose

The purpose of this paper is to introduce some basic knowledge of statistical learning theory (SLT) based on random set samples in set‐valued probability space for the first time and generalize the key theorem and bounds on the rate of uniform convergence of learning theory in Vapnik, to the key theorem and bounds on the rate of uniform convergence for random sets in set‐valued probability space. SLT based on random samples formed in probability space is considered, at present, as one of the fundamental theories about small samples statistical learning. It has become a novel and important field of machine learning, along with other concepts and architectures such as neural networks. However, the theory hardly handles statistical learning problems for samples that involve random set samples.

Design/methodology/approach

Being motivated by some applications, in this paper a SLT is developed based on random set samples. First, a certain law of large numbers for random sets is proved. Second, the definitions of the distribution function and the expectation of random sets are introduced, and the concepts of the expected risk functional and the empirical risk functional are discussed. A notion of the strict consistency of the principle of empirical risk minimization is presented.

Findings

The paper formulates and proves the key theorem and presents the bounds on the rate of uniform convergence of learning theory based on random sets in set‐valued probability space, which become cornerstones of the theoretical fundamentals of the SLT for random set samples.

Originality/value

The paper provides a studied analysis of some theoretical results of learning theory.

Details

Kybernetes, vol. 38 no. 3/4
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 18 October 2011

Minghu Ha, Jiqiang Chen, Witold Pedrycz and Lu Sun

Bounds on the rate of convergence of learning processes based on random samples and probability are one of the essential components of statistical learning theory (SLT). The…

Abstract

Purpose

Bounds on the rate of convergence of learning processes based on random samples and probability are one of the essential components of statistical learning theory (SLT). The constructive distribution‐independent bounds on generalization are the cornerstone of constructing support vector machines. Random sets and set‐valued probability are important extensions of random variables and probability, respectively. The paper aims to address these issues.

Design/methodology/approach

In this study, the bounds on the rate of convergence of learning processes based on random sets and set‐valued probability are discussed. First, the Hoeffding inequality is enhanced based on random sets, and then making use of the key theorem the non‐constructive distribution‐dependent bounds of learning machines based on random sets in set‐valued probability space are revisited. Second, some properties of random sets and set‐valued probability are discussed.

Findings

In the sequel, the concepts of the annealed entropy, the growth function, and VC dimension of a set of random sets are presented. Finally, the paper establishes the VC dimension theory of SLT based on random sets and set‐valued probability, and then develops the constructive distribution‐independent bounds on the rate of uniform convergence of learning processes. It shows that such bounds are important to the analysis of the generalization abilities of learning machines.

Originality/value

SLT is considered at present as one of the fundamental theories about small statistical learning.

1 – 10 of over 376000