Search results

1 – 10 of over 6000
Article
Publication date: 21 October 2013

Anna Tyrie and Shelagh Ferguson

Social exchange theory literature posits that a relationship is dependent on the strength of its social interactions and is clear upon the role of trust, power and commitment…

1897

Abstract

Purpose

Social exchange theory literature posits that a relationship is dependent on the strength of its social interactions and is clear upon the role of trust, power and commitment within that relationship as a means of value creation. However, an understanding of the nature of experiences, expectations, motivations and perceptions as components of the value derivation process are missing. SET literature does not identify these components as antecedents to value creation but central to value derived. This research builds upon that premise to give understanding into how value is derived from arts sponsorships.

Design/methodology/approach

A qualitative exploratory approach is used to research arts sponsorships in New Zealand of differing size, duration and profile.

Findings

This research gives understanding into the nature of experiences, expectations, motivations and perceptions as components parts of value derivation and their interactions resulting in the creation of an iterative value derivation model of the life cycle of an arts sponsorship relationship from a business perspective.

Originality/value

This research has relevance for both academics and marketing managers involved in arts sponsorship. The findings from this research can be used as an analytical tool to help businesses when evaluating their arts sponsorship.

Details

Arts Marketing: An International Journal, vol. 3 no. 2
Type: Research Article
ISSN: 2044-2084

Keywords

Content available
Article
Publication date: 21 October 2013

Noel Dennis and Gretchen Larsen

205

Abstract

Details

Arts Marketing: An International Journal, vol. 3 no. 2
Type: Research Article
ISSN: 2044-2084

Article
Publication date: 1 August 2003

K. Sivakumar and Cheryl Nakata

Companies are increasingly bringing personnel together into teams from different countries, physically and/or electronically, to develop products for multiple or worldwide…

3796

Abstract

Companies are increasingly bringing personnel together into teams from different countries, physically and/or electronically, to develop products for multiple or worldwide markets. Called global new product teams (GNPTs), these groups face significant challenges, including cultural diversity. Differing cultural values can lead to conflict, misunderstanding, and inefficient work styles on the one hand, and strong idea generation and creative problem solving on the other. A study was conducted to identify team compositions that would optimize the effects of national culture so that product development outcomes are favorable. This began by developing a theoretical framework describing the impact of national culture on product development tasks. The framework was then translated into several mathematical models using analytical derivations and comparative statics. The models identify the levels and variances of culture values that maximize product development success by simultaneously considering four relevant dimensions of GNPT performance. Next, the utility of these models was tested by means of numerical simulations for a range of team scenarios. Concludes by drawing implications of the findings for managers and researchers.

Details

International Marketing Review, vol. 20 no. 4
Type: Research Article
ISSN: 0265-1335

Keywords

Article
Publication date: 29 March 2019

Vladimir Michaletz and Andrey I. Artemenkov

The purpose of this paper is to present a methodology based on the transactional asset pricing approach (TAPA) and to illustrate the application of TAPA within the context of…

Abstract

Purpose

The purpose of this paper is to present a methodology based on the transactional asset pricing approach (TAPA) and to illustrate the application of TAPA within the context of professional property valuation.

Design/methodology/approach

The TAPA is a novel analytical valuation methodology recasting the traditional derivations of the income approach techniques, including DCF, from a transactional perspective based on the principle of inter-temporal transactional equity, instead of the conventional investor-specific view originating from I. Fisher (1907, 1930).

Findings

The authors present DCF analysis as a specific case of a more general TAPA approach to valuation under the income method. This also leads to novel analytical derivations of the Direct income capitalization, Gordon, Inwood, Hoskold and Ring models. Based on the TAPA framework, the authors also research the value-enhancing effects of benchmark market volatility on the subject property value and conclude that such effects can be statistically significant depending on the DCF analysis period.

Research limitations/implications

The research has a direct bearing on time-variable discount rate forecasting capabilities, as it uses a time-variant structure for the discount rates.

Practical implications

Using the US Case-Shiller and BLS rental indices as a valuation benchmark, the paper contains an example of applying the general TAPA framework to value a notional property under a TAPA’s DCF version. Such property valuations can be easily replicated in practice – especially in the context of equitable/fair value determination under the International Valuation Standards Council valuation standards.

Social implications

TAPA is a deductive principles-based theory of asset valuation especially fit for the transactional and illiquid asset valuation contexts – thus enabling a more efficient pricing for such assets in a sense of reflecting the transactional interests of the parties more closely than achievable under the conventional valuation methods.

Originality/value

TAPA is an original filiation of research with roots going as far back as Aristotelian Catallactics. It contains analytical formalizations of certain transactional equity principles.

Details

Journal of Property Investment & Finance, vol. 37 no. 3
Type: Research Article
ISSN: 1463-578X

Keywords

Article
Publication date: 1 May 2007

Marija Petek

The purpose of this research is to provide information about derivative bibliographic relationships in the online catalogue COBIB, to investigate size and complexity of…

Abstract

Purpose

The purpose of this research is to provide information about derivative bibliographic relationships in the online catalogue COBIB, to investigate size and complexity of bibliographic families and to determine whether bibliographic characteristics are associated with the extent of derivations.

Design/methodology/approach

A bibliographic entity consisting of a work and item is represented by bibliographic records. A random sample of records is converted into a sample of progenitor works and bibliographic families for each progenitor are constructed.

Findings

25.75 per cent of progenitor works are derivative; successive derivations with 67.02 per cent appear most frequently. The size of bibliographic families ranges from 1 to 16; older progenitors have larger families. The majority of families have one type of relationship; there is one case with four types. A large proportion, 59.06 per cent, of derivative relationships is not expressed explicitly by catalogue.

Research limitations/implications

Research of bibliographic records representing more than one work is needed. It is also important to find out what catalogue users are looking for: a work or an item?

Practical implications

A model for COBIB is suggested; it enables an equal identification of works, items and relationships. A cataloguer must create an authority record for each work and link it with corresponding bibliographic records for items.

Originality/value

Information about relationships should be incorporated into the catalogue and corresponding records linked. Explicit control of derivative relationships would be of great help to catalogue users and would make information retrieval improved and more precise; it would also allow more efficient use of knowledge and library materials.

Details

Journal of Documentation, vol. 63 no. 3
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 1 April 2009

M. Grujicic, B. Pandurangan, N. Coutris, B.A. Cheeseman, W. N. Roy and R.R. Skaggs

A large‐strain/high‐deformation rate model for clay‐free sand recently proposed and validated in our work [1,2], has been extended to sand containing relatively small (< 15vol.%…

Abstract

A large‐strain/high‐deformation rate model for clay‐free sand recently proposed and validated in our work [1,2], has been extended to sand containing relatively small (< 15vol.%) of clay and having various levels of saturation with water. The model includes an equation of state which represents the material response under hydrostatic pressure, a strength model which captures material behavior under elastic‐plastic conditions and a failure model which defines conditions and laws for the initiation and evolution of damage/failure in the material. The model was validated by comparing the computational results associated with detonation of a landmine in clayey sand (at different levels of saturation with water) with their computational counterparts.

Details

Multidiscipline Modeling in Materials and Structures, vol. 5 no. 4
Type: Research Article
ISSN: 1573-6105

Keywords

Article
Publication date: 7 September 2015

Iara Vigo de Lima

The purpose of this paper is to analyse Michel Foucault’s new epistemological model regarding an analogy between the theory of language and economic thought in the seventeenth and…

272

Abstract

Purpose

The purpose of this paper is to analyse Michel Foucault’s new epistemological model regarding an analogy between the theory of language and economic thought in the seventeenth and eighteenth centuries.

Design/methodology/approach

Through the scrutiny of language, Foucault intended to demonstrate that some analogies, among different branches of knowledge (interdiscursive practice), allow us to apprehend the underlying configuration of thought regarding ontological and epistemological conditions that have historically determined knowledge. He draws a parallel between four theoretical segments borrowed from general grammar (Attribution, Articulation, Designation and Derivation) and economic thought on wealth.

Findings

One of the most remarkable propositions of this approach is that the theory of language and economic thought were epistemologically isomorphic in that context. What the theory of language stated in relation to “attribution” and “articulation” corresponded to the “theory of value” in economic thought. What grammar investigated regarding “designation” and “derivation” was analogous to the “theory of money and trade” in economic thought. The relationships that were – directly and diagonally – identified between and among them led to the conclusion that there was ‘a circular and surface causality’ in economic thought insofar as “circulation” preceded “production”. It was “superficial” because it could not find an explanation for the cause of “wealth”, which was only possible when “production” was placed in the front position of theories.

Practical implications

Such an epistemological point of view can inspire other studies in the history of economic thought.

Originality/value

This paper offers a perspective on how to think about the history of ontological and epistemological conditions of economic thought.

Article
Publication date: 7 November 2008

R.D. Sudduth

The primary objective of this two part study was to show theoretically how pigment cluster voids and pigment distribution can influence the critical pigment volume concentration…

Abstract

Purpose

The primary objective of this two part study was to show theoretically how pigment cluster voids and pigment distribution can influence the critical pigment volume concentration (CPVC) and consequently the properties of a dry coating. In Part I of this study a pigment clustering model with an analytical solution has been developed that was a modification of an earlier model by Fishman, Kurtze, and Bierwagen that could only be solved numerically.

Design/methodology/approach

The original derivation of the clustering concept developed by Fishman et al. resulted in a mathematical analysis which was only able to be solved numerically and was found to be very tedious to utilize directly. In this study, a new successful derivation utilizing some of the original concepts of Fishman et al. was generated and shown to result in a practical and much more useable analytical analysis of the clustering concept. This new model was then applied directly to quantify the influence of flow agents or surfactants in a coating formulation on the CPVC as described by Asbeck.

Findings

It was found that the largest deviation from 100 per cent pigment dispersion with no pigment clusters occurred just before and just after the ultimate CPVC (UCPVC). A theoretical relationship was also found between the pigment cluster dispersion coefficient, Cq, and CPVC. This result was consistent with the experimental relationship between CPVC and the per cent flow additive as found by Asbeck. The density ratio of overall coating to the pigment density was found to go through a maximum at a global volume fraction of pigment that was slightly greater than the UCPVC as expected for a mechanical property. It was also identified that mechanical failure of most coating formulations should be apparent at either the “Lower Zero Limit” or the “Upper Zero Limit” global volume fraction pigment as defined in this study.

Research limitations/implications

While the experimental measurement of the parameters to isolate the clustering concepts introduced in this study may be difficult, it is expected that better quantitative measurement of clustering concepts will eventually prove to be very beneficial to providing improved suspension applications including coatings.

Practical implications

The theoretical relationship developed in this study between the pigment cluster dispersion coefficient, Cq, and CPVC and the experimental relationship between CPVC and the per cent flow additive found by Asbeck inferred a direct relationship between Cq and the per cent flow additive. Consequently, it was shown that the theoretical pigment cluster model developed in this study could be directly related to the experimental matrix additive composition in a coating formulation. The implication is that the measurement tool introduced in this study can provide better measurement and control of clustering in coatings and other suspension applications.

Originality/value

In this study, a new successful derivation utilizing some of the original concepts of Fishman et al. was generated and shown to result in a practical and much more useable analytical analysis of the clustering concept. This new model was then applied directly to quantify the influence of flow agents or surfactants in a coating formulation on the CPVC as described by Asbeck.

Details

Pigment & Resin Technology, vol. 37 no. 6
Type: Research Article
ISSN: 0369-9420

Keywords

Article
Publication date: 2 October 2007

Sven Bienert and Wolfgang Brunauer

The purpose of this paper is to critically review the German mortgage lending value (MLV) and to adapt it in order to find a new concept that could serve as the basis for an…

2247

Abstract

Purpose

The purpose of this paper is to critically review the German mortgage lending value (MLV) and to adapt it in order to find a new concept that could serve as the basis for an internationally accepted standard for valuations for lending purposes.

Design/methodology/approach

The research is based on a critical review of existing practices and literature and applies developments in the area of risk management tools, modern valuation techniques as well as the results of the consultation for Basel II in order to find an improved method.

Findings

It was found that a value‐at‐risk approach and the implementation of simulation helps to understand the concept of MLV. The results also indicate that the German system of calculating the MLV has to be improved.

Practical implications

Banks are in need of tools, reliable instruments and a strong theoretical basis when evaluating their collateral. The valuation of real estate for long‐term loans has always been a problem. This paper indicates a strong basis for the implementation of tools in every day business.

Originality/value

Value‐at‐risk concepts and the concepts of maximum/maximum potential loss within a (future) time period have until today not been integrated in the valuation of real estate serving as collateral.

Details

Journal of Property Investment & Finance, vol. 25 no. 6
Type: Research Article
ISSN: 1463-578X

Keywords

Article
Publication date: 5 July 2021

Wang Jianhong

The purpose of this paper is to derive the output predictor for a stationary normal process with rational spectral density and linear stochastic discrete-time state-space model…

Abstract

Purpose

The purpose of this paper is to derive the output predictor for a stationary normal process with rational spectral density and linear stochastic discrete-time state-space model, respectively, as the output predictor is very important in model predictive control. The derivations are only dependent on matrix operations. Based on the output predictor, one quadratic programming problem is constructed to achieve the goal of subspace predictive control. Then an improved ellipsoid optimization algorithm is proposed to solve the optimal control input and the complexity analysis of this improved ellipsoid optimization algorithm is also given to complete the previous work. Finally, by the example of the helicopter, the efficiency of the proposed control strategy can be easily realized.

Design/methodology/approach

First, a stationary normal process with rational spectral density and one stochastic discrete-time state-space model is described. Second, the output predictors for these two forms are derived, respectively, and the derivation processes are dependent on the Diophantine equation and some basic matrix operations. Third, after inserting these two output predictors into the cost function of predictive control, the control input can be solved by using the improved ellipsoid optimization algorithm and the complexity analysis corresponding to this improved ellipsoid optimization algorithm is also provided.

Findings

Subspace predictive control can not only enable automatically tune the parameters in predictive control but also avoids many steps in classical linear Gaussian control. It means that subspace predictive control is independent of any prior knowledge of the controller. An improved ellipsoid optimization algorithm is used to solve the optimal control input and the complexity analysis of this algorithm is also given.

Originality/value

To the best knowledge of the authors, this is the first attempt at deriving the output predictors for stationary normal processes with rational spectral density and one stochastic discrete-time state-space model. Then, the derivation processes are dependent on the Diophantine equation and some basic matrix operations. The complexity analysis corresponding to this improved ellipsoid optimization algorithm is analyzed.

Details

Aircraft Engineering and Aerospace Technology, vol. 93 no. 5
Type: Research Article
ISSN: 1748-8842

Keywords

1 – 10 of over 6000