Search results

1 – 10 of 489
Article
Publication date: 12 November 2019

Gohar Khan, Manar Mohaisen and Matthias Trier

Leveraging social action theory, social network theory and the notion of network externality, the purpose of this paper is to model two different return on investment (ROI

1434

Abstract

Purpose

Leveraging social action theory, social network theory and the notion of network externality, the purpose of this paper is to model two different return on investment (ROI) measures: the networked ROI which captures the network effect originating from a social media investment, and the discrete ROI which focuses social media discrete returns from individual users.

Design/methodology/approach

A field experiment was set up over a period of three months to test the effects of two variants of an advertisement campaign (a social vs a discrete ad) on the modeled networked and discrete ROIs.

Findings

The authors find that emphasizing discrete user actions leads to lower network gains, but higher monetary returns while the social action emphasis produces higher network gains, but lower monetary returns. The study further suggests that social action focus is preferable for brand promotion and engagement, whereas the discrete action focus is suitable for boosting sales and website traffic.

Practical implications

Several potential implications for social media researchers and marketers are also discussed.

Originality/value

The authors for the first time showed that that the social media returns are derived not only from individual actions taken by the user (e.g. likes and shares) but also from users’ social interdependencies and the additional exposure that results from network effects.

Details

Internet Research, vol. 30 no. 2
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 1 February 1999

G.M. Giaglis, R.J. Paul and R.M. O’Keefe

Although the inherent interrelationships between business processes (BP) and the underlying information technology (IT) infrastructure imply that the design of these two…

Abstract

Although the inherent interrelationships between business processes (BP) and the underlying information technology (IT) infrastructure imply that the design of these two organisational facets should be performed in parallel, this does not seem to be the case in practice. For example, simulation is being extensively used in both the BP and IT domains, albeit in a disjointed fashion. In this paper, we investigate the potential of integrating different simulation models to facilitate concurrent engineering of business processes and information technology and to support the process of investment evaluation. Drawing on the findings of an example case, we identify a number of pertinent issues and articulate future research directions towards the integration of simulation usage in the business domain.

Details

Logistics Information Management, vol. 12 no. 1/2
Type: Research Article
ISSN: 0957-6053

Keywords

Article
Publication date: 1 March 1985

F.J. Arcelus and G. Srinivasan

This paper develops heuristics for the discrete‐time‐proportional demand inventory problem. In addition to the traditional minimal cost objective, return on investment in…

Abstract

This paper develops heuristics for the discrete‐time‐proportional demand inventory problem. In addition to the traditional minimal cost objective, return on investment in inventory is proposed as an alternative criterion in capital constrained situations. The resulting heuristics can be implemented with little computational effort. Moreover, the policies associated with the general inventory cases are independent of the length of the planning horizon. Some computational experience is also reported.

Details

Kybernetes, vol. 14 no. 3
Type: Research Article
ISSN: 0368-492X

Article
Publication date: 2 February 2015

Raed El-Khalil

The current economic crisis increased the demand on management to improve process efficiency. The purpose of this paper is to identify and resolve inefficiencies within the car…

2153

Abstract

Purpose

The current economic crisis increased the demand on management to improve process efficiency. The purpose of this paper is to identify and resolve inefficiencies within the car assembly system utilizing discrete simulation modeling and analysis in order to improve productivity at one of the original equipment manufacturers (OEM) body shops in North America.

Design/methodology/approach

This research was driven by a manager’s recommendation from one of the Big Three (GM, Ford, Chrysler LLC) companies in order to improve operational performance. The data utilized in creating the simulation model was obtained from one of the assembly facilities that produce three different vehicles over a period of one year. All model simulation, inputs and outputs were discussed and agreed upon by facility management.

Findings

The established base model was verified and validated to mimic the actual facility outputs indicating all process bottlenecks. Two model scenarios were considered: the first scenario focussed on the top bottleneck processes flexibility with a ROI of 497 percent, while the second considered changing the model mix percentage leading to a cost improvement of $1.6 million/annually.

Research limitations/implications

The model only considered management decision on buffer sizes, batch size and the top bottleneck station alternatives to make improvements. Simulating improvements in labor efficiency, robots uptime, scrap root cause, and maintenance response to downtime where not considered.

Practical implications

This paper indicated the importance of discrete simulation modeling in providing alternatives for improving process efficiency under certain financial limitations. Given the similarity of the automotive manufacturing processes among the various companies, the findings for this particular facility remain valid for other facilities.

Originality/value

Investment cost and process improvement are currently the two biggest challenges facing operations managers in the manufacturing industry. This study allows managers to gain a broader perspective on discrete simulation ability to simulate complicated systems and present different process improvement alternatives.

Details

Journal of Manufacturing Technology Management, vol. 26 no. 1
Type: Research Article
ISSN: 1741-038X

Keywords

Open Access
Article
Publication date: 22 December 2023

Khaled Hamad Almaiman, Lawrence Ang and Hume Winzar

The purpose of this paper is to study the effects of sports sponsorship on brand equity using two managerially related outcomes: price premium and market share.

2597

Abstract

Purpose

The purpose of this paper is to study the effects of sports sponsorship on brand equity using two managerially related outcomes: price premium and market share.

Design/methodology/approach

This study uses a best–worst discrete choice experiment (BWDCE) and compares the outcome with that of the purchase intention scale, an established probabilistic measure of purchase intention. The total sample consists of 409 fans of three soccer teams sponsored by three different competing brands: Nike, Adidas and Puma.

Findings

With sports sponsorship, fans were willing to pay more for the sponsor’s product, with the sponsoring brand obtaining the highest market share. Prominent brands generally performed better than less prominent brands. The best–worst scaling method was also 35% more accurate in predicting brand choice than a purchase intention scale.

Research limitations/implications

Future research could use the same method to study other types of sponsors, such as title sponsors or other product categories.

Practical implications

Sponsorship managers can use this methodology to assess the return on investment in sponsorship engagement.

Originality/value

Prior sponsorship studies on brand equity tend to ignore market share or fans’ willingness to pay a price premium for a sponsor’s goods and services. However, these two measures are crucial in assessing the effectiveness of sponsorship. This study demonstrates how to conduct such an assessment using the BWDCE method. It provides a clearer picture of sponsorship in terms of its economic value, which is more managerially useful.

Details

European Journal of Marketing, vol. 58 no. 13
Type: Research Article
ISSN: 0309-0566

Keywords

Book part
Publication date: 1 November 2007

Irina Farquhar and Alan Sorkin

This study proposes targeted modernization of the Department of Defense (DoD's) Joint Forces Ammunition Logistics information system by implementing the optimized innovative…

Abstract

This study proposes targeted modernization of the Department of Defense (DoD's) Joint Forces Ammunition Logistics information system by implementing the optimized innovative information technology open architecture design and integrating Radio Frequency Identification Device data technologies and real-time optimization and control mechanisms as the critical technology components of the solution. The innovative information technology, which pursues the focused logistics, will be deployed in 36 months at the estimated cost of $568 million in constant dollars. We estimate that the Systems, Applications, Products (SAP)-based enterprise integration solution that the Army currently pursues will cost another $1.5 billion through the year 2014; however, it is unlikely to deliver the intended technical capabilities.

Details

The Value of Innovation: Impact on Health, Life Quality, Safety, and Regulatory Research
Type: Book
ISBN: 978-1-84950-551-2

Article
Publication date: 31 October 2008

Amal A. Said, Hassan R. HassabElnaby and Tanya S. Nowlin

The purpose of this paper is to examine the relative and incremental information content of a cash recovery‐based measure of performance, the estimated internal rate of return, vs…

1091

Abstract

Purpose

The purpose of this paper is to examine the relative and incremental information content of a cash recovery‐based measure of performance, the estimated internal rate of return, vs an earnings‐based measure of performance, return on assets, in explaining firms' economic performance.

Design/methodology/approach

The paper uses the cash recovery rate that is based on continuous time analysis and U‐shaped cash flows to derive the estimated internal rate of return and compare it to return on assets. A cross‐sectional sample was used over a short interval (year 1993 and year 2005) and a time‐series sample (1993‐2005) to empirically examine the relative and incremental information content of the competing measures. Tobin's q and stock returns are used as performance benchmarks.

Findings

The results of the empirical tests indicate that the estimated internal rate of return provides better relative and incremental information content over earnings‐based measures of performance. Specifically, the empirical evidence shows that the estimated internal rate of return is consistently positively related to Tobin's q and stock returns over all measurement intervals.

Research limitations/implications

These results imply that earnings‐based performance measures are less value relevant compared to cash recovery‐based measures. There are some limitations that may apply to this study. First, the systematic measurement error in estimating the cash recovery rate may not be independent of the measurement error in the estimated internal rate of return. Second, the performance benchmarks used in the study are not free from problems. Particularly, the return on assets is influenced by firms' rate of growth and the Tobin's q is not a perfect measure of business performance. Therefore, one avenue of future research is to assess the usefulness of financial accounting data for analysts forecast. Moreover, future research may also examine the role of institutional changes in financial reporting and its effect on the quality of earnings and economic performance.

Originality/value

This paper presents extended research on cash recovery‐based vs earnings‐based metrics as proxies for economic return using improved research designs, larger samples and new sensitivity analyses.

Details

Review of Accounting and Finance, vol. 7 no. 4
Type: Research Article
ISSN: 1475-7702

Keywords

Article
Publication date: 23 February 2010

Wojciech Peter Latusek

Discrete choice modeling has been discussed by both academics and practitioners as a means of analytical support for B2C relationship marketing. This paper aims to discuss…

5311

Abstract

Purpose

Discrete choice modeling has been discussed by both academics and practitioners as a means of analytical support for B2C relationship marketing. This paper aims to discuss applying this analytical framework in B2B marketing, with an example of cross‐selling high‐tech services to a large business customer. This example is also used to show how an algorithm of genetic binary choice (GBC) modeling, developed by the author, performs in comparison with major techniques used nowadays, and to analyze the financial impact of these different approaches on profitability of B2B relationship marketing operations.

Design/methodology/approach

Predictive models based on the regression analysis, the classification tree and the GBC algorithm are built and analyzed in the context of their performance in optimizing cross‐selling campaigns. An example of business case analysis is used to estimate the financial implications of the different approaches.

Findings

B2B relationship marketing, although differing from B2C in many aspects, can also benefit from analytical support with discrete choice modeling. The financial impact of such support is significant, and can be further increased by improving the predictive accuracy of the models. In this context the GBC modeling algorithm proves to be an interesting alternative to the algorithms used nowadays.

Research limitations/implications

The generalizability of the findings, concerning performance characteristics of the algorithms, is limited: which method is best depends, for example, on data distributions and the particular relationships being modeled.

Practical implications

The paper shows how B2B marketing managers can increase the profitability of relationship marketing using discrete choice modeling, and how implementing new algorithms like the GBC model presented here can allow for further improvement.

Originality/value

The paper bridges the gap between research on binary choice modeling and the practice of B2B relationship marketing. It presents a new possibility of analytical support for B2B marketing operations together with financial implications. It also includes a demonstration of an algorithm newly developed by the author.

Details

Journal of Business & Industrial Marketing, vol. 25 no. 3
Type: Research Article
ISSN: 0885-8624

Keywords

Article
Publication date: 6 June 2016

Tammy Drezner, Zvi Drezner and Pawel J Kalczynski

The purpose of this paper is to investigate a competitive location problem to determine how to allocate a budget to expand company’s chain by either adding new facilities…

Abstract

Purpose

The purpose of this paper is to investigate a competitive location problem to determine how to allocate a budget to expand company’s chain by either adding new facilities, expanding existing facilities, or a combination of both actions. Solving large problems may exceed the computational resources currently available. The authors treat a special case when the market can be divided into mutually exclusive sub-markets. These can be markets in cities around the globe or markets far enough from each other so that it can be assumed that customers in one market do not patronize retail facilities in another market, or that cross-patronizing is negligible. The company has a given budget to invest in these markets. Three objectives are considered: maximizing profit, maximizing return on investment (ROI), and maximizing profit subject to a minimum ROI. An illustrative example problem of 20 sub-markets with a total of 400 facilities, 4,800 potential locations for new facilities, and 5,000 demand points is optimally solved in less than two hours of computing time.

Design/methodology/approach

Since the market can be partitioned into disjoint sub-markets, the profit at each market by investing any budget in this sub-market can be calculated. The best allocation of the budget among the sub-markets can be done by either solving an integer linear program or by dynamic programming. This way, intractabole large competitive location problems can be optimally solved.

Findings

An illustrative example problem of 20 sub-markets with a total of 400 facilities, 4,800 potential locations for new facilities, and 5,000 demand points is optimally solved in less than two hours of computing time. Such a problem cannot be optimally solved by existing methods.

Originality/value

This model is new and was not done in previous papers.

Details

Kybernetes, vol. 45 no. 6
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 27 July 2021

Papangkorn Pidchayathanakorn and Siriporn Supratid

A major key success factor regarding proficient Bayes threshold denoising refers to noise variance estimation. This paper focuses on assessing different noise variance estimations…

Abstract

Purpose

A major key success factor regarding proficient Bayes threshold denoising refers to noise variance estimation. This paper focuses on assessing different noise variance estimations in three Bayes threshold models on two different characteristic brain lesions/tumor magnetic resonance imaging (MRIs).

Design/methodology/approach

Here, three Bayes threshold denoising models based on different noise variance estimations under the stationary wavelet transforms (SWT) domain are mainly assessed, compared to state-of-the-art non-local means (NLMs). Each of those three models, namely D1, GB and DR models, respectively, depends on the most detail wavelet subband at the first resolution level, on the entirely global detail subbands and on the detail subband in each direction/resolution. Explicit and implicit denoising performance are consecutively assessed by threshold denoising and segmentation identification results.

Findings

Implicit performance assessment points the first–second best accuracy, 0.9181 and 0.9048 Dice similarity coefficient (Dice), sequentially yielded by GB and DR; reliability is indicated by 45.66% Dice dropping of DR, compared against 53.38, 61.03 and 35.48% of D1 GB and NLMs, when increasing 0.2 to 0.9 noise level on brain lesions MRI. For brain tumor MRI under 0.2 noise level, it denotes the best accuracy of 0.9592 Dice, resulted by DR; however, 8.09% Dice dropping of DR, relative to 6.72%, 8.85 and 39.36% of D1, GB and NLMs is denoted. The lowest explicit and implicit denoising performances of NLMs are obviously pointed.

Research limitations/implications

A future improvement of denoising performance possibly refers to creating a semi-supervised denoising conjunction model. Such model utilizes the denoised MRIs, resulted by DR and D1 thresholding model as uncorrupted image version along with the noisy MRIs, representing corrupted version ones during autoencoder training phase, to reconstruct the original clean image.

Practical implications

This paper should be of interest to readers in the areas of technologies of computing and information science, including data science and applications, computational health informatics, especially applied as a decision support tool for medical image processing.

Originality/value

In most cases, DR and D1 provide the first–second best implicit performances in terms of accuracy and reliability on both simulated, low-detail small-size region-of-interest (ROI) brain lesions and realistic, high-detail large-size ROI brain tumor MRIs.

1 – 10 of 489