Search results

1 – 10 of over 4000
Article
Publication date: 25 November 2021

Robert V. Kozinets

Contemporary branding transpires in a complex technological and media environment whose key contextual characteristics remain largely unexplained. The article provides a…

2487

Abstract

Purpose

Contemporary branding transpires in a complex technological and media environment whose key contextual characteristics remain largely unexplained. The article provides a conceptual understanding of the elements of contemporary branding as they take place using networked platforms and explains them as an increasingly important practice that affects customer and manager experience.

Design/methodology/approach

This article draws on a variety of recent sources to synthesize a model that offers a more contextualized, comprehensive and up-to-date understanding of how branding has become and is being altered because of the use of branded service platforms and algorithms.

Findings

Core terminology about technoculture, technocultural fields, platform assemblages, affordances, algorithms and networks of desire set the foundation for a deeper conceptual understanding of the novel elements of algorithmic branding. Algorithmic branding transcended the mere attachment of specific “mythic” qualities to a product or experience and has morphed into the multidimensional process of using media to manage communication. The goal of marketers is now to use engagement practices as well as algorithmic activation, amplification, customization and connectivity to drive consumers deeper into the brand spiral, entangling them in networks of brand-related desire.

Practical implications

The model has a range of important managerial implications for brand management and managerial relations. It promotes a understanding of platform brands as service brands. It underscores and models the interconnected role that consumers, devices and algorithms, as well as technology companies and their own service brands play in corporate branding efforts. It suggests that consumers might unduly trust these service platforms. It points to the growing importance of platforms' service brands and the consequent surrender of branding power to technology companies. And it also provides a range of important ethical and pragmatic questions that curious marketers, researchers and policy-makers may examine.

Originality/value

This model provides a fresh look at the important topic of branding today, updating prior conceptions with a comprehensive and contextually grounded model of service platforms and algorithmic branding.

Article
Publication date: 16 July 2019

Donghee (Don) Shin, Anestis Fotiadis and Hongsik Yu

The purpose of this study is to offer a roadmap for work on the ethical and societal implications of algorithms and AI. Based on an analysis of the social, technical and…

Abstract

Purpose

The purpose of this study is to offer a roadmap for work on the ethical and societal implications of algorithms and AI. Based on an analysis of the social, technical and regulatory challenges posed by algorithmic systems in Korea, this work conducts socioecological evaluations of the governance of algorithmic transparency and accountability.

Design/methodology/approach

This paper analyzes algorithm design and development from critical socioecological angles: social, technological, cultural and industrial phenomena that represent the strategic interaction among people, technology and society, touching on sensitive issues of a legal, a cultural and an ethical nature.

Findings

Algorithm technologies are a part of a social ecosystem, and its development should be based on user interests and rights within a social and cultural milieu. An algorithm represents an interrelated, multilayered ecosystem of networks, protocols, applications, services, practices and users.

Practical implications

Value-sensitive algorithm design is proposed as a novel approach for designing algorithms. As algorithms have become a constitutive technology that shapes human life, it is essential to be aware of the value-ladenness of algorithm development. Human values and social issues can be reflected in an algorithm design.

Originality/value

The arguments in this study help ensure the legitimacy and effectiveness of algorithms. This study provides insight into the challenges and opportunities of algorithms through the lens of a socioecological analysis: political discourse, social dynamics and technological choices inherent in the development of algorithm-based ecology.

Details

Digital Policy, Regulation and Governance, vol. 21 no. 4
Type: Research Article
ISSN: 2398-5038

Keywords

Content available
Article
Publication date: 28 March 2023

Seniye Banu Garip, Orkan Zeynel Güzelci, Ervin Garip and Serkan Kocabay

This study aims to present a novel Genetic Algorithm-Based Design Model (GABDM) to provide reduced-risk areas, namely, a “safe footprint,” in interior spaces during earthquakes…

201

Abstract

Purpose

This study aims to present a novel Genetic Algorithm-Based Design Model (GABDM) to provide reduced-risk areas, namely, a “safe footprint,” in interior spaces during earthquakes. This study focuses on housing interiors as the space where inhabitants spend most of their daily lives.

Design/methodology/approach

The GABDM uses the genetic algorithm as a method, the Nondominated Sorting Genetic Algorithm II algorithm, and the Wallacei X evolutionary optimization engine. The model setup, including inputs, constraints, operations and fitness functions, is presented, as is the algorithmic model’s running procedure. Following the development phase, GABDM is tested with a sample housing interior designed by the authors based on the literature related to earthquake risk in interiors. The implementation section is organized to include two case studies.

Findings

The implementation of GABDM resulted in optimal “safe footprint” solutions for both case studies. However, the results show that the fitness functions achieved in Case Study 1 differed from those achieved in Case Study 2. Furthermore, Case Study 2 has generated more successful (higher ranking) “safe footprint” alternatives with its proposed furniture system.

Originality/value

This study presents an original approach to dealing with earthquake risks in the context of interior design, as well as the development of a design model (GABDM) that uses a generative design method to reduce earthquake risks in interior spaces. By introducing the concept of a “safe footprint,” GABDM contributes explicitly to the prevention of earthquake risk. GABDM is adaptable to other architectural typologies that involve footprint and furniture relationships.

Details

Construction Innovation , vol. 24 no. 1
Type: Research Article
ISSN: 1471-4175

Keywords

Article
Publication date: 18 April 2016

Petr Sosnin

Nowadays, experience bases are widely used by project companies in designing software-intensive systems (SISs). The efficiency of such informational sources is defined by the…

Abstract

Purpose

Nowadays, experience bases are widely used by project companies in designing software-intensive systems (SISs). The efficiency of such informational sources is defined by the “nature” of modeled experience units and approaches that apply to their systematization. This paper aims to increase the efficiency of designing the SISs by the use of an ontological support for interactions with an accessible experience, models of which are understood as intellectually processed conditioned reflexes.

Design/methodology/approach

Both of the base of experience (BE) and ontological support in interactions with its units are oriented on precedents built in accordance with the offered normative schema when the occupational work is fulfilled by a team of designers. In creating the BE and the ontology as part of the BE, the team should use a reflection of an operational space of solved tasks on a specialized semantic memory intended for simulating the applied reasoning of the question-answer type.

Findings

If the occupational space of designing is reflected on the semantic memory with a programmable shell, then this environment can be adjusted on simulating the intellectual mechanisms flown in a human consciousness when designers ontologically interact with the BE and tasks being solved. The use of simulating the process in consciousness in accordance with their nature facilitates increasing the efficiency of designing the SIS.

Research limitations/implications

An orientation on a precedent model as a basic type of experience unit and an ontological approach to their systematization are defined by the specificity of the study described in this paper. Models of precedents are constructed in accordance with the normative schema when the occupational work is fulfilled by a team of designers.

Practical implications

Investigated and developed means of ontological support are oriented on effective designing of the SISs with the use of the toolkit Working In Questions and Answers (WIQA) by the team of designers. The achieved effects are aimed at increasing the level of success in collaborative designing of SISs.

Social implications

Offered solutions are applicable in designing the systems which supported different relations of a human with artificial and natural environment. They facilitate the naturalness in interactions of a human with computerized world.

Originality/value

An orientation on the precedent model as a basic type of experience unit and the ontological approach to their systematization are defined by the specificity of the study described in this paper. The novelty of this approach is defined by the framework for the precedent model, understood as the intellectually processed conditioned reflex, in which a reflection on the semantic memory (of the question-answer type) is programmable in a conceptually algorithmic language. The ontological support is implemented in the environment of programming.

Details

International Journal of Web Information Systems, vol. 12 no. 1
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 1 March 2004

Y.T. Feng and D.R.J. Owen

This paper proposes an energy‐based general polygon to polygon normal contact model in which the normal and tangential directions, magnitude and reference contact position of the…

Abstract

This paper proposes an energy‐based general polygon to polygon normal contact model in which the normal and tangential directions, magnitude and reference contact position of the normal contact force are uniquely defined. The model in its final form is simple and elegant with a clear geometric perspective, and also possesses some advanced features. Furthermore, it can be extended to a more complex situations and in particular, it may also provide a sound theoretical foundation to possibly unifying existing contact models for all types of (convex) objects.

Details

Engineering Computations, vol. 21 no. 2/3/4
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 16 January 2023

Ingo Fiedler and Lennart Ante

This chapter introduces the concept of stablecoins such as Tether, DAI, or Ampleforth. It also provides a taxonomy of the different types of stablecoins consisting of (1…

Abstract

This chapter introduces the concept of stablecoins such as Tether, DAI, or Ampleforth. It also provides a taxonomy of the different types of stablecoins consisting of (1) traditional asset-backed stablecoins, (2) crypto-collateralized stablecoins, and (3) algorithmic stablecoins and seigniorage shares. The chapter continues with a brief history of stablecoins, starting from BitShares as the first stablecoin implementation over tether and market-wide stablecoin adoption to Facebook-initiated Diem. Next, the chapter explains the impact of stablecoins on cryptocurrencies and other markets and discusses trends and challenges facing stablecoins. The chapter provides a basic understanding of stablecoins, their defining characteristics, challenges, and markets.

Details

The Emerald Handbook on Cryptoassets: Investment Opportunities and Challenges
Type: Book
ISBN: 978-1-80455-321-3

Keywords

Book part
Publication date: 12 March 2001

Stephen J. DeCanio, William E. Watkins, Glenn Mitchell, Keyvan Amir-Atefi and Catherine Dibble

Algorithmic models specifying the kinds of computations carried out by economic organizations have the potential to account for the serious discrepancies between the real-world…

Abstract

Algorithmic models specifying the kinds of computations carried out by economic organizations have the potential to account for the serious discrepancies between the real-world behavior of firms and the predictions of conventional maximization models. The algorithmic approach uncovers a surprising degree of complexity in organizational structure and performance. The fact that firms are composed of networks of individual agents drastically raises the complexity of the firm's optimization problem. Even in very simple network models, a large number of organizational characteristics, including some for which no polynomial time computational algorithm is known, appear to influence economic performance. We explore these effects using regression analysis, and through application of standard search heuristics. The calculations show that discovering optimal network structures can be extraordinarily difficult, even when a single clear organizational objective exists and the agents belonging to the firms are homogeneous. One implication is that firms are likely to operate at local rather than global optima. In addition, if organizational fitness is a function of the ability to solve multiple problems, the structure that evolves may not solve any of the individual problems optimally. These results raise the possibility that externally-driven objectives, such as energy efficiency or pollution control, may shift the firm to a new structural compromise that advances other objectives of the firm also, rather than necessarily imposing economic losses.

Details

The Long-Term Economics of Climate Change: Beyond a Doubling of Greenhouse Gas Concentrations
Type: Book
ISBN: 978-0-76230-305-2

Article
Publication date: 14 December 2020

Evgeny Volchenkov

The purpose of this paper is to establish the nature of mathematical modeling of systems within the framework of the object-semantic methodology.

Abstract

Purpose

The purpose of this paper is to establish the nature of mathematical modeling of systems within the framework of the object-semantic methodology.

Design/methodology/approach

The initial methodological position of the object-semantic approach is the principle of constructing concepts of informatics proceeding from fundamental categories and laws. As the appropriate foundation, this paper accepts the system-physical meta-ontology is being developed in this paper.

Findings

The genesis of system modeling is considered in the aspect of the evolution of language tools in the direction of objectification. A new conception of formalized knowledge is being put forward as the mathematical form of fixing time-invariant relations of the universe, reflecting regularity of the dynamics of natural or anthropogenic organization. Object knowledge is considered as a key component of the mathematical model, and the solving of system information problems with its use is characterized as “work of knowledge.” The establishment of the meta-ontological essence of modern mathematical modeling allows us to formulate its fundamental limitations.

Originality/value

The establishment of system-physical limitations of modern mathematical modeling outlines the boundaries from which it is necessary to proceed in the development of future paradigms of cognition of the surrounding world, which presuppose convergence, synthesis of causal (physicalism) and target (elevationism) determination.

Details

Kybernetes, vol. 50 no. 9
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 4 July 2016

Hammad Abdullah Al Nasseri, Kristian Widen and Radhlinah Aulin

The implementation and control processes of project planning and scheduling involve a wide range of methods and tools. Despite the development and modification and integration of…

1964

Abstract

Purpose

The implementation and control processes of project planning and scheduling involve a wide range of methods and tools. Despite the development and modification and integration of the project management theory with newer scheduling approaches in particular, practitioners’ views on the efficiency and effectiveness of these methods and tools differ. This situation can be attributed in part to a lack of understanding of the most appropriate basis for implementing these methods and tools. This study, therefore, aims to overcome this deficiency by conceptualizing and adopting a taxonomy of planning and scheduling methods.

Design/methodology/approach

The study is based on a review and discourse analysis of the literature covering a large number of theoretical and empirical studies. The underlying theories of various planning and scheduling methods were analyzed with respect to the taxonomy criteria adopted in the study.

Findings

Using the taxonomy, the key characteristics of planning and scheduling methods considered in this study were identified and interpreted. These included concepts and theories; key features; suitability and usability; and benefits and limitations. Overall, the findings suggest that project managers should consider taxonomy as a support tool for selecting and prioritizing the most appropriate method or combination of methods for managing their projects. Recommendations include the need for more advanced or multi-dimensional taxonomies to cope with the diversity of project type and size.

Originality/value

The results of the study allow project managers to improve their current practices by utilizing taxonomy when considering the implementation of planning and scheduling methods. Moreover, taxonomy can be considered as a tool to promote learning on the part of those less experienced in planning and scheduling. Taxonomy can be considered as an initial platform for further research in this area.

Details

Journal of Engineering, Design and Technology, vol. 14 no. 3
Type: Research Article
ISSN: 1726-0531

Keywords

Article
Publication date: 15 October 2018

Abednico Lopang Montshiwa

This study aims to present a competitive advantages framework suited for disaster prone regions in the era of climate change, present supply chain cooperation (SCC) as an integral…

Abstract

Purpose

This study aims to present a competitive advantages framework suited for disaster prone regions in the era of climate change, present supply chain cooperation (SCC) as an integral part of GrSCM within the automobile industry and evaluate the competitive advantages framework merits based on SCC as a new implementation tool.

Design/methodology/approach

In an effort to address limited green supply chain management implementation strategies in disaster prone regions, the paper presents SCC as an economic, social and political implementation tool. To explore this; the study introduces SCC in a three-phase competitive advantages model adopted from the Barney 1995 model (with slight differences). Smart PLS 3.0 software package was adopted to carry out multi-variable data analysis. The study’s assumption is a capital economic system and bases its argument of analysis on stockholder theoretical lenses.

Findings

Big company size does not significantly affect SCC, suggesting that companies of all size can organize and enhance their network to be cooperative. Companies with cooperative supply chain network tend to have competitive advantages. SCC is also a viable way to manage business risks, be there internal or external.

Research limitations/implications

One of the study’s limitation is the stockholder theory it adopts, which shoulders its assumptions on a capital economic model of operation. Indeed, the study covered China, which is seen to be a communist-based economy. Another study’s limitation is that it narrows its data collection to disaster prone areas as documented by Guha-Sapir et al. (2012). Consequently, the findings of this study might be only applicable to areas that experience significant level of disruptions usually caused by disaster incidents.

Originality/value

The study is also the first of its kind to propose a model for automobile manufacturing in disaster prone regions. This is done by introducing SCC as an economic, social and political factor, while risk ranking is introduced as an environmental factor to constitute the external changes that Barney 1995 introduced.

Details

Competitiveness Review: An International Business Journal, vol. 28 no. 5
Type: Research Article
ISSN: 1059-5422

Keywords

1 – 10 of over 4000