Books and journals Case studies Expert Briefings Open Access
Advanced search

Search results

1 – 10 of over 2000
To view the access options for this content please click here
Article
Publication date: 1 February 1988

EXPERT SYSTEMS AND THE USE OF INFORMATION IN BUILDING DESIGN AND CONSTRUCTION

COLIN H. DAVIDSON, PHILIPPE L. DAVIDSON and KALEV RUBERG

The building industry, through its structure and its mandate, faces endemic information problems; expert systems are expected to impact positively. Expert systems are…

HTML
PDF (1.5 MB)

Abstract

The building industry, through its structure and its mandate, faces endemic information problems; expert systems are expected to impact positively. Expert systems are suited to situations of uncertainty; knowledge and reasoning are separated, allowing easier updating. Knowledge acquisition from human experts is difficult and problems of information reliability arise, suggesting the scope for cooperation between knowledge engineers and documentalists familiar with the domain. In building, prevailing conditions seem to indicate the appropriateness of expert systems, particularly during the design phase; however, written documentation and general research results are rarely consulted. This highlights the need for an information ‘refining’ stage between production and use. It is easier to set up expert systems for specialised sub‐domains; however, on‐going research is attempting to develop a comprehensive approach to project‐specific information that would be operational from initial design through to completed construction. Criteria for a comprehensive design information system can be listed.

Details

Journal of Documentation, vol. 44 no. 2
Type: Research Article
DOI: https://doi.org/10.1108/eb026820
ISSN: 0022-0418

To view the access options for this content please click here
Article
Publication date: 8 March 2020

Evaluating the effectiveness of Google, Parsijoo, Rismoon, and Yooz to retrieve Persian documents

Mahdi Zeynali Tazehkandi and Mohsen Nowkarizi

The purpose was to evaluate the effectiveness of Google (as an international search engine) as well as of Parsijoo, Rismoon, and Yooz (as Persian search engines).

HTML
PDF (1.1 MB)

Abstract

Purpose

The purpose was to evaluate the effectiveness of Google (as an international search engine) as well as of Parsijoo, Rismoon, and Yooz (as Persian search engines).

Design/methodology/approach

In this research, Google search engine as an international search engine, and three local ones, Parsijoo, Rismoon, and Yooz, were selected for evaluation. Likewise, 32 subject headings were selected from the Persian Subject Headings List, and then simulated work tasks were assigned based on them. A total of 192 students from Ferdowsi University of Mashhad were asked to search for the information needed for simulated work tasks in the selected search engines, and then to copy the relevant website URLs in the search form.

Findings

The findings indicated that Google, Parsijoo, Rismoon, and Yooz had a significant difference in the precision, recall, and normalized discounted cumulative gain. There was also a significant difference in the effectiveness (average of precision, recall, and NDCG) of these four search engines in the retrieval of the Persian resources.

Practical implications

Users using an efficient search engine will attain more relevant documents, and Google search engine was more efficient in retrieving the Persian resources. It is recommended to use Google as it has a more efficient search.

Originality/value

In this research, for the first time, Google has been compared with local Persian search engines considering the new approach (simulated work tasks).

Details

Library Hi Tech, vol. ahead-of-print no. ahead-of-print
Type: Research Article
DOI: https://doi.org/10.1108/LHT-11-2019-0229
ISSN: 0737-8831

Keywords

  • Evaluation
  • Information retrieval
  • Effectiveness
  • Search engines
  • Google
  • Online retrieval

To view the access options for this content please click here
Article
Publication date: 1 October 2004

Introduction to bibliometrics for construction and maintenance of thesauri: Methodical considerations

Jesper W. Schneider and Pia Borlund

The paper introduces bibliometrics to the research area of knowledge organization – more precisely in relation to construction and maintenance of thesauri. As such, the…

HTML
PDF (146 KB)

Abstract

The paper introduces bibliometrics to the research area of knowledge organization – more precisely in relation to construction and maintenance of thesauri. As such, the paper reviews related work that has been of inspiration for the assembly of a semi‐automatic, bibliometric‐based, approach for construction and maintenance. Similarly, the paper discusses the methodical considerations behind the approach. Eventually, the semi‐automatic approach is used to verify the applicability of bibliometric methods as a supplement to construction and maintenance of thesauri. In the context of knowledge organization, the paper outlines two fundamental approaches to knowledge organization, that is, the manual intellectual approach and the automatic algorithmic approach. Bibliometric methods belong to the automatic algorithmic approach, though bibliometrics do have special characteristics that are substantially different from other methods within this approach.

Details

Journal of Documentation, vol. 60 no. 5
Type: Research Article
DOI: https://doi.org/10.1108/00220410410560609
ISSN: 0022-0418

Keywords

  • Knowledge management
  • Controlled language construction
  • Cataloguing

Content available
Article
Publication date: 3 July 2020

Algorithmic methods to explore the automation of the appraisal of structured and unstructured digital data

Basma Makhlouf Shabou, Julien Tièche, Julien Knafou and Arnaud Gaudinat

This paper aims to describe an interdisciplinary and innovative research conducted in Switzerland, at the Geneva School of Business Administration HES-SO and supported by…

Open Access
HTML
PDF (961 KB)

Abstract

Purpose

This paper aims to describe an interdisciplinary and innovative research conducted in Switzerland, at the Geneva School of Business Administration HES-SO and supported by the State Archives of Neuchâtel (Office des archives de l'État de Neuchâtel, OAEN). The problem to be addressed is one of the most classical ones: how to extract and discriminate relevant data in a huge amount of diversified and complex data record formats and contents. The goal of this study is to provide a framework and a proof of concept for a software that helps taking defensible decisions on the retention and disposal of records and data proposed to the OAEN. For this purpose, the authors designed two axes: the archival axis, to propose archival metrics for the appraisal of structured and unstructured data, and the data mining axis to propose algorithmic methods as complementary or/and additional metrics for the appraisal process.

Design/methodology/approach

Based on two axes, this exploratory study designs and tests the feasibility of archival metrics that are paired to data mining metrics, to advance, as much as possible, the digital appraisal process in a systematic or even automatic way. Under Axis 1, the authors have initiated three steps: first, the design of a conceptual framework to records data appraisal with a detailed three-dimensional approach (trustworthiness, exploitability, representativeness). In addition, the authors defined the main principles and postulates to guide the operationalization of the conceptual dimensions. Second, the operationalization proposed metrics expressed in terms of variables supported by a quantitative method for their measurement and scoring. Third, the authors shared this conceptual framework proposing the dimensions and operationalized variables (metrics) with experienced professionals to validate them. The expert’s feedback finally gave the authors an idea on: the relevance and the feasibility of these metrics. Those two aspects may demonstrate the acceptability of such method in a real-life archival practice. In parallel, Axis 2 proposes functionalities to cover not only macro analysis for data but also the algorithmic methods to enable the computation of digital archival and data mining metrics. Based on that, three use cases were proposed to imagine plausible and illustrative scenarios for the application of such a solution.

Findings

The main results demonstrate the feasibility of measuring the value of data and records with a reproducible method. More specifically, for Axis 1, the authors applied the metrics in a flexible and modular way. The authors defined also the main principles needed to enable computational scoring method. The results obtained through the expert’s consultation on the relevance of 42 metrics indicate an acceptance rate above 80%. In addition, the results show that 60% of all metrics can be automated. Regarding Axis 2, 33 functionalities were developed and proposed under six main types: macro analysis, microanalysis, statistics, retrieval, administration and, finally, the decision modeling and machine learning. The relevance of metrics and functionalities is based on the theoretical validity and computational character of their method. These results are largely satisfactory and promising.

Originality/value

This study offers a valuable aid to improve the validity and performance of archival appraisal processes and decision-making. Transferability and applicability of these archival and data mining metrics could be considered for other types of data. An adaptation of this method and its metrics could be tested on research data, medical data or banking data.

Details

Records Management Journal, vol. 30 no. 2
Type: Research Article
DOI: https://doi.org/10.1108/RMJ-09-2019-0049
ISSN: 0956-5698

Keywords

  • Archival appraisal
  • Algorithmic method
  • Appraisal criteria
  • Appraisal metrics
  • Automation appraisal
  • Data mining

To view the access options for this content please click here
Article
Publication date: 1 December 1996

Integration algorithms for concrete plasticity

G. Etse and K. Willam

Presents a computational algorithm for the numerical integration of triaxial concrete plasticity formulations. The specific material formulation at hand is the so‐called…

HTML
PDF (210 KB)

Abstract

Presents a computational algorithm for the numerical integration of triaxial concrete plasticity formulations. The specific material formulation at hand is the so‐called extended leon model for concrete. It is based on the flow theory of plasticity which entails isotropic hardening as well as fracture energy‐based softening in addition to non‐associated plastic flow. The numerical algorithm resorts to implicit integration according to the backward Euler strategy that enforces plastic consistency according to the closes‐point‐projection method (generalized radial‐return strategy). Numerical simulations illustrate the overall performance of the proposed algorithm and the significant increase of the convergence rate when the algorithmic tangent is used in place of the continuum operator.

Details

Engineering Computations, vol. 13 no. 8
Type: Research Article
DOI: https://doi.org/10.1108/02644409610153005
ISSN: 0264-4401

Keywords

  • Algorithms
  • Plasticity

To view the access options for this content please click here
Article
Publication date: 1 October 2005

Predictor‐corrector procedures for pseudo‐dynamic tests

Alessio Bonelli and Oreste S. Bursi

To propose novel predictor‐corrector time‐integration algorithms for pseudo‐dynamic testing.

HTML
PDF (1.3 MB)

Abstract

Purpose

To propose novel predictor‐corrector time‐integration algorithms for pseudo‐dynamic testing.

Design/methodology/approach

The novel predictor‐corrector time‐integration algorithms are based on both the implicit and the explicit version of the generalized‐α method. In the non‐linear unforced case second‐order accuracy, stability in energy, energy decay in the high‐frequency range as well as asymptotic annihilation are distinctive properties of the generalized‐α scheme; while in the non‐linear forced case they are the limited error near the resonance in terms of frequency location and intensity of the resonant peak. The implicit generalized‐α algorithm has been implemented in a predictor‐one corrector form giving rise to the implicit IPC‐ρ∞ method, able to avoid iterative corrections which are expensive from an experimental standpoint and load oscillations of numerical origin. Moreover, the scheme embodies a secant stiffness formula able to approximate closely the actual stiffness of a structure. Also an explicit algorithm has been implemented, the EPC‐ρb method, endowed with user‐controlled dissipation properties. The resulting schemes have been tested experimentally both on a two‐ and on a six‐degrees‐of‐freedom system, exploiting substructuring techniques.

Findings

The analytical findings and the tests have indicated that the proposed numerical strategies enhance the performance of the pseudo‐dynamic test (PDT) method even in an environment characterized by considerable experimental errors. Moreover, the schemes have been tested numerically on strongly non‐linear multiple‐degrees‐of‐freedom systems reproduced with the Bouc‐Wen hysteretic model, showing that the proposed algorithms reap the benefits of the parent generalized‐α methods.

Research limitations/implications

Further developments envisaged for this study are the application of the IPC‐ρ∞ method and of EPC‐ρb scheme to partitioned procedures for high‐speed pseudo‐dynamic testing with substructuring.

Practical implications

The implicit IPC‐ρ∞ and the explicit EPC‐ρb methods allow a user to have defined dissipation which reduces the effects of experimental error in the PDT without needing onerous iterations.

Originality/value

The paper proposes novel time‐integration algorithms for pseudo‐dynamic testing. Thanks to a predictor‐corrector form of the generalized‐α method, the proposed schemes maintain a high computational efficiency and accuracy.

Details

Engineering Computations, vol. 22 no. 7
Type: Research Article
DOI: https://doi.org/10.1108/02644400510619530
ISSN: 0264-4401

Keywords

  • Tests and testing
  • Programming and algorithm theory
  • Numerical analysis
  • Structures

To view the access options for this content please click here
Article
Publication date: 1 April 1987

The Creation of Expert Performers without Years of Conventional Experience: the Landamatic Method

Lev N. Landa

“Landamatics” is a label given by American scholars to designate this algorithmic‐heuristic theory and method of performance, learning and instruction. Landamatics…

HTML
PDF (672 KB)

Abstract

“Landamatics” is a label given by American scholars to designate this algorithmic‐heuristic theory and method of performance, learning and instruction. Landamatics analyses and explains the mental processes which underlie expert performance, learning and decision making. It also defines specific ways of purposeful and accelerated development of such processes in novices and non‐experts through a special course of instruction.

Details

Journal of Management Development, vol. 6 no. 4
Type: Research Article
DOI: https://doi.org/10.1108/eb051652
ISSN: 0262-1711

Keywords

  • Algorithms
  • Heuristic Methods
  • Learning
  • Teaching Methods

To view the access options for this content please click here
Article
Publication date: 6 March 2020

Design in 3D: a computational fashion design protocol

Katherine Celia Greder, Jie Pei and Jooyoung Shin

The purpose of this study was to create a corset—understructure as well as fabric covering—using only computational, 3D approaches to fashion design. The process…

HTML
PDF (3.2 MB)

Abstract

Purpose

The purpose of this study was to create a corset—understructure as well as fabric covering—using only computational, 3D approaches to fashion design. The process incorporated 3D body scan data, parametric methods for the 3D-printed design, and algorithmic methods for the automated, custom-fit fabric pattern.

Design/methodology/approach

The methods or protocol-based framework that nucleated this design project (see Figure 1) enabled more concentrated research into the iterative step-by-step procedure and the computational techniques used herein.

Findings

The 3D computational methods in this study demonstrated a new way of rendering the body-to-pattern relationship through the use of multiple software platforms. Using body scan data and computer coding, the computational construction methods in this study showed a pliant and sustainable method of clothing design where designers were able to manipulate the X, Y, and Z coordinates of the points on the scan surface.

Research limitations/implications

A study of algorithmic methods is inherently a study of limitation. The iterative process of design was defined and refined through the particularity of an algorithm, which required thoughtful manipulation to inform the outcome of this research.

Practical implications

This study sought to illustrate the use and limitations of algorithm-driven computer programming to advance creative design practices.

Social implications

As body scan data and biometric information become increasingly common components of computational fashion design practices, the need for more research on the use of these techniques is pressing. Moreover, computational techniques serve as a catalyst for discussions about the use of biometric information in design and data modeling.

Originality/value

The process of designing in 3D allowed for the dynamic capability to manipulate proportion and form using parametric design techniques.

Details

International Journal of Clothing Science and Technology, vol. 32 no. 4
Type: Research Article
DOI: https://doi.org/10.1108/IJCST-07-2019-0110
ISSN: 0955-6222

Keywords

  • Computational design
  • 3D printing
  • Parametric design
  • Corset
  • Digital design
  • Virtual fit

To view the access options for this content please click here
Article
Publication date: 3 August 2015

Consistent crystal plasticity kinematics and linearization for the implicit finite element method

Mark Messner, Armand Beaudoin and Robert Dodds

The purpose of this paper is to describe several novel techniques for implementing a crystal plasticity (CP) material model in a large deformation, implicit finite element…

HTML
PDF (2.3 MB)

Abstract

Purpose

The purpose of this paper is to describe several novel techniques for implementing a crystal plasticity (CP) material model in a large deformation, implicit finite element framework.

Design/methodology/approach

Starting from the key kinematic assumptions of CP, the presentation develops the necessary CP correction terms to several common objective stress rates and the consistent linearization of the stress update algorithm. Connections to models for slip system hardening are isolated from these processes.

Findings

A kinematically consistent implementation is found to require a correction to the stress update to include plastic vorticity developed by slip deformation in polycrystals. A simpler, more direct form for the algorithmic tangent is described. Several numerical examples demonstrate the capabilities and computational efficiency of the formulation.

Research limitations/implications

The implementation assumes isotropic slip system hardening. With simple modifications, the described approach extends readily to anisotropic coupled or uncoupled hardening functions.

Practical implications

The modular formulation and implementation support streamlined development of new models for slip system hardening without modifications of the stress update and algorithmic tangent computations. This implementation is available in the open-source code WARP3D.

Originality/value

In the process of developing the CP formulation, this work realized the need for corrections to the Green-Naghdi and Jaumann objective stress rates to account properly for non-zero plastic vorticity. The paper describes fully the consistent linearization of the stress update algorithm and details a new scheme to implement the model with improved efficiency.

Details

Engineering Computations, vol. 32 no. 6
Type: Research Article
DOI: https://doi.org/10.1108/EC-05-2014-0107
ISSN: 0264-4401

Keywords

  • Crystal plasticity
  • Green-Naghdi rate
  • Implicit finite element method
  • Numerical efficiency
  • Objective rate
  • Slip system hardening

To view the access options for this content please click here
Article
Publication date: 1 May 1999

Towards a formal theory of development/evolution and characterization of time discretized operators for heat transfer

Kumar K. Tamma, Xiangmin Zhou and Desong Sha

The time‐discretization process of transient equation systems is an important concern in computational heat transfer applications. As such, the present paper describes a…

HTML
PDF (217 KB)

Abstract

The time‐discretization process of transient equation systems is an important concern in computational heat transfer applications. As such, the present paper describes a formal basis towards providing the theoretical concepts, evolution and development, and characterization of a wide class of time discretized operators for transient heat transfer computations. Therein, emanating from a common family tree and explained via a generalized time weighted philosophy, the paper addresses the development and evolution of time integral operators [IO], and leading to integration operators [InO] in time encompassing single‐step integration operators [SSInO], multi‐step integration operators [MSInO], and a class of finite element in time integration operators [FETInO] including the relationships and the resulting consequences. Also depicted are those termed as discrete numerically assigned [DNA] algorithmic markers essentially comprising of both: the weighted time fields, and the corresponding conditions imposed upon the dependent variable approximation, to uniquely characterize a wide class of transient algorithms. Thereby, providing a plausible standardized formal ideology when referring to and/or relating time discretized operators applicable to transient heat transfer computations.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 9 no. 3
Type: Research Article
DOI: https://doi.org/10.1108/09615539910260185
ISSN: 0961-5539

Keywords

  • Discretized operators
  • Heat transfer

Access
Only content I have access to
Only Open Access
Year
  • Last week (6)
  • Last month (18)
  • Last 3 months (61)
  • Last 6 months (144)
  • Last 12 months (303)
  • All dates (2293)
Content type
  • Article (1841)
  • Book part (333)
  • Earlycite article (110)
  • Case study (5)
  • Expert briefing (4)
1 – 10 of over 2000
Emerald Publishing
  • Opens in new window
  • Opens in new window
  • Opens in new window
  • Opens in new window
© 2021 Emerald Publishing Limited

Services

  • Authors Opens in new window
  • Editors Opens in new window
  • Librarians Opens in new window
  • Researchers Opens in new window
  • Reviewers Opens in new window

About

  • About Emerald Opens in new window
  • Working for Emerald Opens in new window
  • Contact us Opens in new window
  • Publication sitemap

Policies and information

  • Privacy notice
  • Site policies
  • Modern Slavery Act Opens in new window
  • Chair of Trustees governance statement Opens in new window
  • COVID-19 policy Opens in new window
Manage cookies

We’re listening — tell us what you think

  • Something didn’t work…

    Report bugs here

  • All feedback is valuable

    Please share your general feedback

  • Member of Emerald Engage?

    You can join in the discussion by joining the community or logging in here.
    You can also find out more about Emerald Engage.

Join us on our journey

  • Platform update page

    Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

  • Questions & More Information

    Answers to the most commonly asked questions here