Search results
1 – 10 of over 2000COLIN H. DAVIDSON, PHILIPPE L. DAVIDSON and KALEV RUBERG
The building industry, through its structure and its mandate, faces endemic information problems; expert systems are expected to impact positively. Expert systems are…
Abstract
The building industry, through its structure and its mandate, faces endemic information problems; expert systems are expected to impact positively. Expert systems are suited to situations of uncertainty; knowledge and reasoning are separated, allowing easier updating. Knowledge acquisition from human experts is difficult and problems of information reliability arise, suggesting the scope for cooperation between knowledge engineers and documentalists familiar with the domain. In building, prevailing conditions seem to indicate the appropriateness of expert systems, particularly during the design phase; however, written documentation and general research results are rarely consulted. This highlights the need for an information ‘refining’ stage between production and use. It is easier to set up expert systems for specialised sub‐domains; however, on‐going research is attempting to develop a comprehensive approach to project‐specific information that would be operational from initial design through to completed construction. Criteria for a comprehensive design information system can be listed.
Mahdi Zeynali Tazehkandi and Mohsen Nowkarizi
The purpose was to evaluate the effectiveness of Google (as an international search engine) as well as of Parsijoo, Rismoon, and Yooz (as Persian search engines).
Abstract
Purpose
The purpose was to evaluate the effectiveness of Google (as an international search engine) as well as of Parsijoo, Rismoon, and Yooz (as Persian search engines).
Design/methodology/approach
In this research, Google search engine as an international search engine, and three local ones, Parsijoo, Rismoon, and Yooz, were selected for evaluation. Likewise, 32 subject headings were selected from the Persian Subject Headings List, and then simulated work tasks were assigned based on them. A total of 192 students from Ferdowsi University of Mashhad were asked to search for the information needed for simulated work tasks in the selected search engines, and then to copy the relevant website URLs in the search form.
Findings
The findings indicated that Google, Parsijoo, Rismoon, and Yooz had a significant difference in the precision, recall, and normalized discounted cumulative gain. There was also a significant difference in the effectiveness (average of precision, recall, and NDCG) of these four search engines in the retrieval of the Persian resources.
Practical implications
Users using an efficient search engine will attain more relevant documents, and Google search engine was more efficient in retrieving the Persian resources. It is recommended to use Google as it has a more efficient search.
Originality/value
In this research, for the first time, Google has been compared with local Persian search engines considering the new approach (simulated work tasks).
Details
Keywords
Jesper W. Schneider and Pia Borlund
The paper introduces bibliometrics to the research area of knowledge organization – more precisely in relation to construction and maintenance of thesauri. As such, the…
Abstract
The paper introduces bibliometrics to the research area of knowledge organization – more precisely in relation to construction and maintenance of thesauri. As such, the paper reviews related work that has been of inspiration for the assembly of a semi‐automatic, bibliometric‐based, approach for construction and maintenance. Similarly, the paper discusses the methodical considerations behind the approach. Eventually, the semi‐automatic approach is used to verify the applicability of bibliometric methods as a supplement to construction and maintenance of thesauri. In the context of knowledge organization, the paper outlines two fundamental approaches to knowledge organization, that is, the manual intellectual approach and the automatic algorithmic approach. Bibliometric methods belong to the automatic algorithmic approach, though bibliometrics do have special characteristics that are substantially different from other methods within this approach.
Details
Keywords
Basma Makhlouf Shabou, Julien Tièche, Julien Knafou and Arnaud Gaudinat
This paper aims to describe an interdisciplinary and innovative research conducted in Switzerland, at the Geneva School of Business Administration HES-SO and supported by…
Abstract
Purpose
This paper aims to describe an interdisciplinary and innovative research conducted in Switzerland, at the Geneva School of Business Administration HES-SO and supported by the State Archives of Neuchâtel (Office des archives de l'État de Neuchâtel, OAEN). The problem to be addressed is one of the most classical ones: how to extract and discriminate relevant data in a huge amount of diversified and complex data record formats and contents. The goal of this study is to provide a framework and a proof of concept for a software that helps taking defensible decisions on the retention and disposal of records and data proposed to the OAEN. For this purpose, the authors designed two axes: the archival axis, to propose archival metrics for the appraisal of structured and unstructured data, and the data mining axis to propose algorithmic methods as complementary or/and additional metrics for the appraisal process.
Design/methodology/approach
Based on two axes, this exploratory study designs and tests the feasibility of archival metrics that are paired to data mining metrics, to advance, as much as possible, the digital appraisal process in a systematic or even automatic way. Under Axis 1, the authors have initiated three steps: first, the design of a conceptual framework to records data appraisal with a detailed three-dimensional approach (trustworthiness, exploitability, representativeness). In addition, the authors defined the main principles and postulates to guide the operationalization of the conceptual dimensions. Second, the operationalization proposed metrics expressed in terms of variables supported by a quantitative method for their measurement and scoring. Third, the authors shared this conceptual framework proposing the dimensions and operationalized variables (metrics) with experienced professionals to validate them. The expert’s feedback finally gave the authors an idea on: the relevance and the feasibility of these metrics. Those two aspects may demonstrate the acceptability of such method in a real-life archival practice. In parallel, Axis 2 proposes functionalities to cover not only macro analysis for data but also the algorithmic methods to enable the computation of digital archival and data mining metrics. Based on that, three use cases were proposed to imagine plausible and illustrative scenarios for the application of such a solution.
Findings
The main results demonstrate the feasibility of measuring the value of data and records with a reproducible method. More specifically, for Axis 1, the authors applied the metrics in a flexible and modular way. The authors defined also the main principles needed to enable computational scoring method. The results obtained through the expert’s consultation on the relevance of 42 metrics indicate an acceptance rate above 80%. In addition, the results show that 60% of all metrics can be automated. Regarding Axis 2, 33 functionalities were developed and proposed under six main types: macro analysis, microanalysis, statistics, retrieval, administration and, finally, the decision modeling and machine learning. The relevance of metrics and functionalities is based on the theoretical validity and computational character of their method. These results are largely satisfactory and promising.
Originality/value
This study offers a valuable aid to improve the validity and performance of archival appraisal processes and decision-making. Transferability and applicability of these archival and data mining metrics could be considered for other types of data. An adaptation of this method and its metrics could be tested on research data, medical data or banking data.
Details
Keywords
Presents a computational algorithm for the numerical integration of triaxial concrete plasticity formulations. The specific material formulation at hand is the so‐called…
Abstract
Presents a computational algorithm for the numerical integration of triaxial concrete plasticity formulations. The specific material formulation at hand is the so‐called extended leon model for concrete. It is based on the flow theory of plasticity which entails isotropic hardening as well as fracture energy‐based softening in addition to non‐associated plastic flow. The numerical algorithm resorts to implicit integration according to the backward Euler strategy that enforces plastic consistency according to the closes‐point‐projection method (generalized radial‐return strategy). Numerical simulations illustrate the overall performance of the proposed algorithm and the significant increase of the convergence rate when the algorithmic tangent is used in place of the continuum operator.
Details
Keywords
Alessio Bonelli and Oreste S. Bursi
To propose novel predictor‐corrector time‐integration algorithms for pseudo‐dynamic testing.
Abstract
Purpose
To propose novel predictor‐corrector time‐integration algorithms for pseudo‐dynamic testing.
Design/methodology/approach
The novel predictor‐corrector time‐integration algorithms are based on both the implicit and the explicit version of the generalized‐α method. In the non‐linear unforced case second‐order accuracy, stability in energy, energy decay in the high‐frequency range as well as asymptotic annihilation are distinctive properties of the generalized‐α scheme; while in the non‐linear forced case they are the limited error near the resonance in terms of frequency location and intensity of the resonant peak. The implicit generalized‐α algorithm has been implemented in a predictor‐one corrector form giving rise to the implicit IPC‐ρ∞ method, able to avoid iterative corrections which are expensive from an experimental standpoint and load oscillations of numerical origin. Moreover, the scheme embodies a secant stiffness formula able to approximate closely the actual stiffness of a structure. Also an explicit algorithm has been implemented, the EPC‐ρb method, endowed with user‐controlled dissipation properties. The resulting schemes have been tested experimentally both on a two‐ and on a six‐degrees‐of‐freedom system, exploiting substructuring techniques.
Findings
The analytical findings and the tests have indicated that the proposed numerical strategies enhance the performance of the pseudo‐dynamic test (PDT) method even in an environment characterized by considerable experimental errors. Moreover, the schemes have been tested numerically on strongly non‐linear multiple‐degrees‐of‐freedom systems reproduced with the Bouc‐Wen hysteretic model, showing that the proposed algorithms reap the benefits of the parent generalized‐α methods.
Research limitations/implications
Further developments envisaged for this study are the application of the IPC‐ρ∞ method and of EPC‐ρb scheme to partitioned procedures for high‐speed pseudo‐dynamic testing with substructuring.
Practical implications
The implicit IPC‐ρ∞ and the explicit EPC‐ρb methods allow a user to have defined dissipation which reduces the effects of experimental error in the PDT without needing onerous iterations.
Originality/value
The paper proposes novel time‐integration algorithms for pseudo‐dynamic testing. Thanks to a predictor‐corrector form of the generalized‐α method, the proposed schemes maintain a high computational efficiency and accuracy.
Details
Keywords
“Landamatics” is a label given by American scholars to designate this algorithmic‐heuristic theory and method of performance, learning and instruction. Landamatics…
Abstract
“Landamatics” is a label given by American scholars to designate this algorithmic‐heuristic theory and method of performance, learning and instruction. Landamatics analyses and explains the mental processes which underlie expert performance, learning and decision making. It also defines specific ways of purposeful and accelerated development of such processes in novices and non‐experts through a special course of instruction.
Details
Keywords
Katherine Celia Greder, Jie Pei and Jooyoung Shin
The purpose of this study was to create a corset—understructure as well as fabric covering—using only computational, 3D approaches to fashion design. The process…
Abstract
Purpose
The purpose of this study was to create a corset—understructure as well as fabric covering—using only computational, 3D approaches to fashion design. The process incorporated 3D body scan data, parametric methods for the 3D-printed design, and algorithmic methods for the automated, custom-fit fabric pattern.
Design/methodology/approach
The methods or protocol-based framework that nucleated this design project (see Figure 1) enabled more concentrated research into the iterative step-by-step procedure and the computational techniques used herein.
Findings
The 3D computational methods in this study demonstrated a new way of rendering the body-to-pattern relationship through the use of multiple software platforms. Using body scan data and computer coding, the computational construction methods in this study showed a pliant and sustainable method of clothing design where designers were able to manipulate the X, Y, and Z coordinates of the points on the scan surface.
Research limitations/implications
A study of algorithmic methods is inherently a study of limitation. The iterative process of design was defined and refined through the particularity of an algorithm, which required thoughtful manipulation to inform the outcome of this research.
Practical implications
This study sought to illustrate the use and limitations of algorithm-driven computer programming to advance creative design practices.
Social implications
As body scan data and biometric information become increasingly common components of computational fashion design practices, the need for more research on the use of these techniques is pressing. Moreover, computational techniques serve as a catalyst for discussions about the use of biometric information in design and data modeling.
Originality/value
The process of designing in 3D allowed for the dynamic capability to manipulate proportion and form using parametric design techniques.
Details
Keywords
Mark Messner, Armand Beaudoin and Robert Dodds
The purpose of this paper is to describe several novel techniques for implementing a crystal plasticity (CP) material model in a large deformation, implicit finite element…
Abstract
Purpose
The purpose of this paper is to describe several novel techniques for implementing a crystal plasticity (CP) material model in a large deformation, implicit finite element framework.
Design/methodology/approach
Starting from the key kinematic assumptions of CP, the presentation develops the necessary CP correction terms to several common objective stress rates and the consistent linearization of the stress update algorithm. Connections to models for slip system hardening are isolated from these processes.
Findings
A kinematically consistent implementation is found to require a correction to the stress update to include plastic vorticity developed by slip deformation in polycrystals. A simpler, more direct form for the algorithmic tangent is described. Several numerical examples demonstrate the capabilities and computational efficiency of the formulation.
Research limitations/implications
The implementation assumes isotropic slip system hardening. With simple modifications, the described approach extends readily to anisotropic coupled or uncoupled hardening functions.
Practical implications
The modular formulation and implementation support streamlined development of new models for slip system hardening without modifications of the stress update and algorithmic tangent computations. This implementation is available in the open-source code WARP3D.
Originality/value
In the process of developing the CP formulation, this work realized the need for corrections to the Green-Naghdi and Jaumann objective stress rates to account properly for non-zero plastic vorticity. The paper describes fully the consistent linearization of the stress update algorithm and details a new scheme to implement the model with improved efficiency.
Details
Keywords
Kumar K. Tamma, Xiangmin Zhou and Desong Sha
The time‐discretization process of transient equation systems is an important concern in computational heat transfer applications. As such, the present paper describes a…
Abstract
The time‐discretization process of transient equation systems is an important concern in computational heat transfer applications. As such, the present paper describes a formal basis towards providing the theoretical concepts, evolution and development, and characterization of a wide class of time discretized operators for transient heat transfer computations. Therein, emanating from a common family tree and explained via a generalized time weighted philosophy, the paper addresses the development and evolution of time integral operators [IO], and leading to integration operators [InO] in time encompassing single‐step integration operators [SSInO], multi‐step integration operators [MSInO], and a class of finite element in time integration operators [FETInO] including the relationships and the resulting consequences. Also depicted are those termed as discrete numerically assigned [DNA] algorithmic markers essentially comprising of both: the weighted time fields, and the corresponding conditions imposed upon the dependent variable approximation, to uniquely characterize a wide class of transient algorithms. Thereby, providing a plausible standardized formal ideology when referring to and/or relating time discretized operators applicable to transient heat transfer computations.
Details