Search results

1 – 10 of over 3000
Open Access
Article
Publication date: 7 December 2023

Elena Vazquez

Algorithmic and computational thinking are necessary skills for designers in an increasingly digital world. Parametric design, a method to construct designs based on algorithmic…

Abstract

Purpose

Algorithmic and computational thinking are necessary skills for designers in an increasingly digital world. Parametric design, a method to construct designs based on algorithmic logic and rules, has become widely used in architecture practice and incorporated in the curricula of architecture schools. However, there are few studies proposing strategies for teaching parametric design into architecture students, tackling software literacy while promoting the development of algorithmic thinking.

Design/methodology/approach

A descriptive study and a prescriptive study are conducted. The descriptive study reviews the literature on parametric design education. The prescriptive study is centered on proposing the incomplete recipe as instructional material and a new approach to teaching parametric design.

Findings

The literature on parametric design education has mostly focused on curricular discussions, descriptions of case studies or studio-long approaches; day-to-day instructional methods, however, are rarely discussed. A pedagogical strategy to teach parametric design is introduced: the incomplete recipe. The instructional method proposed provides students with incomplete recipes for parametric scripts that are increasingly pared down as the students become expert users.

Originality/value

The article contributes to the existing literature by proposing the incomplete recipe as a strategy for teaching parametric design. The recipe as a pedagogical tool provides a means for both software skill acquisition and the development of algorithmic thinking.

Article
Publication date: 1 February 1988

COLIN H. DAVIDSON, PHILIPPE L. DAVIDSON and KALEV RUBERG

The building industry, through its structure and its mandate, faces endemic information problems; expert systems are expected to impact positively. Expert systems are suited to…

Abstract

The building industry, through its structure and its mandate, faces endemic information problems; expert systems are expected to impact positively. Expert systems are suited to situations of uncertainty; knowledge and reasoning are separated, allowing easier updating. Knowledge acquisition from human experts is difficult and problems of information reliability arise, suggesting the scope for cooperation between knowledge engineers and documentalists familiar with the domain. In building, prevailing conditions seem to indicate the appropriateness of expert systems, particularly during the design phase; however, written documentation and general research results are rarely consulted. This highlights the need for an information ‘refining’ stage between production and use. It is easier to set up expert systems for specialised sub‐domains; however, on‐going research is attempting to develop a comprehensive approach to project‐specific information that would be operational from initial design through to completed construction. Criteria for a comprehensive design information system can be listed.

Details

Journal of Documentation, vol. 44 no. 2
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 6 March 2020

Mahdi Zeynali Tazehkandi and Mohsen Nowkarizi

The purpose was to evaluate the effectiveness of Google (as an international search engine) as well as of Parsijoo, Rismoon, and Yooz (as Persian search engines).

Abstract

Purpose

The purpose was to evaluate the effectiveness of Google (as an international search engine) as well as of Parsijoo, Rismoon, and Yooz (as Persian search engines).

Design/methodology/approach

In this research, Google search engine as an international search engine, and three local ones, Parsijoo, Rismoon, and Yooz, were selected for evaluation. Likewise, 32 subject headings were selected from the Persian Subject Headings List, and then simulated work tasks were assigned based on them. A total of 192 students from Ferdowsi University of Mashhad were asked to search for the information needed for simulated work tasks in the selected search engines, and then to copy the relevant website URLs in the search form.

Findings

The findings indicated that Google, Parsijoo, Rismoon, and Yooz had a significant difference in the precision, recall, and normalized discounted cumulative gain. There was also a significant difference in the effectiveness (average of precision, recall, and NDCG) of these four search engines in the retrieval of the Persian resources.

Practical implications

Users using an efficient search engine will attain more relevant documents, and Google search engine was more efficient in retrieving the Persian resources. It is recommended to use Google as it has a more efficient search.

Originality/value

In this research, for the first time, Google has been compared with local Persian search engines considering the new approach (simulated work tasks).

Details

Library Hi Tech, vol. 39 no. 1
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 1 October 2004

Jesper W. Schneider and Pia Borlund

The paper introduces bibliometrics to the research area of knowledge organization – more precisely in relation to construction and maintenance of thesauri. As such, the paper…

2785

Abstract

The paper introduces bibliometrics to the research area of knowledge organization – more precisely in relation to construction and maintenance of thesauri. As such, the paper reviews related work that has been of inspiration for the assembly of a semi‐automatic, bibliometric‐based, approach for construction and maintenance. Similarly, the paper discusses the methodical considerations behind the approach. Eventually, the semi‐automatic approach is used to verify the applicability of bibliometric methods as a supplement to construction and maintenance of thesauri. In the context of knowledge organization, the paper outlines two fundamental approaches to knowledge organization, that is, the manual intellectual approach and the automatic algorithmic approach. Bibliometric methods belong to the automatic algorithmic approach, though bibliometrics do have special characteristics that are substantially different from other methods within this approach.

Details

Journal of Documentation, vol. 60 no. 5
Type: Research Article
ISSN: 0022-0418

Keywords

Article
Publication date: 19 December 2023

Susan Gardner Archambault

Research shows that postsecondary students are largely unaware of the impact of algorithms on their everyday lives. Also, most noncomputer science students are not being taught…

Abstract

Purpose

Research shows that postsecondary students are largely unaware of the impact of algorithms on their everyday lives. Also, most noncomputer science students are not being taught about algorithms as part of the regular curriculum. This exploratory, qualitative study aims to explore subject-matter experts’ insights and perceptions of the knowledge components, coping behaviors and pedagogical considerations to aid faculty in teaching algorithmic literacy to postsecondary students.

Design/methodology/approach

Eleven semistructured interviews and one focus group were conducted with scholars and teachers of critical algorithm studies and related fields. A content analysis was manually performed on the transcripts using a mixture of deductive and inductive coding. Data analysis was aided by the coding software program Dedoose (2021) to determine frequency totals for occurrences of a code across all participants along with how many times specific participants mentioned a code. Then, findings were organized around the three themes of knowledge components, coping behaviors and pedagogy.

Findings

The findings suggested a set of 10 knowledge components that would contribute to students’ algorithmic literacy along with seven behaviors that students could use to help them better cope with algorithmic systems. A set of five teaching strategies also surfaced to help improve students’ algorithmic literacy.

Originality/value

This study contributes to improved pedagogy surrounding algorithmic literacy and validates existing multi-faceted conceptualizations and measurements of algorithmic literacy.

Details

Information and Learning Sciences, vol. 125 no. 1/2
Type: Research Article
ISSN: 2398-5348

Keywords

Open Access
Article
Publication date: 6 July 2020

Basma Makhlouf Shabou, Julien Tièche, Julien Knafou and Arnaud Gaudinat

This paper aims to describe an interdisciplinary and innovative research conducted in Switzerland, at the Geneva School of Business Administration HES-SO and supported by the…

4553

Abstract

Purpose

This paper aims to describe an interdisciplinary and innovative research conducted in Switzerland, at the Geneva School of Business Administration HES-SO and supported by the State Archives of Neuchâtel (Office des archives de l'État de Neuchâtel, OAEN). The problem to be addressed is one of the most classical ones: how to extract and discriminate relevant data in a huge amount of diversified and complex data record formats and contents. The goal of this study is to provide a framework and a proof of concept for a software that helps taking defensible decisions on the retention and disposal of records and data proposed to the OAEN. For this purpose, the authors designed two axes: the archival axis, to propose archival metrics for the appraisal of structured and unstructured data, and the data mining axis to propose algorithmic methods as complementary or/and additional metrics for the appraisal process.

Design/methodology/approach

Based on two axes, this exploratory study designs and tests the feasibility of archival metrics that are paired to data mining metrics, to advance, as much as possible, the digital appraisal process in a systematic or even automatic way. Under Axis 1, the authors have initiated three steps: first, the design of a conceptual framework to records data appraisal with a detailed three-dimensional approach (trustworthiness, exploitability, representativeness). In addition, the authors defined the main principles and postulates to guide the operationalization of the conceptual dimensions. Second, the operationalization proposed metrics expressed in terms of variables supported by a quantitative method for their measurement and scoring. Third, the authors shared this conceptual framework proposing the dimensions and operationalized variables (metrics) with experienced professionals to validate them. The expert’s feedback finally gave the authors an idea on: the relevance and the feasibility of these metrics. Those two aspects may demonstrate the acceptability of such method in a real-life archival practice. In parallel, Axis 2 proposes functionalities to cover not only macro analysis for data but also the algorithmic methods to enable the computation of digital archival and data mining metrics. Based on that, three use cases were proposed to imagine plausible and illustrative scenarios for the application of such a solution.

Findings

The main results demonstrate the feasibility of measuring the value of data and records with a reproducible method. More specifically, for Axis 1, the authors applied the metrics in a flexible and modular way. The authors defined also the main principles needed to enable computational scoring method. The results obtained through the expert’s consultation on the relevance of 42 metrics indicate an acceptance rate above 80%. In addition, the results show that 60% of all metrics can be automated. Regarding Axis 2, 33 functionalities were developed and proposed under six main types: macro analysis, microanalysis, statistics, retrieval, administration and, finally, the decision modeling and machine learning. The relevance of metrics and functionalities is based on the theoretical validity and computational character of their method. These results are largely satisfactory and promising.

Originality/value

This study offers a valuable aid to improve the validity and performance of archival appraisal processes and decision-making. Transferability and applicability of these archival and data mining metrics could be considered for other types of data. An adaptation of this method and its metrics could be tested on research data, medical data or banking data.

Details

Records Management Journal, vol. 30 no. 2
Type: Research Article
ISSN: 0956-5698

Keywords

Article
Publication date: 1 December 1996

G. Etse and K. Willam

Presents a computational algorithm for the numerical integration of triaxial concrete plasticity formulations. The specific material formulation at hand is the so‐called extended…

Abstract

Presents a computational algorithm for the numerical integration of triaxial concrete plasticity formulations. The specific material formulation at hand is the so‐called extended leon model for concrete. It is based on the flow theory of plasticity which entails isotropic hardening as well as fracture energy‐based softening in addition to non‐associated plastic flow. The numerical algorithm resorts to implicit integration according to the backward Euler strategy that enforces plastic consistency according to the closes‐point‐projection method (generalized radial‐return strategy). Numerical simulations illustrate the overall performance of the proposed algorithm and the significant increase of the convergence rate when the algorithmic tangent is used in place of the continuum operator.

Details

Engineering Computations, vol. 13 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 1 October 2005

Alessio Bonelli and Oreste S. Bursi

To propose novel predictor‐corrector time‐integration algorithms for pseudo‐dynamic testing.

Abstract

Purpose

To propose novel predictor‐corrector time‐integration algorithms for pseudo‐dynamic testing.

Design/methodology/approach

The novel predictor‐corrector time‐integration algorithms are based on both the implicit and the explicit version of the generalized‐α method. In the non‐linear unforced case second‐order accuracy, stability in energy, energy decay in the high‐frequency range as well as asymptotic annihilation are distinctive properties of the generalized‐α scheme; while in the non‐linear forced case they are the limited error near the resonance in terms of frequency location and intensity of the resonant peak. The implicit generalized‐α algorithm has been implemented in a predictor‐one corrector form giving rise to the implicit IPC‐ρ method, able to avoid iterative corrections which are expensive from an experimental standpoint and load oscillations of numerical origin. Moreover, the scheme embodies a secant stiffness formula able to approximate closely the actual stiffness of a structure. Also an explicit algorithm has been implemented, the EPC‐ρb method, endowed with user‐controlled dissipation properties. The resulting schemes have been tested experimentally both on a two‐ and on a six‐degrees‐of‐freedom system, exploiting substructuring techniques.

Findings

The analytical findings and the tests have indicated that the proposed numerical strategies enhance the performance of the pseudo‐dynamic test (PDT) method even in an environment characterized by considerable experimental errors. Moreover, the schemes have been tested numerically on strongly non‐linear multiple‐degrees‐of‐freedom systems reproduced with the Bouc‐Wen hysteretic model, showing that the proposed algorithms reap the benefits of the parent generalized‐α methods.

Research limitations/implications

Further developments envisaged for this study are the application of the IPC‐ρ method and of EPC‐ρb scheme to partitioned procedures for high‐speed pseudo‐dynamic testing with substructuring.

Practical implications

The implicit IPC‐ρ and the explicit EPC‐ρb methods allow a user to have defined dissipation which reduces the effects of experimental error in the PDT without needing onerous iterations.

Originality/value

The paper proposes novel time‐integration algorithms for pseudo‐dynamic testing. Thanks to a predictor‐corrector form of the generalized‐α method, the proposed schemes maintain a high computational efficiency and accuracy.

Details

Engineering Computations, vol. 22 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 1 February 2023

Tareq Babaqi and Béla Vizvári

The total capacity of ambulances in metropolitan cities is often less than the post-disaster demand, especially in the case of disasters such as earthquakes. However, because…

Abstract

Purpose

The total capacity of ambulances in metropolitan cities is often less than the post-disaster demand, especially in the case of disasters such as earthquakes. However, because earthquakes are a rare occurrence in these cities, it is unreasonable to maintain the ambulance capacity at a higher level than usual. Therefore, the effective use of ambulances is critical in saving human lives during such disasters. Thus, this paper aims to provide a method for determining how to transport the maximum number of disaster victims to hospitals on time.

Design/methodology/approach

The transportation-related disaster management problem is complex and dynamic. The practical solution needs decomposition and a fast algorithm for determining the next mission of a vehicle. The suggested method is a synthesis of mathematical modeling, scheduling theory, heuristic methods and the Voronoi diagram of geometry. This study presents new elements for the treatment, including new mathematical theorems and algorithms. In the proposed method, each hospital is responsible for a region determined by the Voronoi diagram. The region may change if a hospital becomes full. The ambulance vehicles work for hospitals. For every patient, there is an estimated deadline by which the person must reach the hospital to survive. The second part of the concept is the way of scheduling the vehicles. The objective is to transport the maximum number of patients on time. In terms of scheduling theory, this is a problem whose objective function is to minimize the sum of the unit penalties.

Findings

The Voronoi diagram can be effectively used for decomposing the complex problem. The mathematical model of transportation to one hospital is the P‖ΣUj problem of scheduling theory. This study provides a new mathematical theorem to describe the structure of an algorithm that provides the optimal solution. This study introduces the notion of the partial oracle. This algorithmic tool helps to elaborate heuristic methods, which provide approximations to the precise method. The realization of the partial oracle with constructive elements and elements proves the nonexistence of any solution. This paper contains case studies of three hospitals in Tehran. The results are close to the best possible results that can be achieved. However, obtaining the optimal solution requires a long CPU time, even in the nondynamic case, because the problem P‖ΣUj is NP-complete.

Research limitations/implications

This research suggests good approximation because of the complexity of the problem. Researchers are encouraged to test the proposed propositions further. In addition, the problem in the dynamic environment needs more attention.

Practical implications

If a large-scale earthquake can be expected in a city, the city authorities should have a central control system of ambulances. This study presents a simple and efficient method for the post-disaster transport problem and decision-making. The security of the city can be improved by purchasing ambulances and using the proposed method to boost the effectiveness of post-disaster relief.

Social implications

The population will be safer and more secure if the recommended measures are realized. The measures are important for any city situated in a region where the outbreak of a major earthquake is possible at any moment.

Originality/value

This paper fulfills an identified need to study the operations related to the transport of seriously injured people using emergency vehicles in the post-disaster period in an efficient way.

Details

Journal of Humanitarian Logistics and Supply Chain Management, vol. 13 no. 1
Type: Research Article
ISSN: 2042-6747

Keywords

Article
Publication date: 1 April 1987

Lev N. Landa

“Landamatics” is a label given by American scholars to designate this algorithmic‐heuristic theory and method of performance, learning and instruction. Landamatics analyses and…

Abstract

“Landamatics” is a label given by American scholars to designate this algorithmic‐heuristic theory and method of performance, learning and instruction. Landamatics analyses and explains the mental processes which underlie expert performance, learning and decision making. It also defines specific ways of purposeful and accelerated development of such processes in novices and non‐experts through a special course of instruction.

Details

Journal of Management Development, vol. 6 no. 4
Type: Research Article
ISSN: 0262-1711

Keywords

1 – 10 of over 3000