Search results
1 – 10 of over 2000Niklas Rönnberg, Rasmus Ringdahl and Anna Fredriksson
The noise and dust particles caused by the construction transport are by most stakeholders experienced as disturbing. The purpose of this study is to explore how sonification can…
Abstract
Purpose
The noise and dust particles caused by the construction transport are by most stakeholders experienced as disturbing. The purpose of this study is to explore how sonification can support visualization in construction planning to decrease construction transport disturbances.
Design/methodology/approach
This paper presents an interdisciplinary research project, combining research on construction logistics, internet of things and sonification. First, a data recording device, including sound, particle, temperature and humidity sensors, was implemented and deployed in a development project. Second, the collected data were used in a sonification design, which was, third, evaluated with potential users.
Findings
The results showed that the low-cost sensors used could capture “good enough” data, and that the use of sonification for representing these data is interesting and a possible useful tool in urban and construction transport planning.
Research limitations/implications
There is a need to further evolve the sonification design and better communicate the aim of the sounds used to potential users. Further testing is also needed.
Practical implications
This study introduces new ideas of how to support visualization with sonification planning the construction work and its impact on the vicinity of the site. Currently, urban planning and construction planning focus on visualizing the final result, with little focus on how to handle disturbances during the construction process.
Originality/value
Showing the potentials of using low-cost sensor data in sonification, and using sonification together with visualization, is the result of a novel interdisciplinary research area combination.
Details
Keywords
The purpose of this paper is to construct a digital collection and database of traditional clothing that is convenient for the digital dissemination and application of traditional…
Abstract
Purpose
The purpose of this paper is to construct a digital collection and database of traditional clothing that is convenient for the digital dissemination and application of traditional clothing and provide resources for research on clothing fashion, traditional clothing techniques, clothing culture, history and clothing teaching.
Design/methodology/approach
A real object analysis method was used in this paper, based on 15 core elements of the internationally common DC metadata standard, and with consideration to the characteristics of clothing products and clothing industry application specifications, the core elements of DC are expanded to facilitate the detailed record of the characteristic information of clothing, especially the implicit clothing culture. A code symbol compilation method was developed to give each piece of clothing a unique number, facilitating identification, classification and recording. At last, a metadata construction scheme for traditional clothing was developed. A traditional embroidered children's hat and Mamianqunt serve as examples to demonstrate the metadata elements.
Findings
The clothing meta-database provides a main body of traditional clothing while also paying attention to the collection of cultural elements. It is composed of five layers of classified data, source data, characteristic data, connotation data and management data, as well as 28 data elements, providing ease of sharing and interoperation.
Originality/value
This paper expands the subset of fashion metadata by describing traditional clothing metadata, especially the excavation of clothing cultural elements, and developing code compilation methods so that each clothing product can obtain a unique identification number, thereby building a traditional clothing metadata construction scheme consisting of five data layers and containing 28 data elements. This scheme records the information about each layer of traditional clothing in detail and provides shared data for discipline research and industry applications.
Details
Keywords
Michael Nii Laryeafio and Omoruyi Courage Ogbewe
Qualitative research that involves the use of human participants calls for the need to protect those participants to give their honest view during data collection. This is an…
Abstract
Purpose
Qualitative research that involves the use of human participants calls for the need to protect those participants to give their honest view during data collection. This is an important part of every primary data collection in qualitative studies using interviews. This paper aims to investigate all available ethical considerations that need to be observed by the researcher when conducting primary data collection through interview and to explore the theories that underpin the ethics in qualitative studies.
Design/methodology/approach
This paper systemically reviewed existing qualitative data on ethics and gathered information that were analysed and presented on the topic area.
Findings
The findings show that ethical considerations deal with the various approaches adopted by the researcher to make the participants feel safe to participate in any given researcher. During an interview process in qualitative research, the findings show that anonymity, voluntary participation, privacy, confidentiality, option to opt out and avoiding misuse of findings are ethical considerations that must be observed by the researcher. The outcome of the investigation also shows that deontology and utilitarianism, rights and virtue are the main theories that underpin ethical considerations in research.
Originality/value
The rights of the research participants need to be respected in qualitative research to assist in gathering accurate information to achieve the objectives of study. This and other ethical principles such as anonymity, privacy, confidentiality, voluntary participation and option to opt out guide the researcher to systematically adhere to data collection approaches that yield valid results in qualitative data collection using interviews.
Details
Keywords
This study aims to determine how the applications of blockchain technology (BT) can play a crucial role in managing financial flows in the humanitarian supply chain (HSC) and what…
Abstract
Purpose
This study aims to determine how the applications of blockchain technology (BT) can play a crucial role in managing financial flows in the humanitarian supply chain (HSC) and what benefits and challenges are associated with BT in a humanitarian setting.
Design/methodology/approach
The present study used a qualitative research approach, incorporating a systematic literature review and conducting semi-structured interviews with 12 experts in the fields of humanitarian operations, supply chain management, fintech and information technology.
Findings
The findings show that the humanitarian sector has the potential to reap significant benefits from BT, including secure data exchange, efficient SCM, streamlined donor financing, cost-effective financial transactions, smooth digital cash flow management and the facilitation of cash programs and crowdfunding. Despite the promising prospects, this study also illuminated various challenges associated with the application of BT in the HSC. Key challenges identified include scalability issues, high cost and resource requirements, lack of network reliability, data privacy, supply chain integration, knowledge and training gaps, regulatory frameworks and ethical considerations. Moreover, the study highlighted the importance of implementing mitigation strategies to address the challenges effectively.
Research limitations/implications
The present study is confined to exploring the benefits, challenges and corresponding mitigation strategies. The research uses a semi-structured interview method as the primary research approach.
Originality/value
This study adds to the existing body of knowledge concerning BT and HSC by explaining the pivotal role of BT in improving the financial flow within HSC. Moreover, it addresses a notable research gap, as there is a scarcity of studies that holistically cover the expert perspectives on benefits, challenges and strategies related to blockchain applications for effective financial flows within humanitarian settings. Consequently, this study seeks to bridge this knowledge gap and provide valuable insights into this critical area.
Details
Keywords
Francesco Leoni, Martina Carraro, Erin McAuliffe and Stefano Maffei
The purpose of this paper is three-fold. Firstly, through selected case studies, to provide an overview of how non-traditional data from digital public services were used as a…
Abstract
Purpose
The purpose of this paper is three-fold. Firstly, through selected case studies, to provide an overview of how non-traditional data from digital public services were used as a source of knowledge for policymaking. Secondly, to argue for a design for policy approach to support the successful integration of non-traditional data into policymaking practice, thus supporting data-driven innovation for policymaking. Thirdly, to encourage a vision of the relation between data-driven innovation and public policy that considers policymaking outside the authoritative instrumental logic perspective.
Design/methodology/approach
A qualitative small-N case study analysis based on desk research data was developed to provide an overview of how data-centric public services could become a source of knowledge for policymaking. The analysis was based on an original theoretical-conceptual framework that merges the policy cycle model and the policy capacity framework.
Findings
This paper identifies three potential areas of contribution of a design for policy approach in a scenario of data-driven innovation for policymaking practice: the development of sensemaking and prefiguring activities to shape a shared rationale behind intra-/inter-organisational data sharing and data collaboratives; the realisation of collaborative experimentations for enhancing the systemic policy analytical capacity of a governing body, e.g. by integrating non-traditional data into new and trusted indicators for policy evaluation; and service design as approach for data-centric public services that connects policy decisions to the socio-technical context in which data are collected.
Research limitations/implications
The small-N sample (four cases) selected is not representative of a broader population but isolates exemplary initiatives. Moreover, the analysis was based on secondary sources, limiting the assessment quality of the real use of non-traditional data for policymaking. This level of empirical understanding is considered sufficient for an explorative analysis that supports the original perspective proposed here. Future research will need to collect primary data about the potential and dynamics of how data from data-centric public services can inform policymaking and substantiate the proposed areas of a design for policy contribution with practical experimentations and cases.
Originality/value
This paper proposes a convergence, yet largely underexplored, between the two emerging perspectives on innovation in policymaking: data for policy and design for policy. This convergence helps to address the designing of data-driven innovations for policymaking, while considering pragmatic indications of socially acceptable practices in this space for practitioners.
Details
Keywords
Geming Zhang, Lin Yang and Wenxiang Jiang
The purpose of this study is to introduce the top-level design ideas and the overall architecture of earthquake early-warning system for high speed railways in China, which is…
Abstract
Purpose
The purpose of this study is to introduce the top-level design ideas and the overall architecture of earthquake early-warning system for high speed railways in China, which is based on P-wave earthquake early-warning and multiple ways of rapid treatment.
Design/methodology/approach
The paper describes the key technologies that are involved in the development of the system, such as P-wave identification and earthquake early-warning, multi-source seismic information fusion and earthquake emergency treatment technologies. The paper also presents the test results of the system, which show that it has complete functions and its major performance indicators meet the design requirements.
Findings
The study demonstrates that the high speed railways earthquake early-warning system serves as an important technical tool for high speed railways to cope with the threat of earthquake to the operation safety. The key technical indicators of the system have excellent performance: The first report time of the P-wave is less than three seconds. From the first arrival of P-wave to the beginning of train braking, the total delay of onboard emergency treatment is 3.63 seconds under 95% probability. The average total delay for power failures triggered by substations is 3.3 seconds.
Originality/value
The paper provides a valuable reference for the research and development of earthquake early-warning system for high speed railways in other countries and regions. It also contributes to the earthquake prevention and disaster reduction efforts.
Details
Keywords
Manuel Pedro Rodríguez Bolívar and Laura Alcaide Muñoz
This study aims to conduct performance and clustering analyses with the help of Digital Government Reference Library (DGRL) v16.6 database examining the role of emerging…
Abstract
Purpose
This study aims to conduct performance and clustering analyses with the help of Digital Government Reference Library (DGRL) v16.6 database examining the role of emerging technologies (ETs) in public services delivery.
Design/methodology/approach
VOSviewer and SciMAT techniques were used for clustering and mapping the use of ETs in the public services delivery. Collecting documents from the DGRL v16.6 database, the paper uses text mining analysis for identifying key terms and trends in e-Government research regarding ETs and public services.
Findings
The analysis indicates that all ETs are strongly linked to each other, except for blockchain technologies (due to its disruptive nature), which indicate that ETs can be, therefore, seen as accumulative knowledge. In addition, on the whole, findings identify four stages in the evolution of ETs and their application to public services: the “electronic administration” stage, the “technological baseline” stage, the “managerial” stage and the “disruptive technological” stage.
Practical implications
The output of the present research will help to orient policymakers in the implementation and use of ETs, evaluating the influence of these technologies on public services.
Social implications
The research helps researchers to track research trends and uncover new paths on ETs and its implementation in public services.
Originality/value
Recent research has focused on the need of implementing ETs for improving public services, which could help cities to improve the citizens’ quality of life in urban areas. This paper contributes to expanding the knowledge about ETs and its implementation in public services, identifying trends and networks in the research about these issues.
Details
Keywords
Sara Lafia, David A. Bleckley and J. Trent Alexander
Many libraries and archives maintain collections of research documents, such as administrative records, with paper-based formats that limit the documents' access to in-person use…
Abstract
Purpose
Many libraries and archives maintain collections of research documents, such as administrative records, with paper-based formats that limit the documents' access to in-person use. Digitization transforms paper-based collections into more accessible and analyzable formats. As collections are digitized, there is an opportunity to incorporate deep learning techniques, such as Document Image Analysis (DIA), into workflows to increase the usability of information extracted from archival documents. This paper describes the authors' approach using digital scanning, optical character recognition (OCR) and deep learning to create a digital archive of administrative records related to the mortgage guarantee program of the Servicemen's Readjustment Act of 1944, also known as the G.I. Bill.
Design/methodology/approach
The authors used a collection of 25,744 semi-structured paper-based records from the administration of G.I. Bill Mortgages from 1946 to 1954 to develop a digitization and processing workflow. These records include the name and city of the mortgagor, the amount of the mortgage, the location of the Reconstruction Finance Corporation agent, one or more identification numbers and the name and location of the bank handling the loan. The authors extracted structured information from these scanned historical records in order to create a tabular data file and link them to other authoritative individual-level data sources.
Findings
The authors compared the flexible character accuracy of five OCR methods. The authors then compared the character error rate (CER) of three text extraction approaches (regular expressions, DIA and named entity recognition (NER)). The authors were able to obtain the highest quality structured text output using DIA with the Layout Parser toolkit by post-processing with regular expressions. Through this project, the authors demonstrate how DIA can improve the digitization of administrative records to automatically produce a structured data resource for researchers and the public.
Originality/value
The authors' workflow is readily transferable to other archival digitization projects. Through the use of digital scanning, OCR and DIA processes, the authors created the first digital microdata file of administrative records related to the G.I. Bill mortgage guarantee program available to researchers and the general public. These records offer research insights into the lives of veterans who benefited from loans, the impacts on the communities built by the loans and the institutions that implemented them.
Details
Keywords
Armando Calabrese, Antonio D'Uffizi, Nathan Levialdi Ghiron, Luca Berloco, Elaheh Pourabbas and Nathan Proudlove
The primary objective of this paper is to show a systematic and methodological approach for the digitalization of critical clinical pathways (CPs) within the healthcare domain.
Abstract
Purpose
The primary objective of this paper is to show a systematic and methodological approach for the digitalization of critical clinical pathways (CPs) within the healthcare domain.
Design/methodology/approach
The methodology entails the integration of service design (SD) and action research (AR) methodologies, characterized by iterative phases that systematically alternate between action and reflective processes, fostering cycles of change and learning. Within this framework, stakeholders are engaged through semi-structured interviews, while the existing and envisioned processes are delineated and represented using BPMN 2.0. These methodological steps emphasize the development of an autonomous, patient-centric web application alongside the implementation of an adaptable and patient-oriented scheduling system. Also, business processes simulation is employed to measure key performance indicators of processes and test for potential improvements. This method is implemented in the context of the CP addressing transient loss of consciousness (TLOC), within a publicly funded hospital setting.
Findings
The methodology integrating SD and AR enables the detection of pivotal bottlenecks within diagnostic CPs and proposes optimal corrective measures to ensure uninterrupted patient care, all the while advancing the digitalization of diagnostic CP management. This study contributes to theoretical discussions by emphasizing the criticality of process optimization, the transformative potential of digitalization in healthcare and the paramount importance of user-centric design principles, and offers valuable insights into healthcare management implications.
Originality/value
The study’s relevance lies in its ability to enhance healthcare practices without necessitating disruptive and resource-intensive process overhauls. This pragmatic approach aligns with the imperative for healthcare organizations to improve their operations efficiently and cost-effectively, making the study’s findings relevant.
Details