Search results
1 – 10 of 114Xingxing Li, Shixi You, Zengchang Fan, Guangjun Li and Li Fu
This review provides an overview of recent advances in electrochemical sensors for analyte detection in saliva, highlighting their potential applications in diagnostics and health…
Abstract
Purpose
This review provides an overview of recent advances in electrochemical sensors for analyte detection in saliva, highlighting their potential applications in diagnostics and health care. The purpose of this paper is to summarize the current state of the field, identify challenges and limitations and discuss future prospects for the development of saliva-based electrochemical sensors.
Design/methodology/approach
The paper reviews relevant literature and research articles to examine the latest developments in electrochemical sensing technologies for saliva analysis. It explores the use of various electrode materials, including carbon nanomaterial, metal nanoparticles and conducting polymers, as well as the integration of microfluidics, lab-on-a-chip (LOC) devices and wearable/implantable technologies. The design and fabrication methodologies used in these sensors are discussed, along with sample preparation techniques and biorecognition elements for enhancing sensor performance.
Findings
Electrochemical sensors for salivary analyte detection have demonstrated excellent potential for noninvasive, rapid and cost-effective diagnostics. Recent advancements have resulted in improved sensor selectivity, stability, sensitivity and compatibility with complex saliva samples. Integration with microfluidics and LOC technologies has shown promise in enhancing sensor efficiency and accuracy. In addition, wearable and implantable sensors enable continuous, real-time monitoring of salivary analytes, opening new avenues for personalized health care and disease management.
Originality/value
This review presents an up-to-date overview of electrochemical sensors for analyte detection in saliva, offering insights into their design, fabrication and performance. It highlights the originality and value of integrating electrochemical sensing with microfluidics, wearable/implantable technologies and point-of-care testing platforms. The review also identifies challenges and limitations, such as interference from other saliva components and the need for improved stability and reproducibility. Future prospects include the development of novel microfluidic devices, advanced materials and user-friendly diagnostic devices to unlock the full potential of saliva-based electrochemical sensing in clinical practice.
Details
Keywords
Renata Lohmann and Ana Taís Martins
This research is located at the intersection of communication, memetics, and the study of the imaginary. As a presupposition, we put forward the existence of a communicational…
Abstract
This research is located at the intersection of communication, memetics, and the study of the imaginary. As a presupposition, we put forward the existence of a communicational imaginary, in which the contemporary person functions through their competencies in social networks, by meeting the demands of the public and the private, managing the obsessiveness of the sharing of intimacy and the exorbitant number of images. Considering memes as a significant aspect of this communicational imaginary, we seek to understand the dynamics and path of memes in the midst of this plethora of images. From the concept of iconophagy, we deal with the exacerbated multiplication of the images and the path of memes starting from a marginalized environment until it is integrated into social roles and a rational level of thought. Thus, it is the general objective of this research to understand the dynamics and the path of memes amidst the plethora of images in the context of communicational imagery and to investigate the multiplication of memes as representative of the myriad images in contemporary imagery.
Details
Keywords
This viewpoint article explores the transformative capabilities of large language models (LLMs) like the Chat Generative Pre-training Transformer (ChatGPT) within the property…
Abstract
Purpose
This viewpoint article explores the transformative capabilities of large language models (LLMs) like the Chat Generative Pre-training Transformer (ChatGPT) within the property valuation industry. It particularly accentuates the pivotal role of prompt engineering in facilitating valuation reporting and advocates for adopting the “Red Book” compliance Chain-of-thought (COT) prompt engineering as a gold standard for generating AI-facilitated valuation reports.
Design/methodology/approach
The article offers a high-level examination of the application of LLMs in real estate research, highlighting the essential role of prompt engineering for future advancements in generative AI. It explores the collaborative dynamic between valuers and AI advancements, emphasising the importance of precise instructions and contextual cues in directing LLMs to generate accurate and reproducible valuation outcomes.
Findings
Integrating LLMs into property valuation processes paves the way for efficiency improvements and task automation, such as generating reports and drafting contracts. AI-facilitated reports offer unprecedented transparency and elevate client experiences. The fusion of valuer expertise with prompt engineering ensures the reliability and interpretability of valuation reports.
Practical implications
Delineating the types and versions of LLMs used in AI-generated valuation reports encourage the adoption of transparency best practices within the industry. Valuers, as expert prompt engineers, can harness the potential of AI to enhance efficiency, accuracy and transparency in the valuation process, delivering significant benefits to a broad array of stakeholders.
Originality/value
The article elucidates the substantial impact of prompt engineering in leveraging LLMs within the property industry. It underscores the importance of valuers training their unique GPT models, enabling customisation and reproducibility of valuation outputs. The symbiotic relationship between valuers and LLMs is identified as a key driver shaping the future of property valuations.
Details
Keywords
Przemysław G. Hensel and Agnieszka Kacprzak
Replication is a primary self-correction device in science. In this paper, we have two aims: to examine how and when the results of replications are used in management and…
Abstract
Purpose
Replication is a primary self-correction device in science. In this paper, we have two aims: to examine how and when the results of replications are used in management and organization research and to use the results of this examination to offer guidelines for improving the self-correction process.
Design/methodology/approach
Study 1 analyzes co-citation patterns for 135 original-replication pairs to assess the direct impact of replications, specifically examining how often and when a replication study is co-cited with its original. In Study 2, a similar design is employed to measure the indirect impact of replications by assessing how often and when a meta-analysis that includes a replication of the original study is co-cited with the original study.
Findings
Study 1 reveals, among other things, that a huge majority (92%) of sources that cite the original study fail to co-cite a replication study, thus calling into question the impact of replications in our field. Study 2 shows that the indirect impact of replications through meta-analyses is likewise minimal. However, our analyses also show that replications published in the same journal that carried the original study and authored by teams including the authors of the original study are more likely to be co-cited, and that articles in higher-ranking journals are more likely to co-cite replications.
Originality/value
We use our results to formulate recommendations that would streamline the self-correction process in management research at the author-, reviewer- and journal-level. Our recommendations would create incentives to make replication attempts more common, while also increasing the likelihood that these attempts are targeted at the most relevant original studies.
Details
Keywords
Evangelia Panagiotidou, Panos T. Chountalas, Anastasios Ι. Magoutas and Fotis C. Kitsios
This study aims to dissect the multifaceted impact of ISO/IEC 17025 accreditation, specifically within civil engineering testing and calibration laboratories. To achieve this, it…
Abstract
Purpose
This study aims to dissect the multifaceted impact of ISO/IEC 17025 accreditation, specifically within civil engineering testing and calibration laboratories. To achieve this, it intends to explore several key objectives: identifying the prominent benefits of accreditation to laboratory performance, understanding the advantages conferred through participation in proficiency testing schemes, assessing the role of accreditation in enhancing laboratory competitiveness, examining the primary challenges encountered during the accreditation process, investigating any discernible adverse effects of accreditation on laboratory performance and evaluating whether the financial cost of accreditation justifies the resultant profitability.
Design/methodology/approach
This study employs a qualitative approach through semi-structured interviews with 23 industry professionals—including technical managers, quality managers, external auditors and clients. Thematic analysis, guided by Braun and Clarke’s six-stage paradigm, was utilized to interpret the data, ensuring a comprehensive understanding of the accreditation’s impact.
Findings
Findings reveal that accreditation significantly enhances operational processes, fosters quality awareness and facilitates continuous improvement, contributing to greater client satisfaction. In addition, standardized operations and rigorous quality controls further result in enhanced performance metrics, such as staff capability and measurement accuracy. However, the study also uncovers the challenges of accreditation, including high resource costs and bureaucratic hurdles that can inhibit innovation and slow routine operations. Importantly, the research underscores that the impact of accreditation on profitability is not universal, but contingent upon various factors like sector-specific regulations and market demand. The study also highlights sector-specific variations in the role of accreditation as a marketing tool and differing perceptions of its value among clients. It further emphasizes the psychological stress of high-stakes evaluations during audits.
Originality/value
This study represents the first in-depth investigation into the impact of ISO/IEC 17025 accreditation on civil engineering testing and calibration laboratories, directly contributing to the enhancement of their quality and operational standards. Providing actionable insights for laboratories, it underscores the importance of weighing accreditation costs and benefits and the necessity for a tailored approach to the unique market and regulatory landscapes they operate in.
Details
Keywords
Although the challenges associated with big data are increasing, the question of the most suitable big data analytics (BDA) platform in libraries is always significant. The…
Abstract
Purpose
Although the challenges associated with big data are increasing, the question of the most suitable big data analytics (BDA) platform in libraries is always significant. The purpose of this study is to propose a solution to this problem.
Design/methodology/approach
The current study identifies relevant literature and provides a review of big data adoption in libraries. It also presents a step-by-step guide for the development of a BDA platform using the Apache Hadoop Ecosystem. To test the system, an analysis of library big data using Apache Pig, which is a tool from the Apache Hadoop Ecosystem, was performed. It establishes the effectiveness of Apache Hadoop Ecosystem as a powerful BDA solution in libraries.
Findings
It can be inferred from the literature that libraries and librarians have not taken the possibility of big data services in libraries very seriously. Also, the literature suggests that there is no significant effort made to establish any BDA architecture in libraries. This study establishes the Apache Hadoop Ecosystem as a possible solution for delivering BDA services in libraries.
Research limitations/implications
The present work suggests adapting the idea of providing various big data services in a library by developing a BDA platform, for instance, providing assistance to the researchers in understanding the big data, cleaning and curation of big data by skilled and experienced data managers and providing the infrastructural support to store, process, manage, analyze and visualize the big data.
Practical implications
The study concludes that Apache Hadoops’ Hadoop Distributed File System and MapReduce components significantly reduce the complexities of big data storage and processing, respectively, and Apache Pig, using Pig Latin scripting language, is very efficient in processing big data and responding to queries with a quick response time.
Originality/value
According to the study, there are significantly fewer efforts made to analyze big data from libraries. Furthermore, it has been discovered that acceptance of the Apache Hadoop Ecosystem as a solution to big data problems in libraries are not widely discussed in the literature, although Apache Hadoop is regarded as one of the best frameworks for big data handling.
Details
Keywords
Rahma Torchani, Salma Damak-Ayadi and Issal Haj-Salem
This study aims to investigate the effect of mandatory international financial reporting standards (IFRS) adoption on the risk disclosure quality by listed European insurers.
Abstract
Purpose
This study aims to investigate the effect of mandatory international financial reporting standards (IFRS) adoption on the risk disclosure quality by listed European insurers.
Design/methodology/approach
The study used a content analysis of the annual reports and consolidated accounts of 13 insurance companies listed in the European market between 2002 and 2007 based on two regulatory frameworks, Solvency and IFRS.
Findings
The results showed a significant effect of the mandatory adoption of IFRS and a clear improvement in the quality of risk disclosure. Moreover, risk disclosure is positively associated with the size of the company.
Research limitations/implications
The authors can consider the relatively limited size of the sample as a limitation of this study. Moreover, the manual content analysis used to be considered subjective.
Practical implications
The findings of this study provide useful insights to professional and regulatory bodies about the consequences of IFRS adoption to enhance transparency and particularly risk disclosure.
Originality/value
The research contributes to the existing literature. First, the authors have shown that companies are improving in the quality of risk disclosure even before 2005. Second, the authors have shown that the year 2005 is distinguished by a marked improvement in disclosure trends, with companies aligning themselves with coercive and mimetic regulatory forces. Third, the authors highlight the significant effect of mandatory IFRS adoption even in highly regulated industries, such as the insurance industry.
Details
Keywords
Besiki Stvilia and Dong Joon Lee
This study addresses the need for a theory-guided, rich, descriptive account of research data repositories' (RDRs) understanding of data quality and the structures of their data…
Abstract
Purpose
This study addresses the need for a theory-guided, rich, descriptive account of research data repositories' (RDRs) understanding of data quality and the structures of their data quality assurance (DQA) activities. Its findings can help develop operational DQA models and best practice guides and identify opportunities for innovation in the DQA activities.
Design/methodology/approach
The study analyzed 122 data repositories' applications for the Core Trustworthy Data Repositories, interview transcripts of 32 curators and repository managers and data curation-related webpages of their repository websites. The combined dataset represented 146 unique RDRs. The study was guided by a theoretical framework comprising activity theory and an information quality evaluation framework.
Findings
The study provided a theory-based examination of the DQA practices of RDRs summarized as a conceptual model. The authors identified three DQA activities: evaluation, intervention and communication and their structures, including activity motivations, roles played and mediating tools and rules and standards. When defining data quality, study participants went beyond the traditional definition of data quality and referenced seven facets of ethical and effective information systems in addition to data quality. Furthermore, the participants and RDRs referenced 13 dimensions in their DQA models. The study revealed that DQA activities were prioritized by data value, level of quality, available expertise, cost and funding incentives.
Practical implications
The study's findings can inform the design and construction of digital research data curation infrastructure components on university campuses that aim to provide access not just to big data but trustworthy data. Communities of practice focused on repositories and archives could consider adding FAIR operationalizations, extensions and metrics focused on data quality. The availability of such metrics and associated measurements can help reusers determine whether they can trust and reuse a particular dataset. The findings of this study can help to develop such data quality assessment metrics and intervention strategies in a sound and systematic way.
Originality/value
To the best of the authors' knowledge, this paper is the first data quality theory guided examination of DQA practices in RDRs.
Details
Keywords
Huazhou He, Pinghua Xu, Jing Jia, Xiaowan Sun and Jingwen Cao
Fashion merchandising hold a paramount position within the realm of retail marketing. Currently, the purpose of this article is that the assessment of display effectiveness…
Abstract
Purpose
Fashion merchandising hold a paramount position within the realm of retail marketing. Currently, the purpose of this article is that the assessment of display effectiveness predominantly relies on the subjective judgment of merchandisers due to the absence of an effective evaluation method. Although eye-tracking devices have found extensive used in tracking the gaze trajectory of subject, they exhibit limitations in terms of stability when applied to the evaluation of various scenes. This underscores the need for a dependable, user-friendly and objective assessment method.
Design/methodology/approach
To develop a cost-effective and convenient evaluation method, the authors introduced an image processing framework for the assessment of variations in the impact of store furnishings. An optimized visual saliency methodology that leverages a multiscale pyramid model, incorporating color, brightness and orientation features, to construct a visual saliency heatmap. Additionally, the authors have established two pivotal evaluation indices aimed at quantifying attention coverage and dispersion. Specifically, bottom features are extract from 9 distinct scale images which are down sampled from merchandising photographs. Subsequently, these extracted features are amalgamated to form a heatmap, serving as the focal point of the evaluation process. The authors have proposed evaluation indices dedicated to measuring visual focus and dispersion, facilitating a precise quantification of attention distribution within the observed scenes.
Findings
In comparison to conventional saliency algorithm, the optimization method yields more intuitive feedback regarding scene contrast. Moreover, the optimized approach results in a more concentrated focus within the central region of the visual field, a pattern in alignment with physiological research findings. The results affirm that the two defined indicators prove highly effective in discerning variations in visual attention across diverse brand store displays.
Originality/value
The study introduces an intelligent and cost-effective objective evaluate method founded upon visual saliency. This pioneering approach not only effectively discerns the efficacy of merchandising efforts but also holds the potential for extension to the assessment of fashion advertisements, home design and website aesthetics.
Details
Keywords
Nanond Nopparat and Damien Motte
Present for more than 20 years, 3D food printing (3DFP) technology has not experienced the same widespread adoption as its non-food counterparts. It is believed that relevant…
Abstract
Purpose
Present for more than 20 years, 3D food printing (3DFP) technology has not experienced the same widespread adoption as its non-food counterparts. It is believed that relevant business models are crucial for its expansion. The purpose of this study is to identify the dominant prototypical business models and patterns in the 3DFP industry. The knowledge gained could be used to provide directions for business model innovation in this industry.
Design/methodology/approach
The authors established a business model framework and used it to analyse the identified 3DFP manufacturers. The authors qualitatively identified the market’s prototypical business models and used agglomerative hierarchical clustering to extract further patterns.
Findings
All identified 3DFP businesses use the prototypical business model of selling ownership of physical assets, with some variations. Low-cost 3D food printers for private usage and dedicated 3D food printers for small-scale food producers are the two primary patterns identified. Furthermore, several benefits of 3DFP technology are not being used, and the identified manufacturers are barely present in high-revenue markets, which prevents them from driving technological innovation forward.
Practical implications
The extracted patterns can be used by the companies within the 3DFP industry and even in other additive manufacturing segments to reflect upon, refine or renew their business model. Some directions for business model innovation in this industry are provided.
Originality/value
To the best of the authors’ knowledge, this is the first quantitative study to give an account of the current 3DFP business models and their possible evolution. This study also contributes to the business model patterns methodological development.
Details