Search results
1 – 10 of 22Vaclav Snasel, Tran Khanh Dang, Josef Kueng and Lingping Kong
This paper aims to review in-memory computing (IMC) for machine learning (ML) applications from history, architectures and options aspects. In this review, the authors investigate…
Abstract
Purpose
This paper aims to review in-memory computing (IMC) for machine learning (ML) applications from history, architectures and options aspects. In this review, the authors investigate different architectural aspects and collect and provide our comparative evaluations.
Design/methodology/approach
Collecting over 40 IMC papers related to hardware design and optimization techniques of recent years, then classify them into three optimization option categories: optimization through graphic processing unit (GPU), optimization through reduced precision and optimization through hardware accelerator. Then, the authors brief those techniques in aspects such as what kind of data set it applied, how it is designed and what is the contribution of this design.
Findings
ML algorithms are potent tools accommodated on IMC architecture. Although general-purpose hardware (central processing units and GPUs) can supply explicit solutions, their energy efficiencies have limitations because of their excessive flexibility support. On the other hand, hardware accelerators (field programmable gate arrays and application-specific integrated circuits) win on the energy efficiency aspect, but individual accelerator often adapts exclusively to ax single ML approach (family). From a long hardware evolution perspective, hardware/software collaboration heterogeneity design from hybrid platforms is an option for the researcher.
Originality/value
IMC’s optimization enables high-speed processing, increases performance and analyzes massive volumes of data in real-time. This work reviews IMC and its evolution. Then, the authors categorize three optimization paths for the IMC architecture to improve performance metrics.
Details
Keywords
Miquel Centelles and Núria Ferran-Ferrer
Develop a comprehensive framework for assessing the knowledge organization systems (KOSs), including the taxonomy of Wikipedia and the ontologies of Wikidata, with a specific…
Abstract
Purpose
Develop a comprehensive framework for assessing the knowledge organization systems (KOSs), including the taxonomy of Wikipedia and the ontologies of Wikidata, with a specific focus on enhancing management and retrieval with a gender nonbinary perspective.
Design/methodology/approach
This study employs heuristic and inspection methods to assess Wikipedia’s KOS, ensuring compliance with international standards. It evaluates the efficiency of retrieving non-masculine gender-related articles using the Catalan Wikipedian category scheme, identifying limitations. Additionally, a novel assessment of Wikidata ontologies examines their structure and coverage of gender-related properties, comparing them to Wikipedia’s taxonomy for advantages and enhancements.
Findings
This study evaluates Wikipedia’s taxonomy and Wikidata’s ontologies, establishing evaluation criteria for gender-based categorization and exploring their structural effectiveness. The evaluation process suggests that Wikidata ontologies may offer a viable solution to address Wikipedia’s categorization challenges.
Originality/value
The assessment of Wikipedia categories (taxonomy) based on KOS standards leads to the conclusion that there is ample room for improvement, not only in matters concerning gender identity but also in the overall KOS to enhance search and retrieval for users. These findings bear relevance for the design of tools to support information retrieval on knowledge-rich websites, as they assist users in exploring topics and concepts.
Details
Keywords
Jiangnan Qiu, Wenjing Gu, Zhongming Ma, Yue You, Chengjie Cai and Meihui Zhang
In the extant research on online knowledge communities (OKCs), little attention has been paid to the influence of membership fluidity on the coevolution of the social and…
Abstract
Purpose
In the extant research on online knowledge communities (OKCs), little attention has been paid to the influence of membership fluidity on the coevolution of the social and knowledge systems. This article aims to fill this gap.
Design/methodology/approach
Based on the attraction-selection-attrition (ASA) framework, this paper constructs a simulation model to study the coevolution of these two systems under different levels of membership fluidity.
Findings
By analyzing the evolution of these systems with the vector autoregression (VAR) method, we find that social and knowledge systems become more orderly as the coevolution progresses. Furthermore, in communities with low membership fluidity, the microlevel of the social system (i.e. users) drives the coevolution, whereas in communities with high membership fluidity, the microlevel of the knowledge system (i.e. users' views) drives the coevolution.
Originality/value
This paper extends the application of the ASA framework and enriches the literature on membership fluidity of online communities and the literature on driving factors for coevolution of the social and knowledge systems in OKCs. On a practical level, our work suggests that community administrators should adopt different strategies for different membership fluidity to efficiently promote the coevolution of the social and knowledge systems in OKCs.
Details
Keywords
Yaolin Zhou, Zhaoyang Zhang, Xiaoyu Wang, Quanzheng Sheng and Rongying Zhao
The digitalization of archival management has rapidly developed with the maturation of digital technology. With data's exponential growth, archival resources have transitioned…
Abstract
Purpose
The digitalization of archival management has rapidly developed with the maturation of digital technology. With data's exponential growth, archival resources have transitioned from single modalities, such as text, images, audio and video, to integrated multimodal forms. This paper identifies key trends, gaps and areas of focus in the field. Furthermore, it proposes a theoretical organizational framework based on deep learning to address the challenges of managing archives in the era of big data.
Design/methodology/approach
Via a comprehensive systematic literature review, the authors investigate the field of multimodal archive resource organization and the application of deep learning techniques in archive organization. A systematic search and filtering process is conducted to identify relevant articles, which are then summarized, discussed and analyzed to provide a comprehensive understanding of existing literature.
Findings
The authors' findings reveal that most research on multimodal archive resources predominantly focuses on aspects related to storage, management and retrieval. Furthermore, the utilization of deep learning techniques in image archive retrieval is increasing, highlighting their potential for enhancing image archive organization practices; however, practical research and implementation remain scarce. The review also underscores gaps in the literature, emphasizing the need for more practical case studies and the application of theoretical concepts in real-world scenarios. In response to these insights, the authors' study proposes an innovative deep learning-based organizational framework. This proposed framework is designed to navigate the complexities inherent in managing multimodal archive resources, representing a significant stride toward more efficient and effective archival practices.
Originality/value
This study comprehensively reviews the existing literature on multimodal archive resources organization. Additionally, a theoretical organizational framework based on deep learning is proposed, offering a novel perspective and solution for further advancements in the field. These insights contribute theoretically and practically, providing valuable knowledge for researchers, practitioners and archivists involved in organizing multimodal archive resources.
Details
Keywords
Kimberly A. Whitler, Graham D. Wells and Gerry Yemen
Few cases allow the student to understand the relationship between brand strategy, marketing strategy, implementation, and analysis. While some conceive of the process as being…
Abstract
Few cases allow the student to understand the relationship between brand strategy, marketing strategy, implementation, and analysis. While some conceive of the process as being sequential, this case demonstrates that in fact, this process is more fluid, and that implementation and analysis impact subsequent strategy.
This field-based case provides a rare glimpse into the turnaround of a brand that was all but dead. After Buick suffered more than five decades of declining business results and an inferior brand image versus all rivals, few thought that the brand could be resuscitated. This case provides a valuable under-the-hood look at how the Buick team, over time, progresses through a series of marketing improvements all anchored on an evolved strategy. Specifically, Buick introduced a shift in brand strategy behind an evolved brand essence statement (i.e., brand positioning), improved product lineup, new-to-the-world innovation, enhanced dealership service, and more compelling advertising. The results led to a record number of product awards, significantly improved advertising measures, improved service ratings, and better business results.
Despite significant improvement across multiple dimensions of the business, Buick still trailed key competitors on one of the most important measures Buick tracked—the brand momentum rating—suggesting that there was still more work needed to complete the brand turnaround. The case introduces Molly Peck, the new marketing director on Buick, who is wondering what more, if anything, Buick should do. The material allows for instruction around marketing strategy and the process of converting it into implementation through the use of a creative brief.
Details
Keywords
Peter Dornheim and Ruediger Zarnekow
The human factor is the most important defense asset against cyberattacks. To ensure that the human factor stays strong, a cybersecurity culture must be established and cultivated…
Abstract
Purpose
The human factor is the most important defense asset against cyberattacks. To ensure that the human factor stays strong, a cybersecurity culture must be established and cultivated in a company to guide the attitudes and behaviors of employees. Many cybersecurity culture frameworks exist; however, their practical application is difficult. This paper aims to demonstrate how an established framework can be applied to determine and improve the cybersecurity culture of a company.
Design/methodology/approach
Two surveys were conducted within eight months in the internal IT department of a global software company to analyze the cybersecurity culture and the applied improvement measures. Both surveys comprised the same 23 questions to measure cybersecurity culture according to six dimensions: cybersecurity accountability, cybersecurity commitment, cybersecurity necessity and importance, cybersecurity policy effectiveness, information usage perception and management buy-in.
Findings
Results demonstrate that cybersecurity culture maturity can be determined and improved if accurate measures are derived from the results of the survey. The first survey showed potential for improving the dimensions of cybersecurity accountability, cybersecurity commitment and cybersecurity policy effectiveness, while the second survey proved that these dimensions have been improved.
Originality/value
This paper proves that practical application of cybersecurity culture frameworks is possible if they are appropriately tailored to a given organization. In this regard, scientific research and practical application combine to offer real value to researchers and cybersecurity executives.
Details
Keywords
With the rapid development of social media, the occurrence and evolution of emergency events are often accompanied by massive users' expressions. The fine-grained analysis on…
Abstract
Purpose
With the rapid development of social media, the occurrence and evolution of emergency events are often accompanied by massive users' expressions. The fine-grained analysis on users' expressions can provide accurate and reliable information for event processing. Hence, 2,003,814 expressions on a major malignant emergency event were mined from multiple dimensions in this paper.
Design/methodology/approach
This paper conducted finer-grained analysis on users' online expressions in an emergency event. Specifically, the authors firstly selected a major emergency event as the research object and collected the event-related user expressions that lasted nearly two years to describe the dynamic evolution trend of the event. Then, users' expression preferences were identified by detecting anomic expressions, classifying sentiment tendencies and extracting topics in expressions. Finally, the authors measured the explicit and implicit impacts of different expression preferences and obtained relations between the differential expression preferences.
Findings
Experimental results showed that users have both short- and long-term attention to emergency events. Their enthusiasm for discussing the event will be quickly dispelled and easily aroused. Meanwhile, most users prefer to make rational and normative expressions of events, and the expression topics are diversified. In addition, compared with anomic negative expressions, anomic expressions in positive sentiments are more common. In conclusion, the integration of multi-dimensional analysis results of users' expression preferences (including discussion heat, preference impacts and preference relations) is an effective means to support emergency event processing.
Originality/value
To the best of the authors' knowledge, it is the first research to conduct in-depth and fine-grained analysis of user expression in emergencies, so as to get in-detail and multi-dimensional characteristics of users' online expressions for supporting event processing.
Details
Keywords
Hazwani Shafei, Rahimi A. Rahman and Yong Siang Lee
Policymakers are developing national strategic plans to encourage organizations to adopt Construction 4.0 technologies. However, organizations often adopt the recommended…
Abstract
Purpose
Policymakers are developing national strategic plans to encourage organizations to adopt Construction 4.0 technologies. However, organizations often adopt the recommended technologies without aligning with organizational vision. Furthermore, there is no prioritization on which Construction 4.0 technology should be adopted, including the impact of the technologies on different criteria such as safety and health. Therefore, this study aims to evaluate Construction 4.0 technologies listed in a national strategic plan that targets the enhancement of safety and health.
Design/methodology/approach
A list of Construction 4.0 technologies from a national strategic plan is evaluated using the fuzzy technique for order preference by similarity to ideal solution (TOPSIS) method. Then, the data are analyzed using reliability, fuzzy TOPSIS, normalization, Pareto, sensitivity, ranking and correlation analyses.
Findings
The analyses identified six Construction 4.0 technologies that are critical in enhancing safety and health: Internet of Things, autonomous construction, big data and predictive analytics, artificial Intelligence, building information modeling and augmented reality and virtualization. In addition, six pairs of Construction 4.0 technologies illustrate strong relationships.
Originality/value
This study contributes to the existing body of knowledge by ranking a list of Construction 4.0 technologies in a national strategic plan that targets the enhancement of safety and health. Decision-makers can use the study findings to prioritize the technologies during the adoption process. Also, to the best of the authors’ knowledge, this study is the first to evaluate the impact of Construction 4.0 technologies listed in a national strategic plan on a specific criterion.
Details
Keywords
The paper seeks to introduce the “critical open access literacy” construct as a holistic approach to confront the challenges in open access (OA) as a dimension of scholarly…
Abstract
Purpose
The paper seeks to introduce the “critical open access literacy” construct as a holistic approach to confront the challenges in open access (OA) as a dimension of scholarly communication.
Design/methodology/approach
The paper first introduces the concepts of information literacy (IL) and OA in the context of transformations in the scholarly information environment. Via a theoretical-analytical exercise on the basis of a literature review of the intersections between the two concepts and of the criticisms of OA, the paper discusses the role of critical IL in addressing the challenges in OA and lays the theoretical-conceptual groundwork for the critical OA literacy construct.
Findings
The structural nature of the challenges and transformations in the scholarly information environment require new foci and pedagogical practices in library and information studies. A more holistic, critical and integrative approach to OA is warranted, which could effectively be achieved through the re-conceptualization of IL.
Practical implications
The paper specifies the avenues for putting the theoretical conceptualizations of critical OA literacy into practice by identifying possible foci for IL instruction alongside a transformed role for librarians.
Originality/value
The paper extends deliberations on the role of critical IL for scholarly communication and attempts to advance the research fields of the two domains by proposing a new construct situated at the junction of OA and IL.
Details
Keywords
Shahin Alipour Bonab, Alireza Sadeghi and Mohammad Yazdani-Asrami
The ionization of the air surrounding the phase conductor in high-voltage transmission lines results in a phenomenon known as the Corona effect. To avoid this, Corona rings are…
Abstract
Purpose
The ionization of the air surrounding the phase conductor in high-voltage transmission lines results in a phenomenon known as the Corona effect. To avoid this, Corona rings are used to dampen the electric field imposed on the insulator. The purpose of this study is to present a fast and intelligent surrogate model for determination of the electric field imposed on the surface of a 120 kV composite insulator, in presence of the Corona ring.
Design/methodology/approach
Usually, the structural design parameters of the Corona ring are selected through an optimization procedure combined with some numerical simulations such as finite element method (FEM). These methods are slow and computationally expensive and thus, extremely reducing the speed of optimization problems. In this paper, a novel surrogate model was proposed that could calculate the maximum electric field imposed on a ceramic insulator in a 120 kV line. The surrogate model was created based on the different scenarios of height, radius and inner radius of the Corona ring, as the inputs of the model, while the maximum electric field on the body of the insulator was considered as the output.
Findings
The proposed model was based on artificial intelligence techniques that have high accuracy and low computational time. Three methods were used here to develop the AI-based surrogate model, namely, Cascade forward neural network (CFNN), support vector regression and K-nearest neighbors regression. The results indicated that the CFNN has the highest accuracy among these methods with 99.81% R-squared and only 0.045468 root mean squared error while the testing time is less than 10 ms.
Originality/value
To the best of the authors’ knowledge, for the first time, a surrogate method is proposed for the prediction of the maximum electric field imposed on the high voltage insulators in the presence Corona ring which is faster than any conventional finite element method.
Details