Search results

1 – 10 of over 102000
To view the access options for this content please click here
Article
Publication date: 3 December 2020

Monireh Gharibe Niazi, Masumeh Karbala Aghaei Kamran and Amir Ghaebi

This study aims to design a proposed framework for evaluating university websites.

Downloads
408

Abstract

Purpose

This study aims to design a proposed framework for evaluating university websites.

Design/methodology/approach

This study is an exploratory mixed research. It was an applied research in terms of objective and used the Delphi technique and systematic review and meta-analysis approaches. Data collection tools were done through library studies, Delphi checklist and observation. The statistical population of the research comprised 17 experts who are designers of university websites and 20 Iranian university websites selected from the Webometrics website. The statistical data were analyzed using fuzzy methods, descriptive and inferential statistical methods and the SWARA weighting method. Also, the statistical analysis software SPSS 20 and Excel 2016, TOPSIS engineering software and MAXQDA were used.

Findings

Findings indicated that the dimensions of the designed proposed framework in order of their weights are credibility (0.130), reliability (0.125), usability (0.120), website design (0.110), functionality (0.104), content (0.100), page design (0.0922), efficiency (0.082), Webometrics (0.070) and systematic evaluation (0.067). Mebrate’s (2010) framework had the highest overlap (mean = 74.65), and Webometrics (mean = 18.5) had the least overlap and dependency (mean = 19) with the proposed framework. In the evaluation of the 20 university websites of Iran selected from the Webometrics site, the University of Tehran was ranked first with a score of 82.7 and Shiraz University was ranked last with a score of 75.

Originality/value

This study provides a comprehensive proposed framework for evaluating university websites that eliminates the shortcomings of all models, frameworks and methods of university website evaluation that focused only on one or more dimensions of university websites.

Details

The Electronic Library , vol. 38 no. 5/6
Type: Research Article
ISSN: 0264-0473

Keywords

To view the access options for this content please click here
Article
Publication date: 8 April 2021

Lynne Caley, Sharon J. Williams, Izabela Spernaes, David Thomas, Doris Behrens and Alan Willson

It has become accepted practice to include an evaluation alongside learning programmes that take place at work, as a means of judging their effectiveness. There is a…

Abstract

Purpose

It has become accepted practice to include an evaluation alongside learning programmes that take place at work, as a means of judging their effectiveness. There is a tendency to focus such evaluations on the relevance of the intervention and the amount of learning achieved by the individual. The aim of this review is to examine existing evaluation frameworks that have been used to evaluate education interventions and, in particular, assess how these have been used and the outcomes of such activity.

Design/methodology/approach

A scoping review using Arskey and O’Malley’s five stage framework was undertaken to examine existing evaluation frameworks claiming to evaluate education interventions.

Findings

Forty five articles were included in the review. A majority of papers concentrate on learner satisfaction and/or learning achieved. Rarely is a structured framework mentioned, or detail of the approach to analysis cited. Typically, evaluations lacked baseline data, control groups, longitudinal observations and contextual awareness.

Practical implications

This review has implications for those involved in designing and evaluating work-related education programmes, as it identifies areas where evaluations need to be strengthened and recommends how existing frameworks can be combined to improve how evaluations are conducted.

Originality/value

This scoping review is novel in its assessment and critique of evaluation frameworks employed to evaluate work-related education programmes.

Details

Journal of Workplace Learning, vol. 33 no. 6
Type: Research Article
ISSN: 1366-5626

Keywords

To view the access options for this content please click here
Article
Publication date: 9 November 2015

Colette Henry

The purpose of this paper is to consider entrepreneurship education (EE) evaluation. Specifically, it explores some of the challenges involved in applying the HEInnovate

Abstract

Purpose

The purpose of this paper is to consider entrepreneurship education (EE) evaluation. Specifically, it explores some of the challenges involved in applying the HEInnovate tool, and considers ways in which its accuracy and value might be strengthened. Using Storey (2000) by way of reflective critique, the paper proposes an augmented framework to support the application of HEInnovate. It provides a further framework to help signpost those involved in EE towards a more robust consideration of EE evaluation. In so doing, the paper aims to contribute to extant theory in the field of EE by: raising awareness of the continued need for evaluation, highlighting the potential benefits as well as the associated challenges of applying a self-assessment framework such as the HEInnovate and finally, proposing an augmented framework, which enhances the accuracy and value of the HEInnovate tool. Some avenues worthy of future research are identified.

Design/methodology/approach

This is a conceptual paper that draws on extant EE evaluation frameworks, specifically Storey’s “Six steps” to Heaven (2000), to explore how a more robust application of the HEInnovate self-evaluation tool might be achieved.

Findings

The HEInnovate framework is an easily accessible and widely applicable self-evaluation tool that higher education institutions (HEIs) are encouraged to use to determine their level of innovativeness and entrepreneurialism and, as a proxy, their preparedness to deliver EE programmes. The paper highlights the inherent challenges involved in administering self-evaluation frameworks of this nature, and uses Storey to identify areas for consideration so that the framework’s overall reliability and robustness can be enhanced, and findings rendered more accurate. The search for the “flawless” evaluative framework is likened to that of “Hunting the heffalump”.

Research limitations/implications

As a conceptual, perspective paper, the paper is limited by personal opinion. The focus on a single self-assessment institutional evaluative framework is a further limiting factor. That said, this approach prompts those using the HEInnovate framework to reflect on ways in which its application can be rendered more accurate and reliable.

Practical implications

The findings offer practical guidelines to enhance the overall robustness and accuracy of the HEInnovate framework. The paper will be of value to HEIs seeking to introduce or increase their EE provision.

Originality/value

The paper demonstrates a novel application of Storey’s evaluative framework, allowing users of the HEInnovate tool to greatly enhance its robustness and value. It also provides two new frameworks signposting entrepreneurship educators towards more a more robust consideration of EE evaluation.

Details

Education + Training, vol. 57 no. 8/9
Type: Research Article
ISSN: 0040-0912

Keywords

To view the access options for this content please click here
Article
Publication date: 1 July 2014

Saad AboMoslim and Alan Russell

The paper aims to study screening design and construction technologies of skyscrapers. Skyscraper projects provide an illustration of important driving factors (e.g…

Abstract

Purpose

The paper aims to study screening design and construction technologies of skyscrapers. Skyscraper projects provide an illustration of important driving factors (e.g. economies of scale and international expertise) when utilising a wide range of solutions, including innovative ones, in the design and construction of building systems and subsystems. The need exists for a methodology for the speedy screening and comprehensive evaluation of candidate solutions covering the complete spectrum of systems that comprise a building project and that have an impact on life cycle performance. Presented in this paper is a three-step evaluation framework directed at meeting this need, along with application of the first step to three case studies performance.

Design/methodology/approach

Research objectives were achieved by an extensive literature review of the current state-of-the-art evaluation tools and criteria; formulation of a three-step evaluation process for screening and ranking candidates; identification and structuring of comprehensive checklists of evaluation criteria; application of the first step of the evaluation framework to three case studies to gauge completeness and ease of use; and assessment of the framework by experienced practitioners.

Findings

The framework proposed provides a structured and transparent approach to assessing design/construction choices. It makes explicit the spectrum of criteria to be considered when assessing their feasibility. Feedback from industry professionals indicates that the framework is reflective of industry needs.

Originality/value

The originality and value of the approach lies in the comprehensiveness of the criteria considered, their relevance to signature building projects that draw on international expertise and technologies and their relevance to all phases of the project life cycle.

Details

Construction Innovation, vol. 14 no. 3
Type: Research Article
ISSN: 1471-4175

Keywords

To view the access options for this content please click here
Article
Publication date: 5 October 2018

Daniela Carlucci, Paolo Renna, Carmen Izzo and Giovanni Schiuma

The purpose of this paper is to propose a framework for the analysis of students’ ratings of teaching quality in higher education and the disclosure of risky issues…

Abstract

Purpose

The purpose of this paper is to propose a framework for the analysis of students’ ratings of teaching quality in higher education and the disclosure of risky issues undermining the quality of teaching and courses that require attention for continuous improvement. The framework integrates two decision-based methods: the standardized u-control chart and the ABC analysis using fuzzy weights. The control chart, using the students’ ratings, allows the identification of those courses requiring an improvement of teaching quality in the short-medium term. While the ABC analysis uses fuzzy weights to deal with the vagueness and uncertainty of students’ teaching evaluations and provides a risk map of the potential areas of teaching performances improvement in the long term. The proposed framework allows the identification of teaching and course quality aspects that need corrective actions in response to students’ criticisms in accordance with different levels of priority.

Design/methodology/approach

This study adopts two methods, commonly used in industrial applications, i.e. the u-control chart and ABC analysis. Combining the results of a literature review on teaching evaluation and the application of these two methods as building blocks for the assessment, a framework to detect potential risks reducing teaching quality in higher education is proposed. The application of the framework is shown through an action-based case study developed in an Italian public university.

Findings

The study proposes a framework that combines two methods, i.e. u-control chart and ABC analysis with fuzzy weights, to support the assessment of teaching and course quality. The framework is proposed as an assessment approach of the teaching performance in higher education with the purpose to continuously improve the quality of teaching and courses both in the short, medium and long term.

Originality/value

The study provides an original contribution to the understanding of how to analyze students’ evaluation of teaching performance in order to take proper and timely decisions on corrective actions in response to the need of continuously improving the level of teaching and course quality.

Details

Management Decision, vol. 57 no. 2
Type: Research Article
ISSN: 0025-1747

Keywords

To view the access options for this content please click here
Article
Publication date: 9 March 2015

Dan Albertson

The purpose of this study is to synthesize prior user-centered research to develop and present a generalized framework for evaluating visual, i.e. both image and video…

Downloads
1034

Abstract

Purpose

The purpose of this study is to synthesize prior user-centered research to develop and present a generalized framework for evaluating visual, i.e. both image and video digital libraries. The primary objectives include comprehensively examining the current state of visual digital library research to: develop a generalized framework applicable for designing user-centered evaluations of visual digital libraries; identify influential experimental factors warranting assessment evaluation as part of specific contexts; and provide examples of applied methods that have been used in research, demonstrating notable findings.

Design/methodology/approach

The framework presented in the present study depicts a set of user-centered methodological considerations and examples, synthesized from a review of prior research that provides significant understanding of users and uses of visual information.

Findings

Primary components for digital library evaluation, pertaining to user, interaction, system and domain and topic, and their implications for interactive research are presented. Methods, examples and discussion are presented for each primary evaluation component of the framework.

Practical implications

Previously applied evaluations and their significance are described and presented as part of the developed framework, providing the importance of each component for practical application in future research and development of interactive visual digital libraries.

Originality/value

Visual digital libraries warrant individual assessment, apart from other types of digital collections, as they offer users more ways to retrieve and interact with collection items. The present study complements prior digital library evaluation research by demonstrating the need for a separate framework due to variations influenced by visual information and reporting on evaluations from different perspectives.

Details

New Library World, vol. 116 no. 3/4
Type: Research Article
ISSN: 0307-4803

Keywords

To view the access options for this content please click here
Article
Publication date: 1 April 2006

Ying‐Lien Lee, Sheue‐Ling Hwang and Eric Min‐Yang Wang

The primary purpose of this paper is to present an integrated framework for user interface prototyping and evaluation for the development of information systems and to…

Downloads
2226

Abstract

Purpose

The primary purpose of this paper is to present an integrated framework for user interface prototyping and evaluation for the development of information systems and to present architecture for evaluating generic applications.

Design/methodology/approach

The framework is constructed through combining two distinctive methods of prototyping and evaluation, statechart and goals, operators, methods, and selection rules. Relevant methods and architectures of the integrated framework are presented in unified modeling language when possible.

Findings

The importance of the usability of information systems is highlighted in this research. However, it still lacks an integrated framework for information system development and usability evaluation. This paper provides a framework that evaluation method is intertwined with user interface prototyping to shorten the time of development lifecycle. The architecture for evaluating generic applications is also invaluable for motion and time study and the procurement of vender‐provided systems.

Research limitations/implications

The user base of information systems is diverse and the requirements of these systems change over time. This paper provides a framework that helps managers and engineers smooth and shorten the development phases. For future works, an object‐oriented programming framework and a tool for evaluating generic applications will be developed.

Originality/value

This paper proposes a comprehensive framework for combining prototyping and evaluation, as well as architecture for the evaluation of generic applications. It shortens the development phases by using formal modeling for user interface construction and evaluation. It also provides means to evaluate candidate systems whose program logics cannot be accessed and modified. It also complements the models used in the framework by extending their practical and academic values.

Details

Industrial Management & Data Systems, vol. 106 no. 4
Type: Research Article
ISSN: 0263-5577

Keywords

To view the access options for this content please click here
Article
Publication date: 4 March 2014

Ekaterina A. Makarova and Anna Sokolova

The aim of this paper is to identify ways for improvement of the foresight evaluation framework on the basis of analysis and systematisation of accumulated experience in

Downloads
2317

Abstract

Purpose

The aim of this paper is to identify ways for improvement of the foresight evaluation framework on the basis of analysis and systematisation of accumulated experience in the field of project management.

Design/methodology/approach

The paper is based on a detailed literature review devoted to an evaluation of foresight and traditional projects. The approaches to project evaluation in the field of project management were investigated, and the main steps of traditional project evaluation process were determined. The most commonly applied steps of foresight evaluation were identified by the analysis of recent foresight evaluation projects. The comparison of evaluation frameworks for foresight projects and traditional projects allows to provide recommendations for foresight evaluation framework improvement.

Findings

The paper identifies several lessons for foresight evaluation from project management. The elements which can enrich foresight evaluation framework are the following: the development of an evaluation model; the extensive use of quantitative methods; the elaboration of evaluation scales; the inclusion of economic indicators into evaluation; and the provision of more openness and transparency for evaluation results.

Originality/value

Given the importance of foresight evaluation procedures and the lack of a commonly applied methodological approach, the value of this paper consists in identifying a foresight evaluation framework and enriching it with elements of project management.

Details

Foresight, vol. 16 no. 1
Type: Research Article
ISSN: 1463-6689

Keywords

To view the access options for this content please click here
Article
Publication date: 1 April 2004

Scott Nicholson

This conceptual piece presents a framework to aid libraries in gaining a more thorough and holistic understanding of their users and services. Through a presentation of…

Downloads
3143

Abstract

This conceptual piece presents a framework to aid libraries in gaining a more thorough and holistic understanding of their users and services. Through a presentation of the history of library evaluation, a multidimensional matrix of measures is developed that demonstrates the relationship between the topics and perspectives of measurement. These measurements are then combined through evaluation criteria, and then different participants in the library system view those criteria for decision making. By implementing this framework for holistic measurement and cumulative evaluation, library evaluators can gain a more holistic knowledge of the library system and library administrators can be better informed for their decision‐making processes.

Details

Journal of Documentation, vol. 60 no. 2
Type: Research Article
ISSN: 0022-0418

Keywords

To view the access options for this content please click here
Article
Publication date: 3 April 2009

Elise Ramstad

During the past decade new types of broader networks that aim to achieve widespread effects in the working life have emerged. These are typically based on an interactive…

Downloads
2424

Abstract

Purpose

During the past decade new types of broader networks that aim to achieve widespread effects in the working life have emerged. These are typically based on an interactive innovation approach, where knowledge is created jointly together with diverse players. At the moment, the challenge is how to evaluate these complex networks and learning processes. This paper seeks to present a developmental evaluation framework for innovation and learning networks.

Design/methodology/approach

The evaluation framework is based on a systemic and complementarity view on knowledge sources and innovation activities. The framework integrates three different elements of network: structure, learning processes, and the outcomes for different actors. The basic assumption is that networks with several actors based on an expanded triple helix model (workplaces, R&D infrastructure, and policy makers) and several learning processes enable better innovation potential and broader outcomes. Here criteria for an evaluation framework are created, which are then contested with empiria, in this case learning network projects (n=17) funded by the Finnish Workplace Development Programme.

Findings

The results show that the created evaluation framework offers a useful tool to point out the networks with a best potential to broader outcomes for diverse actors. It can provide a tool for policy makers, but also for involving participants, in order to direct and coordinate innovation and generative learning more effectively. However, there is not, and cannot be, a common and strict pattern for an innovation and learning network, as one of their main goals is to create and experiment with new forms of development cooperation.

Originality/value

Evaluation framework is needed in order to direct and increase the validity of innovation and learning networks.

Details

Journal of Workplace Learning, vol. 21 no. 3
Type: Research Article
ISSN: 1366-5626

Keywords

1 – 10 of over 102000