Search results

1 – 10 of over 1000
Article
Publication date: 6 October 2020

Mulki Indana Zulfa, Rudy Hartanto and Adhistya Erna Permanasari

Internet users and Web-based applications continue to grow every day. The response time on a Web application really determines the convenience of its users. Caching Web content is…

Abstract

Purpose

Internet users and Web-based applications continue to grow every day. The response time on a Web application really determines the convenience of its users. Caching Web content is one strategy that can be used to speed up response time. This strategy is divided into three main techniques, namely, Web caching, Web prefetching and application-level caching. The purpose of this paper is to put forward a literature review of caching strategy research that can be used in Web-based applications.

Design/methodology/approach

The methods used in this paper were as follows: determined the review method, conducted a review process, pros and cons analysis and explained conclusions. The review method is carried out by searching literature from leading journals and conferences. The first search process starts by determining keywords related to caching strategies. To limit the latest literature in accordance with current developments in website technology, search results are limited to the past 10 years, in English only and related to computer science only.

Findings

Note in advance that Web caching and Web prefetching are slightly overlapping techniques because they have the same goal of reducing latency on the user’s side. But actually, the two techniques are motivated by different basic mechanisms. Web caching uses the basic mechanism of cache replacement or the algorithm to change cache objects in memory when the cache capacity is full, whereas Web prefetching uses the basic mechanism of predicting cache objects that can be accessed in the future. This paper also contributes practical guidelines for choosing the appropriate caching strategy for Web-based applications.

Originality/value

This paper conducts a state-of-the art review of caching strategies that can be used in Web applications. Exclusively, this paper presents taxonomy, pros and cons of selected research and discusses data sets that are often used in caching strategy research. This paper also provides another contribution, namely, practical instructions for Web developers to decide the caching strategy.

Details

International Journal of Web Information Systems, vol. 16 no. 5
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 19 June 2017

Tsuyoshi Donen, Shingo Otsubo, Ryo Nishide, Ian Piumarta and Hideyuki Takada

The purpose of this study is to reduce internet traffic when performing collaborative Web search. Mobile terminals are now in widespread use and people are increasingly using them…

2059

Abstract

Purpose

The purpose of this study is to reduce internet traffic when performing collaborative Web search. Mobile terminals are now in widespread use and people are increasingly using them for collaborative Web search to achieve a common goal. When performing such searches, the authors want to reduce internet traffic as much as possible, for example, to avoid bandwidth throttling that occurs when data usage exceeds a certain quota.

Design/methodology/approach

To reduce internet traffic, the authors use a proxy system based on the peer cache mechanism. The proxy shares Web content stored on mobile terminals participating in an ad hoc Bluetooth network, focusing on content that is accessed multiple times from different terminals. Evaluation of the proxy’s effectiveness was measured using experiments designed to replicate realistic usage scenarios.

Findings

Experimental results show that the proxy reduces internet traffic by approximately 20 per cent when four people collaboratively search the Web to find good restaurants for a social event.

Originality/value

Unlike previous work on co-operative Web proxies, the authors study a form of collaborative Web caching between mobile devices within an ad hoc Bluetooth network created specifically for the purpose of sharing cached content, acting orthogonally to (and independently of) traditional hierarchical Web caching.

Details

International Journal of Web Information Systems, vol. 13 no. 2
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 1 August 2000

Bert J. Dempsey

Every user of the World Wide Web understands why the WWW is often ridiculed as the World Wide Wait. The WWW and other applications on the Internet have been developed with a…

Abstract

Every user of the World Wide Web understands why the WWW is often ridiculed as the World Wide Wait. The WWW and other applications on the Internet have been developed with a client‐server orientation that, in its simplest form, involves a centralized information repository to which users (clients) send requests. This single‐server model suffers from performance problems when clients are too numerous, when clients are physically far away in the Network, when the materials being delivered become very large and hence stress the wide‐area bandwidth, and when the information has a real‐time delivery component as with streaming audio and video materials. Engineering information delivery solutions that break the single‐site model has become an important aspect of next‐generation WWW delivery systems. Intends to help the information professional understand what new directions the delivery infrastructure of the WWW is taking and why these technical changes will impact users around the globe, especially in bandwidth‐poor areas of the Internet.

Details

The Electronic Library, vol. 18 no. 4
Type: Research Article
ISSN: 0264-0473

Keywords

Article
Publication date: 1 May 2001

David C. Yen and David C. Chou

Building intranets enhances organizational communication and information access. Intranets can be used to achieve the goals of business process reengineering and organizational…

4529

Abstract

Building intranets enhances organizational communication and information access. Intranets can be used to achieve the goals of business process reengineering and organizational innovation. This paper discusses the implications, benefits, concerns and challenges of building intranets for organizational innovation.

Details

Information Management & Computer Security, vol. 9 no. 2
Type: Research Article
ISSN: 0968-5227

Keywords

Article
Publication date: 1 September 2005

Lin‐Chih Chen and Cheng‐Jye Luh

This study aims to present a new web page recommendation system that can help users to reduce navigational time on the internet.

1260

Abstract

Purpose

This study aims to present a new web page recommendation system that can help users to reduce navigational time on the internet.

Design/methodology/approach

The proposed design is based on the primacy effect of browsing behavior, that users prefer top ranking items in search results. This approach is intuitive and requires no training data at all.

Findings

A user study showed that users are more satisfied with the proposed search methods than with general search engines using hot keywords. Moreover, two performance measures confirmed that the proposed search methods out‐perform other metasearch and search engines.

Research limitations/implications

The research has limitations and future work is planned along several directions. First, the search methods implemented are primarily based on the keyword match between the contents of web pages and the user query items. Using the semantic web to recommend concepts and items relevant to the user query might be very helpful in finding the exact contents that users want, particularly when the users do not have enough knowledge about the domains in which they are searching. Second, offering a mechanism that groups search results to improve the way search results are segmented and displayed also assists users to locate the contents they need. Finally, more user feedback is needed to fine‐tune the search parameters including α and β to improve the performance.

Practical implications

The proposed model can be used to improve the search performance of any search engine.

Originality/value

First, compared with the democratic voting procedure used by metasearch engines, search engine vector voting (SVV) enables a specific combination of search parameters, denoted as α and β, to be applied to a voted search engine, so that users can either narrow or expand their search results to meet their search preferences. Second, unlike page quality analysis, the hyperlink prediction (HLP) determines qualified pages by simply measuring their user behavior function (UBF) values, and thus takes less computing power. Finally, the advantages of HLP over statistical analysis are that it does not need training data, and it can target both multi‐site and site‐specific analysis.

Details

Internet Research, vol. 15 no. 4
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 10 July 2007

Jenny Craven and Annika Nietzio

Purpose – The purpose of this paper is to describe research undertaken for the European Internet Accessibility Observatory (EIAO) project. It aims to demonstrate how, using a…

1013

Abstract

Purpose – The purpose of this paper is to describe research undertaken for the European Internet Accessibility Observatory (EIAO) project. It aims to demonstrate how, using a task‐based approach, statistical measures can be applied to an initial assessment of a web site's accessibility, which could then be applied to further assessments to provide an evolving picture of the ongoing accessibility of a web site. Design/methodology/approach – Task‐based assessments were used to assess the accessibility of web sites, using quantitative and qualitative analysis. The findings from this approach were mapped onto a probabilistic model, developed to assess the probability of an accessibility barrier relating to a specific feature or features of a web site. Findings – The paper finds that providing participants with a task instead of allowing them to randomly explore and evaluate a web site yielded more comparable results. For the EIAO project team, the benefit of the task‐based approach was that it allowed them to compare the user testing results with the results of the automated testing tool developed by the project. From the aggregation models included in the analysis, the most appropriate model and parameters were selected, and adjustments were made according to the comparison outcome. Research limitations/implications – Due to resource limitations and efficiency requirements, the assessments undertaken were limited to automatic evaluation, which could also be tested by the users. Therefore not all accessibility barriers in a web site could be identified. Despite this, it is felt that the outcome of the automatic analysis can be utilised as indicator for the overall accessibility of the web site. Originality/value – This paper provides a framework for web designers, commissioners, and policy makers to undertake a user focussed assessment of the accessibility of their web sites, which could be used in conjunction with other assessment methods.

Details

Performance Measurement and Metrics, vol. 8 no. 2
Type: Research Article
ISSN: 1467-8047

Keywords

Article
Publication date: 9 November 2015

Mark Taylor, John Haggerty, David Gresty, Natalia Criado Pacheco, Tom Berry and Peter Almond

The purpose of this paper is to examine the process of investigation of employee harassment via social media to develop best practices to help organisations conduct such…

5252

Abstract

Purpose

The purpose of this paper is to examine the process of investigation of employee harassment via social media to develop best practices to help organisations conduct such investigations more effectively.

Design/methodology/approach

It reviews the technical, managerial and legal literature to develop guidance for organisations conducting investigations of employee harassment via social media.

Findings

Organisations may not have effective procedures for the investigation of social media misuse, in general, and employee harassment via social media, in particular. This paper provides guidance for organisations to conduct investigation of employee harassment via social media more effectively.

Originality/value

The paper consolidates the fragmented discussion of investigation of social media misuse with regard to employee harassment via a literature review across technical, managerial and legal disciplines. The paper provides guidance to support organisations for conducting investigations of employee harassment via social media more effectively.

Details

Journal of Systems and Information Technology, vol. 17 no. 4
Type: Research Article
ISSN: 1328-7265

Keywords

Article
Publication date: 4 April 2008

Sami Habib and Maytham Safar

The purpose of this paper is to propose a four‐level hierarchy model for multimedia documents representation to be used during the dynamic scheduling and altering of multimedia…

Abstract

Purpose

The purpose of this paper is to propose a four‐level hierarchy model for multimedia documents representation to be used during the dynamic scheduling and altering of multimedia contents.

Design/methodology/approach

The four‐level hierarchy model (object, operation, timing, and precedence), offers a fine‐grain representation of multimedia contents and is embedded within a research tool, which is called WEBCAP. WEBCAP utilizes the four‐level hierarchy to synchronize the retrieval of objects in the multimedia document employing Allen's temporal relations, and then applies the Bellman‐Ford's algorithm on the precedence graph to schedule all operations (fetch, transmit, process, and render), while satisfying the in‐time updating and all web workload's resources constraints.

Findings

The experimental results demonstrate the effectiveness of the model in scheduling the periodical updating multimedia documents while considering a variety of workloads on web/TCP.

Research limitations/implications

WEBCAP should be enhanced to automatically measure and/or approximate the available bandwidth of the system using sophisticated measurement of end‐to‐end connectivity. In addition, WEBCAP should be expanded and enhanced to examine system infrastructure for more real‐time applications, such as tele‐medicine and e‐learning.

Practical implications

WEBCAP can be used as an XML markup language for describing multimedia presentations. It can be used to create online presentations similar to PowerPoint on desktop environment, or used as an interactive e‐learning tool. An HTML browser may use a WEBCAP plug‐in to display a WEBCAP document embedded in an HTML/XML page.

Originality/value

This paper proposed a dynamic scheduling of multimedia documents with frequent updates taking into consideration the network's workload to reduce the packet lost ratio in the TCP flow, especially in the early stages. WEBCAP can be used to guide distributed systems designers/managers to schedule or tune their resources for optimal or near optimal performance, subject to minimizing the cost of document retrieval while satisfying the in time constraints.

Details

International Journal of Web Information Systems, vol. 4 no. 1
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 5 September 2008

Seda Ozmutlu and Gencer C. Cosar

Identification of topic changes within a user search session is a key issue in content analysis of search engine user queries. Recently, various studies have focused on new topic…

Abstract

Purpose

Identification of topic changes within a user search session is a key issue in content analysis of search engine user queries. Recently, various studies have focused on new topic identification/session identification of search engine transaction logs, and several problems regarding the estimation of topic shifts and continuations were observed in these studies. This study aims to analyze the reasons for the problems that were encountered as a result of applying automatic new topic identification.

Design/methodology/approach

Measures, such as cleaning the data of common words and analyzing the errors of automatic new topic identification, are applied to eliminate the problems in estimating topic shifts and continuations.

Findings

The findings show that the resulting errors of automatic new topic identification have a pattern, and further research is required to improve the performance of automatic new topic identification.

Originality/value

Improving the performance of automatic new topic identification would be valuable to search engine designers, so that they can develop new clustering and query recommendation algorithms, as well as custom‐tailored graphical user interfaces for search engine users.

Details

Library Hi Tech, vol. 26 no. 3
Type: Research Article
ISSN: 0737-8831

Keywords

Content available
Article
Publication date: 4 September 2009

Linda Cloete

760

Abstract

Details

Library Hi Tech, vol. 27 no. 3
Type: Research Article
ISSN: 0737-8831

Keywords

1 – 10 of over 1000