Search results

1 – 10 of 34
Article
Publication date: 6 September 2021

Sivaraman Eswaran, Vakula Rani, Daniel D., Jayabrabu Ramakrishnan and Sadhana Selvakumar

In the recent era, banking infrastructure constructs various remotely handled platforms for users. However, the security risk toward the banking sector has also elevated, as it is…

Abstract

Purpose

In the recent era, banking infrastructure constructs various remotely handled platforms for users. However, the security risk toward the banking sector has also elevated, as it is visible from the rising number of reported attacks against these security systems. Intelligence shows that cyberattacks of the crawlers are increasing. Malicious crawlers can crawl the Web pages, crack the passwords and reap the private data of the users. Besides, intrusion detection systems in a dynamic environment provide more false positives. The purpose of this research paper is to propose an efficient methodology to sense the attacks for creating low levels of false positives.

Design/methodology/approach

In this research, the authors have developed an efficient approach for malicious crawler detection and correlated the security alerts. The behavioral features of the crawlers are examined for the recognition of the malicious crawlers, and a novel methodology is proposed to improvise the bank user portal security. The authors have compared various machine learning strategies including Bayesian network, support sector machine (SVM) and decision tree.

Findings

This proposed work stretches in various aspects. Initially, the outcomes are stated for the mixture of different kinds of log files. Then, distinct sites of various log files are selected for the construction of the acceptable data sets. Session identification, attribute extraction, session labeling and classification were held. Moreover, this approach clustered the meta-alerts into higher level meta-alerts for fusing multistages of attacks and the various types of attacks.

Originality/value

This methodology used incremental clustering techniques and analyzed the probability of existing topologies in SVM classifiers for more deterministic classification. It also enhanced the taxonomy for various domains.

Details

International Journal of Pervasive Computing and Communications, vol. 18 no. 1
Type: Research Article
ISSN: 1742-7371

Keywords

Content available
Article
Publication date: 24 January 2022

A. Pasumponpandian, Robert Bestak, Klimis Ntalianis and Ram Palanisamy

222

Abstract

Details

International Journal of Pervasive Computing and Communications, vol. 18 no. 1
Type: Research Article
ISSN: 1742-7371

Article
Publication date: 14 May 2018

Robert Fox

This paper aims to describe several methods to expose website information to Web crawlers for providing value-added services to patrons.

Abstract

Purpose

This paper aims to describe several methods to expose website information to Web crawlers for providing value-added services to patrons.

Design/methodology/approach

This is a conceptual paper exploring the areas of search engine optimization (SEO) and usability in the context of search engines.

Findings

Not applicable

Originality/value

This paper explains several methods that can be used to appropriately expose website content and library services to the Web crawlers in such a way that services and content can be syndicated via those search engines.

Details

Digital Library Perspectives, vol. 34 no. 2
Type: Research Article
ISSN: 2059-5816

Keywords

Article
Publication date: 1 March 2006

L. Kazatzopoulos, C. Delakouridis, G.F. Marias and P. Georgiadis

The purpose of this paper is to propose the use of priority‐based incentives for collaborative hiding of confidential information in dynamic environments, such as self‐organized…

Abstract

Purpose

The purpose of this paper is to propose the use of priority‐based incentives for collaborative hiding of confidential information in dynamic environments, such as self‐organized networks, peer‐to‐peer systems, pervasive and grid computing applications.

Design/methodology/approach

The paper documents the necessity of ISSON (Incentives for Secret‐sharing in Self‐Organised Networks); it provides functional and technical details on the proposed architecture; and, it assesses its feasibility in mobile ad‐hoc networks through real experiments. The paper elaborates on the availability of the hidden information through an analytical framework.

Findings

Through the real experiments, ISSON was found to be efficient in terms of communication and processing costs. Additionally, it avoids collusions for unauthorized revealing of the hidden information, and ensures the unlinkability and availability of the secret when it is divided and stored to peers.

Originality/value

The proposed, incentive‐based, privacy enforcement architecture is novel and applies to distributed, dynamic, and self‐configured computing environments.

Details

Internet Research, vol. 16 no. 2
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 12 April 2013

Herbert Zuze and Melius Weideman

The purpose of this research project was to determine how the three biggest search engines interpret keyword stuffing as a negative design element.

1742

Abstract

Purpose

The purpose of this research project was to determine how the three biggest search engines interpret keyword stuffing as a negative design element.

Design/methodology/approach

This research was based on triangulation between scholar reporting, search engine claims, SEO practitioners and empirical evidence on the interpretation of keyword stuffing. Five websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. Two phases of the experiment were done and the response of the search engines was recorded.

Findings

Scholars have indicated different views in respect of spamdexing, characterised by different keyword density measurements in the body text of a webpage. During both phases, almost all the test webpages, including the one with a 97.3 per cent keyword density, were indexed.

Research limitations/implications

Only the three biggest search engines were considered, and monitoring was done for a set time only. The claims that high keyword densities will lead to blacklisting have been refuted.

Originality/value

Websites should be designed with high quality, well‐written content. Even though keyword stuffing is unlikely to lead to search engine penalties, it could deter human visitors and reduce website value.

Details

Online Information Review, vol. 37 no. 2
Type: Research Article
ISSN: 1468-4527

Keywords

Article
Publication date: 1 March 2006

Aameek Singh, Bugra Gedik and Ling Liu

To provide mutual anonymity over traditionally un‐anonymous Distributed Hash Tables (DHT) based Peer‐to‐Peer overlay networks, while maintaining the desired scalability and…

Abstract

Purpose

To provide mutual anonymity over traditionally un‐anonymous Distributed Hash Tables (DHT) based Peer‐to‐Peer overlay networks, while maintaining the desired scalability and guaranteed lookup properties of the DHTs.

Design/methodology/approach

Agyaat uses a novel hybrid‐overlay design, a fully decentralized topology without any trusted proxies. It anonymizes both the querying and responding peers through the use of unstructured topologies, called clouds, which are added onto the structured overlays. In addition, it regulates the cloud topologies to ensure the guaranteed location of data and scalability of routing. A unique characteristic of the design is the ability of users to tradeoff between desired anonymity and performance. The paper presents a thorough performance and anonymity analysis of the system, and also analyzes few anonymity compromising attacks and countermeasures.

Findings

The results indicate that Agyaat is able to provide mutual anonymity while maintaining the scalability of lookups, affecting the costs only by a constant factor.

Research limitations/implications

While Agyaat is able to meet its mutual anonymity and performance goals, there exist other security vulnerabilities like possible Denial‐of‐Service (DoS) attacks, both due to its design and the underlying DHT overlay. This is fertile ground for future work.

Originality/value

Agyaat uses a novel topology architecture and associated protocols that are conducive to providing mutually anonymous services.

Details

Internet Research, vol. 16 no. 2
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 1 January 1963

TO all our readers we can wish a happy and prosperous New Year with greater confidence than usual that the coming months will translate the familiar hope into reality. The reason…

Abstract

TO all our readers we can wish a happy and prosperous New Year with greater confidence than usual that the coming months will translate the familiar hope into reality. The reason for that optimism is National Productivity Year. There is growing evidence that the public understands and appreciates its central theme. Naturally the official Guildhall opening attracted wide publicity. Since then it has been reinforced through the addresses given by prominent personalities at meetings up and down the country; meetings which will continue and build up a favourable climate in industry.

Details

Work Study, vol. 12 no. 1
Type: Research Article
ISSN: 0043-8022

Open Access
Book part
Publication date: 4 June 2021

Briony Anderson and Mark A. Wood

This chapter examines the phenomenon of doxxing: the practice of publishing private, proprietary, or personally identifying information on the internet, usually with malicious

Abstract

This chapter examines the phenomenon of doxxing: the practice of publishing private, proprietary, or personally identifying information on the internet, usually with malicious intent. Undertaking a scoping review of research into doxxing, we develop a typology of this form of technology-facilitated violence (TFV) that expands understandings of doxxing, its forms and its harms, beyond a taciturn discussion of privacy and harassment online. Building on David M. Douglas's typology of doxxing, our typology considers two key dimensions of doxxing: the form of loss experienced by the victim and the perpetrator's motivation(s) for undertaking this form of TFV. Through examining the extant literature on doxxing, we identify seven mutually non-exclusive motivations for this form of TFV: extortion, silencing, retribution, controlling, reputation-building, unintentional, and doxxing in the public interest. We conclude by identifying future areas for interdisciplinary research into doxxing that brings criminology into conversation with the insights of media-focused disciplines.

Details

The Emerald International Handbook of Technology-Facilitated Violence and Abuse
Type: Book
ISBN: 978-1-83982-849-2

Keywords

Article
Publication date: 23 November 2012

Swapan Purkait

Phishing is essentially a social engineering crime on the Web, whose rampant occurrences and technique advancements are posing big challenges for researchers in both academia and…

5994

Abstract

Purpose

Phishing is essentially a social engineering crime on the Web, whose rampant occurrences and technique advancements are posing big challenges for researchers in both academia and the industry. The purpose of this study is to examine the available phishing literatures and phishing countermeasures, to determine how research has evolved and advanced in terms of quantity, content and publication outlets. In addition to that, this paper aims to identify the important trends in phishing and its countermeasures and provides a view of the research gap that is still prevailing in this field of study.

Design/methodology/approach

This paper is a comprehensive literature review prepared after analysing 16 doctoral theses and 358 papers in this field of research. The papers were analyzed based on their research focus, empirical basis on phishing and proposed countermeasures.

Findings

The findings reveal that the current anti‐phishing approaches that have seen significant deployments over the internet can be classified into eight categories. Also, the different approaches proposed so far are all preventive in nature. A Phisher will mainly target the innocent consumers who happen to be the weakest link in the security chain and it was found through various usability studies that neither server‐side security indicators nor client‐side toolbars and warnings are successful in preventing vulnerable users from being deceived.

Originality/value

Educating the internet users about phishing, as well as the implementation and proper application of anti‐phishing measures, are critical steps in protecting the identities of online consumers against phishing attacks. Further research is required to evaluate the effectiveness of the available countermeasures against fresh phishing attacks. Also there is the need to find out the factors which influence internet user's ability to correctly identify phishing websites.

Details

Information Management & Computer Security, vol. 20 no. 5
Type: Research Article
ISSN: 0968-5227

Keywords

Article
Publication date: 18 December 2019

Konstantina Vemou and Maria Karyda

In the Web 2.0 era, users massively communicate through social networking services (SNS), often under false expectations that their communications and personal data are private…

Abstract

Purpose

In the Web 2.0 era, users massively communicate through social networking services (SNS), often under false expectations that their communications and personal data are private. This paper aims to analyze privacy requirements of personal communications over a public medium.

Design/methodology/approach

This paper systematically analyzes SNS services as communication models and considers privacy as an attribute of users’ communication. A privacy threat analysis for each communication model is performed, based on misuse scenarios, to elicit privacy requirements per communication type.

Findings

This paper identifies all communication attributes and privacy threats and provides a comprehensive list of privacy requirements concerning all stakeholders: platform providers, users and third parties.

Originality/value

Elicitation of privacy requirements focuses on the protection of both the communication’s message and metadata and takes into account the public–private character of the medium (SNS platform). The paper proposes a model of SNS functionality as communication patterns, along with a method to analyze privacy threats. Moreover, a comprehensive set of privacy requirements for SNS designers, third parties and users involved in SNS is identified, including voluntary sharing of personal data, the role of the SNS platforms and the various types of communications instantiating in SNS.

Details

Information & Computer Security, vol. 28 no. 1
Type: Research Article
ISSN: 2056-4961

Keywords

1 – 10 of 34