Search results
1 – 10 of over 17000Peter Kieseberg, Sebastian Schrittwieser, Lorcan Morgan, Martin Mulazzani, Markus Huber and Edgar Weippl
Today's database management systems implement sophisticated access control mechanisms to prevent unauthorized access and modifications. For instance, this is an important basic…
Abstract
Purpose
Today's database management systems implement sophisticated access control mechanisms to prevent unauthorized access and modifications. For instance, this is an important basic requirement for SOX (Sarbanes‐Oxley Act) compliance, whereby every past transaction has to be traceable at any time. However, malicious database administrators may still be able to bypass the security mechanisms in order to make hidden modifications to the database. This paper aims to address these issues.
Design/methodology/approach
In this paper the authors define a novel signature of a B+‐tree, a widely‐used storage structure in database management systems, and propose its utilization for supporting the logging in databases. This additional logging mechanism is especially useful in conjunction with forensic techniques that directly target the underlying tree‐structure of an index. Several techniques for applying this signature in the context of digital forensics on B+‐trees are proposed in the course of this paper. Furthermore, the authors' signature can be used to generate exact copies of an index for backup purposes, thereby enabling the owner to completely restore data, even on the structural level.
Findings
For database systems in enterprise environments, compliance to regulatory standards such as SOX (Sarbanes‐Oxley Act), whereby every past transaction has to be traceable at any time, is a fundamental requirement. Today's database management systems usually implement sophisticated access control mechanisms to prevent unauthorized access and modifications. Nonetheless malicious database administrators would be able to bypass the security mechanisms in order to make modifications to the database, while covering their tracks.
Originality/value
In this paper, the authors demonstrate how the tree structure of the underlying store engine can be used to enhance forensic logging mechanisms of the database. They define a novel signature for B+‐trees, which are used by the InnoDB storage engine. This signature stores the structure of database storage files and can help in reconstructing previous versions of the file for forensic purposes. Furthermore, the authors' signature can be used to generate exact copies of an index for backup purposes, thus enabling the owner to completely restore data, even on the structural level. The authors applied their concept to four real‐life scenarios in order to evaluate its effectiveness.
Details
Keywords
The purpose of this article is to present an overview of the history and development of transaction log analysis (TLA) in library and information science research. Organizing a…
Abstract
The purpose of this article is to present an overview of the history and development of transaction log analysis (TLA) in library and information science research. Organizing a literature review of the first twenty‐five years of TLA poses some challenges and requires some decisions. The primary organizing principle could be a strict chronology of the published research, the research questions addressed, the automated information retrieval (IR) systems that generated the data, the results gained, or even the researchers themselves. The group of active transaction log analyzers remains fairly small in number, and researchers who use transaction logs tend to use this method more than once, so tracing the development and refinement of individuals' uses of the methodology could provide insight into the progress of the method as a whole. For example, if we examine how researchers like W. David Penniman, John Tolle, Christine Borgman, Ray Larson, and Micheline Hancock‐Beaulieu have modified their own understandings and applications of the method over time, we may get an accurate sense of the development of all applications.
David Nicholas, Paul Huntington, Peter Williams, Nat Lievesley, Tom Dobrowolski and Richard Withey
There is a general dearth of trustworthy information on who is using the web and how they use it. Such information is of vital concern to web managers and their advertisers yet…
Abstract
There is a general dearth of trustworthy information on who is using the web and how they use it. Such information is of vital concern to web managers and their advertisers yet the systems for delivering such data, where in place, generally cannot supply accurate enough data. Nor have web managers the expertise or time to evaluate the enormous amounts of information that are generated by web sites. The article, based on the experience of evaluating The Times web server access logs, describes the methodological problems that lie at the heart of web log analysis, evaluates a range of use measures (visits, page impressions, hits) and provides some advice on what analyses are worth conducting.
Details
Keywords
Yanjun Zuo and Brajendra Panda
Damage assessment and recovery play key roles in the process of secure and reliable computer systems development. Post‐attack assessment in a distributed database system is rather…
Abstract
Purpose
Damage assessment and recovery play key roles in the process of secure and reliable computer systems development. Post‐attack assessment in a distributed database system is rather complicated due to the indirect dependencies among sub‐transactions executed at different sites. Hence, the damage assessment procedure in these systems must be carried out in a collaborative way among all the participating sites in order to accurately detect all affected data items. This paper seeks to propose two approaches for achieving this, namely, centralized and peer‐to‐peer damage assessment models.
Design/methodology/approach
Each of the two proposed methods should be applied immediately after an intrusion on a distributed database system was reported. In the category of the centralized model, three sub‐models are further discussed, each of which is best suitable for a certain type of situations in a distributed database system.
Findings
Advantages and disadvantages of the models are analyzed on a comparative basis and the most suitable situations to which each model should apply are presented. A set of algorithms is developed to formally describe the damage assessment procedure for each model (sub‐model). Synchronization is essential in any system where multiple processes run concurrently. User‐level synchronization mechanisms have been presented to ensure that the damage assessment operations are conducted in a correct order.
Originality/value
The paper proposes two means for damage assessment.
Details
Keywords
A.S. Sodiya, H.O.D. Longe and A.T. Akinwale
Researchers have used many techniques in designing intrusion detection systems (IDS) and yet we still do not have an effective IDS. The interest in this work is to combine…
Abstract
Researchers have used many techniques in designing intrusion detection systems (IDS) and yet we still do not have an effective IDS. The interest in this work is to combine techniques of data mining and expert systems in designing an effective anomaly‐based IDS. Combining methods may give better coverage, and make the detection more effective. The idea is to mine system audit data for consistent and useful patterns of user behaviour, and then keep these normal behaviours in profiles. An expert system is used as the detection system that recognizes anomalies and raises an alarm. The evaluation of the intrusion detection system design was carried out to justify the importance of the work.
Details
Keywords
Martin Botha and Rossouw von Solms
A survey recently completed by the Computer Security Institute (CSI) and the Federal Bureau of Investigation (FBI) revealed that corporations, banks, and governments all face a…
Abstract
A survey recently completed by the Computer Security Institute (CSI) and the Federal Bureau of Investigation (FBI) revealed that corporations, banks, and governments all face a growing threat from computer crime, and in particular computer hacking. Computer hacking activities caused well over US$100 million in losses last year in the USA and the trend toward professional computer crime, such as computer hacking, is on the rise. Different methods are currently used to control the computer crime problem, for example, by controling access to and from a network by implementing a firewall. As the survey highlighted, most of these methods are insufficient. New means and ways which will minimise and control the hacking problem must therefore continuously be researched and defined. Proposes a method, using trend analysis, that could be utilized to minimise and control the hacking problem in an organisation.
Details
Keywords
Hu Xia, Yan Fu, Junlin Zhou and Qi Xia
The purpose of this paper is to provide an intelligent spam filtering method to meet the real‐time processing requirement of the massive short message stream and reduce manual…
Abstract
Purpose
The purpose of this paper is to provide an intelligent spam filtering method to meet the real‐time processing requirement of the massive short message stream and reduce manual operation of the system.
Design/methodology/approach
An integrated framework based on a series of algorithms is proposed. The framework consists of message filtering module, log analysis module and rules handling module, and dynamically filters the short message spam, while generating the filtering rules. Experiments using Java are used to execute the proposed work.
Findings
The experiments are carried out both on the simulation model (off‐line) and on the actual plant (on‐line). All experiment data are considered in both normal and spam real short messages. The results show that use of the integrated framework leads to a comparable accuracy and meet the real‐time filtration requirement.
Originality/value
The approach in the design of the filtering system is novel. In addition, implementation of the proposed integrated framework allows the method not only to reduce the computational cost which leads to a high processing speed but also to filter spam messages with a high accuracy.
Details
Keywords
Ahmad Rafee Che Kassim and Thomas R. Kochtanek
This paper presents the current status in the development of the ongoing project now known as Project i‐DLR. The content of this “pointer site” includes resources pertaining to…
Abstract
This paper presents the current status in the development of the ongoing project now known as Project i‐DLR. The content of this “pointer site” includes resources pertaining to digital libraries organised using an educational framework for access. The paper describes the five‐stage evaluation of that educational digital library resource (www.coe.missouri.edu/ rafee/idigital libraryR/index.php). The focus of this particular effort is on the continued development and refinement based on the recent evaluations of this resource by end users seeking to access digital library resources. The five evaluation methods are presented and described, beginning with focus group reviews, Web log analysis, database transaction logs, a Web survey, and most recently, a remote usability evaluation. As the resource continues to grow in both breadth and depth, such analyses are critical to continued refinement of the interface, the sources themselves, and the manner in which they are organised and presented.
Details
Keywords
Wireless technologies have enhanced applications mobility in no small way. They have created new and increasing number of human‐related challenges particularly in the areas of…
Abstract
Wireless technologies have enhanced applications mobility in no small way. They have created new and increasing number of human‐related challenges particularly in the areas of wireless‐based applications such as Mobile Marketing (Marketing). Bluetooth wireless technology is a completely new method through which devices within a short radius can communicate effectively. This paper explores wireless technologies world for marketing purposes, focusing on Bluetooth as an example to build a system that provides an interactive Bluetooth station for marketing purposes. The Bluetooth station includes Bluetooth profile (OBEX), server, and client applications.
Details
Keywords
Joyce Chapman and David Woodbury
The purpose of this paper is to encourage administrators of device‐lending programs to leverage existing quantitative data for management purposes by integrating analysis of…
Abstract
Purpose
The purpose of this paper is to encourage administrators of device‐lending programs to leverage existing quantitative data for management purposes by integrating analysis of quantitative data into the day‐to‐day workflow.
Design/methodology/approach
This is a case study of NCSU Libraries' efforts to analyze and visualize transactional data to aid in the on‐going management of a device‐lending program.
Findings
Analysis and visualization of qualitative data related to technology lending revealed patterns in lending over the course of the semester, day, and week that had previously gone unrecognized. With more concrete data about trends in wait times, capacity lending, and circulation volume, staff are now able to make more informed purchasing decisions, modify systems and workflows to better meet user needs, and begin to explore new ideas for services and staffing models.
Practical implications
The concepts and processes described here can be replicated by other libraries that wish to leverage transactional data analysis and data visualization to aid in management of a device‐lending program.
Originality/value
Although much literature exists on the implementation and qualitative evaluation of device‐lending programs, this paper is the first to provide librarians with ideas for leveraging analysis of transactional data to improve management of a device‐lending program.
Details