Search results

1 – 10 of 58
Article
Publication date: 1 July 1992

Oded Maimon and Daniel Braha

The problem of design is to find the best mapping between specifications and a solution space. A key operation that is involved in the early stages of the design process is the…

Abstract

The problem of design is to find the best mapping between specifications and a solution space. A key operation that is involved in the early stages of the design process is the synthesis operation. This short communication employs a framework previously published by the authors, to formulate a simple variant of the synthesis problem of the design. The problem is proved to be computationally intractable, thus leading to a practical conclusion that heuristics should be searched for.

Details

Kybernetes, vol. 21 no. 7
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 31 December 2007

Eleftheria Katsiri, Jean Bacon and Alan Mycroft

The event‐driven paradigm is appropriate for context aware, distributed applications, yet basic events may be too low level to be meaningful to users. The authors aim to argue…

Abstract

Purpose

The event‐driven paradigm is appropriate for context aware, distributed applications, yet basic events may be too low level to be meaningful to users. The authors aim to argue that this bottom‐up approach is insufficient to handle very low‐level sensor data or to express all the queries users might wish to make.

Design/methodology/approach

The authors propose an alternative model for querying and subscribing transparently to distributed state in a real‐time, ubiquitous, sensor‐driven environment such as is found in Sentient Computing.

Findings

The framework consists of four components: a state‐based, temporal first‐order logic (TFOL) model that represents the concrete state of the world, as perceived by sensors; an expressive TFOL‐based language, the Abstract Event Specification Language (AESL) for creating abstract event definitions, subscriptions and queries; a higherorder service (Abstract Event Detection Service) that accepts a subscription containing an abstract event definition as an argument and in return publishes an interface to a further service, an abstract event detector; and a satisfiability service that applies classical, logical satisfiability in order to check the satisfiability of the AESL definitions against the world model, in a manner similar to a constraint‐satisfaction problem.

Originality/value

The paper develops a model‐based approach, appropriate for distributed, heterogeneous environments.

Details

International Journal of Pervasive Computing and Communications, vol. 3 no. 4
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 25 February 2019

Celia Hireche and Habiba Drias

This paper is an extended version of Hireche and Drias (2018) presented at the WORLD-CIST’18 conference. The major contribution, in this work, is defined in two phases. First of…

Abstract

Purpose

This paper is an extended version of Hireche and Drias (2018) presented at the WORLD-CIST’18 conference. The major contribution, in this work, is defined in two phases. First of all, the use of data mining technologies and especially the tools of data preprocessing for instances of hard and complex problems prior to their resolution. The authors focus on clustering the instance aiming at reducing its complexity. The second phase is to solve the instance using the knowledge acquired in the first step and problem-solving methods. The paper aims to discuss these issues.

Design/methodology/approach

Because different clustering techniques may offer different results for a data set, a prior knowledge on data helps to determine the adequate type of clustering that should be applied. The first part of this work deals with a study on data descriptive characteristics in order to better understand the data. The dispersion and distribution of the variables in the problem instances is especially explored to determine the most suitable clustering technique to apply.

Findings

Several experiments were performed on different kinds of instances and different kinds of data distribution. The obtained results show the importance and the efficiency of the proposed appropriate preprocessing approaches prior to problem solving.

Practical implications

The proposed approach is developed, in this paper, on the Boolean satisfiability problem because of its well-recognised importance, with the aim of complexity reduction which allows an easier resolution of the later problem and particularly an important time saving.

Originality/value

State of the art of problem solving describes plenty of algorithms and solvers of hard problems that are still a challenge because of their complexity. The originality of this work lies on the investigation of appropriate preprocessing techniques to tackle and overcome this complexity prior to the resolution which becomes easier with an important time saving.

Details

Data Technologies and Applications, vol. 53 no. 1
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 16 November 2018

Yasmine Lahsinat, Dalila Boughaci and Belaid Benhamou

This paper aims to describe two enhancements of the variable neighbourhood search (VNS) algorithm to solve efficiently the minimum interference frequency assignment problem

Abstract

Purpose

This paper aims to describe two enhancements of the variable neighbourhood search (VNS) algorithm to solve efficiently the minimum interference frequency assignment problem (MI-FAP) which is a major issue in the radio networks, as well as a well-known NP-hard combinatorial optimisation problem. The challenge is to assign a frequency to each transceiver of the network with limited or no interferences at all. Indeed, considering that the number of radio networks users is ever increasing and that the radio spectrum is a scarce and expensive resource, the latter should be carefully managed to avoid any interference.

Design/methodology/approach

The authors suggest two new enhanced VNS variants for MI-FAP, namely, the iterated VNS (It-VNS) and the breakout VNS (BVNS). These two algorithms were designed based on the hybridising and the collaboration approaches that have emerged as two powerful means to solve hard combinatorial optimisation problems. Therefore, these two methods draw their strength from other meta-heuristics. In addition, the authors introduced a new mechanism of perturbation to enhance the performance of VNS. An extensive experiment was conducted to evaluate the performance of the proposed methods on some well-known MI-FAP datasets. Moreover, they carried out a comparative study with other metaheuristics and achieved the Friedman’s non-parametric statistical test to check the actual effect of the proposed enhancements.

Findings

The experiments showed that the two enhanced methods (It-VNS) and (BVNS) achieved better results than the VNS method. The comparative study with other meta-heuristics showed that the results are competitive and very encouraging. The Friedman’s non-parametric statistical test reveals clearly that the results of the three methods (It-VNS, BVNS and VNS) are significantly different. The authors therefore carried out the Nemenyi’s post hoc test which allowed us to identify those differences. The impact of the operated change on both the It-VNS and BVNS was thus confirmed. The proposed BVNS is competitive and able to produce good results as compared with both It-VNS and VNS for MI-FAP.

Research limitations/implications

Approached methods and particularly newly designed ones may have some drawbacks that weaken the results, in particular when dealing with extensive data. These limitations should therefore be eliminated through an appropriate approach with a view to design appropriate methods in the case of large-scale data.

Practical implications

The authors designed and implemented two new variants of the VNS algorithm before carrying out an exhaustive experimental study. The findings highlighted the potential opportunities of these two enhanced methods which could be adapted and applied to other combinatorial optimisation problems, real world applications or academic problems.

Originality/value

This paper aims at enhancing the VNS algorithm through two new approaches, namely, the It-VNS and the BVNS. These two methods were applied to the MI-FAP which is a crucial problem arising in a radio network. The numerical results are interesting and demonstrate the benefits of the proposed approaches in particular BVNS for MI-FAP.

Details

Journal of Systems and Information Technology, vol. 20 no. 4
Type: Research Article
ISSN: 1328-7265

Keywords

Article
Publication date: 3 November 2014

John H Drake, Matthew Hyde, Khaled Ibrahim and Ender Ozcan

Hyper-heuristics are a class of high-level search techniques which operate on a search space of heuristics rather than directly on a search space of solutions. The purpose of this…

Abstract

Purpose

Hyper-heuristics are a class of high-level search techniques which operate on a search space of heuristics rather than directly on a search space of solutions. The purpose of this paper is to investigate the suitability of using genetic programming as a hyper-heuristic methodology to generate constructive heuristics to solve the multidimensional 0-1 knapsack problem

Design/methodology/approach

Early hyper-heuristics focused on selecting and applying a low-level heuristic at each stage of a search. Recent trends in hyper-heuristic research have led to a number of approaches being developed to automatically generate new heuristics from a set of heuristic components. A population of heuristics to rank knapsack items are trained on a subset of test problems and then applied to unseen instances.

Findings

The results over a set of standard benchmarks show that genetic programming can be used to generate constructive heuristics which yield human-competitive results.

Originality/value

In this work the authors show that genetic programming is suitable as a method to generate reusable constructive heuristics for the multidimensional 0-1 knapsack problem. This is classified as a hyper-heuristic approach as it operates on a search space of heuristics rather than a search space of solutions. To our knowledge, this is the first time in the literature a GP hyper-heuristic has been used to solve the multidimensional 0-1 knapsack problem. The results suggest that using GP to evolve ranking mechanisms merits further future research effort.

Article
Publication date: 31 December 2006

Hooman Homayounfar and Fangju Wang

XML is becoming one of the most important structures for data exchange on the web. Despite having many advantages, XML structure imposes several major obstacles to large document…

Abstract

XML is becoming one of the most important structures for data exchange on the web. Despite having many advantages, XML structure imposes several major obstacles to large document processing. Inconsistency between the linear nature of the current algorithms (e.g. for caching and prefetch) used in operating systems and databases, and the non‐linear structure of XML data makes XML processing more costly. In addition to verbosity (e.g. tag redundancy), interpreting (i.e. parsing) depthfirst (DF) structure of XML documents is a significant overhead to processing applications (e.g. query engines). Recent research on XML query processing has learned that sibling clustering can improve performance significantly. However, the existing clustering methods are not able to avoid parsing overhead as they are limited by larger document sizes. In this research, We have developed a better data organization for native XML databases, named sibling‐first (SF) format that improves query performance significantly. SF uses an embedded index for fast accessing to child nodes. It also compresses documents by eliminating extra information from the original DF format. The converted SF documents can be processed for XPath query purposes without being parsed. We have implemented the SF storage in virtual memory as well as a format on disk. Experimental results with real data have showed that significantly higher performance can be achieved when XPath queries are conducted on very large SF documents.

Details

International Journal of Web Information Systems, vol. 2 no. 3/4
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 8 October 2018

Nikolaos Argyropoulos, Konstantinos Angelopoulos, Haralambos Mouratidis and Andrew Fish

The selection of security configurations for complex information systems is a cumbersome process. Decision-making regarding the choice of security countermeasures has to take into…

Abstract

Purpose

The selection of security configurations for complex information systems is a cumbersome process. Decision-making regarding the choice of security countermeasures has to take into consideration a multitude of, often conflicting, functional and non-functional system goals. Therefore, a structured method to support crucial security decisions during a system’s design that can take account of risk whilst providing feedback on the optimal decisions within specific scenarios would be valuable.

Design/methodology/approach

Secure Tropos is a well-established security requirements engineering methodology, but it has no concepts of Risk, whilst Constrained Goal Models are an existing method to support relevant automated reasoning tasks. Hence we bridge these methods, by extending Secure Tropos to incorporate the concept of Risk, so that the elicitation and analysis of security requirements can be complimented by a systematic risk assessment process during a system’s design time and supporting the reasoning regarding the selection of optimal security configurations with respect to multiple system objectives and constraints, via constrained goal models.

Findings

As a means of conceptual evaluation, to give an idea of the applicability of the approach and to check if alterations may be desirable, a case study of its application to an e-government information system is presented. The proposed approach is able to generate security mechanism configurations for multiple optimisation scenarios that are provided, whilst there are limitations in terms of a natural trade-off of information levels of risk assessment that are required to be elicited.

Originality/value

The proposed approach adds additional value via its flexibility in permitting the consideration of different optimisation scenarios by prioritising different system goals and the automated reasoning support.

Details

Information & Computer Security, vol. 26 no. 4
Type: Research Article
ISSN: 2056-4961

Keywords

Article
Publication date: 1 March 1980

John R. King and Alexander S. Spachis

Scheduling is defined by Baker as, “the allocation of resources over time to perform a collection of tasks”. The term facilities is often used instead of resources and the tasks…

Abstract

Scheduling is defined by Baker as, “the allocation of resources over time to perform a collection of tasks”. The term facilities is often used instead of resources and the tasks to be performed may involve a variety of different operations.

Details

International Journal of Physical Distribution & Materials Management, vol. 10 no. 3
Type: Research Article
ISSN: 0269-8218

Article
Publication date: 31 July 2023

Jinzhong Li, Ming Cong, Dong Liu and Yu Du

Robots face fundamental challenges in achieving reliable and stable operations for complex home service scenarios. This is one of the crucial topics of robotics methods to imitate…

Abstract

Purpose

Robots face fundamental challenges in achieving reliable and stable operations for complex home service scenarios. This is one of the crucial topics of robotics methods to imitate human beings’ advanced cognitive characteristics and apply them to solve complex tasks. The purpose of this study is to enable robots to have the ability to understand the scene and task process in complex scenes and to provide a reference method for robot task programming in complex scenes.

Design/methodology/approach

This paper constructs a task modeling method for robots in complex environments based on the characteristics of the perception-motor memory model of human cognition. In the aspect of episodic memory construction, the task execution process is included in the category of qualitative spatio-temporal calculus. The topology interaction of objects in a task scenario is used to define scene attributes. The task process can be regarded as changing scene attributes on a time scale. The qualitative spatio-temporal activity graphs are used to analyze the change process of the object state with time during the robot task execution. The tasks are divided according to the different values of scene attributes at different times during task execution. Based on this, in procedural memory, an object-centered motion model is developed by analyzing the changes in the relationship between objects in the scene episode by analyzing the scene changes before and after the robot performs the actions. Finally, the task execution process of the robot is constructed by alternately reconstructing episodic memory and procedural memory.

Findings

To verify the applicability of the proposed model, a scenario where the robot combines the object (one of the most common tasks in-home service) is set up. The proposed method can obtain the landscape of robot tasks in a complex environment.

Originality/value

The robot can achieve high-level task programming through the alternating interpretation of scenarios and actions. The proposed model differs from traditional methods based on geometric or physical feature information. However, it focuses on the spatial relationship of objects, which is more similar to the cognitive mechanism of human understanding of the environment.

Article
Publication date: 12 October 2012

Zhixiang Yin, Jianzhong Cui, Yan Yang, Yin Ma, Wei Wang, Jin Yang and Xia Sun

The bottleneck of current DNA computing paradigms based on brute‐force search strategy is that initial solution space grows exponentially with problem size, thus only trivial…

334

Abstract

Purpose

The bottleneck of current DNA computing paradigms based on brute‐force search strategy is that initial solution space grows exponentially with problem size, thus only trivial instances of NP‐complete problem can be solved. The purpose of this paper is to present a novel molecular program based on sticker models for solving dominating set problems.

Design/methodology/approach

The authors do not synthesize the initial solution pool containing every possible candidate solution as previously reported algorithm. Instead, solutions DNA molecules to the problem of interest are constructed during the course of computation.

Findings

It is shown that “exponential explosions” inherent in current DNA computing paradigms may be overcome in this way.

Originality/value

The paper proposes an error‐resistant DNA algorithm based on sticker model for solving minimum dominating problems.

1 – 10 of 58