Search results

1 – 10 of over 29000
Article
Publication date: 7 August 2017

Eun-Suk Yang, Jong Dae Kim, Chan-Young Park, Hye-Jeong Song and Yu-Seop Kim

In this paper, the problem of a nonlinear model – specifically the hidden unit conditional random fields (HUCRFs) model, which has binary stochastic hidden units between the data…

Abstract

Purpose

In this paper, the problem of a nonlinear model – specifically the hidden unit conditional random fields (HUCRFs) model, which has binary stochastic hidden units between the data and the labels – exhibiting unstable performance depending on the hyperparameter under consideration.

Design/methodology/approach

There are three main optimization search methods for hyperparameter tuning: manual search, grid search and random search. This study shows that HUCRFs’ unstable performance depends on the hyperparameter values used and its performance is based on tuning that draws on grid and random searches. All experiments conducted used the n-gram features – specifically, unigram, bigram, and trigram.

Findings

Naturally, selecting a list of hyperparameter values based on a researchers’ experience to find a set in which the best performance is exhibited is better than finding it from a probability distribution. Realistically, however, it is impossible to calculate using the parameters in all combinations. The present research indicates that the random search method has a better performance compared with the grid search method while requiring shorter computation time and a reduced cost.

Originality/value

In this paper, the issues affecting the performance of HUCRF, a nonlinear model with performance that varies depending on the hyperparameters, but performs better than CRF, has been examined.

Details

Engineering Computations, vol. 34 no. 6
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 19 March 2024

Claire K. Wan and Mingchang Chih

We argue that a fundamental issue regarding how to search and how to switch between different cognitive modes lies in the decision rules that influence the dynamics of learning…

Abstract

Purpose

We argue that a fundamental issue regarding how to search and how to switch between different cognitive modes lies in the decision rules that influence the dynamics of learning and exploration. We examine the search logics underlying these decision rules and propose conceptual prompts that can be applied mentally or computationally to aid managers’ decision-making.

Design/methodology/approach

By applying Multi-Armed Bandit (MAB) modeling to simulate agents’ interaction with dynamic environments, we compared the patterns and performance of selected MAB algorithms under different configurations of environmental conditions.

Findings

We develop three conceptual prompts. First, the simple heuristic-based exploration strategy works well in conditions of low environmental variability and few alternatives. Second, an exploration strategy that combines simple and de-biasing heuristics is suitable for most dynamic and complex decision environments. Third, the uncertainty-based exploration strategy is more applicable in the condition of high environmental unpredictability as it can more effectively recognize deviated patterns.

Research limitations/implications

This study contributes to emerging research on using algorithms to develop novel concepts and combining heuristics and algorithmic intelligence in strategic decision-making.

Practical implications

This study offers insights that there are different possibilities for exploration strategies for managers to apply conceptually and that the adaptability of cognitive-distant search may be underestimated in turbulent environments.

Originality/value

Drawing on insights from machine learning and cognitive psychology research, we demonstrate the fitness of different exploration strategies in different dynamic environmental configurations by comparing the different search logics that underlie the three MAB algorithms.

Details

Management Decision, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0025-1747

Keywords

Book part
Publication date: 25 July 2011

Lawrence F. Rossow and Jacqueline A. Stefkovich

Searching public school students has been a Constitutional reality since the landmark decision New Jersey v. T.L.O. in 1985. The law in this area of students’ rights has expanded…

Abstract

Searching public school students has been a Constitutional reality since the landmark decision New Jersey v. T.L.O. in 1985. The law in this area of students’ rights has expanded greatly, including everything from locker searches involving canines to random drug testing of students involved in sports and extracurricular activities to highly intrusive personal searches. As recent as April 2009, the U.S. Supreme Court decided that the strip search of a middle school student for ibuprofen was illegal, but because school authorities would not necessarily have known they were violating the student's Constitutional rights, the school was immune from paying money damages. Thousands of searches of all kinds are conducted every day in schools across the country. Many of those searches are legal but not all. Whether legal or not, are those searches ethical? Is an illegal search of a student per se unethical because it violates the Ethic of Care? If a search is legal can it nevertheless conform to any standard of Ethic? Does searching a student violate the Ethic of Care or the Ethic of Critique?

Details

Leadership in Education, Corrections and Law Enforcement: A Commitment to Ethics, Equity and Excellence
Type: Book
ISBN: 978-1-78052-185-5

Article
Publication date: 1 May 2002

Mike Thelwall

There have been many attempts to study the content of the Web, either through human or automatic agents. Describes five different previously used Web survey methodologies, each…

2272

Abstract

There have been many attempts to study the content of the Web, either through human or automatic agents. Describes five different previously used Web survey methodologies, each justifiable in its own right, but presents a simple experiment that demonstrates concrete differences between them. The concept of crawling the Web also bears further inspection, including the scope of the pages to crawl, the method used to access and index each page, and the algorithm for the identification of duplicate pages. The issues involved here will be well‐known to many computer scientists but, with the increasing use of crawlers and search engines in other disciplines, they now require a public discussion in the wider research community. Concludes that any scientific attempt to crawl the Web must make available the parameters under which it is operating so that researchers can, in principle, replicate experiments or be aware of and take into account differences between methodologies. Also introduces a new hybrid random page selection methodology.

Details

Internet Research, vol. 12 no. 2
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 1 June 2000

Mike Thelwall

In the UK, millions are now online and many are prepared to use the Internet to make and influence purchasing decisions. Businesses should, therefore, consider whether the…

3945

Abstract

In the UK, millions are now online and many are prepared to use the Internet to make and influence purchasing decisions. Businesses should, therefore, consider whether the Internet could provide them with a new marketing opportunity. Although increasing numbers of businesses now have a website, there seems to be a quality problem that is leading to missed opportunities, particularly for smaller enterprises. This belief is backed up by an automated survey of 3,802 predominantly small UK business sites, believed to be by far the largest of its kind to date. Analysis of the results reveals widespread problems in relation to search engines. Most Internet users find new sites through search engines, yet over half of the sites checked were not registered in the largest one, Yahoo!, and could therefore be missing a sizeable percentage of potential customers. The underlying problem with business sites is the lack of maturity of the medium as evidenced by the focus on technological issues amongst designers and the inevitable lack of Web‐business experience of managers. Designers need to take seriously the usability of the site, its design and its ability to meet the business goals of the client. These issues are perhaps being taken up less than in the related discipline of software engineering, probably owing to the relative ease of website creation. Managers need to dictate the objectives of their site, but also, in the current climate, cannot rely even on professional website design companies and must be capable of evaluating the quality of their site themselves. Finally, educators need to ensure that these issues are emphasised to the next generation of designers and managers in order that the full potential of the Internet for business can be realised.

Details

Journal of Small Business and Enterprise Development, vol. 7 no. 2
Type: Research Article
ISSN: 1462-6004

Keywords

Article
Publication date: 24 August 2010

Tushar Jain, Srinivasan Alavandar, Singh Vivekkumar Radhamohan and M.J. Nigam

The purpose of this paper is to propose a novel algorithm which hybridizes the best features of three basic algorithms, i.e. genetic algorithm, bacterial foraging, and particle…

Abstract

Purpose

The purpose of this paper is to propose a novel algorithm which hybridizes the best features of three basic algorithms, i.e. genetic algorithm, bacterial foraging, and particle swarm optimization (PSO) as genetically bacterial swarm optimization (GBSO). The implementation of GBSO is illustrated by designing the fuzzy pre‐compensated PD (FPPD) control for two‐link rigid‐flexible manipulator.

Design/methodology/approach

The hybridization is carried out in two phases; first, the diversity in searching the optimal solution is increased using selection, crossover, and mutation operators. Second, the search direction vector is optimized using PSO to enhance the convergence rate of the fitness function in achieving the optimality. The FPPD controller design objective was to tune the PD controller constants, normalization, and denormalization factors for both the joints so that integral square error, overshoots, and undershoots are minimized.

Findings

The proposed algorithm is tested on a set of mathematical functions which are then compared with the basic algorithms. The results showed that the GBSO had a convergence rate better than the other algorithms, reaching to the optimal solution. Also, an approach of using fuzzy pre‐compensator in reducing the overshoots and undershoots for loading‐unloading and circular trajectories had been successfully achieved over simple PD controller. The results presented emphasize that a satisfactory tracking precision could be achieved using hybrid FPPD controller with GBSO.

Originality/value

Simulation results were reported and the proposed algorithm indeed has established superiority over the basic algorithms with respect to set of functions considered and it can easily be extended for other global optimization problems. The proposed FPPD controller tuning approach is interesting for the design of controllers for inherently unstable high‐order systems.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 3 no. 3
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 7 October 2013

M. Vaz Jr, E.L. Cardoso and J. Stahlschmidt

Parameter identification is a technique which aims at determining material or other process parameters based on a combination of experimental and numerical techniques. In recent…

Abstract

Purpose

Parameter identification is a technique which aims at determining material or other process parameters based on a combination of experimental and numerical techniques. In recent years, heuristic approaches, such as genetic algorithms (GAs), have been proposed as possible alternatives to classical identification procedures. The present work shows that particle swarm optimization (PSO), as an example of such methods, is also appropriate to identification of inelastic parameters. The paper aims to discuss these issues.

Design/methodology/approach

PSO is a class of swarm intelligence algorithms which attempts to reproduce the social behaviour of a generic population. In parameter identification, each individual particle is associated to hyper-coordinates in the search space, corresponding to a set of material parameters, upon which velocity operators with random components are applied, leading the particles to cluster together at convergence.

Findings

PSO has proved to be a viable alternative to identification of inelastic parameters owing to its robustness (achieving the global minimum with high tolerance for variations of the population size and control parameters), and, contrasting to GAs, higher convergence rate and small number of control variables.

Originality/value

PSO has been mostly applied to electrical and industrial engineering. This paper extends the field of application of the method to identification of inelastic material parameters.

Details

Engineering Computations, vol. 30 no. 7
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 20 November 2019

Gaetano Lisi

This paper aims to study the phenomenon known as “house price dispersion”, one of the most important distinctive features of housing markets. House price dispersion refers to the…

Abstract

Purpose

This paper aims to study the phenomenon known as “house price dispersion”, one of the most important distinctive features of housing markets. House price dispersion refers to the phenomenon of selling two houses with very similar attributes and in near locations at the same time but at very different prices.

Design/methodology/approach

This theoretical paper makes use of a search and matching model of the housing market. The search and matching models are the benchmark models of the “matching” markets, such as the labour market and the housing market, where trade is a decentralised, uncoordinated and time-consuming economic activity.

Findings

Unlike the previous related literature that attributes to the heterogeneity of buyers and sellers a significant part of the price volatility, in this paper, the house price dispersion depends on the housing tenure status of home-seekers in the house search process. Indeed, in the presence of different housing tenure status of home-seekers, the house search process leads to different types of matching. In turn, this implies different surpluses (the sum of the net gains of the parties involved in the trade), and eventually, different surpluses produce different prices of equilibrium.

Research limitations/implications

An interesting research agenda for future works would be an extension of the model to study the effect of “online housing search” on the house search and matching process, and thus, on the house price dispersion.

Practical implications

The main practical implication of this work is that the house price dispersion is an inherent phenomenon in the house search and matching process.

Originality/value

None of the existing and related works of research have considered how to take advantage of the search and matching approach to deal with the phenomenon known as “house price dispersion”, without relying on the ex ante heterogeneity of the parties but looking at the “core” of the house search and matching process.

Details

Journal of European Real Estate Research , vol. 12 no. 3
Type: Research Article
ISSN: 1753-9269

Keywords

Article
Publication date: 30 October 2018

Satyabrata Dash, Sukanta Dey, Deepak Joshi and Gaurav Trivedi

The purpose of this paper is to demonstrate the application of river formation dynamics to size the widths of power distribution network for very large-scale integration designs…

Abstract

Purpose

The purpose of this paper is to demonstrate the application of river formation dynamics to size the widths of power distribution network for very large-scale integration designs so that the wire area required by power rails is minimized. The area minimization problem is transformed into a single objective optimization problem subject to various design constraints, such as IR drop and electromigration constraints.

Design/methodology/approach

The minimization process is carried out using river formation dynamics heuristic. The random probabilistic search strategy of river formation dynamics heuristic is used to advance through stringent design requirements to minimize the wire area of an over-designed power distribution network.

Findings

A number of experiments are performed on several power distribution benchmarks to demonstrate the effectiveness of river formation dynamics heuristic. It is observed that the river formation dynamics heuristic outperforms other standard optimization techniques in most cases, and a power distribution network having 16 million nodes is successfully designed for optimal wire area using river formation dynamics.

Originality/value

Although many research works are presented in the literature to minimize wire area of power distribution network, these research works convey little idea on optimizing very large-scale power distribution networks (i.e. networks having more than four million nodes) using an automated environment. The originality in this research is the illustration of an automated environment equipped with an efficient optimization technique based on random probabilistic movement of water drops in solving very large-scale power distribution networks without sacrificing accuracy and additional computational cost. Based on the computation of river formation dynamics, the knowledge of minimum area bounded by optimum IR drop value can be of significant advantage in reduction of routable space and in system performance improvement.

Details

Journal of Systems and Information Technology, vol. 20 no. 4
Type: Research Article
ISSN: 1328-7265

Keywords

Article
Publication date: 1 December 2020

Chandramohan D., Ankur Dumka, Dhilipkumar V. and Jayakumar Loganathan

This paper aims to predict the traffic and helps to find a solution. Unpredictable traffic leads more vehicles on the road. The result of which is one of the factors that…

Abstract

Purpose

This paper aims to predict the traffic and helps to find a solution. Unpredictable traffic leads more vehicles on the road. The result of which is one of the factors that aggravate traffic congestion. Traffic congestion occurs when the available transport resources are less when compared to the number of vehicles that share the resource. As the number of vehicles increases the resources become scarce and congestion is more.

Design/methodology/approach

The population of the urban areas keeps increasing as the people move toward the cities in search of jobs and a better lifestyle. This leads to an increase in the number of vehicles on the road. However, the transport network, which is accessible to the citizens is less when compared to their demand.

Findings

The demand for resources is higher than the actual capacity of the roads and the streets. There are some circumstances, which will aggravate traffic congestion. The circumstances can be the road condition (pot holes and road repair), accidents and some natural calamities.

Originality/value

There is a lot of research being done to predict the traffic and model it to find a solution, which will make the condition better. However, still, it is an open issue. The accuracy of the predictions done is less.

Details

International Journal of Pervasive Computing and Communications, vol. 17 no. 1
Type: Research Article
ISSN: 1742-7371

Keywords

1 – 10 of over 29000