Search results

1 – 10 of 427
Article
Publication date: 7 September 2015

Kazuya Murao, Hayami Tobise, Tsutomu Terada, Toshiki Iso, Masahiko Tsukamoto and Tsutomu Horikoshi

User authentication is generally used to protect personal information such as phone numbers, photos and account information stored in a mobile device by limiting the user to a…

Abstract

Purpose

User authentication is generally used to protect personal information such as phone numbers, photos and account information stored in a mobile device by limiting the user to a specific person, e.g. the owner of the device. Authentication methods with password, PIN, face recognition and fingerprint identification have been widely used; however, these methods have problems of difficulty in one-handed operation, vulnerability to shoulder hacking and illegal access using fingerprint with either super glue or facial portrait. From viewpoints of usability and safety, strong and uncomplicated method is required.

Design/methodology/approach

In this paper, a user authentication method is proposed based on grip gestures using pressure sensors mounted on the lateral and back sides of a mobile phone. Grip gesture is an operation of grasping a mobile phone, which is assumed to be done instead of conventional unlock procedure. Grip gesture can be performed with one hand. Moreover, it is hard to imitate grip gestures, as finger movements and grip force during a grip gesture are hardly seen by the others.

Findings

The feature values of grip force are experimentally investigated and the proposed method from viewpoint of error rate is evaluated. From the result, this method achieved 0.02 of equal error rate, which is equivalent to face recognition.

Originality/value

Many researches using pressure sensors to recognize grip pattern have been proposed thus far; however, the conventional works just recognize grip patterns and do not identify users, or need long pressure data to finish confident authentication. This proposed method authenticates users with a short grip gesture.

Details

International Journal of Pervasive Computing and Communications, vol. 11 no. 3
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 9 September 2014

Darryl Charles, Katy Pedlow, Suzanne McDonough, Ka Shek and Therese Charles

The Leap Motion represents a new generation of depth sensing cameras designed for close range tracking of hands and fingers, operating with minimal latency and high spatial…

1327

Abstract

Purpose

The Leap Motion represents a new generation of depth sensing cameras designed for close range tracking of hands and fingers, operating with minimal latency and high spatial precision (0.01 mm). The purpose of this paper is to develop virtual reality (VR) simulations of three well-known hand-based rehabilitation tasks using a commercial game engine and utilising a Leap camera as the primary mode of interaction. The authors present results from an initial evaluation by professional clinicians of these VR simulations for use in their hand and finger physical therapy practice.

Design/methodology/approach

A cross-disciplinary team of researchers collaborated with a local software company to create three dimension interactive simulations of three hand focused rehabilitation tasks: Cotton Balls, Stacking Blocks, and the Nine Hole Peg Test. These simulations were presented to a group of eight physiotherapists and occupational therapists (n=8) based in the Regional Acquired Brain Injury Unit, Belfast Health, and Social Care Trust for evaluation. After induction, the clinicians attempted the tasks presented and provided feedback by filling out a questionnaire.

Findings

Results from questionnaires (using a Likert scale 1-7, where 1 was the most favourable response) revealed a positive response to the simulations with an overall mean score across all questions equal to 2.59. Clinicians indicated that the system contained tasks that were easy to understand (mean score 1.88), and though it took several attempts to become competent, they predicted that they would improve with practice (mean score 2.25). In general, clinicians thought the prototypes provided a good illustration of the tasks required in their practice (mean score 2.38) and that patients would likely be motivated to use the system (mean score 2.38), especially young patients (mean score 1.63), and in the home environment (mean score 2.5).

Originality/value

Cameras offer an unobtrusive and low maintenance approach to tracking user motion in VR therapy in comparison to methods based on wearable technologies. This paper presents positive results from an evaluation of the new Leap Motion camera for input control of VR simulations or games. This mode of interaction provides a low cost, easy to use, high-resolution system for tracking fingers and hands, and has great potential for home-based physical therapies, particularly for young people.

Details

Journal of Assistive Technologies, vol. 8 no. 3
Type: Research Article
ISSN: 1754-9450

Keywords

Abstract

Details

Mad Muse: The Mental Illness Memoir in a Writer's Life and Work
Type: Book
ISBN: 978-1-78973-810-0

Article
Publication date: 1 December 2010

Caroline Langensiepen, Ahmad Lotfi and Scott Higgins

The world has an ageing population who want to stay at home, many of whom are unable to care for themselves without help. As the number of available carers is becoming saturated…

Abstract

The world has an ageing population who want to stay at home, many of whom are unable to care for themselves without help. As the number of available carers is becoming saturated by demand, research is being carried out into how technology could assist elderly people in the home. A barrier preventing wide adoption is that this audience can find controlling assistive technology difficult, as they may be less dexterous and computer literate. This paper explores the use of gestures to control home automation, hoping to provide a more natural and intuitive interface to help bridge the gap between technology and older users. A prototype was created, and then trialled with a small panel of older users. Using the Nintendo Wii Remote (Wiimote) technology, gestures performed in the air were captured using an infrared camera. Computational intelligence techniques were then used to recognise and learn the gestures. This resulted in sending the command to standard home automation X10 units to control a number of attached electrical devices. It was found that although older people could readily use gestures to control devices, configuration of a home system might remain a task for carers or technicians.

Details

Journal of Assistive Technologies, vol. 4 no. 4
Type: Research Article
ISSN: 1754-9450

Keywords

Article
Publication date: 1 September 1992

David A. Wesson

The first thing you do when you meet someone in a business contextis to tell that person who you think you are, who you think they are,and what you think the nature of your…

3316

Abstract

The first thing you do when you meet someone in a business context is to tell that person who you think you are, who you think they are, and what you think the nature of your relationship is going to be. Explores some of the issues which surround the handshake in a business setting and discusses some of the characteristics and meanings of handshakes, with special attention to gender and cultural differences. The argument is made that, if there is a problem in the workplace for uninitiated, untrained persons with handshaking, it would be worthwhile addressing communication skill in an educational setting.

Details

Marketing Intelligence & Planning, vol. 10 no. 9
Type: Research Article
ISSN: 0263-4503

Keywords

Article
Publication date: 25 March 2024

Boyang Hu, Ling Weng, Kaile Liu, Yang Liu, Zhuolin Li and Yuxin Chen

Gesture recognition plays an important role in many fields such as human–computer interaction, medical rehabilitation, virtual and augmented reality. Gesture recognition using…

Abstract

Purpose

Gesture recognition plays an important role in many fields such as human–computer interaction, medical rehabilitation, virtual and augmented reality. Gesture recognition using wearable devices is a common and effective recognition method. This study aims to combine the inverse magnetostrictive effect and tunneling magnetoresistance effect and proposes a novel wearable sensing glove applied in the field of gesture recognition.

Design/methodology/approach

A magnetostrictive sensing glove with function of gesture recognition is proposed based on Fe-Ni alloy, tunneling magnetoresistive elements, Agilus30 base and square permanent magnets. The sensing glove consists of five sensing units to measure the bending angle of each finger joint. The optimal structure of the sensing units is determined through experimentation and simulation. The output voltage model of the sensing units is established, and the output characteristics of the sensing units are tested by the experimental platform. Fifteen gestures are selected for recognition, and the corresponding output voltages are collected to construct the data set and the data is processed using Back Propagation Neural Network.

Findings

The sensing units can detect the change in the bending angle of finger joints from 0 to 105 degrees and a maximum error of 4.69% between the experimental and theoretical values. The average recognition accuracy of Back Propagation Neural Network is 97.53% for 15 gestures.

Research limitations/implications

The sensing glove can only recognize static gestures at present, and further research is still needed to recognize dynamic gestures.

Practical implications

A new approach to gesture recognition using wearable devices.

Social implications

This study has a broad application prospect in the field of human–computer interaction.

Originality/value

The sensing glove can collect voltage signals under different gestures to realize the recognition of different gestures with good repeatability, which has a broad application prospect in the field of human–computer interaction.

Details

Sensor Review, vol. 44 no. 2
Type: Research Article
ISSN: 0260-2288

Keywords

Article
Publication date: 17 August 2015

Gilbert Tang, Seemal Asif and Phil Webb

The purpose of this paper is to describe the integration of a gesture control system for industrial collaborative robot. Human and robot collaborative systems can be a viable…

Abstract

Purpose

The purpose of this paper is to describe the integration of a gesture control system for industrial collaborative robot. Human and robot collaborative systems can be a viable manufacturing solution, but efficient control and communication are required for operations to be carried out effectively and safely.

Design/methodology/approach

The integrated system consists of facial recognition, static pose recognition and dynamic hand motion tracking. Each sub-system has been tested in isolation before integration and demonstration of a sample task.

Findings

It is demonstrated that the combination of multiple gesture control methods can increase its potential applications for industrial robots.

Originality/value

The novelty of the system is the combination of a dual gesture controls method which allows operators to command an industrial robot by posing hand gestures as well as control the robot motion by moving one of their hands in front of the sensor. A facial verification system is integrated to improve the robustness, reliability and security of the control system which also allows assignment of permission levels to different users.

Details

Industrial Robot: An International Journal, vol. 42 no. 5
Type: Research Article
ISSN: 0143-991X

Keywords

Open Access

Abstract

Purpose

To compare the electromyography (EMG) features during physical and imagined standing up in healthy young adults.

Design/methodology/approach

Twenty-two participants (ages ranged from 20–29 years old) were recruited to participate in this study. Electrodes were attached to the rectus femoris, biceps femoris, tibialis anterior and the medial gastrocnemius muscles of both sides to monitor the EMG features during physical and imagined standing up. The %maximal voluntary contraction (%MVC), onset and duration were calculated.

Findings

The onset and duration of each muscle of both sides had no statistically significant differences between physical and imagined standing up (p > 0.05). The %MVC of all four muscles during physical standing up was statistically significantly higher than during imagined standing up (p < 0.05) on both sides. Moreover, the tibialis anterior muscle of both sides showed a statistically significant contraction before the other muscles (p < 0.05) during physical and imagined standing up.

Originality/value

Muscles can be activated during imagined movement, and the patterns of muscle activity during physical and imagined standing up were similar. Imagined movement may be used in rehabilitation as an alternative or additional technique combined with other techniques to enhance the STS skill.

Details

Journal of Health Research, vol. 35 no. 1
Type: Research Article
ISSN: 0857-4421

Keywords

Article
Publication date: 21 September 2015

Linda Wulf, Markus Garschall, Michael Klein and Manfred Tscheligi

The purpose of this paper is to gain deeper insights into performance differences of younger and older users when performing touch gestures, as well as the influence of tablet…

Abstract

Purpose

The purpose of this paper is to gain deeper insights into performance differences of younger and older users when performing touch gestures, as well as the influence of tablet device orientation (portrait vs landscape).

Design/methodology/approach

The authors performed a comparative study involving 20 younger (25-45 years) and 20 older participants (65-85 years). Each participant executed six gestures with each device orientation. Age was set as a between-subject factor. The dependent variables were task completion time and error rates (missed target rate and finger lift rate). To measure various performance characteristics, the authors implemented an application for the iPad that logged completion time and error rates of the participants when performing six gestural tasks – tap, drag, pinch, pinch-pan, rotate left and rotate right – for both device orientations.

Findings

The results show a significant effect of age on completion time and error rates. Means reveal faster completion times and lower error rates for younger users than for older users. In addition, a significant effect of device orientation on error rates could be stated. Means show higher error rates for portrait orientation than for landscape orientation. Qualitative results reveal a clear preference for landscape orientation in both age groups and a lower acceptance of rotation gestures among older participants.

Originality/value

In this study the authors were able to show the importance of device orientation as an influencing factor on touch interaction performance, indicating that age is not the exclusive influencing factor.

Details

Journal of Assistive Technologies, vol. 9 no. 3
Type: Research Article
ISSN: 1754-9450

Keywords

Article
Publication date: 14 January 2014

Ludger Schmidt, Jens Hegenberg and Liubov Cramar

To avoid harm to humans, environment, and capital goods, hazardous or explosive gases that are possibly escaping from industrial and infrastructure facilities of the gas and oil…

Abstract

Purpose

To avoid harm to humans, environment, and capital goods, hazardous or explosive gases that are possibly escaping from industrial and infrastructure facilities of the gas and oil processing industry have to be detected and located quickly and reliably. Project RoboGasInspector aims at the development and evaluation of a human-robot system that applies autonomous robots equipped with remote gas detection devices to detect and locate gas leaks. This article aims to focus on the usability of telemanipulation in this context.

Design/methodology/approach

This paper presents four user studies concerning human-robot interfaces for teleoperation in industrial inspection tasks. Their purpose is to resolve contradictory scientific findings regarding aspects of teleoperation and to verify functionality, usability, and technology acceptance of the designed solution in the actual context of use. Therefore, aspects concerning teleoperation that were separately examined before are evaluated in an integrated way. Considered aspects are influence of media technology on telepresence, simulator sickness and head slaved camera control, usability of different input devices for telemanipulation, and identification of intuitive gestures for teleoperation of mobile robots.

Findings

In general, the implemented interaction concepts perform better compared to conventional ones used in contemporary, actually applied robot systems. Otherwise, reasons are analyzed and approaches for further improvements are discussed. Exemplary results are given for each study.

Originality/value

The solution combines several technical approaches that are so far separately examined. Each approach is transferred to the innovative domain of industrial inspections and its applicability in this context is verified. New findings give design recommendations for remote workplaces of robot operators.

Details

Industrial Robot: An International Journal, vol. 41 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

1 – 10 of 427