Anda di halaman 1dari 10

CHAPTER 1 INTRODUCTION

1.1 What is ‘Haptic’ ?

Haptic refers to technology that uses touch to control and interact with computers. A
user may apply a sense of touch through vibrations, motion or force. Haptic technology is
used mainly in creating virtual objects, controlling virtual objects or in the improvement of
the remote control of machines and devices. The word haptic is derived from the Greek
"haptikos," which means a sense of touch.

Haptic technology refers to technology that interfaces the user with a virtual
environment via the sense of touch by applying forces, vibrations, and/or motions to the user.
This mechanical stimulation may be used to assist in the creation of virtual objects (objects
existing only in a computer simulation), for control of such virtual objects, and to enhance the
remote control of machines and devices (teleoperators). This emerging technology promises
to have wide reaching applications as it already has in some fields. For example, haptic
technology has made it possible to investigate in detail how the human sense of touch works
by allowing the creation of carefully controlled haptic virtual objects. These objects are used
to systematically probe human haptic capabilities, which would otherwise be difficult to
achieve. These new research tools contribute to our understanding of how touch and its
underlying brain functions work. Although haptic devices are capable of measuring bulk or
reactive forces that are applied by the user, it should not to be confused with touch or tactile
sensors that measure the pressure or force exerted by the user to the interface.
Haptics is the technology of adding the sensation of touch and feeling to computers.
When virtual objects are touched, they seem real and tangible. Haptics senses links to brain
sensing position and moment of the body by means of sensory nerves within the muscles and
joints. Haptics devices may join tactile sensor that measure forces exerted by the user on the
interface. Haptic technology has made it possible to investigate how the human sense of
touch works by allowing the creation of carefully controlled haptic virtual objects.

 Haptics = touch = connection.


 Touch is the code of personal experience.
 Of the live sense, touch is the most proficient, the only one capable of simultaneous
input and output.
1
1.2 History

One of the earliest applications of haptic technology was in large aircraft that
use servomechanism systems to operate control surfaces. Such systems tend to be "one-way",
meaning external forces applied aerodynamically to the control surfaces are not perceived at
the controls. Here, the missing normal forces are simulated with springs and weights. In
lighter aircraft without servo systems, as the aircraft approached a stall the aerodynamic
buffeting (vibrations) was felt in the pilot's controls. This was a useful warning of a
dangerous flight condition. This control shake is not felt when servo control systems are used.
To replace this missing sensory cue, the angle of attack is measured and when it approaches
the critical stall point, a stick shaker is engaged which simulates the response of a simpler
control system. Alternatively, the servo force may be measured and the signal directed to a
servo system on the control, known as force feedback. Force feedback has been implemented
experimentally in some excavators and is useful when excavating mixed material such as
large rocks embedded in silt or clay. It allows the operator to "feel" and work around unseen
obstacles, enabling significant increases in productivity and less risk of damage to the
machine.
The first US patent for a tactile telephone was granted to Thomas D. Shannon in
1973. An early tactile man-machine communication system was constructed by A. Michael
Noll at Bell Telephone Laboratories, Inc. in the early 1970s and a patent was issued for his
invention in 1975.
In 1994, Aura Systems launched the Interactor Vest, a wearable force-feedback device
that monitors an audio signal and uses Aura's patented electromagnetic actuator technology to
convert bass sound waves into vibrations that can represent such actions as a punch or kick.
The Interactor vest plugs into the audio output of a stereo, TV, or VCR and the user is
provided with controls that allow for adjusting of the intensity of vibration and filtering out of
high frequency sounds. The Interactor Vest is worn over the upper torso and the audio signal
is reproduced through a speaker embedded in the vest. After selling 400,000 of its Interactor
Vest, Aura began shipping the Interactor Cushion, a device which operates like the Vest but
instead of being worn, it's placed against a seat back and the user must lean against it. Both
the Vest and the Cushion were launched with a price tag of $99.
In 1995 Norwegian Geir Jensen described a wrist watch haptic device with a skin tap
mechanism, termed Tap-in. It would connect to a mobile phone via Bluetooth. Tapping-
frequency patterns would identify callers to a mobile and enable the wearer to respond by
selected short messages. It was submitted for a governmental innovation contest and received
no award. It was not pursued or published until recovered in 2015. The Tap-in device by
Jensen was devised facing the user to avoid twisting of the wrist, see image. It would adapt
across all mobile phone and watch brands. In 2015 Apple started to sell a wrist watch which
included skin tap sensing of notifications and alerts to mobile phone of the watch wearer.

2
1.3 Need of Haptic
Gesture control is an increasingly common form of interface across many sectors, including
desktop computers, gaming, interactive displays and automotive. However, in current gesture
control systems, users can gesture, but they cannot feel the controls they are interacting with.
A team of human computer interaction (HCI) researchers from Glasgow University decided
to find out whether this mattered. Using Ultrahaptics’ mid-air haptic technology, Dr Dong-
Bach Vo and Professor Stephen Brewster put together a user study to determine whether
adding haptic feedback to gesture controls would improve performance.

3
1.4 Aim of Haptic &Future

Haptic refers to technology that uses touch to control and interact with computers. A
user may apply a sense of touch through vibrations, motion or force.Haptic technology is
used mainly in creating virtual objects, controlling virtual objects or in the improvement of
the remote control of machines and devices.
Love, compassion, joy, affection, warmth… All our emotions come alive through the
sense of touch. Be it the physical world or artificial, the sense of ‘reality’ is massively
governed by touch. So, it is obvious that a better future will need a heightened response to
touch stimuli. When Haptic Technology together with other promising innovations such
as Virtual Reality, Augmented Reality, 3D Virtual Worlds and 3D Visualization rise to
prominence, our entire perception of the world will be elevated to a whole new level of
grandeur.
Look around you. There is a masterful design in almost everything that we see around
us. We may not really give due credit every time but every little thing that we see around is
the end result of a creative design. From the pen in your hand to the water bottle, notepad,
computer table and even the building you work in! Engineering, design, and architecture have
played a great role in making the world as beautiful as today. It is going to look even better
with efficient utilization of Haptic Technology.
Characters wore gloves with feedback that let them feel the imaginary objects in their
hands. They could upgrade to full body suits that reproduced the force of a punch to the chest
or the stroking of a caress. And yet these capabilities, too, might not be as far off as we
imagine.
We rely on touch — or “haptic” — information continuously, in ways we don’t even
consciously recognize. Nerves in our skin, joints, muscles and organs tell us how our bodies
are positioned, how tightly we’re holding something, what the weather is like, or that a loved
one is showing affection through a hug. Around the world, engineers are now working to
recreate realistic touch sensations, for video games and more. Engaging touch in human-
computer interactions would enhance robotic control, physical rehabilitation, education,
navigation, communication and even online shopping.
“In the past, haptics has been good at making things noticeable, with vibration in your
phone or the rumble packs in gaming controllers,” says Heather Culbertson, a computer
scientist at the University of Southern California. “But now there’s been a shift toward
making things that feel more natural, that more mimic the feel of natural materials and natural
interactions.”
Take surgical robots, which allow doctors to operate from the other side of the world,
or to manipulate tools too small or in spaces too tight for their hands. Numerous studies have
shown that adding haptic feedback to the control of these robots increases accuracy and

4
reduces tissue damage and operation time. Ones with haptic feedback also allow doctors to
train on patients that exist only in virtual reality while getting the feeling of actual cutting and
suturing.Getting a feel for what the robot under your command is doing would also be helpful
for defusing bombs or extracting people from collapsed buildings. Or for repairing a satellite
without suiting up for a spacewalk. Even Disney has looked into haptic telepresence robots,
for safe human-robot interactions. They developed a system that has pneumatic tubes
connecting a humanoid’s robotic arms with a mirror set of arms for a human to grasp. The
person can manipulate the mirror bot to cause the first bot to hold a balloon, pick up an egg or
pat a child on the cheeks.

1.5 Problem With Haptic

Haptics is a recent enhancement to virtual environments, allowing users to "touch" and feel
the simulated objects they interact with. Current commercial products allow tactile feedback
through desktop interfaces (such as the FEELIt/sup TM/ mouse or the PHANToM/sup TM/
arm) and dextrous tactile and force feedback at the fingertips through haptic gloves (such as
the CyberTouch/sup TM/ and the CyberGrasp/sup TM/). Virtual reality haptic programming
requires good physical modeling of user interactions, primarily through collision detection,
and of object responses, such as surface deformation, hard-contact simulation, slippage, etc.
It is at present difficult to simulate complex virtual environments that have a realistic
behavior. This task is added to by the recent introduction of haptic toolkits (such as Ghost/sup
TM/ or VPS). Current technology suffers from a number of limitations, which go beyond the
higher production cost of haptic interfaces. These technical drawbacks include the limited
workspace of desktop interfaces, the large weight of force-feedback gloves, the lack of force
feedback to the body, safety concerns, etc. Not to be neglected is the high bandwidth
requirement of haptics, which is not met by current Internet technology. As a result, it is not
possible at present to have a large number of remote participants interacting haptically in a
shared virtual space.

5
CHAPTER 2 LITERATURE REVIEW

2.1 BASIC CONCEPTS OF HAPTICS

The haptic system can sense and act on the environment while vision and audition have
purely sensory nature. Haptics means the combined sensation of mechanical, thermal and
noci-perception (fig.1). As a result haptics consists of nocio-receptive, thermoceptive,
kinaesthetic and tactile perceptions. The sense of balance takes an exceptional position as it is
not counted among the five human senses having receptors of their own. Yet, it really exists
making use of all other senses’ receptors, especially the haptic ones.
Unlike the four other senses (sight, hearing, taste, and smell), the sense of touch is not
localized to a specific region of the body; instead, it is distributed across the entire body
through the touch sensory organ, our skin, and in our joints, muscles, and tendons. The sense
of touch is typically described as being divided into two modalities: kinesthetic and tactile.
Kinesthetic sensations, such as forces and torques, are sensed in the muscles, tendons, and
joints. Tactile sensations, such as pressure, shear, and vibration, are sensed by specialized
sensory end organs known as mechanoreceptors that are embedded in the skin. Each type of
mechanoreceptor senses and responds to a specific type of haptic stimulus.

Fig.-Distribution of senses

6
2.2 TERMINOLOGY USED IN HAPTIC SYSTEMS
Some technical terminology used in haptic systems is listed below and illustrated via block
diagrams. The arrows between the components of the block diagrams remain unlabeled as
because they may represent different kinds of information depending on the devices they
refer to. Haptic devices are capable of transmitting elongations, forces and temperature
differences and in a few realizations they also stimulate pain receptors. The terms “system”
and “device” and “component” are not defined on an interdisciplinary basis. Dependent on
one’s point of view the same object can be e.g. “a device” for a hardware-designer, “a
system” for the software-engineer, or “just a component” for another hardware-engineer.

A. A haptic device is a system generating an output which can be perceived haptically. It


has (figure. 2) at least one output, but not necessarily any input. The tactile markers
on the keys F and J of a keyboard represent information for the positioning of the
index finger. By these properties the keys are already tactile devices. At a closer look
the key itself shows a haptically notable point of actuation, the haptic click. This
information is transmitted in a kinaesthetic and tactile way by the interaction of the
key’s mechanics with the muscles and joints and the force being transmitted through
the skin. Such a key is a haptic device without a changing input and two outputs.
B. A user (in the context of haptic systems) is a receiver of haptic information.
C. A haptic controller describes a component of a haptic system for processing haptic
information flows and improving transmission.

Fig – Haptic Device,user and controller

D. Haptic interaction describes the haptic transmission of information. This transmission


can be bi- or unidirectional (fig. 3). Moreover, specifically tactile (unidirectional) or
kinaesthetic (uni- or bidirectional) interaction may happen. A tactile marker like
embossed printing on a bill can communicate tactile information (the bill’s value) as a
result of haptic interaction.

E. The addressability of haptic systems refers to the subdivision of an output signal of a


device (frequently a force) or of the user (frequently a position).
F. The resolution of a haptic system refers to the capability to detect a subdivision
(spatial or temporal) of an input signal. With reference to a device this is in
accordance with the measuring accuracy.With respect to the user this corresponds to
his perceptual resolution.

7
G. A haptic marker refers to a mark communicating information about the object
carrying the marker by way of a defined code of some kind. Examples are markers in
Braille on bills or road maps.
H. A haptic display is a haptic device permitting haptic interaction, whereby the
transmitted information is subject to change (fig. 4). Purely tactile as well as
kinaesthetic displays are available.

Fig – Haptic Display

I. A tactor is a purely tactile haptic display generating a dynamic and oscillating output.
They usually provide a translatory output, but could also be rotatory.
J. A haptic interface devices are those that measure the motion of, and stimulate the
sensory capabilities within, our hands and thus permitting a haptic interaction (Figure:
5). A haptic interface always refers to data and device.

Fig – Haptic Interface

K. Force-Feedback (FFB) refers to the information transmitted by kinaesthetic


interaction (fig. 5). This term is widely used in advertising and numerous commercial
products like FFB-joysticks, FFB-steering wheels and FFB-mice.
L. A haptic manipulator is a system interacting mechanically with objects whereby
continuously information about positions in space and forces and torques of the
interaction is acquired.

2.3 HAPTIC TECHNOLOGIES


Tactile cues include textures, vibrations, and bumps, while kinesthetic cues include
weight, impact, etc. In the following section, we present some crucial concepts and
terminology related to haptics.
Haptic: the science of applying tactile, kinesthetic, or both sensations to human computer
interactions. It refers to the ability of sensing and/or manipulating objects in a natural or
synthetic environment using a haptic interface.
Cutaneous: relating to or involving the skin. It includes sensations of pressure, temperature,
and pain.
Tactile: pertaining to the cutaneous sense, but more specifically the sensation of pressure
rather than temperature or pain.
Kinesthetic: relating to the feeling ofmotion. It is related to sensations originating in
muscles, tendons, and joints.
Force Feedback: relating to the mechanical production of information that can be sensed by
the human kinesthetic system.

8
Haptic communication: the means by which humans and machines communicate via touch.
It mostly concerns networking issues.
Haptic device: is a manipulator with sensors, actuators, or both. A variety of haptic devices
have been developed for their own purposes. The most popular are tactilebased, pen-based,
and 3 degree-of-freedom (DOF) force feedback devices.
Haptic interface: consists of a haptic device and software-based computer control
mechanisms. It enables human–machine communication through the sense of touch. By using
a haptic interface, someone can not only feed the information to the computer but can also
receive information or feedback from the computer in the form of a physical sensation on
some parts of the body.
Haptic perception: the process of perceiving the characteristics of objects through touch.
Haptic rendering: the process of calculating the sense of touch, especially force. It involves
sampling the position sensors at the haptic device to obtain the user’s position within the
virtual environment. Haptic rendering is, therefore, a system that consists of three parts, a
collision detection algorithm, a collision response algorithm, and a control algorithm.
Sensors and Actuators: a sensor is responsible for sensing the haptic information exerted by
the user on a certain object and sending these force readings to the haptic rendering module.
The actuator will read the haptic data sent by the haptic rendering module and transform this
information into a form perceivable by human beings .
Tele-haptics: the science of transmitting haptic sensations from a remote explored
object/environment, using a network such as the Internet, to a human operator. In other
words, it is an extension of human touching sensation/capability beyond physical distance
limits.
Tele-presence: the situation of sensing sufficient information about the remote task
environment and communicating this to the human operator in a way that is sufficient for the
operator to feel physically present at the remote site. The user’s voice, movements, actions,
etc. may be sensed, transmitted, and duplicated in the remote location. Information may be
traveling in both directions between the user and the remote location.
Virtual Reality (VR): can be described as the computer simulation of a real or virtual
(imaginary) world where users can interact with it in real time and change its state to increase
realism. Such interactions are sometimes carried out with the help of haptic interfaces,
allowing participants to exchange tactile and kinesthetic information with the virtual
environment.
Virtual environment (VE): is an immersive virtual reality that is simulated by a computer
and primarily involves audiovisual experiences. Despite the fact that the terminology is
evolving, a virtual environment is mainly concerned with defining interactive and virtual
image displays.
Collaborative virtual environments (CVE): is one of the most challenging fields in VR
because the simulation is distributed among geographically dispersed computers. Potential
CVE applications vary widely from medical applications to gaming.
Collaborative haptic audio visual environment (C-HAVE): in addition to traditional
media, such as image, audio, and video, haptics – as a new media – plays a prominent role in
making virtual or realworld objects physically palpable in a CVE. A C-HAVE allows multiple
users, each with his/her own haptic interface, to collaboratively and/or remotely manipulate
shared objects in a virtual or real environment and is shown in fig.

9
Fig: Haptic Visual Environment

It is a process of applying forces to the user through a force-feedback device. Using haptic
rendering, we can enable a user to touch, feel and manipulate virtual objects. Enhance a
user’s experience in virtual environment. Haptic rendering is process of displaying
synthetically generated 2D/3D haptic stimuli to the user. The haptic interface acts as a two-
port system terminated on one side by the human operator and on the other side by the virtual
environment.
Contact Detection: A fundamental problem in haptics is to detect contact between the virtual
objects and the haptic device (a PHANTOM, a glove, etc.). Once this contact is reliably
detected, a force corresponding to the interaction physics is generated and rendered using the
probe. This process usually runs in a tight servo loop within a haptic rendering system.

10

Anda mungkin juga menyukai