Haptic technology
#19
[attachment=10886]
ABSTRACT
‘Haptics’ is a technology that adds the sense of touch to virtual environments. Users are given the illusion that they are touching or manipulating a real physical object.
This seminar discusses the important concepts in haptics, some of the most commonly used haptics systems like ‘Phantom’, ‘Cyberglove’, ‘Novint Falcon’ and such similar devices. Following this, a description about how sensors and actuators are used for tracking the position and movement of the haptic systems, is provided.
The different types of force rendering algorithms are discussed next. The seminar explains the blocks in force rendering. Then a few applications of haptic systems are taken up for discussion.
INTRODUCTION
2. a) What is ‘Haptics’?

Haptic technology refers to technology that interfaces the user with a virtual environment via the sense of touch by applying forces, vibrations, and/or motions to the user. This mechanical stimulation may be used to assist in the creation of virtual objects (objects existing only in a computer simulation), for control of such virtual objects, and to enhance the remote control of machines and devices (teleoperators). This emerging technology promises to have wide-reaching applications as it already has in some fields. For example, haptic technology has made it possible to investigate in detail how the human sense of touch works by allowing the creation of carefully controlled haptic virtual objects. These objects are used to systematically probe human haptic capabilities, which would otherwise be difficult to achieve. These new research tools contribute to our understanding of how touch and its underlying brain functions work. Although haptic devices are capable of measuring bulk or reactive forces that are applied by the user, it should not to be confused with touch or tactile sensors that measure the pressure or force exerted by the user to the interface.
The term haptic originated from the Greek word ἁπτικός (haptikos), meaning pertaining to the sense of touch and comes from the Greek verb ἅπτεσθαι (haptesthai) meaning to “contact” or “touch”.
2. b) History of Haptics
In the early 20th century, psychophysicists introduced the word haptics to label the subfield of their studies that addressed human touch-based perception and manipulation. In the 1970s and 1980s, significant research efforts in a completely different field, robotics also began to focus on manipulation and perception by touch. Initially concerned with building autonomous robots, researchers soon found that building a dexterous robotic hand was much more complex and subtle than their initial naive hopes had suggested.
In time these two communities, one that sought to understand the human hand and one that aspired to create devices with dexterity inspired by human abilities found fertile mutual interest in topics such as sensory design and processing, grasp control and manipulation, object representation and haptic information encoding, and grammars for describing physical tasks.
In the early 1990s a new usage of the word haptics began to emerge. The confluence of several emerging technologies made virtualized haptics, or computer haptics possible. Much like computer graphics, computer haptics enables the display of simulated objects to humans in an interactive manner. However, computer haptics uses a display technology through which objects can be physically palpated.
WORKING OF HAPTIC SYSTEMS
3. a) Basic system configuration.

Basically a haptic system consist of two parts namely the human part and the machine part. In the figure shown above, the human part (left) senses and controls the position of the hand, while the machine part (right) exerts forces from the hand to simulate contact with a virtual object. Also both the systems will be provided with necessary sensors, processors and actuators. In the case of the human system, nerve receptors performs sensing, brain performs processing and muscles performs actuation of the motion performed by the hand while in the case of the machine system, the above mentioned functions are performed by the encoders, computer and motors respectively.
3. b) Haptic Information
Basically the haptic information provided by the system will be the combination of (i) Tactile information and (ii) Kinesthetic information.
Tactile information refers the information acquired by the sensors which are actually connected to the skin of the human body with a particular reference to the spatial distribution of pressure, or more generally, tractions, across the contact area.
For example when we handle flexible materials like fabric and paper, we sense the pressure variation across the fingertip. This is actually a sort of tactile information. Tactile sensing is also the basis of complex perceptual tasks like medical palpation, where physicians locate hidden anatomical structures and evaluate tissue properties using their hands.
Kinesthetic information refers to the information acquired through the sensors in the joints.
Interaction forces are normally perceived through a combination of these two informations.
3. c) Creation of Virtual environment (Virtual reality).
Virtual reality is the technology which allows a user to interact with a computer-simulated environment, whether that environment is a simulation of the real world or an imaginary world. Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special or stereoscopic displays, but some simulations include additional sensory information, such as sound through speakers or headphones. Some advanced, haptic systems now include tactile information, generally known as force feedback, in medical and gaming applications. Users can interact with a virtual environment or a virtual artifact (VA) either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove, the Polhemus boom arm, and omnidirectional treadmill. The simulated environment can be similar to the real world, for example, simulations for pilot or combat training, or it can differ significantly from reality, as in VR games. In practice, it is currently very difficult to create a high-fidelity virtual reality experience, due largely to technical limitations on processing power, image resolution and communication bandwidth. However, those limitations are expected to eventually be overcome as processor, imaging and data communication technologies become more powerful and cost-effective over time.
Virtual Reality is often used to describe a wide variety of applications, commonly associated with its immersive, highly visual, 3D environments. The development of CAD software, graphics hardware acceleration, head mounted displays; database gloves and miniaturization have helped popularize the motion. The most successful use of virtual reality is the computer generated 3-D simulators. The pilots use flight simulators. These flight simulators have designed just like cockpit of the airplanes or the helicopter. The screen in front of the pilot creates virtual environment and the trainers outside the simulators commands the simulator for adopt different modes. The pilots are trained to control the planes in different difficult situations and emergency landing. The simulator provides the environment. These simulators cost millions of dollars.
The virtual reality games are also used almost in the same fashion. The player has to wear special gloves, headphones, goggles, full body wearing and special sensory input devices. The player feels that he is in the real environment. The special goggles have monitors to see. The environment changes according to the moments of the player. These games are very expensive.
3. d) Haptic feedback
Virtual reality (VR) applications strive to simulate real or imaginary scenes with which users can interact and perceive the effects of their actions in real time. Ideally the user interacts with the simulation via all five senses. However, today’s typical VR applications rely on a smaller subset, typically vision, hearing, and more recently, touch.
The application’s main elements are:
1) The simulation engine, responsible for computing the virtual environment’s behavior over time;
2) Visual, auditory, and haptic rendering algorithms, which compute the virtual environment’s graphic, sound, and force responses toward the user; and
3) Transducers, which convert visual, audio, and force signals from the computer into a form the operator can perceive HAPTIC DEVICES
A haptic device is the one that provides a physical interface between the user and the virtual environment by means of a computer. This can be done through an input/output device that senses the body’s movement, such as joystick or data glove. By using haptic devices, the user can not only feed information to the computer but can also receive information from the computer in the form of a felt sensation on some part of the body. This is referred to as a haptic interface.
Haptic devices can be broadly classified into
4. a) Virtual reality/ Telerobotics based devices
i) Exoskeletons and Stationary device
ii) Gloves and wearable devices
iii) Point-sources and Specific task devices
iv) Locomotion Interfaces
4. b) Feedback devices
i) Force feedback devices
ii) Tactile displays
4. a. i) Exoskeletons and Stationary devices
The term exoskeleton refers to the hard outer shell that exists on many creatures. In a technical sense, the word refers to a system that covers the user
or the user has to wear. Current haptic devices that are classified as exoskeletons are large and immobile systems that the user must attach him- or herself to.
4. a. ii) Gloves and wearable devices
These devices are smaller exoskeleton-like devices that are often, but not always, take the down by a large exoskeleton or other immobile devices. Since the goal of building a haptic system is to be able to immerse a user in the virtual or remote environment and it is important to provide a small remainder of the user’s actual environment as possible. The drawback of the wearable systems is that since weight and size of the devices are a concern, the systems will have more limited sets of capabilities.
4. a. iii) Point sources and specific task devices
This is a class of devices that are very specialized for performing a particular given task. Designing a device to perform a single type of task restricts the application of that device to a much smaller number of functions. However it allows the designer to focus the device to perform its task extremely well. These task devices have two general forms, single point of interface devices and specific task devices.
4. a. iv) Locomotion interfaces
An interesting application of haptic feedback is in the form of full body Force Feedback called locomotion interfaces. Locomotion interfaces are movement of force restriction devices in a confined space, simulating unrestrained mobility such as walking and running for virtual reality. These interfaces overcomes the limitations of using joysticks for maneuvering or whole body motion platforms, in which the user is seated and does not expend energy, and of room environments, where only short distances can be traversed.
4. b. i) Force feedback devices
Force feedback input devices are usually, but not exclusively, connected to computer systems and is designed to apply forces to simulate the sensation of weight and resistance in order to provide information to the user. As such, the feedback hardware represents a more sophisticated form of input/output devices, complementing others such as keyboards, mice or trackers. Input from the user in the form of hand, or other body segment whereas feedback from the computer or other device is in the form of hand, or other body segment whereas feedback from the computer or other device is in the form of force or position. These devices translate digital information into physical sensations.
4. b. ii) Tactile display devices
Simulation task involving active exploration or delicate manipulation of a virtual environment require the addition of feedback data that presents an object’s surface geometry or texture. Such feedback is provided by tactile feedback systems or tactile display devices. Tactile systems differ from haptic systems in the scale of the forces being generated. While haptic interfaces will present the shape, weight or compliance of an object, tactile interfaces present the surface properties of an object such as the object’s surface texture. Tactile feedback applies sensation to the skin.COMMONLY USED HAPTIC
Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Tagged Pages: engineering seminar haptic feedback,
Popular Searches: ieee paper format of haptic technology, a universalremote control with haptic, haptic technology detail in malayalam, haptic technology seminar report download, usc ms cs, haptic meaning, submarines,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Messages In This Thread
Haptic technology - by electronics seminars - 24-12-2009, 02:52 PM
RE: Haptic technology - by project reporter - 01-02-2010, 12:13 PM
RE: Haptic technology - by project report tiger - 13-02-2010, 04:25 PM
RE: Haptic technology - by seminar topics - 30-03-2010, 09:29 PM
RE: Haptic technology - by project topics - 09-04-2010, 08:58 PM
RE: Haptic technology - by seminar presentation - 23-05-2010, 12:08 PM
RE: Haptic technology - by projectsofme - 28-09-2010, 09:22 AM
RE: Haptic technology - by sandhya.421 - 04-12-2010, 09:39 AM
RE: Haptic technology - by seminar surveyer - 06-12-2010, 10:45 AM
RE: Haptic technology - by seminar surveyer - 13-01-2011, 05:29 PM
RE: Haptic technology - by seminar class - 02-03-2011, 12:54 PM
RE: Haptic technology - by assu - 07-03-2011, 10:14 AM
RE: Haptic technology - by seminar class - 11-03-2011, 02:32 PM
RE: Haptic technology - by rachelstevens - 12-03-2011, 02:01 AM
RE: Haptic technology - by seminar class - 12-03-2011, 02:28 PM
RE: Haptic technology - by seminar class - 21-03-2011, 02:23 PM
RE: Haptic technology - by seminar class - 24-03-2011, 10:29 AM
RE: Haptic technology - by smart paper boy - 14-07-2011, 02:53 PM
RE: Haptic technology - by project topics - 19-07-2011, 03:19 PM
RE: Haptic technology - by kimiko - 19-07-2011, 03:29 PM
RE: Haptic technology - by smart paper boy - 06-08-2011, 02:57 PM
RE: Haptic technology - by seminar paper - 15-03-2012, 10:01 AM

Possibly Related Threads...
Thread Author Replies Views Last Post
  LAMP TECHNOLOGY (LINUX,APACHE,MYSQL,PHP) seminar class 1 3,578 04-04-2018, 04:11 PM
Last Post: Guest
  5 Pen PC Technology project topics 95 100,756 21-08-2015, 11:18 PM
Last Post: Guest
  Jini Technology computer science crazy 10 13,859 19-08-2015, 01:36 PM
Last Post: seminar report asees
  3D-OPTICAL DATA STORAGE TECHNOLOGY computer science crazy 3 8,634 12-09-2013, 08:28 PM
Last Post: Guest
Question 4g wireless technology (Download Full Report ) computer science crazy 35 34,719 15-03-2013, 04:10 PM
Last Post: computer topic
  FACE RECOGNITION TECHNOLOGY A SEMINAR REPORT Computer Science Clay 25 36,194 14-01-2013, 01:07 PM
Last Post: seminar details
  TWO WAY STUDENT INFORMATION SYSTEM USING CELLULAR TECHNOLOGY smart paper boy 3 3,608 24-12-2012, 11:24 AM
Last Post: seminar details
  TOUCH SCREEN TECHNOLOGY seminar projects crazy 1 3,350 06-12-2012, 12:12 PM
Last Post: seminar details
  Brain finger printing technology seminar projects crazy 43 49,193 05-12-2012, 02:41 PM
Last Post: seminar details
Photo Cybereconomy : Information Technology and Economy computer science crazy 1 2,828 23-11-2012, 01:00 PM
Last Post: seminar details

Forum Jump: