Vision-Based Estimation for Guidance, Navigation, and Control of an Aerial Vehicle
#1

Vision-Based Estimation for Guidance, Navigation, and Control of an Aerial Vehicle
Seminar Report
Presented by
Ms. Preethy P
1st
Semester, M.Tech
Guidance and Navigational Control
Roll No: 10GNC09

Abstract
While a Global Positioning System (GPS) is the most widely used sensor modality for
aircraft navigation, researchers have been motivated to investigate other navigational sensor
modalities because of the desire to operate in GPS denied environments. Due to advances in
computer vision and control theory, monocular camera systems have received growing interest
as an alternative/collaborative sensor to GPS systems. Cameras can act as navigational
sensors by detecting and tracking feature points in an image. Current methods have a limited
ability to relate feature points as they enter and leave the camera eld of view (FOV).
A vision-based position and orientation estimation method for aircraft navigation and
control is described. This estimation method accounts for a limited camera FOV by releasing
tracked features that are about to leave the FOV and tracking new features. At each time
instant that new features are selected for tracking, the previous pose estimate is updated. The
vision-based estimation scheme can provide input directly to the vehicle guidance system and
autopilot. Simulations are performed wherein the vision-based pose estimation is integrated
with a nonlinear
ight model of an aircraft. Experimental veri cation of the pose estimation
is performed using the modeled aircraft.

Chapter 1
Introduction

Global Positioning System (GPS) is the primary navigational sensor modality used for ve-
hicle guidance, navigation, and control. However, a comprehensive study(the Volpe report),
indicates several vulnerabilities of GPS associated with signal disruptions. This Report de-
lineates the sources of interference with the GPS signal into two categories, unintentional
and deliberate disruptions. Some of the ultimate recommendations of this report were to,
"create awareness among members of the domestic and global transportation community of
the need for GPS backup systems" and to "conduct a comprehensive analysis of GPS backup
navigation".
The Volpe report inspired a search for strategies to mitigate the vulnerabilities of the
current GPS navigation protocol. Nearly all resulting strategies followed the suggested GPS
backup methods that revert to legacy methods. Unfortunately, these navigational modalities
are limited by the range of their land-based transmitters, which are expensive and may not
be feasible for remote, hazardous, or adversarial environments. Based on these restrictions,
researchers have investigated local methods of estimating position when GPS is denied.
Given the advancements in computer vision and estimation and control theory, monocular
camera systems have received growing interest as a local alternative sensor to GPS systems.
One issue that has inhibited the use of a vision system as a navigational aid is the diculty
in reconstructing inertial measurements from the projected image. Current approaches to
estimating the aircraft state through a camera system utilize the motion of feature points
in an image. A geometric approach is proposed that uses a series of homography relation-
ships to estimate position and orientation with respect to an inertial pose. This approach
creates a series of Daisy-Chained pose estimates in which the current feature points can be
related to previously viewed feature points to determine the current coordinates between
each successive image. Through these relationships previously recorded GPS data can be
linked with the image data to provide position measurements in navigational regions where
GPS is denied. The method also delivers an accurate estimation of vehicle attitude, which
is an open problem in aerial vehicle control. The position and attitude (pose) estimation
method can be executed in real time, making it amenable for use in closed-loop guidance
control of an aircraft.
The concept of vision-based control for a
ight vehicle has been an active area of research
over the last decade. Recent literature focused on vision-based state estimation for use in
control of a
ight vehicle can be categorized by several distinctions. One distinction is that
some methods require simultaneous sensor fusion, while other methods rely solely on camera
feedback. Research can further be categorized into methods that require a priori knowledge
of landmarks versus techniques that do not require any prior knowledge of landmarks.
Another category of research includes methods that require the image features to remain
in the eld of view (FOV) versus methods that are capable of acquiring new features. Finally,methods can be categorized according to the vision-based technique for information extrac-
tion, such as: optic
ow, simultaneous localization and mapping (SLAM), stereo vision, or
epipolar geometry. This last category might also be delineated between methods that are
more computationally intensive and therefore indicative of the level of real-time, on-board,
computational feasibility. The method presented can provide an estimate of the position
and attitude of unmanned air vehicle (UAV), and the method can be extended to map the
location of static landmarks in the world frame. By using the daisy-chaining strategy, the
coordinates of static features that have moved out of the FOV can also be estimated.
To investigate the performance of the developed method, a numerical simulation is pro-
vided for a nonlinear, six degrees-of-freedom model of an Osprey UAV. The simulation
illustrates the ability of the estimation method to reconstruct the UAV pose in the presence
of disturbances, such as errors in the initial altitude measure, image quantization, and noise.
To illustrate the potential use of the estimates in a feedback loop, an autopilot was also
included in the simulation, with inputs complimentary with the outputs from the estimation
method. A speci c maneuver is created to perform a simultaneous rolling, pitching, and yaw-
ing motion of the aircraft, combined with a xed-mounted camera. The aircraft/autopilot
modeling e ort and maneuver are intended to test the robustness of the homography-based
estimation method as well as to provide proof-of-concept in using the camera as the primary
sensor for achieving closed-loop autonomous
ight.
Based on the outcomes from the simulation, the performance of the developed estimation
method is also experimentally tested through two
ight tests with an Osprey UAV. One
experiment compares the estimated inertial coordinates of the UAV to the GPS position
reported by two on-board GPS units. The GPS units provide di erent outputs, but the
homography-based estimation is shown to provide approximately equivalent performance. A
second experiment is also performed, where precision laser-measured ground markers were
viewed by the
ight camera. In this experiment, the images of the known maker locations
are used to generate a precise ground truth. The estimation method (which did not use any
information about marker locations) matches the reconstructed ground truth data.

Chapter 2
Pose Reconstruction from two
views

2.1 Euclidean Relationships
Consider a body- xed coordinate frame Fc that de nes the position and attitude of a camera,
with respect to a constant world frame Fw The world frame could represent a departure point,
destination, or some other point of interest. The rotation and translation of Fc, with respect
to Fw , is de ned as R(t) 2 R3x3 and x (t) 2 R3, respectively. The camera rotation and
translation of Fc between two time instances, t0 and t1 , is denoted by R01 (t1) and x01 (t1).
During the camera motion a collection of I (where I  4) coplanar and noncolinear static
feature points are assumed to be visible in a plane . The assumption of four coplanar and
noncolinear feature points is only required to simplify the subsequent analysis and is made
without loss of generality. Image processing techniques can be used to select coplanar and
non-colinear feature points within an image. However, if four coplanar target points are not
available, then the subsequent development can also exploit a variety of linear solutions for
eight or more non-coplanar points or nonlinear solutions for ve or more points.

Get the full report here:
[attachment=8499]
Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: vision for mobile robot navigation a survey, missile guidance by three dimensional proportional navigation, intelligent control in guidance and control ppt, how to make agile aerial robots material, unmanned aerial vehicles in india, aerial bowfishing, mission and vision,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
  DC MOTOR CONTROL USING CHOPPER seminar class 3 5,737 25-11-2015, 02:00 PM
Last Post: seminar report asees
  Night Vision computer science crazy 8 8,479 01-04-2014, 10:10 PM
Last Post: Guest
  Cellular through remote control switch computer science crazy 4 4,697 22-07-2013, 12:10 PM
Last Post: computer topic
  Four-Quadrant Control of Switched Reluctance Motors full report seminar topics 3 5,169 14-05-2013, 01:16 PM
Last Post: Guest
  Maglev- Magnetic Levitation Vehicle computer girl 1 1,159 07-01-2013, 01:28 PM
Last Post: seminar details
  COORDINATED POWER FLOW CONTROL USING MULTIPLE FACTS seminar class 1 2,716 03-12-2012, 11:54 AM
Last Post: seminar details
  A microprocessor-based generator of synchronizing signal and test signal for colour T Electrical Fan 1 4,963 07-11-2012, 11:38 PM
Last Post: Guest
  INTRODUCTION OF CONTROL SYSTEMS computer girl 0 2,438 11-06-2012, 04:51 PM
Last Post: computer girl
  Implementing Embedded Speed Control for AC Induction Motors computer girl 0 1,301 09-06-2012, 03:56 PM
Last Post: computer girl
  Open loop control vs. closed loop (feedback) control computer girl 0 1,444 09-06-2012, 10:35 AM
Last Post: computer girl

Forum Jump: