Landing system for AR.Drone 2.0 using onboard camera and ROS
Tianqu Zhao, Hong Jiang
AbstractAs the application of unmanned aerial This paper is structured with the following part. In section
vehicles(UAVs) becoming more widely, a demand for precise 2, we introduce the hardware and software used in our
autonomous landing is rising quickly. AR.Drone 2.0 is a high- project. In section 3the main steps of our work is
tech low-cost flying toy that is able to take role as UAV. In this described. Section 4 showed the experiment result, and
paper, we present a system for the drone to implement section 5, we summarize our work and the future
autonomous landing. The image flow from the camera of the improvement.
drone is obtained and rectified by camera calibration. Then the
image frames are processed by a marker recognition package
to detect the AR marker which is used as a landing pad, and II. HARDWARE AND SOFTWARE
calculate the position and orientation relation between the The AR.Drone 2.0 quadrotor (called a drone) is used as
drone and the landing pad. Next a PID controller is used to our UAV platform, it has been used by several research
control the drones velocity to stably hovering on the center of groups from various universities. It is an off-the-shelf
the landing pad. A laptop computer running Ubuntu and ROS product which is low-cost and easy to use, released by Parrot
is used to communicate with the drone via Wi-Fi, throughout
SA. in 2012. It comes with two body cover (indoor/outdoor
the landing process. Several experiments are made to verify the
feasibility of the whole system.
hull), the indoor hull which covers the propellers and
Keywords: AR.Drone; Robot Operating System(ROS); therefore the drone can be safely used indoors. All parts of
landing system the drone can be replaced alone which makes it easy to fix.
1098
SDK is 2.0.1 which can be download at Parrot developer the drone is flying, the message type is geometry_msgs
website [5]. The original version of the drones SDK is hard which is send to ROS topic cmd_vel. Then you can control
to use, many developers encountered errors when trying to the drone to move forward/backward, flying up/down,
compile it. Therefore, we choose to use a substitution left/right as you like. The drone driver will transfer
package based on ROS. geometry_msgs to relevant change of roll, pitch, yaw angles.
ROS, the Robot Operating System is a flexible tum_ardrone: This is a ROS package developed by Engel
framework and a toolbox for the development of robot et al [7]. It enables the low-cost ardrone coupled with a
applications. ROS can provide several functions similar to laptop computer to navigate in previously unknown and
the traditional operating system, like hardware abstraction, GPS-denied indoor environment. They also conduct many
underlying equipment control, massage passing between experiment proved that the system has a great robustness
threads and software packaging management. In addition, it even though the significant communication delays and
also provides quantity of utilities and libraries. In our temporary loss of visual tracking. The package contains 3
project, the most commonly used ROS tools and packages nodes.Fig. 3 show the running interface of tum_ardrone.
are RViz, ardrone_autonomy, ar_track_alvar and
tum_ardrone. drone_stateestimation: This node provides the drone
with SLAM capabilities by implementing an algorithm
RViz stands for ROS visualization. It is a general- based on Parallel Tracking and Mapping (PTAM). In
purpose 3D visualization environment for robots, sensors, addition, it implements an EKF for state estimation and
and algorithms. RViz can plot a variety of data types compensation of the time delays in the system arising from
streaming through a typical ROS system, and can be used for Wi-Fi communication.
any robot and rapidly configured for a particular application.
The default unconfigure RViz window is shown in Fig. 2. drone_autopilot: This node implements a PID
controller for pose stabilization and navigation. Also
includes basic waypoint navigation and automatic
initialization of the drone.
drone_gui: This node implements a simple GUI to
display the drones status and control the movement of the
drone. You can use a joystick or simply a keyboard to driver
the drone.
1099
fx
X = z
Y = fy
z
where (XY) stands for the coordinate of point p on the
projection plane, (x,y,z) is the 3D point Ps coordinate in the
camera coordinate system. f is the distance between optic
Figure 4. AR Marker tag examples center and the image plane, which is known as focal length.
In ROS, we have a useful utility called
The encoding of these tags is carefully computed to camera_calibration, which allows easy calibration of
reduce reading errors and to permit accurate calculation of monocular or stereo cameras using a checkerboard
the orientation and distance of the tag relative to a camera calibration target. All camera that satisfying the standard
that image it. ALVAR marker tags can work surprisingly ROS camera interface can be calibrated by
well in a variety of application environment. camera_calibration. Fortunately, the ardrone_autonomy
In our project, we use the ar_track_alvar package to driver meets this requirement. Therefore, we run the
calculate the coordinate transformation relation between the ardrone_autonomy driver on one terminal at first, then we run
ALVAR tag and the drone. the camera_calibration on another terminal like this:
rosrun camera_calibration cameracalibrator.py --size 9x6 --square
We get a video streaming of resolution of 720p from the 0.044 image:=/ardrone/front/image_raw camera:=/ardrone/front
drones front-facing camera, and get 320x240 pixels from
the downward-facing camera. Due to the limited computing This will open up the calibration window which will
power of the drones onboard computer, all the computation highlight the checkerboard.
processes are running on a laptop computer, Lenovo Y430p
(CPU: Intel Core i7-4710MQ 2.5GHz, GPU: NVIDIA
GeForce GTX 850M8GB RAM) running Ubuntu 14.04
LTS.
A. Camera Calibration
As we all know, all camera lens has distortion. In order to
get accurate pose estimation result, we should calibrate the
drones camera at first.
1100
We do some experiment to test the accuracy of the pose
computed by the al_track_alvar. The drone is placed in front
of the AR marker just an edge length of the square ceramic
tile, 60cm. Next we run the al_track_alvar node and set
relevant parameter properly. Then we can see the pose of the
marker with regard to the ardrone_base_frontcam in RViz.
Fig. 8 is a screenshot of the Rviz when running this program.
On the left side, the ar_marker_4s position is representing
by three ever-changing values. This is because the noise of
the sensor. The two digits after the decimal point is stable.
We can tell that is accuracy of the computed pose is
(a) Image before rectify acceptable.
Figure 7. Camera calibration result Figure 8. Screenshot of running ar_track_alvar and Rviz
B. Marker Recognition
In our experiment, we plan to use the images from the
Before the landing phase, we need to figure out the down-looking camera of the drone, therefore we write a
location relationship between the drone and the landing pad. subscriber to the ROS topic ardrone/bottom/image_raw, then
In [9], Merz et al. use special designed landing pad patterns. we use the result of the camera calibration phase to rectify
In our project, we use the AR marker in Fig. 4 as our landing the output image flow. The image flow was transmitting to
pad pattern. our laptop via Wi-Fi. Based on these image frames, the
The AR markers are augmented reality tags which are relative nodes will do the image process, search for and
formed of different patterns of black or white squares locate the landing target (in our project, it is the AR marker).
arranged in a grid area. Such markers are well suited for Fig. 9 is an image from bottom camera in actual operation.
pattern recognition approaches and can easily be detected
with some known software. ALVAR is a software library for
creating virtual and augmented reality(AR) applications, and
has been developed by the VTT Technical Research Centre
[10].
The ar_track_alvar published a ROS topic named
ar_pose_marker, this is a list of the poses of all the observed
AR tags, with respect to the camera frame or other specified
coordinate frame.
Beneath the surface, the pose estimation problem is
solved through the following process. At first the color Figure 9. Actual image from bottom camera
image is converted to grayscale. Then this image is
converted into a binary (only two colors: black and white)
image using an adaptive threshold. From this image edges C. AR.Drone velocity control
are searched, producing a number of lines. Then sets of four Control theory deals with the problem of controlling the
intersecting lines (i.e. quadrangles) are searched, and these behavior of a dynamic system. The general goal is to
are potential markers. Then it is verified that the outside of calculate system input values u(t), such that the system
the quadrangle is white, and the inside is black (i.e. we are reaches and holds a desired state. In other words, the
indeed seeing the border stripe of a marker). And finally the measured error e(t) between a given setpoint w(t) and the
inside of the marker borders can be interpreted as bits, and measured output of the system y(t) is to be minimized over
the bit pattern is a valid marker. From the four corner points time. In our situation, the set point is the image center, the
of the detected marker the program can compute the pose measured error is the distance between the AR markers
(i.e. location and orientation) of the marker in camera center and the image center. We plan to use the most
coordinates.
1101
common proportional integral derivative(PID) controller in
our velocity control process. Drone take off
t
U t = K P (X (0,0) - X t -1 )+ K D (X& (0,0) - X& t -1 )+ K I (X (0,0) - X t -1 )dt Image frame
0
process
REFERENCES
[1] Venugopalan, T. K., Taher, T., & Barbastathis, G. (2012, October).
Autonomous landing of an Unmanned Aerial Vehicle on an
autonomous marine vehicle. In Oceans, 2012 (pp. 1-9). IEEE.
[2] Kim, J., Jung, Y., Lee, D., & Shim, D. H. (2014, May). Outdoor
autonomous landing on a moving platform for quadrotors using an
omnidirectional camera. In Unmanned Aircraft Systems (ICUAS),
2014 International Conference on (pp. 1243-1252). IEEE.
1102