Drone Cat and Mouse

The goal of this exercise is to implement the logic that allows a quadrotor to play a game of cat and mouse with a second quadrotor.

In this exercise, the cat quadrotor has to be programmed by the student to follow the mouse quadrotor (which is preprogrammed and moves randomly) as close as possible without crashing. The refree application will measure the distance between the two drones and will assign a score based on the time spent by the drones close to each other.

Drone Cat and Mouse.
Gallery.

Installation

Install the General Infrastructure of the JdeRobot Robotics Academy.

As this is a drones exercise, you will need to additionally install the jderobot-assets, dronewrapper and rqt_drone_teleop packages. These can be installed as

sudo apt-get install ros-melodic-drone-wrapper ros-melodic-rqt-drone-teleop ros-melodic-jderobot-assets

There is an additional dependency on MAVROS and PX4 that you can fulfill following the Drones installation instructions.

Finally, due to the use of two drones you have to install xmlstarlet:

sudo apt-get install -y xmlstarlet

How to run

To launch the exercise, simply use the following command from this directory:

roslaunch drone_cat_mouse.launch

Once the exercise is launched, you can launch the autonomous mouse by running in another command line:

roslaunch autonomous_mouse.launch mouse:=0

As you can see, there are several mice available with different paths. Up to today, there are 4 different mice. You can launch them by modifying the mouse argument from 0 to 3. Notice that the difficulty increases with the number of the selected mouse. Thus, mouse 0 will be the easiest to follow and mouse 3 the most difficult.

Notice too that you can restart and change the mouse without shutting down the exercise. You just need to shut down the mouse launched and relaunch the new one after the drone have returned to home and landed.

How to do the practice

To solve the exercise, you must edit the my_solution.py file and insert the control logic into it.

Where to insert the code

Your code has to be entered in the execute function between the Insert your code here comments.

my_solution.py

def execute(event):
  global drone
  img_frontal = drone.get_frontal_image()
  img_ventral = drone.get_ventral_image()
  # Both the above images are cv2 images
  ################# Insert your code here #################################

  set_image_filtered(img_frontal)
  set_image_threshed(img_ventral)

  #########################################################################

API

  • set_image_filtered(cv2_image) - If you want to show a filtered image of the camera images in the GUI
  • set_image_threshed(cv2_image) - If you want to show a thresholded image in the GUI
  • drone.get_frontal_image() - Returns the latest image from the frontal camera as a cv2_image
  • drone.get_ventral_image() - Returns the latest image from the ventral camera as a cv2_image
  • drone.get_position() - Returns the position of the drone as a numpy array [x, y, z]
  • drone.get_orientation() - Returns the roll, pitch and yaw of the drone as a numpy array [roll, pitch, yaw]
  • drone.get_roll() - Returns the roll of the drone
  • drone.get_pitch() - Returns the pitch of the drone
  • drone.get_yaw() - Returns the yaw of the drone
  • drone.set_cmd_vel(vx, vy, vz, az) - Commands the linear velocity of the drone in the x, y and z directions and the angular velocity in z in its body fixed frame

Theory

Comming soon.

Hints

Simple hints provided to help you solve the follow_road exercise. Please note that the full solution has not been provided.

Comming soon.


Contributors