School of Electrical Engineering and Computer Science Intelligent Robot Learning Laboratory (IRL Lab)

RoboSub Club of the Palouse competes in the 18th Annual International RoboSub Competition

Picture from WSU News

RoboSub Club of the Palouse is gearing up to compete in the annual International RoboSub Competition in San Diego. The members are leaving Pullman, Saturday, July 18th, for the week long event. James Irwin, who is one of the IRL Lab’s graduate student, will also be heading to San Diego as a member of the Electrical Engineering team.

The team has recently been featured in WSU News and KLEWTV. Click on the links to read more about the articles.

Good luck to the team and Go Cougs!!!

Controlling the AR Drone with the Microsoft Kinect 2.0

This semester I created a C# program to control the ARDrone using the Microsoft Kinect 2.0 in Windows 8.

At the beginning of the semester I spent most of my time figuring out how to set up the Kinect 2.0. Once I figured out that my laptop was not compatible I had research a laptop that was. Dr. Mathew Taylor was very generous to offer to buy me a laptop to use over the duration of my project. My original project was to use multiple Kinects to track the coordinates of an ARDrone in the Intelligent Robot Learning Lab to simulate GPS, but after doing a bit of research I realized that this was not a good project to use the Kinects on, so we redefined my goal to create a hand motion controller for the ARDrone using the Microsoft Kinect 2.0.

My first task was to set up the Kinect 2.0 hand motions. This was difficult at first because I had never used C# before and because I had to learn the Kinect SDK 2.0 which at this point was still in development and the documentation had not been completed. To learn the Kinect SDK 2.0 and C# I took some time to run through four of the tutorials listed on the Windows Kinect 2.0 official website. After running through these tutorials I determined that skeletal tracking would be a better solution in comparison to gesture tracking which would have required me to take hundreds of videos with the Kinect to recognize my customized gestures. Once I had finished all the skeletal tracking gestures I tested them by drawing a green circle to the screen when they were successful and a red circle to the screen when they failed.

My next task was to figure out how to get the information from Kinect project to the ARDrone. I went through multiple open source C# projects that claimed they could pilot the ARDrone and even considered the idea of building a server to talk to my Linux machine so that I could use ROS to fly the ARDrone. Figuring out what the next step for me to take was the most difficult part of the project because I had no guarantee that either of these solutions would work and I wanted to make sure that I finished the project before the end of the semester. After doing a lot of research I found Roslan’s AR.Drone 2.0 C# library. Getting this project to work was extremely hard because I had to learn how to set up references in C# using visual studios 13 and because there is no documentation or tutorials on how to use Roslan’s AR.Drone 2.0 library, so I ended up spend a large amount of time reading through his code to figure out how to actually set up and fly the ARDrone.

Finally when I had both my flight project and my Kinect project I combined the two and rewrote the code a few times to make it so multiple flight commands could be processed in a single message. Once all the flight controls were working I spent the last two weeks of my project trying to figure out how to get a video feed from the ARDrone to a Window’s Forms application. I had multiple attempts including using ffmpeg to stream in a separate video player to the screen, and using Roslan’s FFMPEG project, but I was unable to figure this part out.

I found this project to be very beneficial for me and I also had a great time working on it. From this project I learned how to program in C#, How to program with the Kinect 2.0, how to program with the ARDrone in Windows and how to create a GUI in C#.

I think the next step for this project should be adding a video stream so that the ARDrone can be piloted using a camera. I also think it would be cool if voice commands were added to the Kinect project to take off and land.