Intelligent Robot Learning Laboratory (IRL Lab) IRL Lab

2016 Showcase for Undergraduate Research and Creative Activites (SURCA)

Kayl Coulston during the poster session.

Kayl Coulston was one of 200+ undergraduates that were invited to present their undergraduate research in the 2016 SURCA held at the Compton Union Building, Senior Ballroom on March 28, 2016. SURCA is an annual event here at WSU to promote undergraduate research from all majors.

Kayl, a junior undergraduate in Computer Science, has been working under Dr. Matthew Taylor since Summer 2015 in building a low-cost autonomous rover that will be utilized to count trees in nursery farms. The current practice involves a farm worker walking through miles and hundreds of rows, while counting the trees manually.

Click here to view his research poster, and to read more about this exciting project, visit the project’s webpage.

REU participants showcase their research projects

Jessie Bryant during the poster session.

The IRL Lab had two REU participants this year. Jessie Bryant, a junior Mechanical Engineering student from WSU, and Maddie Chili, a junior Computer Science and Communications Design student from Elon University.

The REU program allows undergraduate students to experience what it takes to do research under the guidance of a top research professor and working along with graduate students.

At a poster session yesterday, the REU participants presented the results on their research where it was well attended by students and professors. To read more about their individual research projects, please click on the links below.

Jessie Bryant: Laser Power Beaming in Smart Homes
Maddie Chili: On the Ability to Provide Demonstrations on a UAS

RoboSub Club of the Palouse competes in the 18th Annual International RoboSub Competition

Picture from WSU News

RoboSub Club of the Palouse is gearing up to compete in the annual International RoboSub Competition in San Diego. The members are leaving Pullman, Saturday, July 18th, for the week long event. James Irwin, who is one of the IRL Lab’s graduate student, will also be heading to San Diego as a member of the Electrical Engineering team.

The team has recently been featured in WSU News and KLEWTV. Click on the links to read more about the articles.

Good luck to the team and Go Cougs!!!

Controlling the AR Drone with the Microsoft Kinect 2.0

This semester I created a C# program to control the ARDrone using the Microsoft Kinect 2.0 in Windows 8.

At the beginning of the semester I spent most of my time figuring out how to set up the Kinect 2.0. Once I figured out that my laptop was not compatible I had research a laptop that was. Dr. Mathew Taylor was very generous to offer to buy me a laptop to use over the duration of my project. My original project was to use multiple Kinects to track the coordinates of an ARDrone in the Intelligent Robot Learning Lab to simulate GPS, but after doing a bit of research I realized that this was not a good project to use the Kinects on, so we redefined my goal to create a hand motion controller for the ARDrone using the Microsoft Kinect 2.0.

My first task was to set up the Kinect 2.0 hand motions. This was difficult at first because I had never used C# before and because I had to learn the Kinect SDK 2.0 which at this point was still in development and the documentation had not been completed. To learn the Kinect SDK 2.0 and C# I took some time to run through four of the tutorials listed on the Windows Kinect 2.0 official website. After running through these tutorials I determined that skeletal tracking would be a better solution in comparison to gesture tracking which would have required me to take hundreds of videos with the Kinect to recognize my customized gestures. Once I had finished all the skeletal tracking gestures I tested them by drawing a green circle to the screen when they were successful and a red circle to the screen when they failed.

My next task was to figure out how to get the information from Kinect project to the ARDrone. I went through multiple open source C# projects that claimed they could pilot the ARDrone and even considered the idea of building a server to talk to my Linux machine so that I could use ROS to fly the ARDrone. Figuring out what the next step for me to take was the most difficult part of the project because I had no guarantee that either of these solutions would work and I wanted to make sure that I finished the project before the end of the semester. After doing a lot of research I found Roslan’s AR.Drone 2.0 C# library. Getting this project to work was extremely hard because I had to learn how to set up references in C# using visual studios 13 and because there is no documentation or tutorials on how to use Roslan’s AR.Drone 2.0 library, so I ended up spend a large amount of time reading through his code to figure out how to actually set up and fly the ARDrone.

Finally when I had both my flight project and my Kinect project I combined the two and rewrote the code a few times to make it so multiple flight commands could be processed in a single message. Once all the flight controls were working I spent the last two weeks of my project trying to figure out how to get a video feed from the ARDrone to a Window’s Forms application. I had multiple attempts including using ffmpeg to stream in a separate video player to the screen, and using Roslan’s FFMPEG project, but I was unable to figure this part out.

I found this project to be very beneficial for me and I also had a great time working on it. From this project I learned how to program in C#, How to program with the Kinect 2.0, how to program with the ARDrone in Windows and how to create a GUI in C#.

I think the next step for this project should be adding a video stream so that the ARDrone can be piloted using a camera. I also think it would be cool if voice commands were added to the Kinect project to take off and land.