Intelligent Robot Learning Laboratory (IRL Lab) IRL Lab

Kory Mathewson visits IRL Lab

koryWhat started as a simple email exchange between Kory and Dr. Taylor ends up with an invitation to visit the IRL Lab.

Kory Mathewson, a PhD student from University of Alberta, and a member of the Reinforcement Learning and Artificial Intelligence research lab, visited WSU from February 22-24. During his short visit, he gave a very engaging guest lecture talk to the students of the Introduction to Robotics class. He also gave a departmental talk on “Developing Machine Intelligence to Improve Bionic Limb Control” in which it has drawn a diverse audience.

Read more about Kory’s visit in his own words on his blog post.

1st Annual Hardware Hackathon

Organized by IEEE Student Chapter, Robotics Club and Palouse RoboSub Club, the 1st Annual Hardware Hackathon was held in Intelligent Robot Learning Laboratory and Frank Innovation Zone during November 14-15, 2015. The theme for this year’s event was “Internet of Things.” More than 25 teams that is composed of 75 total participants attended the event. Different majors were represented during this event including Computer Science, Computer Engineering, Electrical Engineering, Mechanical Engineering, and Material Science Engineering.

Teams had 24 hours to plan, design, and build their projects. At the end, all teams presented their works to a panel of judges represented from members of the industry and the academe.

This event was sponsored by Digilent Inc., Voiland College of Engineering and Architecture, and School of Electrical Engineering and Computer Science.

To view this year’s winners, please visit the Hardware Hackathon’s website.

12241292_477392535765718_6920767382616650569_n  IMG_2054

REU participants showcase their research projects

Jessie Bryant during the poster session.

The IRL Lab had two REU participants this year. Jessie Bryant, a junior Mechanical Engineering student from WSU, and Maddie Chili, a junior Computer Science and Communications Design student from Elon University.

The REU program allows undergraduate students to experience what it takes to do research under the guidance of a top research professor and working along with graduate students.

At a poster session yesterday, the REU participants presented the results on their research where it was well attended by students and professors. To read more about their individual research projects, please click on the links below.

Jessie Bryant: Laser Power Beaming in Smart Homes
Maddie Chili: On the Ability to Provide Demonstrations on a UAS

RoboSub Club of the Palouse competes in the 18th Annual International RoboSub Competition

Picture from WSU News

RoboSub Club of the Palouse is gearing up to compete in the annual International RoboSub Competition in San Diego. The members are leaving Pullman, Saturday, July 18th, for the week long event. James Irwin, who is one of the IRL Lab’s graduate student, will also be heading to San Diego as a member of the Electrical Engineering team.

The team has recently been featured in WSU News and KLEWTV. Click on the links to read more about the articles.

Good luck to the team and Go Cougs!!!

Controlling the AR Drone with the Microsoft Kinect 2.0

This semester I created a C# program to control the ARDrone using the Microsoft Kinect 2.0 in Windows 8.

At the beginning of the semester I spent most of my time figuring out how to set up the Kinect 2.0. Once I figured out that my laptop was not compatible I had research a laptop that was. Dr. Mathew Taylor was very generous to offer to buy me a laptop to use over the duration of my project. My original project was to use multiple Kinects to track the coordinates of an ARDrone in the Intelligent Robot Learning Lab to simulate GPS, but after doing a bit of research I realized that this was not a good project to use the Kinects on, so we redefined my goal to create a hand motion controller for the ARDrone using the Microsoft Kinect 2.0.

My first task was to set up the Kinect 2.0 hand motions. This was difficult at first because I had never used C# before and because I had to learn the Kinect SDK 2.0 which at this point was still in development and the documentation had not been completed. To learn the Kinect SDK 2.0 and C# I took some time to run through four of the tutorials listed on the Windows Kinect 2.0 official website. After running through these tutorials I determined that skeletal tracking would be a better solution in comparison to gesture tracking which would have required me to take hundreds of videos with the Kinect to recognize my customized gestures. Once I had finished all the skeletal tracking gestures I tested them by drawing a green circle to the screen when they were successful and a red circle to the screen when they failed.

My next task was to figure out how to get the information from Kinect project to the ARDrone. I went through multiple open source C# projects that claimed they could pilot the ARDrone and even considered the idea of building a server to talk to my Linux machine so that I could use ROS to fly the ARDrone. Figuring out what the next step for me to take was the most difficult part of the project because I had no guarantee that either of these solutions would work and I wanted to make sure that I finished the project before the end of the semester. After doing a lot of research I found Roslan’s AR.Drone 2.0 C# library. Getting this project to work was extremely hard because I had to learn how to set up references in C# using visual studios 13 and because there is no documentation or tutorials on how to use Roslan’s AR.Drone 2.0 library, so I ended up spend a large amount of time reading through his code to figure out how to actually set up and fly the ARDrone.

Finally when I had both my flight project and my Kinect project I combined the two and rewrote the code a few times to make it so multiple flight commands could be processed in a single message. Once all the flight controls were working I spent the last two weeks of my project trying to figure out how to get a video feed from the ARDrone to a Window’s Forms application. I had multiple attempts including using ffmpeg to stream in a separate video player to the screen, and using Roslan’s FFMPEG project, but I was unable to figure this part out.

I found this project to be very beneficial for me and I also had a great time working on it. From this project I learned how to program in C#, How to program with the Kinect 2.0, how to program with the ARDrone in Windows and how to create a GUI in C#.

I think the next step for this project should be adding a video stream so that the ARDrone can be piloted using a camera. I also think it would be cool if voice commands were added to the Kinect project to take off and land.