Intelligent Robot Learning Laboratory (IRL Lab) Undergraduate Research Posters

On the Ability to Provide Demonstrations on a UAS

maddieMaddie Chili
BS Computer Science and Communication Design (2017)
Elon University
REU Participant (2015)

Abstract: Human-Robot Interaction (HRI) is a relatively young field, however it is growing rapidly. This field is expanding beyond academic communities as more people are exposed to robots in their everyday lives such as robotic toys and household appliances. Robots are being studied and developed for real-world applications such as educational or health care purposes. To further this technological development, we designed an experiment to determine the effect instructions can have on an individual’s performance and interaction with an AR Drone 2.0. We hypothesize that the instructions will significantly change how they pilot the AR Drone 2.0. First, participants went through a brief training session on how to fly the unmanned aircraft system (UAS). Then, we split the participants into two groups based on different instructions and alternated between the two instruction sets for every other participant. We told one group the UAS was an inexpensive toy and the other that it was an expensive piece of research equipment. We told both groups to fly the UAS as quickly as possible without hitting any of the obstacles. Then participants flew the UAS through a specified obstacle course. The obstacle course featured flying to the right of a pole, then flying through and to the right of a hula-hoop and finally flying around the left side of the first pole before landing in a green square. We examined the time and accuracy of flying the UAS. After the course, participants answered a short survey that included questions about individuals’ age, gender, experience with a UAS, experience with video games, and thoughts during the experiment such as nervousness and belief of the value of the UAS. Experiments are ongoing at the time of this abstract submission, thus we cannot draw any conclusions yet.

Laser Power Beaming in Smart Homes

jessieJessie Bryant
BS Mechanical Engineering (2017)
Washington State University
REU Participant (2015)

Abstract: Laser power beaming is the method of charging devices by transmitting energy via the high density light of a laser. This project focuses on using this technology to power sensors in smart homes in order to eliminate the need for wired connections and the replacement of batteries. Instead, the sensors can be charged remotely. The system involves two major parts: the base and the receiver. The base houses the camera, computer system, low-power visible laser, high-power near infrared laser, and beam directing mirrors powered by small motors. The receiver’s main component is the silicon based vertical-multijunction photovolatic cells designed for high light concentrations and high voltage outputs. These cells are centered between two LED lights for positioning and are part of a circuit connected to a capacitor for storing the acquired energy. To charge, the camera first locates the visible laser and LEDs, signals the motors, and lines the laser up with the midpoint of the LEDs, targeting the receiving cells. Once in place, the visible laser is dismissed while the near infrared laser powers on and begins transmitting energy. By our calculations, this system has the capability to charge sensors within smart homes. With minor alterations, it could be used for many other purposes as well, whether stationary objects or moving targets are involved. Over 10 million iRobot Roombas have been purchased worldwide, and in the future, this home charging system could be integrated into a similar autonomous multipurpose robot.

Quadcopters in Artificial Intelligence Research

Maher M. Abujelala
BS Computer Engineering (2014)
Washington State University


2014_spring_maher-page-001In this project, a quadcopter (quad-rotor helicopter) is used to showcase how data can be interpreted into a set of functions that make a corrected autonomous flight. The quadcopter has a built-in camera that has been programmed to work as a tracking system in places where navigation systems are not accessible. However, this project is still in progress. At the moment, the work in the tracking system is done, and we can detect objects up to 25 feet away.

Presented during WSU Showcase for Undergraduate Research and Creative Activities (SURCA) 2014

Can the AR Drone Fly Solo?

denzelDenzel Hamilton
University of Maryland Baltimore County
REU Participant (2013)


denzelAbstract: The AR Drone, created by Parrot, that I am using originally doesn’t have an autonomous flying mode where the quadcopter can follow something and maintain the same distance at all times. We use image processing to change the default video feed of the AR Drone to black and white. An object’s color will be changed to white if it is the object we want the quadcopter to follow, while all other objects with different colors will changed to black in the camera image. Using the white pixels in the image, we will have the quadcopter focus toward the (approximate) middle of the pixels. Then, we will use the measurements of the camera image to create a ratio of white pixels to overall pixels in the image. This ratio is what will maintain a distance because the quadcopter will be programmed to move to obtain that ratio at all times.