In my work at Rakuten Institute of Technology, I developed a ROS wrapper and tools to utilize the Dorna Arm. Rakuten has released this work under an MIT license. You can find the code and documentation on my github page and more information about the Dorna Arm at dorna.ai
The Planar Hopper is a 3DoF [x, z, pitch], power autonomous, single legged hopper that uses two direct drive motors in parallel. The robot was designed to test the stability of combining controllers created in isolation. I implemented the active damping hopping controller described in A. De and D. E. Koditschek, “Vertical hopper compositions for preflexive and feedback-stabilized quadrupedal bounding, pacing, pronking, and trotting,” . As well as the fore-aft controller described in Raibert M (1986) “Legged Robots that Balance” and a PD controller to correct the pitch of the robot. I worked on this project for an independent study in my sophomore year at the University of Pennsylvania under the mentorship and guidance of Dr. Dan Koditschek and Dr. Avik De.
Figure 1: Pitch controller implementation and testing in isolation
Figure 2: Pitch and Hopping controller combined testing on linear rail
Figure 3: Fore-aft and Hopping controllers combined testing
Figure 4: Pitch, Fore-aft and Hopping controllers integrated all together
The Delta Hopper is an ongoing research project in Kod*Lab at the University of Pennsylvania. The Delta Hopper hopes to utilize an unconventional leg design with 3 electromechanical motors in parallel to be a free-running monopedal hopper. I worked on this project for two years and developed several design iterations, simulations and controllers for the system. To see more on this project click here.
Iteration 1: Tries to maximize force and workspace of the end effector
Iteration 2: Tries maximize Body Inertia to Leg Inertia ratio to reduce inertial effects of Leg
Iteration 3: Makes compromises from previous designs to improve performance
Stabilize was a Senior Design project at the University of Pennsylvania by Jared Sobel, Devin Caplow-Munro, Ilana Tiecher, Langston Macdiarmid, Sean Cohen, and myself.
Stabilize is a 4 DoF camera stabilizer that integrates non-invasively with existing legged robots. Much like a chicken isolates its head from the motion of its torso, Stabilize seeks to isolate an onboard camera from the motion of the robot. It is also designed to be robust and energy efficient so that it does not limit the distance or terrains the RHex can traverse.
Stabilize uses high precision motor controllers and brushless direct drive motors to actively stabilize and track Roll, Pitch, Yaw (RPY) and Z translation of the sensor. The 3 motors controlling the RPY are axially aligned to reduce power consumption and a spring in parallel with the Z axis motor compensates for gravity and helps to passively stabilize at higher frequencies.
The system is validated by measuring the reduction in maximum rotational and translational velocity of the sensor. This in turn reduces frame loss and motion blur, two key factors in vision performance.
To see the full report click here.
This project was a combined effort from Tyler Altenhofen and myself for our final project for ESE 650: Learning in Robotics. We investigated localization and mapping of a robot using acoustic signals. In our experiments we had four speakers arranged in an arbitrary configuration as well as our microphone, which had line of sight to each speaker. We were able to get both the mapping and localization to work relatively well, but unable to integrate the simultaneous localization and mapping together due to high noise in our measurements.
To see the full report click here.
Figure 1: An graphic of our array of microphones for our test setup
Figure 2: Probability map of the first speaker given a microphone location and distance.
Figure 3: Probability map of speaker location given 1, 2, 5, and 20 samples respectively
Figure 4: Applying an odometry update and noise to our particle filter
For an independent study during my masters work, with the support of Dr. Dan Koditschek and PhD candidate Vassilis Vasilopoulos, I investigated state of the art vision systems and integrated VIO software to improve robot state estimation. I implemented the system on ROS, which was my first exposure to the framework. During my independent study I was only able to integrate and test the system on the Turtlebot. I continued this investigation during the summer after graduation and implemented an improved system on the Minitaur robot by Ghost Robotics. The final report found here. (Note: The report only includes the information completed during the school semester.)
Figure 1: This setup includes the Intel Realsense zr300 depth camera angled 45 deg towards the ground and the TurtleBot 2E using the Kobuki base. (Note: The Hokuyo lidar pictured was not used in my experiments.)
Figure 2: This setup has the Intel Realsense zr300 depth camera angled 45 deg towards the ground and mounted on the Ghost Robotics Minitaur robot.
This is an on-going project of mine. I am currently designing and manufacturing the frame for the mirror. I have implemented and modified the open-source software from the magic mirror github community on a raspberry-pi and 7” touch display. When I am done with the project I will be releasing everything for public use as well. The mirror I am designing is a small desk mirror that can also turn into a smart photo frame and other useful information display.