I am a Roboticist currently working as a Senior Software Engineer in Robotics at Bossa Nova Robotics, where we are bringing autonomous robots into public spaces around the world. Before joining Bossa Nova, I earned my Ph.D. in Robotics from Georgia Tech (via the Healthcare Robotics Lab). My graduate research focused on enabling non-expert users with severe motor impairments to effectively operate human-scale, general-purpose robots for self-care and everyday tasks. The complexity of these robots gives them the potential to provide economical, 24/7 assistance to diverse users for a variety of tasks, but also makes them difficult for non-experts to control. My dissertation worked to change this through the design and evaluation of accessible interfaces and intelligent robotic capabilities. I described the results of a multi-year, user-centered research and design process, including both a laboratory evaluation with 15 non-expert users with profound motor impairments, and a seven day deployment of the robot in the home of another individual with quadriplegia.
When not working on making robots more capable, I like to spend my time with my wife, Heather, and our dog, Seamus. On most nights, you will find the three of us in the kitchen trying new recipes or perfecting old ones (with Seamus making sure any dropped food never hits the floor). Aside from that, I am often working on at least one side-project, which might range from woodworking to embedded microcontrollers, or possibly both. My most recent build was a small digital clock, custom designed and built, from the PCB layout to the wooden case carved from pen blanks.
Heather and I are active in our local church, and enjoy hiking and working on our home together. I recently enjoyed installing custom under-cabinet LED lighting in our kitchen, and had enough spare components to add lights in our basement hallway as well. As we are originally from Vandalia, OH, after a few years living in the Atlanta heat, we are enjoying Pittsburgh's snowy winters, though Seamus is still not convinced. Every fall we follow College Football, especially the Big Ten. Being from Ohio, I have always been an Ohio State Buckeyes fan, even though Heather, after attending Luther College in Iowa, now supports the Hawkeyes.
Robots for Humanity: A Case Study in Assistive Mobile Manipulation. TL Chen, M Ciocarlie, S Cousins, PM Grice, K Hawkins, K Hsiao, CC Kemp, C-H King, DA Lazewatsky, A Leeper, H Nguyen, A Paepcke, C Pantofaru, WD Smart, L Takayama. IEEE Robotics and Automation Magazine (RAM), Special Issue on Assistive Robotics. Vol. 20. March 2013. Authors listed alphabetically.
A Technique to Measure Optical Properties of Brownout Clouds for Modeling Terahertz Propagation. ST Fiorino, JA Deibel, PM Grice, MH Novak, J Spinoza, L Owens, S Ganti. Applied Optics. 51(16): 3605-3613. June 2012.
Autobed: Open Hardware for Accessible Web-based Control of an Electric Bed. PM Grice, Y Chitalia, M Rich, HM Clever, CC Kemp. Rehabilitation Engineering and Assistive Technology Society of North America, 2016 Annual Conference (RESNA 2016). Washington, DC, USA. July 2016. [Poster ]
Assistive Mobile Manipulation for Self-Care Tasks Around the Head. KP Hawkins, PM Grice, TL Chen, C-H King, CC Kemp. 2014 IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies. Orlando, FL, USA. December 2014.
Whole-Arm Tactile Sensing for Beneficial and Acceptable Contact During Robotics Assistance. PM Grice, MD Killpack, A Jain, S Vaish, J Hawke, CC Kemp. Rehabilitation Robotics (ICORR), 2013 IEEE International Conference on. Seattle, WA, USA. June 2013.
Robots for Humanity: User-Centered Design for Assistive Mobile Manipulation. TL Chen, M Ciocarlie, S Cousins, PM Grice, K Hawkins, K Hsiao, CC Kemp, C-H King, DA Lazewatsky, A Leeper, H Nguyen, A Paepcke, C Pantofaru, WD Smart, L Takayama. Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on. pp. 2434-2435. Vilamoura, Algarve, Portugal. October 2012. Authors listed alphabetically.
The Wouse: A Wearable Wince Detector to Stop Assistive Robots. PM Grice, A Lee, H Evans, CC Kemp. 21st IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). pp. 165-172. Paris, France. September 2012.
Lab Measurements to Support Modeling Terahertz Propagation in Brownout Conditions. ST Fiorino, PM Grice, MJ Krizo, RJ Bartell, JD Haiducek, SJ Cusumano. SPIE Defense, Security, and Sensing. Orlando, FL, USA. April 2010.
Modeling THz Propagation in Brownout Conditions. PM Grice. 12th Annual Directed Energy Annual Symposium. San Antonio, TX, USA. November 3, 2009.
Image Based BRDF Acquisition. PM Grice. 11th Annual Directed Energy Annual Symposium. Honolulu, HI, USA. November 18, 2008.
Assistive Mobile Manipulation: Designing for Operators with Motor Impairments. PM Grice, CC Kemp. RSS 2016 Workshop: Socially & Physically Assistive Robotics for Humanity. Ann Arbor, MI, USA. June 18, 2016.
A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning. T Bhattacharjee, PM Grice, A Kapusta, MD Killpack, D Park, CC Kemp. IROS 2014 Workshop: 3rd Workshop on Robots in Clutter: Perception and Interaction in Clutter. Chicago, IL, USA. September 18, 2014. Authors listed alphabetically except CC Kemp.
Robots for Humanity: Developing Assistive Mobile Manipulation. PM Grice, TL Chen, M Ciocarlie, S Cousins, K Hawkins, K Hsiao, C-H King, D Lazewatsky, A Leeper, H Nguyen, A Paepcke, C Pantofaru, W Smart, L Takayama, CC Kemp. 2012 Biomedical Engineering Society Annual Meeting: Assistive Technology & Robotics in Rehabilitation Engineering. Atlanta, GA, USA. October 2012. Authors listed alphabetically except first, last.
Modeling THz Propagation in Brownout Conditions. PM Grice. Directed Energy Education Workshop. San Antonio, TX, USA. November 6, 2009.
2008 AFIT DE Summer Intern Program. M Houle, B McClung, PM Grice. Directed Energy Education Workshop. Honolulu, HI, USA. November 21, 2009.
The focus of my Ph.D. dissertation is the design and evaluation of an accessible interface to enable individuals with profound motor impairments to operate a complex assistive robot. The interface is web-based, and can be operated via a single-button mouse. This makes it accessible to non-expert users, as no installation is required, and many individuals with profound motor impairments can operate a single-button mouse via accessible Human-Computer Interface devices, such as eye-gaze tracking, head tracking, speech controls, or simple touchpads.
The greatest challenge in designing such a system is to map the 2D screen surface to control the PR2's 20 degrees of freedom, and to make this system accessible to non-technical users with little or no training. To accomplish this, I designed a video-centric interface with augmented reality features. This provides a consistent perspective and more natural interaction. Multiple interface modes break down the control space into manageable segments. The modes are visually and functionally distinct to avoid confusion, and each mode is designed to allow control of the relevant portion of the robot in a natural and efficient manner.
I have also designed and conducted an evaluation of this system. In my study, 15 individuals with profound motor impairments from a variety of causes, operated the robot effectively from across the US, using a variety of accessible interface devices. After only limited training, participants demonstrated a clinically significant improvement in a modified clinical manipulation test, as well as success in a simulated self-care task. I also deployed this system for seven days in the home of Henry and Jane Evans. During this time, Henry used the robot in 16 separate sessions, totalling over 22.5 hours, to complete a variety of tasks both for his own care and around the home. The results of this exploratory study highlight both near-term opportunities for assistive robots in the home, as well as remaining challenges to near-term adoption. Additional results from both of these studies will be presented in my upcoming Ph.D. dissertation defense and associated academic publications.
The Autobed device enables an individual to operate an Invacare Full Electric Homecare Bed via a web-based interface. This makes important functionality accessible to individuals, such as Henry Evans, whose motor impairments prevent them from adjusting their bed using the standard control pendant. This capability can improve comfort and reduce the need for able-bodied caregivers to adjust the bed, and may help to prevent the formation of bedsores.
In June 2014, I mentored Henry Clever in developing an initial prototype using Arduino, XBee, and a PySerial-based interface on Henry's Mac laptop. I later developed and deployed a web-based GUI to simplify control for the user.
Henry Evans has used some form of Autobed almost daily since June 2014, approaching three years of near-daily real-world use. We also submitted a prototype to Invacare, Inc. and they have decided to pursue commercialization of the Autobed through a subsidiary, Adaptive Switch Laboratories, Inc. We have continued to work with their engineers as they refine the prototype for sale. The Autobed was also featured on one of Georgia Tech's 2017 Robot Trading Cards to celebrate National Robotics Week.
In 2015, I decided to design and build an SMD-component digital clock from scratch, including a handmade hardwood case.
For the digital clock, I designed a custom Printed Circuit Board (PCB), which I printed through OSHPark. I then hand-soldered the self-selected components, including an 8-pin ATtiny85 AVR Microcontroller, 2 SN74LS595 shift registers, 2 momentary switch buttons, a green 7-segment display, and numerous transistors and 0604 resistors. I programmed the timing and output display logic in Arduino, and flashed it onto the microcontroller using an Arduino ProMicro configured as an AVR programmer.
I carved the wooden case from six pen blanks of lacewood, placing the PCB inside a hollow so that the 7-segment display is exposed from the front, and the two buttons are accessible at the back for setting the clock. Once the front and back of the case were carved to fit, I finished the lacewood with two coats of Teak Oil before assembling. I loved this project for the many new skills it allowed me to develop, and for the finished project, which I gifted to my father for Father's Day, 2016.
The Robots for Humanity Project is the brain-child of Henry Evans. Henry, who lives near Palo Alto, CA, with his wife Jane, has quadriplegia resulting from a brainstem stroke suffered in 2002. Robots for Humanity encompasses Henry's diverse efforts at developing technology to improve the lives of individuals with severe motor impairments. Our work, which helped bring Robots for Humanity into existence, has focused on developing human-scale, general purpose assistive mobile manipulation. It all started in January 2011 after Henry saw my Ph.D. advisor Prof. Charlie Kemp and Travis Deyle on CNN demonstrating the potential of the PR2 from Willow Garage. Henry, who controls a computer with a head-tracking mouse, recognized that while he couldn't physically interact with his own environment, or provide for his own care, he could control a computer, and the PR2 was simply a computer which could do some of the physical tasks for him. Inspired, Henry contacted Willow Garage and Dr. Kemp, who suggested that I, a new student in robotics, work with Henry to see what we could accomplish. What a journey it has been...
In the beginning, we partnered with Willow Garage, Prof. Bill Smart and Dan Lazewatsky, and enabled Henry to operate the PR2 with a web-based interface based on Henry's design. This video shows some of the earliest work.
In 2012, our group from Georgia Tech enabled Henry to shave his face using the PR2.
Our work with Henry continues today, and has produced many results, including the main components of my dissertation. Our efforts have also been covered by numerous media outlets, including ABC News, CNET, Singularity Hub, Slashdot, IEEE Spectrum, and Fast Company. CBS News also did a special segment on the project.
As part of our lab's involvement in the DARPA Maximum Mobility and Manipulation (M3) program, our lab has developed novel techniques using fabric-based tactile sensors for whole-arm robotic tactile sensing. In 2013, I investigated the potential for using our lab's force-regulating model predictive controller and whole-arm tactile sensing to aid in assistive robotics. The results of my study show that people are accepting of significant physical contact when using the tactile sensor and force-regulating controller, and Henry reports that when using the system the PR2 "feels VERY safe." Our ongoing work incorporates tactile sensing as its benefits become evident for more and more applications. This work has been reported by the New York Times, Reuters, KurzweilAI.net, and more.
In late 2013, I lead the integration of multiple research efforts from the lab into a single, coordinated system. Working with other lab members, I designed and implemented a coordinating module on our Meka M1 robot. This module performed action sequencing and execution monitoring for reaching into dense clutter. The combined system integrated learning for initial configurations, dynamic model predictive control with tactile feedback, real-time haptic mapping and classification, and a novel motion-planning strategy for operating on spare maps. I then presented this work on behalf of our team at the 3rd Workshop on Robots in Clutter at IROS 2014.
The Wouse (for 'Wince Mouse') was an attempt to develop a wearable, robot-independent run-stop that could be worn and used by people with severe motor impairments, such as Henry Evans. The Wouse uses a mouse sensor placed near the wearer's temple on a pair of glasses to monitor the motion of skin near the eye. As many individuals with severe motor impairments maintain voluntary control of their face and eyes, this seemed like a promising signal which would not interfere with control of a robot, and which would be available even if the control computer or interface became unresponsive.
I developed a first prototype using the sensor and comms from a SwiftPoint SM300 wireless laptop mouse and a pair of safety glasses. After testing it out, we collected data from a number of individuals making various head motions and facial expressions. With this data, we showed that a Support Vector Machine (SVM) classifier was able to detect wincing while ignoring a variety of other facial expressions and head motions. Henry Evans' nephew, Henry Clever, then an undergraduate at Kansas University, offered to help with the mechanical design. Henry Clever built a second hardware iteration and tested it with his uncle. His prototype can be seen in the video below.
During the summer of 2010 I worked with Lt. Col. Michael Hawks at the Air Force Institute of Technology (AFIT) on the space-based Chromotomography Experiment (CTEx). Chromotomography is a form of hyper-spectral imaging that uses rotating prism to spread the spectrum of light from each point in an image across a line on the imaging plane. By knowing the orientation of the prism when an image is taken, and the spread of spectral lines from that prism, the spectral intensity at each image point can be deduced by enforcing agreement between reconstructions of multiple images from different prism angles. When you work for the U.S. Air Force, you try to do this at very high speed (high frame rates and prism rotation velocity), from space, so that you can observe explosive detonations, determine the spectra of the detonation, and thereby determine the particular explosives used.
During the summer, I rewired and improved the optical calibration of a ground-based prototype system, producing visible improvements in image and data quality. I also developed control and data collection software using a combination of LabVIEW and MATLAB.
My undergraduate honors thesis under Prof. Arto Nurmikko in the Neuroengineering and Nanophotonics Laboratory at Brown University involved designing and fabricating an implantable electrode for optically-powered functional electrical stimulation (OFES). Functional electrical stimulation (FES) aims to reinnervate peripheral nerves which are healthy but unused due to injury close to the spin or other higher structures. While possibly useful, current methods use long wires to transfer electrical power from batteries in the abdomen to simulation sites in the extremities. This wire can create and is susceptible to electrical interference and, if damaged, could lead to further tissue damage and injury. Optical fibers can be smaller and less dangerous for delivering power throughout the body, if light from the fiber can be transformed back to electrical potentials at the stimulation site.
Working with then graduate student David Borton, I designed molds for a nerve cuff and containment for wire leads, optical fibers, and a photovoltaic chip for implantation in the body. After extensive bench-top testing, I was able to demonstrate optically-powered electrical stimulation in the sciatic nerve of a Sprague-Dawley Rat under anesthesia.
During the summer of 2009, I worked with Prof. Steven Fiorino at the Air Force Institute of Technology (AFIT) on experimentally determining a model for the propagation of Terahertz (THz) radiation through brownout conditions. Brownout is the condition, common in dusty environments, where sand and dust is kicked up by the downwash of a helicopter during landing. Brownout not only blinds the pilot and many conventional sensors, but can produce optical illusions resulting in potentially critical pilot over-correction. Terahertz radiation has been suggested as a method of imaging through brownout as the THz wavelengths are sufficiently large to avoid the scattering by dust particles which blinds visible-light sensors, while still being small enough to provide clear visual imagery, unlike traditional radar.
Conducting my research required analyzing thin layers of sand using multiple Terahertz spectrometers, including experimental setups at AFIT, the Air Force Research Labs, and IDCAST (thanks to Prof. Jason Deibel). I incorporated my data-driven model into LEEDR, a MATLAB-based software package for determining radiation propagation through the atmosphere. My updated model clearly demonstrates the opportunity for Terahertz imaging to provide improved visibility through brownout conditions.
During the summer of 2008, I worked with Dr. Michael Marciniak at the Air Force Institute of Technology (AFIT) to develop a platform for Camera-based Bidirectional Reflectance Distribution Function (BRDF) Acquisition. The system used an expanded (~18in), collimated laser source to consistently illuminate a cylinder of target material. A camera mount revolved around the target cylinder's central axis, and collecting reflectance intensity over a wide range of angles in every frame. I developed data processing scripts using a combination of ImageJ Macros and MATLAB. This methods allows for the determination of isotropic BRDF characteristics of the target material in less time and with less effort than existing methods. I was invited to present my results at the annual Directed Energy Symposium in Honolulu, HI.
After a high-school class project building a small trebuchet, from 2006-2008 I designed and built, with my grandfather (a career civil engineer), a slightly larger version.
12 feet tall at the axle, with a ~23 foot long throwing arm, and using a tub from an old water heater, filled with water, as the counterweight, we managed to throw a few paving bricks over 100 yards before the throwing arm started to warp. While I did learn a lot constructing it, this project was purely for fun. Unfortunately, after sitting behind my grandfather's barn worrying his neighbors for a few years, it has since been taken apart and the wood recycled.