Phillip M. Grice

Image: Phillip M. Grice

Professional Bio

I am a Roboticist currently working as a Principal Robotics Software Engineer at Blue River Technology, developing automation for the future of farming. Previously, I have worked at iRobot and Bossa Nova Robotics. Before joining Bossa Nova, I earned my Ph.D. in Robotics at the Healthcare Robotics Lab at Georgia Tech. My experience includes research-, business-, and consumer-grade robotic systems, indoor and outdoor applications, diverse application environments, and development at every level of the robotics stack, including hardware integration, diagnostics and monitoring, perception pipelines, fleet management, and web interface design. From this broad background, I have developed expertise in robotic systems design and integration, high-quality production robotic software development practices, and human-robot interaction, especially in human support of autonomous systems.

Image:Phillip and his wife Heather

Personal Bio

When not advancing the future of robotics, I like to spend my time with my wife, Heather, and our dogs, Seamus and Sadie. On most nights, you will find us in the kitchen trying new recipes or perfecting old ones (with the dogs making sure any dropped food never hits the floor). Heather and I are active in our local church, and enjoy hiking and home improvement projects, including a recent update to our master bathroom. Every fall we follow college football, especially the Big Ten. Being from Ohio, I have always been an Ohio State Buckeyes fan, even though Heather, after attending Luther College in Iowa, now supports the Hawkeyes. I am also usually working on at least one side-project, which might range from woodworking to embedded microcontrollers, or possibly both. See my projects page for more details.

Publications

Google Scholar

Journals

A System for Bedside Assistance that Integrates a Robotic Bed and a Mobile Manipulator. AS Kapusta, PM Grice, HM Clever, Y Chitalia, D Park, and CC Kemp. PLoS ONE 14(10):e0221854. March 2019.

In-home and Remote Use of Robotic Body Surrogates by People with Profound Motor Deficits. PM Grice and CC Kemp. PLoS ONE 14(3):e0212904. March 2019. *2019 PLoS ONE Editors' Pick

Robots for Humanity: A Case Study in Assistive Mobile Manipulation. TL Chen, M Ciocarlie, S Cousins, PM Grice, K Hawkins, K Hsiao, CC Kemp, C-H King, DA Lazewatsky, A Leeper, H Nguyen, A Paepcke, C Pantofaru, WD Smart, L Takayama. IEEE Robotics and Automation Magazine (RAM), Special Issue on Assistive Robotics. Vol. 20. March 2013. Authors listed alphabetically. Image: PDF Icon

A Technique to Measure Optical Properties of Brownout Clouds for Modeling Terahertz Propagation. ST Fiorino, JA Deibel, PM Grice, MH Novak, J Spinoza, L Owens, S Ganti. Applied Optics. 51(16): 3605-3613. June 2012. Image: PDF Icon

Conferences

Autobed: Open Hardware for Accessible Web-based Control of an Electric Bed. PM Grice, Y Chitalia, M Rich, HM Clever, CC Kemp. Rehabilitation Engineering and Assistive Technology Society of North America, 2016 Annual Conference (RESNA 2016). Washington, DC, USA. July 2016. Image: PDF Icon [Poster Image: PDF Icon]

Assistive Mobile Manipulation for Self-Care Tasks Around the Head. KP Hawkins, PM Grice, TL Chen, C-H King, CC Kemp. 2014 IEEE Symposium on Computational Intelligence in Robotic Rehabilitation and Assistive Technologies. Orlando, FL, USA. December 2014. Image: PDF Icon

Whole-Arm Tactile Sensing for Beneficial and Acceptable Contact During Robotics Assistance. PM Grice, MD Killpack, A Jain, S Vaish, J Hawke, CC Kemp. Rehabilitation Robotics (ICORR), 2013 IEEE International Conference on. Seattle, WA, USA. June 2013. Image: PDF Icon

Robots for Humanity: User-Centered Design for Assistive Mobile Manipulation. TL Chen, M Ciocarlie, S Cousins, PM Grice, K Hawkins, K Hsiao, CC Kemp, C-H King, DA Lazewatsky, A Leeper, H Nguyen, A Paepcke, C Pantofaru, WD Smart, L Takayama. Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on. pp. 2434-2435. Vilamoura, Algarve, Portugal. October 2012. Authors listed alphabetically. Image: PDF Icon

The Wouse: A Wearable Wince Detector to Stop Assistive Robots. PM Grice, A Lee, H Evans, CC Kemp. 21st IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). pp. 165-172. Paris, France. September 2012. Image: PDF Icon

Lab Measurements to Support Modeling Terahertz Propagation in Brownout Conditions. ST Fiorino, PM Grice, MJ Krizo, RJ Bartell, JD Haiducek, SJ Cusumano. SPIE Defense, Security, and Sensing. Orlando, FL, USA. April 2010. Image: PDF Icon

Modeling THz Propagation in Brownout Conditions. PM Grice. 12th Annual Directed Energy Annual Symposium. San Antonio, TX, USA. November 3, 2009.

Image Based BRDF Acquisition. PM Grice. 11th Annual Directed Energy Annual Symposium. Honolulu, HI, USA. November 18, 2008.

Workshops & Presentations

Assistive Mobile Manipulation for Users with Severe Motor Impairments. PM Grice. Defense of Ph.D. Dissertation. Atlanta, GA, USA. June 12, 2017.

7 Day In-home Evaluation: Mobile Manipulator Robot. PM Grice, HM Clever. TechSAge State of the Science Conference Atlanta, GA, USA. March 27, 2017. Image: PDF Icon

Assistive Mobile Manipulation: Designing for Operators with Motor Impairments. PM Grice, CC Kemp. RSS 2016 Workshop: Socially & Physically Assistive Robotics for Humanity. Ann Arbor, MI, USA. June 18, 2016. Image: PDF Icon

A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning. T Bhattacharjee, PM Grice, A Kapusta, MD Killpack, D Park, CC Kemp. IROS 2014 Workshop: 3rd Workshop on Robots in Clutter: Perception and Interaction in Clutter. Chicago, IL, USA. September 18, 2014. Authors listed alphabetically except CC Kemp. Image: PDF Icon

Robots for Humanity: Developing Assistive Mobile Manipulation. PM Grice, TL Chen, M Ciocarlie, S Cousins, K Hawkins, K Hsiao, C-H King, D Lazewatsky, A Leeper, H Nguyen, A Paepcke, C Pantofaru, W Smart, L Takayama, CC Kemp. 2012 Biomedical Engineering Society Annual Meeting: Assistive Technology & Robotics in Rehabilitation Engineering. Atlanta, GA, USA. October 2012. Authors listed alphabetically except first, last.

Modeling THz Propagation in Brownout Conditions. PM Grice. Directed Energy Education Workshop. San Antonio, TX, USA. November 6, 2009.

2008 AFIT DE Summer Intern Program. M Houle, B McClung, PM Grice. Directed Energy Education Workshop. Honolulu, HI, USA. November 21, 2009.

Projects

Dissertation: Accessible, Web-based Robot UI

2014 - 2017

My graduate research focused on enabling non-expert users with severe motor impairments to effectively operate human-scale, general-purpose robots for self-care and everyday tasks. The complexity of these robots gives them the potential to provide economical, 24/7 assistance to diverse users for a variety of tasks, but also makes them difficult for non-experts to control. I worked to change this through the design and evaluation of accessible interfaces and intelligent robotic capabilities. My dissertation describes the results of a multi-year, user-centered research and design process, including both a laboratory evaluation with 15 non-expert users with profound motor impairments, and a seven day deployment of the robot in the home of another individual with quadriplegia.

The interface is web-based, and can be operated via a single-button mouse. This makes it accessible to non-expert users, as no installation is required, and many individuals with profound motor impairments can operate a single-button mouse via accessible Human-Computer Interface devices, such as eye-gaze tracking, head tracking, speech controls, or simple touchpads. The greatest challenge in designing such a system is to map the 2D screen surface to control the PR2's 20 degrees of freedom, and to make this system accessible to non-technical users with little or no training. To accomplish this, I designed a video-centric interface with augmented reality features. This provides a consistent perspective and more natural interaction. Multiple interface modes break down the control space into manageable segments. The modes are visually and functionally distinct to avoid confusion, and each mode is designed to allow control of the relevant portion of the robot in a natural and efficient manner.

Image: The accessible PR2 web interface - Driving Mode. Image: The accessible PR2 web interface - Hand Control Mode.

I use the PR2 robot from Willow Garage, Inc. as the assistive robot for this work, and have developed numerous ROS components for various capabilities, using both Python and C++. The web-based interface uses ROSLIB.js from Robot Web Tools to communicate with the robot, and the front-end is a combination of custom HTML5, CSS3, and Javascript. I use JQuery and JQueryUI for DOM manipulation and styling, as well as the Snap.svg and THREE.js libraries for more advanced augmented reality display features.

In one study, 15 individuals with profound motor impairments from a variety of causes, operated the robot effectively from across the US, using a variety of accessible interface devices. After only limited training, participants demonstrated a clinically significant improvement in a modified clinical manipulation test, as well as success in a simulated self-care task. I also deployed this system for seven days in the home of Henry and Jane Evans. During this time, Henry used the robot in 16 separate sessions, totaling over 22.5 hours, to complete a variety of tasks both for his own care and around the home. The results of this exploratory study highlight both near-term opportunities for assistive robots in the home, as well as remaining challenges to near-term adoption. Full details of the system and evaluation results can be found in my dissertation.

Autobed

2012 - Present

The Autobed device enables an individual to operate an Invacare Full Electric Homecare Bed via a web-based interface. This makes important functionality accessible to individuals, such as Henry Evans, whose motor impairments prevent them from adjusting their bed using the standard control pendant. This capability can improve comfort and reduce the need for able-bodied caregivers to adjust the bed, and may help to prevent the formation of bedsores.

Image: The Autobed v2 device.

In June 2014, I mentored Henry Clever in developing an initial prototype using Arduino, XBee, and a PySerial-based interface on Henry's Mac laptop. I later developed and deployed a web-based GUI to simplify control for the user.

Image: The Autobed v2 web interface.

In summer 2015, I designed and developed Autobed v2. Autobed v2 uses a Raspberry Pi for both hardware integration and software hosting. The software backend uses a lightweight Python Tornado web-socket server for responsive, bi-directional communication with the interface. The interface itself is custom HTML5, CSS3, and pure Javascript for lightweight operation. Unlike the initial prototype, the Autobed v2 hardware module connects inline between the standard control pendant and the bed's motor control board, using optoisolator circuits to simulate digitally-controlled button presses. The bed hardware remains completely unaltered, and the standard control pendant continues to work whether or not the Autobed module is powered. The complete build instructions are publicly archived here.

Henry Evans has used some form of Autobed almost daily since June 2014, approaching three years of near-daily real-world use. We also submitted a prototype to Invacare, Inc. and they have decided to pursue commercialization of the Autobed through a subsidiary, Adaptive Switch Laboratories, Inc. We have continued to work with their engineers as they refine the prototype for sale. The Autobed was also featured on one of Georgia Tech's 2017 Robot Trading Cards to celebrate National Robotics Week.

Image: The Autobed Trading Card (Front). Image: The Autobed Trading Card (Back).

Custom Digital Clock

2015 - 2016

In 2015, I decided to design and build an SMD-component digital clock from scratch, including a handmade hardwood case.

Image: The finished digital clock.

For the digital clock, I designed a custom Printed Circuit Board (PCB), which I printed through OSHPark. I then hand-soldered the self-selected components, including an 8-pin ATtiny85 AVR Microcontroller, 2 SN74LS595 shift registers, 2 momentary switch buttons, a green 7-segment display, and numerous transistors and 0604 resistors. I programmed the timing and output display logic in Arduino, and flashed it onto the microcontroller using an Arduino ProMicro configured as an AVR programmer.

I carved the wooden case from six pen blanks of lacewood, placing the PCB inside a hollow so that the 7-segment display is exposed from the front, and the two buttons are accessible at the back for setting the clock. Once the front and back of the case were carved to fit, I finished the lacewood with two coats of Teak Oil before assembling. I loved this project for the many new skills it allowed me to develop, and for the finished project, which I gifted to my father for Father's Day, 2016.

Image: The front panel of the custom PCB. Image: The back panel of the custom PCB.

Henry Evans and Robots for Humanity

2011 - Present

The Robots for Humanity Project is the brain-child of Henry Evans. Henry, who lives near Palo Alto, CA, with his wife Jane, has quadriplegia resulting from a brainstem stroke suffered in 2002. Robots for Humanity encompasses Henry's diverse efforts at developing technology to improve the lives of individuals with severe motor impairments. Our work, which helped bring Robots for Humanity into existence, has focused on developing human-scale, general purpose assistive mobile manipulation. It all started in January 2011 after Henry saw my Ph.D. advisor Prof. Charlie Kemp and Travis Deyle on CNN demonstrating the potential of the PR2 from Willow Garage. Henry, who controls a computer with a head-tracking mouse, recognized that while he couldn't physically interact with his own environment, or provide for his own care, he could control a computer, and the PR2 was simply a computer which could do some of the physical tasks for him. Inspired, Henry contacted Willow Garage and Dr. Kemp, who suggested that I, a new student in robotics, work with Henry to see what we could accomplish. What a journey it has been...

Image: Group photo of Robots for Humanity group members taken in 2011

In the beginning, we partnered with Willow Garage, Prof. Bill Smart and Dan Lazewatsky, and enabled Henry to operate the PR2 with a web-based interface based on Henry's design. This video shows some of the earliest work.

In 2012, our group from Georgia Tech enabled Henry to shave his face using the PR2.

Our work with Henry continues today, and has produced many results, including the main components of my dissertation. Our efforts have also been covered by numerous media outlets, including ABC News, CNET, Singularity Hub, Slashdot, IEEE Spectrum, and Fast Company. CBS News also did a special segment on the project.

Robotic Tactile Sensing

2012-Present - Georgia Institute of Technology

As part of our lab's involvement in the DARPA Maximum Mobility and Manipulation (M3) program, our lab has developed novel techniques using fabric-based tactile sensors for whole-arm robotic tactile sensing. In 2013, I investigated the potential for using our lab's force-regulating model predictive controller and whole-arm tactile sensing to aid in assistive robotics. The results of my study show that people are accepting of significant physical contact when using the tactile sensor and force-regulating controller, and Henry reports that when using the system the PR2 "feels VERY safe." Our ongoing work incorporates tactile sensing as its benefits become evident for more and more applications. This work has been reported by the New York Times, Reuters, KurzweilAI.net, and more.

In late 2013, I lead the integration of multiple research efforts from the lab into a single, coordinated system. Working with other lab members, I designed and implemented a coordinating module on our Meka M1 robot. This module performed action sequencing and execution monitoring for reaching into dense clutter. The combined system integrated learning for initial configurations, dynamic model predictive control with tactile feedback, real-time haptic mapping and classification, and a novel motion-planning strategy for operating on spare maps. I then presented this work on behalf of our team at the 3rd Workshop on Robots in Clutter at IROS 2014.

The Wouse

2012 - Spring - Georgia Institute of Technology

The Wouse (for 'Wince Mouse') was an attempt to develop a wearable, robot-independent run-stop that could be worn and used by people with severe motor impairments, such as Henry Evans. The Wouse uses a mouse sensor placed near the wearer's temple on a pair of glasses to monitor the motion of skin near the eye. As many individuals with severe motor impairments maintain voluntary control of their face and eyes, this seemed like a promising signal which would not interfere with control of a robot, and which would be available even if the control computer or interface became unresponsive.

Image: Phillip wearing his wouse prototype, showing the sensor and battery pack attached to the temple of a pair of safety glasses

I developed a first prototype using the sensor and comms from a SwiftPoint SM300 wireless laptop mouse and a pair of safety glasses. After testing it out, we collected data from a number of individuals making various head motions and facial expressions. With this data, we showed that a Support Vector Machine (SVM) classifier was able to detect wincing while ignoring a variety of other facial expressions and head motions. Henry Evans' nephew, Henry Clever, then an undergraduate at Kansas University, offered to help with the mechanical design. Henry Clever built a second hardware iteration and tested it with his uncle. His prototype can be seen in the video below.

Space-based Chromotomography Experiment (CTEx)

2010 - Summer - Air Force Institute of Technology

During the summer of 2010 I worked with Lt. Col. Michael Hawks at the Air Force Institute of Technology (AFIT) on the space-based Chromotomography Experiment (CTEx). Chromotomography is a form of hyper-spectral imaging that uses rotating prism to spread the spectrum of light from each point in an image across a line on the imaging plane. By knowing the orientation of the prism when an image is taken, and the spread of spectral lines from that prism, the spectral intensity at each image point can be deduced by enforcing agreement between reconstructions of multiple images from different prism angles. When you work for the U.S. Air Force, you try to do this at very high speed (high frame rates and prism rotation velocity), from space, so that you can observe explosive detonations, determine the spectra of the detonation, and thereby determine the particular explosives used.

During the summer, I rewired and improved the optical calibration of a ground-based prototype system, producing visible improvements in image and data quality. I also developed control and data collection software using a combination of LabVIEW and MATLAB.

Optically-Powered Peripheral Neural Stimulation

2009-2010 - Brown University

My undergraduate honors thesis under Prof. Arto Nurmikko in the Neuroengineering and Nanophotonics Laboratory at Brown University involved designing and fabricating an implantable electrode for optically-powered functional electrical stimulation (OFES). Functional electrical stimulation (FES) aims to reinnervate peripheral nerves which are healthy but unused due to injury close to the spin or other higher structures. While possibly useful, current methods use long wires to transfer electrical power from batteries in the abdomen to simulation sites in the extremities. This wire can create and is susceptible to electrical interference and, if damaged, could lead to further tissue damage and injury. Optical fibers can be smaller and less dangerous for delivering power throughout the body, if light from the fiber can be transformed back to electrical potentials at the stimulation site.

Image: The optically-powered functional electrical stimulation device produced as part of Phillip's undergraduate honors thesis. Image: A close-up view of the photovoltaics and fiber optics, wire leads, and nerve cuff of the OPES system.

Working with then graduate student David Borton, I designed molds for a nerve cuff and containment for wire leads, optical fibers, and a photovoltaic chip for implantation in the body. After extensive bench-top testing, I was able to demonstrate optically-powered electrical stimulation in the sciatic nerve of a Sprague-Dawley Rat under anesthesia.

Modeling Terahertz (THz) Radiation Propagation through Brownout Conditions

2009 - Summer - Air Force Institute of Technology

During the summer of 2009, I worked with Prof. Steven Fiorino at the Air Force Institute of Technology (AFIT) on experimentally determining a model for the propagation of Terahertz (THz) radiation through brownout conditions. Brownout is the condition, common in dusty environments, where sand and dust is kicked up by the downwash of a helicopter during landing. Brownout not only blinds the pilot and many conventional sensors, but can produce optical illusions resulting in potentially critical pilot over-correction. Terahertz radiation has been suggested as a method of imaging through brownout as the THz wavelengths are sufficiently large to avoid the scattering by dust particles which blinds visible-light sensors, while still being small enough to provide clear visual imagery, unlike traditional radar.

Image: A Boeing V-22 Osprey near the ground showing the dust cloud from rotor downwash which can lead to brownout conditions.

Conducting my research required analyzing thin layers of sand using multiple Terahertz spectrometers, including experimental setups at AFIT, the Air Force Research Labs, and IDCAST (thanks to Prof. Jason Deibel). I incorporated my data-driven model into LEEDR, a MATLAB-based software package for determining radiation propagation through the atmosphere. My updated model clearly demonstrates the opportunity for Terahertz imaging to provide improved visibility through brownout conditions.

Image-based BRDF Acquisition

2008 - Summer - Air Force Institute of Technology

During the summer of 2008, I worked with Dr. Michael Marciniak at the Air Force Institute of Technology (AFIT) to develop a platform for Camera-based Bidirectional Reflectance Distribution Function (BRDF) Acquisition. The system used an expanded (~18in), collimated laser source to consistently illuminate a cylinder of target material. A camera mount revolved around the target cylinder's central axis, and collecting reflectance intensity over a wide range of angles in every frame. I developed data processing scripts using a combination of ImageJ Macros and MATLAB. This methods allows for the determination of isotropic BRDF characteristics of the target material in less time and with less effort than existing methods. I was invited to present my results at the annual Directed Energy Symposium in Honolulu, HI.

Image: Experimental setup for camera-based BRDF acquisition from 2008 summer internship at AFIT

Homemade 12-foot Trebuchet

2006 - 2008

After a high-school class project building a small trebuchet, from 2006-2008 I designed and built, with my grandfather (a career civil engineer), a slightly larger version.

Image: Side view of a 12 foot tall trebuchet that I built with my grandfather during college. Image: A front view of the same trebuchet.

12 feet tall at the axle, with a ~23 foot long throwing arm, and using a tub from an old water heater, filled with water, as the counterweight, we managed to throw a few paving bricks over 100 yards before the throwing arm started to warp. While I did learn a lot constructing it, this project was purely for fun. Unfortunately, after sitting behind my grandfather's barn worrying his neighbors for a few years, it has since been taken apart and the wood recycled.

Curriculum Vitae

Contact

Contact Info

Phillip M. Grice
Principal Robotics Software Engineer
Blue River Technology
Phone: (563) 419-2345
E-mail: phillip.grice@bluerivertech.com

Image: LinkedIn icon. Click to view Phillip's LinkedIn profile.