My research interests include human-centered computing, mobile computing, biosignal processing and augmented reality. I have spent over ten years working for the Department of Defense. During that time, I have worked on various human-centered systems for the U.S. Military that facilitate ubiquitous interaction, communication and collaboration through technology.  I have also conducted research in innovative and unorthodox pedagogical methods for introducing underrepresented groups to computing. More information related to specific research efforts can be found below.


  • Chavez, B., White, C., Andrews, A., Wilde, J., Valentine, W., Harrison, P., Kupferman, S., Hadfield, S., Cummings, D., Sparkman, C., Hadfield, M., McMichael, J. Real World Service Learning: […]Programming Humanoid NAO and Pepper Robots to Foster Social Interaction Development for Children with Autism. 10th annual International Conference on Education and New Learning Technologies (EDULEARN2018), Palma de Mallorca (Spain). July 2 -4 2018.
  • Cummings, D., Cheeks, L., Robinson, R. Culturally-Centric Outreach and Engagement for Underserved Groups in STEM. ACM Special Interest Group on Computer Science Education Technical Symposium (SIGSCE 2018), Baltimore, Maryland. February 21 -24 2018.
  • Obaid, M., Duenser, A., Moltchanova, E., Cummings, D., Wagner, J., Bartneck, C. LEGO Pictoral Scales for Assessing Affective Response. 15th IFIP TC13 Conference on Human-Computer Interaction (INTERACT 2015), Bamberg, Germany. September 14 – 18 2015.
  • Cummings, D., Prasad, M., Lucchese, G., Aikens, C., Ho, J. and Hammond, T. Multi-modal Location-Aware System for Paratrooper Team Coordination. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2013) Case Studies, Paris, France, April 27 – May 2, 2013. Best of CHI Award: Honorable Mention.
  • Cummings, D., Lucchese, G., Prasad, M., Aikens, C., Ho, J. and Hammond, T. Haptic and AR Interface for Paratrooper Coordination. 13th Annual Conference of the NZ ACM Special Interest Group on Human-Computer Interaction (CHINZ 2012), Dunedin, New Zealand. July 2 – 3, 2012.
  • Cummings, D., Vides, F. and Hammond, T. I Don’t Believe My Eyes! Geometric Sketch Recognition for a Computer Art Tutorial. International Symposium on Sketch-Based Interfaces and Modeling (SBIM 2012), Annecy, France. June 4 – 6, 2012.
  • Cummings, D., Fymat, S. and Hammond, T. RedDog: A Smart Sketch Interface for Autonomous Aerial Systems. International Symposium on Sketch-Based Interfaces and Modeling (SBIM 2012), Annecy, France. June 4 – 6, 2012.
  • Cummings, D., Fymat, S. and Hammond, T. Sketch-based Interface for Interaction with Unmanned Air Vehicles. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2012) Works-in-Progress, Austin, TX, May 5-10, 2012.
  • Paulson, B., Cummings, D., Hammond, T. (2010). Object Interaction Detection using Hand Posture Cues in an Office Setting. International Journal of Human-Computer Studies, 12, 389-401.


Navigation and assembly are critical tasks for Soldiers in battlefield situations. Soldiers must locate equipment, supplies and teammates quickly and quietly in order to ensure the success of their mission. This task can be extremely difficult and take a significant amount of time without guidance or extensive experience. In order to facilitate the re-assembly and coordination of airborne paratrooper teams, we have developed a location-aware system that uses an ad-hoc Wi-Fi network on android phones to broadcast and receive encrypted GPS coordinates of equipment and/or rendezvous points. We tested GeoTrooper with U.S. Army paratroopers and demonstrated that the system was easy to learn and effectively helped the Soldiers navigate to prescribed locations.

Extensive research has been conducted to develop technologies that will allow computers to identify the user state. With significant advances being made in physiological data capturing technology, there is surprisingly very little research on the visualization of user states and its value in a social context. We are currently developing a system that uses biosignal data to recognize human emotion in real-time. We hope to one day combine the use of augmented reality and physiological signal classification in order to create a real-time visualization of the user state. Our goal is to create a system that can be used to facilitate non-verbal communication for the purpose of enhancing human-to-human interaction and team coordination in the field.

Sketch-based Unmanned Aerial System Interface
The US military is exploring the use of Unmanned Aircraft Systems (UAS) to deliver cargo to soldiers in the field and eventually fulfill various other air support requests. The UAS is expected to operate with a high level of autonomy. As a result, there is a need for a multimodal interface that will allow soldiers to directly interact with and direct the Cargo UAS. SRL is working in conjunction with Polarity Labs to develop a user interaction approach suitable for interacting with autonomous unmanned aircraft systems that makes use of sketch, text, speech input and video/augmented reality and graphical output.


Leave a Reply