Thursday, December 11, 2014

CSE Capstone Students Use Machine Learning and Haptics to Further Technology

On Wednesday, December 10, the Computer Science and Engineering senior capstone design course presented their semester projects to classmates and industry representatives alike in the Harvey R. Bright Building at Texas A&M University. Each of the five senior design teams was mentored by members of Dr. Tracy Hammond’s Sketch Recognition Lab. The projects included iErgonomics, VR, InMotion, C.A.N.E., and CourseSketch.



Team iErgonomics
The first group created iErgonomics, a system that attempts to use eye tracking to tell computer users when they are least busy to take a break in order to avoid computer vision syndrome. The group was comprised of students Aaron Moore, Karan Khatter, Michael Gyarmathy, and Zach Cannon. The group was supported by Dr. Hammond, SRL members Cassandra Oduola and Folami Alamudun, and Dr. Mark Benden, an Associate Professor in Environmental and Occupational Health.

Computer vision syndrome (CVS) refers to the pains 132 million people in the U.S. experience from looking at computer screens for extended periods of time. These pains include eyestrain, headaches, dry eyes, neck pain, and shoulder pain. Other programs have been created to help prevent CVS, but the group argued that none are user friendly, as they interrupt productivity. Therefore, the group created iErgonomics, which uses economical eye tracking to decide when a user is not busy and, thus, decide the best time to interrupt their computer usage for the recommended time to avoid CVS.

iErgonomics uses Eye Tribe, an eye tracking product from Denmark that detects eye direction and pupil dilation. When a user is considered “not busy,” a notification box appears on screen alerting the user to begin their rest period. If the user’s eyes return to the screen during the break, the eye tracking product recognizes their pupils and the break time starts to increase again, until the user looks away for the appropriate amount of time. The user can choose the 20-20-20 option, which requires the user to look away from the screen every twenty minutes for twenty seconds, ideally looking at an object at least twenty feet away. Or the user can choose the 60-5 option, which requires the user to look away from the screen every hour for five minutes. Both options are supported by research on CVS.

In the future, as per Dr. Mark Benden’s suggestion, the iErgonomics group plans to consider how to prove avoiding CVS improves person or employee productivity. With a focus on productivity, the group believes businesses would show interest in the product.


Karan Khatter

Aaron Moore

Zach Cannon

Michael Gyarmathy

Team VR
The second group created VR, a program that combines motion tracking and virtual reality to provide an immersive education system. The group was comprised of students Patrick Knauth, Jose Manriquez, Adrien Mombo-Caristan, and Hao Sun. The group was supported by Dr. Hammond, SRL members Cassandra Oduola and Raniero Lara-Garduno, and Dr. Debra Fowler, from the Center for Teaching Excellence at Texas A&M.

VR provides student users a physics learning application and an astronomy learning application. For physics, the program uses motion tracking, via Leap Motion, and virtual reality, via the Oculus Rift, to allow users to do learning tasks such as lifting and dropping balls in with varying gravitational pulls or rolling balls down surfaces of different levels of friction. In the astronomy application, users can go to each planet of our solar system and interact with each planet and celestial objects that have varying gravitational pulls as well.

In order to consider the economic impact these learning technologies could have on users, the team separated VR into modules. Through this, depending on their economic accessibility, users can simply use a mouse to interact with the educational program, or use the optional motion tracking and virtual reality components to make the experience more immersive.

In the future, the group sees VR serving different learning preferences as an easily distributed platform to many users.

Team InMotion
The third group created the Therapeutic Exercise Haptic Monitor, an at-home physical therapy program that uses haptic feedback, visual feedback, and motion tracking to ensure the patient is executing the exercises correctly. The group was comprised of students Jerry Barth, Patrick Vo, Trevor Gray, and Matthew Mjelde. The group was supported by Dr. Hammond and SRL members Cassandra Oduola and Vijay Rajanna.

The Therapeutic Exercise Haptic Monitor uses motion tracking to ensure a patient completes their physical therapy exercises correctly and completely. Physical therapy can be costly and time consuming, and the group hopes this program will help reduce the hours physicians are needed to help patients in executing basic movements. Physical therapists can create exercise routines and export the routine to the user’s home program. Then, the user can do the instructed exercises as per the program’s instruction. Once the user has reached the correct position in their exercise, such as lifting their arm 50 degrees, a visual circle on the screen turns red and a vibration is sent through a felt armband with sensors embedded in the material. The user feels the vibration, and lowers their arm to the original position.

In the future, the group sees the Therapeutic Exercise Haptic Monitor benefiting athletes in their training, with the primary example being weight lifters. The Therapeutic Exercise Haptic Monitor could be used to ensure these lifters are not overextending their limbs, and the program would vibrate when the athlete reaches the ideal angle.


Jerry Barth

Patrick Vo

Trevor Gray

Matthew Mjelde
Team C.A.N.E.
The fourth group created C.A.N.E., a vibrating belt that uses sonar and GPS to allow blind users to navigate their world without a conspicuous cane or Seeing Eye dog. The group was comprised of Grant Hendley, Kodi Tapie, Jeff Harrison, and Matt Harper. The group was supported by Dr. Hammond and SRL members Cassandra Oduola and Larry Powell.

The group carefully chose haptic, or vibrating, feedback, as they were told by visually impaired students that audio feedback would interrupt their natural ability to use sound to navigate their surroundings. The belt uses ultrasonic sensors and GPS triangulation to detect where the wearer is located and warn them of upcoming obstacles. If the belt vibrates on the right side of the body, the user needs to go to the left to avoid an obstacle.

The group is passionate about using human-computer interaction to improve the well-being of the disadvantaged. In the future they may extend this technology to be useful to firefighters, who are navigating usually dark, smoky, and dangerous situations with little to no guidance.

Team CourseSketch
The fifth group created CourseSketch, an educational program that allows student users to draw or sketch images that are recognized by the program. The group was comprised of Khoa Bui, James Granger, Andrew King, Angel Lozano, Matthew Runyon, Antonio Sanchez, Devin Tuchsen, Michael Turner, and Joshua Zodda. The group was supported by Dr. Tracy Hammond, SRL members Cassandra Oduola and David Turner, and Dr. Dwayne Raymond, who used CourseSketch in his Intro to Logic class last semester.

CourseSketch is a sketch recognition program. It allows instructors of online, or in-person classes with online components, to do more than multiple choice questions online. Sketch recognition means students can draw and sketch images and equations that can even be automatically graded by the system. If the instructor prefers the system to give automatic feedback, the program allows for instant feedback on open-ended problems. Later, too, the instructor can watch a student’s drawing of the image, and give them feedback in real time.

The program is intuitive for student and instructor users alike, the group argued, and interactive tutorials are already built into the system as well. In the future, the group hopes that CourseSketch will continue to be developed and allow for full lectures to be conducted successfully using the program. It will be used in the classroom next semester for Phil240.


Andrew King III

Antonio Sanchez

Joshua Zodda

Angel Lozano

Devin Tuchsen

James Granger

Khoa Bui

Matthew Runyon

Michael Turner


Jess Gantt can be reached at jessicalgantt@gmail.com

Friday, December 5, 2014

Clint Brown's Reflections on his Visit to the SRL

As discussed in last week's blog post, Clint Brown, the Director of Product Engineering for Esri, a company that is a leading innovator in developing geographic information systems (GIS), visited Dr. Tracy Hammond's Sketch Recognition Lab on Tuesday, November 18. Brown visited during the three-day Texas A&M University GIS Day 2014 event organized by Dr. Daniel Goldberg, Assistant Professor of Geography. 

Brown offered his thoughts on his visit to the SRL, and graciously allowed us to share them with you here! Brown found many similarities in the processes for SRL research and Esri product development. Both entities use 
small collaborative teams to engineer projects on the cutting edge of computer science. 

Clint Brown, Monday, December 1:


I very much enjoyed meeting the students in the lab and having each of them show their work. This reminded me of a lot of how our development teams work at Esri. At Esri, we are always searching to find for more true talent. We continually seek out clever software engineers who have what it takes to be creative, to work on teams, and who pay close attention to the user experience.

We like to work on real customer problems, very much like the ones that the students are working on in the lab. And a key aspect of that work is to imagine new kinds of experiences for software users. 
The best experiences are graphic and interactive and fun to use. And they do something useful, but are also cool. They are delightful in their own unique ways. That’s the vibe I picked up on in the Lab. You definitely have a talented team of very smart, creative people collaborating to solve some incredibly complex problems. And those team members are coming up with some inventive, and fun, approaches. It’s a talent that the software industry (and Esri in particular) is always looking for.

In addition, I believe that leadership and collaboration are incredibly important. Leaders who can motivate and guide designs and navigate the team through critically important creative processes are incredibly valuable. Dr. Hammond, I think, is providing strong vision and leadership. And I liked the collegial atmosphere. You work as a team.


At Esri, on our software development teams, we have similar meetings where our small project teams show their latest work to their peers and to our leadership. The process of having to introduce and explain, and ultimately --
 to present your results through demonstrations is a very effective way to develop strong, robust solutions to the problems you are trying to solve. And this process needs to be repeated over the life of the project; you are iterating and evolving your designs and implementations. 

Great ideas are hard enough to create, but to implement elegant solutions for those ideas requires the genius of teams working together combined with a willingness to adapt and evolve your designs to move them forward. I really liked that about the lab. I really liked the creativity and openness of the students. I felt like feedback and collaboration is a big part of your success.

It’s fantastic to see these applied projects. I believe that add so much more depth and understanding about how new computing approaches and technology are evolving. The empirical experience that the students are gaining is incredibly valuable.


Here are a few other observations about how your work is like our product development here at Esri:

- We have small teams, and we expect each team to articulate their goals and to demonstrate their software in short iterations. We apply agile methods like Scrum.


- We expect that each new version of the software does something interesting and important to address key aspects of the overall problem for which we are trying to build solutions. We demo at each iteration and in between.


- Our work is very graphic – cartographic. In other words, maps and information layers are at the heart of our work. Your work is as well.


- The broad adoption of smart phones and tablets in the past decade is causing great disruption and is transforming computing. The designs for commercial software and apps are now incredibly graphic in nature. This is a great point of departure (compared to traditional computer science approaches) for the innovative work being done in the lab. It’s not that computer science concepts are invalid or are going away. It’s how they are being applied in this new paradigm.


- Today’s computing platforms are communications networks enabling a lot of web interaction and the use of smart devices and development tools like JavaScript, JSON, the web, CSS, iOS, Android, etc. And focused apps like you are building.

- Your projects are collaborative in nature. It’s about the work being done by and the talent and ethics of the team.


- Social coding and shared source are fundamental to how new software will be developed. This is about people collaborating and sharing great ideas and implementations that others can leverage in their own work.


So, I very much enjoyed my visit to the lab. And as an added bonus, it was in the Teague building in the same location as the “Institute” of Statistics was back in 1976 to 1978 when I was a graduate student in the Statistics program for my Masters of Science degree!


Thanks for the wonderful experience on my visit.







Jess Gantt can be contacted at jessicalgantt@gmail.com.