Thursday, December 11, 2014

CSE Capstone Students Use Machine Learning and Haptics to Further Technology

On Wednesday, December 10, the Computer Science and Engineering senior capstone design course presented their semester projects to classmates and industry representatives alike in the Harvey R. Bright Building at Texas A&M University. Each of the five senior design teams was mentored by members of Dr. Tracy Hammond’s Sketch Recognition Lab. The projects included iErgonomics, VR, InMotion, C.A.N.E., and CourseSketch.



Team iErgonomics
The first group created iErgonomics, a system that attempts to use eye tracking to tell computer users when they are least busy to take a break in order to avoid computer vision syndrome. The group was comprised of students Aaron Moore, Karan Khatter, Michael Gyarmathy, and Zach Cannon. The group was supported by Dr. Hammond, SRL members Cassandra Oduola and Folami Alamudun, and Dr. Mark Benden, an Associate Professor in Environmental and Occupational Health.

Computer vision syndrome (CVS) refers to the pains 132 million people in the U.S. experience from looking at computer screens for extended periods of time. These pains include eyestrain, headaches, dry eyes, neck pain, and shoulder pain. Other programs have been created to help prevent CVS, but the group argued that none are user friendly, as they interrupt productivity. Therefore, the group created iErgonomics, which uses economical eye tracking to decide when a user is not busy and, thus, decide the best time to interrupt their computer usage for the recommended time to avoid CVS.

iErgonomics uses Eye Tribe, an eye tracking product from Denmark that detects eye direction and pupil dilation. When a user is considered “not busy,” a notification box appears on screen alerting the user to begin their rest period. If the user’s eyes return to the screen during the break, the eye tracking product recognizes their pupils and the break time starts to increase again, until the user looks away for the appropriate amount of time. The user can choose the 20-20-20 option, which requires the user to look away from the screen every twenty minutes for twenty seconds, ideally looking at an object at least twenty feet away. Or the user can choose the 60-5 option, which requires the user to look away from the screen every hour for five minutes. Both options are supported by research on CVS.

In the future, as per Dr. Mark Benden’s suggestion, the iErgonomics group plans to consider how to prove avoiding CVS improves person or employee productivity. With a focus on productivity, the group believes businesses would show interest in the product.


Karan Khatter

Aaron Moore

Zach Cannon

Michael Gyarmathy

Team VR
The second group created VR, a program that combines motion tracking and virtual reality to provide an immersive education system. The group was comprised of students Patrick Knauth, Jose Manriquez, Adrien Mombo-Caristan, and Hao Sun. The group was supported by Dr. Hammond, SRL members Cassandra Oduola and Raniero Lara-Garduno, and Dr. Debra Fowler, from the Center for Teaching Excellence at Texas A&M.

VR provides student users a physics learning application and an astronomy learning application. For physics, the program uses motion tracking, via Leap Motion, and virtual reality, via the Oculus Rift, to allow users to do learning tasks such as lifting and dropping balls in with varying gravitational pulls or rolling balls down surfaces of different levels of friction. In the astronomy application, users can go to each planet of our solar system and interact with each planet and celestial objects that have varying gravitational pulls as well.

In order to consider the economic impact these learning technologies could have on users, the team separated VR into modules. Through this, depending on their economic accessibility, users can simply use a mouse to interact with the educational program, or use the optional motion tracking and virtual reality components to make the experience more immersive.

In the future, the group sees VR serving different learning preferences as an easily distributed platform to many users.

Team InMotion
The third group created the Therapeutic Exercise Haptic Monitor, an at-home physical therapy program that uses haptic feedback, visual feedback, and motion tracking to ensure the patient is executing the exercises correctly. The group was comprised of students Jerry Barth, Patrick Vo, Trevor Gray, and Matthew Mjelde. The group was supported by Dr. Hammond and SRL members Cassandra Oduola and Vijay Rajanna.

The Therapeutic Exercise Haptic Monitor uses motion tracking to ensure a patient completes their physical therapy exercises correctly and completely. Physical therapy can be costly and time consuming, and the group hopes this program will help reduce the hours physicians are needed to help patients in executing basic movements. Physical therapists can create exercise routines and export the routine to the user’s home program. Then, the user can do the instructed exercises as per the program’s instruction. Once the user has reached the correct position in their exercise, such as lifting their arm 50 degrees, a visual circle on the screen turns red and a vibration is sent through a felt armband with sensors embedded in the material. The user feels the vibration, and lowers their arm to the original position.

In the future, the group sees the Therapeutic Exercise Haptic Monitor benefiting athletes in their training, with the primary example being weight lifters. The Therapeutic Exercise Haptic Monitor could be used to ensure these lifters are not overextending their limbs, and the program would vibrate when the athlete reaches the ideal angle.


Jerry Barth

Patrick Vo

Trevor Gray

Matthew Mjelde
Team C.A.N.E.
The fourth group created C.A.N.E., a vibrating belt that uses sonar and GPS to allow blind users to navigate their world without a conspicuous cane or Seeing Eye dog. The group was comprised of Grant Hendley, Kodi Tapie, Jeff Harrison, and Matt Harper. The group was supported by Dr. Hammond and SRL members Cassandra Oduola and Larry Powell.

The group carefully chose haptic, or vibrating, feedback, as they were told by visually impaired students that audio feedback would interrupt their natural ability to use sound to navigate their surroundings. The belt uses ultrasonic sensors and GPS triangulation to detect where the wearer is located and warn them of upcoming obstacles. If the belt vibrates on the right side of the body, the user needs to go to the left to avoid an obstacle.

The group is passionate about using human-computer interaction to improve the well-being of the disadvantaged. In the future they may extend this technology to be useful to firefighters, who are navigating usually dark, smoky, and dangerous situations with little to no guidance.

Team CourseSketch
The fifth group created CourseSketch, an educational program that allows student users to draw or sketch images that are recognized by the program. The group was comprised of Khoa Bui, James Granger, Andrew King, Angel Lozano, Matthew Runyon, Antonio Sanchez, Devin Tuchsen, Michael Turner, and Joshua Zodda. The group was supported by Dr. Tracy Hammond, SRL members Cassandra Oduola and David Turner, and Dr. Dwayne Raymond, who used CourseSketch in his Intro to Logic class last semester.

CourseSketch is a sketch recognition program. It allows instructors of online, or in-person classes with online components, to do more than multiple choice questions online. Sketch recognition means students can draw and sketch images and equations that can even be automatically graded by the system. If the instructor prefers the system to give automatic feedback, the program allows for instant feedback on open-ended problems. Later, too, the instructor can watch a student’s drawing of the image, and give them feedback in real time.

The program is intuitive for student and instructor users alike, the group argued, and interactive tutorials are already built into the system as well. In the future, the group hopes that CourseSketch will continue to be developed and allow for full lectures to be conducted successfully using the program. It will be used in the classroom next semester for Phil240.


Andrew King III

Antonio Sanchez

Joshua Zodda

Angel Lozano

Devin Tuchsen

James Granger

Khoa Bui

Matthew Runyon

Michael Turner


Jess Gantt can be reached at jessicalgantt@gmail.com

No comments:

Post a Comment