Friday, March 4, 2016

SRL Dissertation Defense Hong-Hoe (Ayden) Kim Friday, March 4, 2016 , Title: A Fine Motor Skill Classifying Framework to Support Children's Self-regulation Skills and School Readiness

SRL Dissertation Defense
Hong-Hoe (Ayden) Kim
Friday, March 4, 2016

Title:  A Fine Motor Skill Classifying Framework to Support Children's Self-regulation Skills and School Readiness

Hong-hoe Kim
3:30pm Friday, March 4, 2016
Room 323 Teague Building

Abstract
Children's self-regulation skills predict their school-readiness and social behaviors, and assessing these skills enables parents to target areas for improvement children to enter school ready to learn and achieve. To assess children's fine motor skills, current educators are assessing those skills by either determining their shape drawing correctness or measuring their drawing time durations through paper-based assessments. However, the methods involve human experts manually assessing children's fine motor skills, which are time consuming and prone to human error and bias. We introduce our fine motor skill classifying framework based on children's digital drawings on tablet-computers. The framework contains two fine motor skill classifiers and a sketch-based educational interface.

Biography
Hong-hoe Kim is a PhD Candidate in sketch-recognition lab at Texas A&M University under the supervision of Dr. Tracy Hammond. He received his Masters degree in Computer Science from Texas A&M University and B.S. degree in Computer Science from Soongsil University in Korea. His research area includes Child-Computer Interaction, Human-Computer Interaction (HCI), Machine Learning, and Educational Psychology.

Advisor: Dr. Tracy Hammond

Thursday, March 3, 2016

SRL MS Thesis Defense: Purnendu Kaul, March 3, 2016, Gaze Assisted Classification of OnScreen Tasks and User Activities

SRL Thesis Defense:
Purnendu Kaul
Thursday, March 3

Title:  Gaze Assisted Classification of On-Screen Tasks (by Difficulty Level) and User Activities (Reading, Writing/Typing, Image-Gazing)
Purnendu Kaul
12:30pm Thursday, March 3, 2016
323 Teague, CSE, TAMU

Abstract
Intelligent tutoring systems(ITS) are commonly used to indirectly assist classroom instructors by helping them deliver the learning material and assess students' progress as they learn. Today, such systems put the onus of asking for appropriate help on students, instead of assessing their needs automatically. This provides an opportunity to make systems which are capable of adapting to the cognitive states of students as they learn.

We have shown that Gaze-assisted human-computer interaction is a means of transforming these Intelligent tutoring systems (ITS) into more proactive and adaptive systems. A system with eye tracking capability can be trained to learn cognitive states of a user and offer contextual assistance. In this research, we conducted an experiment using Mechanix, a sketch based ITS system, that helps students learn how to solve truss problems.

Through this experiment, we investigated the possibility of using eye gaze data to classify problems being solved by students as difficult, medium, or hard. We also classify the activity being performed by users as "reading", "gazing at an image," and "drawing/typing." We only used those gaze features which can be calculated in real time, and are not dependent on the duration of activity on the system.  The results show that gaze features can clearly differentiate between the activities with an accuracy of 94%, and classify the problems as easy, medium, or hard with an accuracy of more than 70%.

Biography
Purnendu is a masters student in the Sketch recognition lab. He completed his undergraduate degree at National Institute of Technology Kurukshetra, India and worked at the Indian Institute of Technology for a year before starting the graduate program at Texas A&M University. He was a summer software intern at Schlumberger Information Solutions in Houston during the summer of 2014.

Advisor: Dr. Tracy Hammond

SRL Thesis Defense: Shalini Priya Ashok Kumar Thursday, March 3 Title: Evaluation of Conceptual Sketches on Stylus-based Devices

SRL Thesis Defense: Shalini Priya Ashok Kumar
Thursday, March 3

Title:  Evaluation of Conceptual Sketches on Stylus-based Devices

Shalini Priya Ashok Kumar
9:00am Thursday, March 3, 2016
Room  323 Teague Building

Abstract
Design Sketching is an important tool for designers and creative professionals to express their ideas and thoughts onto visual medium. Being a very critical and versatile skill for engineering students, this course is often taught in universities on pen and paper. However, this traditional pedagogy is limited by the availability of human instructors for their feedback. Using intelligent interfaces this problem can be solved where we try to mimic the feedback given by an instructor and assess the student drawn sketches to give them insight of the areas they need to improve on. PerSketchTivity is an intelligent tutoring system which allows students to practice their drawing fundamentals and gives them real-time assessment and feedback. This research deals with coming up with the grading rubric that will enable us to grade students from their sketch data.
Biography
Shalini Priya Ashok Kumar got her Bachelor of Technology degree in Information Technology from National Institute of Technology Karnataka, Surathkal. She worked for Citrix R&D India, Bangalore for a couple of years after which she joined Texas A&M University in the Masters program. She works in the Sketch Recognition Lab on developing software for Design sketching students. Her interests are Artificial Intelligence, Machine learning and Human Computer Interaction.

Tuesday, March 1, 2016

SRL Dissertation Defense: Folami Alamudun Monday, February 29 Title: Analysis of Visuo-cognitive Behavior in Screening Mammography

SRL Dissertation Defense:
Folami Alamudun
Monday, February 29

Title:  Analysis of Visuo-cognitive Behavior in Screening Mammography

Folami Alamudun
12:00pm Monday, February 29, 2016
Room 323 Teague Building

Abstract
Improved precision in modeling and predicting human behavior and the underlying metacognitive processes is now possible thanks to significant advances in bio-sensing device technology and improved technique in machine intelligence.  Eye tracking bio-sensors measure psycho-physiological response through changes in configuration of the human eye. These changes include positional measures such as visual fixation, saccadic movements, and scanpath, and non-positional measures such as blinks and pupil dilation and constriction. Using data from eye-tracking sensors, we can model human perception, cognitive processes, and responses to external stimuli.

In this study, we investigate visuo-cognitive behavior in screening mammography under clinically equivalent experimental conditions. We examined the behavior of 10 image readers (three breast-imaging radiologists and seven Radiology residents) during the diagnostic decision process for breast cancer in screening mammography. Using a head-mounted eye tracking device, we recorded eye movements, pupil response, and diagnostic decisions from each image reader for 100 screening mammograms. Our corpus of mammograms comprised cases of varied pathology and breast parenchyma density.

We proposed algorithms for extraction of primitives, which encode discriminative patterns in positional and non-positional measures of the eye. These primitives capture changes correlate with individual radiologists, radiologists’ experience level, case pathology, breast parenchyma density, and diagnostic decision. We evaluated the effectiveness of these primitives through performance measures using ten-fold cross-validation for training and testing a simple learning algorithm.

Our results suggest that a combination of machine intelligence and new bio-sensing modalities is an adequate predictor for the characteristics of a mammographic case and image readers’ diagnostic performance. Our results also suggest that primitives characterizing eye movements can be useful for biometric identification of radiologists. These findings are impactful in real-time performance monitoring and personalized intelligent training and evaluation systems in screening mammography.

Biography
Folami Alamudun is a doctoral candidate in the Department of Computer Science and Engineering at Texas A&M University working directly with Dr. Tracy Hammond in the Sketch Recognition Laboratory (SRL) and in collaboration with Dr. Georgia Tourassi at the Oak Ridge National laboratory's (ORNL) Biomedical Sciences and Engineering Center.

His research focuses on machine intelligence and bio-sensing device applications in user behavior modeling.

Advisor: Dr. Tracy Hammond