Academic Interests
Augmented Reality
Virtual Reality
Novel User Interfaces
Human-Computer Interaction
Education
University of Illinois-Chicago, IL
Ph.D. student in Computer science
Advised by Prof. Debaleena Chattopadhyay
at the UIC HCI Lab.
Korea University, Seoul
M.S. in Computer science and Engineering
Advised by Prof. Gerard J. Kim
at the Digital Experience Laboratory
Korea University, Seoul
B.Eng. in Computer and Communication Engineering
Advised by Prof. Gerard J.Kim
at the Digital Experience Laboratory
Ewha Womans University, Seoul
B.S. in Public Health
Department of Health Education & Management
Minor. Multimedia
Research & Project
myCityMeter: Helping Older Adults Manage the Environmental Risk Factors for Cognitive Impairment
2018
A pollution exposure management tool for older adults and their caregivers. myCityMeter measures the pollutants shown to be associated with cognitive impairment in older adults: PM2.5 and ambient noise.
- Android, Raspberry Pi
DoodleLock:N-stroke Rhythmic Drawing Lockscreen
2016
The multi-stroke pattern password scheme that allows the user to draw the meaningful patterns which are easy to remember or the patterns that the users usually doodle.
- Android, OpenCV
3D Indoor Crime Scene Reconstruction using a smart Phone and Kinect
2014, 2016
Funded by National Forensic Service (NFS), Ministry of the Interior-affiliated, Korea
Using the acquired indoor crime scene models, reconstruct the scene in 3D virtual environment. Users can explore the scene in first person view (victim or criminal) and simulate the scenarios of the crime in virtual space.
First person view through the HMD
Third person view
Coordination of CCTV and 3D Video and 3D based evidence collecting system
2015
Funded by National Forensic Service (NFS), Ministry of the Interior-affiliated, Korea
System that creates textured 3D face model with detected and captured images of face in 2D video (CCTV).
- C++, OpenCV, OpenGL
Frontalized face image
Texture map
Generated texture image
Screenshot
Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-through based Augmented Reality.
2015, 2016
Propose the AR interaction method in which one uses the unfocused blurred finger, the sense of proprioception and ocular dominance, to aim, point and directly select a distant object in the real world for the purpose of further interaction.
- Android
The basic concept of Blurry Finger
AR based object inquiry system
(top) The system set up with the OST display and (below) a scene from actual usage.
The user encircles an object in the real world and the captured image is sent to a data base for a match with the result shown through the OST. A monitor could be used to show the glass/camera view to the audience.
Project 'Eyeing'
2014
2-channel password system that uses eye movement as a complementary factor to prevent attacks such as shoulder surfing or keyboard sniffing.
- Android, OpenCV
- SHA-256 hash function algorithm to encrypt and save the password.
‘Eyeing’ demands number and eye’s place at the same time, so user should move the eye to right place and input the number by clicking screen. As we can see on above screenshot, to input the key, ‘number: 1 / eye: left’, user should put his or her pupils to the left and push the number 1.
Publications
2018
myCityMeter: Helping Older Adults Manage the Environmental Risk Factors for
Cognitive Impairment.
Sakhnini, Nina, Yu, Ja Eun, and Chattopadhyay, Debaleena
Propose a pollution exposure management tool for older adults and their caregivers. It measures the pollutants shown to be associated with cognitive impairment in older adults. Using a set of neighborhood-level stationary and personal mobile sensors, it helps users to monitor their environmental exposures, know potential exposures when planning activities, journal cognitive performances, and take day-to-day actions to avoid the environmental risk factors for early dementia.
UbiComp/ISWC 2018 (poster) [pdf]
2016
Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects
for Optical See-through based Augmented Reality.
Yu, Ja Eun, and Kim, Gerard J.
Propose the AR interaction method in which one uses the unfocused blurred finger, the sense of proprioception and ocular dominance, to aim, point and directly select a distant object in the real world for the purpose of further interaction.
ISMAR 2016(demo) [pdf]
ICAT 2016 (Full) [pdf]
2015
Resolving view difference between eye and camera for proprioceptive pointing and selection in augmented reality applications.
Yu, Ja Eun, and Kim, Gerard J.
Propose “proprioceptive” pointing (and selection) in which we use the finger and the sense of proprioception, without focusing on the finger, to point and select an object in the real world.
VRST 2015(poster) [pdf]
Eye Strain from Switching Focus in Optical See-through Displays
Yu, Jaeun, and Kim, Gerard J.
Measure the level of fatigue caused by the frequent refocusing between real and virtual objects and its relation to the focusing duration.
INTERACT 2015 (poster) [pdf]
© 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Technical Skills
Well experienced:
C, C++, Java, Android development, OpenGL, OpenCV
Basic Ability:
Python,Web programming (HTML, CSS, JavaScript, jQuery),
Arduino, VHDL, Matlab.
Software:
Unity3D, 3Ds Max.
© 2019 By Ja Eun Yu. Proudly created with Wix.com