Niall Williams

Niall L. Williams

PhD Student
GAMMA Lab
Department of Computer Science
University of Maryland, College Park
College Park, MD 20740
USA

e-mail: niallw AT umd.edu
[CV] [Google Scholar] [Twitter] [Github] [LinkedIn]





I am on the job market for a full-time position starting in Summer or Fall 2024! I am interested in positions relating to AR/VR, human perception, and computer graphics. Please contact me if you have any leads!

About me

I am a PhD candidate in computer science at the University of Maryland, College Park. I am a member of the GAMMA lab, where I work with Dr. Dinesh Manocha, Dr. Aniket Bera, and Dr. Ming C. Lin. My research interests include virtual/augmented reality, human perception, computer graphics, and robotics (in that order). For my dissertation, I am developing robust methods to enable exploration of large virtual environments in arbitrary physical environments using natural walking in VR. To this end, I use techniques from visual perception, robot motion planning, computational geometry, and statistical modeling to develop rigorous algorithms for steering users through unseen physical environments.

I graduated with a B.S. with High Honors in Computer Science from Davidson College. During my time at Davidson, I was a member of the DRIVE lab, where I was advised by Dr. Tabitha C. Peck. My undergraduate thesis studied redirected walking thresholds under different conditions and how we can efficiently estimate them.

My name is pronounced in the same way that you pronounce "Nile."
In my free time, I mostly enjoy drawing and competitive video games (DotA 2 and Tetris).


News


Journal and Conference Publications

A full list of my publications can also be found on my Google Scholar profile. Representative papers are highlighted.
* denotes equal contributions.
A picture of the custom-built head-mounted display simulator used in our experiment. Perceptual Thresholds for Radial Optic Flow Distortion in Near-Eye Stereoscopic Displays
Mohammad R. Saeedpour-Parizi, Niall L. Williams, Tim Wong, Phillip Guan, Dinesh Manocha, Ian M. Erkelens
Transactions on Visualization and Computer Graphics, 2024
Proc. IEEE VR 2024
We measured how sensitive observers are to image magnification artifacts in varifocal displays and to what extent we can leverage blinks to decrease their visual sensitivity, with applications to mitigating the vergence-accommodation conflict.
Visualization of how we use active haptic feedback to improve the user's virtual experience. A Framework for Active Haptic Guidance Using Robotic Haptic Proxies
Niall L. Williams*, Nicholas Rewkowski*, Jiasheng Li, Ming C. Lin
IEEE International Conference on Robotics and Automation (ICRA), 2023
We used a robot to proactively generate context-aware haptic feedback that influences the user's behavior in mixed reality, to improve the immersion and safety of their virtual experience.
Visualization of our navigability metric. ENI: Quantifying Environment Compatibility for Natural Walking in Virtual Reality
Niall L. Williams, Aniket Bera, Dinesh Manocha
IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), 2022   [Best Paper Honorable Mention]
We provide a metric to quantify the ease of collision-free navigation in VR for any given pair of physical and virtual environments, using geometric features.
Visibility polygons can represent free space. Redirected Walking in Static and Dynamic Scenes Using Visibility Polygons
Niall L. Williams, Aniket Bera, Dinesh Manocha
Transactions on Visualization and Computer Graphics, 2021
Proc. IEEE ISMAR 2021   [Best Paper Honorable Mention]
We formalize the redirection problem using motion planning and use this formalization to develop an improved steering algorithm based on the similarity of physical and virtual free spaces.
Our algorthim yields physical paths that match virtual paths. ARC: Alignment-based Redirection Controller for Redirected Walking in Complex Environments
Niall L. Williams, Aniket Bera, Dinesh Manocha
Transactions on Visualization and Computer Graphics, 2021
Proc. IEEE VR 2021   [Best Paper Honorable Mention]
We achieve improved steering results with redirected walking by steering the user towards positions in the physical world that more closely match their position in the virtual world.
Logo for the PettingZoo library. PettingZoo: Gym for Multi-Agent Reinforcement Learning
J. K. Terry, Benjamin Black, Mario Jayakumar, Ananth Hari, Ryan Sullivan, Luis Santos, Clemens Dieffendahl, Niall L. Williams, Yashas Lokesh, Caroline Horsch, Praveen Ravi
Neural Information Processing Systems (NeurIPS), 2021
One of the most popular libraries for multi-agent reinforcement learning.
Our emotionally expressive agents in an AR environment. Generating Emotive Gaits for Virtual Agents Using Affect-Based Autoregression
Uttaran Bhattacharya, Nicholas Rewkowski, Pooja Guhan, Niall L. Williams, Trisha Mittal, Aniket Bera, Dinesh Manocha
IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2020
We automatically synthesize emotionally expressive gaits for virtual avatars using an autoregression network.
Difference in viewports with a 40 and 110 FOV. Estimation of Rotation Gain Thresholds Considering FOV, Gender, and Distractors
Niall L. Williams, Tabitha C. Peck
Transactions on Visualization and Computer Graphics, 2019
Proc. IEEE ISMAR 2019
We measured perceptual thresholds for redirected walking and found that the user's tolerance for redirection depends on the field of view, the presence of distractors, and their gender.

Workshop Papers and Posters

XXX_ALT_TEXT Redirection Using Alignment
Niall L. Williams, Aniket Bera, Dinesh Manocha
IEEE VR Locomotion Workshop, 2021
We provide a general framework for how alignment can be used in redirected walking to steer the user towards similar physical and virtual positions.
Our haptic interface and buouancy simulator. Augmenting Physics Education with Haptic and Visual Feedback
Kern Qi, David Borland, Emily Jackson, Niall L. Williams, James Minogue, and Tabitha C. Peck
IEEE VR 5th Workshop on K-12+ Embodied Learning through Virtual & Augmented Reality (KELVAR), 2020
Using haptic force feedback to help teachers better understand physics concepts.
Our haptic interface and experiment conditions. The Impact of Haptic and Visual Feedback on Teaching
Kern Qi, David Borland, Emily Jackson, Niall L. Williams, James Minogue, and Tabitha C. Peck
IEEE Conference on Virtual Reality and 3D User Interfaces, 2020
Significant differences results. Estimation of Rotation Gain Thresholds for Redirected Walking Considering FOV and Gender
Niall L. Williams, Tabitha C. Peck
IEEE Conference on Virtual Reality and 3D User Interfaces, 2019

Invited Talks

Thumbnail for my SIGGRAPH talk. ARC: Alignment-based Redirection Controller for Redirected Walking in Complex Environments
Niall L. Williams
SIGGRAPH TVCG Session on VR, 2021

Service

Teaching

I greatly enjoy teaching since it's a combination of some of my favorite things: talking about computer science, introducing people to computer science, and learning. I would like to see more diverse groups of people become active in the computer science community, and I think teaching is an important step towards that. It's important to me that everyone has an equal opportunity to learn, so I try my best to be welcoming and unassuming about people's prior knowledge. I learned a lot about teaching from my time as an undergrad at Davidson, and I deeply agree with their approach to teaching.

Fun Stuff

A collection of random bits of info about me or things I find interesting.

Useful resources:

Fun reads/cool things:


Website format copied from Sophie Jörg.