(MEET ØffGrid)
(MEET ØffGrid)
(MEET ØffGrid)

Jeff Hansberger, Ph.D.
Founder/CEO
Dr. Hansberger is one of the Army’s leading researchers in Human–Computer Interaction (HCI) and spatial user interfaces, with over 20 years of service at the U.S. Army Research Laboratory.
He founded and led a skunkworks-style spatial UI initiative, driving the design and evaluation of next-generation Soldier–system interfaces that integrate cognitive science, advanced visualization, and multimodal interaction research.
He was principal investigator for Vitreous, a 360-degree indirect vision and spatial UI system for ground vehicle crews, which he led from concept to Technology Readiness Level (TRL) 6. Vitreous integrated real-time sensor data, holographic dashboards, and multimodal inputs (eye gaze, voice, gesture, and physical controls), enabling Soldiers to “see through” vehicle hulls while remaining protected, significantly improving target detection and mission awareness.
Beyond Vitreous, Dr. Hansberger has led multiple DARPA and OSD efforts, including Visual Media Reasoning (VMR), which applied cognitive task analysis and spatial interfaces to enhance intelligence analysis. His broader research portfolio spans command-and-control visualization, digital twin development, PMESII analysis, and adaptive multimodal interaction frameworks.
As Affiliate Professor at the University of Alabama in Huntsville, he teaches graduate courses in task analysis, prototyping, and usability evaluation, strengthening the pipeline of Army-relevant HCI expertise. He has authored over 50 peer-reviewed publications, technical reports, and NATO/ARL papers on spatial interfaces, decision support, and cognitive systems engineering.
Education: 
Ph.D., Human Factors & Applied Cognition, George Mason University (2003)
Relevant Awards: DEVCOM ARL Honorary Award – Impactful Communication (2023); Best Paper Award, HCI International – VR/XR track (2017); Finalist, Federal Virtual World Challenge (2010).
Relevant Publications:
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Target Detection Performance for Head Mounted Indirect Vision Displays (SPIE 2020); Virtual Collaboration Spaces: Bringing Presence to Distributed Collaboration (JVWR 2014); Whole-government Planning and Wargaming of Complex International Interventions (International C2 Journal 2010).

Jeff Hansberger, Ph.D.
Founder/CEO
Dr. Hansberger is one of the Army’s leading researchers in Human–Computer Interaction (HCI) and spatial user interfaces, with over 20 years of service at the U.S. Army Research Laboratory.
He founded and led a skunkworks-style spatial UI initiative, driving the design and evaluation of next-generation Soldier–system interfaces that integrate cognitive science, advanced visualization, and multimodal interaction research.
He was principal investigator for Vitreous, a 360-degree indirect vision and spatial UI system for ground vehicle crews, which he led from concept to Technology Readiness Level (TRL) 6. Vitreous integrated real-time sensor data, holographic dashboards, and multimodal inputs (eye gaze, voice, gesture, and physical controls), enabling Soldiers to “see through” vehicle hulls while remaining protected, significantly improving target detection and mission awareness.
Beyond Vitreous, Dr. Hansberger has led multiple DARPA and OSD efforts, including Visual Media Reasoning (VMR), which applied cognitive task analysis and spatial interfaces to enhance intelligence analysis. His broader research portfolio spans command-and-control visualization, digital twin development, PMESII analysis, and adaptive multimodal interaction frameworks.
As Affiliate Professor at the University of Alabama in Huntsville, he teaches graduate courses in task analysis, prototyping, and usability evaluation, strengthening the pipeline of Army-relevant HCI expertise. He has authored over 50 peer-reviewed publications, technical reports, and NATO/ARL papers on spatial interfaces, decision support, and cognitive systems engineering.
Education: 
Ph.D., Human Factors & Applied Cognition, George Mason University (2003)
Relevant Awards: DEVCOM ARL Honorary Award – Impactful Communication (2023); Best Paper Award, HCI International – VR/XR track (2017); Finalist, Federal Virtual World Challenge (2010).
Relevant Publications:
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Target Detection Performance for Head Mounted Indirect Vision Displays (SPIE 2020); Virtual Collaboration Spaces: Bringing Presence to Distributed Collaboration (JVWR 2014); Whole-government Planning and Wargaming of Complex International Interventions (International C2 Journal 2010).

Jeff Hansberger, Ph.D.
Founder/CEO
Dr. Hansberger is one of the Army’s leading researchers in Human–Computer Interaction (HCI) and spatial user interfaces, with over 20 years of service at the U.S. Army Research Laboratory.
He founded and led a skunkworks-style spatial UI initiative, driving the design and evaluation of next-generation Soldier–system interfaces that integrate cognitive science, advanced visualization, and multimodal interaction research.
He was principal investigator for Vitreous, a 360-degree indirect vision and spatial UI system for ground vehicle crews, which he led from concept to Technology Readiness Level (TRL) 6. Vitreous integrated real-time sensor data, holographic dashboards, and multimodal inputs (eye gaze, voice, gesture, and physical controls), enabling Soldiers to “see through” vehicle hulls while remaining protected, significantly improving target detection and mission awareness.
Beyond Vitreous, Dr. Hansberger has led multiple DARPA and OSD efforts, including Visual Media Reasoning (VMR), which applied cognitive task analysis and spatial interfaces to enhance intelligence analysis. His broader research portfolio spans command-and-control visualization, digital twin development, PMESII analysis, and adaptive multimodal interaction frameworks.
As Affiliate Professor at the University of Alabama in Huntsville, he teaches graduate courses in task analysis, prototyping, and usability evaluation, strengthening the pipeline of Army-relevant HCI expertise. He has authored over 50 peer-reviewed publications, technical reports, and NATO/ARL papers on spatial interfaces, decision support, and cognitive systems engineering.
Education: 
Ph.D., Human Factors & Applied Cognition, George Mason University (2003)
Relevant Awards: DEVCOM ARL Honorary Award – Impactful Communication (2023); Best Paper Award, HCI International – VR/XR track (2017); Finalist, Federal Virtual World Challenge (2010).
Relevant Publications:
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Target Detection Performance for Head Mounted Indirect Vision Displays (SPIE 2020); Virtual Collaboration Spaces: Bringing Presence to Distributed Collaboration (JVWR 2014); Whole-government Planning and Wargaming of Complex International Interventions (International C2 Journal 2010).

Jayse Hansen
Founder/CCO
Spatial UI Designer with 20+ years of professional experience; widely recognized for pioneering work in cinematic and defense user interfaces.
Relevant Experience:
Jayse Hansen has over 20 years of experience designing user interfaces for film, defense, and commercial applications. He is widely recognized for creating futuristic UIs—HUDs, holograms, volumetric displays, and spatial interactions—for major film franchises including Marvel’s Avengers, Iron Man, Spider-Man, Guardians of the Galaxy, Star Wars, Top Gun: Maverick, and Tron: Ares. These cinematic designs have shaped how global audiences imagine next-generation interfaces and have directly influenced real-world design thinking.
In defense contexts, Jayse has collaborated with Army researchers to adapt advanced design principles for spatial UIs, contributing to the Vitreous program for mounted vehicle operations. He has also contributed UI design expertise to projects supporting many domains including Navy and Space Force, broadening his perspective while reinforcing his ability to tailor advanced visualization concepts to diverse mission domains. As PI, Hansen will lead design direction, ensure delivery of annotated mockups, and serve as the primary point of contact with the government. Hansen’s defense interface work has also been profiled in Popular Mechanics (“How Hollywood Tech is Transforming the Military Vehicles of Tomorrow,” July/Aug 2023), which highlighted a collaborative Army effort demonstrating that spatially displayed information could improve human performance metrics by up to 60 percent.
Education
Jayse Hansen’s background reflects a lifelong pursuit of innovation beyond traditional systems. After graduating high school two years early, he flipped the script by joining the Los Angeles Unified School District as an instructor, introducing creative and technology-driven methods that reshaped how students and teachers engaged with learning. Since then, he has authored books and articles on design and human-machine interaction, developed online and in-person curricula, and become a sought-after international speaker. His unconventional path embodies the ØffGrid ethos: challenging norms to advance how humans think, learn, and design for the future.
Relevant Awards or Patents:
Featured in Fast Company’s “Most Innovative Design” coverage for film UI contributions; Recognized in Wired and Gizmodo for innovative HUD and holographic interface design; Invited speaker at SIGGRAPH, Augmented World Expo and SXSW panels on futuristic interface design.
Relevant Publications or Media Features: Interviews and feature articles in Popular Mechanics, Fast Company, Wired, and Variety on cinematic UI design and its real-world influence; Portfolio and case studies showcased on jayse.io, highlighting cross-domain interface design expertise.

Jayse Hansen
Founder/CCO
Spatial UI Designer with 20+ years of professional experience; widely recognized for pioneering work in cinematic and defense user interfaces.
Relevant Experience:
Jayse Hansen has over 20 years of experience designing user interfaces for film, defense, and commercial applications. He is widely recognized for creating futuristic UIs—HUDs, holograms, volumetric displays, and spatial interactions—for major film franchises including Marvel’s Avengers, Iron Man, Spider-Man, Guardians of the Galaxy, Star Wars, Top Gun: Maverick, and Tron: Ares. These cinematic designs have shaped how global audiences imagine next-generation interfaces and have directly influenced real-world design thinking.
In defense contexts, Jayse has collaborated with Army researchers to adapt advanced design principles for spatial UIs, contributing to the Vitreous program for mounted vehicle operations. He has also contributed UI design expertise to projects supporting many domains including Navy and Space Force, broadening his perspective while reinforcing his ability to tailor advanced visualization concepts to diverse mission domains. As PI, Hansen will lead design direction, ensure delivery of annotated mockups, and serve as the primary point of contact with the government. Hansen’s defense interface work has also been profiled in Popular Mechanics (“How Hollywood Tech is Transforming the Military Vehicles of Tomorrow,” July/Aug 2023), which highlighted a collaborative Army effort demonstrating that spatially displayed information could improve human performance metrics by up to 60 percent.
Education
Jayse Hansen’s background reflects a lifelong pursuit of innovation beyond traditional systems. After graduating high school two years early, he flipped the script by joining the Los Angeles Unified School District as an instructor, introducing creative and technology-driven methods that reshaped how students and teachers engaged with learning. Since then, he has authored books and articles on design and human-machine interaction, developed online and in-person curricula, and become a sought-after international speaker. His unconventional path embodies the ØffGrid ethos: challenging norms to advance how humans think, learn, and design for the future.
Relevant Awards or Patents:
Featured in Fast Company’s “Most Innovative Design” coverage for film UI contributions; Recognized in Wired and Gizmodo for innovative HUD and holographic interface design; Invited speaker at SIGGRAPH, Augmented World Expo and SXSW panels on futuristic interface design.
Relevant Publications or Media Features: Interviews and feature articles in Popular Mechanics, Fast Company, Wired, and Variety on cinematic UI design and its real-world influence; Portfolio and case studies showcased on jayse.io, highlighting cross-domain interface design expertise.

Jayse Hansen
Founder/CCO
Spatial UI Designer with 20+ years of professional experience; widely recognized for pioneering work in cinematic and defense user interfaces.
Relevant Experience:
Jayse Hansen has over 20 years of experience designing user interfaces for film, defense, and commercial applications. He is widely recognized for creating futuristic UIs—HUDs, holograms, volumetric displays, and spatial interactions—for major film franchises including Marvel’s Avengers, Iron Man, Spider-Man, Guardians of the Galaxy, Star Wars, Top Gun: Maverick, and Tron: Ares. These cinematic designs have shaped how global audiences imagine next-generation interfaces and have directly influenced real-world design thinking.
In defense contexts, Jayse has collaborated with Army researchers to adapt advanced design principles for spatial UIs, contributing to the Vitreous program for mounted vehicle operations. He has also contributed UI design expertise to projects supporting many domains including Navy and Space Force, broadening his perspective while reinforcing his ability to tailor advanced visualization concepts to diverse mission domains. As PI, Hansen will lead design direction, ensure delivery of annotated mockups, and serve as the primary point of contact with the government. Hansen’s defense interface work has also been profiled in Popular Mechanics (“How Hollywood Tech is Transforming the Military Vehicles of Tomorrow,” July/Aug 2023), which highlighted a collaborative Army effort demonstrating that spatially displayed information could improve human performance metrics by up to 60 percent.
Education
Jayse Hansen’s background reflects a lifelong pursuit of innovation beyond traditional systems. After graduating high school two years early, he flipped the script by joining the Los Angeles Unified School District as an instructor, introducing creative and technology-driven methods that reshaped how students and teachers engaged with learning. Since then, he has authored books and articles on design and human-machine interaction, developed online and in-person curricula, and become a sought-after international speaker. His unconventional path embodies the ØffGrid ethos: challenging norms to advance how humans think, learn, and design for the future.
Relevant Awards or Patents:
Featured in Fast Company’s “Most Innovative Design” coverage for film UI contributions; Recognized in Wired and Gizmodo for innovative HUD and holographic interface design; Invited speaker at SIGGRAPH, Augmented World Expo and SXSW panels on futuristic interface design.
Relevant Publications or Media Features: Interviews and feature articles in Popular Mechanics, Fast Company, Wired, and Variety on cinematic UI design and its real-world influence; Portfolio and case studies showcased on jayse.io, highlighting cross-domain interface design expertise.

Ty Conner
Lead Developer
Mr. Conner is a former U.S. Army Infantryman and Aviation Maintenance Technician with over six years of operational and technical experience, including deployments in three countries and leadership as a Stryker Vehicle Commander. He served as a Battalion Javelin Missile Trainer/Instructor and was awarded the Expert Infantryman Badge, Aviation Badge, and Master Drivers Badge. He also earned two Sikorsky Rescue Medals for lifesaving MEDEVAC missions.
Since transitioning from military service, Conner has become a lead Unreal Engine/Swift developer and project lead at the University of Alabama in Huntsville. He has designed and implemented advanced simulations integrating head-mounted displays, multimodal interaction (hand tracking, gestures, HUD overlays), and vehicle/weapon systems for Army research and experimentation. His dual background as a Soldier and developer provides unique operational insight into interface requirements and ensures that UI designs are both technically feasible and tactically relevant.
Education: 
Ongoing B.S., Game Development, Full Sail University; Nationally Registered EMT; multiple Army technical schools and certifications (Stryker Leaders Course, Javelin Missile Trainer, Aviation Maintenance).
Relevant Awards: 
Twice awarded Sikorsky Rescue Medal for lifesaving MEDEVAC missions; Expert Infantryman Badge, Aviation Badge, Master Drivers Badge
Relevant Publications / Demonstrations: 
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Spatial UI demonstration presented at SXSW 2024 showcasing advanced multimodal interaction and defense-relevant applications

Ty Conner
Lead Developer
Mr. Conner is a former U.S. Army Infantryman and Aviation Maintenance Technician with over six years of operational and technical experience, including deployments in three countries and leadership as a Stryker Vehicle Commander. He served as a Battalion Javelin Missile Trainer/Instructor and was awarded the Expert Infantryman Badge, Aviation Badge, and Master Drivers Badge. He also earned two Sikorsky Rescue Medals for lifesaving MEDEVAC missions.
Since transitioning from military service, Conner has become a lead Unreal Engine/Swift developer and project lead at the University of Alabama in Huntsville. He has designed and implemented advanced simulations integrating head-mounted displays, multimodal interaction (hand tracking, gestures, HUD overlays), and vehicle/weapon systems for Army research and experimentation. His dual background as a Soldier and developer provides unique operational insight into interface requirements and ensures that UI designs are both technically feasible and tactically relevant.
Education: 
Ongoing B.S., Game Development, Full Sail University; Nationally Registered EMT; multiple Army technical schools and certifications (Stryker Leaders Course, Javelin Missile Trainer, Aviation Maintenance).
Relevant Awards: 
Twice awarded Sikorsky Rescue Medal for lifesaving MEDEVAC missions; Expert Infantryman Badge, Aviation Badge, Master Drivers Badge
Relevant Publications / Demonstrations: 
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Spatial UI demonstration presented at SXSW 2024 showcasing advanced multimodal interaction and defense-relevant applications

Ty Conner
Lead Developer
Mr. Conner is a former U.S. Army Infantryman and Aviation Maintenance Technician with over six years of operational and technical experience, including deployments in three countries and leadership as a Stryker Vehicle Commander. He served as a Battalion Javelin Missile Trainer/Instructor and was awarded the Expert Infantryman Badge, Aviation Badge, and Master Drivers Badge. He also earned two Sikorsky Rescue Medals for lifesaving MEDEVAC missions.
Since transitioning from military service, Conner has become a lead Unreal Engine/Swift developer and project lead at the University of Alabama in Huntsville. He has designed and implemented advanced simulations integrating head-mounted displays, multimodal interaction (hand tracking, gestures, HUD overlays), and vehicle/weapon systems for Army research and experimentation. His dual background as a Soldier and developer provides unique operational insight into interface requirements and ensures that UI designs are both technically feasible and tactically relevant.
Education: 
Ongoing B.S., Game Development, Full Sail University; Nationally Registered EMT; multiple Army technical schools and certifications (Stryker Leaders Course, Javelin Missile Trainer, Aviation Maintenance).
Relevant Awards: 
Twice awarded Sikorsky Rescue Medal for lifesaving MEDEVAC missions; Expert Infantryman Badge, Aviation Badge, Master Drivers Badge
Relevant Publications / Demonstrations: 
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Spatial UI demonstration presented at SXSW 2024 showcasing advanced multimodal interaction and defense-relevant applications

Jacob Nix
Software Engineer
Mr. Nix is a software engineer specializing in immersive simulation, mixed reality, and interactive system design for defense and research applications. Since 2020, he has served as a Software Engineer at the University of Alabama in Huntsville supporting the U.S. Army Research Laboratory. In this role, he has developed next-generation virtual cockpits and simulation environments for ground combat vehicles, integrating multimodal interactions such as eye gaze, gesture recognition, and voice control to enhance Soldier–system interfaces.
Mr. Nix has led the design and implementation of mixed and virtual reality dashboards for operational visualization, contributing to multiple demonstrations and field tests. His strong problem-solving ability and innovative approach have made him a key asset to the development team, consistently driving forward the technical and experiential goals of each project.
Education:
B.S., Computer Science, University of Alabama in Huntsville
Relevant Publications / Demonstrations:
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Spatial UI demonstration presented at SXSW 2024 showcasing advanced multimodal interaction and defense-relevant applications.

Jacob Nix
Software Engineer
Mr. Nix is a software engineer specializing in immersive simulation, mixed reality, and interactive system design for defense and research applications. Since 2020, he has served as a Software Engineer at the University of Alabama in Huntsville supporting the U.S. Army Research Laboratory. In this role, he has developed next-generation virtual cockpits and simulation environments for ground combat vehicles, integrating multimodal interactions such as eye gaze, gesture recognition, and voice control to enhance Soldier–system interfaces.
Mr. Nix has led the design and implementation of mixed and virtual reality dashboards for operational visualization, contributing to multiple demonstrations and field tests. His strong problem-solving ability and innovative approach have made him a key asset to the development team, consistently driving forward the technical and experiential goals of each project.
Education:
B.S., Computer Science, University of Alabama in Huntsville
Relevant Publications / Demonstrations:
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Spatial UI demonstration presented at SXSW 2024 showcasing advanced multimodal interaction and defense-relevant applications.

Jacob Nix
Software Engineer
Mr. Nix is a software engineer specializing in immersive simulation, mixed reality, and interactive system design for defense and research applications. Since 2020, he has served as a Software Engineer at the University of Alabama in Huntsville supporting the U.S. Army Research Laboratory. In this role, he has developed next-generation virtual cockpits and simulation environments for ground combat vehicles, integrating multimodal interactions such as eye gaze, gesture recognition, and voice control to enhance Soldier–system interfaces.
Mr. Nix has led the design and implementation of mixed and virtual reality dashboards for operational visualization, contributing to multiple demonstrations and field tests. His strong problem-solving ability and innovative approach have made him a key asset to the development team, consistently driving forward the technical and experiential goals of each project.
Education:
B.S., Computer Science, University of Alabama in Huntsville
Relevant Publications / Demonstrations:
Designing an Interactive Mixed Reality Cockpit for Enhanced Soldier-Vehicle Interaction (GVSETS 2023); Spatial UI demonstration presented at SXSW 2024 showcasing advanced multimodal interaction and defense-relevant applications.
(TEAM EXPERIENCE)
(TEAM EXPERIENCE)
(TEAM EXPERIENCE)
2026
OPSVision
Army Research Labs
Data rich C2 Command and Control centers of the future need to be mobile, decentralized and more efficient than ever. We combined Ai, Spatial and a hyper efficient UI design to prototype the command center of the future.
2026
OPSVision
Army Research Labs
Data rich C2 Command and Control centers of the future need to be mobile, decentralized and more efficient than ever. We combined Ai, Spatial and a hyper efficient UI design to prototype the command center of the future.
2026
OPSVision
Army Research Labs
Data rich C2 Command and Control centers of the future need to be mobile, decentralized and more efficient than ever. We combined Ai, Spatial and a hyper efficient UI design to prototype the command center of the future.
2023
F-18: Holographic Cockpit Design & Prototype
Army Research Labs
The world's most battle-tested fighter jet needed an upgrade. We designed and prototyped a holographic cockpit - for use by pilots in (and out of) the plane.
2023
F-18: Holographic Cockpit Design & Prototype
Army Research Labs
The world's most battle-tested fighter jet needed an upgrade. We designed and prototyped a holographic cockpit - for use by pilots in (and out of) the plane.
2023
F-18: Holographic Cockpit Design & Prototype
Army Research Labs
The world's most battle-tested fighter jet needed an upgrade. We designed and prototyped a holographic cockpit - for use by pilots in (and out of) the plane.
2016
Special Forces HUD, SEAL Paratrooper / Ground Multi-Mode
Naval Special Warfare, US Special Operations Command (SOCOM)
If you're a SEAL, you might find yourself jumping out of a plane, trying to find the others that jumped with you, landing, navigating underwater, above ground, and kicking in the door while trying to locate your target to rescue. All of these in the pitch black of night. This is why we need HUDs.
2016
Special Forces HUD, SEAL Paratrooper / Ground Multi-Mode
Naval Special Warfare, US Special Operations Command (SOCOM)
If you're a SEAL, you might find yourself jumping out of a plane, trying to find the others that jumped with you, landing, navigating underwater, above ground, and kicking in the door while trying to locate your target to rescue. All of these in the pitch black of night. This is why we need HUDs.
2016
Special Forces HUD, SEAL Paratrooper / Ground Multi-Mode
Naval Special Warfare, US Special Operations Command (SOCOM)
If you're a SEAL, you might find yourself jumping out of a plane, trying to find the others that jumped with you, landing, navigating underwater, above ground, and kicking in the door while trying to locate your target to rescue. All of these in the pitch black of night. This is why we need HUDs.
2015
Cognitive Tools for Target Detection – U.S.-India Defense Technology & Trade Initiative (DTTI)
Army Research Labs
A multi-year, multi-million-dollar international collaboration between the U.S. Department of Defense (DoD) and India’s Ministry of Defence (MoD) under the Defense Technology & Trade Initiative (DTTI). This project aimed to develop cognitive augmentation tools to enhance human target detection and intelligence analysis by integrating neuroscience, computer vision, and human-computer interaction (HCI) techniques.
2015
Cognitive Tools for Target Detection – U.S.-India Defense Technology & Trade Initiative (DTTI)
Army Research Labs
A multi-year, multi-million-dollar international collaboration between the U.S. Department of Defense (DoD) and India’s Ministry of Defence (MoD) under the Defense Technology & Trade Initiative (DTTI). This project aimed to develop cognitive augmentation tools to enhance human target detection and intelligence analysis by integrating neuroscience, computer vision, and human-computer interaction (HCI) techniques.
2015
Cognitive Tools for Target Detection – U.S.-India Defense Technology & Trade Initiative (DTTI)
Army Research Labs
A multi-year, multi-million-dollar international collaboration between the U.S. Department of Defense (DoD) and India’s Ministry of Defence (MoD) under the Defense Technology & Trade Initiative (DTTI). This project aimed to develop cognitive augmentation tools to enhance human target detection and intelligence analysis by integrating neuroscience, computer vision, and human-computer interaction (HCI) techniques.
2013
CIA/DARPA Visual Media Reasoning (VMR) - Zoomable UI for Intelligence Analysis
RDECOM Army Development and Engineering Command
Bad guys generate lots of data. An analyst needs to tie tens of thousands of elements together to reveal a story. In partnership with DARPA, we created a 'use-anywhere-anytime' spatial UI where agents could sift massive quantities of text documents, photos, videos and audio to track down the enemy with ease.
2013
CIA/DARPA Visual Media Reasoning (VMR) - Zoomable UI for Intelligence Analysis
RDECOM Army Development and Engineering Command
Bad guys generate lots of data. An analyst needs to tie tens of thousands of elements together to reveal a story. In partnership with DARPA, we created a 'use-anywhere-anytime' spatial UI where agents could sift massive quantities of text documents, photos, videos and audio to track down the enemy with ease.
2013
CIA/DARPA Visual Media Reasoning (VMR) - Zoomable UI for Intelligence Analysis
RDECOM Army Development and Engineering Command
Bad guys generate lots of data. An analyst needs to tie tens of thousands of elements together to reveal a story. In partnership with DARPA, we created a 'use-anywhere-anytime' spatial UI where agents could sift massive quantities of text documents, photos, videos and audio to track down the enemy with ease.
2012
Next-Generation Adaptive Interfaces for UAV Operators
Army Research Labs
Designing adaptive interfaces for UAV operators that reduce cognitive overload and enhance decision-making. By leveraging insights from cognitive science and human factors, Jeff developed user-centered design solutions tailored to individual differences in information processing.
2012
Next-Generation Adaptive Interfaces for UAV Operators
Army Research Labs
Designing adaptive interfaces for UAV operators that reduce cognitive overload and enhance decision-making. By leveraging insights from cognitive science and human factors, Jeff developed user-centered design solutions tailored to individual differences in information processing.
2012
Next-Generation Adaptive Interfaces for UAV Operators
Army Research Labs
Designing adaptive interfaces for UAV operators that reduce cognitive overload and enhance decision-making. By leveraging insights from cognitive science and human factors, Jeff developed user-centered design solutions tailored to individual differences in information processing.
2014
Next Generation Air Dominance (NGAD) / Next-Gen Fighter Jet Cockpit Volumetric / Holographic UI
Northrop Grumman Systems
For the next fighter jet to rival the most advanced fighter ever: the F-35, it's going to require thinking very differently. The F-35 effectively got rid of buttons in the cockpit, replaced by touch screens. For what would become known today as the F-47, Jayse pitched the idea of getting rid of all the screens as well, replacing them with tap-areas and a cockpit driven entirely by the helmet AR system. This opened up a world of possibilities, including not needing to be physically 'in' the cockpit, and not even needing glass on the canopy (a concept called 'Indirect Vision'.)
2014
Next Generation Air Dominance (NGAD) / Next-Gen Fighter Jet Cockpit Volumetric / Holographic UI
Northrop Grumman Systems
For the next fighter jet to rival the most advanced fighter ever: the F-35, it's going to require thinking very differently. The F-35 effectively got rid of buttons in the cockpit, replaced by touch screens. For what would become known today as the F-47, Jayse pitched the idea of getting rid of all the screens as well, replacing them with tap-areas and a cockpit driven entirely by the helmet AR system. This opened up a world of possibilities, including not needing to be physically 'in' the cockpit, and not even needing glass on the canopy (a concept called 'Indirect Vision'.)
2014
Next Generation Air Dominance (NGAD) / Next-Gen Fighter Jet Cockpit Volumetric / Holographic UI
Northrop Grumman Systems
For the next fighter jet to rival the most advanced fighter ever: the F-35, it's going to require thinking very differently. The F-35 effectively got rid of buttons in the cockpit, replaced by touch screens. For what would become known today as the F-47, Jayse pitched the idea of getting rid of all the screens as well, replacing them with tap-areas and a cockpit driven entirely by the helmet AR system. This opened up a world of possibilities, including not needing to be physically 'in' the cockpit, and not even needing glass on the canopy (a concept called 'Indirect Vision'.)

