Category

Future of Work

Home / Future of Work
AwardsFuture of WorkMovement ThinkingThe Research University

Secure Smart Machining

Sponsor: Stanford University
Award Number: 1931750
Philip Levis [email protected] (Principal Investigator)
Dawson Engler (Co-Principal Investigator)
David Mazieres (Co-Principal Investigator)

ABSTRACT

Machining is software. Gcode, the programming language for machining tools such as milling machines, lathes, and plasma cutters, was developed in the late 1950s and remains the dominant language today. In the past 60 years, programming languages and software have changed and advanced tremendously, but Gcode remains mostly unchanged. This is true both for legacy systems as well as new ones, such as 3D printers. Machining pioneered cyber-physical systems but, from a computing perspective, remains half a century in the past. Enabling machine tools as modern, networked programming systems has the potential to revolutionize the $40B machining industry.

This research project will demonstrate new techniques that safely and securely improve machining automation, using new embedded operating systems, program analysis, secure code distribution, and user tools. The research relies on three important principles: discretization, programmable safety, and end-to-end integrity with auditing. The first principle, discretization, is a method of program representation that greatly simplifies correctness checks and verifying invariants. Rather than rely on implicit curves and geometric to define physical shapes, programs use an explicit, discretized representation defined by the desired machining precision. The second principle, programmable safety, allows quickly-changing software to have the same physical safety as traditional machining systems, by using high-assurance software that operates correctly even if the entire system crashes. Finally, end-to-end integrity and auditing allows operators to verify code before running it and allows the system to prove that programs executed correctly.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

Please report errors in award information by writing to: [email protected]

AwardsFuture of WorkInventXRMovement ThinkingNSFThe Research University

Collaborative Research: Enhancing Human Capabilities through Virtual Personal Embodied Assistants in Self-Contained Eyeglasses-Based Augmented Reality (AR) Systems

Sponsor: University of North Carolina at Chapel Hill
Award Number: 1840131
Henry Fuchs [email protected] (Principal Investigator)
Jan-Michael Frahm (Co-Principal Investigator)
Mohit Bansal (Co-Principal Investigator)
Felicia Williams (Co-Principal Investigator)
Prudence Plummer (Co-Principal Investigator)

ABSTRACT

The Future of Work at the Human-Technology Frontier (FW-HTF) is one of 10 new Big Ideas for Future Investment announced by NSF. The FW-HTF cross-directorate program aims to respond to the challenges and opportunities of the changing landscape of jobs and work by supporting convergent research. This award fulfills part of that aim.

This award supports basic research underpinning development of an eyeglass-based 3D mobile telepresence system with integrated virtual personal assistant. This technology will increase worker productivity and improve skills. The system automatically adjusts visual focus and places virtual elements in the image without eye strain. The user will be able to communicate to the system by speech. The system also uses sensors to keep track of the user’s surroundings and provide the relevant information to the user automatically. The project will explore two of the many possible uses of the system: amplifying a workers capabilities (such as a physical therapist interacting with a remote patient), and accelerating post-injury return to work through telepresence (such as a burn victim reintegrating into his/her workplace). The project will advance the national interest by allowing the right person to be virtually in the right place at the right time. The project also includes an education and outreach component wherein undergraduate and graduate students shall receive training in engineering and research methods. Course curriculum at Stanford University and the University of North Carolina at Chapel Hill shall be updated to include project-related content and examples.

This project comprises fundamental research activities needed to develop an embodied Intelligent Cognitive Assistant (GLASS-X) that will amplify the capabilities of workers in a way that will increase productivity and improve quality of life. GLASS-X is conceived of as an eyeglass-based 3D mobile telepresence system with integrated virtual personal assistant. Methods include: body and environment reconstruction (situation awareness) from a fusion of images provided by an eyeglass frame-based camera array and limb motion data provided by inertial measurement units; fundamental research on adaptive focus displays capable to reduce eye strain when using augmented reality displays; dialog-based communication with a virtual personal assistant, including transformations from visual input to dialog and vice versa; human subject evaluations of GLASS-X technology in two workplace domains (remote interactions between a physical therapist and his/her patient; burn survivor remote return-to-work). This research promises to push the state of the art in core areas including: computer vision; augmented reality; accommodating displays; and natural language and dialogue models.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

About Exponent

Exponent is a modern business theme, that lets you build stunning high performance websites using a fully visual interface. Start with any of the demos below or build one on your own.

Get Started
Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from Youtube
Vimeo
Consent to display content from Vimeo
Google Maps
Consent to display content from Google
Spotify
Consent to display content from Spotify
Sound Cloud
Consent to display content from Sound