Articles Tagged with

Future of Work

Home / Future of Work

I-Corps: Virtual Reality Biofeedback Education Technologies

Award Number: 1938166
Sponsor : Cornell University
Michael Timmons [email protected] (Principal Investigator)

The broader impact/commercial potential of this I-Corps project is to provide a new means for how educational content is delivered in virtual reality (VR) with consistent impact, independent of demographics and environment. This technology offers a higher level of personalized learning that can be administered without requiring accompanying professional development for teachers due to the delivery of lessons in a highly controlled virtual environment. Removing this dependence on additional training facilitates the adoption across schools with limited resources. The increased efficacy in learning and retention of material through VR has made it a technology of interest for teachers and school administrators. This technology can be used across a range of educational subjects, and with the immersive personalized learning experience, offers a high-quality learning solution for educators.

This I-Corps project uses biofeedback to personalize learning in virtual reality (VR). The technology manipulates a user’s environment based on their physiological reactions. This can include the surrounding physical objects, light source, or landscape. The auditory manipulation is a more indirect change where the audio path of a lesson is based on the emotional reaction of a user, such as when someone is distracted or agitated. This emotional reaction is based on tracking numerous physiological inputs over time. The identification process is based on existing research using the same physiological measures to classify users’ emotional states and improved through this project This level of adaptation enables a higher degree of VR program customization and more a meaningful learning experience while using the technology.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

AwardsInventXRMovement ThinkingSuperintendentsThe Research University

The Research University: The Agave Platform: An Open Science-As-A-Service Cloud Platform for Reproducible Science

Sponsor: Chapman University

Rion Dooley [email protected] (Principal Investigator)


In today’s data-driven research environment, the ability to easily and reliably access compute, storage, and derived data sources is as much a necessity as the algorithms used to make the actual discoveries. The earth is not shrinking, it is digitizing, and the ability for US researchers to stay competitive in the global research community will increasingly be determined by their ability to reduce the time from theory to discovery. Over the last 5 years, the open source commercial sector has greatly outpaced the academic research world in its growth and adoption of programming languages, infrastructure design, and interface development. Problems that were primarily academic in nature several years ago are now common in the commercial world. Terms like big data, business intelligence, remote visualization, and streaming event processing, have moved from the classroom to the board room. However, academic projects are largely unable to take advantage of many today’s most popular and widely used open source technologies within the context of their campus and shared research infrastructure. The recently completed, NSF funded, Science Gateway Institute planning project revealed just how far behind many communities are. In a survey of over 26,000 NSF-funded PIs, science gateway developers, and leaders in higher education (i.e., CIOs, CTOs, and others), over 85% of respondents said they needed help adapting existing technologies to realize the needs of their gateway. Another 80% said they needed help simply understanding what technologies were available to them. The research community doesn’t just see the gap, they live it. This project seeks to quickly close the capability gap between academic and commercial infrastructure by extending and making robust the Agave Platform, an open, Science-as-a-Service cloud platform for reproducible science. Essentially, this project will allow scientists to focus their energies on their science rather than so much on the computing technologies they use.

This Agave Platform will build upon the success of the existing Agave Developer APIs which currently serve over 20,000 users in the plant biology community. This project includes three well-defined efforts which will synergistically evolve the current technology into a sustainable Science-as-a-Service platform for the national research community. First,it will extend the Agave Developer APIs with additional services and management interfaces to create a cohesive, self-provisioning Agave Platform which will enable Science-as-a-Service to the developer community. Second, the project team will partner with commercial and academic institutions to create a community driven Application Exchange (AX) based on Docker container technology to facilitate application transparency, portability, attribution, and reproducibility. Third, the project will consolidate existing open source contributions from projects already with the Agave ecosystem into Agave ToGo, a collection of reference science gateways in multiple languages and web frameworks. The Agave Platform will democratize access to software and infrastructure across all areas of science and engineering by modernizing the mechanisms with which the research community can utilize and access academic research infrastructure. This will bridge the gap between industrial and academic research infrastructure and allow researchers to use a new generation of open source software and technologies. The AX will enable greater interoperability and accountability in the way computational science results are published and reviewed. Through the matching investment of industrial partners, reproducibility, best practices, and rigorous scientific review will be brought to the mainstream and promoted as a fundamental aspect of the scientific process in an open, sustainable way. Agave ToGo will make custom gateways readily available to end users and developers alike. For end users, it will empower them to focus on domain science rather than computer science. For developers, it will stimulate innovation and increase the opportunity for discovery. When combined with the Agave Platform and Application Exchange, Agave ToGo will enable novice users to create scalable, reproducible, digital labs that span their office, commercial cloud, and national data centers in a matter of minutes.

AwardsFuture of Learning iHubMovement Thinking

Closing Gaps: Connecting Assessment and Culture to Increase Achievement

Sponsor: WestEd
Sharon Nelson-Barber [email protected] (Principal Investigator)
Matt Silberglitt (Co-Principal Investigator)
Jonathan Boxerman (Co-Principal Investigator)

This project will advance efforts of the Innovative Technology Experiences for Students and Teachers (ITEST) program to better understand and promote practices that increase students’ motivations and capacities to pursue careers in fields of science, technology, engineering, or mathematics (STEM) by investigating ways to make science assessment and science instruction more culturally relevant to Native Hawaiians. Closing Gaps: Connecting Assessment and Culture to Increase Achievement, a three-year design and development study, opens new doors for understanding how technology can enhance teaching and learning. The project focuses on ways in which technology-rich learning environments can improve instruction and assessment practices for diverse indigenous students. It combines two innovative learning technologies — SimScientists and FieldScope — that support STEM teaching and learning through the practices of science. Teachers in the Na Lei Na’auao Native Hawaiian Charter School Alliance use these innovative technologies in their classrooms and on ecosystems-themed field trips. Project researchers will study how features of each technology can foster learning and enhance assessment. The project addresses a persistent limitation of STEM learning: students’ lack of access to connected and familiar experiences that can help build foundational knowledge. Although new technologies to support STEM learning are available each year, many deliver inaccessible information because the context of the information is unfamiliar and does not relate to children’s own experiences and intuitive knowledge. This promotes fragile understandings rather than the kinds of knowledge valued by NGSS and in work environments. This project explores how to design educational learning tools that can be adapted to a local context yet be standardized enough to align with state and national guidelines. Findings may prove critical in improving test development practices for diverse populations. Testing in diverse indigenous communities is underexplored; little is known about how assessments can be adapted to serve the dual role of assessing content and practice standards while attending to specifics of the local context. This project intends to enhance the educational advancement of all students in STEM areas.

This project intends to advance the field of educational technology to maximize benefits of cultural and contextual diversity in technology-rich learning environments. It addresses four research questions: (1) Can features of two learning technologies be customized to be both contextually relevant and aligned with standardized learning goals?; (2) Can technology-rich learning environments be used to make salient connections between instruction and the culture in which learning is situated?; (3) Can assessment embedded into technology-rich learning environments be responsive to ways of knowing and demonstrating understanding unique to an indigenous culture?; and (4) Can assessment embedded in technology-rich learning environments support inferences about student understanding of the practices, core ideas, and crosscutting concepts of science with appropriate and sufficient evidence? In Year 1, the project will conduct initial feasibility studies with students and teachers to inform revisions to existing SimScientists modules and reflection activities. In Year 2, the project will revise existing modules to enhance their cultural relevance and then conduct small-scale usability and feasibility testing with the revised modules. In Year 3, the customized modules will be piloted with 12 teachers. Data collection and analysis strategies include: (a) design charrettes; (b) focus groups and usability testing; (c) cognitive labs for cultural relevance, construct validity and innovation impacts; (d) pre/posttest of American Association for the Advancement of Science (AAAS) items; (e) benchmark assessment data; (f) teacher surveys; (i) case studies; (j) classroom and field trip participant observations (k) differential item functioning; (l) analysis of covariance; and (m) analysis of variance on posttest scores (outcome variable) to compare the means across student groups (by intervention mode) and their prior science achievement levels to measure the technical quality of the assessments. Project success means students will make personal connections between the knowledge they gain throughout the course of their lives and the knowledge that is important in STEM fields, offering additional ways to see the value and possibilities of STEM careers.


Information Technology Skill Standards, 2020 and Beyond

Sponsor: Collin County Community College
Information Technology Skills, 2020 and Beyond
Ann Beheler [email protected] (Principal Investigator)

In 2003, the ATE National Workforce Center for Emerging Technologies (NWCET) developed and published the “Building a Foundation for Tomorrow: Skill Standards for Information Technology”. This document has not been updated since 2003, and it no longer aligns with current information technology (IT) industry needs. This project will develop a new employer-led and verified IT Skill Standards document for the top eight to ten IT industry job clusters supporting positions requiring a two-year or a four-year applied IT degree. Initial discussions with industry experts have identified at least four new job clusters that did not exist in 2003, along with an additional five to six job clusters that need to be updated. Skill standards make IT careers more accessible as they provide transparency regarding the knowledge, skills and performance standards needed for success in the industry, and educators use skill standards to develop relevant curricular materials to better prepare students for the workplace.

This project will develop a future-facing set of IT Skill Standards for the most critical IT job clusters, led by employer subject matter experts. In addition to creating a set of skill standards for each job cluster, the standards within each job cluster will be stratified by the top four to eight critical work functions. The project will also compile a list of the certifications valued by employers as of the date of publication. For each cluster, a national group of educators and Business and Industry Leadership Team (BILT) members will determine which portions of the standards apply to two-year and four-year programs, to facilitate ease of use in development of employer aligned curriculum. These stratifications will assist both employers and educators to more easily apply the standards. Subject matter experts will be asked to predict trends in IT and the knowledge and skills that will likely to be needed to support emerging trends. It is expected that this effort will increase the use and longevity of the developed standards. Dissemination efforts will use ATE Central and the Convergence Technology Center website, will feature multiple strategies to increase awareness of the skill standards, and will provide tools for their use and application.

AwardsMovement ThinkingNSFThe Research University

Engineering for US All – E4USA: A National Pilot High School Engineering Course and Database

Sponsor: University of Maryland College Park
Award Number: 1849430
Darryll Pines [email protected] (Principal Investigator)
Jumoke Ladeji-Osias (Co-Principal Investigator)
Stacy Klein-Gardner (Co-Principal Investigator)
Kenneth Reid (Co-Principal Investigator)
Adam Carberry (Co-Principal Investigator)
Leigh Abts (Former Co-Principal Investigator)
Ann McKenna (Former Co-Principal Investigator)


The College Board currently serves 7 million students, 23,000 high schools, and 3,600 colleges through the AP and SAT annually. However, no standardized educational program exists for pre-college students to earn widely accepted, transferable, engineering course credits. In addition, there are no nationally offered professional development programs to train and certify highly qualified, secondary teachers to support an undergraduate-level engineering course at the pre-college level. An Engineering for US All (E4USA), one-year high school course has the potential to ‘democratize’ the learning and practice of engineering by engaging high school students and their teachers to think and practice engineering principles and design practices, like an engineer. E4USA would be equivalent to placement credit for an introductory college course, such as: 1) core engineering; or 2) an elective; or 3) a substitute required course in the general education sequence. The impact might well go beyond an E4USA credit, through the credentialing of a broad range of STEM trained teachers to instruct and assess engineering principles and design-based experiences, and therefore become cornerstones supporting the introduction of engineering principles and design as outlined in the Next Generation Science Standards (NGSS). The E4USA framework will focus on three “big ideas:” 1) Engineering and Society; 2) Engineering Processes; and 3) Essential Engineering Content, Skills and Tools. Credit would be earned by students through two integrated pathways: 1) a standard’s based curriculum covering the Principles of Engineering; and 2) a submission of a design project. The national pilot will be lead by the University of Maryland College Park and include Arizona State University, Morgan State University, Vanderbilt University, Virginia Tech, a dissemination collaboration with NASA, and a sampling of some 70 high schools across the United States.

Engineering for US All (E4USA) would help to ‘demystify’ and ‘democratize’ engineering, and empower science, technology, engineering, and mathematics (STEM) teachers to gain the self-efficacy, self-confidence and skills to teach and assess their students engineering-based competencies. No standardized programs currently exist at a national level to train and certify high school teachers to support a one-year high school course based on engineering principles and a design-based experience. Our proposed national pilot would enable the standardized and centralization of data collection from across the United States, thus tracking STEM teachers and their students’ trajectories of learning engineering concepts through competency-based evaluations and design project submissions. A national, data repository will be created and updated at the University of Maryland to track the training of the teachers, and their students. The research will explore if: 1) a broader diversity of students will consider engineering as an academic and career option; 2) professional development (PD) can enable teachers to apply engineering concepts across STEM disciplines to train students to tackle and solve problems; and 3) the piloting of the PD models can be used to certify highly qualified teachers. Projected outcomes will include: 1) a hybrid (e.g., in-person and online) PD model that prepares STEM teachers to gain the confidence, instructional skills and assessment competencies to support E4USA; 2) guidelines for the use of a Learning Management Systems and on-line analytical tools to collect data from a diverse sampling of teachers, students, institutions, and high schools; 3) E4USA course materials and resources; and 4) E4USA models that can be aligned to state and local high school graduation requirements.

google-site-verification: google3c69bd686adcd6f5.html

AwardsNSFThe Research University

FABRIC: Adaptive Programmable Research Infrastructure for Computer Science and Science Applications

Sponsor: University of North Carolina at Chapel Hill
Investigator(s): Ilya Baldin [email protected] (Principal Investigator)
James Griffioen (Co-Principal Investigator)
Kuang-Ching Wang (Co-Principal Investigator)
Indermohan Monga (Co-Principal Investigator)
Anita Nikolich (Co-Principal Investigator)
Award Number: 1935966


FABRIC is a unique national research infrastructure to enable cutting-edge, and exploratory research at-scale in computer networking, distributed computing systems, and applications. It is a platform on which researchers will experiment with new ideas that will become building blocks of the next generation Internet and address requirements for emerging science applications that depend on large-scale networking. FABRIC will create the opportunities to explore innovative solutions not previously possible for a large variety of high-end science applications. FABRIC will provide a platform on which to educate and train the next generation of researchers on future advanced distributed systems designs. It will engage with students and educators from under-represented communities to create a diverse cohort of developers and experimenters. FABRIC members will participate in community building and will engage in outreach and tech transfer to industry affiliates. The FABRIC team is led by researchers from University of North Carolina at Chapel Hill, University of Kentucky, Clemson University, Illinois Institute of Technology, and the Department of Energy’s ESnet (Energy Sciences Network). The team also includes researchers from many other universities to help test the design of the facility and integrate their computing facilities, testbeds and instruments into FABRIC.

The main focus of the project is to create a nation-wide high-speed (100-1000 Gigabits per-second) network interconnecting major research centers and national computing facilities that will allow researchers and scientists at these facilities to develop and experiment with new distributed application, compute and network architectures not possible today. Uniquely, FABRIC nodes can store and process information “in the network” in ways not possible in the current Internet, which will lead to completely new networking protocols, architectures and applications that address pressing problems with performance, security and adaptability in the Internet. Reaching deep into university campuses, FABRIC will connect university researchers and their local compute clusters and scientific instruments to the larger FABRIC infrastructure. The infrastructure will also provide access to public clouds, such as Amazon Web Services, Google Cloud Platform, and Microsoft Azure. This experimental facility will allow multiple experiments to be conducted simultaneously, and is capable of incorporating real traffic and real users into experiments. For more information about FABRIC including current plans for connected facilities visit All project software is available at

This project is supported by the Foundation-wide Mid-scale Research Infrastructure program. The project will be managed by the Division of Computer & Network Systems (CNS) within the Directorate for Computer & Information Science & Engineering (CISE).

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

AwardsInventXRThe Research University

EI: Virtual Data Collaboratory: A Regional Cyberinfrastructure for Collaborative Data Intensive Science

Sponsor: Rutgers University New Brunswick
Ivan Rodero [email protected] (Principal Investigator)
Manish Parashar (Former Principal Investigator)
Vasant Honavar (Co-Principal Investigator)
Jenni Evans (Co-Principal Investigator)
Grace Agnew (Co-Principal Investigator)
James von Oehsen (Co-Principal Investigator)
Award Number: 1640834


This project develops a virtual data collaboratory that can be accessed by researchers, educators, and entrepreneurs across institutional and geographic boundaries, fostering community engagement and accelerating interdisciplinary research. A federated data system is created, using existing components and building upon existing cyberinfrastructure and resources in New Jersey and Pennsylvania. Seven universities are directly involved (the three Rutgers University campuses, Pennsylvania State University, the University of Pennsylvania, the University of Pittsburgh, Drexel University, Temple University, and the City University of New York); indirectly, other regional schools served by the New Jersey and Pennsylvania high-speed networks also participate. The system has applicability to a several science and engineering domains, such as protein-DNA interaction and smart cities, and is likely to be extensible to other domains. The cyberinfrastructure is to be integrated into both graduate and undergraduate programs across several institutions.

The end product is a fully-developed system for collaborative use by the research and education community. A data management and sharing system is constructed, based largely on commercial off-the-shelf technology. The storage system is based on the Hadoop Distributed File System (HDFS), a Java-based file system providing scalable and reliable data storage, designed to span large clusters of commodity servers. The Fedora and VIVO object-based storage systems are used, enabling linked data approaches. The system will be integrated with existing research data repositories, such as the Ocean Observatories Initiative and Protein Data Bank repositories. Regional high-performance computing and network infrastructure is leveraged, including New Jersey’s Regional Education and Research Network (NJEdge), Pennsylvania’s Keystone Initiative for Network Based Education and Research (KINBER), the Extreme Science and Engineering Discovery Environment (XSEDE) computing capabilities, Open Science Grid, and other NSF Campus Cyberinfrastructure investments. The project also develops a custom site federation and data services layer; the data services layer provides services for data linking, search, and sharing; coupling to computation, analytics, and visualization; mechanisms to attach unique Digital Object Identifiers (DOIs), archive data, and broadly publish to internal and wider audiences; and manage the long-term data lifecycle, ensuring immutable and authentic data and reproducible research.

AwardsMovement ThinkingNSFThe Research University

Collaborative Research: Development of Language-Focused Three-Dimensional Science Instructional Materials to Support English Language Learners in Fifth Grade

Sponsor: Stanford University
Award Number: 1502507
Guadalupe Valdes [email protected] (Principal Investigator)


This project was submitted to the Discovery Research K-12 (DRK-12) program that seeks to significantly enhance the learning and teaching of science, technology, engineering, and mathematics (STEM) by preK-12 students and teachers, through research and development of innovative resources, models, and tools. Projects in the DRK-12 program build on fundamental research in STEM education and prior research and development efforts that provide theoretical and empirical justification for proposed projects. The project is responsive to the societal challenges emerging from the nation’s diverse and rapidly changing student demographics, including the rise of English language learners (ELLs), the fastest growing student population (see, for example, “U.S. school enrollment hits majority-minority milestone”, Education Week, February 1, 2015). ELLs have grown exponentially: 1 in 5 students (21%) in the nation spoke a language other than English at home in 2011. The project’s main purpose is to develop instructional materials for a year-long, fifth grade curriculum for all students, including ELLs. The planned curriculum will promote language-focused and three-dimensional science learning (through blending of science and engineering practices, crosscutting concepts, and disciplinary core ideas), aligned with the Framework for K-12 Science Education (National Research Council, 2012), the Next Generation Science Standards (Achieve, 2013), and the Conceptual Framework for Language use in the Science Classroom (Lee, Quinn & Valdés, 2013). The grade-level science content will target topics, such as structure and properties of matter, matter and energy in organisms and ecosystems, and Earth’s and space systems, with engineering design embedded in each topic. The language approach will emphasize analytical science tasks aimed at making sense of and constructing scientific knowledge; and receptive (listening and reading) and productive (speaking and writing) language functions. Products and research results from this project will help to reduce the science achievement gaps between ELLs and non-ELLs, and enable all students to attain higher levels of proficiency in subsequent grade levels.

After the curriculum has been developed and field-tested during Years 1-3, a pilot study will be conducted in Year 4 to investigate promise of effectiveness. Using a randomized controlled trial design, the pilot study will address three research questions: (1) What is the impact of the intervention on science learning and language development for all students, including ELLs and former ELLs?; (2) What is the impact of the intervention on teachers’ instructional practices?; and (3) To what extent are teachers able to implement the instructional materials with fidelity? To address research question 1, a sequence of multi-level models (MLMs) in which the posttest score for each student measure (the state/district science test score, and the science score and the language score on the researcher-developed assessment) will be regressed on a dummy variable representing condition (treatment or control) and pretest covariates. To examine whether the intervention is beneficial for students of varying levels of English proficiency, subgroup analyses will be conducted comparing ELLs in the treatment group against ELLs in the control group; former ELLs in the treatment group against former ELLs in the control group; and non-ELLs in the treatment group against non-ELLs in the control group, using the same MLMs. Exploratory analyses will be employed to examine the extent to which the level of English proficiency moderates the impact of the intervention on ELLs. To address research question 2, a 2-level model (teachers as level-1, and schools as level-2) in which the post-questionnaire scale score will be regressed on a dummy variable representing condition (treatment or control) will be conducted. To address research question 3, plans are to analyze ratings on coverage, adherence, and quality of instruction from classroom observations, along with ratings on program differentiation and participant responsiveness from the implementation and feedback form.

AwardsMovement ThinkingSuperintendentsThe Research University

A Center for Brains, Minds and Machines: the Science and the Technology of Intelligence

Sponsor: Massachusetts Institute of Technology
Award Number: 1231216
Tomaso Poggio [email protected] (Principal Investigator)
Ellen Hildreth (Co-Principal Investigator)
Matthew Wilson (Co-Principal Investigator)
Gabriel Kreiman (Co-Principal Investigator)
Haym Hirsh (Former Co-Principal Investigator)
Lakshminarayana Mahadevan (Former Co-Principal Investigator)


Today’s AI technologies, such as Watson, Siri and MobilEye, are impressive yet still confined to a single domain or task. Imagine how truly intelligent systems — systems that actually understand their world — could change our world. The work of scientists and engineers could be amplified to help solve the world’s most pressing technical problems. Education, healthcare and manufacturing could be transformed. Mental health could be understood on a deeper level, leading in turn to more effective treatments of brain disorders. These accomplishments will take decades. The proposed Center for Brains, Minds, and Machines (CBMM) will enable the kind of research needed to ultimately achieve such ambitious goals. The vision of the Center is of a world where intelligence, and how it emerges from brain activity, is truly understood. A successful research plan for realizing this vision requires four main areas of inquiry and integrated work across all four guided by a unifying theoretical foundation. First, understanding intelligence requires discovering how it develops from the interplay of learning and innate structure. Second, understanding the physical machinery of intelligence requires analyzing brains across multiple levels of analysis, from neural circuits to large-scale brain architecture. Third, intelligence goes beyond the narrow expertise of chess or Jeopardy-playing computers, bridging several domains including vision, planning, action, social interactions, and language. Finally, intelligence emerges from the interactions among individual interactions; it is the product of social interactions. Therefore, the research of the Center engages four major research thrusts (Reverse Engineering the Infant Mind, Neuronal Circuits Underlying Intelligence, Integrating Intelligence, and Social Intelligence) with interlocking teams and working groups, and a common theoretical, mathematical, and computational platform (Enabling Theory).

The intellectual merit of the Center is its focus on elucidating the mechanisms and architecture of intelligence in the most intelligent system known: the human brain. Success in this project will ultimately enable us to understand ourselves better, to produce smarter machines, and perhaps even to make ourselves smarter. The Center’s potential legacy of a deep understanding of intelligence, and the ability to engineer it, is tantalizing and timeless. It includes the creation of a community of researchers by programs such as an intensive summer school, technical workshops and online courses that will train the next generation of scientists and engineers in an emerging new field — the Science and Engineering of Intelligence. This new field will catalyze continuing progress in and cross-fertilization between computer science, math and statistics, robotics, neuroscience, and cognitive science. Sitting between science and engineering, it will attract growing interest from the best students at all levels. The broader impact of the Center program could be to revolutionize K-12, and also 0-K, and 12-life with a deeper understanding of the process of learning. The ability to build more human-like intelligence in machines will transform our productivity, enabling robots to care for the aged, drive our cars, and help with small-business manufacturing. The Center team is composed of over 23 investigators, many having already made significant accomplishments in multiple research areas relevant to the science and the technology of intelligence. The Center team has a mix of junior and senior researchers, bringing expertise in Computer Science, Neuroscience, Cognitive Science and Mathematics. The institutional partners include nine institutions (MIT, Harvard, Cornell, Rockefeller, UCLA, Stanford, The Allen Institute, Wellesley, Howard, Hunter and the University of Puerto Rico), three of which have significant underrepresented student populations. The academic institutions are complemented by the Center’s industrial partners (Microsoft, IBM, Google, DeepMind, Orcam, MobilEye, Willow Garage, RethinkRobotics, Boston Dynamics) and by world-renowned researchers at international institutions (Max Planck Institute, The Weizmann Institute, Italian Institute of Technology, The Hebrew University).

Please report errors in award information by writing to: [email protected]

AwardsNSFThe Research University

Learning From Diverse Populations: A Complexity-Theoretic Perspective

Sponsor: Stanford University
Omer Reingold [email protected] (Principal Investigator)
Award Number: 1908774


Despite the successes of machine learning at complex prediction and classification tasks (such as which add a reader will click? or which word a speaker pronounced?), there is growing evidence that “state-of-the-art” predictors can perform significantly less accurately on minority populations than on the majority population. Indeed, a notable study of three commercial face recognition systems, known as the “Gender Shades” project demonstrated significant performance gaps across different subpopulations at natural classification tasks. Systematic errors on underrepresented subpopulations limit the overall utility of machine-learned prediction systems and may cause material harm to individuals from minority groups. To address accuracy disparity and systematic biases throughout machine learning, the project pursue a principled study of learning in the presence of diverse populations. The project puts high value on education, service to the research community, and wide dissemination of knowledge. The research activities will be accompanied by and integrated with curriculum development, research advising (for students at all levels), service, and outreach to other scientific communities and in popular writing. In addition, in the age of machine-learning and big data, the project’s societal impact is twofold: making sure that algorithms work for everyone but also making sure algorithms uncover all potential talent, which exists in all communities.

The project combines theoretical and empirical investigations to develop algorithmic tools for mitigating systematic bias across subpopulations and to answer basic scientific questions about why discrepancy in accuracy across subpopulations emerges in the first place. Specifically, the project aims to ask and resolve questions that arise in the context of learning from diverse populations along three main axes: (1) Improving predictions for underrepresented populations: Can learning algorithms be developed that provably do not overlook significant subpopulations, (2) Representing individuals to improve the ability to audit and repair models, (3) Understanding the causes for biases in machine common learning models and algorithms.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

1 2 3
About Exponent

Exponent is a modern business theme, that lets you build stunning high performance websites using a fully visual interface. Start with any of the demos below or build one on your own.

Get Started
Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from Youtube
Consent to display content from Vimeo
Google Maps
Consent to display content from Google
Consent to display content from Spotify
Sound Cloud
Consent to display content from Sound