×
Professor wins $546K NSF grant to develop AI robotics for disabled patients
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

University of Rhode Island assistant professor Reza Abiri has received a $546,848 National Science Foundation CAREER Award to develop AI-driven robotics systems for people with spinal cord injuries and stroke survivors. The five-year project aims to create brain-machine interfaces that can interpret minimal human movements—like head nods or eye movements—and translate them into complex robotic actions, potentially restoring significant independence for individuals with physical disabilities.

How it works: Abiri’s system combines human movement detection with artificial intelligence to create intuitive robotic assistance.

  • The technology captures small movements like head turns or eye movements and uses AI to understand the person’s intentions.
  • Computer vision allows the system to analyze the environment—for example, recognizing a coffee cup and understanding it should be grabbed from the side rather than the top.
  • The AI learns individual capabilities and adapts to each person’s specific disabilities and interaction patterns.

The collaboration approach: Unlike current AI systems that work independently, Abiri’s research focuses on human-AI partnership.

  • “Our work is how the human and AI can collaborate to complete a task,” Abiri explained, distinguishing his approach from text-based AI like ChatGPT.
  • The system becomes increasingly personalized as it learns from each individual user’s movements and preferences.
  • Users remain actively engaged in the process rather than relying on fully autonomous assistance.

Why this matters: The project addresses a critical gap in assistive technology for people with permanent motor impairments.

  • While stroke patients often experience some recovery, spinal cord injury patients typically require assistive technology for independence.
  • Current brain-machine interfaces require extensive invasive procedures or complex setups, limiting their practical application.
  • The research builds on decades of neuroscience investment, focusing on translating scientific advances into real-world patient benefits.

What patients experience: Early human subjects show enthusiasm for technology that preserves their agency and decision-making.

  • “They’re excited to see this technology, and that they are really allowed to do something on their own,” Abiri noted.
  • The interactive approach motivates users to engage more actively with their environment.
  • As users become more proficient, the AI system can step back and provide less assistance.

The bigger picture: The project extends beyond immediate patient care to inspire future scientists and engineers.

  • Abiri is developing a new university course, annual hackathon competitions, and K-12 internships focused on human-centered robotics.
  • The research involves collaborations with medical schools, rehabilitation centers, and robotics companies across Rhode Island and Massachusetts.
  • People with disabilities interested in participating can contact Abiri at [email protected].

Timeline expectations: While autonomous robotics technology has advanced significantly, human-AI collaboration represents the beginning of a new frontier.

  • “From the perspective of autonomous, we are in a good stage. But from the perspective of human involvement, I see this is the beginning,” Abiri said.
  • The five-year NSF grant provides funding through 2029 for continued development and testing.
URI professor receives federal grant to develop AI-driven robotics for people with spinal cord injuries

Recent News

Cal State Bakersfield to hold free AI conference for 500+ community members on October 2nd

Attendees from tech giants to grade schoolers will explore practical AI applications across multiple industries.

AI and VR converge to reshape graphics at Vancouver’s Siggraph 2025

Training robots in hyper-detailed virtual worlds without risking real equipment.