Skip to content

QualityPoint Technologies News

Emerging Technologies News

Menu
  • About Us
  • Technology
  • Medical
  • Robots
  • Artificial Intelligence (AI)
  • 3D Printing
  • Contact Us
Menu

Huge Discount Offer: 14 ebooks + 2 courses

Use of brain-computer interface, virtual avatar could help people with gait disabilities

Posted on August 25, 2017

Researchers from the University of Houston have shown for the first time that the use of a brain-computer interface augmented with a virtual walking avatar can control gait, suggesting the protocol may help patients recover the ability to walk after stroke, some spinal cord injuries and certain other gait disabilities.

Researchers said the work, done at the University’s Noninvasive Brain-Machine Interface System Laboratory, is the first to demonstrate that a brain-computer interface can promote and enhance cortical involvement during walking. The study, funded by the National Institute of Neurological Disease and Stroke, was published this week in Scientific Reports.

Jose Luis Contreras-Vidal, Cullen professor of electrical and computer engineering at UH and senior author of the paper, said the data will be made available to other researchers. While similar work has been done in other primates, this is the first to involve humans, he said. Contreras-Vidal is also site director of the BRAIN Center (Building Reliable Advances and Innovation in Neurotechnology), a National Science Foundation Industry/University Cooperative Research Center.

Contreras-Vidal and researchers with his lab use non-invasive brain monitoring to determine what parts of the brain are involved in an activity, using that information to create an algorithm, or a brain-machine interface, which can translate the subject’s intentions into action.

In addition to Contreras-Vidal, researchers on the project are first author Trieu Phat Luu, a research fellow in neural engineering at UH; Sho Nakagome and Yongtian He, graduate students in the UH Department of Electrical and Computer Engineering.

“Voluntary control of movements is crucial for motor learning and physical rehabilitation,” they wrote. “Our results suggest the possible benefits of using a closed-loop EEG-based BCI-VR (brain-computer interface-virtual reality) system in inducing voluntary control of human gait.”

Researchers already knew electroencephalogram (EEG) readings of brain activity can distinguish whether a subject is standing still or walking. But they hadn’t previously known if a brain-computer interface was practical for helping to promote the ability to walk, or what parts of the brain are relevant to determining gait.

In this case, they collected data from eight healthy subjects, all of whom participated in three trials involving walking on a treadmill while watching an avatar displayed on a monitor. The volunteers were fitted with a 64-channel headset and motion sensors at the hip, knee and ankle joint.

The avatar first was activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. In later tests, the avatar was controlled by the brain-computer interface, meaning the subject controlled the avatar with his or her brain.

The avatar perfectly mimicked the subject’s movements when relying upon the sensors, but the match was less precise when the brain-computer interface was used.

Contreras-Vidal said that’s to be expected, noting that other studies have shown some initial decoding errors as the subject learns to use the interface. “It’s like learning to use a new tool or sport,” he said. “You have to understand how the tool works. The brain needs time to learn that.”

The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex, which is involved in motor learning and error monitoring.

The next step is to use the protocol with patients, the subject of He’s Ph.D. dissertation.

“The appeal of brain-machine interface is that it places the user at the center of the therapy,” Contreras-Vidal said. “They have to be engaged, because they are in control.”

News Source: https://ssl.uh.edu/news-events/stories/2017/August%202017/08232017Contreras-Vidal-Computer-Interface-Virtual-Avatar.php

Share

Related News:

  1. New type of supercomputer could be based on ‘magic dust’ combination of light and matter
  2. IBM designs world’s smallest computer as cryptographic anchors to fight against counterfeiters.
  3. FlowRep Software can sketch, recreate 3D shapes
  4. Researchers develop origami-inspired robot
Master RAG ⭐ Rajamanickam.com ⭐ Bundle Offer ⭐ Merch ⭐ AI Course

  • Bundle Offer
  • Hire AI Developer

Latest News

  • Harvard Sues Trump Administration Over International Student Ban May 23, 2025
  • Stanford Researchers Develop AI Agents That Simulate Human Behavior with High Accuracy May 23, 2025
  • ​Firebase Studio: Google’s New Platform for Building AI-Powered Applications April 11, 2025
  • MIT Researchers Develop Framework to Enhance LLMs in Complex Planning April 7, 2025
  • MIT and NVIDIA Unveil HART: A Breakthrough in AI Image Generation March 25, 2025
  • Can LLMs Truly Understand Time Series Anomalies? March 18, 2025
  • Can AI tell us if those Zoom calls are flowing smoothly? March 11, 2025
  • New AI Agent, Manus, Emerges to Bridge the Gap Between Conception and Execution March 10, 2025
  • OpenAI Unveils GPT-4.5, Promising Enhanced AI Performance February 28, 2025
  • Anthropic Launches Claude Code to Revolutionize Developer Productivity February 25, 2025

Pages

  • About Us
  • Basics of 3D Printing
  • Key Innovations
  • Know about Graphene
  • Privacy Policy
  • Shop
  • Contact Us

Archives

Developed by QualityPoint Technologies (QPT)

QPT Products | eBook | Privacy

Timesheet | Calendar Generator

©2025 QualityPoint Technologies News | Design: Newspaperly WordPress Theme