In order to solve this problem, the research team developed an AI-enabled model that adjusted for day-to-day changes. For two weeks, the subject tried to visualize simple movements while the AI learned from his brain signals. When he first attempted to control a robotic arm, his movements were imprecise. To improve accuracy, he practiced using a virtual robotic arm that provided real-time feedback.
As soon as he learned how to use the virtual arm, he could very quickly transfer these skills to the real robotic arm. He was able to grab blocks with the arm, rotate them, and place them in different positions. In a more advanced task, he opened a cabinet, took out a cup, and placed it under the water dispenser.