Path-breaking work that translates an amputee’s thoughts into finger motions, and even commands in video games, holds open the possibility of humans controlling just about anything digital with their minds. Using GPUs, a group of researchers trained an AI neural decoder able to run on a compact, power-efficient NVIDIA Jetson Nano system on module (SOM) Read article >
The post First-Hand Experience: Deep Learning Lets Amputee Control Prosthetic Hand, Video Games appeared first on The Official NVIDIA Blog.