源之原味

音乐家谁失去了他的胳膊再弹钢琴与 AI 假肢

 

这篇文章来自 nvidia.com。原始 url 是: https://blogs.nvidia.com/blog/2017/12/28/ai-prosthesis-skywalker/

以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.

星战启发的肢体可以控制每个手指。

一个遥远的星系, 是由卢克·天行者的仿生手所启发的机器人手臂的发明。

虽然这只手臂可能没有挥舞光剑, 但它有更大的力量为爵士乐音乐家贾森巴恩斯-它让他弹钢琴的第一次在五年。

巴恩斯, 谁失去了他的右臂在一个工作事故, 是回到了一个人工智能假肢由研究员在佐治亚理工学院创建的关键。与大多数假肢不同, 它给 28-year-old 的能力, 以控制每个手指单独。

有了它, 巴恩斯可以演奏贝多芬。他还扮演 "星球大战" 的主题曲。(你可以看他在下面的视频。

"这是完全 mind-blowing 的," 巴恩斯说。"如果它能弹钢琴, 它几乎可以做任何事情。

个人手指控制使灵巧

通过单独的手指控制, 巴恩斯和其他截肢者可以使用 AI 假体日常活动, 如手持叉子, 毛巾或梳子。那种灵巧来自于 GPU 加速的组合 深入学习 还有一台超声波机

Barnes’ everyday prosthesis, like most, relies on electromyogram (EMG) sensors to detect electrical impulses in his muscles. Although these recognize muscle movement, EMG signals are too noisy to determine which finger wants to move.

“It’s like putting a microphone next to a concert hall,” said Gil Weinberg, the Georgia Tech professor who leads the research. “We needed to be inside the concert hall.”

Ultrasound Strikes a Chord

Weinberg was in a colleague’s lab trying to improve EMG when he noticed an ultrasound machine next to where he was working. The same device that doctors use to see babies in the womb let him see muscle contractions, as well as the speed and direction of muscle movements.

“It was a big eureka,” he said. “With the ultrasound, there was a distinct correlation between what finger moved and what was on the machine.”

By attaching an ultrasound probe to the arm, Weinberg trained a deep learning network to analyze and detect muscle movements. Using our GeForce GTX 泰坦 X GPU with the cuDNN-accelerated TensorFlow deep learning framework, the team created an algorithm that predicts what finger the musician is trying to use.

A Different Drummer

The “Star Wars” arm is Barnes’ second AI prosthesis. The Atlanta music teacher is a drummer at heart. Determined to keep playing after the accident, he rigged up a homemade prosthesis. It let him use a drumstick, but he couldn’t control the speed or bounce of the stick.

That posed the perfect challenge for Weinberg, founding director of Georgia Tech’s center for music technology, who wants to change how we think about music by creating AI technologies that compose and perform songs.

When Barnes approached him, Weinberg had already built a robotic percussionist marimba player that use deep learning to improvise with human musicians. Like Barnes, he’s a jazz musician (piano), and the idea of using AI to help Barnes get his groove back intrigued him.

AI Music on Tour

But Weinberg did more than let Barnes beat the drum again. He built the deep learning prosthesis with not one, but two drumsticks. Barnes controls one, and the other improvises tunes based on the music in the room. Besides composing music, the robotic arm plays faster than any drummer in the world, according to Weinberg.

“The idea is to bring you back to how you used to be — or better,” Weinberg said. “We can push the limits of what’s humanly possible with deep learning.”

Although the second drumstick was intimidating at first, Barnes mastered it well enough to go on tour. The journey took him and Weinberg to four continents and included a stop at the Kennedy Center in Washington.

“I went from a horrible accident to playing around the world,” Barnes said.

Leave A Reply

Your email address will not be published.