这篇文章来自 wired.com。原始 url 是: https://www.wired.com/story/this-computer-uses-lightnot-electricityto-train-ai-algorithms/
以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.
William Andregg ushers me into the cluttered workshop of his startup Fathom Computing and gently lifts the lid from a bulky black box. Inside, green light glows faintly from a collection of lenses, brackets, and cables that resemble an exploded telescope. It’s a prototype computer that processes data using light, not electricity, and it’s learning to recognize handwritten digits. In other experiments the device learned to generate sentences in text.
Right now, this embryonic optical computer is good, not great: on its best run it read 90 percent of scrawled numbers correctly. But Andregg, who cofounded Fathom late in 2014 with his brother Michael, sees it as a breakthrough. “We opened the champagne when it was only at about 30 percent,” he says with a laugh.
Andregg claims this is the first time such complex machine-learning software has been trained using circuits that pulse with laser light, not electricity. The company is working to shrink its exploded telescope, which covers a few square feet of workbench, to fit into a standard cloud server. Fathom hopes the technology will become one of the shovels of the artificial-intelligence gold rush.
Tech companies, particularly large cloud providers like Amazon and Microsoft, spend heavily on computer chips to power machine-learning algorithms. The current AI-crazed moment began when researchers found that chips marketed for graphics were well-suited to power so-called artificial neural networks for tasks such as recognizing speech or images. The stock price of leading graphics-chip supplier Nvidia has grown more than 10-fold in the past three years, and Google and many other companies are now making or developing specialized machine-learning chips of their own.
Fathom’s founders are betting this hunger for more powerful machine learning will outstrip the capabilities of purely electronic computers. “Optics has fundamental advantages over electronics that no amount of design will overcome,” says William Andregg. He and his brother’s 11-person company is backed by Playground Global, the venture firm led by Andy Rubin, who coinvented the Android operating system now owned by Google. Fathom operates out of Playground’s combined offices and workshops in Palo Alto, California. The facility, which true to its name also boasts a slide popular with Andregg’s 18-month-old daughter, previously hosted Nervana, the startup acquired by Intel in 2016 to form the heart of the chip giant’s AI hardware strategy.
You’re already reaping the benefits of using light instead of electricity to work with data. Telecommunications companies move our web pages and selfies over long distances by shooting lasers down optical fiber, because light signals travel much farther, using a fraction of the energy, than electrical pulses in a metal cable. A single cable can house many parallel streams of data at once, carried by light of different colors.
Using light to crunch data, as well as transport it, should also offer significant performance gains. Light inside optical circuits travels more or less for free. By contrast, electrical signals must battle resistance, producing waste heat. A combination of capacity gains and energy savings could be tempting to companies running big machine-learning projects. A single experiment at Google, for example, can now use hundreds of powerful graphics chips for solid weeks at a time, according to some of the firm’s research papers.
Optical processors aren’t a new idea. They were a feature of some 1960s military radar systems. But the idea fell by the wayside when the semiconductor industry hit its stride, delivering decades of exponential increases in the density of chips that came to be known as Moore’s Law. Fathom is part of a nascent optical computing renaissance sparked by a realization that Moore’s Law appears to be running out of steam. The trend’s demise was cited in a recent report by 14 Berkeley researchers on the technical challenges to making AI systems ever smarter. “Our historically rapidly improving hardware technology is coming to a grinding halt,” they wrote.
Optical computers aren’t likely to power your laptop or smartphone any time soon. Fathom’s prototype is still too bulky, for one thing. But the technology does look to be a decent match for the main work that chips perform in AI projects based on artificial neural networks, says Pierre-Alexandre Blanche, a professor at the University of Arizona. Siri’s speech recognition, and Alphabet’s conquest of the boardgame Go, are built on huge volumes of one particular mathematical operation, multiplying grids of numbers known as matrices.
Fathom’s prototype performs those operations by encoding numbers into beams of light. The beams are passed through a sequence of lenses and other optical components. Reading how the beams have been changed by their ordeal reveals the result of a calculation. Optical circuits like this can effectively perform the work of both the memory and processor in conventional computers. The time and energy costs of moving data between those components is a bottleneck on the performance of systems in use today.
Fathom isn’t alone in thinking AI systems need to trip the light fantastic. Paris-based startup LightOn announced Friday that it has begun testing its own technology in a data center. Startups Lightmatter and Lightelligence spun out of MIT last year, raising a total of $21 million in funding, including from China’s search giant Baidu. The pair originate in an MIT project that ran neural networks for speech recognition on an optical computer, although unlike Fathom’s device the system didn’t play host to the training of that software. “As soon as we posted our research paper on that project online we received multiple calls from investors,” says Yichen Shen, CEO and cofounder of Lightelligence. “There’s recognition this is a big opportunity.”
The Andregg brothers’ last startup, Halcyon Molecular, stumbled in pursuit of a different big opportunity. The genome-sequencing company was backed by Tesla CEO Elon Musk and Facebook investor Peter Thiel but folded in 2012 because, the founders say, competitors were further ahead.
Andregg believes his team is better placed in the optical-computing race. All the same, Fathom’s prototype has a way to go. Beyond its size, the current version becomes error-prone when it gets cold. The goal is to fit the system onto one circuit board so it can be slid into a server. Some aspects of the bulky system I saw should be relatively easy to make smaller; it was assembled using relatively low-cost parts to aid hands-on tinkering as the idea was proven out. But the company also has to create a new chip to detect and manipulate laser beams. That’s within the realm of what contract chip manufacturers can build, but designing any kind of chip is a complex task for a startup.
Andregg guesses that final product won’t be ready for about two years, but he and his brother are already worrying about what people might do with it. Fathom was incorporated as a benefit corporation with the mission statement “Making better hardware for artificial intelligence and improving all lives.” That is intended to give Fathom’s leaders the right to turn down sales they think could lead to harmful uses of artificial intelligence. “We don’t want a negative singularity,” Andregg says. “If the military wants to buy a bunch of systems we’ll be like eh…no.”
- Chips that exploit quantum mechanics could reinvent computing; startup Rigetti Computing is racing Google, Microsoft, and IBM to do it first.
- China’s plan to overtake the U.S. in artificial intelligence rides in part on the country developing new silicon chips.
- Cramming AI smarts into tiny devices could require reviving ideas from the birth of computing 60 years ago.