Translate

Pages

Pages

Pages

Intro Video

Monday, October 28, 2019

Nature can help solve optimization problems

Today's best digital computers still struggle to solve, in a practical time frame, a certain class of problem: combinatorial optimization problems, or those that involve combing through large sets of possibilities to find the best solution. Quantum computers hold potential to take on these problems, but scaling up the number of quantum bits in these systems remains a hurdle. 

Now, MIT Lincoln Laboratory researchers have demonstrated an alternative, analog-based way to accelerate the computing of these problems. "Our computer works by 'computing with physics' and uses nature itself to help solve these tough optimization problems," says Jeffrey Chou, co-lead author of a paper about this work published in Nature's Scientific Reports. "It's made of standard electronic components, allowing us to scale our computer quickly and cheaply by leveraging the existing microchip industry."

Perhaps the most well-known combinatorial optimization problem is that of the traveling salesperson. The problem asks to find the shortest route a salesperson can take through a number of cities, starting and ending at the same one. It may seem simple with only a few cities, but the problem becomes exponentially difficult to solve as the number of cities grows, bogging down even the best supercomputers. Yet optimization problems need to be solved in the real world daily; the solutions are used to schedule shifts, minimize financial risk, discover drugs, plan shipments, reduce interference on wireless networks, and much more.

"It has been known for a very long time that digital computers are fundamentally bad at solving these types of problems," says Suraj Bramhavar, also a co-lead author. "Many of the algorithms that have been devised to find solutions have to trade off solution quality for time. Finding the absolute optimum solution winds up taking an unreasonably long time when the problem sizes grow." Finding better solutions and doing so in dramatically less time could save industries billions of dollars. Thus, researchers have been searching for new ways to build systems designed specifically for optimization.

Finding the beat  

Nature likes to optimize energy, or achieve goals in the most efficient and distributed manner. This principle can be witnessed in the synchrony of nature, like heart cells beating together or schools of fish moving as one. Similarly, if you set two pendulum clocks on the same surface, no matter when the individual pendula are set into motion, they will eventually be lulled into a synchronized rhythm, reaching their apex at the same time but moving in opposite directions (or out of phase). This phenomenon was first observed in 1665 by the Dutch scientist Christiaan Huygens. These clocks are an example of coupled oscillators, set up in such a way that energy can be transferred between them. 

"We've essentially built an electronic, programmable version of this [clock setup] using coupled nonlinear oscillators," Chou says, showing a YouTube video of metronomes displaying a similar phenomenon. "The idea is that if you set up a system that encodes your problem's energy landscape, then the system will naturally try to minimize the energy by synchronizing, and in doing so, will settle on the best solution. We can then read out this solution."

The laboratory's prototype is a type of Ising machine, a computer based on a model in physics that describes a network of magnets, each of which have a magnetic "spin" orientation that can point only up or down. Each spin's final orientation depends on its interaction with every other spin. The individual spin-to-spin interactions are defined with a specific coupling weight, which denotes the strength of their connection. The goal of an Ising machine is to find, given a specific coupling strength network, the correct configuration of each spin, up or down, that minimizes the overall system energy.

But how does an Ising machine solve an optimization problem? It turns out that optimization problems can be mapped directly onto the Ising model, so that a set of a spins with certain coupling weights can represent each city and the distances between them in the traveling salesperson problem. Thus, finding the lowest-energy configuration of spins in the Ising model translates directly into the solution for the seller's fastest route. However, solving this problem by individually checking each of the possible configurations becomes prohibitively difficult when the problems grow to even modest sizes. 

In recent years, there have been efforts to build quantum machines that map to the Ising model, the most notable of which is one from the Canadian company D-Wave Systems. These machines may offer an efficient way to search the large solution space and find the correct answer, although they operate at cryogenic temperatures.

The laboratory's system runs a similar search, but does so using simple electronic oscillators. Each oscillator represents a spin in the Ising model, and similarly takes on a binarized phase, where oscillators that are synchronized, or in phase, represent the "spin up" configuration and those that are out of phase represent the "spin down" configuration. To set the system up to solve an optimization problem, the problem is first mapped to the Ising model, translating it into programmable coupling weights connecting each oscillator.

With the coupling weights programmed, the oscillators are allowed to run, like the pendulum arm of each clock being released. The system then naturally relaxes to its overall minimum energy state. Electronically reading out each oscillator's final phase, representing "spin up" or "spin down," presents the answer to the posed question. When the system ran against more than 2,000 random optimization problems, it came to the correct solution 98 percent of the time.

Previously, researchers at Stanford University demonstrated an Ising machine that uses lasers and electronics to solve optimization problems. That work revealed the potential for a significant speedup over digital computing although, according to Chou, the system may be difficult and costly to scale to larger sizes. The goal of finding a simpler alternative ignited the laboratory's research. 

Scaling up

The individual oscillator circuit the team used in their demonstration is similar to circuitry found inside cellphones or Wi-Fi routers. One addition they've made is a crossbar architecture that allows all of the oscillators in the circuit to be directly coupled to each other. "We have found an architecture that is both scalable to manufacture and can enable full connectivity to thousands of oscillators," Chou says. A fully connected system allows it to easily be mapped to a wide variety of optimization problems. 

"This work from Lincoln Laboratory makes innovative use of a crossbar architecture in its construction of an analog-electronic Ising machine," says Peter McMahon, an assistant professor of applied and engineering physics at Cornell University who was not involved in this research. "It will be interesting to see how future developments of this architecture and platform perform."  

The laboratory's prototype Ising machine uses four oscillators. The team is now working out a plan to scale the prototype to larger numbers of oscillators, or "nodes," and fabricate it on a printed circuit board. "If we can get to, say, 500 nodes, there is a chance we can start to compete with existing computers, and at 1,000 nodes we might be able to beat them," Bramhavar says.

The team sees a clear path forward to scaling up because the technology is based on standard electronic components. It's also extremely cheap. All the parts for their prototype can be found in a typical undergraduate electrical engineering lab and were bought online for about $20.

"What excites me is the simplicity," Bramhavar adds. "Quantum computers are expected to demonstrate amazing performance, but the scientific and engineering challenges required to scale them up are quite hard. Demonstrating even a small fraction of the performance gains envisioned with quantum computers, but doing so using hardware from the existing electronics industry, would be a huge leap forward. Exploiting the natural behavior of these circuits to solve real problems presents a very compelling alternative for what the next era of computing could be."



from MIT News https://ift.tt/2NcTfJG
via