MIT’s New Neural Network: “Liquid” Machine-Learning System Adapts to Changing Conditions

MIT researchers have developed a type of neural network that learns on the job, not just during its training phase. These flexible algorithms, dubbed “liquid” networks, change their underlying equations to continuously adapt to new data inputs. The advance could aid decision making based on data streams that change over time, including those involved in medical diagnosis and autonomous driving.

“This is a way forward for the future of robot control, natural language processing, video processing — any form of time series data processing,” says Ramin Hasani, the study’s lead author. “The potential is really significant.”

The research will be presented at February’s AAAI Conference on Artificial Intelligence. In addition to Hasani, a postdoc in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), MIT co-authors include Daniela Rus, CSAIL director and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science, and PhD student Alexander Amini. Other co-authors include Mathias Lechner of the Institute of Science and Technology Austria and Radu Grosu of the Vienna University of Technology.

Time series data are both ubiquitous and vital to our understanding the world, according to Hasani. “The real world is all about sequences. Even our perception — you’re not perceiving images, you’re perceiving sequences of images,” he says. “So, time series data actually create our reality.”

He points to video processing, financial data, and medical diagnostic applications as examples of time series that are central to society. The vicissitudes of these ever-changing data streams can be unpredictable. Yet analyzing these data in real time, and using them to anticipate future behavior, can boost the development of emerging technologies like self-driving cars. So Hasani built an algorithm fit for the task.

Hasani designed a neural network that can adapt to the variability of real-world systems. Neural networks are algorithms that recognize patterns by analyzing a set of “training” examples. They’re often said to mimic the processing pathways of the brain — Hasani drew inspiration directly from the microscopic nematode, C. elegans. “It only has 302 neurons in its nervous system,” he says, “yet it can generate unexpectedly complex dynamics.”

Hasani coded his neural network with careful attention to how C. elegans neurons activate and communicate with each other via electrical impulses. In the equations he used to structure his neural network, he allowed the parameters to change over time based on the results of a nested set of differential equations.

This flexibility is key. Most neural networks’ behavior is fixed after the training phase, which means they’re bad at adjusting to changes in the incoming data stream. Hasani says the fluidity of his “liquid” network makes it more resilient to unexpected or noisy data, like if heavy rain obscures the view of a camera on a self-driving car. “So, it’s more robust,” he says.

There’s another advantage of the network’s flexibility, he adds: “It’s more interpretable.”

Hasani says his liquid network skirts the inscrutability common to other neural networks. “Just changing the representation of a neuron,” which Hasani did with the differential equations, “you can really explore some degrees of complexity you couldn’t explore otherwise.” Thanks to Hasani’s small number of highly expressive neurons, it’s easier to peer into the “black box” of the network’s decision making and diagnose why the network made a certain characterization.

“The model itself is richer in terms of expressivity,” says Hasani. That could help engineers understand and improve the liquid network’s performance.

Hasani’s network excelled in a battery of tests. It edged out other state-of-the-art time series algorithms by a few percentage points in accurately predicting future values in datasets, ranging from atmospheric chemistry to traffic patterns. “In many applications, we see the performance is reliably high,” he says. Plus, the network’s small size meant it completed the tests without a steep computing cost. “Everyone talks about scaling up their network,” says Hasani. “We want to scale down, to have fewer but richer nodes.”

Hasani plans to keep improving the system and ready it for industrial application. “We have a provably more expressive neural network that is inspired by nature. But this is just the beginning of the process,” he says. “The obvious question is how do you extend this? We think this kind of network could be a key element of future intelligence systems.”

Silicon Anode Nanostructure Generates New Potential for Lithium-Ion Batteries

Scientists reveal a new nanostructure that could revolutionize technology in batteries and beyond.

New research has identified a nanostructure that improves the anode in lithium-ion batteries
Instead of using graphite for the anode, the researchers turned to silicon: a material that stores more charge but is susceptible to fracturing
The team made the silicon anode by depositing silicon atoms on top of metallic nanoparticles
The resulting nanostructure formed arches, increasing the strength and structural integrity of the anode
Electrochemical tests showed the lithium-ion batteries with the improved silicon anodes had a higher charge capacity and longer lifespan
New research conducted by the Okinawa Institute of Science and Technology Graduate University (OIST) has identified a specific building block that improves the anode in lithium-ion batteries. The unique properties of the structure, which was built using nanoparticle technology, are revealed and explained today (February 5, 2021) in Communications Materials.

Powerful, portable and rechargeable, lithium-ion batteries are crucial components of modern technology, found in smartphones, laptops and electric vehicles. In 2019, their potential to revolutionize how we store and consume power in the future, as we move away from fossil fuels, was notably recognized, with the Nobel Prize co-awarded to new OIST Board of Governors member, Dr. Akira Yoshino, for his work developing the lithium-ion battery.

Traditionally, graphite is used for the anode of a lithium-ion battery, but this carbon material has major limitations.

“When a battery is being charged, lithium ions are forced to move from one side of the battery — the cathode — through an electrolyte solution to the other side of the battery — the anode. Then, when a battery is being used, the lithium ions move back into the cathode and an electric current is released from the battery,” explained Dr. Marta Haro, a former researcher at OIST and first author of the study. “But in graphite anodes, six atoms of carbon are needed to store one lithium ion, so the energy density of these batteries is low.”

With science and industry currently exploring the use of lithium-ion batteries to power electric vehicles and aerospace craft, improving energy density is critical. Researchers are now searching for new materials that can increase the number of lithium ions stored in the anode.

One of the most promising candidates is silicon, which can bind four lithium ions for every one silicon atom.
Silicon anodes can store ten times as much charge in a given volume than graphite anodes — a whole order of magnitude higher in terms of energy density,” said Dr. Haro. “The problem is, as the lithium ions move into the anode, the volume change is huge, up to around 400%, which causes the electrode to fracture and break.”

The large volume change also prevents stable formation of a protective layer that lies between the electrolyte and the anode. Every time the battery is charged, this layer therefore must continually reform, using up the limited supply of lithium ions and reducing the lifespan and rechargeability of the battery.

“Our goal was to try and create a more robust anode capable of resisting these stresses, that can absorb as much lithium as possible and ensure as many charge cycles as possible before deteriorating,” said Dr. Grammatikopoulos, senior author of the paper. “And the approach we took was to build a structure using nanoparticles.”

In a previous paper, published in 2017 in Advanced Science, the now-disbanded OIST Nanoparticles by Design Unit developed a cake-like layered structure, where each layer of silicon was sandwiched between tantalum metal nanoparticles. This improved the structural integrity of the silicon anode, preventing over-swelling.

While experimenting with different thicknesses of the silicon layer to see how it affected the material’s elastic properties, the researchers noticed something strange.

“There was a point at a specific thickness of the silicon layer where the elastic properties of the structure completely changed,” said Theo Bouloumis, a current PhD student at OIST who was conducting this experiment. “The material became gradually stiffer, but then quickly decreased in stiffness when the thickness of the silicon layer was further increased. We had some ideas, but at the time, we didn’t know the fundamental reason behind why this change occurred.”

Now, this new paper finally provides an explanation for the sudden spike in stiffness at one critical thickness.

Through microscopy techniques and computer simulations at the atomic level, the researchers showed that as the silicon atoms are deposited onto the layer of nanoparticles, they don’t form an even and uniform film. Instead, they form columns in the shape of inverted cones, growing wider and wider as more silicon atoms are deposited. Eventually, the individual silicon columns touch each other, forming a vaulted structure.

“The vaulted structure is strong, just like an arch is strong in civil engineering,” said Dr. Grammatikopoulos. “The same concept applies, just on a nanoscale.”

Importantly, the increased strength of the structure also coincided with enhanced battery performance. When the scientists carried out electrochemical tests, they found that the lithium-ion battery had an increased charge capacity. The protective layer was also more stable, meaning the battery could withstand more charge cycles.

These improvements are only seen at the precise moment that the columns touch. Before this moment occurs, the individual pillars are wobbly and so cannot provide structural integrity to the anode. And if silicon deposition continues after the columns touch, it creates a porous film with many voids, resulting in a weak, sponge-like behavior.

This reveal of the vaulted structure and how it gains its unique properties not only acts as an important step forward towards the commercialization of silicon anodes in lithium-ion batteries, but also has many other potential applications within material sciences.

“The vaulted structure could be used when materials are needed that are strong and able to withstand various stresses, such as for bio-implants or for storing hydrogen,” said Dr. Grammatikopoulos. “The exact type of material you need — stronger or softer, more flexible or less flexible — can be precisely made, simply by changing the thickness of the layer. That’s the beauty of nanostructures.”

Harvard Scientists Trilayer Graphene Breakthrough Opens the Door for High Temperature Superconductors

In 2018, the physics world was set ablaze with the discovery that when an ultrathin layer of carbon, called graphene, is stacked and twisted to a “magic angle,” that new double layered structure converts into a superconductor, allowing electricity to flow without resistance or energy waste. Now, in a literal twist, Harvard scientists have expanded on that superconducting system by adding a third layer and rotating it, opening the door for continued advancements in graphene-based superconductivity.

The work is described in a new paper in Science and can one day help lead toward superconductors that operate at higher or even close to room temperature. These superconductors are considered the holy grail of condensed matter physics since they would allow for tremendous technological revolutions in many areas including electricity transmission, transportation, and quantum computing. Most superconductors today, including the double layered graphene structure, work only at ultracold temperatures.

“Superconductivity in twisted graphene provides physicists with an experimentally controllable and theoretically accessible model system where they can play with the system’s properties to decode the secrets of high temperature superconductivity,” said one of the paper’s co-lead authors Andrew Zimmerman, a postdoctoral researcher in working in the lab of Harvard physicist Philip Kim.

Graphene is a one-atom-thick layer of carbon atoms that is 200 times stronger than steel yet is extremely flexible and lighter than paper. It has almost always been known to be a good conductor of heat and electrical current but is notoriously difficult to handle. Experiments unlocking the puzzle of twisted bilayer graphene have been ongoing since MIT physicist Pablo Jarillo-Herrero and his group pioneered the emerging field of “twistronics” with their experiment in 2018 where they produced the graphene superconductor by twisting it to a magic angle of 1.1 degrees.

The Harvard scientists report successfully stacking three sheets of graphene and then twisting each of them at that magic angle to produce a three-layered structure that is not only capable of superconductivity but does so more robustly and at higher temperatures than many of the double-stacked graphene. The new and improved system is also sensitive to an externally applied electric field that allows them to tune the level of superconductivity by adjusting the strength of that field.

“It enabled us to observe the superconductor in a new dimension and provided us with important clues about the mechanism that’s driving the superconductivity,” said the study’s other lead author Zeyu Hao, a Ph.D. student in the Graduate School of Arts and Sciences also working in the Kim Group.

One of those mechanisms has the theorists really excited. The trilayer system showed evidence that its superconductivity is due to strong interactions between electrons as opposed to weak ones. If true, this can not only help open a path to high temperature superconductivity but possible applications in quantum computing.

“In most conventional superconductors, electrons move with a high speed and occasionally cross-paths and influence each other. In this case, we say their interaction effects are weak,” said Eslam Khalaf, a co-author on the study and postdoctoral fellow working in the lab of Harvard physics professor Ashvin Vishwanath. “While weakly interacting superconductors are fragile and lose superconductivity when heated to a few Kelvins, strong coupling superconductors are much more resilient but much less understood. Realizing strong coupling superconductivity in a simple and tunable system such as trilayer could pave the way to finally develop a theoretical understanding of strongly-coupled superconductors to help realize the goal of a high temperature, maybe even room temperature, superconductor.”

The researchers plan on continuing to explore the nature of this unusual superconductivity in further studies.

“The more we understand, the better we have chance to increase the superconducting transition temperatures,” said Kim.

Atomic Design for a Carbon-Free Planet – “To Help Save Earth, Essentially”

For much of his career, Ju Li thrived on the theoretical aspects of his work, which investigated how manipulating and restructuring materials at the atomic scale could yield surprising and useful new macroscale properties. This research, which he began in 1994 as a graduate student at MIT, was situated at “the interface between the known and unknown,” says Li PhD ’00, the Battelle Energy Alliance Professor of Nuclear Science and Engineering (NSE) and professor of Materials Science and Engineering. “There was a kind of uncertainty in doing research that was very attractive to me, almost addictive.”

Li’s work modeling the positions of atoms “the way Newton tracked trajectories of planets,” he says, was a form of deep play: “The science was fascinating, and I was having a lot of fun doing simulations about electrons, atoms and defects,” he says.

But beginning in 2011, after he returned to MIT as a faculty member, Li began questioning his goals. “As one gets older, just doing theory and talking about science is not enough,” he says. “I had known since the late 1990s that climate change was a problem, and I came to realize there was a lot I could and should do personally to contribute.”

Li recognized that his years of microstructural material simulations provided a robust platform for exploring energy solutions to help address climate change. He launched an experimental program in his lab, and, he says, “I became more engineering-focused.”

The result: a gusher of advances in materials with applications in nuclear energy, batteries, and energy conversion, with significant near- and long-term implications for decarbonizing the planet. The breadth of his work, captured in hundreds of journal articles — 45 in 2020 alone — has earned Li recognition, including election to the Materials Research Society, the American Physical Society, and, just last November, election as a fellow to the American Association for the Advancement of Science.

But what drives all this productivity “is feeling the pressure of time,” says Li, who has launched what amounts to an ambitious campaign “to help save Earth, essentially.”
Researching A+B
As a way of organizing his own burgeoning energy research portfolio, and establishing a model for the larger research community, Li has embraced a two-part, “A+B” approach:

“‘A’ is for action, which means rapidly scaling up proven technologies such as nuclear power and battery energy storage that we know can work at the terawatt scale required to reduce CO2 emissions drastically before mid-century,” says Li. “‘B’ is for baby technologies, like advanced fission and fusion reactors, and quantum computing, new technologies that we must nurture today so that they are ready in 20 to 30 years.”

Earth is catching fire, Li believes, and it’s important to direct the full force of scalable technologies at the conflagration right now. “You put out the fire by 2050, slow down the slope of CO2 and temperature rise, then bring in cleaner, more advanced energy systems to scale,” he says.

To underscore his commitment to this approach, Li last year launched the Applied Energy Symposium: MIT A+B showcasing the most promising materials and technologies for immediate and future energy impacts.

Li’s own A+B research draws on his deep expertise in materials theory, modeling, and microstructural science. For more than a decade, he has been investigating innovative applications for elastic strain engineering, a technique that puts huge tensile and shear mechanical stresses on the lattice-like atomic structure of certain materials in order to generate novel optical, electrical, thermal, catalytic, and other properties. This approach first emerged in the 1990s, when researchers strained silicon crystal lattice 1 percent beyond its original state, permitting electrons to travel faster through the material and setting the stage for better lasers and transistors.

Li’s group has broken past previous elastic strain limits, unleashing more potential in materials. Among other accomplishments, his team can strain silicon beyond 10 percent and diamond beyond 7 percent, paving the way for much faster semiconductors. They have developed better catalysts for hydrogen fuel cells, and for the energy conversions required to turn the electricity from solar, wind, and nuclear energy into chemical fuels that can be stored. Li’s team has also demonstrated strain-engineered superconductors. “These strained metallic conductors could significantly improve superconducting magnets, as well as efficient, long-range power transmission,” he says.

Nanocircuitry and beyond
In another application of strain engineering, Li and his collaborators were able to stretch micron-sized, uniformly shaped structures out of industrial diamond material, deploying microfabricated grippers triggered by microelectromechanical systems. These structures, which Li calls microbridges, have unique electrical properties and can be massively replicated. “We can put gazillions of these microbridges onto wafers, and each of these bridges can host thousands of transistors,” Li says. “We hope they could prove useful in power electronics for solar photovoltaics.”

This work in nanocircuitry is part of Li’s broader efforts in advanced computing, which incorporate a range of engineering techniques. For instance, his lab has learned how to manipulate single atoms with great precision, employing highly focused electron beams. “We can dribble and shoot the atom, like a soccer ball, controlling its direction and energy,” says Li. It is research he hopes will advance quantum information processing, boosting many domains of engineering including A+B technologies.

In parallel to this advanced computing work, Li is forging ahead with critical energy applications, aided by in situ transmission electron microscopy, machine learning, and electronic structure modeling, One current project: designing safe and powerful all-solid-state batteries, using honeycomb-shaped nanostructures that are stable while in contact with highly corrosive lithium metal.

In the nuclear energy arena, Li is developing robust, carbon-nanotube and nanowire strengthened metallic nanocomposite materials that can survive high-dose radiation and high temperature; 3D printing of refractory alloys; and materials crafted from ceramic-zirconium crystal that could serve as a thermal superinsulator, taking heat up to 1,400 degrees Celsius. He is also crafting processes for removing radioactive gases and liquids in treating spent nuclear fuels, in an attempt to “fully close the nuclear fuel cycle,” says Li.

To top off this flood of research, Li is co-directing the MIT Energy Initiative’s Low-Carbon Energy Center for Materials in Energy and Extreme Environments, with NSE Professor Bilge Yildiz.

From theory to device
As the child of two engineers who built nuclear power plants in China, Li always felt comfortable with nuclear energy and other sophisticated energy technologies. But he loved computer programming and theoretical physics, and never viewed himself as an engineer.

It was through his MIT mentor, Professor Emeritus Sidney Yip, who spanned the fields of material science and nuclear science, that Li first glimpsed the nearly unlimited potential of working with materials. “This totally shaped me as a scientist,” he says. “I found out both how ignorant I was, and how interdisciplinary research could be.”

After nine years away from MIT “learning the ropes” in other universities, Li had the tools in hand, and new resolve, to start “coming up with more and more relevant material solutions to climate change problems,” he says. “Going from computer simulations all the way to actual devices is now what I love to do.”

With three children, Li finds himself increasingly preoccupied by the urgency of his mission. “I would like to see some of my discoveries and inventions being exponentially replicated, really used by people,” he says. “My dream is to see us carbon-free, and improving lives around the globe.”

A Black Computing Pioneer Takes His Place in Technology History at MIT

A brief history of a 1950s photo featuring Joseph Thompson, one of the original operators of MIT’s groundbreaking Whirlwind computer.

The caption on a black-and-white photo reads, in part: “In 1951, high school graduate Joe Thompson, 18, was trained as one of the first two computer operators. The computer was the Whirlwind, the prototype for the SAGE air defense system.”

MIT’s Whirlwind was one of the earliest high-speed digital computers, and Thompson played a key role in its operation at the start of his decades-long career in computing. With help from Deborah Douglas, director of collections at the MIT Museum, David Brock of the Computer History Museum recently caught up with Thompson, the first person trained as a Whirlwind operator at the MIT Digital Computer Laboratory, to learn more about his time with the project and his subsequent years as a leader in the computing industry.

“They at MIT were looking for bright, young kids who were not going to college,” Thompson told Brock. “I was the first [operator] to see if it would work, and I guess it worked well. … You had to learn the whole system, and you’d get to the point where you understand what they’re doing.”

Also seen in the photo is system programmer John “Jack” Gilmore. According to a publication from the Computer History Museum, “It had been Jack Gilmore of the Whirlwind project, famous for his software contributions, who had been key to bringing Joe Thompson into the project in an MIT push to meet the demands for skilled staff by recruiting from local high schools those students who were academically and socially exceptional, but for whom, for whatever reasons, college was inaccessible.”

After Whirlwind, Thompson accepted a job with RAND as a programmer working on the SAGE air defense system software. He transferred to California with the company, and his group eventually spun off into the non-profit System Development Corporation. Thompson retired in the 1990s after four decades in computing.

Gilmore would go on to work in advanced computing research at MIT Lincoln Laboratory before starting his own firm and spending the rest of his career in the computing industry. He died in 2015.