[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help] 

Status: Not Logged In; Sign In

Iran sets a world record by deporting 300,000 illegal refugees in 14 days

Brazilian Women Soccer Players (in Bikinis) Incredible Skills

Watch: Mexico City Protest Against American Ex-Pat 'Invasion' Turns Viole

Kazakhstan Just BETRAYED Russia - Takes gunpowder out of Putin’s Hands

Why CNN & Fareed Zakaria are Wrong About Iran and Trump

Something Is Going Deeply WRONG In Russia

329 Rivers in China Exceed Flood Warnings, With 75,000 Dams in Critical Condition

Command Of Russian Army 'Undermined' After 16 Of Putin's Generals Killed At War, UK Says

Rickards: Superintelligence Will Never Arrive

Which Countries Invest In The US The Most?

The History of Barbecue

‘Pathetic’: Joe Biden tells another ‘tall tale’ during rare public appearance

Lawsuit Reveals CDC Has ZERO Evidence Proving Vaccines Don't Cause Autism

Trumps DOJ Reportedly Quietly Looking Into Criminal Charges Against Election Officials

Volcanic Risk and Phreatic (Groundwater) eruptions at Campi Flegrei in Italy

Russia Upgrades AGS-17 Automatic Grenade Launcher!

They told us the chickenpox vaccine was no big deal—just a routine jab to “protect” kids from a mild childhood illness

Pentagon creates new military border zone in Arizona

For over 200 years neurological damage from vaccines has been noted and documented

The killing of cardiologist in Gaza must be Indonesia's wake-up call

Marandi: Israel Prepares Proxies for Next War with Iran?

"Hitler Survived WW2 And I Brought Proof" Norman Ohler STUNS Joe Rogan

CIA Finally Admits a Pyschological Warfare Agent from the Agency “Came into Contact” with Lee Harvey Oswald before JFK’s Assassination

CNN Stunned As Majority Of Americans Back Trump's Mass Deportation Plan

Israeli VS Palestinian Connections to the Land of Israel-Palestine

Israel Just Lost Billions - Haifa and IMEC

This Is The Income A Family Needs To Be Middle Class, By State

One Big Beautiful Bubble": Hartnett Warns US Debt Will Exceed $50 Trillion By 2032

These Are The Most Stolen Cars In Every US State

Earth Changes Summary - June 2025: Extreme Weather, Planetary Upheaval,


Science/Tech
See other Science/Tech Articles

Title: Physicists make the case that our brains' learning is controlled by entropy
Source: [None]
URL Source: [None]
Published: Feb 8, 2017
Author: FIONA MACDONALD
Post Date: 2017-02-08 21:17:51 by Tatarewicz
Keywords: None
Views: 44

ScienceAlert... The way our brains learn new information has puzzled scientists for decades - we come across so much new information daily, how do our brains store what's important, and forget the rest more efficiently than any computer we've built?

It turns out that this could be controlled by the same laws that govern the formation of the stars and the evolution of the Universe, because a team of physicists has shown that, at the neuronal level, the learning process could ultimately be limited by the laws of thermodynamics.

"The greatest significance of our work is that we bring the second law of thermodynamics to the analysis of neural networks," lead researcher Sebastian Goldt from the University of Stuttgart in Germany told Lisa Zyga from Phys.org.

The second law of thermodynamics is one of the most famous physics laws we have, and it states that the total entropy of an isolated system always increases over time.

Entropy is a thermodynamic quantity that's often referred to as a measure of disorder in a system. What that means is that, without extra energy being put into a system, transformations can't be reversed - things are going to get progressively more disordered, because it's more efficient that way.

Entropy is currently the leading hypothesis for why the arrow of time only ever marches forwards. The second law of thermodynamics says that you can't un-crack an egg, because it would lower the Universe's entropy, and for that reason, there will always be a future and a past.

But what does this have to do with the way our brains learn? Just like the bonding of atoms and the arrangement of gas particles in stars, our brains are designed to find the most efficient way to organise themselves.

"The second law is a very powerful statement about which transformations are possible - and learning is just a transformation of a neural network at the expense of energy," Goldt explained to Zyga.

If you keep in mind the fact that learning in its most simplistic form is controlled by billions of neurons firing inside our brains, then finding patterns in that energy output becomes a little easier.

To model how this works, Goldt and his team set up a neural network - a computer system that models the activity of neurons in the human brain.

"Virtually every organism gathers information about its noisy environment and builds models from those data, mostly using neural networks," the team writes in Physical Review Letters.

What the researchers were looking for is how neurons filter out the noise, and only respond to important sensory input.

They based their models on something called Hebbian theory, which explains how neurons adapt during the learning process. It's often summarised by the saying "cells that fire together, wire together" - which basically means that, as cells get better at firing in certain patterns, the resulting thoughts get more reinforced in our brains.

Using this model, the team showed that learning efficiency was constrained by the total entropy production of a neural network.

They noticed that the slower a neuron learns, the less heat and entropy it produces, which increased its efficiency.

What does that mean for you and I? Unfortunately, the result doesn't tell us a whole lot about how to learn better or smarter.

It also doesn't provide any magical solutions for how to create computers that can learn as efficiently as the human brain - these particular results can only be applied to simple learning algorithms that don't use feedback.

But what the researchers have done is put a new perspective on the study of learning, and provided evidence that our neurons follow the same thermodynamic laws as the rest of the Universe.

They're not the first ones to think about our brains in terms of thermodynamics, either.

Last year, a team from France and Canada proposed that consciousness could simply be a side effect of entropy, and our brains organising themselves in the most efficient manner.

"We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values," they wrote at the time.

We're still a long way off understanding how our brains work - and these are just two studies out of many that have tried to identify why our neurons connect and function the way we do.

But every new clue takes us closer to unlocking the keys to our brains' enormous power - and hopefully learning how to harness that in artificial systems.

"Having a thermodynamic perspective on neural networks gives us a new tool to think about their efficiency and gives us a new way to rate their performance," Goldt told Zyga.

The research has been published in Physical Review Letters, and you can read the full paper online here. arxiv.org/pdf/1611.09428.pdf

Post Comment   Private Reply   Ignore Thread  



[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help]