[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help]  [Register] 

Status: Not Logged In; Sign In

The White House just held its first cabinet meeting in almost a year. Guess who was running it.

The Democrats' War On America, Part One: What "Saving Our Democracy" Really Means

New York's MTA Proposes $65.4 Billion In Upgrades With Cash It Doesn't Have

More than 100 killed or missing as Sinaloa Cartel war rages in Mexico

New York state reports 1st human case of EEE in nearly a decade

Oktoberfest tightens security after a deadly knife attack in western Germany

Wild Walrus Just Wanted to Take A Summer Vacation Across Europe

[Video] 'Days of democracy are GONE' seethes Neil Oliver as 'JAIL' awaits Brits DARING to speak up

Police robot dodges a bullet, teargasses a man, and pins him to the ground during a standoff in Texas

Julian Assange EXPOSED

Howling mad! Fury as school allows pupil suffering from 'species dysphoria' to identify as a WOLF

"I Thank God": Heroic Woman Saves Arkansas Trooper From Attack By Drunk Illegal Alien

Taxpayers Left In The Dust On Policy For Trans Inmates In Minnesota

Progressive Policy Backfire Turns Liberals Into Gun Owners

PURE EVIL: Israel booby-trapped CHILDRENS TOYS with explosives to kill Lebanese children

These Are The World's Most Reliable Car Brands

Swing State Renters Earn 17% Less Than Needed To Afford A Typical Apartment

Fort Wayne man faces charges for keeping over 10 lbs of fentanyl in Airbnb

🚨 Secret Service Announces EMERGENCY LIVE Trump Assassination Press Conference | LIVE Right Now [Livestream in progress]

More Political Perverts, Kamala's Cringe-fest On Oprah, And A Great Moment For Trump

It's really amazing! Planet chocolate cake eaten by hitting it with a hammer [Slow news day]

Bombshell Drops: Israel Was In On It! w/ Ben Swann

Cash Jordan: NYC Starts Paying Migrants $4,000 Each... To Leave

Shirtless Trump Supporter Puts CNN ‘Reporter’ in Her Place With Awesome Responses

Iraqi Resistance Attacks Two Vital Targets In Israels Haifa

Ex-Border Patrol Chief Says He Was Instructed By Biden-Harris Admin To Hide Terrorist Encounters

Israeli invasion of Lebanon 'will lead to DOOMSDAY' and all-out war,

PragerUMiss Universe Bankrupt after Trans Takeover: Former Judge Weighs In

Longtime Democratic Campaign Operative Quits the Party After What She Saw at the DNC

Dr. Lindsey Doe is teaching people that Pedophilia is a sexual orientation…


Science/Tech
See other Science/Tech Articles

Title: Physicists make the case that our brains' learning is controlled by entropy
Source: [None]
URL Source: [None]
Published: Feb 8, 2017
Author: FIONA MACDONALD
Post Date: 2017-02-08 21:17:51 by Tatarewicz
Keywords: None
Views: 37

ScienceAlert... The way our brains learn new information has puzzled scientists for decades - we come across so much new information daily, how do our brains store what's important, and forget the rest more efficiently than any computer we've built?

It turns out that this could be controlled by the same laws that govern the formation of the stars and the evolution of the Universe, because a team of physicists has shown that, at the neuronal level, the learning process could ultimately be limited by the laws of thermodynamics.

"The greatest significance of our work is that we bring the second law of thermodynamics to the analysis of neural networks," lead researcher Sebastian Goldt from the University of Stuttgart in Germany told Lisa Zyga from Phys.org.

The second law of thermodynamics is one of the most famous physics laws we have, and it states that the total entropy of an isolated system always increases over time.

Entropy is a thermodynamic quantity that's often referred to as a measure of disorder in a system. What that means is that, without extra energy being put into a system, transformations can't be reversed - things are going to get progressively more disordered, because it's more efficient that way.

Entropy is currently the leading hypothesis for why the arrow of time only ever marches forwards. The second law of thermodynamics says that you can't un-crack an egg, because it would lower the Universe's entropy, and for that reason, there will always be a future and a past.

But what does this have to do with the way our brains learn? Just like the bonding of atoms and the arrangement of gas particles in stars, our brains are designed to find the most efficient way to organise themselves.

"The second law is a very powerful statement about which transformations are possible - and learning is just a transformation of a neural network at the expense of energy," Goldt explained to Zyga.

If you keep in mind the fact that learning in its most simplistic form is controlled by billions of neurons firing inside our brains, then finding patterns in that energy output becomes a little easier.

To model how this works, Goldt and his team set up a neural network - a computer system that models the activity of neurons in the human brain.

"Virtually every organism gathers information about its noisy environment and builds models from those data, mostly using neural networks," the team writes in Physical Review Letters.

What the researchers were looking for is how neurons filter out the noise, and only respond to important sensory input.

They based their models on something called Hebbian theory, which explains how neurons adapt during the learning process. It's often summarised by the saying "cells that fire together, wire together" - which basically means that, as cells get better at firing in certain patterns, the resulting thoughts get more reinforced in our brains.

Using this model, the team showed that learning efficiency was constrained by the total entropy production of a neural network.

They noticed that the slower a neuron learns, the less heat and entropy it produces, which increased its efficiency.

What does that mean for you and I? Unfortunately, the result doesn't tell us a whole lot about how to learn better or smarter.

It also doesn't provide any magical solutions for how to create computers that can learn as efficiently as the human brain - these particular results can only be applied to simple learning algorithms that don't use feedback.

But what the researchers have done is put a new perspective on the study of learning, and provided evidence that our neurons follow the same thermodynamic laws as the rest of the Universe.

They're not the first ones to think about our brains in terms of thermodynamics, either.

Last year, a team from France and Canada proposed that consciousness could simply be a side effect of entropy, and our brains organising themselves in the most efficient manner.

"We find a surprisingly simple result: normal wakeful states are characterised by the greatest number of possible configurations of interactions between brain networks, representing highest entropy values," they wrote at the time.

We're still a long way off understanding how our brains work - and these are just two studies out of many that have tried to identify why our neurons connect and function the way we do.

But every new clue takes us closer to unlocking the keys to our brains' enormous power - and hopefully learning how to harness that in artificial systems.

"Having a thermodynamic perspective on neural networks gives us a new tool to think about their efficiency and gives us a new way to rate their performance," Goldt told Zyga.

The research has been published in Physical Review Letters, and you can read the full paper online here. arxiv.org/pdf/1611.09428.pdf

Post Comment   Private Reply   Ignore Thread  



[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help]  [Register]