Monday, March 1, 2021
  • Setup menu at Appearance » Menus and assign menu to Top Bar Navigation
Advertisement
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News
No Result
View All Result
NikolaNews
No Result
View All Result
Home Neural Networks

The and Maths Behind How Your Brain Learns (and Unlearn)

May 8, 2020
in Neural Networks
The and Maths Behind How Your Brain Learns (and Unlearn)
585
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

Definitions of learning vary widely across disciplines, influenced largely by different approaches in the assessment. At the core, learning could be defined as a process that results in a change in knowledge or behaviour as a result of experience. Many learning activities make use of a reward system of the brain. You know we should not play with fire because you got burnt as a little kid. You know you will be very happy if someone gives you a box of chocolate (maybe not you but definitely me). You know how to learn the Norwegian language based on your knowledge in Swedish. All of this “past experience” is the result of your brain taking in information and store it into the memory where it could be, hopefully, use to apply to new knowledge, which then leads to an update of your current state of knowledge. Thus, learning and memory are strongly correlated, particularly declarative memory, which contains the memory of facts (e.g. name of the prime minister) and events (e.g. your hiking trip last summer).

https://human-memory.net/types-of-memory/

Recall, in short, that when an action potential of neuron A arrives at the synapse it either causes an excitatory or inhibitory behavior of receiving neuron B, where this change can be measured as an Excitatory or Inhibitory Postsynaptic Potential (EPSP or IPSP). The synaptic strength is said to be stronger when it shows an increase in the EPSP, meaning that a postsynaptic neuron is more likely to fire an action potential.

You might also like

How AI Can Be Used in Agriculture Sector for Higher Productivity? | by ANOLYTICS

Future Tech: Artificial Intelligence and the Singularity | by Jason Sherman | Feb, 2021

Tackling ethics in AI algorithms: the case of Salesforce | by Iflexion | Feb, 2021

Neuroplasticity or simply Plasticity is defined as the ability of the brain to physically change its connectivity and neuronal synaptic strength through selectivity. Just like plastic, your brain also goes through a series of changes throughout life, forming new connections when it needs to. According to Gerstner (2011), different forms of learning is actually the result of the dynamical changes in the strength of synapses. This goes back to the quote I wrote at the beginning of the article — Neurons that fire together wire together, which is basically the rough summarisation of the Hebbian Theory. How plasticity is differentiated is based on “how long” the stimulus from the pre-synaptic neurons A can increase the EPSP, or simply put, how good at neuron A at exciting neuron B.

Jobs in AI

Simply, the synaptic plasticity can be divided into:

(1) Short-term Plasticity (STP): where the increase only lasts for one or few seconds

(2) Long-term Plasticity (LTP): in which the increase can last for seconds or months. Or you can say neuron A is good at exciting neuron B for a longer period of time! In reverse, if neuron A suppress neuron B for a long period of time, it is called Long-term Depression (LTD). Sometimes, only a 3 seconds long stimulus can stimulate EPSP for minutes or hours.

Gerstner (2011)

Both LTP and LTD are thought to be the building blocks of how learning happens in the brain.

The majority of the existing synaptic theories of learning today are, in some ways, influenced by the Hebbian Learning, which arose from the Hebbian Theory, a theory that attempts to explain synaptic plasticity, introduced by Donald Hebb in 1949.

When an axon of cell A is near enough to excite cell and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A ’s efficiency, as one of the cells firing , is increased.

Hebbian learning is powerful when it comes to studying the process of learning since it implies that the relationship between two neurons reflects past correlated activity and the change in the strength of that connection represents the association. Let us use the example below:

Gerstner (2011): Representation of Hebbian Learning in Human

Let’s say the person in the figure above sees a banana for the first time and that he has 10,000 neurons in a network that have to work together to learn about this banana (let’s depict all those thousands of neurons as 4,5,6, and 9). Those neurons might be working on smells, shape, texture, taste, colour, or the environment associated with that banana. Now, to learn about this banana and to be able to store it in the memory, neurons 4,5,6, and 9, have to be switched on together, and according to Hebbian learning, this co-activation will lead to a strengthening of their connections. At this point, the memory concept of “banana” has been formed.

Next, let’s say the person encounter cue(s) that resembles a banana on the next day; maybe something with similar smell or color and shape. Those neurons that are responsible for such cues and also are a part of the “banana” concept (e.g. neuron 5) will become active and will fire an action potential to the neighboring neurons. Those that were previously associated with this “banana” concept (4,6, and 9) will also become activated and mutually fire in a cascade manner due to the strengthened past connection where the remaining irrelevant neighbor neurons remain inactive, or not as active, due to a weaker connection. Repeat this a few times (also known as iteration) and you get a solid memory of a “banana” that can be retrieved.

Credit: BecomingHuman By: Puttatida Mahapattanakul

Previous Post

How to Generate Leads via Digital Marketing: Trends & Tactics

Next Post

Machine learning speeds up virus coverage insurance reviews

Related Posts

How AI Can Be Used in Agriculture Sector for Higher Productivity? | by ANOLYTICS
Neural Networks

How AI Can Be Used in Agriculture Sector for Higher Productivity? | by ANOLYTICS

February 27, 2021
Future Tech: Artificial Intelligence and the Singularity | by Jason Sherman | Feb, 2021
Neural Networks

Future Tech: Artificial Intelligence and the Singularity | by Jason Sherman | Feb, 2021

February 27, 2021
Tackling ethics in AI algorithms: the case of Salesforce | by Iflexion | Feb, 2021
Neural Networks

Tackling ethics in AI algorithms: the case of Salesforce | by Iflexion | Feb, 2021

February 27, 2021
Creative Destruction and Godlike Technology in the 21st Century | by Madhav Kunal
Neural Networks

Creative Destruction and Godlike Technology in the 21st Century | by Madhav Kunal

February 26, 2021
How 3D Cuboid Annotation Service is better than free Tool? | by ANOLYTICS
Neural Networks

How 3D Cuboid Annotation Service is better than free Tool? | by ANOLYTICS

February 26, 2021
Next Post
Machine learning speeds up virus coverage insurance reviews

Machine learning speeds up virus coverage insurance reviews

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

Plasticity in Deep Learning: Dynamic Adaptations for AI Self-Driving Cars

January 6, 2019
Microsoft, Google Use Artificial Intelligence to Fight Hackers

Microsoft, Google Use Artificial Intelligence to Fight Hackers

January 6, 2019

Categories

  • Artificial Intelligence
  • Big Data
  • Blockchain
  • Crypto News
  • Data Science
  • Digital Marketing
  • Internet Privacy
  • Internet Security
  • Learn to Code
  • Machine Learning
  • Marketing Technology
  • Neural Networks
  • Technology Companies

Don't miss it

How to Change the WordPress Admin Login Logo
Learn to Code

Use Touch ID for sudo on Mac

March 1, 2021
Judge approves $650m settlement for Facebook users in privacy, biometrics lawsuit
Internet Security

Judge approves $650m settlement for Facebook users in privacy, biometrics lawsuit

March 1, 2021
SolarWinds Blames Intern for Weak Password That Led to Biggest Attack in 2020
Internet Privacy

SolarWinds Blames Intern for Weak Password That Led to Biggest Attack in 2020

March 1, 2021
(Part 2 of 4) How to Modernize Enterprise Data and Analytics Platform – by Alaa Mahjoub, M.Sc. Eng.
Data Science

(Part 2 of 4) How to Modernize Enterprise Data and Analytics Platform – by Alaa Mahjoub, M.Sc. Eng.

March 1, 2021
Machine Learning Courses Market Overview, Revenue, Industry Verticals, and Forecast Evaluation 2020 to 2026 – NeighborWebSJ
Machine Learning

Machine Learning Courses Market Overview, Revenue, Industry Verticals, and Forecast Evaluation 2020 to 2026 – NeighborWebSJ

March 1, 2021
Benefits of Data Integration – Data Science Central
Data Science

Benefits of Data Integration – Data Science Central

March 1, 2021
NikolaNews

NikolaNews.com is an online News Portal which aims to share news about blockchain, AI, Big Data, and Data Privacy and more!

What’s New Here?

  • Use Touch ID for sudo on Mac March 1, 2021
  • Judge approves $650m settlement for Facebook users in privacy, biometrics lawsuit March 1, 2021
  • SolarWinds Blames Intern for Weak Password That Led to Biggest Attack in 2020 March 1, 2021
  • (Part 2 of 4) How to Modernize Enterprise Data and Analytics Platform – by Alaa Mahjoub, M.Sc. Eng. March 1, 2021

Subscribe to get more!

© 2019 NikolaNews.com - Global Tech Updates

No Result
View All Result
  • AI Development
    • Artificial Intelligence
    • Machine Learning
    • Neural Networks
    • Learn to Code
  • Data
    • Blockchain
    • Big Data
    • Data Science
  • IT Security
    • Internet Privacy
    • Internet Security
  • Marketing
    • Digital Marketing
    • Marketing Technology
  • Technology Companies
  • Crypto News

© 2019 NikolaNews.com - Global Tech Updates