Main Menu

search

You are here

Neural Nets

[last updated: 2024-05-19]
Artificial Intelligence
-----


      On This Page:
  • Neurons
  • Synapses
  • Networks
  • Learning

---------------------------------------------

  • Neurons:
    • A neuron is a processing element. It takes input from other neurons (through synapse connections),
      and sends an output to some number of "downstream" neurons through connecting synapses.
      • All downstream neurons receive the same output, though of course individually weighted according to the connecting synapse.
    • A given neuron has a "threshold" parameter. When the weighted sum of all its inputs exceeds that threshold, it fires (sends an ouput).
    • There must be a delay/hysteresis, such that, once a neuron fires, it cannot fire again for some period of time.
      During that time, it accumulates its inputs, and at the end of that delay, if the total (weighted) of its inputs exceeds it's threshold, then it again fires.
      It may be that, when the delay expires and the neuron can fire, the accumulated inputs are way more than just its minimum threshold, in which case its output will be proportional to that total accumulated input.

    • In fact, in biological brains, individual neurons have a "life span" and die after a time.
      In addition, new neurons are continuously being "born" or created.

  • Synapses:
    • Synapses are the connections/conduits between neurons.
    • Synapses have a "weight", or multiplication parameter. The net signal they transmit to their downstream neuron
      is the value of their input (from their upstream neuron) multiplied by their "weight" parameter.
      The weight parameter may be positive or negative.
    • Each synapse has an inherent delay.

  • Networks:
    • Many neurons are connected together in a network.
      "input neurons" are connected to inputs from external sensors/data.
      "output neurons" are connected to outputs/actions.
      "hidden neurons" are not connected to any inputs or outputs.
    • In "feedforward" networks, starting from input neurons, connections progress towards the output,
      through however many "layers" there are in the network.
    • "Recurrent" networks may have some neurons connected to themselves, or to other neurons in their same layer or in previous layers.
      This feature constitutes (functions as) a "memory".

  • Learning:
    • Online/real-time vs. batch:
      In online learning, connection weights are adjusted with each new data point.
      In batch learning, weights are adjusted after receiving input of an entire data (training) set.
    • Hebbian Rule:
      "The weight between two neurons gets reinforced if the two neurons are active at the same time"

    • Back propagation:
      A method for training multi-layer networks

    • Reinforcement Learning:

    • Connectionist models:
      propose that each neuron represents (or is correlated to, or corresponds to) a specific "concept"
      and the state of the neuron amounts to the truth of that concept.
      The synaptic connections then represent constraints or dependencies between concepts.