Main Menu

search

You are here

Neural Nets

[last updated: 2024-11-26]
Artificial Intelligence
-----

    This is all pretty rough ...


      On This Page:
  • Introduction:
  • Nodes (Neurons)
  • Synapses
  • Networks
  • Learning
  • Links/Refs

---------------------------------------------

  • Introduction:
    • A Neural Network (NN) is an assembly ("network") of connected components.
    • Components of a NN are Nodes (also called Neurons)
      and Synapses (sometimes called Edges or Weights).
    • The components are connected to each other in a specific pattern (network).
    • Once a NN is assembled, it is ready to receive inputs, process them, and send outputs.
    • Input signals are real numbers.
      There may be one or many input signals.
      A given input signal is sent into a node, which is defined thereby as an "Input Node" since it receives an input signal.

    ------------------------------------------------------

  • Nodes:
    • A Node is a processing element.
      It may have one or many input channels.
      It has one output channel.
      As a "processing element", it is encoded with a program or algorithm, that performs calculations on the input signals that it receives,
      then sends a single output signal.
      Its output signal may in fact be sent to several destinations, but all such destinations will receive the exact same output.
    • A given Node has a "threshold" parameter. When the weighted sum of all its inputs exceeds that threshold, it fires (sends an ouput).
    • There must be a delay/hysteresis, such that, once a Node fires, it cannot fire again for some period of time.
      During that time, it accumulates its inputs, and at the end of that delay, if the total (weighted) of its inputs exceeds it's threshold, then it again fires.
      It may be that, when the delay expires and the Node can fire, the accumulated inputs are way more than just its minimum threshold, in which case its output will be proportional to that total accumulated input.

    • In fact, in biological brains, individual neurons have a "life span" and die after a time.
      In addition, new neurons are continuously being "born" or created.

  • Synapses:
    • Synapses are the connections/conduits between Nodes.
    • Synapses have a "weight", or multiplication parameter. The net signal they transmit to their downstream neuron
      is the value of their input (from their upstream neuron) multiplied by their "weight" parameter.
      The weight parameter may be positive or negative.
    • Each synapse has an inherent delay.
      In the brain, chemical synapse delays may be several milliseconds, however electrical synapses are nearly instantaneous.
      Further, chemical synapses can either excite or inhibit the downstream neuron, while electrical synapses can only excite.

  • Networks:
    • Many Nodes are connected together in a network.
      "input Nodes" are connected to inputs from external sensors/data.
      "output Nodes" are connected to outputs/actions.
      "hidden Nodes" are not connected to any inputs or outputs.
    • In "feedforward" networks, starting from input Nodes, connections progress towards the output,
      through however many "layers" there are in the network.
    • "Recurrent" networks may have some Nodes connected to themselves, or to other Nodes in their same layer or in previous layers.
      This feature constitutes (functions as) a "memory".

  • Learning:
    • Online/real-time vs. batch:
      In online learning, connection weights are adjusted with each new data point.
      In batch learning, weights are adjusted after receiving input of an entire data (training) set.
    • Hebbian Rule:
      "The weight between two Nodes gets reinforced if the two neurons are active at the same time"

    • Back propagation:
      A method for training multi-layer networks

    • Reinforcement Learning:

    • Connectionist models:
      propose that each Node represents (or is correlated to, or corresponds to) a specific "concept"
      and the state of the Node amounts to the truth of that concept.
      The synaptic connections then represent constraints or dependencies between concepts.

    ----------------------------------------------------------------------------

  • Links/Refs: