For AI: what we're learning about computations

scruffy

Diamond Member
Mar 9, 2022
18,411
14,938
2,288

The article suggests a parallel and entirely different form of computation, for AI.

Traditionally, neural networks are linear-ish, even the transformers underlying Chat-GPT.

However this linked article points to a controllable alternate form of computation based on nonlinearity and criticality.

Of immense interest are the "phase transitions" that happen in small areas of the network, which when they couple lead to a global topology.

The same thing happens in ferromagnets and also in the biological cell.

The reason it's of interest is, I want to program my own personal AI myself. Everything needed is available online, including the training datasets for natural language and the like.
 
Here is a concrete example:

First this


Then this

 

Forum List

Back
Top