Neural Networks (Foundations)
Neural networks are loosely inspired by the brain, but the analogy only goes so far. At their simplest, they're layers of tiny mathematical functions - called neurons - that each take some numbers in, do a small calculation, and pass the result forward. Stack enough of these layers together and something remarkable happens: the network can learn to recognise patterns far too complex for anyone to program by hand. The earliest neural networks had just a few layers and could handle simple tasks like recognising handwritten digits. The key insight is that during training, the network adjusts thousands or millions of internal settings - called weights - until its outputs match the correct answers. Each weight is a tiny dial, and learning is the process of turning all the dials until the system gets things right. This basic principle underpins virtually every modern AI system, from chatbots to self-driving cars. If you're evaluating AI tools, it helps to know that neural networks aren't magic - they're pattern-matching machines whose quality depends entirely on their structure, their training data, and how carefully those weights were tuned.