A C++ implementation of neural networks and perceptrons from scratch I did as a project during my Bachelor's, with manual implementation of Matrix and matrix operations, backpropagation training, multiple activation functions, and support for various datasets.
- Multi-layer neural networks with configurable architecture
- Backpropagation training algorithm
- Multiple activation functions: tanh, sigmoid, linear
- Loss functions: mean squared error
- Support for training and validation datasets
- Single-layer perceptron implementation
- Training for basic logical operations (AND, OR, NAND, XOR)
- XOR problem
- Abalone dataset (regression)
- Boston Housing dataset (regression)
- Sine function approximation
- Sinc function approximation
- GCC 13 or later (g++-13)
- C++23 standard support
- Linux environment
To build the project, run:
make allThis will compile main.cpp with debugging symbols enabled.
For a release build with optimizations:
make releaseAfter building, run the executable:
./programaThe program contains various commented-out examples. Uncomment the desired section in main.cpp to test different neural network configurations or perceptron operations.
Uncomment the XOR section in main.cpp to train a neural network on the XOR problem:
// XOR example code
Matrix x_test({0,0, 0,1, 1,0, 1,1}, 2);
Matrix y_test({0,1,1,0}, 1);
std::vector<LayerDescriptor> arch;
arch.emplace_back(2, ActivationFunctions::tanh);
arch.emplace_back(1, ActivationFunctions::tanh);
NeuralNetwork nn(2, arch, LossFunctions::mean_squared_error, 0.1);
TrainResult r = nn.fit(x_test, y_test, 10000, true);Uncomment the AND section to train a perceptron for logical AND:
Matrix x_test({0,0, 0,1, 1,0, 1,1}, 2);
Matrix y_and({-1,-1,-1,1}, 1);
Perceptron p_and;
p_and.fit(x_test, y_and, 1000);The project uses only standard C++ libraries and no external dependencies beyond the compiler.
This project was developed by Eduard Duta and Francisco Wendeburg (wendeburg).