Algorithms_in_C++
1.0.0
Set of algorithms implemented in C++.
|
Implementation of Multilayer Perceptron. More...
#include <algorithm>
#include <cassert>
#include <chrono>
#include <cmath>
#include <fstream>
#include <iostream>
#include <sstream>
#include <string>
#include <valarray>
#include <vector>
#include "vector_ops.hpp"
Classes | |
class | machine_learning::neural_network::layers::DenseLayer |
class | machine_learning::neural_network::NeuralNetwork |
Namespaces | |
machine_learning | |
Machine learning algorithms. | |
neural_network | |
Neural Network or Multilayer Perceptron. | |
activations | |
Various activation functions used in Neural network. | |
util_functions | |
Various utility functions used in Neural network. | |
layers | |
This namespace contains layers used in MLP. | |
Functions | |
double | machine_learning::neural_network::activations::sigmoid (const double &x) |
double | machine_learning::neural_network::activations::dsigmoid (const double &x) |
double | machine_learning::neural_network::activations::relu (const double &x) |
double | machine_learning::neural_network::activations::drelu (const double &x) |
double | machine_learning::neural_network::activations::tanh (const double &x) |
double | machine_learning::neural_network::activations::dtanh (const double &x) |
double | machine_learning::neural_network::util_functions::square (const double &x) |
double | machine_learning::neural_network::util_functions::identity_function (const double &x) |
static void | test () |
int | main () |
Implementation of Multilayer Perceptron.
A multilayer perceptron (MLP) is a class of feedforward artificial neural network (ANN). The term MLP is used ambiguously, sometimes loosely to any feedforward ANN, sometimes strictly to refer to networks composed of multiple layers of perceptrons (with threshold activation). Multilayer perceptrons are sometimes colloquially referred to as "vanilla" neural networks, especially when they have a single hidden layer.
An MLP consists of at least three layers of nodes: an input layer, a hidden layer and an output layer. Except for the input nodes, each node is a neuron that uses a nonlinear activation function. MLP utilizes a supervised learning technique called backpropagation for training. Its multiple layers and non-linear activation distinguish MLP from a linear perceptron. It can distinguish data that is not linearly separable.
See Backpropagation for training algorithm.
double machine_learning::neural_network::activations::drelu | ( | const double & | x | ) |
double machine_learning::neural_network::activations::dsigmoid | ( | const double & | x | ) |
double machine_learning::neural_network::activations::dtanh | ( | const double & | x | ) |
double machine_learning::neural_network::util_functions::identity_function | ( | const double & | x | ) |
int main | ( | void | ) |
Driver Code
double machine_learning::neural_network::activations::relu | ( | const double & | x | ) |
Relu function
X | Value |
double machine_learning::neural_network::activations::sigmoid | ( | const double & | x | ) |
Sigmoid function
X | Value |
double machine_learning::neural_network::util_functions::square | ( | const double & | x | ) |
double machine_learning::neural_network::activations::tanh | ( | const double & | x | ) |
Tanh function
X | Value |
|
static |
Function to test neural network