simple neural network library in ANSI C


Build Status

Genann logo


Genann is a minimal, well-tested library for training and using feedforward artificial neural networks (ANN) in C. Its primary focus is on being simple, fast, reliable, and hackable. It achieves this by providing only the necessary functions and little extra.


  • C99 with no dependencies.
  • Contained in a single source code and header file.
  • Simple.
  • Fast and thread-safe.
  • Easily extendible.
  • Implements backpropagation training.
  • Compatible with alternative training methods (classic optimization, genetic algorithms, etc)
  • Includes examples and test suite.
  • Released under the zlib license - free for nearly any use.


Genann is self-contained in two files: genann.c and genann.h. To use Genann, simply add those two files to your project.

Example Code

Four example programs are included with the source code.

Quick Example

We create an ANN taking 2 inputs, having 1 layer of 3 hidden neurons, and providing 2 outputs. It has the following structure:

NN Example Structure

We then train it on a set of labeled data using backpropagation and ask it to predict on a test data point:

#include "genann.h"

/* Not shown, loading your training and test data. */
double **training_data_input, **training_data_output, **test_data_input;

/* New network with 2 inputs,
 * 1 hidden layer of 3 neurons each,
 * and 2 outputs. */
genann *ann = genann_init(2, 1, 3, 2);

/* Learn on the training set. */
for (i = 0; i < 300; ++i) {
    for (j = 0; j < 100; ++j)
        genann_train(ann, training_data_input[j], training_data_output[j], 0.1);

/* Run the network and see what it predicts. */
double const *prediction = genann_run(ann, test_data_input[0]);
printf("Output for the first test data point is: %f, %f\n", prediction[0], prediction[1]);


This example is to show API usage, it is not showing good machine learning techniques. In a real application you would likely want to learn on the test data in a random order. You would also want to monitor the learning to prevent over-fitting.


Creating and Freeing ANNs

genann *genann_init(int inputs, int hidden_layers, int hidden, int outputs);
genann *genann_copy(genann const *ann);
void genann_free(genann *ann);

Creating a new ANN is done with the genann_init() function. Its arguments are the number of inputs, the number of hidden layers, the number of neurons in each hidden layer, and the number of outputs. It returns a genann struct pointer.

Calling genann_copy() will create a deep-copy of an existing genann struct.

Call genann_free() when you're finished with an ANN returned by genann_init().

Training ANNs

void genann_train(genann const *ann, double const *inputs,
        double const *desired_outputs, double learning_rate);

genann_train() will preform one update using standard backpropogation. It should be called by passing in an array of inputs, an array of expected outputs, and a learning rate. See example1.c for an example of learning with backpropogation.

A primary design goal of Genann was to store all the network weights in one contigious block of memory. This makes it easy and efficient to train the network weights using direct-search numeric optimization algorthims, such as Hill Climbing, the Genetic Algorithm, Simulated Annealing, etc. These methods can be used by searching on the ANN's weights directly. Every genann struct contains the members int total_weights; and double *weight;. *weight points to an array of total_weights size which contains all weights used by the ANN. See example2.c for an example of training using random hill climbing search.

Saving and Loading ANNs

genann *genann_read(FILE *in);
void genann_write(genann const *ann, FILE *out);

Genann provides the genann_read() and genann_write() functions for loading or saving an ANN in a text-based format.


double const *genann_run(genann const *ann, double const *inputs);

Call genann_run() on a trained ANN to run a feed-forward pass on a given set of inputs. genann_run() will provide a pointer to the array of predicted outputs (of ann->outputs length).


  • All functions start with genann_.
  • The code is simple. Dig in and change things.

Extra Resources

The FAQ is an excellent resource for an introduction to artificial neural networks.

If you need an even smaller neural network library, check out the excellent single-hidden-layer library tinn.

If you're looking for a heavier, more opinionated neural network library in C, I recommend the FANN library. Another good library is Peter van Rossum's Lightweight Neural Network, which despite its name, is heavier and has more features than Genann.

  • Addition of a build system generator

    Addition of a build system generator

    opened by elfring 12
  • Improve clean method from Makefile

    Improve clean method from Makefile

    The make task was traing to remove .exe file and thats file don't exists. Now the task is remove compiled programs to.

    opened by edgardleal 7
  • is possible to generate source code?

    is possible to generate source code?

    I need function in C output = answer(data)

    Can You write mi this functionality? When I train my network i would like to use it in my program.

    opened by mikolaj24 7
  • Add Meson build support

    Add Meson build support

    I think it would help to add a script for Meson users. I was going to wrap your library and have it added to the WrapDB but there doesn’t seem to be any archived releases of this library.


    opened by michaelbrockus 6
  • link issue using msvc

    link issue using msvc

    Great work on this, just wanted to make note of a minor issue when using MSVC(compiler v19, linker v14) to compile and link w/ the test file for windows. (gcc on linux worked fine.)

    It seems the 'inline' definitions are causing unresolved external symbol errors (e.g genann_act_threshold). I messed around w/ different optimization switches (/GL, /LTCG, /Ob{0|1|2} etc.) but it doesn't seem to help. Making the declarations explicitly 'extern' fixes it, but not sure if that will cause issues elsewhere, or make the 'inline' superfluous .

    Or maybe I'm missing something obvious?

    bug help wanted 
    opened by jeog 5
  • Beautiful work

    Beautiful work

    Not an issue, I just wanted to commend you for your excellent contribution. Its hard finding stuff as small and powerful as this.

    Well done.

    opened by glouw 4
  • ANSI C compatibility fixes: declarations must always preceed statements

    ANSI C compatibility fixes: declarations must always preceed statements

    genann states to be ANSI C compatible, but clang disagrees when executed with -std=gnu89:

    genann.c:92:12: warning: ISO C90 forbids mixing declarations and code [-Wdeclaration-after-statement]
        size_t j = (size_t)((a-sigmoid_dom_min)*interval+0.5);
    genann.c:115:15: warning: ISO C90 forbids mixing declarations and code [-Wdeclaration-after-statement]
        const int hidden_weights = hidden_layers ? (inputs+1) * hidden + (hidden_layers-1) * (hidden+1) * hidden : 0;
    genann.c:161:13: warning: ISO C90 forbids mixing declarations and code [-Wdeclaration-after-statement]
        genann *ann = genann_init(inputs, hidden_layers, hidden, outputs);
    genann.c:220:9: warning: ISO C90 forbids mixing declarations and code [-Wdeclaration-after-statement]
        int h, j, k;
    genann.c:282:9: warning: ISO C90 forbids mixing declarations and code [-Wdeclaration-after-statement]
        int h, j, k;
    genann.c:399:9: warning: ISO C90 forbids mixing declarations and code [-Wdeclaration-after-statement]
        int i;
    6 warnings generated.

    this patch fixes the above.

    opened by mateuszviste 4
  • this captcha

    this captcha

    Is possible to solve this captcha?

    opened by mikolaj24 3
  • input data

    input data

    input data is form 0 .. 1 or -1 to +1 ?

    opened by 0xtrzy 3
  • Removed inline function specifiers

    Removed inline function specifiers

    This pull request fixes issue #28, which was unrelated to the build environment the user was building on. Instead, the inline function specifiers meant that a build of only one header and one source file would build correctly, even in the case of a library, but since the current master branch has no library build system, users were including the files genann.c and genann.h to their projects.

    This meant that projects were trying to link to an inline function definition that they did not have access to because it was defined in a different compilation unit. As stated in the commit message, I thought about moving the functions to the header, but I figured the best way to move forward for now is to simply remove the inline specifier while we profile builds both with and without them, and see what the results say.

    The alternative would have required moving a significant number of macros to the header file as well, which could have had a waterfall effect on other builds, since anyone linking to the library would have needed the header, which was now full of new macros. Another option would have been to remove the unused specifiers on the variables, but then there would be no point in having the macros in the first place.

    This option was the simplest one, and until we profile the builds, I think it's the one that makes the most sense.

    opened by jflopezfernandez 3
  • Non-linear regression

    Non-linear regression

    I'm not skilled with machine learning and i'm trying to study its practical applications. I'm trying to use an ANN for non-linear regression of function sin(x) with analitical points and i have found some problems.

    Creating an ANN with 10 hidden layers and 4 neurons, running a for-cycle 1 million times over 100 points maked with x values evenly distributed between 0 and 2pi i obtain this:

    alt text

    with this specs once created the ANN: ann->activation_output = genann_act_linear; and the output doesn't change trying to make different ANN.

    Someone can help me to understand where i'm wrong (also teoretically)? I've seen that sigmoid activation function is not the best choice for this purpose like relu, but it's just this?

    opened by ScratchyCode 1
  • example1 not enough training.

    example1 not enough training.

    with a value of 300 in the training loop I see this output:

    Output for [0, 0] is 0.
    Output for [0, 1] is 1.
    Output for [1, 0] is 1.
    Output for [1, 1] is 1.

    changing the loop to 350 gives:

    Output for [0, 0] is 0.
    Output for [0, 1] is 1.
    Output for [1, 0] is 1.
    Output for [1, 1] is 0.

    Was this done on purpose to show some kind of limitation of back propagation ?

    opened by chriscamacho 5
  • Implement relu

    Implement relu


    I started to implement the relu function for the genann library on a fork under my name ( before sending you a PR:

    double inline genann_act_relu(const struct genann *ann unused, double a) {
        return (a > 0.0) ? a : 0.0;

    But I am a bit lost in the way you compute the back propagation of the neural network. The derivative of relu formula is trivial (a > 0.0) ? 1.0 : 0.0 But I cannot understand were I should plug-it inside your formula as I do not understand how do you compute your back propagation. Did you implemented only the derivate of the sigmoid ?

    opened by kasey- 3
  • Issue with changing activation functions

    Issue with changing activation functions

    I was wondering how to change the default sigmoid activation function to something else. I've tried changing it to tanh and it's not working. I've also tried using the linear activation function on the examples given and it's failing that as well

    opened by rnagurla 13
simple neural network library in ANSI C

Genann Genann is a minimal, well-tested library for training and using feedforward artificial neural networks (ANN) in C. Its primary focus is on bein

Lewis Van Winkle 1.1k Feb 17, 2021
A lightweight C library for artificial neural networks

Getting Started # acquire source code and compile git clone cd kann; make # learn unsigned addition (30000 sam

Attractive Chaos 510 Feb 18, 2021
oneAPI Deep Neural Network Library (oneDNN)

oneAPI Deep Neural Network Library (oneDNN) This software was previously known as Intel(R) Math Kernel Library for Deep Neural Networks (Intel(R) MKL-

oneAPI-SRC 2.3k Feb 18, 2021
A GPU (CUDA) based Artificial Neural Network library

Updates - 05/10/2017: Added a new example The program "image_generator" is located in the "/src/examples" subdirectory and was submitted by Ben Bogart

Daniel Frenzel 83 Feb 1, 2021
Convolutional Neural Networks

Darknet Darknet is an open source neural network framework written in C and CUDA. It is fast, easy to install, and supports CPU and GPU computation. D

Joseph Redmon 20.2k Mar 14, 2021
Marking up images for use with Darknet.

What is DarkMark? DarkMark is a C++ GUI tool used to annotate images for use in neural networks. It was written specifically to be used with the Darkn

Stéphane Charette 10 Mar 16, 2021
An Open Source Machine Learning Framework for Everyone

Documentation TensorFlow is an end-to-end open source platform for machine learning. It has a comprehensive, flexible ecosystem of tools, libraries, a

null 153.4k Feb 19, 2021
header only, dependency-free deep learning framework in C++14

The project may be abandoned since the maintainer(s) are just looking to move on. In the case anyone is interested in continuing the project, let us k

tiny-dnn 5.3k Feb 17, 2021
Simplified distributed block storage with strong consistency, like in Ceph (repository mirror)

Vitastor Читать на русском The Idea Make Software-Defined Block Storage Great Again. Vitastor is a small, simple and fast clustered block storage (sto

Vitaliy Filippov 12 Mar 29, 2021
DyNet: The Dynamic Neural Network Toolkit

The Dynamic Neural Network Toolkit General Installation C++ Python Getting Started Citing Releases and Contributing General DyNet is a neural network

Chris Dyer's lab @ LTI/CMU 3.2k Mar 13, 2021
Deep Learning API and Server in C++11 support for Caffe, Caffe2, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE

Open Source Deep Learning Server & API DeepDetect ( is a machine learning API and server written in C++11. It makes state

JoliBrain 2.2k Mar 14, 2021
Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit

CNTK Chat Windows build status Linux build status The Microsoft Cognitive Toolkit ( is a unified deep learning toolkit that describes

Microsoft 17k Feb 19, 2021
neural net with blackjack and hookers

SkyNet is a light deep learning library. Linux/Windows License ResNet cpp-example for Win Compare with Tensorflow, inference ResNet50. PC: i5-2400, GF

Alexander Medvedev 56 Feb 7, 2021
A c++ trainable semantic segmentation library based on libtorch (pytorch c++). Backbone: ResNet, ResNext. Architecture: FPN, U-Net, PAN, LinkNet, PSPNet, DeepLab-V3, DeepLab-V3+ by now.

中文 C++ library with Neural Networks for Image Segmentation based on LibTorch. The main features of this library are: High level API (just a line to cr

null 28 Mar 1, 2021