Hi everyone!

A couple of days ago, I finally decided to follow Andrej Karpathy’s micrograd tutorial. I liked the concepts so much I implemented it in C++ and added more functionality:

- Optimizers: Adam, SGD with momentum
- Activation functions: tanh, sigmoid, relu

I tried to make both the API as well as internal library as clean as possible so that everyone could understand the basic algorithms behind deep learning. Here is the project’s repo and a get started guide. Let me know what you think :)

  • shefcu
    cake
    OPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    All contributions are welcome!