KaiTorch

woof

KaiTorch is a deep learning library that dynamically builds a neural network as a decentralized acyclic graph (DAG) of Scalar values and implements backprop using reverse-mode autodiff. Heavily over-commented, highly impractical, but hopefully educational.

It implements a Keras-like API that allows you to build models using a Sequential class with Dense and Dropout layers, with implementations of several commonly used weight initializers, activation functions, optimizers, and loss functions.

This project was inspired by and is an extension of Andrej Karpathy's micrograd :)

Github Repo: https://github.com/kaihayden/KaiTorch


Installation

pip install kaitorch

Tutorial Notebooks

  1. Functions and Gradients

  2. Functions as a Feed Forward Neural Net

  3. Reverse-mode Autodiff and Backpropogation

  4. Activation Functions

  5. Dense Layer and Weight Initialization

  6. Loss Functions

  7. Gradient Descent and Optimizers (*personal favorite)

  8. Inverted Dropout

Example Notebooks

  1. Regression

  2. Binary Classification

  3. Multi-class Classification


Keras-esque API

Building a Neural Net

from kaitorch.models import Sequential
from kaitorch.layers import Dense, Dropout
from kaitorch.losses import CategoricalCrossentropy
from kaitorch.optimizers import Adam
from kaitorch.activations import LeakyReLU
from kaitorch.initializers import LecunNormal

model = Sequential()

model.add(Dense(12, activation='sigmoid', initializer='he_normal'))
model.add(Dropout(0.25))
model.add(Dense(12, activation=LeakyReLU(alpha=0.01), initializer=LecunNormal()))
model.add(Dense(3, activation='softmax'))

model.compile(
    optimizer=Adam(lr=0.025),
    loss=CategoricalCrossentropy()
)

Training a Neural Net

history = model.fit(X_train, y_train, epochs=32)
y_pred = model.predict(X_test)

Tracing/Visualization

model.plot_model(filename='trace')