Skip to content

Instantly share code, notes, and snippets.

View karpathy's full-sized avatar

Andrej karpathy

View GitHub Profile
@karpathy
karpathy / add_to_zshrc.sh
Created August 25, 2024 20:43
Git Commit Message AI
# -----------------------------------------------------------------------------
# AI-powered Git Commit Function
# Copy paste this gist into your ~/.bashrc or ~/.zshrc to gain the `gcm` command. It:
# 1) gets the current staged changed diff
# 2) sends them to an LLM to write the git commit message
# 3) allows you to easily accept, edit, regenerate, cancel
# But - just read and edit the code however you like
# the `llm` CLI util is awesome, can get it here: https://llm.datasette.io/en/stable/
gcm() {
@karpathy
karpathy / pytorch_strangeness.py
Created June 15, 2023 04:13
pytorch strangeness
import torch
import torch.nn as nn
torch.manual_seed(42)
x = torch.randn(2, 768)
# matrix multiply "ignores" the second row when calculating the first row
w = torch.randn(768, 768)
z1 = x[0] @ w
z2 = (x @ w)[0]
@karpathy
karpathy / stablediffusionwalk.py
Last active October 1, 2024 09:56
hacky stablediffusion code for generating videos
"""
stable diffusion dreaming
creates hypnotic moving videos by smoothly walking randomly through the sample space
example way to run this script:
$ python stablediffusionwalk.py --prompt "blueberry spaghetti" --name blueberry
to stitch together the images, e.g.:
$ ffmpeg -r 10 -f image2 -s 512x512 -i blueberry/frame%06d.jpg -vcodec libx264 -crf 10 -pix_fmt yuv420p blueberry.mp4
@karpathy
karpathy / nes.py
Last active October 31, 2024 10:45
Natural Evolution Strategies (NES) toy example that optimizes a quadratic function
"""
A bare bones examples of optimizing a black-box function (f) using
Natural Evolution Strategies (NES), where the parameter distribution is a
gaussian of fixed standard deviation.
"""
import numpy as np
np.random.seed(0)
# the function we want to optimize
@karpathy
karpathy / pg-pong.py
Created May 30, 2016 22:50
Training a Neural Network ATARI Pong agent with Policy Gradients from raw pixels
""" Trains an agent with (stochastic) Policy Gradients on Pong. Uses OpenAI Gym. """
import numpy as np
import cPickle as pickle
import gym
# hyperparameters
H = 200 # number of hidden layer neurons
batch_size = 10 # every how many episodes to do a param update?
learning_rate = 1e-4
gamma = 0.99 # discount factor for reward
@karpathy
karpathy / gist:88701557e59199f16045
Last active March 6, 2019 02:53
Google slides in present form shows the next slide, but it is tiny and very difficult to see. This CSS hacks it so that the next slide is large.
.punch-viewer-speakernotes-side-panel {
width: 400px !important;
}
.punch-viewer-speakernotes-text-body-scrollable {
left: 435px !important;
}
.punch-viewer-speakernotes-page,
.punch-viewer-speakernotes-page svg {
width:400px !important;
height:300px !important;
@karpathy
karpathy / min-char-rnn.py
Last active November 20, 2024 18:46
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@karpathy
karpathy / gist:f3ee599538ff78e1bbe9
Last active July 6, 2019 13:34
Batched L2 Normalization Layer for Torch nn package
--[[
This layer expects an [n x d] Tensor and normalizes each
row to have unit L2 norm.
]]--
local L2Normalize, parent = torch.class('nn.L2Normalize', 'nn.Module')
function L2Normalize:__init()
parent.__init(self)
end
function L2Normalize:updateOutput(input)
@karpathy
karpathy / gist:7bae8033dcf5ca2630ba
Created May 5, 2015 07:31
Efficient LSTM cell in Torch
--[[
Efficient LSTM in Torch using nngraph library. This code was optimized
by Justin Johnson (@jcjohnson) based on the trick of batching up the
LSTM GEMMs, as also seen in my efficient Python LSTM gist.
--]]
function LSTM.fast_lstm(input_size, rnn_size)
local x = nn.Identity()()
local prev_c = nn.Identity()()
local prev_h = nn.Identity()()
@karpathy
karpathy / gist:587454dc0146a6ae21fc
Last active July 11, 2024 10:36
An efficient, batched LSTM.
"""
This is a batched LSTM forward and backward pass
"""
import numpy as np
import code
class LSTM:
@staticmethod
def init(input_size, hidden_size, fancy_forget_bias_init = 3):