Skip to content

Instantly share code, notes, and snippets.

View gerred's full-sized avatar
:octocat:

Gerred Dillon gerred

:octocat:
View GitHub Profile
Here's a breakdown of the thought process behind crafting the Python Doom-like raycaster:
1. **Understanding the Core Concept:** The request is for a "Doom-like raytracer." It's crucial to recognize that classic Doom *isn't* a raytracer in the modern sense. It uses a technique called "raycasting," which is a simplified form of ray tracing optimized for speed and the specific constraints of early 3D graphics. The key difference is that raycasting only casts rays in a 2D plane to determine wall intersections, not for lighting, reflections, or other advanced effects.
2. **Simplifying the Problem:** A full 3D raytracer is complex. To make this achievable in a reasonable scope, the focus needs to be on the core raycasting algorithm for rendering walls. Things like enemies, items, and complex level geometry can be considered as potential extensions, but the initial focus should be on the fundamental wall rendering.
3. **Core Raycasting Algorithm:** The fundamental idea is to cast rays from the player's positi
Here's a breakdown of the thought process to generate the Doom-like raytracer:
1. **Understand the Core Request:** The user wants a "Doom-like raytracer game." This immediately brings several key concepts to mind:
* **Raycasting, not Raytracing:** Doom uses a fast, simpler technique called raycasting, not full raytracing, for its 2.5D effect. This is crucial for performance in a real-time game.
* **Walls, not Full 3D:** Doom renders vertical walls, not true 3D objects. The floor and ceiling are often simple textures.
* **Limited Vertical Look:** Basic raycasters often don't have true up/down look.
* **Movement and Interaction:** The game needs to allow player movement and possibly some interaction (though this initial request doesn't emphasize complex interaction).
2. **High-Level Structure:** A game needs a main loop, input handling, rendering, and some way to represent the game world. I'll start with a basic game loop using Pygame.
❯ curl 'https://gist.githubusercontent.com/gerred/ae5074a928f5ef29265dc468093b0a64/raw/e1621a1575d189c18112c586ede3e8176648f813/doom.ts' | uvx llm -m gemini-2.0-flash-thinking-exp-1219 -s 'convert to python'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 5364 100 5364 0 0 121k 0 --:--:-- --:--:-- --:--:-- 121k
The user wants to convert the provided JavaScript code to Python. This looks like a simple 2D raycasting engine.
Here's a plan to convert the code:
1. **Imports:** The `readline` import from Node.js needs to be replaced with Python's `asyncio` and `readline` (or potentially just input for simplicity).
2. **Constants:** Translate the JavaScript `const` declarations to Python constants (uppercase variable names).
3. **Game World (Map):** The map array can be directly translated to a Python list of strings.
@gerred
gerred / doom.ts
Created December 20, 2024 00:25
gemini-2.0-flash-thinking-experimental doom ascii raytracer
import * as readline from 'node:readline/promises';
import { stdin as input, stdout as output } from 'node:process';
// --- Game Constants ---
const MAP_WIDTH = 32;
const MAP_HEIGHT = 32;
const SCREEN_WIDTH = 80;
const SCREEN_HEIGHT = 30;
const FOV = Math.PI / 3; // Field of View
const MAX_DEPTH = 20;
Okay, so I need to build a transducer system using TypeScript that allows for the safe composition of streaming iterators. Transducers are a way to compose multiple transformation functions into a single transformation, which can be applied to a stream of data efficiently. The goal is to make this system safe, meaning it should handle errors well and prevent common pitfalls like memory leaks or incorrect data flow.
First, I need to understand what transducers are and how they work. Transducers are composable transformations that can be applied to various types of collections, including arrays, streams, and more. They were popularized by Clojure but have been implemented in several other languages, including JavaScript.
In the context of streaming iterators, transducers can be used to process data as it comes in, without having to buffer the entire stream in memory. This is particularly useful for large data sets or real-time data processing.
Now, since I'm supposed to use TypeScript, I need to ensure that
@gerred
gerred / gist:7e39e3753ae44b617b49c8f45a81c549
Created December 17, 2024 03:00
qwq 32b int4 local transducers implementation
>>> From types up, using TypeScript, build a safe transducer system that allows for composition of
... streaming iterators. As you proceed through your thinking, add in thinking tags along the way i
... n the form of: <thought title="Considering types...">I am currently considering the types. If<
... I...</thought>. This will help separate and organize thoughts and contextualize your thinking p
... rocess to me along the way, better enabling me to understand your final response.
Alright, so I need to build a safe transducer system using TypeScript that allows for
composing streaming iterators. Transducers are a way to compose multiple operations on data
streams into a single transformation without creating intermediate arrays or objects. They're
really useful for processing large data sets efficiently.
# starter.gpt
tools: gptscript.docs, sys.write, sys.exec
Write the list of tools built into GPTScript in a markdown file called `tools.gpt`.
---
name: gptscript.docs
description: Access the GPTScript docs. Use this tool to answer questions about GPTScript, understand the GPTScript syntax, get documentation about the GPTScript CLI, and to write end edit GPTScript files.
args: query: The query to search for
@gerred
gerred / recoil.md
Created December 21, 2022 00:20 — forked from kellabyte/recoil.md

History

For a long time I've been really impacted by the ease of use Cassandra and CockroachDB bring to operating a data store at scale. While these systems have very different tradeoffs what they have in common is how easy it is to deploy and operate a cluster. I have experience with them with cluster sizes in the dozens, hundreds, or even thousands of nodes and in comparison to some other clustered technologies they get you far pretty fast. They have sane defaults that provide scale and high availability to people that wouldn't always understand how to achieve it with more complex systems. People can get pretty far before they have to become experts. When you start needing more extreme usage you will need to become an expert of the system just like any other piece of infrastructure. But what I really love about these systems is it makes geo-aware data placement, GDPR concerns potentially simplified and data replication and movement a breeze most of the time.

Several years ago the great [Andy Gross](ht

---
apiVersion: apiextensions.k8s.io/v1beta1
kind: CustomResourceDefinition
metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.2.4
creationTimestamp: null
name: operators.kudo.dev
spec:
group: kudo.dev
package dynamic
import (
"fmt"
"k8s.io/apimachinery/pkg/apis/meta/v1/unstructured"
"sigs.k8s.io/controller-runtime/pkg/client/config"
"sigs.k8s.io/controller-runtime/pkg/controller"
"sigs.k8s.io/controller-runtime/pkg/handler"
"sigs.k8s.io/controller-runtime/pkg/manager"
"sigs.k8s.io/controller-runtime/pkg/reconcile"