Skip to content

Instantly share code, notes, and snippets.

@dfeldman
dfeldman / gist:5a5630d28b8336f403123c071cfdac9e
Created June 5, 2024 15:48
Database Schema for Microsoft's Copilot+Recall feature
****** SemanticTextStore.db :
CREATE TABLE si_db_info (
schema_version INTEGER
);
CREATE TABLE si_items (
id BLOB(16) PRIMARY KEY NOT NULL
);
CREATE TABLE si_diskann_graph (
id INTEGER PRIMARY KEY,
@dfeldman
dfeldman / System prompt
Created March 6, 2024 16:41
ChatGPT->Wikidata connector
You are a Wikidata query helper. The user inputs queries in English, and you translate them to SPARQL and pass them to Wikidata using the included Action. ALWAYS do research using Wikidata if there is any doubt about the answer to a question. When possible, output a table including several of the interesting properties of the Wikidata output. Feel free to do multiple Wikidata queries if the first does not provide sufficient information. Do as many queries as needed to provide a satisfactory result. If there are no rows, always refine the query and improve it. Feel free to do initial queries to determine property names and item identifiers. Think step by step and be careful. If a query returns no results, it is probably incorrect. When returning numerical results, always think through the units used. Always remember whether to query based on subclasses, instances, or some other relationship. You can always do sample queries on known objects to see what the properties are likely to be. REmember that Wikidata in
@dfeldman
dfeldman / to_time.py
Created March 4, 2024 22:35
Process chatgpt conversations.json into the length of time of conversations
# GOAL:
# Determine how long (in seconds) you've spent talking to ChatGPT.
# This goes through each message sent in order. If there are two messages closer than 5 minutes apart,
# it adds their time delta to the total. If the gap is over 5 minutes, it assumes this is idle time
# and ignores the time delta entirely.
# Outputs a CSV file of total time per conversation, and prints the grand total of all conversations.
# All output is in seconds.
@dfeldman
dfeldman / pdttochatgpt.py
Last active January 4, 2024 05:54
Script to send a PDF file to ChatGPT page by page
import sys
import os
import base64
import requests
import io
import hashlib
from PyPDF2 import PdfFileReader
from pdf2image import convert_from_path
# OpenAI API Key
@dfeldman
dfeldman / gist:55d597dc9af5e388d5a0143e2a0a03fd
Last active May 17, 2023 07:31
cities_without_airports
# Code by chatgpt
# Cities database:
# https://simplemaps.com/static/data/world-cities/basic/simplemaps_worldcities_basicv1.76.zip
# (Needs to be unzipped -- copy worldcities.csv to the same directory)
# Airports database:
# https://davidmegginson.github.io/ourairports-data/airports.csv
import pandas as pd
import numpy as np
@dfeldman
dfeldman / get_census_data.py
Last active February 24, 2016 23:39
get_census_data
import Census
# Get your own key!
c=census.Census("b99ef7ede80606207d3a3836bafdc00dd90a244d")
c.acs.zipcode('B19001_001E', 'zip:55406')
# Available variables: [{u'zip code tabulation area': u'55406', u'B19001_001E': u'14900'}]
# Get output like:
# [{u'zip code tabulation area': u'55406', u'B19001_001E': u'14900'}]
@dfeldman
dfeldman / get_census_data.py
Last active February 25, 2016 01:24
analyzethis-example
import Census
# Get your own key!
c=census.Census("b99ef7ede80606207d3a3836bafdc00dd90a244d")
c.acs.zipcode('B19001_001E', 'zip:55406')
# Available variables: http://api.census.gov/data/2014/acs5/variables.html
# Get output like:
# [{u'zip code tabulation area': u'55406', u'B19001_001E': u'14900'}]
@dfeldman
dfeldman / gist:d25dd0fac033f991ff0e
Last active December 29, 2015 06:29
arduino unicycle controller
//TWIN WHEELER MODIFIED FOR ARDUINO SIMPLIFIED SERIAL PROTOCOL TO SABERTOOTH V2
// based on insructable from J Dingley
//i.e. the current standard Arduino board.
//designed for use with a rocker switch for steering as in original description on my "instructable":
//http://www.instructables.com/id/Easy-build-self-balancing-skateboardrobotsegway-/
//XXXXXXXXXXXXXXXXX UPDATES May 2011 XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX