Skip to content

Instantly share code, notes, and snippets.

@magnetikonline
Last active October 10, 2024 17:31
Show Gist options
  • Save magnetikonline/dd5837d597722c9c2d5dfa16d8efe5b9 to your computer and use it in GitHub Desktop.
Save magnetikonline/dd5837d597722c9c2d5dfa16d8efe5b9 to your computer and use it in GitHub Desktop.
List all Git repository objects by size.

List all Git repository objects by size

Summary

Bash script which will:

  • Iterate all commits made within a Git repository.
  • List every object at each commit.
  • Order unique objects in descending size order.

Useful for removing large resources from a Git repository, for instance with migrations into GitHub where individual objects are limited to 100MB maximum.

Example

$ ./gitlistobjectbysize.sh

100644 blob de6bdeaefebec0bff53d4859833caddba635609c    123452290	something/really/large.iso
100644 blob 946488f3c2ab8abf5d36b88f9018af77dceda12d         2290	path/to/script.js
100644 blob 2e234e61460f2fa087f9aebbfee2f6b524bc38fe         1724	README.md
100644 blob 1807d789603ae1038985f76c54e6de3b093da761         1710	README.md
100644 blob 7b5071e880f1abed9191fb34425157901c0a51a7         1083	LICENSE
100755 blob ef377e40d54365c814b9324ab4001455f4b5d4d8          651	bashscript.sh
100644 blob 08ca429f5434247f12f503dd69df244399d4ef83           19	.gitignore
100644 blob 8a52f946a9aed2c242cbe8891b3510f750527bb2           18	.gitignore

Note

For git version 2.38.0 and above, the git ls-tree --format argument will provide a more succinct report output, used via the gitlistobjectbysize-git2.38.0.sh script variant.

If we now wish to remove something/really/large.iso we can rewrite history using git filter-branch:

$ git filter-branch \
  --tree-filter "rm -f something/really/large.iso" \
  -- --all

Ref 'refs/heads/main' was rewritten
#!/bin/bash -e
function main {
local tempFile=$(mktemp)
# work over each commit and append all files in tree to $tempFile
local IFS=$'\n'
local commitSHA1
for commitSHA1 in $(git rev-list --all); do
git ls-tree \
--format="%(objectname) %(objectsize:padded) %(path)" \
-r \
"$commitSHA1" >>"$tempFile"
done
# sort files by SHA-1, de-dupe list and finally re-sort by filesize
sort --key 1 "$tempFile" | \
uniq | \
sort --key 2 --numeric-sort --reverse
# remove temp file
rm "$tempFile"
}
main
#!/bin/bash -e
function main {
local tempFile=$(mktemp)
# work over each commit and append all files in tree to $tempFile
local IFS=$'\n'
local commitSHA1
for commitSHA1 in $(git rev-list --all); do
git ls-tree -r --long "$commitSHA1" >>"$tempFile"
done
# sort files by SHA-1, de-dupe list and finally re-sort by filesize
sort --key 3 "$tempFile" | \
uniq | \
sort --key 4 --numeric-sort --reverse
# remove temp file
rm "$tempFile"
}
main
@Maxattax97
Copy link

Made some modifications: it de-duplicates the file paths, shows file sizes in human readable format, reduces to human-friendly columns, and sorts it so you'll quickly see the largest objects next to your prompt (without scrolling or less).

Sample output:

...
46 KiB		clib/docs/kbuild/makefiles.txt
58 KiB		clib/scripts/kconfig/zconf.lex.c_shipped
71 KiB		clib/lib/cmocka/cmocka.h
75 KiB		clib/scripts/kconfig/zconf.tab.c_shipped
110 KiB		clib/lib/cmocka/cmocka.c
$ # your shell prompt starts here

https://gist.github.com/Maxattax97/f566fdf67ac4ad2492ea1c732f5afdda

@magnetikonline
Copy link
Author

I like this @Maxattax97 - probably at the point I'd convert this to a Python script... 😄

@kenorb
Copy link

kenorb commented Nov 9, 2020

Add these aliases into ~/.gitconfig file:

[alias]
  big-files    = !"git rev-list --objects --all \
                 | git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' \
                 | sed -n 's/^blob //p' \
                 | sort -nk2 \
                 | cut -c 1-12,41- \
                 | $(command -v gnumfmt || echo numfmt) --field=2 --to=iec-i --suffix=B --padding=7 --round=nearest"
  big-objects = !"git rev-list --all \
                | parallel git ls-tree -r --long "{}" \
                | sort -uk3 \
                | sort -nk4"

Then run git big-files or git big-objects. It'll show the biggest files/objects at the bottom (tested on macOS).

@magnetikonline
Copy link
Author

Nice additions @kenorb - thanks! 👍

@voiski
Copy link

voiski commented Dec 24, 2020

If I'm not wrong, it only affects your local. If you run a fresh clone it will also download orphan commits - those ones you detached. The only way to make it fully work is to push your local to a new repo and replace the original with that new one. Another way different from that is if you are an admin in the server and manually drop those orphan commits - then, not possible in GH.

You can push to the server, it will rewrite the branch, but the orphan commit will be still there.

Another solution for this script, but with the same issue I pointed, https://rtyley.github.io/bfg-repo-cleaner/

@magnetikonline
Copy link
Author

magnetikonline commented Dec 28, 2020

@voiski that's not quite correct. To clarify:

  • Using git filter-branch or BFG/etc. will remove the commit(s) against the objects locally.
  • Force pushing then to a remote will create a rewritten commit log which does not reference the offending commit(s), but existing commit(s) will still exist in the remote as unreachable/orphan commits.
  • If someone then does a fresh git clone they will not receive the orphan commit(s) - only reachable commits based on the commit log are fetched - so the fresh clone will not receive the removed objects.
  • If you have direct access to the remote repo - you could issue a git gc --aggressive --prune=now - which will nuke the orphan commit(s) in question from the remote entirely.
  • In the case of GitHub, that's a bit of a black box - but we can assume it somewhat follows the rules that are outlined against the gc.auto setting, which determines when a Git repository (both local and remote) will self-clean/prune/gc. It's fair to say at some point it will git gc based on number of commits/pushes/etc.

@voiski
Copy link

voiski commented Dec 29, 2020

@magnetikonline Thanks for the clarification. I got confused because I had tried it before, and it didn't work, but mine was to redact secrets in the source code. I guess if you don't fully delete the file, it will still keep the orphan commits. But, for the gist intention here, it looks fine.

@pansila
Copy link

pansila commented Feb 18, 2021

python version

import subprocess
from tqdm import tqdm

files = []

commitSHA1 = subprocess.check_output(['git', 'rev-list', '--all'], text=True)
for c in tqdm(commitSHA1.splitlines()):
    files.extend(subprocess.check_output(['git', 'ls-tree', '-r', '--long', c], text=True).splitlines())


files.sort(key=lambda x: x.split()[2])
files = list(set(files))
files.sort(key=lambda x: int(x.split()[3]), reverse=True)
print('\n'.join(files[:100]))

@tombohub
Copy link

I waited more than 5min to see any result, and when i pressed the key results came.
If i didnt press the key i woudl have waited for eternity.

WSL1

@Jan-Bruun-Andersen
Copy link

I probably went a bit overboard, but here is my version:

https://github.com/Jan-Bruun-Andersen/git-ls-blobs

@kraduk
Copy link

kraduk commented Feb 22, 2022

Removing the temp file is so sub optimal, when you have to wait ages for it to run, it lists the biggest 1st, and then scrolls off the terminal and you don't have enough history for the complete feed 8(

#!/bin/bash -e

function main {
	local tempFile=$(mktemp)

	# work over each commit and append all files in tree to $tempFile
	local IFS=$'\n'
	local commitSHA1
	for commitSHA1 in $(git rev-list --all); do
		git ls-tree -r --long "$commitSHA1" >>"$tempFile"
	done

	# sort files by SHA1, de-dupe list and finally re-sort by filesize
	sort --key 3 "$tempFile" | \
		uniq | \
		sort --key 4 --numeric-sort --reverse

	# remove temp file
	#rm "$tempFile"
}


main

@samzmann
Copy link

Thanks for this!
gitlistobjectbysize.sh works well, though kinda slow.

Then, the git filter-branch ... command is extremely slow: +30 minutes expected to remove one large file from history of a relatively small repo (~1000 commits)

I found https://github.com/newren/git-filter-repo (which is also recommended when running git filter-branch).
It provides a super fast and in-depth way to analyze the repo:

git filter-repo --analyze

And then provides all kinds of ways to filter/clean a repo, all super fast.

I can recommend!

@pauljohn32
Copy link

I have git lfs holding a lot of large files. I want to find the large files that are only in local ".git" folder or history. Can you discuss this question?

Example issues:

  1. User put in a file as ordinary git file. Then later I changed it to lfs. Does the original commit stay somewhere in the ".git/" folder. I guess yes.
  2. Do you know of a way to scan for everything that is correctly in LFS on the current master branch, and then check for those same named files in ".git" history and delete them. All versions?
  3. I delete a file from the master branch, but copies of it are still sitting about in history. Does each version show up as a separate file when we do this kind of search?

@magnetikonline
Copy link
Author

magnetikonline commented Nov 30, 2022

@pauljohn32 can probably only answer the first with any confidence - haven't really played with LFS (yet).

User put in a file as ordinary git file. Then later I changed it to lfs. Does the original commit stay somewhere in the ".git/" folder. I guess yes.

yes it does 👍

I delete a file from the master branch, but copies of it are still sitting about in history. Does each version show up as a separate file when we do this kind of search?

each modified version of the same object/path will - yes.

@zhaopan
Copy link

zhaopan commented Mar 1, 2023

+10086 👍

@Stonks3141
Copy link

A nice one-line version that doesn't use a tempfile or GNU-specific flags and puts the largest files at the bottom:

git rev-list --all | xargs -n1 git ls-tree --long -r | sort -k3 | uniq | sort -nk4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment