The best part about writing your own #GitLFS server is that the API "specification" provided by the Git LFS project is basically Just a Bunch of Markdown Files 😬😬😬
The best part about writing your own #GitLFS server is that the API "specification" provided by the Git LFS project is basically Just a Bunch of Markdown Files 😬😬😬
clone2leak, czyli jak drobne szczegóły powodują wyciekanie poświadczeń
Git to bezdyskusyjnie jeden z najpopularniejszych systemów zarządzania wersjami. Zaadoptowany przez wiele organizacji (np. GitHuba) doczekał się wsparcia w postaci różnych narzędzi, takich jak GitHub Desktop czy Git LFS. To z kolei pociągnęło za sobą konieczność współdzielenia poświadczeń użytkownika. Niestety nie wszystkie sprawdzenia dokonywane były z należytą starannością i badacz...
#WBiegu #Git #Gitcls #Gitdesktop #Gitlfs #Injection #Newline #Podatność
https://sekurak.pl/clone2leak-czyli-jak-drobne-szczegoly-powoduja-wyciekanie-poswiadczen/
I was too shy to try it but now I got more pressure to do so. And yes! Git LFS (Large Files Storage) extension is really cool 😎:
If you want to use Restic (or any other deduplicating backup tool) for backing up Git repositories with lots of large binary files, this blogpost might be helpful: https://olivergerlich.wordpress.com/2025/01/10/efficient-backup-of-a-git-repository-with-deduplication/ .
Unfortunately I did not find an _easy_ solution for this; but at least it is possible to make space-efficient backups in this case.
Started the new year by pushing a bunch of commits to #github, when I got the dreaded warning about commit file size:
remote: warning: File <blob>... is larger than GitHub's recommended maximum file size of 50.00 MB.
The recommended solution is to use #gitlfs, but having been burned by it in the past, I know Git LFS is the spawn of the devil.
So looking for an alternative, I came upon #gitannex.
Will try that on an experimental basis to see if it works.
Just because it just invaded my tiny little world and brought pain and destruction about: Why the f*** does git-lfs exist?
Seriously, unless you have to track different versions of Bladerunner: Don’t use git for large files, and don’t even think about git-lfs.
And never, ever, ever store your builds in your SCM, it’s called SCM for a reason. Why not using something like Artifactory? Or whatever artefact storage your git hoster throws in for free? As long as its not git LFS.
#git #gitlfs #scm
If my team works with SVN and I'd like to use GIT-SVN is there a way to handle binaries via Git-LFS while keeping in sync with the svn repo? 🤔👀
I tried:
- git svn init
- placing a .gitattributes
- adding .gitattributes to .git/info/exclude
- git svn fetch
But the binaries won't be listed by git lfs ls-files, meaning they are still handled outside lfs 🤕
After struggling with #gitlfs for the last couple of days. I've made the decision to avoid it. In theory, Git LFS allows us to manage our large video files in the same way that we manage the rest of our website assets. In practice it's just broken in so many ways that it is effectively unusable. Foritunately, stripping it out of our repo is not a big deal since I'm able to rewire histories without any real consequence for now. https://gregoryszorc.com/blog/2021/05/12/why-you-shouldn%27t-use-git-lfs/
Update:
Experimented a bit with this idea and seems to work ok.
Enabled LFS support in one of my Git servers, and as long as we don't use any pointers and we stick to use the locking functionality only, we could go back and forth between Git core only and Git LFS enabled servers withou any issue.
I'd appreciate your views on Git LFS.
I've always been reluctant to enable LFS in my Git servers basically because of the reasons so well explained by Gregory Szorc in this article:
https://gregoryszorc.com/blog/2021/05/12/why-you-shouldn%27t-use-git-lfs/
When having the need to track big binary files, we have pushed them inside the repos and allowed them to balloon rather than implement LFS.
There is an interesting use case that came now though, which is the case when the interest is not about storing the large files in LFS, but to take advantage of the locking feature only... this case is less controversial as there will be actually no files in the LFS storage and no pointers in the repo.
Please, share your experiences or views. Thanks.
Trying to figure out a good setup for photos on my blog (https://jonas.brusman.se).
The current setup based on Git LFS on Netlify is cumbersome and doesn’t allow me to post new photos from my phone with something like Tina CMS.
Suggestions are welcome!
#eleventy #11ty #staticsite #netlify #photoblog #gitlfs #tinacms #ssg #webdev #indieweb
1. У тебя активирован #GitLFS
2. Ты добавил свежий скриншот в новую заметку
3. #Git распознал его как новый медиа-файл для индексации в LFS
4. Закоммитил
5. `git lfs fsck` говорит `Git LFS fsck OK`
6. Запушил
7. Дождался сборки и выгрузки в прод
8. `The image “https://toby3d.me/ru/2023/10/11/143444/photo.png” cannot be displayed because it contains errors.`
TIL about Git Large File Storage. Tomorrow: put all my EXEs in my automated VM setup repos. I really should spend more time reading the documentation. https://docs.github.com/en/repositories/working-with-files/managing-large-files/installing-git-large-file-storage #git #github #gitlfs #automation #programming #malwareanalysis #documentation
I need need need to stop reading "GitLFS" as "GILFS"
The part I've been most obsessed with for the last couple of years is #Studio #Pipeline and #Collaboration software.
This year, I made a transition from using #Subversion for version control and asset management to #Git and #Gitea , which requires a lot of use of #GitLFS for our application (MOST of our files, by number and certainly by volume, are in the LFS storage).
Still working out the problems. Just last week, I migrated from AWS S3 to Backblaze B2 for storing the LFS data.
Have encountered a problem with my #Gitea server, which uses S3 to store #GitLFS files: the S3 is getting hammered, apparently by search/indexing bots?
I don't mind that the site is getting scanned. Nor, for that matter, that people legitimately interested want to download files.
But there's no way this is legit traffic, and the scans shouldn't be retrieving LFS data. That's just very wasteful! And I'm paying for it in fees.
I wonder if there is an easy way to block that kind of request?