#Backblaze

2026-01-21

ohh is #Synology C2 now cheaper then #Backblaze B2? $69.99/yr vs $72/yr for 1TB. Plus Synology C2 also has deduplication.

Have I missed something, I thought C2 was always much more expensive than B2

#nas

Sam Steiner and 487 otherssamsteiner@swiss.social
2026-01-21

Kann jemand eine Nicht-USAmerikanische Alternative Backup-Lösung wie #Backblaze empfehlen?

2026-01-19

Bạn đang chạy Immich trên Docker Debian với ~1 TB dữ liệu và muốn sao lưu offsite. Hai hướng:
a) Chuyển mini‑PC sang Windows, chạy Immich trong Docker, sao lưu volume + dump PostgreSQL lên Backblaze Personal (9$/tháng, không giới hạn).
b) Dùng rsync sao lưu các volume Debian lên máy Windows cá nhân, rồi đẩy toàn bộ lên Backblaze (đã có 20 TB).
Cân nhắc chi phí, độ ổn định và tính linh hoạt của mỗi giải pháp.

#Backup #Immich #Docker #Backblaze #Selfhosted #SaoLưu #CôngNghệ

https://www.r

KipJayChou :debian: :docker:admin@mstdn.feddit.social
2026-01-16

RE: mstdn.feddit.social/@admin/115

Rclone Mount再在Cloudflare CNAME+Rules,确实是免去了backblaze的流量费,但是这个kiwix的启动速度感觉比直接从backblaze拉慢了100倍...

感觉需要在Hetzner换一个新机器2X4TB HDD专门做镜像网站...
Kiwix-serve、Immich......

#rclone #cloudflare #backblaze #Bandwidth #Alliance #hetzner #kiwix

tree /mnt/zimsdocker compose logs -f
2026-01-15

@cinimodev My #GoogleDrive upload rates are a total disaster... utterly nightmarishly slow as hell. Always have been. The ONLY service I have ever seen actually use all my bandwidth is #backblaze which is amazing...

Nathan Arthur (OLD ACCOUNT)narthur
2026-01-14

I've been a customer for years, but it sounds like they're :

1. They made me change my password because mine "wasn't strong enough".

- How did they know, if they're storing it salted and hashed?!?
- It wasn't! It was gen'd by 1password, and I had 2FA enabled!
- Their web form for changing passwords has UX issues!

2. I got a support reply from an AI telling me they "know about the issue" and my "feedback is valuable". And they'll automatically close the ticket soon!

Trezzer (aka Helvedeshunden)trezzer@social.linux.pizza
2026-01-05

Extremely frustrated with #Backblaze today. They actively block restores to non-Apple formats on a Mac - even natively supported ones like ExFAT. I explicitly want to restore to non-Apple format to achieve platform agnosticism. Now I have 8 TB stuck in their cloud and have to decide which bothersome way outside my elderly Mac Mini server I want to struggle with in order to get my data back. Oh, and it immediately started backing up the ExFAT drive (it has some stuff on it already) despite Backblaze support saying it’s “unsupported” (no warning from the app!) and I will apparently never be able to download it again from the same machine. I guess I need a new backup provider. This is 🤡 stuff.

2025-12-23

Backblaze ne sauvegarde plus le contenu de Dropbox ni d’autres services de stockage dlvr.it/TPzGQS #Backblaze #Dropbox

KipJayChou :debian: :docker:admin@mstdn.feddit.social
2025-12-23

在iOS上的图片——先上传到Mac
在Windows上的图片——先上传到Mac(Windows文件系统不是很喜欢)
在Mac搭建immich,导入完成后,rclone sync library到b2:XXX
在hetzner搭建immich,rclone mount b2到服务器本地,确保hetzner的immich不上传

要点:
先重置数据库
全程不从app/web上传,app/web只做阅读器
上传靠本地immich服务器(为了整理文件格式)
对于生成缩略图、内容都不在library内,应该没事

immich.app/?ref=selfh.st

#immich #b2 #backblaze #s3

KipJayChou :debian: :docker:admin@mstdn.feddit.social
2025-12-22

以前只弄过视频媒体,音乐媒体,服务器也好、VPS也好都是存在本地
今天终于把漫画上对象存储并且配置好了OPDS了
存储用的是backblaze b2,opds服务器是ComicOPDS
漫画总大小为144GB(得到经验,这种上百G而且低速网络还是分开上传好)
在iPadOS上用Panels接入OPDS,每个月1.49$

音乐上的对象存储是Cloudflare R2
接下来是视频存储,决定上Backblaze B2

#manga #comic #backblaze #opds

backblaze 限度和警报
KipJayChou :debian: :docker:admin@mstdn.feddit.social
2025-12-22

校园网就是用来干这种脏活累活的

#backblaze #manga #comic #b2

12
Marcus "MajorLinux" Summersmajorlinux@toot.majorshouse.com
2025-12-21

Also, in my haste to do so, I forgot to backup my data from my MBP (which I traded in).

Luckily, I have #Backblaze to bail me out.

KipJayChou :debian: :docker:admin@mstdn.feddit.social
2025-12-16

immich还没有S3支持,但是我是不是可以通过将backblaze的bucket挂载为磁盘

v2ex.com/t/911646
docs.raidrive.com/en/gui/stora
blog.spikeseed.cloud/mount-s3-

macOS上已经有cloudmount实现了挂载S3到finder,但是不知道只是算一个快捷方式还是真的可以作为路径使用

#s3 #immich #backblaze #b2

Screenshot
KipJayChou :debian: :docker:admin@mstdn.feddit.social
2025-12-13

刚刚试着完成了Mastodon的备份 :neofinder:
参考了蜗牛哥的 eallion.com/mastodon-backup ,以及GPT的指示
了解了备份的基本流程,接下来就是变成脚本然后设置为定时任务

先配置rclone链接到Backblaze b2
rclone config
图方便就把Application Keys删了,重新生成了Master Application Key

Rclone除了协议、account、key意外都不需要什么额外的配置
~/.config/rclone/rclone.conf
[b2]
type = b2
account = $keyID$
key = $applicationKey$

rclone lsd b2:
查看buckt确认无误

基本备份流程:
备份 一些目录/文件 + 一个数据库 → 打包成带时间戳的 tar.gz → 用 rclone 传到多个云 → 清理本地

我需要备份的文件有:
compose.yaml
Dockerfile
.env.production
docker-compose.override.yml
mastodon.dump(数据库)

docker exec mastodon_dev_db \
pg_dump -Fc -U postgres postgres > mastodon.dump

我需要备份的目录有:
/home/jay/docker/social/mastodon/main/overrides

不需要备份的目录:
/home/jay/docker/social/mastodon/main/public(我的文件都在R2,本地无文件)
/home/jay/docker/social/mastodon/main/elasticsearch(索引可以重建)
/home/jay/docker/social/mastodon/main/redis (不是权威数据源)

接下来上传到backblaze
ts=$(date +"%Y%m%d-%H%M%S") (打上时间戳)
tar -czf mastodon-dev-backup-${ts}.tar.gz \
compose.yaml \
Dockerfile \
.env.production \
docker-compose.override.yml \
mastodon.dump \
overrides
(打包为tar.gz)

rclone copy mastodon-dev-backup-${ts}.tar.gz b2:Hostdzire-SFO-Backup/ (上传到Backblaz的Hostdzire-SFO-Backup桶下)

验证上传:
rclone ls b2:Hostdzire-SFO-Backup | tail(输出文件)

删掉临时文件和本地备份:
rm -f mastodon.dump
rm -f mastodon-dev-backup-${ts}.tar.gz

#b2 #backblaze #rclone #backup #备份 #docker #tar #打包 #mastodon

backblaze b2
KipJayChou :debian: :docker:admin@mstdn.feddit.social
2025-12-12

非常好的关于国内外对象存储的博客
使我的存储桶旋转 :bilibili_049:
所以我还是选择
* cloudflare r2应对社交媒体(gts、mstdn等)大流量存储
* backblaze b2应对私人用于备份数据和相册吧

blog.sotkg.com/2025/12/oss-pre

#object_storage #cloudflare #r2 #backblaze #b2 #对象存储

2025-12-07

Yo Fedi!

Is there anyone here who uses Backblaze with Linux? Do you use the CLI client and scripts, or a third-party tool from this list at the link? Infodump on me please.

backblaze.com/cloud-storage/in

#backup #backblaze #linux #LinuxBackup

#:idle: Don T3rr0r :antifa:t3rr0rz0n3@xarxa.cloud
2025-12-04

:aw_yeah: @homeassistant backup to #BackBlaze ♥️

Home Assistant backup to BackBlaze
KipJayChou[维护模式]jay@gts.feddit.social
2025-12-03

GotoSocial进阶配置:迁移数据至 Backblaze B2(S3 Storage)

自用备忘

其他:增大VIDEO_MAX_SIZE到40MB

GTS_MEDIA_VIDEO_MAX_SIZE: "41943040"
20MB---40MB,将使用S3存储图片、视频,可以适当放开视频大小限制。
由于Mastodon 通常默认最大 40MB,过大的视频可能被拒绝联合
更大的视频需要托管到 https://video.feddit.social (施工中)

1.配置backblaze B2

没有海外信用卡看这个教程:https://linux.do/t/topic/1093338

创建桶
bucketName: gotosocial-s3-media
类型: Private

创建Application Keys
keyName: gts-s3
bucketName: gotosocial-s3-media
KeyID: XXX
APPLICATION_KEY: XXX

2. 配置MinIO mc

curl -O https://dl.min.io/client/mc/release/linux-amd64/mc
sudo chmod +x mc
sudo mv mc /usr/local/bin/
mc -v
mc alias set b2-gts https://Endpoint ACCESS_KEY APPLICATION_KEY

3. 迁移本地数据到backblaze bucket

docker compose down
停机,防止迁移数据时仍有新文件

mc mirror --exclude "sqlite.db*" ~/gotosocial/data b2-gts/gotosocial-s3-media/

备份
cp -r ~/gotosocial/data ~/gotosocial/data_backup
编辑docker-compose.yaml
mv docker-compose.yaml docker-compose.yaml.bak
nano docker-compose.yaml

启动服务器
docker compose up -d

4. 验证S3成功(本地文件未增加)

find ~/gotosocial/data -type f | wc -l
# 上传几个文件
find ~/gotosocial/data -type f | wc -l
# 输出数量不变,

验证S3成功(远程文件增加)

mc find b2-gts/gotosocial-s3-media --name "*" | wc -l
# 或者来到 backblaze browse Files。进入gotosocial-s3-media,全选,查看档案数量是否增加

5. 访问Web客户端

访问正常,确认迁移成功,可删去~/gotosocial/data内除了sqlite-db-*的所有文件/文件夹

#gotosocial #peertube #s3 #backblaze #url #torrent #sysadmin #admin #fediverse #mastodon #socialmedia #linux #docker

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst