Mac mini large file scan completed with cleanup candidates identified

Source type: obs · Harvested: 2026-05-02 · Original date: 2026-05-02T01:01:51.004Z Metadata: {"project":"Desktop","type":"discovery","obs_id":64844}


obs/64844 · discovery · 2026-05-02T01:01:51.004Z

Mac mini large file scan completed with cleanup candidates identified

The automation successfully completed remote disk scanning on mac-mini using SSH and identified 8 files exceeding 2GB threshold. The largest storage consumers are AI model files: Ollama models in ~/.ollama/models/blobs (34GB total across 4 blobs) and a quantized Gemma-4 26B model in ~/.omlx/models (15.6GB across 3 shards). These model files were last modified in early August 2024, making them over 8 months old. An additional 4.27GB Chrome on-device model cache (weights.bin) from March 2024 was also identified. This data will feed into the automation’s cleanup recommendations, particularly for unused or outdated AI models that can be re-downloaded if needed.

Concepts: [“how-it-works”,“what-changed”]

Facts: [“Largest file on mac-mini is Ollama model blob sha256-7121… at 17.98GB modified 2024-08-05”,“Three additional Ollama model blobs found: 6.14GB, 5.97GB, and 4.68GB”,“Gemma-4-26b-a4b-it-4bit model consists of three safetensors files totaling 15.6GB modified 2024-08-06”,“Chrome OptGuideOnDeviceModel weights.bin is 4.27GB modified 2024-03-27”,“SSH-based find command successfully scanned entire /Users/lunhsiangyuan tree on remote mac-mini”,“All identified large files are in cache/model directories: ~/.ollama/models/blobs, ~/.omlx/models, ~/Library/Application Support/Google/Chrome”]



[← 回 Alfred Brain Hub]