How to Fix Git 'Out of Memory - malloc Failed' Error on Windows When Pulling/Pushing Large Files

If you’ve ever tried to pull from or push to a Git repository containing large files on Windows, you might have encountered the frustrating error: “Out of memory - malloc failed”. This error occurs when Git cannot allocate enough memory to process the large files, leaving you stuck in a loop of failed operations.

Git, by default, is not optimized for handling extremely large binary files (e.g., videos, high-res images, or large datasets) because it loads these files into memory during operations like pull, push, or clone. On Windows, this issue is often exacerbated by default memory limits, 32-bit Git installations, or inefficient handling of large pack files.

In this blog, we’ll break down why this error happens, walk through pre-solution checks, and provide step-by-step fixes to resolve it—including long-term solutions like Git LFS (Large File Storage) and short-term workarounds. By the end, you’ll be able to push/pull large files without memory issues.

Table of Contents#

  1. Understanding the 'Out of Memory - malloc Failed' Error
  2. Common Causes on Windows
  3. Pre-Solution Checks
  4. Solutions to Fix the Error
  5. Prevention Tips
  6. Conclusion
  7. References

1. Understanding the 'Out of Memory - malloc Failed' Error#

The error message “Out of memory - malloc failed” originates from Git’s inability to allocate memory using the malloc (memory allocation) function. Git relies on memory to process files during operations like:

  • Reading/writing large files during push or pull.
  • Packing loose objects into compressed “pack files” (Git’s way of storing data efficiently).
  • Validating or checking out large files from the repository.

When Git tries to load a large file or process a massive pack file, it requests memory from your system. If your system lacks available RAM or Git’s internal memory limits are too low, malloc fails, triggering the error.

2. Common Causes on Windows#

This error is particularly common on Windows due to several factors:

  • 32-bit Git Installation: 32-bit Git can only address up to 4GB of memory (often less in practice), making it prone to memory exhaustion with large files.
  • Default Git Memory Limits: Git has conservative default settings for memory usage during packing (e.g., pack.windowMemory), which may be too low for large repos.
  • Unoptimized Large Files: Binary files (e.g., .zip, .exe, .psd) are stored as-is in Git, leading to bloated pack files that strain memory.
  • Insufficient System RAM: If your PC has limited RAM (e.g., 4GB or less), other apps may consume memory, leaving little for Git.
  • Corrupt Git Objects: Damaged or corrupted objects in the repo can force Git to use excessive memory while trying to repair them.
  • Outdated Git Versions: Older Git releases may have memory leaks or inefficient packing logic.

3. Pre-Solution Checks#

Before diving into fixes, run these checks to narrow down the cause:

Check 1: Verify Git Architecture (32-bit vs. 64-bit)#

Open Git Bash or Command Prompt and run:

git --version  

Look for 64-bit in the output (e.g., git version 2.45.0.windows.1 (64-bit)). If it says 32-bit, switch to 64-bit Git (see Fix 1 below)—this is often the root cause.

Check 2: Identify Large Files in the Repo#

Use Git to list large files in your repo:

git ls-tree -r -t -l --full-name HEAD  

Sort by size (add | sort -nrk 4 to the command) to find files >100MB—these are likely culprits.

Check 3: Verify Available System Memory#

Open Task Manager (Ctrl+Shift+Esc) → Go to the Performance tab → Check “Available” memory. If available RAM is <1GB, close other apps (e.g., browsers, video editors) to free up space.

Check 4: Scan for Corrupt Objects#

Run Git’s built-in integrity check:

git fsck --full  

If you see errors like corrupt object or dangling blob, corrupt objects may be the issue (see Fix 8).

4. Solutions to Fix the Error#

1. Switch to 64-bit Git#

If you’re using 32-bit Git, upgrading to 64-bit is critical. 64-bit Git can address far more memory, reducing the risk of allocation failures.

Steps:

  1. Uninstall your current Git via Settings → Apps → Apps & features → Git → Uninstall.
  2. Download the 64-bit Git installer from the official Git for Windows page.
  3. Run the installer, ensuring “64-bit” is selected (default for modern systems).
  4. Verify with git --version (should show 64-bit).

2. Increase Git’s Memory Limits#

Git uses configuration settings to limit memory during packing. Increasing these limits can resolve allocation failures.

Key Settings:

  • pack.windowMemory: Max memory used to window objects during packing (default: 10m).
  • pack.packSizeLimit: Max size of a single pack file (default: unlimited, but large packs strain memory).
  • core.packedGitWindowSize: Memory used to read pack files (default: 1m).

Steps to Adjust:
Open Git Bash and run these commands to set limits (adjust values based on your RAM; start with 1g = 1GB):

# Set memory limits for packing (global settings)  
git config --global pack.windowMemory "2g"  
git config --global pack.packSizeLimit "2g"  
git config --global core.packedGitWindowSize "1g"  

Notes:

  • Use --local instead of --global to apply settings only to the current repo.
  • Avoid setting values higher than your available RAM (e.g., if you have 8GB RAM, max windowMemory to 4g).

3. Use Git LFS (Large File Storage)#

Git LFS replaces large binary files with tiny pointer files, storing the actual files on a remote server (e.g., GitHub, GitLab). This eliminates bloated pack files and reduces memory usage.

Steps to Set Up Git LFS:

Step 1: Install Git LFS#

Step 2: Initialize LFS in Your Repo#

git lfs install  

Step 3: Track Large Files#

Tell LFS to manage specific file types (e.g., .psd, .zip, .mp4):

# Track all .zip files  
git lfs track "*.zip"  
# Track specific large files (e.g., a 2GB backup)  
git lfs track "backup_20240101.iso"  

Step 4: Commit the LFS Tracking Rule#

LFS stores tracking rules in .gitattributes. Add and commit it:

git add .gitattributes  
git commit -m "Track large files with Git LFS"  

Step 5: Migrate Existing Large Files (If Error Occurs During Push)#

If you already committed large files (causing the error), migrate them to LFS:

# Replace "*.zip" with your file pattern  
git lfs migrate import --include="*.zip" --everything  

This rewrites history to replace large files with LFS pointers. Warning: Only do this for private repos—rewriting public history breaks collaborators’ clones.

4. Reduce Pack Size with git repack#

Git’s pack files can grow very large, causing memory issues. Use git repack to split large packs into smaller, more manageable ones.

Run this command in Git Bash:

git repack -a -d --window=0 --depth=0  

What it does:

  • -a: Repack all objects (not just loose ones).
  • -d: Remove old pack files after repacking.
  • --window=0/--depth=0: Disable delta compression (reduces memory usage during repacking, at the cost of slightly larger packs).

Note: This may increase repo size slightly but reduces memory strain during push/pull.

5. Free Up System Memory#

If Git is starved for RAM, close memory-heavy apps:

  • Open Task Manager (Ctrl+Shift+Esc).
  • Go to the Processes tab, sort by “Memory” (descending).
  • End tasks for apps you don’t need (e.g., Chrome with 50 tabs, Adobe Premiere, games).

6. Update Git to the Latest Version#

Newer Git releases often include memory optimizations. Update Git via:

Method 1: Git Installer
Download the latest Git for Windows and run the installer (it will upgrade your existing installation).

Method 2: Chocolatey (Package Manager)
If you use Chocolatey, run:

choco upgrade git -y  

7. Clone with Shallow Depth (Temporary Workaround)#

If the error occurs during git clone, use a shallow clone to download only the latest commit (reducing initial data transfer and memory usage):

git clone --depth 1 <your-repo-url>  

Limitations: Shallow clones lack full history. To fetch more history later, run:

git fetch --depth=100  # Fetches the last 100 commits  

Not ideal for long-term use, but useful for urgent clones.

8. Fix Corrupt Git Objects#

Corrupt objects force Git to use extra memory while repairing them. To fix:

Step 1: Clean Up Corrupted Data#

Run Git’s garbage collection to remove invalid objects:

git gc --prune=now  

Step 2: Reclone the Repo (If Corruption Persists)#

If git fsck still shows errors, delete the local repo and reclone:

rm -rf <repo-folder>  # WARNING: Deletes local changes!  
git clone <repo-url>  

5. Prevention Tips#

  • Use Git LFS from the Start: Track large binaries (e.g., .exe, .zip, .mp4) with LFS before committing them.
  • Regularly Clean Up Repos: Run git gc periodically to optimize pack files and free memory.
  • Avoid Committing Large Files: Compress files (e.g., .zip) or host them externally (e.g., AWS S3) instead of in Git.
  • Monitor Repo Size: Use tools like git-sizer to track repo bloat.
  • Keep Git Updated: New releases fix memory bugs and improve efficiency.

6. Conclusion#

The “Out of memory - malloc failed” error in Git on Windows is usually caused by large files, low memory limits, or 32-bit Git. The most effective fixes are:

  • Switching to 64-bit Git for better memory addressing.
  • Using Git LFS to offload large files.
  • Increasing Git’s memory limits for packing.

By combining these solutions with prevention tips like regular cleanup and LFS adoption, you can avoid this error and keep your Git workflow smooth.

7. References#