How to Convert YUV_420_888 to Bitmap on Samsung Galaxy S7 (Camera2): Understanding Plane Structure, Pixel Stride & Padding

Mobile camera development often requires working with raw image data, and Android’s Camera2 API is a powerful tool for this. However, Camera2 returns images in the YUV_420_888 format by default, which is not directly displayable as a Bitmap (Android’s preferred format for images). Converting YUV_420_888 to Bitmap can be tricky, especially on devices with unique hardware quirks—like the Samsung Galaxy S7.

The Samsung Galaxy S7, while popular, has specific memory layout behaviors (e.g., padding in image planes) that can lead to distorted images if not handled correctly. This blog demystifies the YUV_420_888 format, breaks down its plane structure, explains critical concepts like pixel stride and row stride, and provides a step-by-step guide to converting it to a Bitmap on the S7.

Table of Contents#

  1. Understanding YUV_420_888 in Camera2
    • 1.1 What is YUV_420_888?
    • 1.2 Why YUV Instead of RGB?
  2. Key Concepts: Planes, Strides, and Padding
    • 2.1 Plane Structure in YUV_420_888
    • 2.2 Pixel Stride vs. Row Stride
    • 2.3 What is Padding and Why It Matters?
  3. Samsung Galaxy S7 Specific Considerations
    • 3.1 S7’s YUV_420_888 Memory Layout
    • 3.2 Common Pitfalls (e.g., Swapped U/V Planes)
  4. Step-by-Step Conversion: YUV_420_888 to Bitmap
    • 4.1 Access Image Data from Camera2
    • 4.2 Extract Planes, Strides, and Buffers
    • 4.3 Manual Conversion with Stride/Padding Handling
    • 4.4 Optimized Conversion with RenderScript
  5. Debugging Tips for S7
  6. Conclusion
  7. References

1. Understanding YUV_420_888 in Camera2#

1.1 What is YUV_420_888?#

YUV_420_888 is a flexible raw image format used by Android’s Camera2 API. Unlike RGB (which stores color channels directly), YUV separates luminance (Y) (brightness) from chrominance (U and V) (color information). This separation reduces bandwidth and storage, making it ideal for camera sensors.

The “420” refers to chrominance subsampling:

  • Y: Full resolution (4 parts luminance).
  • U/V: Half resolution in both width and height (2 parts chrominance for every 4 luminance parts).

The “888” indicates each channel (Y, U, V) uses 8 bits per pixel.

1.2 Why YUV Instead of RGB?#

  • Efficiency: YUV requires ~50% less data than RGB for the same image quality (due to subsampling).
  • Sensor Compatibility: Most camera sensors output YUV natively, avoiding costly on-sensor RGB conversion.

2. Key Concepts: Planes, Strides, and Padding#

2.1 Plane Structure in YUV_420_888#

A YUV_420_888 image is divided into 3 separate planes (buffers) for Y, U, and V:

PlanePurposeResolutionPixel Format
YLuminance (brightness)Full width/height8-bit grayscale
UChrominance (blue)½ width, ½ height (subsampled)8-bit color
VChrominance (red)½ width, ½ height (subsampled)8-bit color

Example: For a 1920x1080 image:

  • Y plane: 1920x1080
  • U/V planes: 960x540 (each).

2.2 Pixel Stride vs. Row Stride#

To efficiently store image data in memory, hardware often aligns rows to byte boundaries (e.g., 16-byte alignment for performance). This introduces two critical parameters:

Pixel Stride#

The number of bytes between consecutive pixels in a row. For YUV_420_888:

  • Y Plane: Pixel stride = 1 (each Y pixel is 1 byte).
  • U/V Planes: Pixel stride = 1 (each U/V pixel is 1 byte, since they are separate planes).

Example: A pixel stride of 1 means adjacent pixels in a row are consecutive in memory.

Row Stride#

The total number of bytes per row, including padding (extra bytes at the end of a row to meet alignment requirements). Row stride is always ≥ the row’s pixel count (since padding adds bytes).

Formula:
Row Stride = (Number of Pixels per Row) + Padding

Example: A Y plane with width 1920 might have a row stride of 1936 (16 bytes of padding) to align with 16-byte memory boundaries.

2.3 What is Padding and Why It Matters?#

Padding is the extra bytes at the end of a row (calculated as Row Stride - Pixels per Row). Ignoring padding corrupts images:

  • If you read width bytes per row instead of row stride, you’ll accidentally include padding bytes from the next row, causing horizontal distortion (e.g., shifted lines).

3. Samsung Galaxy S7 Specific Considerations#

The Samsung Galaxy S7 (and many Samsung devices) has unique behaviors when outputting YUV_420_888 via Camera2. These quirks are critical to handle for accurate conversions:

3.1 S7’s YUV_420_888 Memory Layout#

  • Row Stride Padding: The S7 often adds padding to Y, U, and V planes to align rows to 16-byte boundaries. For example:
    • A 1920x1080 Y plane may have a row stride of 1936 (16 bytes padding: 1936 - 1920 = 16).
    • U/V planes (960x540) may have a row stride of 960 (no padding) or 976 (16 bytes padding).
  • Subsampling Consistency: U and V planes are reliably subsampled to ½ width/height (no surprises here).

3.2 Common Pitfalls#

  • Swapped U/V Planes: Some S7 models swap the U and V planes (e.g., Plane 1 = V, Plane 2 = U instead of U/V). This causes color inversion (e.g., red <-> blue).
  • Hidden Padding: Even if the row stride equals the image width, always verify—S7 may add padding for certain resolutions (e.g., 1280x720).

4. Step-by-Step Conversion: YUV_420_888 to Bitmap#

4.1 Access Image Data from Camera2#

First, capture a YUV_420_888 image via Camera2’s ImageReader. In your ImageAvailableListener:

ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2);
reader.setOnImageAvailableListener(imageReader -> {
    try (Image image = imageReader.acquireNextImage()) {
        if (image != null) {
            Bitmap bitmap = convertYuvToBitmap(image); // Implement this!
            // Use the bitmap (e.g., display, save)
        }
    } catch (Exception e) {
        e.printStackTrace();
    }
}, backgroundHandler);

4.2 Extract Planes, Strides, and Buffers#

A YUV_420_888 Image object contains 3 planes. Extract their buffers, row strides, and pixel strides:

private Bitmap convertYuvToBitmap(Image image) {
    Image.Plane[] planes = image.getPlanes();
    int width = image.getWidth();
    int height = image.getHeight();
 
    // Extract Y, U, V planes
    Image.Plane yPlane = planes[0];
    Image.Plane uPlane = planes[1];
    Image.Plane vPlane = planes[2];
 
    // Get buffers, row strides, and pixel strides
    ByteBuffer yBuffer = yPlane.getBuffer();
    ByteBuffer uBuffer = uPlane.getBuffer();
    ByteBuffer vBuffer = vPlane.getBuffer();
 
    int rowStrideY = yPlane.getRowStride();
    int rowStrideU = uPlane.getRowStride();
    int rowStrideV = vPlane.getRowStride();
 
    int pixelStrideY = yPlane.getPixelStride(); // Should be 1 for Y
    int pixelStrideU = uPlane.getPixelStride(); // Should be 1 for U
    int pixelStrideV = vPlane.getPixelStride(); // Should be 1 for V
    ...
}

4.3 Manual Conversion with Stride/Padding Handling#

To convert manually, iterate over each pixel, sample Y, U, V values (accounting for strides/padding), and convert to RGB.

Step 1: Allocate Pixels Array#

Bitmap pixels are stored as int[] (ARGB format). Allocate an array for the image:

int[] pixels = new int[width * height];

Step 2: Iterate Over Pixels and Sample Y, U, V#

For each pixel (x, y) in the output Bitmap:

  • Y: Sampled from the Y plane (full resolution).
  • U/V: Sampled from U/V planes (subsampled to x/2, y/2).

Account for padding by using rowStride (not width) to index into the buffers:

for (int y = 0; y < height; y++) {
    for (int x = 0; x < width; x++) {
        // Sample Y (luminance)
        int yIndex = y * rowStrideY + x * pixelStrideY;
        byte yByte = yBuffer.get(yIndex);
        int Y = yByte & 0xFF; // Convert to unsigned (0-255)
 
        // Sample U and V (chrominance, subsampled)
        int uvX = x / 2;
        int uvY = y / 2;
 
        int uIndex = uvY * rowStrideU + uvX * pixelStrideU;
        byte uByte = uBuffer.get(uIndex);
        int U = uByte & 0xFF;
 
        int vIndex = uvY * rowStrideV + uvX * pixelStrideV;
        byte vByte = vBuffer.get(vIndex);
        int V = vByte & 0xFF;
        ...
    }
}

Step 3: Convert YUV to RGB#

Use the standard YUV-to-RGB conversion formula. Note: Adjust for S7’s U/V swap if colors are inverted!

// Convert YUV to RGB (BT.601 standard)
// Formula: R = Y + 1.402*(V-128), G = Y - 0.344*(U-128) - 0.714*(V-128), B = Y + 1.772*(U-128)
int R = (int) (Y + 1.402 * (V - 128));
int G = (int) (Y - 0.34414 * (U - 128) - 0.71414 * (V - 128));
int B = (int) (Y + 1.772 * (U - 128));
 
// Clamp values to 0-255 (prevent overflow)
R = Math.max(0, Math.min(255, R));
G = Math.max(0, Math.min(255, G));
B = Math.max(0, Math.min(255, B));
 
// Store as ARGB (alpha = 255 for opaque)
pixels[y * width + x] = 0xFF000000 | (R << 16) | (G << 8) | B;

Step 4: Create Bitmap#

Finally, convert the pixels array to a Bitmap:

Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
return bitmap;

4.4 Optimized Conversion with RenderScript#

Manual conversion is slow for large images. Use Android’s RenderScript for hardware-accelerated conversion. However, ScriptIntrinsicYuvToRGB requires NV21 format (a specific YUV variant). To use it:

Step 1: Convert YUV_420_888 to NV21#

NV21 is a semi-planar format (Y plane + interleaved UV plane). Merge U/V planes into a single interleaved buffer:

private byte[] yuvToNv21(Image image) {
    Image.Plane[] planes = image.getPlanes();
    int width = image.getWidth();
    int height = image.getHeight();
 
    // Y buffer
    byte[] yBuffer = new byte[planes[0].getBuffer().remaining()];
    planes[0].getBuffer().get(yBuffer);
 
    // U and V buffers (subsampled)
    byte[] uBuffer = new byte[planes[1].getBuffer().remaining()];
    planes[1].getBuffer().get(uBuffer);
    byte[] vBuffer = new byte[planes[2].getBuffer().remaining()];
    planes[2].getBuffer().get(vBuffer);
 
    // NV21 = Y (full) + UV (interleaved, subsampled)
    int ySize = yBuffer.length;
    int uvSize = uBuffer.length;
    byte[] nv21 = new byte[ySize + uvSize * 2];
 
    // Copy Y
    System.arraycopy(yBuffer, 0, nv21, 0, ySize);
 
    // Interleave U and V (swap U/V if needed for S7!)
    int uvIndex = ySize;
    for (int i = 0; i < uvSize; i++) {
        nv21[uvIndex++] = vBuffer[i]; // V first (NV21 uses VU interleaving)
        nv21[uvIndex++] = uBuffer[i]; // U second
    }
    return nv21;
}

Step 2: Use RenderScript to Convert NV21 to Bitmap#

private Bitmap renderScriptNv21ToBitmap(Context context, byte[] nv21, int width, int height) {
    RenderScript rs = RenderScript.create(context);
    ScriptIntrinsicYuvToRGB yuvToRgb = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs));
 
    // Create input allocation (NV21 format)
    Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs)).setX(nv21.length);
    Allocation inAlloc = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);
 
    // Create output allocation (Bitmap)
    Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
    Allocation outAlloc = Allocation.createFromBitmap(rs, bitmap);
 
    // Copy NV21 data and process
    inAlloc.copyFrom(nv21);
    yuvToRgb.setInput(inAlloc);
    yuvToRgb.forEach(outAlloc);
    outAlloc.copyTo(bitmap);
 
    // Cleanup
    rs.destroy();
    return bitmap;
}

5. Debugging Tips for S7#

  • Log Strides/Padding: Print row strides for Y, U, V to detect padding:
    Log.d("YUV_DEBUG", "Y Stride: " + rowStrideY + ", U Stride: " + rowStrideU + ", V Stride: " + rowStrideV);
  • Test U/V Swap: If colors are inverted, swap U and V in the conversion formula or NV21 interleaving step.
  • Visual Inspection: Capture a test image (e.g., a red object) and check if the bitmap shows red (correct) or blue (swapped U/V).

6. Conclusion#

Converting YUV_420_888 to Bitmap on the Samsung Galaxy S7 requires careful handling of plane structure, row strides, and padding. By understanding how Y, U, and V planes are stored, accounting for S7-specific quirks (e.g., padding, U/V swaps), and using optimized tools like RenderScript, you can reliably convert raw camera data to displayable bitmaps.

Always test with your target S7 model—hardware variations exist!

7. References#