{"id":6215,"date":"2014-04-13T23:30:04","date_gmt":"2014-04-13T23:30:04","guid":{"rendered":"https:\/\/unknownerror.org\/index.php\/2014\/04\/13\/images-to-video-using-mediacodec-and-mediamuxer-collection-of-common-programming-errors\/"},"modified":"2014-04-13T23:30:04","modified_gmt":"2014-04-13T23:30:04","slug":"images-to-video-using-mediacodec-and-mediamuxer-collection-of-common-programming-errors","status":"publish","type":"post","link":"https:\/\/unknownerror.org\/index.php\/2014\/04\/13\/images-to-video-using-mediacodec-and-mediamuxer-collection-of-common-programming-errors\/","title":{"rendered":"Images to Video using MediaCodec and MediaMuxer-Collection of common programming errors"},"content":{"rendered":"<li><img decoding=\"async\" src=\"http:\/\/www.gravatar.com\/avatar\/73b47bf10b09085cff6016b53b398f3a?s=32&amp;d=identicon&amp;r=PG&amp;f=1\" \/><br \/>\nchentc<\/p>\n<p>I have a bunch of local images saved as jpeg files. My images are captured using CameraPreview and the PreviewFormat is as default: NV21. I want to generate a small video from a fixed number of images.<\/p>\n<p><strong>I am not going to use FFMpeg<\/strong> because it requires NDK and will introduce compatibility issues.<\/p>\n<p>MediaCodec and MediaMuxer seems work but there are not one working solutions on the web.<\/p>\n<p>There are a few references lead to my current solution.<\/p>\n<p>1.<strong>EncodeAndMuxTest<\/strong>: http:\/\/bigflake.com\/mediacodec\/EncodeAndMuxTest.java.txt<\/p>\n<p>This one is written by fadden. It quite suits my needs except he is using createInputSurface not queueInputBuffer.<\/p>\n<p><strong>2.Convert bitmap array to YUV (YCbCr NV21)<\/strong><\/p>\n<p>I do the conversion following this answer. http:\/\/stackoverflow.com\/a\/17116985\/3047840<\/p>\n<p><strong>3.Using MediaCodec to save series of images as Video<\/strong><\/p>\n<p>This question looks much similar as mine but I don&#8217;t bother using MediaMuxer.<\/p>\n<p>My code is the following:<\/p>\n<pre><code>public class EncodeAndMux extends Activity {\nprivate static final String TAG = \"EncodeAndMuxTest\";\n\nprivate static final boolean VERBOSE = false;\n\nprivate static final File OUTPUT_DIR = Environment\n        .getExternalStorageDirectory();\n\nprivate static final String MIME_TYPE = \"video\/avc\";\n\nprivate static final int FRAME_RATE = 10;\n\/\/ 10 seconds between I-frames\nprivate static final int IFRAME_INTERVAL = 10;\n\nprivate static final int NUM_FRAMES = 5;\nprivate static final String DEBUG_FILE_NAME_BASE = \"\/sdcard\/test\";\n\/\/ two seconds of video size of a frame, in pixels\nprivate int mWidth = -1;\n\nprivate int mHeight = -1;\n\/\/ bit rate, in bits per second\nprivate int mBitRate = -1;\n\nprivate byte[] mFrame;\n\n\/\/ largest color component delta seen (i.e. actual vs. expected)\nprivate int mLargestColorDelta;\n\/\/ encoder \/ muxer state\nprivate MediaCodec mEncoder;\nprivate MediaMuxer mMuxer;\nprivate int mTrackIndex;\nprivate boolean mMuxerStarted;\nprivate Utils mUtils;\nprivate float mPadding;\nprivate int mColumnWidth;\n\nprivate static final int TEST_Y = 120; \/\/ YUV values for colored rect\nprivate static final int TEST_U = 160;\nprivate static final int TEST_V = 200;\nprivate static final int TEST_R0 = 0; \/\/ RGB equivalent of {0,0,0}\nprivate static final int TEST_G0 = 136;\nprivate static final int TEST_B0 = 0;\nprivate static final int TEST_R1 = 236; \/\/ RGB equivalent of {120,160,200}\nprivate static final int TEST_G1 = 50;\nprivate static final int TEST_B1 = 186;\n\nprivate static final boolean DEBUG_SAVE_FILE = false; \/\/ save copy of\n                                                        \/\/ encoded movie\n\/\/ allocate one of these up front so we don't need to do it every time\nprivate MediaCodec.BufferInfo mBufferInfo;\nprivate ArrayList mImagePaths = new ArrayList();\n\nbyte[] getNV21(int inputWidth, int inputHeight, Bitmap scaled) {\n\n    int[] argb = new int[inputWidth * inputHeight];\n    scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);\n    byte[] yuv = new byte[inputWidth * inputHeight * 3 \/ 2];\n    encodeYUV420SP(yuv, argb, inputWidth, inputHeight);\n    scaled.recycle();\n    return yuv;\n}\n\nvoid encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {\n    final int frameSize = width * height;\n\n    int yIndex = 0;\n    int uvIndex = frameSize;\n\n    int a, R, G, B, Y, U, V;\n    int index = 0;\n    for (int j = 0; j &lt; height; j++) {\n        for (int i = 0; i &lt; width; i++) {\n\n            a = (argb[index] &amp; 0xff000000) &gt;&gt; 24; \/\/ a is not used obviously\n            R = (argb[index] &amp; 0xff0000) &gt;&gt; 16;\n            G = (argb[index] &amp; 0xff00) &gt;&gt; 8;\n            B = (argb[index] &amp; 0xff) &gt;&gt; 0;\n\n            \/\/ well known RGB to YUV algorithm\n            Y = ((66 * R + 129 * G + 25 * B + 128) &gt;&gt; 8) + 16;\n            U = ((-38 * R - 74 * G + 112 * B + 128) &gt;&gt; 8) + 128;\n            V = ((112 * R - 94 * G - 18 * B + 128) &gt;&gt; 8) + 128;\n\n            \/\/ NV21 has a plane of Y and interleaved planes of VU each\n            \/\/ sampled by a factor of 2\n            \/\/ meaning for every 4 Y pixels there are 1 V and 1 U. Note the\n            \/\/ sampling is every other\n            \/\/ pixel AND every other scanline.\n            yuv420sp[yIndex++] = (byte) ((Y &lt; 0) ? 0\n                    : ((Y &gt; 255) ? 255 : Y));\n            if (j % 2 == 0 &amp;&amp; index % 2 == 0) {\n                yuv420sp[uvIndex++] = (byte) ((V &lt; 0) ? 0\n                        : ((V &gt; 255) ? 255 : V));\n                yuv420sp[uvIndex++] = (byte) ((U &lt; 0) ? 0\n                        : ((U &gt; 255) ? 255 : U));\n            }\n\n            index++;\n        }\n    }\n}\n\npublic static Bitmap decodeFile(String filePath, int WIDTH, int HIGHT) {\n    try {\n\n        File f = new File(filePath);\n\n        BitmapFactory.Options o = new BitmapFactory.Options();\n        o.inJustDecodeBounds = true;\n        o.inPurgeable = true;\n        o.inInputShareable = true;\n        BitmapFactory.decodeStream(new FileInputStream(f), null, o);\n\n        final int REQUIRED_WIDTH = WIDTH;\n        final int REQUIRED_HIGHT = HIGHT;\n        int scale = 1;\n        while (o.outWidth \/ scale \/ 2 &gt;= REQUIRED_WIDTH\n                &amp;&amp; o.outHeight \/ scale \/ 2 &gt;= REQUIRED_HIGHT)\n            scale *= 2;\n        BitmapFactory.Options o2 = new BitmapFactory.Options();\n        o2.inSampleSize = scale;\n        o2.inPurgeable = true;\n        o2.inInputShareable = true;\n        return BitmapFactory.decodeStream(new FileInputStream(f), null, o2);\n    } catch (FileNotFoundException e) {\n        e.printStackTrace();\n    }\n    return null;\n}\n\n@Override\nprotected void onCreate(Bundle savedInstanceState) {\n    super.onCreate(savedInstanceState);\n    setContentView(R.layout.activity_encode_and_mux);\n    mUtils = new Utils(this);\n    mImagePaths = mUtils.getBackFilePaths();\n    mPadding = TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP,\n            AppConstant.GRID_PADDING, getResources().getDisplayMetrics());\n    mColumnWidth = (int) ((mUtils.getScreenWidth() - ((AppConstant.NUM_OF_COLUMNS + 1) * mPadding)) \/ AppConstant.NUM_OF_COLUMNS);\n\n\n    try {\n        testEncodeDecodeVideoFromBufferToSurface720p();\n    } catch (Exception e) {\n        \/\/ TODO Auto-generated catch block\n        e.printStackTrace();\n    } catch (Throwable e) {\n        \/\/ TODO Auto-generated catch block\n        e.printStackTrace();\n    }\n}\n\n\/**\n * Returns the first codec capable of encoding the specified MIME type, or null if no\n * match was found.\n *\/\nprivate static MediaCodecInfo selectCodec(String mimeType) {\n    int numCodecs = MediaCodecList.getCodecCount();\n    for (int i = 0; i &lt; numCodecs; i++) {\n        MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);\n\n        if (!codecInfo.isEncoder()) {\n            continue;\n        }\n\n        String[] types = codecInfo.getSupportedTypes();\n        for (int j = 0; j &lt; types.length; j++) {\n            if (types[j].equalsIgnoreCase(mimeType)) {\n                return codecInfo;\n            }\n        }\n    }\n    return null;\n}\n\n\/**\n * Returns a color format that is supported by the codec and by this test code.  If no\n * match is found, this throws a test failure -- the set of formats known to the test\n * should be expanded for new platforms.\n *\/\nprivate static int selectColorFormat(MediaCodecInfo codecInfo, String mimeType) {\n    MediaCodecInfo.CodecCapabilities capabilities = codecInfo.getCapabilitiesForType(mimeType);\n    for (int i = 0; i &lt; capabilities.colorFormats.length; i++) {\n        int colorFormat = capabilities.colorFormats[i];\n        if (isRecognizedFormat(colorFormat)) {\n            return colorFormat;\n        }\n    }\n    Log.e(\"\",\"couldn't find a good color format for \" + codecInfo.getName() + \" \/ \" + mimeType);\n    return 0;   \/\/ not reached\n}\n\n\/**\n * Returns true if this is a color format that this test code understands (i.e. we know how\n * to read and generate frames in this format).\n *\/\nprivate static boolean isRecognizedFormat(int colorFormat) {\n    switch (colorFormat) {\n        \/\/ these are the formats we know how to handle for this test\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:\n        case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:\n            return true;\n        default:\n            return false;\n    }\n}\n\n\/**\n * Returns true if the specified color format is semi-planar YUV.  Throws an exception\n * if the color format is not recognized (e.g. not YUV).\n *\/\nprivate static boolean isSemiPlanarYUV(int colorFormat) {\n    switch (colorFormat) {\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:\n            return false;\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:\n        case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:\n        case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:\n            return true;\n        default:\n            throw new RuntimeException(\"unknown format \" + colorFormat);\n    }\n}\n\n\/**\n * Does the actual work for encoding frames from buffers of byte[].\n *\/\nprivate void doEncodeDecodeVideoFromBuffer(MediaCodec encoder, int encoderColorFormat,\n        MediaCodec decoder, boolean toSurface) {\n    final int TIMEOUT_USEC = 10000;\n    ByteBuffer[] encoderInputBuffers = encoder.getInputBuffers();\n    ByteBuffer[] encoderOutputBuffers = encoder.getOutputBuffers();\n    ByteBuffer[] decoderInputBuffers = null;\n    ByteBuffer[] decoderOutputBuffers = null;\n    MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();\n    MediaFormat decoderOutputFormat = null;\n    int generateIndex = 0;\n    int checkIndex = 0;\n    int badFrames = 0;\n    boolean decoderConfigured = false;\n    OutputSurface outputSurface = null;\n\n    \/\/ The size of a frame of video data, in the formats we handle, is stride*sliceHeight\n    \/\/ for Y, and (stride\/2)*(sliceHeight\/2) for each of the Cb and Cr channels.  Application\n    \/\/ of algebra and assuming that stride==width and sliceHeight==height yields:\n\n    \/\/ Just out of curiosity.\n    long rawSize = 0;\n    long encodedSize = 0;\n\n    \/\/ Save a copy to disk.  Useful for debugging the test.  Note this is a raw elementary\n    \/\/ stream, not a .mp4 file, so not all players will know what to do with it.\n\n\n    if (toSurface) {\n        outputSurface = new OutputSurface(mWidth, mHeight);\n    }\n\n    \/\/ Loop until the output side is done.\n    boolean inputDone = false;\n    boolean encoderDone = false;\n    boolean outputDone = false;\n    while (!outputDone) {\n        Log.e(TAG, \"loop\");\n\n        \/\/ If we're not done submitting frames, generate a new one and submit it.  By\n        \/\/ doing this on every loop we're working to ensure that the encoder always has\n        \/\/ work to do.\n        \/\/\n        \/\/ We don't really want a timeout here, but sometimes there's a delay opening\n        \/\/ the encoder device, so a short timeout can keep us from spinning hard.\n        if (!inputDone) {\n            int inputBufIndex = encoder.dequeueInputBuffer(TIMEOUT_USEC);\n            Log.e(TAG, \"inputBufIndex=\" + inputBufIndex);\n            if (inputBufIndex &gt;= 0) {\n                long ptsUsec = computePresentationTime(generateIndex);\n                if (generateIndex == NUM_FRAMES) {\n                    \/\/ Send an empty frame with the end-of-stream flag set.  If we set EOS\n                    \/\/ on a frame with data, that frame data will be ignored, and the\n                    \/\/ output will be short one frame.\n                    encoder.queueInputBuffer(inputBufIndex, 0, 0, ptsUsec,\n                            MediaCodec.BUFFER_FLAG_END_OF_STREAM);\n                    inputDone = true;\n                    Log.e(TAG, \"sent input EOS (with zero-length frame)\");\n                } else {\n                    generateFrame(generateIndex, encoderColorFormat, mFrame);\n                    \/\/generateFrame(generateIndex);\n\n                    ByteBuffer inputBuf = encoderInputBuffers[inputBufIndex];\n                    \/\/ the buffer should be sized to hold one full frame\n                    inputBuf.clear();\n                    inputBuf.put(mFrame);\n\n                    encoder.queueInputBuffer(inputBufIndex, 0, mFrame.length, ptsUsec, 0);\n                    Log.e(TAG, \"submitted frame \" + generateIndex + \" to enc\");\n                }\n                generateIndex++;\n            } else {\n                \/\/ either all in use, or we timed out during initial setup\n                Log.e(TAG, \"input buffer not available\");\n            }\n        }\n\n        \/\/ Check for output from the encoder.  If there's no output yet, we either need to\n        \/\/ provide more input, or we need to wait for the encoder to work its magic.  We\n        \/\/ can't actually tell which is the case, so if we can't get an output buffer right\n        \/\/ away we loop around and see if it wants more input.\n        \/\/\n        \/\/ Once we get EOS from the encoder, we don't need to do this anymore.\n        if (!encoderDone) {\n            int encoderStatus = encoder.dequeueOutputBuffer(info, TIMEOUT_USEC);\n            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {\n                \/\/ no output available yet\n                Log.e(TAG, \"no output from encoder available\");\n            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {\n                \/\/ not expected for an encoder\n                encoderOutputBuffers = encoder.getOutputBuffers();\n                Log.e(TAG, \"encoder output buffers changed\");\n            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {\n                \/\/ not expected for an encoder\n\n\n                if (mMuxerStarted) {\n                    throw new RuntimeException(\"format changed twice\");\n                }\n                MediaFormat newFormat = encoder.getOutputFormat();\n                Log.e(TAG, \"encoder output format changed: \" + newFormat);\n\n                \/\/ now that we have the Magic Goodies, start the muxer\n                mTrackIndex = mMuxer.addTrack(newFormat);\n                Log.e(TAG, \"muxer defined muxer format: \" + newFormat);\n                mMuxer.start();\n                mMuxerStarted = true;\n\n            } else if (encoderStatus &lt; 0) {\n                Log.e(\"\",\"unexpected result from encoder.dequeueOutputBuffer: \" + encoderStatus);\n            } else { \/\/ encoderStatus &gt;= 0\n                ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];\n                if (encodedData == null) {\n                    Log.e(\"\",\"encoderOutputBuffer \" + encoderStatus + \" was null\");\n                }\n\n                \/\/ It's usually necessary to adjust the ByteBuffer values to match BufferInfo.\n                encodedData.position(info.offset);\n                encodedData.limit(info.offset + info.size);\n\n                encodedSize += info.size;\n\n                if ((info.flags &amp; MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {\n                    \/\/ Codec config info.  Only expected on first packet.  One way to\n                    \/\/ handle this is to manually stuff the data into the MediaFormat\n                    \/\/ and pass that to configure().  We do that here to exercise the API.\n\n                    MediaFormat format =\n                            MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);\n                    format.setByteBuffer(\"csd-0\", encodedData);\n                    decoder.configure(format, toSurface ? outputSurface.getSurface() : null,\n                            null, 0);\n\n                    decoder.start();\n                    decoderInputBuffers = decoder.getInputBuffers();\n                    decoderOutputBuffers = decoder.getOutputBuffers();\n                    decoderConfigured = true;\n                    Log.e(TAG, \"decoder configured (\" + info.size + \" bytes)\"+format);\n                } else {\n                    \/\/ Get a decoder input buffer, blocking until it's available.\n\n                    int inputBufIndex = decoder.dequeueInputBuffer(-1);\n                    ByteBuffer inputBuf = decoderInputBuffers[inputBufIndex];\n                    inputBuf.clear();\n                    inputBuf.put(encodedData);\n                    decoder.queueInputBuffer(inputBufIndex, 0, info.size,\n                            info.presentationTimeUs, info.flags);\n\n                    encoderDone = (info.flags &amp; MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0;\n                    Log.e(TAG, \"passed \" + info.size + \" bytes to decoder\"\n                            + (encoderDone ? \" (EOS)\" : \"\"));\n                    Log.e(\"encoderDone\",encoderDone+\"\");\n                }\n\n                encoder.releaseOutputBuffer(encoderStatus, false);\n            }\n        }\n\n        \/\/ Check for output from the decoder.  We want to do this on every loop to avoid\n        \/\/ the possibility of stalling the pipeline.  We use a short timeout to avoid\n        \/\/ burning CPU if the decoder is hard at work but the next frame isn't quite ready.\n        \/\/\n        \/\/ If we're decoding to a Surface, we'll get notified here as usual but the\n        \/\/ ByteBuffer references will be null.  The data is sent to Surface instead.\n        if (decoderConfigured) {\n            int decoderStatus = decoder.dequeueOutputBuffer(info, 3*TIMEOUT_USEC);\n            if (decoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {\n                \/\/ no output available yet\n                Log.e(TAG, \"no output from decoder available\");\n            } else if (decoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {\n                \/\/ The storage associated with the direct ByteBuffer may already be unmapped,\n                \/\/ so attempting to access data through the old output buffer array could\n                \/\/ lead to a native crash.\n                Log.e(TAG, \"decoder output buffers changed\");\n                decoderOutputBuffers = decoder.getOutputBuffers();\n            } else if (decoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {\n                \/\/ this happens before the first frame is returned\n                decoderOutputFormat = decoder.getOutputFormat();\n                Log.e(TAG, \"decoder output format changed: \" +\n                        decoderOutputFormat);\n            } else if (decoderStatus &lt; 0) {\n                Log.e(TAG, \"unexpected result from deocder.dequeueOutputBuffer: \" + decoderStatus);\n\n            } else {  \/\/ decoderStatus &gt;= 0\n                if (!toSurface) {\n                    ByteBuffer outputFrame = decoderOutputBuffers[decoderStatus];\n\n                    outputFrame.position(info.offset);\n                    outputFrame.limit(info.offset + info.size);\n                    mMuxer.writeSampleData(mTrackIndex, outputFrame,\n                            info);\n                    rawSize += info.size;\n                    if (info.size == 0) {\n                        Log.e(TAG, \"got empty frame\");\n                    } else {\n                        Log.e(TAG, \"decoded, checking frame \" + checkIndex);\n\n                        if (!checkFrame(checkIndex++, decoderOutputFormat, outputFrame)) {\n                            badFrames++;\n                        }\n                    }\n\n                    if ((info.flags &amp; MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {\n                        Log.e(TAG, \"output EOS\");\n                        outputDone = true;\n                    }\n                    decoder.releaseOutputBuffer(decoderStatus, false \/*render*\/);\n                } else {\n                    Log.e(TAG, \"surface decoder given buffer \" + decoderStatus +\n                            \" (size=\" + info.size + \")\");\n                    rawSize += info.size;\n                    if ((info.flags &amp; MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {\n                        Log.e(TAG, \"output EOS\");\n                        outputDone = true;\n                    }\n\n                    boolean doRender = (info.size != 0);\n\n                    \/\/ As soon as we call releaseOutputBuffer, the buffer will be forwarded\n                    \/\/ to SurfaceTexture to convert to a texture.  The API doesn't guarantee\n                    \/\/ that the texture will be available before the call returns, so we\n                    \/\/ need to wait for the onFrameAvailable callback to fire.\n                    decoder.releaseOutputBuffer(decoderStatus, doRender);\n                    if (doRender) {\n                        Log.e(TAG, \"awaiting frame \" + checkIndex);\n\n                        outputSurface.awaitNewImage();\n                        outputSurface.drawImage();\n                        if (!checkSurfaceFrame(checkIndex++)) {\n                            badFrames++;\n                        }\n                    }\n                }\n            }\n        }\n    }\n\n    Log.e(TAG, \"decoded \" + checkIndex + \" frames at \"\n            + mWidth + \"x\" + mHeight + \": raw=\" + rawSize + \", enc=\" + encodedSize);\n\n    if (outputSurface != null) {\n        outputSurface.release();\n    }\n\n    if (checkIndex != NUM_FRAMES) {\n\n        Log.e(TAG, \"awaiting frame \" + checkIndex);\n    }\n    if (badFrames != 0) {\n        Log.e(TAG, \"Found \" + badFrames + \" bad frames\");\n    }\n}\nprivate void generateFrame(int frameIndex) {\n\n    Bitmap bitmap = decodeFile(mImagePaths.get(frameIndex), mColumnWidth,\n            mColumnWidth);\n\n    mFrame = getNV21(bitmap.getWidth(), bitmap.getHeight(), bitmap);\n}\n\n\/**\n * Generates data for frame N into the supplied buffer.  We have an 8-frame animation\n * sequence that wraps around.  It looks like this:\n * <\/code><\/pre>\n<pre><code>\n *   0 1 2 3\n *   7 6 5 4\n * <\/code><\/pre>\n<pre>\n * We draw one of the eight rectangles and leave the rest set to the zero-fill color.\n *\/\nprivate void generateFrame(int frameIndex, int colorFormat, byte[] mFrame) {\n    final int HALF_WIDTH = mWidth \/ 2;\n    boolean semiPlanar = isSemiPlanarYUV(colorFormat);\n    \/\/ Set to zero.  In YUV this is a dull green.\n    Arrays.fill(mFrame, (byte) 0);\n\n    int startX, startY, countX, countY;\n\n    frameIndex %= 8;\n    \/\/frameIndex = (frameIndex \/ 8) % 8;    \/\/ use this instead for debug -- easier to see\n    if (frameIndex &lt; 4) {\n        startX = frameIndex * (mWidth \/ 4);\n        startY = 0;\n    } else {\n        startX = (7 - frameIndex) * (mWidth \/ 4);\n        startY = mHeight \/ 2;\n    }\n\n    for (int y = startY + (mHeight\/2) - 1; y &gt;= startY; --y) {\n        for (int x = startX + (mWidth\/4) - 1; x &gt;= startX; --x) {\n            if (semiPlanar) {\n                \/\/ full-size Y, followed by UV pairs at half resolution\n                \/\/ e.g. Nexus 4 OMX.qcom.video.encoder.avc COLOR_FormatYUV420SemiPlanar\n                \/\/ e.g. Galaxy Nexus OMX.TI.DUCATI1.VIDEO.H264E\n                \/\/        OMX_TI_COLOR_FormatYUV420PackedSemiPlanar\n                mFrame[y * mWidth + x] = (byte) TEST_Y;\n                if ((x &amp; 0x01) == 0 &amp;&amp; (y &amp; 0x01) == 0) {\n                    mFrame[mWidth*mHeight + y * HALF_WIDTH + x] = (byte) TEST_U;\n                    mFrame[mWidth*mHeight + y * HALF_WIDTH + x + 1] = (byte) TEST_V;\n                }\n            } else {\n                \/\/ full-size Y, followed by quarter-size U and quarter-size V\n                \/\/ e.g. Nexus 10 OMX.Exynos.AVC.Encoder COLOR_FormatYUV420Planar\n                \/\/ e.g. Nexus 7 OMX.Nvidia.h264.encoder COLOR_FormatYUV420Planar\n                mFrame[y * mWidth + x] = (byte) TEST_Y;\n                if ((x &amp; 0x01) == 0 &amp;&amp; (y &amp; 0x01) == 0) {\n                    mFrame[mWidth*mHeight + (y\/2) * HALF_WIDTH + (x\/2)] = (byte) TEST_U;\n                    mFrame[mWidth*mHeight + HALF_WIDTH * (mHeight \/ 2) +\n                              (y\/2) * HALF_WIDTH + (x\/2)] = (byte) TEST_V;\n                }\n            }\n        }\n    }\n}\n\n\n\n\n \/**\n * Sets the desired frame size and bit rate.\n *\/\nprivate void setParameters(int width, int height, int bitRate) {\n    if ((width % 16) != 0 || (height % 16) != 0) {\n        Log.w(TAG, \"WARNING: width or height not multiple of 16\");\n    }\n    mWidth = width;\n    mHeight = height;\n    mBitRate = bitRate;\n    mFrame = new byte[mWidth * mHeight * 3 \/ 2];\n}\npublic void testEncodeDecodeVideoFromBufferToSurface720p() throws Throwable {\n    setParameters(1280, 720, 6000000);\n    encodeDecodeVideoFromBuffer(false);\n}\n<\/pre>\n<p><code>}<\/code><\/p>\n<p><code>Logcat:<\/code><\/p>\n<pre><code><code>  12-17 18:25:47.405: E\/EncodeAndMuxTest(16415): found codec: OMX.qcom.video.encoder.avc\n  12-17 18:25:47.405: I\/OMXClient(16415): Using client-side OMX mux.\n  12-17 18:25:47.455: E\/EncodeAndMuxTest(16415): found colorFormat: 21\n  12-17 18:25:47.455: E\/EncodeAndMuxTest(16415): format: {frame-rate=10, bitrate=6000000, height=720, mime=video\/avc, color-format=21, i-frame-interval=10, width=1280}\n  12-17 18:25:47.465: I\/OMXClient(16415): Using client-side OMX mux.\n  12-17 18:25:47.495: E\/ACodec(16415): [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w\/ err -2147483648\n  12-17 18:25:47.495: I\/ACodec(16415): setupVideoEncoder succeeded\n  12-17 18:25:47.535: I\/OMXClient(16415): Using client-side OMX mux.\n  12-17 18:25:47.545: E\/EncodeAndMuxTest(16415): loop\n  12-17 18:25:47.545: E\/EncodeAndMuxTest(16415): inputBufIndex=0\n  12-17 18:25:47.655: E\/EncodeAndMuxTest(16415): submitted frame 0 to enc\n  12-17 18:25:47.655: E\/EncodeAndMuxTest(16415): encoder output format changed: {csd-1=java.nio.ByteArrayBuffer[position=0,limit=8,capacity=8], height=720, mime=video\/avc, csd-0=java.nio.ByteArrayBuffer[position=0,limit=18,capacity=18], what=1869968451, width=1280}\n  12-17 18:25:47.655: E\/EncodeAndMuxTest(16415): muxer defined muxer format: {csd-1=java.nio.ByteArrayBuffer[position=0,limit=8,capacity=8], height=720, mime=video\/avc, csd-0=java.nio.ByteArrayBuffer[position=0,limit=18,capacity=18], what=1869968451, width=1280}\n 12-17 18:25:47.655: I\/MPEG4Writer(16415): limits: 2147483647\/0 bytes\/us, bit rate: -1 bps and the estimated moov size 3072 bytes\n 12-17 18:25:47.655: E\/EncodeAndMuxTest(16415): inputBufIndex=2\n 12-17 18:25:47.795: E\/EncodeAndMuxTest(16415): submitted frame 1 to enc\n 12-17 18:25:47.825: E\/EncodeAndMuxTest(16415): decoder configured (26 bytes){csd-0=java.nio.DirectByteBuffer[position=0,limit=26,capacity=692224], height=720, width=1280, mime=video\/avc}\n 12-17 18:25:47.855: E\/EncodeAndMuxTest(16415): no output from decoder available\n  12-17 18:25:47.855: E\/EncodeAndMuxTest(16415): inputBufIndex=0\n  12-17 18:25:47.976: E\/EncodeAndMuxTest(16415): submitted frame 2 to enc\n  12-17 18:25:48.136: E\/EncodeAndMuxTest(16415): passed 3188 bytes to decoder\n  12-17 18:25:48.176: E\/EncodeAndMuxTest(16415): no output from decoder available\n  12-17 18:25:48.176: E\/EncodeAndMuxTest(16415): inputBufIndex=1\n  12-17 18:25:48.296: E\/EncodeAndMuxTest(16415): submitted frame 3 to enc\n  12-17 18:25:48.296: E\/EncodeAndMuxTest(16415): passed 1249 bytes to decoder\n 12-17 18:25:48.326: E\/EncodeAndMuxTest(16415): no output from decoder available\n  12-17 18:25:48.326: E\/EncodeAndMuxTest(16415): loop\n  12-17 18:25:48.326: E\/EncodeAndMuxTest(16415): inputBufIndex=2\n   12-17 18:25:48.396: E\/EncodeAndMuxTest(16415): submitted frame 4 to enc\n   12-17 18:25:48.396: E\/EncodeAndMuxTest(16415): passed 3085 bytes to decoder\n  12-17 18:25:48.436: E\/EncodeAndMuxTest(16415): no output from decoder available\n  12-17 18:25:48.436: E\/EncodeAndMuxTest(16415): inputBufIndex=0\n   12-17 18:25:48.436: E\/EncodeAndMuxTest(16415): sent input EOS (with zero-length frame)\n  12-17 18:25:48.436: E\/EncodeAndMuxTest(16415): passed 3056 bytes to decoder\n    12-17 18:25:48.466: E\/EncodeAndMuxTest(16415): no output from decoder available\n   12-17 18:25:48.466: E\/EncodeAndMuxTest(16415): passed 1085 bytes to decoder (EOS)\n  12-17 18:25:48.476: E\/EncodeAndMuxTest(16415): decoder output buffers changed\n  12-17 18:25:48.496: E\/EncodeAndMuxTest(16415): decoder output format changed:\n<\/code><\/code><\/pre>\n<\/li>\n","protected":false},"excerpt":{"rendered":"<p>chentc I have a bunch of local images saved as jpeg files. My images are captured using CameraPreview and the PreviewFormat is as default: NV21. I want to generate a small video from a fixed number of images. I am not going to use FFMpeg because it requires NDK and will introduce compatibility issues. MediaCodec [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-6215","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/posts\/6215","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/comments?post=6215"}],"version-history":[{"count":0,"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/posts\/6215\/revisions"}],"wp:attachment":[{"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/media?parent=6215"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/categories?post=6215"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/unknownerror.org\/index.php\/wp-json\/wp\/v2\/tags?post=6215"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}