首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >如何修复由MediaMuxer生成的损坏的Mp4文件?

如何修复由MediaMuxer生成的损坏的Mp4文件?
EN

Stack Overflow用户
提问于 2020-10-11 03:19:26
回答 1查看 412关注 0票数 2

我使用MediaMuxerMediaCodec生成了一个mp4视频。

在调用mMediaMuxer.stop()之后,视频可以播放了

但是,当用户在我获得调用stop()方法的更改之前退出应用程序时,我留下了一个无法播放的大mp4文件。

有没有办法修复这个mp4文件使其可播放?

编辑

Here是损坏的mp4文件的一个示例

我可以使用this online tool修复文件,但这个工具要求上传一个未损坏的视频作为参考。

Here是我用作参考的未损坏的mp4视频。当我上传这个视频时,这个工具修复了我损坏的mp4文件。

所以修复文件是可能的,但是他们是如何做到的呢?

如果有用,下面是我用来生成损坏和未损坏的代码

代码语言:javascript
运行
复制
package com.tolotra.images_to_video

import android.content.ContentValues.TAG
import android.content.Context
import android.graphics.Bitmap
import android.graphics.BitmapFactory
import android.media.*
import android.opengl.*
import android.util.Log
import android.util.TimingLogger
import android.view.Surface
import java.io.File
import java.nio.ByteBuffer
import java.nio.ByteOrder
import java.nio.FloatBuffer
import java.nio.IntBuffer
import java.text.SimpleDateFormat
import java.util.*


class VideoBuilder(applicationContext: Context) {


    private var frameId: Long = 0
    private lateinit var muxer: MediaMuxer
    private lateinit var glTool: OverlayRenderer
    private lateinit var encoder: MediaCodec
    private lateinit var outVideoFilePath: String
    private var context = applicationContext
    private var trackIndex: Int = 0
    private lateinit var bufferInfo: MediaCodec.BufferInfo
    private var eglContext: EGLContext? = null
    private var eglDisplay: EGLDisplay? = null
    private var eglSurface: EGLSurface? = null
    private lateinit var surface: Surface


    val timeoutUs = 10000L
    val frameRate = 5
    var presentationTimeUs: Long = 0


    fun setup() {
        encoder = createEncoder()
        initInputSurface(encoder)
        encoder.start()

        outVideoFilePath = getScreenshotPath("tolotra-screen-recoder-${Date().time}.mp4")
        muxer = MediaMuxer(outVideoFilePath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4)

        glTool = OverlayRenderer()
        glTool.initGl()
    }

    /**
     * Laspse is the duration between the current frame and the previous frame
     */
    fun feed(bitmap: Bitmap, timelapse: Long) {

        frameId++
        Log.d("FEED_PROFILE", "feed frame:$frameId")
        val timings = TimingLogger("FEED_PROFILE", "feed frame:$frameId")
        // Get encoded data and feed it to muxer
        drainEncoder(encoder, muxer, false, timelapse)

        timings.addSplit("drainEncoder done");
        // Render the bitmap/texture with OpenGL here
        glTool.render(bitmap)
        timings.addSplit("render done");

        // Set timestamp with EGL extension
        EGLExt.eglPresentationTimeANDROID(eglDisplay, eglSurface, presentationTimeUs * 1000)

        // Feed encoder with next frame produced by OpenGL
        EGL14.eglSwapBuffers(eglDisplay, eglSurface)

        timings.dumpToLog();
    }

    fun finish() {
        Log.d(TAG, "Finishing")

        // Drain last encoded data and finalize the video file
        drainEncoder(encoder, muxer, true, 0)
        _cleanUp(encoder, muxer)

        val file = File(outVideoFilePath)

        val file_size = (file.length() / 1024).toString().toInt()
        val retriever = MediaMetadataRetriever()
        retriever.setDataSource(outVideoFilePath)
        val width =
            retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH)
        val height =
            retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT)
        val rotation =
            retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION)

        val bitRate =
            retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_BITRATE)

        val duration =
            java.lang.Long.valueOf(retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION)) * 1000

        Log.d("Result", "bitrate $bitRate duration $duration  fileSize $file_size ")

    }

    fun getScreenshotPath(fileName: String): String {
        val f = context.externalCacheDir
        val externalDir: String = f!!.path;
        val sDir: String = externalDir + File.separator + "Screen Recorder";
        val dir = File(sDir);
        val dirPath: String;
        if (dir.exists() || dir.mkdir()) {
            dirPath = sDir + File.separator + fileName;
        } else {
            dirPath = externalDir + File.separator + fileName
        }
        Log.d("Mp4 file path", "Path: $dirPath")

        return dirPath;
    } //


    fun createEncoder(): MediaCodec {

        bufferInfo = MediaCodec.BufferInfo()
        val MIME = "video/avc"
        val encoder = MediaCodec.createEncoderByType(MIME)
        val width = 320
        val heigh = 512
        val format = MediaFormat.createVideoFormat(MIME, width, heigh)
        format.setInteger(
            MediaFormat.KEY_COLOR_FORMAT,
            MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface
        )
//        format.setInteger(MediaFormat.KEY_BIT_RATE, 2_000_000)
        format.setInteger(MediaFormat.KEY_BIT_RATE, 350_000)
        format.setInteger(MediaFormat.KEY_FRAME_RATE, 45)
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5)

        encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
        trackIndex = -1;
        return encoder
    }

    fun drainEncoder(
        encoder: MediaCodec,
        muxer: MediaMuxer,
        endOfStream: Boolean,
        timelapseUs: Long
    ) {
        if (endOfStream)
            encoder.signalEndOfInputStream()

        while (true) {
            val outBufferId = encoder.dequeueOutputBuffer(bufferInfo, timeoutUs)

            if (outBufferId >= 0) {
                val encodedBuffer = encoder.getOutputBuffer(outBufferId)

                // MediaMuxer is ignoring KEY_FRAMERATE, so I set it manually here
                // to achieve the desired frame rate
                bufferInfo.presentationTimeUs = presentationTimeUs
                if (encodedBuffer != null) {
                    muxer.writeSampleData(trackIndex, encodedBuffer, bufferInfo)
                }

                presentationTimeUs += timelapseUs

                encoder.releaseOutputBuffer(outBufferId, false)

                // Are we finished here?
                if ((bufferInfo.flags and MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0)
                    break
            } else if (outBufferId == MediaCodec.INFO_TRY_AGAIN_LATER) {
                if (!endOfStream)
                    break

                // End of stream, but still no output available. Try again.
            } else if (outBufferId == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                trackIndex = muxer.addTrack(encoder.outputFormat)
                muxer.start()
            }
        }
    }

    private fun initInputSurface(encoder: MediaCodec) {

        val surface = encoder.createInputSurface()

        val eglDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY)
        if (eglDisplay == EGL14.EGL_NO_DISPLAY)
            throw RuntimeException(
                "eglDisplay == EGL14.EGL_NO_DISPLAY: "
                        + GLUtils.getEGLErrorString(EGL14.eglGetError())
            )

        val version = IntArray(2)
        if (!EGL14.eglInitialize(eglDisplay, version, 0, version, 1))
            throw RuntimeException("eglInitialize(): " + GLUtils.getEGLErrorString(EGL14.eglGetError()))

        val attribList = intArrayOf(
            EGL14.EGL_RED_SIZE, 8,
            EGL14.EGL_GREEN_SIZE, 8,
            EGL14.EGL_BLUE_SIZE, 8,
            EGL14.EGL_ALPHA_SIZE, 8,
            EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
            EGLExt.EGL_RECORDABLE_ANDROID, 1,
            EGL14.EGL_NONE
        )
        val configs = arrayOfNulls<EGLConfig>(1)
        val nConfigs = IntArray(1)
        EGL14.eglChooseConfig(eglDisplay, attribList, 0, configs, 0, configs.size, nConfigs, 0)

        var err = EGL14.eglGetError()
        if (err != EGL14.EGL_SUCCESS)
            throw RuntimeException(GLUtils.getEGLErrorString(err))

        val ctxAttribs = intArrayOf(
            EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
            EGL14.EGL_NONE
        )
        val eglContext =
            EGL14.eglCreateContext(eglDisplay, configs[0], EGL14.EGL_NO_CONTEXT, ctxAttribs, 0)

        err = EGL14.eglGetError()
        if (err != EGL14.EGL_SUCCESS)
            throw RuntimeException(GLUtils.getEGLErrorString(err))

        val surfaceAttribs = intArrayOf(
            EGL14.EGL_NONE
        )
        val eglSurface =
            EGL14.eglCreateWindowSurface(eglDisplay, configs[0], surface, surfaceAttribs, 0)
        err = EGL14.eglGetError()
        if (err != EGL14.EGL_SUCCESS)
            throw RuntimeException(GLUtils.getEGLErrorString(err))

        if (!EGL14.eglMakeCurrent(eglDisplay, eglSurface, eglSurface, eglContext))
            throw RuntimeException("eglMakeCurrent(): " + GLUtils.getEGLErrorString(EGL14.eglGetError()))


        this.eglSurface = eglSurface
        this.eglDisplay = eglDisplay
        this.eglContext = eglContext
        this.surface = surface
    }

    private fun _cleanUp(encoder: MediaCodec, muxer: MediaMuxer) {
        if (eglDisplay != EGL14.EGL_NO_DISPLAY) {
            EGL14.eglDestroySurface(eglDisplay, eglSurface)
            EGL14.eglDestroyContext(eglDisplay, eglContext)
            EGL14.eglReleaseThread()
            EGL14.eglTerminate(eglDisplay);
        }
        surface?.release();
        eglDisplay = EGL14.EGL_NO_DISPLAY
        eglContext = EGL14.EGL_NO_CONTEXT
        eglSurface = EGL14.EGL_NO_SURFACE

        encoder.stop()
        encoder.release()

        muxer.stop()
        muxer.release()
    }


}

class OverlayRenderer() {

    private val mvpMatrix = FloatArray(16)
    private val projectionMatrix = FloatArray(16)
    private val viewMatrix = FloatArray(16)

    private val vertexShaderCode =
        "precision highp float;\n" +
                "attribute vec3 vertexPosition;\n" +
                "attribute vec2 uvs;\n" +
                "varying vec2 varUvs;\n" +
                "uniform mat4 mvp;\n" +
                "\n" +
                "void main()\n" +
                "{\n" +
                "\tvarUvs = uvs;\n" +
                "\tgl_Position = mvp * vec4(vertexPosition, 1.0);\n" +
                "}"

    private val fragmentShaderCode =
        "precision mediump float;\n" +
                "\n" +
                "varying vec2 varUvs;\n" +
                "uniform sampler2D texSampler;\n" +
                "\n" +
                "void main()\n" +
                "{\t\n" +
                "\tgl_FragColor = texture2D(texSampler, varUvs);\n" +
                "}"

    private var vertices = floatArrayOf(
        // x, y, z, u, v
        -1.0f, -1.0f, 0.0f, 0f, 0f,
        -1.0f, 1.0f, 0.0f, 0f, 1f,
        1.0f, 1.0f, 0.0f, 1f, 1f,
        1.0f, -1.0f, 0.0f, 1f, 0f
    )

    private var indices = intArrayOf(
        2, 1, 0, 0, 3, 2
    )

    private var program: Int = 0
    private var vertexHandle: Int = 0
    private var bufferHandles = IntArray(2)
    private var uvsHandle: Int = 0
    private var mvpHandle: Int = 0
    private var samplerHandle: Int = 0
    private val textureHandle = IntArray(1)


    val viewportWidth = 320
    val viewportHeight = 486


    var vertexBuffer: FloatBuffer = ByteBuffer.allocateDirect(vertices.size * 4).run {
        order(ByteOrder.nativeOrder())
        asFloatBuffer().apply {
            put(vertices)
            position(0)
        }
    }

    var indexBuffer: IntBuffer = ByteBuffer.allocateDirect(indices.size * 4).run {
        order(ByteOrder.nativeOrder())
        asIntBuffer().apply {
            put(indices)
            position(0)
        }
    }

    fun render(bitmap: Bitmap) {

        Log.d("Bitmap", "width ${bitmap.width} height ${bitmap.height}")


// Prepare some transformations
        val mvp = FloatArray(16)
        Matrix.setIdentityM(mvp, 0)
        Matrix.scaleM(mvp, 0, 1f, -1f, 1f)

        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT or GLES20.GL_DEPTH_BUFFER_BIT)
        GLES20.glClearColor(0f, 0f, 0f, 0f)

        GLES20.glViewport(0, 0, viewportWidth, viewportHeight)

        GLES20.glUseProgram(program)

// Pass transformations to shader
        GLES20.glUniformMatrix4fv(mvpHandle, 1, false, mvp, 0)

// Prepare texture for drawing
        GLES20.glActiveTexture(GLES20.GL_TEXTURE0)
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0])
        GLES20.glPixelStorei(GLES20.GL_UNPACK_ALIGNMENT, 1)

// Pass the Bitmap to OpenGL here
        GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0)

        GLES20.glTexParameteri(
            GLES20.GL_TEXTURE_2D,
            GLES20.GL_TEXTURE_MIN_FILTER,
            GLES20.GL_NEAREST
        )
        GLES20.glTexParameteri(
            GLES20.GL_TEXTURE_2D,
            GLES20.GL_TEXTURE_MAG_FILTER,
            GLES20.GL_NEAREST
        )

// Prepare buffers with vertices and indices & draw
        GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, bufferHandles[0])
        GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, bufferHandles[1])

        GLES20.glEnableVertexAttribArray(vertexHandle)
        GLES20.glVertexAttribPointer(vertexHandle, 3, GLES20.GL_FLOAT, false, 4 * 5, 0)

        GLES20.glEnableVertexAttribArray(uvsHandle)
        GLES20.glVertexAttribPointer(uvsHandle, 2, GLES20.GL_FLOAT, false, 4 * 5, 3 * 4)

        GLES20.glDrawElements(GLES20.GL_TRIANGLES, 6, GLES20.GL_UNSIGNED_INT, 0)
    }


    fun initGl() {
        val vertexShader = GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER).also { shader ->
            GLES20.glShaderSource(shader, vertexShaderCode)
            GLES20.glCompileShader(shader)
        }

        val fragmentShader = GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER).also { shader ->
            GLES20.glShaderSource(shader, fragmentShaderCode)
            GLES20.glCompileShader(shader)
        }

        program = GLES20.glCreateProgram().also {
            GLES20.glAttachShader(it, vertexShader)
            GLES20.glAttachShader(it, fragmentShader)
            GLES20.glLinkProgram(it)

            vertexHandle = GLES20.glGetAttribLocation(it, "vertexPosition")
            uvsHandle = GLES20.glGetAttribLocation(it, "uvs")
            mvpHandle = GLES20.glGetUniformLocation(it, "mvp")
            samplerHandle = GLES20.glGetUniformLocation(it, "texSampler")
        }

        // Initialize buffers
        GLES20.glGenBuffers(2, bufferHandles, 0)

        GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, bufferHandles[0])
        GLES20.glBufferData(
            GLES20.GL_ARRAY_BUFFER,
            vertices.size * 4,
            vertexBuffer,
            GLES20.GL_DYNAMIC_DRAW
        )

        GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, bufferHandles[1])
        GLES20.glBufferData(
            GLES20.GL_ELEMENT_ARRAY_BUFFER,
            indices.size * 4,
            indexBuffer,
            GLES20.GL_DYNAMIC_DRAW
        )

        // Init texture handle
        GLES20.glGenTextures(1, textureHandle, 0)

        // Ensure I can draw transparent stuff that overlaps properly
        GLES20.glEnable(GLES20.GL_BLEND)
        GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA)
    }
}
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2020-10-11 19:39:44

一般来说,MP4不是一种好的记录格式。通常,示例表保存在内存中,并在关闭时写入。因此,在断电或应用程序错误的情况下-您丢失了录音。使用MPEG-2传输流或分段MP4,则大多数写入的媒体仍可播放。你的文件很可能只包含一个MP4 'ftyp‘和'mdat’atom,音频和视频是交错的。有了一些有根据的猜测和关于视频流的知识,就有机会提取音频和视频。https://fix.video似乎就是这么做的。

代码语言:javascript
运行
复制
Correct MP4:
[ftyp]
[mdat]
[moov]
-end-

Truncated MP4:
[ftyp]
[mdat]
-end-

Fix.video解析你的好文件,提取音频和视频的设置。它使用good文件中的信息重新创建大部分“moov”原子。缺失的样本表'stXX‘是通过解析'mdat’原子重新创建的。'mdat‘原子内的视频块每个都带有前缀长度,其余的必须是AAC音频。

票数 2
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/64297338

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档