添加链接
link之家
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

I am using an android v21 device to stream data to a javafx application. Its working fine but I have about 2 seconds of latency.

As of now the basic transportation goes like this

  • android webrtc/custom implementation 16ms
  • android packetizer(udp) 6 ms
  • udp transport assumed at < 5ms
  • windows depacketizer no buildup of data in buffers
  • windows ffmpeg framgrabber unkown latency
  • javafx imageview <1 ms
  • My data stream to my desktop and my packetizer is much faster than my frame rate and is often just waiting. There is no buildup of data anywhere else and therefore I assume no delay in any of my code.

    I tested my android device by writing the yuv from camera to a texture and timing how long before the android device can encode the frame into h264 and then how long until its sent. so 16 + 6 = 22ms

    I feel the problem is with the Javacv ffmpeg framegrabber. Im studying this api in order to learn why this is occurring.

    My major concern is that framegrabber takes foever to start...around 4 seconds.

    Once it start I can clearly see how many frames I insert and how many its grabbing and it always lagging by some large number such as 40 up to 200.

    Also Framegrabber.grab() is blocking and runs every 100ms to match my frame rate no matter how fast I tell it to run so I can never catch up.

    Do you have any suggestions?

    Im starting to think javacv is not a viable solution because it seems many people struggle with this delay issue. If you have alternate suggestions please advise.

    My ffmpeg framgrabber

        public RapidDecoder(final InputStream inputStream, final ImageView view)
        System.out.println(TAG + " starting");
         grabber = new FFmpegFrameGrabber(inputStream, 0);
         converter = new Java2DFrameConverter();
         mView = view;
        emptyBuffer = new Runnable() {
            @Override
            public void run() {
                System.out.println(TAG + " emptybuffer thread running");
                try {
                    grabber.setFrameRate(12);
                    grabber.setVideoBitrate(10000);
                    //grabber.setOption("g", "2");
                   // grabber.setOption("bufsize", "10000");
                    //grabber.setOption("af", "delay 20");
                    //grabber.setNumBuffers(0);
                    //grabber.setOption("flush_packets", "1");
                    //grabber.setOption("probsize", "32");
                    //grabber.setOption("analyzeduration", "0");
                    grabber.setOption("preset", "ultrafast");
                    grabber.setOption("fflags", "nobuffer");
                    //grabber.setVideoOption("nobuffer", "1");
                    //grabber.setOption("fflags", "discardcorrupt");
                    //grabber.setOption("framedrop", "\\");
                   //grabber.setOption("flags","low_delay");
                    grabber.setOption("strict","experimental");
                    //grabber.setOption("avioflags", "direct");
                    //grabber.setOption("filter:v", "fps=fps=30");
                    grabber.setVideoOption("tune", "zerolatency");
                    //grabber.setFrameNumber(60);
                    grabber.start();
                }catch (Exception e)
                    System.out.println(TAG + e);
                while (true)
                        grabFrame();
                        Thread.sleep(1);
                    }catch (Exception e)
                        System.out.println(TAG + " emptybuffer " + e);
        display = new Runnable() {
            @Override
            public void run() {
                System.out.println(TAG + " display thread running ");
                while(true)
                        displayImage();
                        Thread.sleep(10);
                    }catch (Exception e)
                        System.out.println(TAG + " display " + e);
    public synchronized void grabFrame() throws FrameGrabber.Exception
               //frame = grabber.grabFrame();
            frame = grabber.grab();
        //System.out.println("grab");
    public synchronized void displayImage()
        bufferedImage = converter.convert(frame);
        frame = null;
        if (bufferedImage == null) return;
        mView.setImage(SwingFXUtils.toFXImage(bufferedImage, null));
        //System.out.println("display");
    

    here you can see i draw texture with image and send to h264 encoder

    @Override public void onTextureFrameCaptured(int width, int height, int texId, float[] tranformMatrix, int rotation, long timestamp) { //Log.d(TAG, "onTextureFrameCaptured: ->");

                VideoRenderer.I420Frame frame = new VideoRenderer.I420Frame(width, height, rotation, texId, tranformMatrix, 0,timestamp);
                avccEncoder.renderFrame(frame);
                videoView.renderFrame(frame);
                surfaceTextureHelper.returnTextureFrame();
    

    Here you can see webrtc encoding happen

     @Override
        public void renderFrame(VideoRenderer.I420Frame i420Frame) {
            start = System.nanoTime();
            bufferque++;
            mediaCodecHandler.post(new Runnable() {
                @Override
                public void run() {
                    videoEncoder.encodeTexture(false, i420Frame.textureId, i420Frame.samplingMatrix, TimeUnit.NANOSECONDS.toMicros(i420Frame.timestamp));
         * Called to retrieve an encoded frame
        @Override
        public void onEncodedFrame(MediaCodecVideoEncoder.OutputBufferInfo frame, MediaCodec.BufferInfo bufferInfo) {
            b = new byte[frame.buffer().remaining()];
            frame.buffer().get(b);
            synchronized (lock)
                encodedBuffer.add(b);
                lock.notifyAll();
                if(encodedBuffer.size() > 1)
                    Log.e(TAG, "drainEncoder: too big: " + encodedBuffer.size(),null );
            duration = System.nanoTime() - start;
            bufferque--;
            calcAverage();
            if (bufferque > 0)
            Log.d(TAG, "onEncodedFrame: bufferque size: " + bufferque);
                    Probably encoding related. You can try to set IFRAME_INTERVAL to -1 instead of 5. Not to many other options with v21 to reduce latency.
    – Mike
                    Dec 5, 2019 at 19:03
                    thanks for response. I notice that 4.4 is required for most video call apps. Would you have a link to any example projects?
    – cagney
                    Dec 5, 2019 at 19:26
                    Most video call apps probably aren't using the MediaCodec api. If that is the route you are going check out WebRTC.
    – Mike
                    Dec 6, 2019 at 0:42
                    This change should take care of the startup time at least: github.com/bytedeco/javacpp/commit/…  Let me know if it doesn't though.
    – Samuel Audet
                    Dec 19, 2019 at 22:01
    

    I edited my question above as I solved the problem over the course of a few days but let me give detail for those who may need them.

    Android - I ended up using this library https://github.com/Piasy/VideoCRE It tears the webrtc function open and allows you to encode video frame by frame. Thats how I benchmarked the frames at 16ms for encoding on an old terrible phone.

    javacv ffmpeg -The solution was a buffering issue in the c++ avcodec. To prove it try feed every frame in twice or 10 times instead of once. It cuts down latency by the same factor although the feed becomes useless as well. It also reduces the startup time of the video feed. However on line 926 of ffmpegframegrabber in the javacv code I set thread from (0) to (1) per this link https://mailman.videolan.org/pipermail/x264-devel/2009-May/005880.html

    The thread_count = 0 directs x264 to use enough threads to load all your CPU cores during encode. So you probably run the tests on a dual core machine (2 cores will have 3 threads). To get x264 encode without delay, set thread_count = 1.

    You may find countless suggestions of setting options through javacv however I never had javacv reject the options I set and learned many times that I was affecting the wrong factors. Here is a list of the thing i tried;

                    //grabber.setFrameRate(12);
                    //grabber.setVideoBitrate(10000);
                    //grabber.setOption("g", "2");
                   // grabber.setOption("bufsize", "10000");
                    //grabber.setOption("af", "delay 20");
                    //grabber.setNumBuffers(0);
                    //grabber.setOption("flush_packets", "1");
                    //grabber.setOption("probsize", "32");
                    //grabber.setOption("analyzeduration", "0");
                    //grabber.setOption("preset", "ultrafast");
                    //grabber.setOption("fflags", "nobuffer");
                    //grabber.setVideoOption("nobuffer", "1");
                    //grabber.setOption("fflags", "discardcorrupt");
                    //grabber.setOption("framedrop", "\\");
                   //grabber.setOption("flags","low_delay");
                    //grabber.setOption("strict","experimental");
                    //grabber.setOption("avioflags", "direct");
                    //grabber.setOption("filter:v", "fps=fps=30");
                    //grabber.setOptions("look_ahead", "0");
                    //Map options = new HashMap();
                    //options.put("tune", "zerolatency");
                    grabber.setVideoOption("look_ahead", "0");
                    //grabber.setFrameNumber(60);
    

    None of them worked and as you read the documentation you will understand that when ffmpeg starts up there are different encoders (avcontext, videocontext, audiocontext) which takes different values and there are different api framegrabber and ffply which takes different flags( i believe) so throwing things at the wall is rather futile.

    Try adding the extra frames to your stream first. Also if you only need a single image just add a null packet to you input stream and it will flush the buffer.

    If you need to stream video for robotic vision check out my blog post http://cagneymoreau.com/stream-video-android/

    What you're looking for is setVideoOption("threads", "1"), which is well documented here: ffmpeg.org/ffmpeg-codecs.html – Samuel Audet Dec 28, 2019 at 7:11

    Thanks for contributing an answer to Stack Overflow!

    • Please be sure to answer the question. Provide details and share your research!

    But avoid

    • Asking for help, clarification, or responding to other answers.
    • Making statements based on opinion; back them up with references or personal experience.

    To learn more, see our tips on writing great answers.