Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
Discussion options

The buffered video source uses a queue of frames, and then the source picks the best next frame, based on their timestamps. But that makes the offset between input timestamps and output timestamps grow in case that the video source temporarily stops having frames ready (like CPU too busy, video acq lags a bit).

I have a situation where the camera returns between 30 and 60fps, and my output is at 30fps.

Then imagine that there is a lag in receiving frames (like the CPU is too busy). There will be a while without new frames. But nevertheless the last_frame_ts advances sys_offset (1/30 of second at 30fps). Here it falls into this case:

source->last_frame_ts += sys_offset;

finishing the function with no frame:

blog(LOG_DEBUG, "no frame!");

Despite we end with no-frame, last_frame_ts advanced a lot. With usual frames, I get a 20ms frame_offset, but when there are no frames because of temporary CPU lag, the last_frame_ts advances 33ms!

So when CPU becomes free again, the video source has last_ts_frame maybe one second into the future, making the video images go 1s behind audio (up to 2s or as much as the queue length), and it will never recover back to ilttle delay.

It's fair to say that, with buffered video source, video will be delayed by a random amount of time, always increasing every time some frame lags a bit behind, and it will never be resynched back.

This looks clearly wrong, and I don't see when anyone would desire this situation. So is the ready_async_frame algorithm wrong, or really people have just to use the video sources unbuffered, for recording live video?

One hard limit I thought was something like:

if (source->last_frame_ts > sys_time)
   source->last_frame_ts = sys_time;

That limits the delay to the acq latency at least, and that's too little to make any use of buffer. Maybe "= sys_time + 50ms"?
I don't know what would be desirable to improve this: try to keep the frame->timestamp within some boundaries? Use a smaller queue than 30 frames (that's 1s delay at 30fps input)?

You must be logged in to vote

Replies: 3 comments · 1 reply

Comment options

In the past I've had to turn off buffering with V4L2 inputs, too, for this same reason. I never investigated why because I hadn't looked under the hood of OBS.

I definitely think this should be looked at or explained!

The documentation on async video sources is rather sparse - there's no docs for obs_source_async_unbuffered() and the like.

You must be logged in to vote
0 replies
Comment options

I added in a debug string to show a situation where V4L2 input grabs a few frames between calls to ready_async_frame => implying a break between calls of about 100ms (the webcam is 30FPS), while OBS is (supposedly) rendering at 50FPS.

I did this by switching scenes, which caused a small stutter (see the line of debug output info: User switched to scene 'Desktop share test')

I'm seeing that the sys_time argument is suggesting that only 20ms has passed every time ready_async_frame is called - this comes from obs->video.video_time - I'm still trying to figure out why obs->video.video_time doesn't seem to advance as much as I thought it would given there was a delay of 100ms+:

debug: source->last_frame_ts: 30288348000, frame_time: 30300588000, sys_offset: 20000000, frame_offset: 12240000, number of frames: 1
debug: new frame, source->last_frame_ts: 30308348000, next_frame->timestamp: 30300588000
debug: v4l2-input: /dev/video0: ts: 039610 buf id #3, flags 0x00012001, seq #874, len 816000, used 73081
debug: v4l2-input: output timestamp: 30332581000
info: User switched to scene 'Desktop share test'
debug: v4l2-input: /dev/video0: ts: 071608 buf id #0, flags 0x00012001, seq #875, len 816000, used 73186
debug: v4l2-input: output timestamp: 30364579000
debug: v4l2-input: /dev/video0: ts: 107617 buf id #1, flags 0x00012001, seq #876, len 816000, used 72775
debug: v4l2-input: output timestamp: 30400588000
debug: v4l2-input: /dev/video0: ts: 139610 buf id #2, flags 0x00012001, seq #877, len 816000, used 72423
debug: v4l2-input: output timestamp: 30432581000
debug: source->last_frame_ts: 30308348000, frame_time: 30332581000, sys_offset: 20000000, frame_offset: 24233000, number of frames: 4
debug: no frame!
debug: source->last_frame_ts: 30328348000, frame_time: 30332581000, sys_offset: 20000000, frame_offset: 4233000, number of frames: 4
debug: new frame, source->last_frame_ts: 30348348000, next_frame->timestamp: 30332581000
debug: source->last_frame_ts: 30348348000, frame_time: 30364579000, sys_offset: 20000000, frame_offset: 16231000, number of frames: 3

It never recovers to the 1-2 frames in the async buffer that it would normally have, it eventually balloons out to 20-21 frames.

You must be logged in to vote
0 replies
Comment options

This is definitely bugged 🐛

Firstly, if any timestamp sent with a frame is zero; then the test if (!source->last_frame_ts) is giving a false positive for whether or not we are looking at the first frame or not (there's a couple of places in the code where this test is performed).

Secondly, there is something incorrect in the logic within get_closest_frame which can be demonstrated in the following example.

Reproducible example

The optional test-input plugin has a Sync Test (Async Video/Audio) source. This source is supposed to show a black box for 1 sec, followed by a white box for 1 sec. When the box is white - a 'tone' is played.

This source, however, sends a timestamp of zero, so we need to address that first. I made the following changes to test/test-input/sync-async-source.c as a temporary workaround.

@@ -80,8 +80,8 @@ static void *video_thread(void *data)
        while (os_event_try(ast->stop_signal) == EAGAIN) {
                fill_texture(pixels, whitelist ? 0xFFFFFFFF : 0xFF000000);
 
-               frame.timestamp = cur_time - start_time;
-               audio.timestamp = cur_time - start_time;
+               frame.timestamp = cur_time; //- start_time;
+               audio.timestamp = cur_time; //- start_time;

I then put the (workaround) source into a scene; and found that the tone was being played while the box is black. The opposite of what should be happening.

I'll work on a fix for the logic; but given how 'deep' this is within the way OBS (async) sources operate - it will need some good eyes to check over it.

I have a sneaking suspicion that async (video) sources are 1 frame 'behind' where they should be at all times - but most people are playing videos at frame rates where this doesn't matter.

You must be logged in to vote
1 reply
@stephematician
Comment options

The test plugin can be modified to use the "unbuffered" approach; in which case the audio and video are synchronised correctly (e.g. within the video thread of the source add the line obs_source_set_async_unbuffered(ast->source, true); before the loop).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.