Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
Discussion options

Hi there, thanks so much for making this, it's helped me a lot to understand timecode creation.

I'm just a video editor trying to solve a problem, and so I don't know Swift super well, but I kept getting warnings about the .wait() in "AVAsset Timecode Write.swift" because it doesn't seem to work well with modern concurrency, or at least however I was trying to use it (I don't really know what I'm doing). Anyway it worked until I ran 1TB of files to replace timecode and then compress, and then eventually it would cause a crash, so I had to stop using TimecodeKit unfortunately.

However, I looked into what you were doing and what I could do and came up with a solution using your method of creating new timecode samples.

I was just wondering if your code is supposed to only add 1 sample? I noticed that my output files seem to be fine but the timecode end is always 4 frames after timecode start, whereas the original files have a timecode end matching the final frame of the video. If it's only supposed to add 1 sample it seems like you realised the timecode end doesn't actually matter?

I'm also not sure if your timecode is usually written in the timescale of the original timecode, so I've added the timescale to the writer just in case:

// called with: await writeTemporaryTimecodeTrack(from: sourceTimecodeTrack, toURL: tempfileURL) { }

func writeTemporaryTimecodeTrack(
    from sourceTimecodeTrack: AVAssetTrack,
    toURL tempfileURL: URL,
    completion: @escaping () -> Void
) async {
    // get settings from source asset
    let formatDescriptions = try? await sourceTimecodeTrack.load(.formatDescriptions)
    guard let formatDescription = formatDescriptions?.first else {
        print("Failed to load timecode track properties")
        return
    }
    let frameDurationDesc = CMTimeCodeFormatDescriptionGetFrameDuration(formatDescription)
    let frameDuration = frameDurationDesc.value
    let timeScale = frameDurationDesc.timescale
    let inputFrameDuration = CMTime(value: frameDuration, timescale: timeScale)
    
    // set up writer
    guard let writer = try? AVAssetWriter(url: tempfileURL, fileType: .mov) else {
        print("Writer initialisation failed")
        return }
    writer.movieTimeScale = timeScale
    
    let input = AVAssetWriterInput(mediaType: .timecode, outputSettings: nil)
    input.mediaTimeScale = timeScale
    
    guard writer.canAdd(input) else { print("Adding input to writer failed")
        return }
    writer.add(input)
    
    guard writer.startWriting() else {
        print("Failed to start writing")
        return }
    writer.startSession(atSourceTime: .zero)
    
    // fetch start timecode or set new one here
    guard let getStartTimecode = getInitialTimecode(for: sourceTimecodeTrack) else {
        print("Failed to get initial timecode frame")
        return
    }
    var startTimecode = getStartTimecode.bigEndian
    
    // write data
    guard let blockBuffer = try? CMBlockBuffer(length: MemoryLayout<UInt32>.size) else {
        print("Failed to initialise blockbuffer")
        return }
    do {
        try blockBuffer.fillDataBytes(with: 0x00)
        try withUnsafeBytes(of: &startTimecode) { startTimecodePointer in
            try blockBuffer.replaceDataBytes(with: startTimecodePointer)
        }
    } catch {
        print("Failed to write blockbuffer data")
        return
    }
    
    // append sample buffer
    guard let sampleBuffer = try? CMSampleBuffer(
        dataBuffer: blockBuffer,
        formatDescription: formatDescription,
        numSamples: 1,
        sampleTimings: [CMSampleTimingInfo(duration: inputFrameDuration, presentationTimeStamp: .zero, decodeTimeStamp: .invalid)],
        sampleSizes: [4]
    ) else {
        print("Failed to initialise samplebuffer")
        return }
    do {
        try sampleBuffer.makeDataReady()
    } catch {
        print("Failed to make samplebuffer ready")
        return
    }
    input.append(sampleBuffer)
    input.markAsFinished()
    
    // finish writing
    do {
        try await finishWritingAsset(writer: writer, frameDuration: inputFrameDuration)
        print("Asset writing completed successfully.")
    } catch {
        print("Error during asset writing: \(error)")
    }
}

func finishWritingAsset(writer: AVAssetWriter, frameDuration: CMTime) async throws {
    writer.endSession(atSourceTime: frameDuration)
    try await withCheckedThrowingContinuation { continuation in
        writer.finishWriting {
            if writer.status == .completed {
                continuation.resume(returning: ())
            } else if writer.status == .failed {
                let defaultError = NSError(domain: "com.yourdomain.error",
                                           code: -1,
                                           userInfo: [NSLocalizedDescriptionKey: "Unknown error during asset writing."])
                continuation.resume(throwing: writer.error ?? defaultError)
            } else {
                let defaultError = NSError(domain: "com.yourdomain.error",
                                           code: -2,
                                           userInfo: [NSLocalizedDescriptionKey: "Asset writing was cancelled or an unexpected error occurred."])
                continuation.resume(throwing: defaultError)
            }
        }
    }
}

// if successfully written then call: await replaceTimecodeInMovie(movie: movie, withNewTimecodeFrom: tempfileURL)

func replaceTimecodeInMovie(movie: AVMutableMovie, withNewTimecodeFrom tempfileURL: URL) async {
    // load new timecode asset
    let newTimecodeAsset = AVMutableMovie(url: tempfileURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
    guard let newTimecodeTrack = newTimecodeAsset.tracks(withMediaType: .timecode).first else {
        print("Failed to load new timecode track.")
        return
    }
    let newTimecodeTrackDuration = newTimecodeTrack.timeRange.duration
    
    //remove existing timecode tracks from the mutable movie
    let existingTimecodeTracks = movie.tracks(withMediaType: .timecode)
    existingTimecodeTracks.forEach { track in
        movie.removeTrack(track)
    }
    
    // create new mutable timecode track
    guard let movieTimecodeTrack = movie.addMutableTrack(withMediaType: .timecode, copySettingsFrom: nil) else {
        print("Failed to add new mutable timecode track.")
        return
    }
    
    // add new timecode track
    do {
        let range = CMTimeRangeMake(start: .zero, duration: newTimecodeTrackDuration)
        try movieTimecodeTrack.insertTimeRange(range, of: newTimecodeTrack, at: .zero, copySampleData: true)
        let videoTracks = movie.tracks(withMediaType: .video)
        if videoTracks.isEmpty {
            print("No video tracks found.")
        } else {
            videoTracks.forEach { videoTrack in
                movieTimecodeTrack.addTrackAssociation(to: videoTrack, type: .timecode)
                print("Timecode associated with video track.")
            }
        }
    }  catch let error as NSError {
        print("Failed to replace timecode track: \(error)")
        print("Error localised description: \(error.localizedDescription)")
    }
}
You must be logged in to vote

I kept getting warnings about the .wait() in "AVAsset Timecode Write.swift" because it doesn't seem to work well with modern concurrency, or at least however I was trying to use it

Do you remember what sort of errors? Do you mean when Thread Sanitizer is turned on?

it worked until I ran 1TB of files to replace timecode and then compress, and then eventually it would cause a crash

Sounds like a memory issue. Do you recall if that caused really large RAM usage? If so, then that's probably a small fix where a release pool might be needed.

I was just wondering if your code is supposed to only add 1 sample? I noticed that my output files seem to be fine but the timecode end is always 4 fr…

Replies: 2 comments · 10 replies

Comment options

I kept getting warnings about the .wait() in "AVAsset Timecode Write.swift" because it doesn't seem to work well with modern concurrency, or at least however I was trying to use it

Do you remember what sort of errors? Do you mean when Thread Sanitizer is turned on?

it worked until I ran 1TB of files to replace timecode and then compress, and then eventually it would cause a crash

Sounds like a memory issue. Do you recall if that caused really large RAM usage? If so, then that's probably a small fix where a release pool might be needed.

I was just wondering if your code is supposed to only add 1 sample? I noticed that my output files seem to be fine but the timecode end is always 4 frames after timecode start, whereas the original files have a timecode end matching the final frame of the video. If it's only supposed to add 1 sample it seems like you realised the timecode end doesn't actually matter?

I'd have to look into it. I didn't run into any issues with test video files I was using so maybe it's an edge case, not sure. In theory you just need one timecode sample at the start of the track and have the track duration match the video track's (if I recall, been a while since I worked on this module).

You must be logged in to vote
4 replies
@KweenLizzie
Comment options

Thanks, and sorry but I don't remember the exact error as it never printed to the console. There was just a purple warning in Xcode about waiting for a lower class task or lower priority task or something? Like I said I don't really know what I'm doing, but the gist was that the function that compresses the files could hang because of the .wait() when appending the timecode. I just ignored it but then it started crashing after running for a while and so I updated the code to use await instead and haven't had any crashes. I assume it's because everything else in the code is using the latest concurrency and doesn't play well with the .wait()

I've since done some tests with the samples and reconnecting Premiere projects with a mix of camera files and it does seem like generally there aren't any issues with just the one timecode sample, but I ended up adding an extra sample at the end for the final frame, and now the timecode in MediaInfo has the same start and end timecode as the original file and feels good regardless of if it's need or not. Hopefully it isn't a bad thing to have a random sample at the end! The whole reason I tried using TimecodeKit in the first place is because AssetWriter kept crashing when camera files had a timecode sample for every video frame, so replacing the timecode fixed that issue.

Anyway everything seems to be working for me now, thanks.

@orchetect
Comment options

Thanks for the details.

purple warning in Xcode about waiting for a lower class task or lower priority task or something

Yeah that might depend on how/where the methods are being called as I am not seeing these errors come when running the package's unit tests. But since DispatchGroup is being used that could definitely run interference in an async/await context.

The AVFoundation portion of TimecodeKit was written around the time when async/await was very new, so efforts were made to allow functions to be backwards compatible for apps with older OS requirements. At some point I'll refactor them to use the newer API since Swift Concurrency is fairly pervasive now. That should fix the issues you ran into.

Hopefully it isn't a bad thing to have a random sample at the end!

Technically it's harmless, but I'm not sure it's necessary, that's all. Start sample and duration should be all that's needed. But if it works then great!

everything seems to be working for me now

Great - glad you found some solutions for your use case.

I'm always interested if there's areas of the library that are causing friction or may be buggy, as I want to improve its usage where possible. Thanks for sharing the code snippet (I cleaned it up a little so it's formatted with syntax highlighting) - it may be useful for reference when I dive back into the codebase.

@orchetect
Comment options

Just FYI - I finally updated the library and migrated all of the AVFoundation methods to async/await. Released in 2.2.0.

If you end up doing refactors let me know how that goes. Otherwise you can stay on 2.1.0 if you're not ready yet.

@KweenLizzie
Comment options

Thanks, I'm going to finally look into this as I was going along great and then found the new Canon C80 and C400 files are having timecode issues after transcode, and I think it has something to do with the way I was calculating the start frame. Most of the info on the internet seems to be about issues with drop frame and not 25fps, so I'll report back my findings in case anyone with the same issue finds this thread.

Answer selected by KweenLizzie
Comment options

I discovered the issue with Adobe Premiere isn't that the timecode is incorrect, but it's the way timecode is read in Premiere, specifically with MP4s. Up until this point I hadn't compressed MP4s with timecode, and (at least on macOS) Premiere doesn't read MP4s the same way QuickTime does, so if the timecode doesn't start at 0 then it does some wacky calculation that throws everything off massively the further from 0 the timecode starts.
After investigating I realised that AVAssetwriter was also causing all the MOV files to read timecode 1 frame earlier in Premiere, but as the timecode is correct in QuickTime Player and MediaInfo, I just didn't notice.
People that know what they're doing might already know about this edit-list/start frame issue with AVAssetwriter vs Premiere and know how to fix it, but I've since had to ditch AVAssetwriter as I couldn't find anything on the internet explaining this issue or how to fix it, and so I just use FFmpeg, which is slower and has some other issues, but the timecode matches perfectly in Premiere every time.
Anyway I just thought I should share in case anyone else comes across this issue, as I couldn't find any info with internet searches about it, and I also discovered I can fix already compressed files with this FFmpeg command:

ffmpeg -i "INPUT PATH"
-movflags use_metadata_tags+faststart -use_editlist 0
-c copy "OUTPUT PATH"

You must be logged in to vote
6 replies
@KweenLizzie
Comment options

I tried to get to the bottom of it, but couldn't find anything online (and I'm out of my depth), however if you did want to understand what I'm talking about then you might be able to see the same issue in Davinci Resolve if your compressed files have the same problem. If an MP4 with timecode written with AVAssetWriter shows the correct timecode in QuickTime but not in Resolve, then you have the same issue. Otherwise I must be missing something with the AVAssetWriter setup.

A simple test with FFmpeg running this command on a MOV or MP4 file from a camera with timecode:
ffprobe -select_streams d:0 -show_packets -i "FILE PATH" | grep -m1 pts_time

Fresh from the camera I always get: pts_time=0.000000
But after re-compressing with AVAssetWriter I get: pts_time=0.040000 (for a 25fps file, so that's 1 frame)

I've tried everything to get AVAssetWriter to start at 0, but apparently it's supposed to start at 1 frame and then the edit-list informs the software reading the file (I could be wrong though, I really don't know), but either way the timecode is correct in QuickTime Player, but then apparently Premiere ignores the edit-list and the timecode is 1 frame off, and then the way Premiere reads MP4s that 1 frame compounds to a large time difference the further from 0 the timecode is. Davinci Resolve is even further from the correct timecode.

As an example, I have a GoPro MP4 with timecode start of 19:43:14:04, but after compressing the three apps show different start times:
QT: 19:43:14:04
Premiere: 19:26:32:21
Resolve: 17:48:53:18

But then after running this on the compressed file:
ffmpeg -i "INPUT PATH"
-movflags use_metadata_tags+faststart -use_editlist 0
-c copy "OUTPUT PATH"

And then running this on the new output file:
ffprobe -select_streams d:0 -show_packets -i "OUTPUT PATH" | grep -m1 pts_time

It returns:
pts_time=0.000000

And then the timecode reads 19:43:14:04 in all three apps.

If you have the time I would appreciate finding out if you have the same issue, as if not then it must be something I'm missing it the AVAssetWriter setup, but I've tried everything I can think of to get the pts_time to 0 and have given up and just use FFmpeg now, but it so much slower when compressing (on an M2 mac using VideoToolbox at least), so not ideal.

@KweenLizzie
Comment options

Hi there, I found a solution to my issue is to force the first timecode sample to be zero when using the AVAssetWriter to append the first timecode sample, and then minus 1 frame off the final timecode sample timestamp before appending (if just adding the first and last sample).

So basically just minus 1 frame off all timecode samples and use that timing to create the sample buffer just before appending.

After doing this QuickTime Player, Premiere, and Resolve all showed the same correct timecode. Also, FFmpeg probe showed the new file to have pts_time=0.000000, and not pts_time=0.040000.

So I'm not sure if I had AVAssetWriter set up differently to others, but as far as I can tell it was all correct according to Apple docs, and then QuickTime Player and MediaInfo all showed correct timecode after compression. And I couldn't find anyone else talking about this issue online, so maybe it's common knowledge and people already know to minus 1 frame off the sample buffer timing when appending timecode with AVAssetWriter?

@orchetect
Comment options

Thanks for all the info. I'll do some tests on my end at some point to see if I can reproduce the issue or if I can find any reasonable explanation.

@orchetect
Comment options

I will say just offhand that GoPro is known for doing some strange things. I worked on a contract last year for a software developer that had an app which recorded video and audio streams, and I spent many hours chasing some odd behaviour from the GoPro webcam driver, for one thing. It was not syncing to host clock properly and was therefore getting written to video files totally out of sync or corrupted, no matter what software was recording it. Not saying that's directly related to this issue, but thought I might mention it. As far as video files the GoPro device itself writes to memory cards on its own hardware, not sure if there could be some issues because the experience didn't instil a lot of confidence.

@KweenLizzie
Comment options

yes I would have been more suspect of the gopro files if it didn't also happen to the Canon C400 when saving files as .mp4 (I'm guessing most people save as MXF), and that's when I realised all my compressed .mov files were off by 1 frame, but as it's such a small amount I just never noticed (because the timecodes looked correct when inspecting). So I then compared mov files that were compressed and imported into Premiere to the same exports from the original files — and every file I checked with a timecode and with mov extension was 1 frame off, and any file with a timecode and mp4 extension wildly off. Even more so in Davinci Resolve. So it actually happens to every camera with timecode I've tested, it's just mov files aren't affected as much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.