EP049: Dealing With Archival Pt 2
Please Give A Warm Welcome To Our Newest Sponsor Conform.Tools!
Conform Tools allows you to convert timelines between Premiere, Resolve, and other NLEs while automatically solving all those tedious issues that can add significant time to your workflow. With a growing toolbox of features, you can avoid time consuming trim and transfer issues, and securely send large media files to collaborators at a fraction of the size, in minutes instead of hours.
Built by post professionals, Conform Tools helps editors, colorists, and conform artists move faster and finish stronger.
Check out conform.tools for more info.
Video Orginated Archival Can Look Better –
Once You Know What To Look For
Some of the specifics we discuss include:
- Dealing with interlacing
- Blanking & edges
- Pixel Aspect Ratio (P.A.R) issues
- Resolution
Check out www.offsetpodcast.com for our entire library of episodes + some of the additional assets mentioned in this episode that are available for download.
Be sure to like it and subscribe to the podcast wherever you found it and be sure to check out our growing library of episodes. If you like the podcast it’d mean the world to us if you’d consider supporting the show by buying us a cup of virtual coffee -https://buymeacoffee.com/theoffsetpodcast
See you in about two weeks for a new episode.
Video
Links
Transcript
00:00:00:05 - 00:00:15:09
Robbie
Hey everybody, welcome back to The Offset Podcast. And today we're continuing on with part two of our discussion on dealing with archival sources. Stay tuned.
00:00:15:11 - 00:00:47:14
Joey
Support for this episode comes from Flanders Scientific's XMP551 and XMP651. The flagship QD-OLED reference monitors that are reshaping modern grading rooms, their large format and industry leading viewing angles let clients see accurate images from anywhere in the room and their true HDR and SDR reference performance makes single monitor room layouts possible. When everyone relies on the same display, you avoid the headache of explaining why the client and trading monitors don't match.
00:00:47:19 - 00:00:50:19
Joey
Learn more at FlandersScientific.com.
00:00:50:21 - 00:01:11:05
Robbie
Hey everybody, welcome back to The Offset Podcast. And today we're continuing on with part two of our two part series in dealing with archival sources. In part one, we talked about sort of the big picture of archival as well as dealing with film archival, but in this episode, we're going to specifically talk about video archival and some of the challenges that pop up there.
00:01:11:07 - 00:01:32:16
Robbie
Now, before we continue on like usual, let's just do some quick housekeeping as a reminder, our viewer audience survey is still open. You can find that at this link right here. If you have 5 or 10 minutes to give us some feedback about how the show is is treating it, we'd really appreciate that. Your feedback is going to go in to directly help us sort of shape how 2026 and how the podcast is going to shape up.
00:01:32:16 - 00:01:56:10
Robbie
So we appreciate anything you can do there. Of course, you can always follow us on social media, on Facebook or Instagram. Just search for The Offset Podcast, and then you can also head over to offsetpodcast.com for our complete library, as well as show notes and =we actually link to some cool, tools and utilities that Joey actually created himself to deal with some of the challenges with interlacing and stuff like that.
00:01:56:10 - 00:02:21:16
Robbie
So be sure to check out offsetpodcast.com as well. Now, Joey, as I said, we are going to talk a little bit more about the video side of things rather than kind of the film archive. And of course, there's some overlap here, of course, but, I want to tell the viewers at home, that I have been on the receiving end of Joey's pontifications on interlacing for almost, you know, 15, 20 years, whatever it's been.
00:02:21:18 - 00:02:52:11
Robbie
So this is something I just want to preempt and say that Joey is very passionate about it. But let us just begin with the idea of interlacing. And it's amazing to me in 2025 how quickly people have for 2026, I should say now how quickly people forgot about interlacing as a thing. It has somehow become. It's in the it's in the the area of like nostalgia.
00:02:52:12 - 00:03:14:09
Robbie
It's a vibe, as my kids would say, right. It's like, oh, we want to make something look archival old. Let's make it. Let's add scanned lines and make it interlaced looking or whatever. And I'm like, guys, t like weeding, not watch interlaced TV going, wow, I love these scam lines and all of these jagged edges. They look so good.
00:03:14:09 - 00:03:25:13
Robbie
Like it didn't happen then. What makes people think that that's the way people watch TV and consume media? You know, even 20 years ago, it didn't happen that way, right?
00:03:25:14 - 00:03:50:19
Joey
Yeah. This is my personal documentary, Hill to Die On, and I've done extensive I'll say this, I've done extensive research, I've interviewed editors, I've interviewed other colorists of various generations to figure out what their understanding level of these issues are. And I've come up with some answers, I think. But let's start with the beginning. What is interlacing? Well, yeah.
00:03:50:22 - 00:04:21:14
Joey
For the first 100 years of television, every frame of the image was made up of two fields. Every other line would be displayed first, so the even lines would be displayed first, and then the odd lines would be displayed after. Okay. This was a way of optimizing the signal to have the best bandwidth efficiency for broadcast. Okay, so this was happening in analog television from the black and white days.
00:04:21:14 - 00:04:51:17
Joey
Okay. That's how far back this concept goes. Now, at the time, we were using displays called cathode ray tubes that would, for each line, essentially imagine a laser drawing the line one by one. It was an electron beam hitting glowing phosphors, and they would glow for a little bit and then fade down gradually. So it's going beam line one, beam line three, beam line five.
00:04:51:18 - 00:05:15:03
Joey
And we get to the bottom. It goes back up to the top beam line two, beam line four. Okay. Now these pixels, if you will, they weren't actually pixels because they weren't discrete boundaries. They were not instant on and off. They would slowly glow up and then slowly glow down. If you look at high speed footage of a CRT, you'll see exactly what I mean.
00:05:15:05 - 00:05:51:14
Joey
This meant, combined with the persistence of vision in our brain, we would put together both all of the lines as one image and both fields as one contiguous image. So when we watched interlaced footage, it looked smooth on the displays of the time, which were CRT. That's point number one, point number two is today when we deal with interlaced footage, most of the time it's a deliverable that's derived from a progressive source.
00:05:51:14 - 00:06:16:16
Joey
So we take one image, split it into two fields for the file or the transmission, and send it on its way. Okay, that's an easily reversible operation. However, for the first hundred years of television and basically up until the mid 2000, when basically when Phantom Menace came out and Sony invented 24 PE on a television camera, the H.W 900 F, well.
00:06:16:18 - 00:06:18:14
Robbie
A little earlier than that. Early, early 2000.
00:06:18:14 - 00:06:44:06
Joey
Yeah, yeah, yeah. Phantom Menace literally was the demarcation line for 24 frames a second and video. Okay, everything before that, there was almost always what we refer to as intra field motion. You have in a 59.4 I signal it's not 30 frames a second, it's 60 fields per second, which means an object can move between field one and field two.
00:06:44:09 - 00:07:09:11
Joey
Field one and field two are two distinct moments in time, whereas now with one frame and progressive, that frame is one distinct moment. In time. So what this means is if you take an interlaced image and you put it in a progressive display, and you put field one and field two on top of each other, they might horizontally not line up because things are moving.
00:07:09:11 - 00:07:36:15
Joey
That's where you get those little jaggy sideways lines. We call that a baked in interlacing, because we're taking two discrete moments in time and squishing them together as one, and it looks like garbage. It looks like utter complete garbage. And I have talked to so many young people in our industry and they I've shown them kind of images both ways.
00:07:36:17 - 00:08:02:06
Joey
And a vast majority of them that didn't grow up on CRT is an interlacing. They're not leaving that in because they don't want to fix it. They're not leaving that into these shows. And like I said, I've seen this on the highest end. Documentaries on Netflix, on Amazon. This is not, a small problem in my opinion. They're not leaving it in out of laziness or malice or anything bad.
00:08:02:08 - 00:08:22:12
Joey
Yeah, they think because they open up the file on their computer. And this is what it looks like, that this is what the show used to look like, or the image used to look like. I'm here to tell you all that TV for the past hundred years did not have horizontal jaggy lines. Okay, those need to be fixed if at all possible.
00:08:22:15 - 00:08:28:23
Joey
And now we can talk about kind of how we can fix that in a modern production world.
00:08:29:00 - 00:08:47:06
Robbie
I want to add just a few other bits of context here, because that's all really good stuff. I want to add a few bits of context. Is that, the first one being that you said something earlier about, hey, it's, you know, it's the even in the odd fields, right? This is a concept known as field order that I think a lot of people have forgotten about.
00:08:47:06 - 00:09:04:09
Robbie
Right. And field order is important because if you have mismatch or incorrect field order bad things, you're putting, you know, it's kind of like time travel a little bit, right? If you have the wrong field order, you're putting a moment in time that was supposed to come after, before or vice versa. Oh, and fun fact.
00:09:04:12 - 00:09:09:19
Joey
Standard definition TV in the United States has a different field order than high definition interlaced TV.
00:09:09:19 - 00:09:28:10
Robbie
Did you say it used to be? It used to be that we did everything, like in the days of like DV tape, it was all lower field first rate and now it's all upper field for like and that's. But my point being more about the incorrect field order and mismanaging that leads to a lot of these, these type problems.
00:09:28:12 - 00:09:51:14
Robbie
The other thing, the other thing I would say is that, you know, you're talking about kind of the two fields being overlaid on each other and being incorrect alignment on that. You know, the one thing that a lot of people think that works in that situation but often doesn't, because the cadence or of the order of the fields is not correct is, oh, I'll just d interlace this content and have it work great.
00:09:51:16 - 00:10:09:03
Robbie
Yep. That only that only works. And because the d interlaced are works by going okay, which field do you want me to pull out. Right. I'll pull out the upper field or the lower field, and then I'll recreate the image from the other field. Right. The problem is when you do the interlace and do that, if you have the wrong fields, the wrong field order, it's just not going to do anything.
00:10:09:05 - 00:10:23:12
Robbie
Or, you know, at best case, it's just going to give you an image that anytime you do something, you're getting rid of temporal information. Guess what? That means. You're going to have a softer, less, you know, less sharp image because you're removing data.
00:10:23:14 - 00:10:46:06
Joey
So that's the thing, right? If you use what's called a D interlace effect. And in a lot of these cases you have to do it because you're going to progressive. It's got to become progressive. So we got to remove the interlacing somehow. But here's the problem. All D interlacing algorithms are based on the assumption that you're giving it an interlaced input.
00:10:46:12 - 00:10:51:07
Joey
Right. And that means it has access to field one and field two.
00:10:51:10 - 00:10:53:01
Robbie
So that's what I'm saying. That's what I'm saying.
00:10:53:01 - 00:11:14:21
Joey
You don't always have that because most of the time somebody has captured this to some digital format and burnt those two fields together into one frame. And then, hey, maybe they resized it after that. Maybe it got scaled up from HD. Once you've done that, that relationship between field one and field two is completely blown out the window.
00:11:14:21 - 00:11:26:21
Joey
You have horizontal jaggy lines baked into your image forever. And that's where I think a lot of people give up. But I'm here to tell you about this. Still, there are ways to deal with that.
00:11:26:23 - 00:11:57:02
Robbie
Well, and also this is a big one, and you are perhaps the only person I know that still actually has this in a physical setup, right? Is that, you know, one of the challenges for a lot of people is they don't even see the problem, right? They don't see the problem with the interlacing, like in the like, they might see the jagged edges or whatever, but they're not even at the point where they can identify the problems, like if it's true in place or not, because they're not looking at it on an interlaced monitor either.
00:11:57:02 - 00:12:07:10
Robbie
Right. They're looking at things on a progressive monitor which can present. It's like depending on how that works, with how it rebuilds interlaced fields or whatever can be an issue. Yeah.
00:12:07:10 - 00:12:31:14
Joey
The quick thing to do, if you're looking at a fully progressive monitor and you need to know if there's intra field motion in a source bouncing to another resolve project, turn on interlacing and by default, resolve will now left and right arrow go by the field, not by the frame. So in a progressive image that was converted to interlaced, you'll see left and right will be the same right.
00:12:31:14 - 00:12:52:07
Joey
It will look like a freeze frame, but if there is intra field motion that eventually you might need to remove or address, somehow, you can left and right and see it on your progressive display. Because what you're doing, you're just saying go field by field. But the big thing is, in almost all cases, we're going to be getting rid of interlacing for our final mastering process at this point.
00:12:52:07 - 00:13:18:16
Joey
So we are going to be getting rid of some temporal information. The best way to do it is with a real interlace. And resolve has a very good interlace. For especially if you go in to preferences and turn it on to its enhanced mode. But those dinner lasers all rely on a good solid interlaced signal, whereas if it has been interlaced before or has been captured to video and then scaled up to HD.
00:13:18:17 - 00:13:28:04
Joey
For example, your native interlacing data is gone and we need to start looking at other solves for that baked in jaggedness that we see.
00:13:28:05 - 00:13:37:19
Robbie
Okay, so let me give you a hypothetical. I have a situation that's got some baking Jaggedness producer director is yelling me about it. What is my first go to way to address this?
00:13:37:21 - 00:14:03:08
Joey
A lot of people will start thinking, oh, maybe I'll do some noise reduction. Maybe I'll try that D interlace effect. And none of these things work, right? All you the only real solve to baked in interlacing. And it's a bit of a bummer because you do lose some resolution doing it. However, since we're keeping the entirety of the horizontal resolution, it actually looks shockingly good.
00:14:03:09 - 00:14:24:00
Joey
What we want to do is we want to resample the image vertically, essentially averaging out those two different areas in time where we get those jaggies along the vertical axis. And I do this with a fusion effect that I've built. I'll give this fusion effect out to anybody who wants it because.
00:14:24:00 - 00:14:25:02
Robbie
I'm passionate.
00:14:25:05 - 00:14:25:12
Joey
About.
00:14:25:12 - 00:14:26:00
Robbie
We'll put it in.
00:14:26:03 - 00:14:53:19
Joey
This problem in documentaries. We're going to post it. We're going to put it in the show notes. Everybody that knows me has gotten a copy of this fusion effect and everybody loves it. I'm actually really proud of this one. So it's it's really useful essentially inside that fusion effect, all we're doing is we scale this slider that says scale the image up vertically, then scale it back down again to the exact same amount.
00:14:53:19 - 00:15:17:06
Joey
So we're sampling it vertically. But unlike doing that in the timeline, fusion has a lot more different options for the resampling algorithms. So I went through and picked the one that kind of held the most detail for this very weird application and baked that into it. So essentially you just get a slider that kind of very gently smears the image vertically to get rid of those lines.
00:15:17:08 - 00:15:38:05
Joey
And the reason why it needs to be a slider as opposed to just, oh, only two lines, right, is because, hey, if it was two perfect TV lines in the file, we could interlace it. You know, we could use the interlace effect. Most of these sources have gone through generations of PSD to HD, to scaling the H264 to time effects to whatever else.
00:15:38:11 - 00:16:01:01
Joey
So we have that little slider. Now, the last part of this puzzle and this is going to get to our next issue I want to talk about with video sources and documentaries is what we call blanking or edge behavior. Right? When we scale or resample this image vertically, it's going to mess up the top and bottom edges. You're going to get softness.
00:16:01:01 - 00:16:24:13
Joey
There. So you can either kind of scale it up a little bit and crop it, or do a little bit of a clone on the top. Yeah. To adjust that. So in this effect that we're going to put in the show notes and give you to play with, I have little options for how to deal with that. But in general, if you're using this technique of resampling vertically to get rid of baked in interlacing, be aware of the top and bottom edges and in those blanking regions.
00:16:24:13 - 00:16:44:19
Joey
So now that I have completely ranted and rave like an insane person about obscure interlacing, well, it's not obscure. I really think this is a major issue that we all, as people in this industry, doing finishing work, should take more seriously. I really believe that.
00:16:44:21 - 00:17:09:07
Robbie
Support for this episode comes from Conform.Tools. Conform.Tools allows you to translate timelines between Premiere and Resolve and other NLEs, while automatically solving common issues that normally need to be fixed by hand. Avoid time consuming trim and transfer issues, and securely send large media files to collaborators at a fraction of the size and in minutes instead of hours.
00:17:09:09 - 00:17:27:04
Robbie
With a growing toolbox of features, let control tools handle the tedious stuff so you can focus on the creative. Built by process professionals. Conform tools helps editors, colorists, and conform artists move faster and finish stronger. Learn more about conform that tools.
00:17:27:06 - 00:17:51:05
Joey
That kind of leads me into the next major, major thing to look out for. And again, that's another thing that is often missed when dealing with video archival sources. That is, things in what we refer to as blanking or the edges. Now, we used to call it blanking because in the original video signals those areas were blanked out, as in not visible.
00:17:51:08 - 00:18:00:09
Joey
So, Robbie, why don't you tell us a little bit more about what other issues blanking can give us now that we're finishing archival stuff in modern formats?
00:18:00:11 - 00:18:30:11
Robbie
Oh man, this is giving me a little PTSD about my days stuck in a QC box looking at, you know, scopes and analyzing, you know, what do they call that front porch? And, you know, all sorts of, you know, analog, type of evaluations. But generally speaking, these days when we're talking about blinking, it's referring to dead areas or black parts of the screen which were commonly going to find either in the pillar side of the screen, left and right side of the screen, or the, the letter bar, box where the screen top or bottom, right.
00:18:30:12 - 00:18:56:11
Robbie
Oftentimes especially with archival that originated on tape and was digitized, you'll often see a thing, you know, maybe 2 to 10 pixel wide, maybe sometimes a little bigger. Even, you know, kind of stripped down the side of the frame that is black, that is not active picture. Right? It was never meant to be active picture. But when it got digitized, it was digitized with that it wasn't, you know, scaled or anything.
00:18:56:11 - 00:19:17:16
Robbie
So it was copied right off that. And so that can be annoying visually, but it also can be a QC issue. You'll often get flagged for issues like blanking on top or bottom of screen. The other thing that you'll commonly see two is depending on the source and how it was again digitized at the very top of the screen, you might see what looks to be like noise.
00:19:17:16 - 00:19:34:19
Robbie
Right? It looks like, you know, kind of like little dots going off or little lines or whatever, and people. And what does all that weird noise at the top? Chances are it's one of two things, or both potentially, is that it could be closed. Caption data embedded line 21. Closed caption data at the top of the screen.
00:19:34:21 - 00:19:52:15
Robbie
It could also be timecode data. It could be vincey timecode embedded at the top of the screen as well. Ditzy timecode or vertically integrated timecode. Vertical thunderbolt timecode. Thank you. I'm sorry. Vertical interval. Timecode versus. What's the opposite of it's it's let's see.
00:19:52:17 - 00:19:53:15
Joey
Linear timecode.
00:19:53:21 - 00:20:14:13
Robbie
Thank you. Which you'll see or hear rather on audio out. But as long as a tape is playing it's just one method of inserting timecode into and to a source. It can also be Vincy on top right. So oftentimes you'll see things kind of slightly mis scaled or kind of, you know, miss shaped or whatever, and you'll see that blinking or that busy or that, close caption at the top.
00:20:14:15 - 00:20:37:22
Robbie
So how do you go about fixing these issues? Right. Well, the first thing is that I'm all about eagle eyes on this. I will oftentimes zoom in the viewer right to just the edge of the screen. And this is what we talked about this one before. But this is super helpful. If you in the resolved viewer options, you can actually, tie your viewer zoom into your SDI output output zoom.
00:20:38:03 - 00:20:40:19
Robbie
So it will also zoom on your reference monitor.
00:20:40:20 - 00:20:54:03
Joey
Now another important thing while you're doing that is in your resolve preferences, set the option for viewer background to gray. That way you'll see a hard edge where there's any discrepancy.
00:20:54:03 - 00:21:09:19
Robbie
Yeah. So that's that's my first step is just to eagle eye this. But you know even then you know it can change shot to shot or if you're just, you know, whatever it's late at night, you're working through it. Or if you want to go low fi about this, you could put a brightly colored solid behind the clips and just move your clips up to video track number two.
00:21:09:21 - 00:21:27:06
Robbie
But our smart audience will go, well, that's not going to work. Rob, how am I going to see the bright clip behind that black strip? Well, here's the deal. There's two types of blinking that I think you'll find one, the blinking that's actually baked into the clip, which is an actual black bar of pixels that's blinking, you know, type number one.
00:21:27:06 - 00:21:46:20
Robbie
But blinking part number two is you've done a reposition or remove on your own, and you've introduced blinking. And then you have a set of transparent pixels. And when there's nothing behind it, it's it's black. But if you put something back there and then you can see that color, if you see behind it. So I check for both of those things.
00:21:46:22 - 00:22:04:06
Robbie
And they're super, they're super useful to check now in terms of the Vinci or the closed caption data, that's usually just a slight scale. And one of the things I'm especially for, docs that are really archival heavy or shows are archival heavy, I'll create an input scaling preset that I can just go, hey, you know what?
00:22:04:06 - 00:22:20:07
Robbie
Every time I don't want to have to grab this input or the edit sizing, you know, not to size it. I'll just figure out a good, you know, 1%, 2%, you know, push and save that as a preset. So anytime I can just apply that to that archival, just one click and it's done.
00:22:20:12 - 00:22:40:14
Joey
Yeah. And it's also one of those times where it is good to have. We talked a little bit previously about actively masking your four by three sources. If they're pillar boxed. This is a great time to do that because when you scale it up a tiny bit, you'll also clean up those left and right horizontal edges to be a dead straight line, which in general I think looks better.
00:22:40:19 - 00:22:56:17
Robbie
Yeah I agree. And now the same thing. You know, I just want one note about the artificial masking is that you have to get a little used to this, by the way, with how you're there's a couple different ways you can handle it. But like one of the things that can happen with that is that you don't want to, you know, let's say you're doing a zoom on a photo.
00:22:56:19 - 00:23:11:23
Robbie
You don't want to also zoom the masking, right? So you either have to do that through, some sort of layer ordering or a compound clip or whatever to just make sure that you can still do the moves that you potentially want to do without that masking also changes.
00:23:12:01 - 00:23:46:09
Joey
Yeah. And that applies to any kind of sizing things like for example stabilization. Right. If you have a four by three clip that you stabilized, you don't want those horizontal edges jiggling around, right? You want them to be locked into a four by three crop. Now, speaking of four by three, there's another thing that to get into my old video nerd history mode again, my favorite thing that I don't think a lot of people have heard of at this point, but it's another thing that I see done wrong often that is the concept of the pixel aspect ratio.
00:23:46:13 - 00:24:14:17
Joey
We've talked about normal aspect ratios and how you should never change what the actual real aspect ratio is. What's a pixel aspect ratio? Well, you know, we joked about field order being weird for standard definition television. We've talking about the weird history of interlacing and how CRT has worked for television. Well guess what? CRT has also had this concept of the oval shaped pixel.
00:24:14:19 - 00:24:43:19
Joey
On all modern displays, our pixels are individual, pixels are a square, makes sense right? You draw a square, it'll be the same length top to bottom. Well, for the first 80 years of television, up until HDTV, that was not the case. All of our television signals and every video source had vertically oblong, oval shaped pixels. This didn't really matter when dealing tape to tape.
00:24:43:22 - 00:25:11:23
Joey
It didn't really matter when broadcasting because the CRT would display it correctly at all times. But when we brought those signals into a computer, we have to compensate for that because we're essentially taking an oblong pixel, putting it into a square pixel raster. So video sources, you often see the resolution 720 by 480 or 720 by 486. And you often see the resolution 640 by 480.
00:25:12:05 - 00:25:43:09
Joey
Obviously, those are vertically very different numbers. That's because if you were to capture all of the pixel data of a standard definition image, you do get 720 lines. However, they are stretched vertically, and it's about a 10% scale to bring that to where they will actually look correct on a square pixel display. Now resolve does this automatically. Two 724 data sources that are tagged appropriately.
00:25:43:10 - 00:26:12:10
Joey
Most in alleys do this automatically. But this is one of those cases where if everybody looks slightly oblong, it might be wrong. You might need to apply that. It's literally 10% is the number vertical scaling. And it's funny this I've had this baked into my head since 2010, which was right around the time when political advertisers who were kind of late to the game because it was expensive, move from standard definition to high definition.
00:26:12:11 - 00:26:35:12
Joey
And I actually I got on the news for this one. We did a political ad, I brought in a photograph that was going to a standard definition output. So I do the opposite. I had to compensate by stretching it slightly vertically so it would be the right or sorry stretch to get down a little bit. So it would be the right pixel ratio for our standard definition deliverable.
00:26:35:14 - 00:26:57:07
Joey
Now I was used to high definition at the time. So we had kind of started going the other way. Well anyway the end result was I put out a political ad where a particular candidate was 10% thinner than they should have been, and this particular candidate was known for being a larger individual. So the internet blew up. There were news stories about this.
00:26:57:07 - 00:27:24:02
Joey
Oh, we're digitally manipulating the image to make X candidate look thinner. Nobody could do that by accident. That is obviously an intentional decision to do, but they're being dishonest and blah. There were there were local news stories, there were forum posts. There was all kinds of stuff. And it was because I had the pixel aspect ratio checked wrong when I brought that picture into my non-linear editor, and it was a long day, I didn't notice.
00:27:24:08 - 00:27:28:06
Robbie
So. So yeah, that's a bad day. I remember I.
00:27:28:08 - 00:27:32:18
Joey
Had to go scaling could be a pixel aspect ratio issue, and I think a lot of people don't realize that.
00:27:32:22 - 00:27:51:13
Robbie
True. And I think that the last time I really seriously thought hard about this, because it was a daily occurrence back then, was in the Divi days, the HD cam days, that kind of stuff. And I have I just had to look it up on Wikipedia because I had this number in my head and I couldn't. I couldn't remember if it was correct.
00:27:51:15 - 00:28:14:15
Robbie
I kept thinking about .9.9.9 and point nine. Was that non-square DV pixel aspect? Right? That that's that's the decimal version of it. How it related to it. Yeah. So, it did for me. Yeah, it can be, it can be a little bit of a weird one. But thankfully these days most everything is square pixels. So it's less of, an issue with acquired footage.
00:28:14:15 - 00:28:36:07
Robbie
But you're right, it's still have some. It's still pops up from time to time. Bad pixel aspect ratios can get baked into things. And I think one of the things to to to to do, as you said earlier, is to focus on geometric shapes. Right? Or is somebody, you know, as a circle, you know, an oval, right, or somebody too thin or are they too fat or whatever?
00:28:36:09 - 00:29:01:11
Robbie
But this can also I this bit me. Recently, I didn't realize that some anamorphic film footage that I was dealing with had been stretched improperly. Its aspects pixel aspect ratio was calculated incorrect. And so you can still deal with these issues even if they don't have anything to do with archival DV or whatever, right? They can still have an A more anamorphic is a great example.
00:29:01:13 - 00:29:02:01
Robbie
How.
00:29:02:03 - 00:29:05:10
Joey
Anamorphic is the film equivalent of a different pixel aspect ratio?
00:29:05:10 - 00:29:24:10
Robbie
Yeah, man. I mean, you know what it is. It's one of those things that like, it seems like a digital thing, but you can, you know, in this case, it was an optical thing. And I didn't do I didn't do the squeeze originally myself, I was working with a sort of a baked in an image of this. And it just, you know, at that point in time it can be a little difficult, right?
00:29:24:10 - 00:29:43:19
Robbie
Because you don't know, especially if you're working with something that's already been stretched or squeezed or whatever direction you're going in, you kind of have to use your best judgment at that point. You're probably never going to get it mathematically perfect, but that's where looking for those circles, ovals, etc. can always get you in the passable ballpark.
00:29:43:21 - 00:30:07:01
Joey
Yeah, absolutely. That's kind of one of the overarching things I really want to emphasize here is these sources go through generations of different conversions, different processing. Sometimes you got to put your detective hat on and kind of think to what could have done this, to this image, and how do I undo it? Yeah. One quick little aside, little, little piece of history.
00:30:07:02 - 00:30:37:18
Joey
We dodged a bullet with HDTV and pixel aspect ratio. The original proposed HDTV spec was 1920 by 1035, with oblong, oval shaped pixels, just like the standard definition had been, and the biggest advocate against that. And for 1920, by 1080, as a standard and as the standard came from the legendary and, quite an idol of mine, Mr. Charles Pointon, that was that was one of his causes in the original development of HDTV.
00:30:37:18 - 00:30:41:02
Joey
So we can thank him for square pixels finally.
00:30:41:05 - 00:30:50:14
Robbie
I mean, it's our go to, you know, all these years later, it seems like a no brainer. Why do you want to have to be doing mental math constantly when you can just go, it's square?
00:30:50:16 - 00:31:14:03
Joey
Well, the argument was we had been doing it for 80 years, 90 years. Support for this episode comes from Flanders Scientific and the XMP270 and XMP310. The accessible, lightweight and versatile monitors helping to bring HDR monitoring on set while also being very well suited to post-production work. Learn more at FlandersScientific.com.
00:31:14:05 - 00:31:32:20
Robbie
All right, so moving, moving right along. I one of the things I wanted, to chat about was in terms of video as well, is the resolution issue and how we attach the resolution issue. And this is going to parlay into a brief discussion for those of you who are AI adverse. We're going to talk about this in a second.
00:31:32:22 - 00:31:49:21
Robbie
But let's talk about the non well I can't say it's completely non AI. But let's talk about the more manual approaches to dealing with low resolution video source. So I suppose even film sources too. So in a film source obviously your answer is hey we need this to be better. I could have potentially go back and re scan it.
00:31:49:21 - 00:32:04:12
Robbie
Right. And a video side of things. You're not going to go back and redo the acquisition in any way because it's it is what it is. So how do we deal with something let's say 720 by 40. And we need to put it in a project. Well, we can get creative with how we, you know, we do it.
00:32:04:14 - 00:32:23:00
Robbie
We talked about this earlier. You know, window it box it, you know, do a background treatment or whatever. But sometimes you do want those things to go to, go full screen. The thing you probably don't want to do, I'm just going to put this out there is just use regular old transform and push in a couple hundred percent into something, right?
00:32:23:02 - 00:32:48:16
Robbie
I generally and this is not a hard and fast rule, but I generally think about 15 to 20% as kind of mid cap for how far I'm willing to push in on things, with traditional just scale and, you know, you know, basic transform controls. After that, I'm thinking about, hey, if I can go further, do I need to start doing some other treatments, noise reduction, sharpening and that kind of stuff?
00:32:48:21 - 00:33:14:08
Robbie
But what I, you know, I try to write in that, you know, 50% range, you reach the point of like, nope, no matter what I do, noise reduction, sharpening, whatever this is probably going to get not so good. It's going to get softer. So enter the world of I am I assisted tools right. And the first one that I've actually come to love a lot for this, which does a pretty good job is super scale in in DaVinci resolve.
00:33:14:08 - 00:33:41:12
Robbie
Right. So super scale applies a mathematical algorithm to basically do some doubling up of pixels to help give you the perception of. And there's different ways the algorithm can be handled. But to give you the perception of a sharper, more robust image when blown up, the downside of it is that it it's machine intensive to do this math all the time, especially as you go up and resolution and start dealing with more, you know, high resolution sources and stuff like that.
00:33:41:16 - 00:33:55:04
Robbie
Have you had pretty good results with with super scale? I found it for a lot of things pretty good. But it can, especially at the settings that focus more on noise reduction. It can it can get things pretty, pretty soft too.
00:33:55:09 - 00:34:17:23
Joey
Yeah. The thing with Super scale to remember is that they do marketed as kind of an AI tool, but it's not a generative, hey, it is not filling in new pixels that it makes up, which I think is we'll talk about this a little bit more in detail, but very important for accuracy in a documentary, if that matters to your project.
00:34:18:01 - 00:34:41:04
Joey
We are not making up new film material with super scale. We are just combining some noise reductions and sharpening algorithms and some other things together. You can also kind of make your own formula with that by adding some noise reduction or some sharpening, or sometimes a little bit of film grain can increase the perceived detail without giving the ringing around edges.
00:34:41:04 - 00:35:01:15
Joey
That hard sharpening can do. Same thing if you do a frequency separated sharpen like the Soften and sharpen tool, you can get, you know, those those crisp details a little bit sharper without really getting that hard ringing. So it really depends. Shot the shot. Sometimes it's super scale, sometimes it's soften and sharpen. Sometimes it's a little bit of regular sharpening.
00:35:01:15 - 00:35:24:03
Joey
Sometimes it's a little bit of film grain. But the other thing that I think gets forgotten about a lot is there are various different scaling algorithms or interpolation algorithms available. If you look in the inspector, you could do sharper, softer, better quality. There's a couple of different options you can have and for different sources, maybe something that has a lot of really sharp pixels or sharp detail.
00:35:24:03 - 00:35:35:22
Joey
You might want to use the sharper version or the softer version, depending on how it is. So dig it in the Inspector. When you're even just using regular scaling and see what works best for your footage. It's not always just the defaults.
00:35:35:22 - 00:36:02:06
Robbie
One thing I would point out about those scaling algorithms is that that math can actually make a potentially a gigantic difference, and it's not just about going up either, right? It can sometimes be about going down. I'm sure people have have had this problem where they take, say, a UHD drone shot, right? And then they scale it down to HD and all of a sudden it's got all of this aliasing and more and all that kind of stuff.
00:36:02:08 - 00:36:16:09
Robbie
That's a scaling issue going the other way around. Right. Like so you could try say in those situations I often try Land Coast. I think that's how you say it. Land SEOs, is scaling and that works tremendously for kind of downscaled more options.
00:36:16:11 - 00:36:34:08
Joey
Yeah. Jump into fusion. Fusion has a ton of options for different interpolation, especially if you're going to be doing slow pushes or zooms or moves. If you start seeing twinkles or aliases or stuff like that, dip into fusion. Try doing your scaling there and go through the different algorithms and see what works with the image.
00:36:34:10 - 00:36:54:15
Robbie
Now there are a lot of applications out there now that are claiming to have the secret sauce about this, in terms of getting the best results. And I have to admittedly, and some I begrudgingly say that they can do a pretty fairly good job on it, depending on the sauce. Right? And probably the most popular one out there these days, it gets a lot of talk.
00:36:54:15 - 00:37:18:08
Robbie
Is the tool set from, Topaz. And Topaz comes as a standalone, application. Or it can actually even run as a plugin inside of resolve. I think I generally prefer the standalone option versus the plugin. For, for a couple reasons, but mostly workflow wise. But it does with various AI models. Do, targeted focuses for things like actual upscaling.
00:37:18:13 - 00:37:34:01
Robbie
It can do, really good job with noise reduction and sharpening, but the same general rules apply where you have to kind of work out a little bit of a recipe. This is not just like, oh, I'm just going to just choose this. And it's a one size fits all. You really kind of have to start separating out.
00:37:34:01 - 00:37:55:19
Robbie
Okay. These sources do well with this kind of upscale algorithm. These sorts of do well with this, and kind of evaluate the result and try to iterate a little bit. I would, I would say the one other thing I would put out there about using an AI tool like this is you definitely have to factor in the processing time that's going to be involved on doing these sources.
00:37:55:19 - 00:38:17:15
Robbie
Now, it's one thing if you're dealing with processing the clips that are on your timeline, that's a relatively straightforward thing because you're talking, you know, seconds or minutes, not hours. But if you're trying to do this up, convert on up, convert on sources before you get it in, yeah, that's where you're gonna have to really kind of budget for some time because some of these things can be really, really machine intensive.
00:38:17:17 - 00:38:44:21
Joey
Now, I'm going to be really dogmatic here and say that it's very important to remember that there is a demarcation line when you get into a generative tool like Topaz, where essentially what it's doing is it's looking at your image and then making its best guess of what all of the photos that it has been learned on, what pieces and pixels from those it can steal to fill in details that don't exist in your image.
00:38:45:03 - 00:39:14:00
Joey
So if historical accuracy is important in your project, not only do you need to be aware of this, your client might not be aware of this. If you use tools that are generative AI to fill in texture or details, yes, it might look very convincing. You are now in my opinion, and I would say factually, you are destroying the authenticity of that image for a bump in visual quality.
00:39:14:06 - 00:39:39:17
Joey
It's essentially you're faking it. That was never those are details and information that was never captured. And that could be a subtle as blemishes in people's skin. It can be as subtle as someone's hair. It can be as subtle as the way they move. Right? It might look super convincing, but I'm sorry it's not real. And for real historical documentaries, I don't think it's appropriate.
00:39:39:19 - 00:39:58:12
Robbie
And I think it can be very easy to very like have like a gut reaction to it, like, oh my God, that looks so much better. But then once you start pixel peeping that a little bit and really doing some like analysis on it, you're like, why is that guy's hair like gone to a geometric square now, right?
00:39:58:14 - 00:39:59:19
Robbie
Or why is it seven.
00:39:59:19 - 00:40:00:15
Joey
Fingers.
00:40:00:17 - 00:40:26:11
Robbie
Right. Or I've never experienced that with Topaz. But like I'm more of like, you know, things taken on like sort of a plastic sheen with them because noise reduction is overzealous. You can't just assume that the algorithms that are using these tools are always doing supportive or good things. You have to. That's what I'm saying. You have to be iterative about this and generally speaking, I tend to take the less is more approach with these these tools.
00:40:26:11 - 00:40:46:13
Robbie
Right. Like, okay, can I get to a good baseline with this tool, but then maybe use more sophisticated noise reduction in resolve or do other techniques and combine that, like I don't need Topaz or any of these AI tools to necessarily solve every problem with the clip. I'm just looking at the things that it does really, really well.
00:40:46:15 - 00:41:05:20
Robbie
Okay, you're really great at, you know, tripling or quadrupling pixels, but you're, you know, I don't like your noise reduction, so fine, just separate those two tasks, right? That's totally that's totally fine, I think. But you're only going to get to that once you experiment a little bit. And the last place you want to be with this kind of stuff, by the way, is just kind of winging it on deadline.
00:41:05:22 - 00:41:19:07
Robbie
I would really, really suggest that if you're going to use something like a Topaz that you get familiar with the controls, you understand recipes that generally work or don't work, rather than going, oh crap, I now need to process 50 clips and I have an hour to do it kind of thing.
00:41:19:10 - 00:41:50:01
Joey
Yeah, and I don't mean to insult the topaz is on those that category of products. I just they do a great job. In fact, Topaz specifically I say, does a very impressive job of not doing sloppy ish artifacting as in the seven fingers or things like that. They've tuned their models to be very, very good looking, and they've also built their models from, well, what we'll say is authorized sources.
00:41:50:01 - 00:42:11:05
Joey
And they didn't just scrape the internet for copyrighted work. So that's not a there's not going to be licensing problems, things like that. But when it comes to actual historical footage, I think it's very important to draw a line here and say if you are presenting this as historical photography, you just can't use generative algorithms to fill in the details.
00:42:11:05 - 00:42:19:00
Joey
It's basically, you know, in Jurassic Park, they put frog DNA in there to make the dinosaurs work. And we saw how that worked out. Right? Yeah.
00:42:19:00 - 00:42:44:05
Robbie
And that's why I think that the I mentioned much earlier the idea of kind of like chain of custody is that like, you know, a good, a good archival producer, but which is a whole nother subject. We should we don't have to dive into right now, but a good archival producer will understand that chain of custody to a certain degree and understand, oh, there are much better versions of this, and this is what it looks like.
00:42:44:09 - 00:42:59:03
Robbie
We just can't afford it for this project. So let's use that as a reference point in the in the technical work that we're going to try to do. Because you're right. Like I think like take that challenge, your documentary, I was talking about earlier. Right. Obviously they went back to the original film scans and re scanned it, but let's just say they couldn't.
00:42:59:03 - 00:43:30:16
Robbie
Right. That's a case in point where too much cleanup is digger updating the original content to a certain degree. Right. Like it was never the space shuttle was never that white. It were, whatever the case may be. Right? Like, yes, there was these, you know, these lines you could see in the heat tiles on the bottom right, and you know, all the noise reduction, you've gotten rid of those lines, like whatever it may be, I think that there can be aspects of this where a little something that seems good on the surface goes a little too far.
00:43:30:18 - 00:43:50:17
Joey
Yeah. And as long as you understand that generative tools are making up data that wasn't captured, you can use that in your decision making process. And it has. You talk to the client like, look, if it's a B-roll shot between recreations that's made to look historical for the story, fine. Topaz that all you want. If it's a president making a speech.
00:43:50:20 - 00:44:11:08
Joey
No, that's not historically appropriate to use an AI upscaler on you know, it depends on the context in the film. And only, you know, you and your client and the producers can really be the judge of that. But it's important to understand that when you get into generative AI, you are removing the authenticity of the image. That's unquestioned.
00:44:11:10 - 00:44:37:09
Robbie
Yeah, I agree, I agree, and I mean, so I think that there are there are, you know, there's obviously some sort of creative slash authenticity issues that exist there. But I think the best work that I've seen done in this regard tries to respect it with moderate improvement. Right. So like the idea that you're going to get to a perfectly clean, perfect image, that should never probably be the goal with most video archival, right?
00:44:37:11 - 00:45:01:21
Robbie
The idea that you can improve and enhance tastefully, that's really more what I think the goal should be. Right? To not. And the generative stuff. I want to be clear, I don't look as I don't look at. I think from a purely technical point, you're correct about it creating pixels that are not there. To me, it feels a little different than, hey, you know, chat bot make this cool image for me, right?
00:45:01:21 - 00:45:22:17
Robbie
It's not it's not necessarily like generative in the sense that you're you know, I'm not saying, hey, make a person and put him next to this other guy. Right. Like that's clearly generative in this regard. I think I think of those algorithms as more of I get what you're saying, but I look at them more as, as enhancement, you know, resolution or noise reduction enhancement, yes, technically.
00:45:22:17 - 00:45:29:19
Robbie
Are they making new pixels? Yeah, I agree with that. But it's not it's not exactly the same as putting a different person in the shot.
00:45:29:21 - 00:45:52:15
Joey
Yeah. That's why like like I'm drawing a hard line in the sand here. And that's kind of where I stand on it. Yeah. But like I said, it depends on the context in the film and what the goals of your producer and your client is. It's just important to understand the difference in the technology between something like a Topaz versus something like a super scale or a regular noise reduction.
00:45:52:17 - 00:46:13:23
Robbie
Yeah. And I think if that like, you know, if that last thing, if, if often authenticity to the image is the most important thing that's worth having a plan about how to handle this stuff is more important. Right? I remember years ago I did a film, it's a film about the punk rock and hardcore scene here in DC, and the filmmaker was like, we have to like, I don't even want to color this.
00:46:13:23 - 00:46:34:06
Robbie
I just literally want like, can you just make like, clean up the edges so everything, you know, every time they had a shot from, you know, 1982 and Bad Brains or whatever, right. It was a four by three image in the middle of the frame. But he came up with some other creative, artistic ways to make that seem less boring because he didn't want to scale it.
00:46:34:10 - 00:46:53:07
Robbie
Noise, reduce it. He wanted it to be as raw as possible. And so you had to consider that stuff as well. All right, man, good stuff. I think over these past two episodes, we've, you know, we've covered a lot of, you know, 50,000ft view of this stuff. Obviously, there is hundreds if not thousands of things we could cover in detail about each one of these things.
00:46:53:07 - 00:47:11:23
Robbie
But like the idea here is that, you know, focusing on the challenges and the big picture ways to fix them are going to lead, you know, plenty of opportunities to, to not just settle for, oh, this is archival, and we're just going to insert it in. Right. You can improve, you can get better, you can indentify what's good and what's what's what's bad.
00:47:11:23 - 00:47:27:21
Robbie
Quick bit of housekeeping again, if you wouldn't mind if you have 5 or 10 minutes. We still have our audience survey open. That's right here on the, link on screen. If you have 5 or 10 minutes to answer our audience survey, that would be really helpful. We're using this, feedback to help our sort of guide the podcast in 2026.
00:47:27:23 - 00:47:45:16
Robbie
So we appreciate anything that you can do there. As a reminder, you can head over to offsetpodcast.com, to find our complete library. But that's also where we have show notes, including some of the things that we're going to include here on this episode, the DCTLs, and some of the fusion stuff that Joey mentioned.
00:47:45:18 - 00:48:05:14
Robbie
Over the course of these two episodes, we'll link to that, over on the offset podcast.com, if you're listening to us on YouTube or, various audio or, podcast platforms, do us a favor and give us a like and subscribe wherever you find the show. And then lastly, if, if you do have a few minutes, head over to this link right here where you can buy us a cup of virtual coffee.
00:48:05:15 - 00:48:25:14
Robbie
Your support of the show means the world to us. And every, every, dollar, donated on, Buy Us A Coffee goes right to supporting the show, helping us pay our editor and all that kind of jazz. So we really appreciate the support there as well. Joey fun two episodes, hopefully a lot of people, got some sort of, you know, a couple nuggets out of this.
00:48:25:15 - 00:48:42:06
Robbie
There's a lot to talk about, but it was always good talking about how to handle this kind of stuff, because honestly, if you do any sort of, long form doc work or that kind of thing, this is always going to be something that pops up on how to how to best handle that. So for The Offset Podcast, I'm Robbie Carman.
00:48:42:08 - 00:48:44:00
Joey
And I'm Joe D’Anna Thanks for listening.
Robbie Carman
Robbie is the managing colorist and CEO of DC Color. A guitar aficionado who’s never met a piece of gear he didn’t like.
Joey D'Anna
Joey is lead colorist and CTO of DC Color. When he’s not in the color suite you’ll usually find him with a wrench in hand working on one of his classic cars or bikes
Stella Yrigoyen
Stella Yrigoyen is an Austin, TX-based video editor specializing in documentary filmmaking. With a B.S. in Radio-Television-Film from UT Austin and over 7 years of editing experience, Stella possesses an in-depth understanding of the post-production pipeline. In the past year, she worked on Austin PBS series like 'Taco Mafia' and 'Chasing the Tide,' served as a Production Assistant on 'Austin City Limits,' and contributed to various post-production roles on other creatively and technically demanding project