Single-frame+decoding+continued

This page is a continuation from Algorithmic approaches - thoughts

Andrew Steer - 19 April 2008


 * More improvements!**

I'm starting a new page as it makes for easier following (and the Wiki editor seems to get flaky once the page gets too long).

Last night I received the DVD from James with a 34 second clip (857 frames) as a 3.5GB HD1080 UYVY file. Today I wrote a simple module to decode single-frames from the UYVY file.

I made some tiny changes to my colour-recovery program as very early steps towards improvements and speedups, and suddenly had an unexpected improvement in image-quality! The horizontal colour-banding much reduced and the U and V amplitudes became less unbalenced. The strange thing is that I don't quite understand why the improvement - I suspect it's to do with the way the C/C++ compiler manages different data-types within an expression, and by splitting the expression into two parts I've helped things. I've also realised a possible reason that might contribute to my U/V gain-offset (but Andrew Browne should not be affected).

Latest 1:1 pixel image (of same original film-scan), now showing less colour-banding. I've also adjusted the U/V ratio fixup to compensate for algorithmic improvements.

Another impression of the whole-image, to show progress - again from the same source file as I used on the previous page. Oops I squashed the vertical height a bit much. Should have been 360 pixels, but I typed 320. Never mind!


 * Who'd believe it?**

[I have a slight suspicion that the chromadots are slightly "cleaner" in the UYVY file than in the .BMP files James supplied before.]

Here's frame 750 from the UYVY clip: Who would believe that this came off a piece of black-and-white 16mm film, tele-recorded 30-odd years ago? But it did.

Remarkably the -U / +V trick still seems to work quite well. I have no idea what the original colours looked like though!

(edit to add frame from VT - J.Wood 21/4/08)

After a bit more tidying up (and speeding up) of my program code, maybe I can think about doing the U and V polarity properly.

Maybe not relevant (yet) for this project, but I also have some ideas on how to get some of the PAL-decoding performance benefits of the BBC's James Easterbrook's "Transform Decoder" without doing transforms...

Bedtime!

Andrew Steer - 20 April 2008


 * Analysis of what my Colour Recovery software-program is actually doing**

The above plot shows the decoding behaviour of my present program on a zoneplate (still fixed -U,+V decoding).


 * The U-blobs (greeny-yellow) appear slightly larger than the V-blobs (orange-red) owing to the slight extra gain I'm still giving U at the moment.
 * The magenta stripe through the orange-red region is a bit surprising, and indicates that something is probably slightly wrong. I'm not going to worry about it now, but keep an eye on it - hope it will sort itself out as I tidy up the program.

I could probably afford to reduce the vertical bandwidth of my filter (if it proved useful). Care needs to be taken in doing this as PAL is not inherently bandwidth-limited vertically, and using a narrow filter to decode it can cause "cross-luma" - "hanging dots" on horizontal-boundaries of colour. This is a big issue on modern computer-generated graphics or captions, but less of a worry for vintage material.

The plot also reminds me that at the moment in my colour-recovery program I'm finding the "maximum" phase of U and V //independently of each other//. In reality of course they are always in quadrature (90 degrees apart). This extra constraint ought to be added to my program and would probably reduce chroma noise, and would be a useful step towards proper phase-locking.

This image shows the effect of the colour-decoding on some white bars (aligned at angles close to the U and V axes). Shown at 50% of original size. The source-image was band-limited. In a proper phase-sensitive PAL decoder you would get both +ve and -ve U and V colours on the edges. They would typically alternate polarity as you go up and down the image (and across frames). With my present program which forces -U,+V, all the cross-colour (false colour) goes the same way. Similarly all chroma-noise is biased the same way.

As an aside: With the BBC's "Transform Decoder" (or an idea I have for future use) you wouldn't get the cross-colour on luminance edges at all. Well-known 2D PAL decoders reduce cross-colour on vertical edges, but don't help with edges aligned at angles close to the 2D U or V axes.


 * Details of the earlier coding error**

I found what the earlier problem was with my code; it's an //extremely subtle// error.

/ / for U-phase ie \ \ \ m1[i]=(b[i]+b7[i-2]+b5[i-1]+b3[i-1]+b1[i]+b2[i+1]+b4[i+1]+b6[i+2]/2)*sine[i]/7.5; n1[i]=(b[i]+b7[i-2]+b5[i-1]+b3[i-1]+b1[i]+b2[i+1]+b4[i+1]+b6[i+2]/2)*cosine[i]/7.5;

It would take a real C-geek to realise why this does not perform as expected.

[Hint: the compiler gives a warning about a continuation character in the comment...]

This was causing my U-phase signal to decode incorrectly, with the extent of the error depending on the absolute phase of the signal in a horizontal sense. The effect was banding over every couple of lines (seen in the closeups during last week) and an overall loss of U saturation. Correcting this code has improved things considerably, although U appears to remain slightly weaker than V for as-yet unknown reasons.

One other thought (not something that's got any urgency now): but knowing that the PAL signal should be a sinewave in amplitude, in principle it ought to be possible to calibrate overall gamma distortions by studying the detailed luminance gradations of the subcarrier. Getting the gamma right would help to avoid harmonics appearing post-filtering. Although it wouldn't be hard to filter them out anyway.

That's all (for today) folks!

Andrew Steer - 21 April 2008


 * On U and V polarities, and towards pulling out the other colours**

A diagram showing UV colour space can be found on the Wikipedia article at http://en.wikipedia.org/wiki/YUV

Graphic showing U/V colour axis (from Wikipedia)

The -U +V quadrant is the most "useful" containing red-orange-yellow to olive-green, and red with a magenta tinge. Includes skin-tones. The -U -V quadrant is all greens, a bit "olive-green" close to the V=0 axis, and slightly turquoisy near the U=0 axis. The +U -V quadrant is blues, mostly sky-blue; a bit turquoisy near U=0, and erring slightly purply near V=0. The +U +V quadrant is magenta becoming progressively more purply-blue as you move downwards towards the V=0 axis.

This shows the test-image rendered with all four permutations of fixed U and V polarity.

Oddly, the blue shirt, blue denim, blue waistcoats haven't really come through very well on any of the renditions. This might be just because the colours are too dark and the subcarrier has got lost, or it may be that we're getting too much U-V cross coupling because I need a more selective vertical filter and/or to force the U- and V to quadrature. I'll have to look closely at the source.

On inspection of the source, the chroma patterning is //very// weak on the denim. Some more powerful vertical filtering (possibly combined with inter-frame techniques) for better phase-locking would probably help - but the fact that the subcarrier is coded on top of such low luminance may point to something of a fundamental problem... After all, the subcarrier would have gone "below black" in the original PAL signal :-(

I tried a bit of "gamma correction" to pull a bit more detail out of the dark colours, but it didn't help very much. You can see the coding is pretty noisy on the original scan.

No real evidence this is the cause, but dark low-contrast detail is also something that might tend to get lost/corrupted in any DCT-based (HDCAM) compression.

Bear in mind that we still don't fully understand the detail of the line-by-line chroma structure we're seeing - and my decoding effectively decodes as if there were chroma detail from one field only. A better understanding of what's going on (and maybe a scan with a higher vertical resolution) might help pull more chroma data out of the noise. That said, I suspect dark blues are always likely to be tricky with this kind of colour-recovery approach.

In the shorter term, proper phase-locking may help, but it's still going to be a tall order!

This is an experiment - the basic -U/+V image, but with key denim blue areas manually pasted in from +U/-V image. The blue frame doesn't look much, but when set against the red/yellow, it might be passable. Do bear in mind this is cheating somewhat because the random noise will have gained a blue bias in the "blue" corner anyway. It might mean there's some hope though. I've also tried applying a bit of gamma-correction to brighten the image overall.

I guess in processing a movie sequence it would theoretically be possible to use data from adjacent frames in a "3D" filter to pull more signal out of the noise. As a matter of principle I'd prefer to treat the frames in isolation as far as possible though.

Goodnight.

Andrew Steer - 24 April 2008


 * Thoughts on the detail of the chroma and line-structure**

Within a single //field//, the chroma phase (for arbitrary U and V magnitudes) has a complete periodicity of 4 lines. This is equivalent to 15 lines in the HD1080 scan. Chroma sampling with the 1080 line structure will therefore result in a beat frequency of 15 lines. Now the interlaced field similarly has a periodicity of 4 lines (within its field), but is 180 degrees offset. This will create a second beat structure offset by around 7.5 lines from that resulting from the first frame. Result: a beat structure with 7.5-line periodicity for arbitrary U and V magnitudes. For U-only or V-only colours different symmetries result in these beats dropping out (we see these "cleaner" colour structures on some of the frames from the UYVY file).

I had previously worried that the 180 degree chroma-phase-offset of the interlaced field ought to have the net effect of cancelling the chroma if two fields are integrated equally. The fact that my method works at all is proof that some element of this assumption is wrong. Playing around in Excel it soon becomes apparent that although the fields partially-cancel, analysing vertically the phases are not 180 degrees out of phase, but 180**+45** degrees because of the physical displacement of the interleave (maybe 180-45 degrees; I've not been totally rigorous). The sum of two sinewaves with 180+45 degrees offset is another sinewave still with significant amplitude (actually 1/sqrt(2) times the original amplitude). So the fact that the colour from interlaced fields //can// be decoded crudely by treating the combined image as if a single-field should not be surprising. This method though, as previously suggested, will worsen the chroma noise - and markedly worsen cross-colour when compared to a smarter approach. Because the vertically-filtered (i.e. vertically-smoothed/averaged) chroma-signal will have phase errors with respect to any individual scanline, the 2D filter (while useful for recovering chroma) would not recover the luminance cleanly. At the moment I'm only using a simple 1D filter to recover luminance - and this looses more sharpness than an ideal 2D filter (and/or Transform decoding) might.

Although a telecine with higher vertical resolution (e.g. around 1500 lines) would be nice, I am clarifying my ideas for some clever signal-processing tricks we can do with what we've got... for reducing cross-colour (and perhaps for recovering more of the line structure).

The apparent darkening of some post-processed high-chroma-strength lines which I observed on the previous page are probably due to systematic gamma-errors in the telerecording and telecine process. This is reasonable considering that the post-processed images have come out too dark overall.

The interlaced line-structure in the low-chroma areas cannot be explained by the previous argument. Among photographers, it is known that film can be pre-sensitised by one exposure, making it more sensitive to a future exposure. I wonder whether this kind of effect might explain the visibility of line-structure in low-chroma areas? Where the first field (somewhat blurred by the spot-wobble) presensitises the film in advance of the second field, making the latter structure more dominant? Would any film-expert like to comment? In coloured areas where the chroma from adjacent scanlines in mis-aligned this effect may reasonably be expected to be less significant.

Nothing especially new or clever, but here's a couple more recent images, after processing with tighter bandwidth filters to improve sharpness and reduce cross-colour. On the Blodwyn image I've applied a 1.3 gamma correction before processing. The Jimmy image has a 1.15 gamma correction.



Both images still with forced -U / +V. Other colours to follow ... all in good time!

That's it for now.

Andrew Steer - 25 April 2008


 * Small tweaks**

The mission is now to work towards a video-demo (albeit only with -U/+V colours for now). I've implemented the gamma-adjustment in my software, since I can't be messing with PaintShopPro manually for 800+ frames! I've also re-remembered of course that digital video has luminance between digital counts of 16 (nominal black) and 235 (nominal white), so played around with the gamma and colour saturation etc so that when I expand the final image from 16-235 to 0-255 for PC display, everything is sweet.

I had been bothered that while my improved colour filter (yesterday) appeared to have cleaned things up in general, it had also reintroduced some alternate-line hue variations... and today found a little typo in my code which was responsible, and have fixed it.

Next steps (tomorrow) are: to better modularise the code so I can open frames programmatically introduce new frame-buffers so I can store the decoded image natively in UYVY, rather than purely in RGB. write a routine to save (append) the frame back to a very big file write a simple loop to step through 857 frames, opening/decoding/saving.

It's all fairly straightforward.

For reference: latest chroma-filter characteristics.

Andrew Steer - 26 April 2008


 * Processing 32-second movie-sequence**

Software prepared for batch-processing of frames into a 1080HD UYVY format. Various tests performed.

Bulk-processing of frames 5-200 began at 17:47 It is currently taking approx. 30 seconds per frame. I haven't yet made any significant speedups to the program!

Having watched the first few frames go by, there is evidence that some frames are displaced by a line or so. This is not too surprising considering the fast-pull-down film feed... but will add some extra complexity to methods to recover the U and V phases to yeild the full colour gamut...

Time to go and prepare dinner. While colourising, my computer is too sluggish to use for anything much else anyway!

22:23 - got to frame 370

Andrew Steer - 27 April

Update: apparently the computer finished at 04:18 this morning!

On a newer PC the program runs in 2/3rd the time despite a slightly slower processor speed. "Early indications show" that on a dual-core processor you can run two copies of my program simultaneously with near-full performance of both instances.

A colour-recovered UYVY file has been supplied to James. Assuming it all reads back okay, Jonathan will get to see it later in the week. I look forward to your feedback.

Agreed the next technical step is phase-sensitive decoding - ie full-gamut. Bring out those blues, purples, bright greens, etc.!

Andrew Steer - 28 April 2008


 * Wow ... amazing ... awesome ... incredible!**

James and the good folks at BBC R&D Kingswood Warren have just returned to me a standard-definition DVD version of my colour-recovered clip. I'm really pleased with it! I've played it on my computer. I've played it on a domestic (CRT) television and DVD player. I've watched all 34 seconds over and over :-)

As an engineer and a perfectionist, there's always room for improvement. Apart from getting the rest of the colours back, the only real issue seems to be the frame-to-frame fluctuation in the saturation of the orangey-reds (the green is rock-steady), and a lesser weak pink/green fluctuation on the skin-tones. This is no worse than many VHS video-recordings I've seen - and the sharpness and other aspects of the picture are vastly superior to that.

I have one idea how we might improve this saturation-fluctuation fundamentally (tweak the vertical chrominance filter - I suspect with the present implementation there's some U/V mixing which (thanks partly to the limited vertical resolution of the HD scan) depends on the stage of the PAL frame-sequencing - it may also sort itself out with phase-sensitive decoding, or just with forced //quadrature// decoding), although the fudge would be just to use some inter-frame chroma-smoothing.

Overall, //extremely// encouraging though.

I saw some odd white flecks in Jimmy's clothing and wondered what was going on ... then realised they were chroma (or luma?) keying artifacts on the original programme material!

If we just ironed out the overall vertical jitter, you'd never guess it had come off film - let alone //black and white// film! Of course the recovered colours are electronic (TV) colours rather than "film" colours too.

Shall sleep well.

Andrew Steer - 29 April 2008


 * Competition!**

Looks like I have some competition from Richard Russell !

Some of my earlier comments on "forced quadrature" decoding are probably a bit misleading. Given the resampling the in vertical direction, the U and V will //not// be in quadrature for arbitrary HD scanlines. What I do probably mean is more of a low-pass (substantially sub-1MHz bandwidth) "flywheel" on the U and V phasing - and then do phase-sensitive decoding wrt that reference.

Very confident that we can use the PAL information to **perfectly** measure and correct for film weave and jitter... in //most cases//. In the absence of suitable chroma (reasonably strong mixed U/V colours) we might have to fall back to a less precise luminance-based approach (which is also susceptable to interference from camera pan). Correcting for weave and jitter would cosmetically help the material look more "video" and less "film" ... but would also be the gateway to inter-frame awareness or adaptation.

I can think of lots of steps towards doing a really nice technically perfect job towards getting the full gamut back and curing many other ills in the process. What would be good though is a simple hack... a shortcut.

//Some// of what-should-be-blue is markedly out-of-gamut when decoded in -U/+V. This depends though on the colour, and the luminance, and of course the noise - which is poor for dark colours. It would probably be possible to use this information to //assist// phase-determinations, but whether it is robust enough to use without much manual intervention remains to be seen.

Simple hack to highlight out-of-gamut (-ve blue):

In this image I've simply highlighted the colours which decode to negative-blue components when interpreted as -U/+V.

This shows marginal detection on the denim, but a strong reading on the purples in the mirrors at the extreme right/top of the image.

The highlighting doesn't tell you which quadrant the colours should be, but does give an indication of where the -U/+V assumption looks particularly poor. A more careful measure of gamut may help inform an auto-phase-tracking algorithm for the U and V carriers - although I would prefer a more fundamental approach if possible. Where the existing red/orange butts up to the blue jeans it should be possible to track the phase-changes anyway - provided the signal-to-noise is good enough.

Question for Jonathan and/or James. The file you supplied from the VT begins with colour-bars (I need to write a small program to open the file properly so I can browse the rest of it). **Do the film telerecordings include these initial colour-bars?** If they do, those colour-bars would form a very useful calibration screen. Otherwise we will have to build up a calibration from the real imagery.

For Richard's benefit, here's a closeup 1:1 crop of the Jimmy image processed as per my movie demo (native 16...235 luminance):



Andrew Steer - 30 April 2008

Thinking about the colours which "flicker" in saturation in the movie. They are mostly the orangey colours which are mixed U and V, with the V component markedly stronger than the U (in terms of subcarrier amplitude, at least). I had suspected, and proved today (details another time) that some of the V is cross-coupling into the U filter. A better vertical filter characteristic may help slightly. I wonder whether some ratios of U and V lead to particularly challenging beats with the vertical HD scanning...? On the other hand, it is curious that the whole area of colour seems to flash (in saturation) in time with itself, rather than vertical bands which flash in anti-phase. So maybe it's not related to the telecine sampling?

Is there any chance that the source VT has been through two cascaded PAL codings? I'll have to check whether there's any instability on the VT reference. It is possible that the non-phase-aware decoding emphasises any latent coding errors? I tried to check whether the instability is at all periodic with the 4-frame PAL sequence, but so far the results are inconclusive. If all else fails, it could be brute-force smoothed across frames in post-processing anyway.

Also looking at ideas for calibrating the absolute phase, in order to recover the full gamut. It's not going to be easy! Unless we can pull out a few frames with very large area saturated colour, I'm probably going to need to develop rather better chroma/luma separating filters. At present, sufficient diagonal (U/V aligned) luma transitions still "bang" the filter enough to cause massive local phase-errors!

Andrew Steer - 1st May 2008


 * Phase analysis**

No new work today. Other things cropped up. Instead you get a picture of things I was visuallising yesterday...

Image showing the filtered V carrier, from a Jimmy image. The original luminance is overlaid faintly in blue for reference. Easy on the solid orange block, but it'll take some more work to establish the colour phasing from the whole area of the picture from a mix of typical image-content... note also how the filter is "banged" by luminance edges. :-)

Image showing U-filter (1:2 scale).

In general, there will be gaps and glitches to smooth over. If the opposite phase colours crop up then an antiphase signal will appear. Owing to cross-colour artificial / false patterns also appear. Some pairs of frames align well enough to subtract out the luminance (assuming no movement), but often the film-judder is an issue. I have more ideas though. Watch this (Wiki)space!

Andrew Steer - 3 May 2008


 * Anti-Cross-Colour** (WAS-ACC)

Just a brief experiment to test an idea...



Yes. It's a crude implementation, but I've verified that my previously-alluded-to anti-cross-colour idea basically works. With a bit of refinement, I'm hoping I can get similar results to the Transform Decoder ... without infringing the patent ;-)

Jimmy, with my new anti-cross-colour (ACC) beta. Compare with the ordinarily filtered image earlier on this page. :-)

Crop from 768x576 rescaled Jimmy, with ACC. It's still quite a crude version of my intended algorithm, but quite effective nonetheless. I can't really optimise it further until I've made the implementation more rigorous. This is still processed as a stand-alone frame.

To clarify: at the moment the luma is still decoded with a simple 1D 4.43MHz notch filter. The chroma is being decoded with a 2D filter, and the anti-cross-colour is only in the chroma path. In principle, with a proper PAL signal symmetrical paths could be used for both - but this is not realistic for the film colour-recovery (at least not at the present scanning resolution) where the chroma data from two fields is beating and aliassed.

Normal version for comparison: As previous image, but without anti-cross-colour processing. The image processing should be identical in all other respects (they're both nominally the same c. 1.5MHz bandwidth colour filters).

That's an interesting diversion, and one I'd been meaning to try for a very long time.

Then again, I still think it's a long stretch to get to phase-calibration (full-gamut) from arbitrary images. There's got to be an easier way...?

Can we just scan through a whole programme and hope we can pick out a handful of frames which are very rich in relatively constant colours? Have a separate applet for doing the calibration, possibly with some kind of human-guidance...?

//Need to work on a crack for the colour gamut!// Think I should go to Brighton tomorrow for some inspiration from the sea...

Andrew Steer - 4 May 2008


 * Thoughts on Harmonic distortion**

I've been continuing to ponder on why the orange square doesn't come out the right colour, and that it's luminance doesn't decode cleanly with my simple horizontal notch filter. Jonathan Wood helpfully commented (private correspondence) on a correlation between the chroma saturation fluctuation and large net picture luminance changes.

I have noted before on the wiki that the only reason for the luminance not filtering properly would be distortion of the chroma subcarrier sine waveform, and that this would be caused by net gamma errors or -more seriously- by black-level clipping.

Thanks to Jonathan's comment, a penny has just dropped. It is reasonable (given my experience of analog video systems and monitors) to suppose that although the net picture brightness is fairly stable, there may be momentary black-level fluctuations resulting from large-area luma changes. This would change the chroma distortion - and hence decoded saturation.

I think it would be very much worth my while analysing for subcarrier harmonic distortion products and subtracting them out of the picture before decoding. This may allow us to recover "below black" levels of subcarrier ;-) and may fix several issues.

I will also try and improve my U and V vertical filters as I know there are still some spurious responses which could cause a degree of U/V mixing.

As another aside, I really need to write a quick program to open and display the VT reference material. This would probably provide further hints too.

Andrew Steer - 5 May 2008


 * Sorry - another distraction** :-)

When looking at the ACC images, I couldn't help but get annoyed by the luminance filter ringing, causing the "shadowing" around Jimmy's sleeve, Jimmy's hair, and on the "Edison Lighthouse" lettering. So I copied and pasted and modified the ACC code to do much the same trick with the luminance. Voila:



This is still a simplified version of the idea, but (IMHO) it looks rather good.

Here's a 768x576 scaled copy of the whole frame. This is the raw output from todays version of my code, apart from a final pre-resize vertical filter to tame the raster structure.

Further good news: the new processing is computationally fairly lightweight, and hasn't affected the runtime too adversely. Has made the code a bit longer, and harder to follow though.

Possibly a bit too "beta" to let loose on the next movie sequence (particularly the luma processing for which any artifacts will be more noticeable). Definitely verifies the general principle though.

//Need// to get back to the problem of absolute-phase and full-gamut. Am I procrastinating?

[Insert from Jonathan Wood]
 * Comparison of graded recovered colour** (before the last few enhancements) :-

Jonathan Wood - 5 May 2008

(might be time to start a new page!)

Hint taken.

This page is now continued on Towards full-gamut ...