Jump to content

567P or 1080i


Recommended Posts



  • Replies 33
  • Created
  • Last Reply

Top Posters In This Topic

Aust 1080i is 1440x1080 @ 14mbits.

HERE, HERE about STB and one format.

Channel Nine is 1920x1088

Also my 2c about 576p vs 1080i. It's like comparing night and day. 1080i is worlds ahead of 576p, even on my 17" CRT monitor. On a 50" display, I'm sure the difference would be even greater in favour of 1080i.

Link to comment
Share on other sites

Aust 1080i is 1440x1080 @ 14mbits.

HERE, HERE about STB and one format.

Channel Nine is 1920x1088

Also my 2c about 576p vs 1080i. It's like comparing night and day. 1080i is worlds ahead of 576p, even on my 17" CRT monitor. On a 50" display, I'm sure the difference would be even greater in favour of 1080i.

Hmm, my experience is that the TV stations are 1440 wide, but to correct the aspect ratio for square pixeled PCs, the image is stretched to 1920x1440 if using PC based software such as windvd etc to play the HD stream.

Now to the 576p vs 1080i argument:

But you have to be careful when comparing.

576p/720p (50) would look superior with fast moving live events, where as 1080i based on film would look superior (or with slow live stuff) as it can be de-interlaced sucessfully to 1080p from film.

It could be argued that 1080i for fast motion stuff (not talking about film here!) is not as good.

Home&Away doesn't fully take advantage of 576p50, as it repeats every 2nd frame. If you were watching something fast moving that had 50 unique frames per second, I think 576p would possibly give 1080i a run for its money.

Link to comment
Share on other sites

BUT here's my observations, ......  when I view at 3.0m my RankArena 86cm with my HD Thomson DTI1500HD set to 31.3KHz & 50Hz :

Dugby,

While I am not sure of the exact "measurements", I seem to recall that to get the benefit of HD, you should be viewing within 6 times picture height of the screen. That would be within 8 feet of your screen, if my calculations are correct.

It would be interesting to hear your thoughts viewing a bit closer to the screen. Too far away and your eyes cannot resolve the extra detail.

Someone may care to correct me on the recommended figures, I am going by vague memory.

Cheers

JB

Hi John, well to answer your question..... last night whilst watching a true 1080i broadcast, I sat 2.0m from the HDTV and then back at 3.0m.

Yes I was able to see more detail in peoples' skin complexion and individual strands of their hair, whilst at 2.0m.

At 3.0m I can still see individual strands of hair.

What I personally don't like is the 'too' big screen effect sitting closer to the HDTV. I have a friend with a PJ, and everytime I see the screen which is setup for 4.0m viewing from seat-to-screen I get quite nauseas and have to leave the room.

As a result of this, whenever I go to a cinema, you see me is the back 4 rows.

So for me.....3.0m is optimum for my 86cm HDTV

Link to comment
Share on other sites

576p/720p (50) would look superior with fast moving live events, where as 1080i based on film would look superior (or with slow live stuff) as it can be de-interlaced sucessfully to 1080p from film.

That's pretty much it in a nutshell.

576p is near pointless for low frame rate sources (as Seven are using now) as you'll only ever be looking at 25 unique progressive frames a second anyway (and you can get this with de-interlaced 1080/25p @ 50hz with much higher spatial resolution)

Home&Away doesn't fully take advantage of 576p50, as it repeats every 2nd frame.  If you were watching something fast moving that had 50 unique frames per second, I think 576p would possibly give 1080i a run for its money.

Exactly. Its important to note that even with native 576/50p sources you'd still of course notice the lower spatial resolution than 1080i, but what you would see is superb consistent resolution a all times, with no blurring or loss of detail during movement (which from my experience gives everything an enhanced depth and 3D like appearance) and no artefacts whatsoever (no aliasing, interline flicker etc). In these cases 576/50p has the potential to look very good indeed and offers much over the 576/50i format.

Link to comment
Share on other sites



Darklord,

I would like to commend you on the outstandingly polite way you have handled this discussion. Often discussion can turn to argument. You are in deed a gentleman and one with plenty of time on his hands judging by the length of your posts. :blink:

There’s no doubting that from a simple “lines of resolution standpoint” 1080i is superior. However its important to note that line structure certainly isn’t everything, and that the very nature of progressive scanning makes line structure far less visible on many displays, as the lines are refreshed in the same spot, rather than jumping up and down. There’s also twice as many of them displayed in the same time.

True, but there is one detail I’d like to point out. CRT phosphors have a designed in amount of persistence where they do not turn of instantly after being illuminated. This allows the first field to be held on screen while the second field is drawn. Sort of a sample and hold function if you will. This means that all 1080 lines ARE on screen at the same time even though the first field will have decayed in brightness some what.

A well designed digital display should deinterlace and display progressively so no line jumping should occur.

I am not arguing that interlaced is better then progressive. Far from it, all else being equal progressive scan is far better then interlaced especially for fast movement as you have said, but there are many variables at work that make things not always equal. :P

Regards,

Owen

Link to comment
Share on other sites

Darklord,

I would like to commend you on the outstandingly polite way you have handled this discussion. Often discussion can turn to argument. You are in deed a gentleman and one with plenty of time on his hands judging by the length of your posts.

LOL. Thanks Owen. I probably do spend too much of my life contributing at AV forums, but once a topic of interest comes up I can’t help myself. I realise that makes me an epic geek, but hey since you’re reading this you’re more than likely one yourself :blink:

As for being “outstandingly polite” thankyou for that courteous comment. I’m actually guilty of letting these kind of discussions get out of hand previously in other topics and forums, as when you have strong opinions about technical topics, it’s tempting to often get carried away, and in turn fall into the trap of being unnecessarily condescending and insulting. I've certainly learnt that the only way to get people to listen is to treat them with respect, put forward arguments in a reasonable way, and most importantly be willing to listen to and learn from angles you simply may not have considered.

Amway thanks again for your comments, and thankyou to you also for keeping this discussion on friendly and enjoyable terms.

There’s no doubting that from a simple “lines of resolution standpoint” 1080i is superior. However its important to note that line structure certainly isn’t everything, and that the very nature of progressive scanning makes line structure far less visible on many displays, as the lines are refreshed in the same spot, rather than jumping up and down. There’s also twice as many of them displayed in the same time.

True, but there is one detail I’d like to point out. CRT phosphors have a designed in amount of persistence where they do not turn of instantly after being illuminated. This allows the first field to be held on screen while the second field is drawn. Sort of a sample and hold function if you will. This means that all 1080 lines ARE on screen at the same time even though the first field will have decayed in brightness some what.

It’s a good point and certainly not one I have overlooked. Persistence of phosphors on CRT based displays is one aspect I've often brought up when discussing the relative merits of interlaced scan. In fact as you probably know its one of the main reason that interlaced scan was deemed an acceptable broadcast solution when it was first introduced, as it certainly helps the illusion of there being greater resolution available than there actually is. However it’s interesting to note with this topic, that many newer CRT displays (including HD models) actually have a faster phosphor refresh rate than older TVs, so they work well when using 100hz digital processing. The reason being that if phosphor refresh is slow in 100hz mode you get a hideous trailing effect (you've probably seen this on cheaper TVs with poor 100hz modes). Increasing the phosphor refresh means crisper images and more detail during 100hz motion. Even though many of these TVs are designed for 50hz HD, they are still often designed primarily with 100hz SD viewing in mind as that takes up the majority of viewing for most “average” consumers.

A TV with fast phosphor refresh not only has less detail during viewing interlaced content at 50hz (as lines obviously disappear more quickly) but full frame flicker is more noticeable @ 50hz (one of the reasons some CRT TVs seem to flicker more with HD @ 50hz than others)

Anyway regardless of phosphor refresh, fixed panel native progressive digital displays are becoming more and more common, and I have no doubt they will eventually be the norm, once their overall quality and price point is up to scratch. As we have discussed these fixed panel displays simply can’t display an interlacedsignal anyway, and refresh the whole screen at once, so this changes things dramatically.

A well designed digital display should deinterlace and display progressively so no line jumping should occur.

Actually it still can. The reason being that if a source is native interlaced to being with (shot in 576/50i or 1080/50i for example) each individual field is from a different motion stage, and hence you can’t recombine these fields using "weave" de-interlacing to from complete frames on a progressive display. The result is that most digital displays when detecting this type of video source, will simply scale each individual interlaced field to a full frame via the process of basic interpolation (known as bob de-interlacing which I admittedly spend way too much discussing!). Because the original even/off line sequence remains (the gaps are just filled in) the original lines still jump and down (or detail flashes on and off) causing the same interline flicker artefact as found on interlaced monitors! In addition because those fields aren’t being perceived as being woven together anymore (due to lack of phosphor persistence and the interlacing of those fields) you’re only perceiving around half the lines of vertical resolution (in the case of 576i: 288 lines) at any given time! This is one of those cases where you’re actually better off viewing interlaced sources as interlaced.

I am not arguing that interlaced is better then progressive. Far from it, all else being equal progressive scan is far better then interlaced especially for fast movement as you have said, but there are many variables at work that make things not always equal.

As is probably clear from my arguments I completely agree! There are many cases where progressive viewing is better than interlaced and sometimes vice versa. There are so many things to take into consideration with modern digital viewing it’s a nightmare!

It really will be much simpler when we live in an all progressive world (Hey that sounds like a suitable slogan for JVC’s latest campaign! :P)

Link to comment
Share on other sites

Hmm, my experience is that the TV stations are 1440 wide, but to correct the aspect ratio for square pixeled PCs, the image is stretched to 1920x1440 if using PC based software such as windvd etc to play the HD stream.

The STB will display 1920x1080, but the TV station's Mpeg en-coders sample @ 1440 H, and 1080 V.

If 9 was sampling at 1920x1080 (@14mbits) the resulting "blocking" would woeful on fast moving images. If they do 1920, their en-coders are VERY VERY good.

Link to comment
Share on other sites

Hmm, my experience is that the TV stations are 1440 wide, but to correct the aspect ratio for square pixeled PCs, the image is stretched to 1920x1440 if using PC based software such as windvd etc to play the HD stream.

The STB will display 1920x1080, but the TV station's Mpeg en-coders sample @ 1440 H, and 1080 V.

If 9 was sampling at 1920x1080 (@14mbits) the resulting "blocking" would woeful on fast moving images. If they do 1920, their en-coders are VERY VERY good.

So you are agreeing with me, right?

I actually made a typo in my orig statement, I meant "1920x1088 if using PC based software..."etc.

Cheers

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top