Jump to content

Report: 1080 & 768 Lcd Panels Side By Side


Recommended Posts

GG Osborne Park has on display two Philips 37" LCD panels side by side, one 1920x1080, one 1366x768, both being fed the Philips HD demo loop from GG's 1080i "Stream Generator".

I watched carefully from 1m, then 2m and 4m, honestly I couldn't tell much difference in terms of resolution. The 768 picture was about as sharp and well defined as the 1080 picture. No way anyone will convince me to pay 50% more $$ for the higher panel.

In my view packing 1920x1080 into a 37-42" is good only for use as a PC monitor (when you typically sit less than 1m from it), but is a big waste as a TV screen as no-one watches from less than 1m.

Link to comment
Share on other sites



  • Replies 69
  • Created
  • Last Reply

Top Posters In This Topic

GG Osborne Park has on display two Philips 37" LCD panels side by side, one 1920x1080, one 1366x768, both being fed the Philips HD demo loop from GG's 1080i "Stream Generator".

I watched carefully from 1m, then 2m and 4m, honestly I couldn't tell much difference in terms of resolution. The 768 picture was about as sharp and well defined as the 1080 picture. No way anyone will convince me to pay 50% more $$ for the higher panel.

In my view packing 1920x1080 into a 37-42" is good only for use as a PC monitor (when you typically sit less than 1m from it), but is a big waste as a TV screen as no-one watches from less than 1m.

Correct .. There is not much benefit in 1080p for a small 37" screen .. It becomes more noticeable at 50" and is important at 60" and above ..

Link to comment
Share on other sites

GG Osborne Park has on display two Philips 37" LCD panels side by side, one 1920x1080, one 1366x768, both being fed the Philips HD demo loop from GG's 1080i "Stream Generator".

I watched carefully from 1m, then 2m and 4m, honestly I couldn't tell much difference in terms of resolution. The 768 picture was about as sharp and well defined as the 1080 picture. No way anyone will convince me to pay 50% more $$ for the higher panel.

In my view packing 1920x1080 into a 37-42" is good only for use as a PC monitor (when you typically sit less than 1m from it), but is a big waste as a TV screen as no-one watches from less than 1m.

Wouldn't it also depend how each tv was connected? And wouldn't you need a 1080p source to compare properly?

Link to comment
Share on other sites

Wouldn't it also depend how each tv was connected? And wouldn't you need a 1080p source to compare properly?

I did report that both panels were fed from the "Stream Generator", which provides multiple component outputs @ 1080i. The picture material is high quality as well being promo stuff from Philips stored in HD format on the Stream Generator's harddisk.

Link to comment
Share on other sites



I did report that both panels were fed from the "Stream Generator", which provides multiple component outputs @ 1080i. The picture material is high quality as well being promo stuff from Philips stored in HD format on the Stream Generator's harddisk.

That doesn't mean that it was driving the 1080p panel to it's full ability though.

For a start if it's driving it at 1080i the total pixels isn't actualy better then the 768p image.

Also the original source image may have been 720p (that is considered HD) anyway, so the 768p display would have been a better match to the source anyway.

Still... given the lack of good 1080p HD content and the likley continuation of that it's probably wiser to save the 50% and buy a second, better HD panel in a few years time when the price drops. You get a good screen now (at 768p) and you get a better screen later all for the same cost of getting a 1080p panel now.

Link to comment
Share on other sites

Guest Sparky66

I agree that 1080p would show a slight increase in picture quality on anything above 60" but when viewing 720p/1080i -where most viewing is being watched at present, I don't agree.

My brother owns the 1080p 72CM9UA Toshiba and I own the 720p 62JM9UA Toshiba and to be honest we agree that the picture output by both units is identical. Only when viewing 1080p WMV HD is there a slight difference and only noticeable when viewed from closer up to the screen.

As a twist, because I have a better quality DVD player (Arcam DV29) and a DVDO iScan VP30 video processor to the 720p TV, It shits on his for movie playback. Go figure that one out.

Correct .. There is not much benefit in 1080p for a small 37" screen .. It becomes more noticeable at 50" and is important at 60" and above ..
Link to comment
Share on other sites

1080i that is deintelaced and displayed properly is the same resolution as 1080p, and 1080i from a progressive source like film, can be converted back to 1080p very easily, although many digital displays still cant do it properly, and use simple bob methods.

Even true interlaced 1080i is the same as 1080p in low motion scenes, and if properly deinterlaced, is so close to 1080p the rest of the time that it is virtually impossible to pick the difference, even up very close.

1080i also has smoother motion then 1080p 24/25, which is the only kind of 1080p around for video use.

1080 is a waste on relatively small displays at normal viewing distances for video use.

Even for general PC use, 1920x1080 is bloody high resolution, and text is hard to read at TV type viewing distances.

For those that want to view a 60” plus screen at 3 meters or even less, the difference between 720p and 1080i/p is definitely quite noticeable for video.

Link to comment
Share on other sites

GG Osborne Park has on display two Philips 37" LCD panels side by side, one 1920x1080, one 1366x768, both being fed the Philips HD demo loop from GG's 1080i "Stream Generator".

Perhaps the Philips 37" LCD display with the 1920x1080 resolution was the 37PF9830/10 Philips webpage

According to page 5 of the online manual, the Philips 37PF9830/10 has a picture format key at the bottom left of the remote control. This allows for a number of options including auto format, super zoom, movie expand 16:9, subtitle zoom, and wide screen. There is a note: "With HD sources not all screen formats will be available". From the diagrams, it appears that the format most likely to display a 1920 x 1080 source without scaling is the "wide screen" format.

I think that one of the difficulties in making comparisons would be that a lot of 1920 x 1080 displays default to overscanning a 1920 x 1080 source. This inevitably degrades the displayed picture somewhat, although according to some experts this scaling degradation is virtually undetectable at normal viewing distances. I await the opportunity to make an A B comparison myself of a 1920 x 1080 display at close range, displaying a 1920 x 1080 source, as the picture format is alternated between overscan format and actual 1920 x 1080 format.

Link to comment
Share on other sites



You NEED overscan when displaying video, that’s why ALL TV’s overscan at any input resolution.

I don't WANT compulsory overscanning when displaying video. If the outer edges of the frame of a particular movie are shockingly compromised and too distracting, I might appreciate the OPTION of overscanning, or the alternative option of a reduced aperture by way of a black border electronically overlaying the raw full picture). (Preferably the aperture would be adjustable to hide only as much of the source frame as needed to be hidden from view.)

Some advanced products already offer such options. For example the Sharp Aquos 45" LCD TV has a "dot for dot" mode available to display external 1080i material.

With blu-ray and HD-DVD disks and players arriving in Australia over coming months, we will be increasingly likely to want to disable overscanning. We will want to see the WHOLE of the picture frame, and will want to see it without a FURTHER stage of scaling and interpolation.

To use a crude analogy, if faced with the necessity of looking at a scene through a double layer of insect screening, which of us would not prefer the nearer insect screen to be the same gauge as, and to perfectly match up in alignment with, the farther insect screen? And would we be happy to have the outer edges of the nearer insect screen permanently obscured to protect us from things that MIGHT be present or MIGHT NOT (e.g. a boom microphone, or weird flashing lights from an alien spacecraft visible at one edge of the scene that otherwise is of a newreader in the studio).

I live in a democracy and I am a consumer who will only BUY an LCD product that has the option of displaying a 1920 x 1080 video in that actual format -- not in a cropped, upscaled and reinterpolated version.

These remarks also apply to still pictures taken with readily available high resolution domestic digital cameras. If a PC is set to display at 1920 x 1080 and interfaced with the LCD TV, there is absolutely no need for the LCD TV to compulsorily crop upscale and reinterpolate. I think the LCD TV manufacturers (and consumers) will become increasingly aware of this, particularly as more computer video cards become available that can comprehensively cope with the 1920 x 1080 scanning resolution.

May we all live long and prosper but remember to exercise and not sit down all day in front of a computer screen. :blink:

Link to comment
Share on other sites

This topic has been thrashed out several times on this forum, and I understand where you are coming from, I really do.

Inexperienced people get all worked up about 1:1 mapping, but for video use, it just does not matter if decent scaling is employed.

As hard as I try, I cant see any difference between a 1:1 image and a slightly scaled image on any of my 1080 or 1200 line displays when displaying 1080 video.

It’s also rare to find video that does not need overscan.

I run 2% overscan, and thats often not enough.

Just because 1:1 gives sharper text on a PC desktop, does not mean it provides better video.

Video is vastly different to PC generated text and graphics.

Anything that was captured with a camera lens cannot have one pixel perfectly sharp edges.

The only thing that has such sharp edges in video is compression macroblocks - artifacts, and you sure don’t want them cleanly reproduced if quality video is your aim.

The very sharp edged pixels of LCD screens is what makes them good as a PC monitor, however sharp edged pixels are the last thing you want for video, unless you enjoy a harsh digital and unrealistic looking image.

The real world is not made up of pixels.

Link to comment
Share on other sites

Hello MLXXX,

This topic has been done to death. It's pretty normal for people to feel the way you do, but seriously, it is misguided concern (no offence!). Yes, you live in a democracy and can decide to buy whatever LCD product you like. The thing is though, you may actually be doing yourself a disservice by limiting your options because of something that truly is not important. Don't get me wrong, there are valid reasons (mainly, PC use) for not wanting overscan, but for normal video viewing, it really is not desireable. Nor does it have to degrade the image (the scaling that may be invoked from having overscan). Of course in reality, most displays have to scale all of the time, anyway -- overscan or not -- simply because of resolution mismatch.

It can be mathematically proven that scaling, if done correctly, does not degrade the image. Unlike PC graphics, which is based upon discrete pixels and sharp edges that must be reproduced faithfully in order to convey what has been rendered, video is not captured and created in this way. In fact, if you ever do encounter anything like a sharp edge in normal video, then it is actually an artifact. It may be the edge of an MPEG macroblock, or, commonly, it may be aliasing which is high-frequency content that occurs when you capture or scale video without filtering it correctly. If video is filtered correctly so that it contains no aliasing (no high-frequency content above the Nyquist frequency) -- which is a must for high quality video -- then it most certainly will not contain any sharp edges that would warrant 1:1 pixel mapping. It's that simple! (Well, it's actually pretty complicated, but you get the idea).

Let's think of it another way... If video did contain the sharp kind of content that you find in PC text and graphics, then it would look really bad whenever a scene panned up/down/left or right. You guessed it, you would get... aliasing! As you know, you can't shift the image on your PC screen by sub-pixel amounts and have it still look sharp (remember how the older VGA analogue-interface LCD screens looked when they weren't calibrated correctly to the incoming analogue signal? The image would look soft unless it was "dead-on". The same type of effect would occur if it was not running at native resolution and therefore having to scale). Well, video, whether we like it or not is full of natural sub-pixel shifts (pans) and scaling (zooms)!! Because of this, and in order to have a scene look consistent, no matter whether individual pieces of detail (say, the hard edge of a table leg) correspond with individual pixels in the eventual raster or not, it's important to pre-filter the image (or oversample to begin with and then filter) prior to packing into the eventual raster. To do this correctly (and thereby avoid all aliasing) implies absolutely that there will not be any detail present that would be lost if high-quality scaling is later done (other than the obvious detail that you lose if you downsample).

So, it's really all a moot point. If you ever do think that some video looks worse or softer after scaling, then it's either:

1. poor quality scaling or other processing being performed

or

2. the apparent "detail" that you are losing is aliasing (or other artifacts, such as MPEG-related) and therefore you truly are better off with it being somewhat filtered by the scaling anyway.

or

3. you are downsampling by a significant amount. It's fairly obvious that you lose detail when you downsample, proportionally to the amount you are downsampling by!

In reality, a lot of video (especially content captured on CCDs) contains aliasing. This is not useful "resolution" and is actually harmonically at odds with the content! So preserving aliasing is never really a good idea, unless you're the kind of guy who loves distortion :blink:

Hope that clears some things up.

Cheers,

Adam

Link to comment
Share on other sites

Well I am a brave (many would think foolhardy) man to attempt to resist the patient explanations immediately above from Owen and Adam-O, who have each posted more than 1000 times to the forum and appear to be very well informed indeed and could be dubbed our experts.

As for the desirability of having a 1:1 mapping option, this seems to be conceded (even if indirectly) by both of the experts, in relation to using an LCD television to display text and graphics from a PC operating at 1920 x 1080 pixels.

Allied to that question would be displaying high quality still images originating from a domestic still digital camera costing say a few hundred dollars. These days it is common for such cameras to have over 2160 horizontal lines of pixels. They may not quite manage 3840 vertical lines of pixels. A camera specified to capture a 16:9 image at 3840 x 2160 pixels would be rated at some 8 Megapixels which is higher than average and could currently cost more than a few hundred dollars. However the LCD panels we buy in the next few months will still be in our living rooms 12 months later. Some very high end models of domestic still cameras already come with a raw image output option, but more realistically we might expect a form of compression. This compression will typically be strong in relation to the colour detail but not too severe in relation to the luminance detail. If the picture is taken in good lighting and well focussed we will have a picture that is 3840 x 2160 pixels but compressed into jpeg or some other format. We can then scale that to 1920 x 1080 with appropriate software and save the result using minimal compression, or even zero further compression. I find that if I try to slightly rescale such an image the quality visibly deteriorates. It could hardly be otherwise. I assert that this type of quality still image will be available in the home over the next 12 months, if it is not already available, and LCD TVs we buy today should cater for a 1:1 mapping. I do not accept the contention that upscaling by 2 to 5% and reinterpolation does not noticeably affect image quality, in circumstances where the original quality is high and where the rescaled image can be examined at close range. However Adam-O does explain that video material is subject to filtering and processing.

The trickiest question is indeed the compressed video. It requires massive amounts of compression to fit raw video into the available bandwidth of a conventional DVD, or a current DTV terrestrial broadcast in Australia. If the picture content is changing, the approximations become severe. I find it disconcerting that a person dashing into a room can appear for an instant to have no face! I am able to accept the suggestion that this type of material may not be noticeably impacted by rescaling, particularly when viewed from a distance. This I think is what Owen has been suggesting. Adam-O goes so far as to say, "It can be mathematically proven that scaling, if done correctly, does not degrade the image." I do not think that statement is strictly correct for an upscaling in the 2 to 5% range, unless the source material is already degraded, e.g. through video compression.

I have not been able to properly witness higher bandwidth technology displaying slow moving or static material, though I've had a glimpse of such material. I would presume the effects of the compression would be far less severe than what we have been used to with conventional DVDs and terrestrial digital TV.

Just to round off, one way of applying mathematics to the effects of scaling is to do with ascertaining a final figure for resolution of a still image. That can be a misleading approach. The eye, really the brain, can perceive more than a final resolution figure. If you look through venetian blinds at a ball that is falling, it is very apparently a ball even though you can never technically see the whole of the ball in a single instant, because the blinds hamper your view. If in addition to the venetian blinds there is a dust storm, the measurable picture static resolution may become very low but out of the chaos the brain is sill able to perceive something.

At the local Ozzie tv station, the picture content may have been through many processes but -- to use audio terminology -- is finally 'mixed down' to a particular format. Immediately prior to final mix at the local tv station, the material may exist at a fairly good standard, e.g. a live feed from a studio interview, or a digital stream derived from a high quality conventional movie on film. The format of the final mix for a 1920 x 1080 broadcast should presumably be at least 1920 x 1080 [or 1088] pixels which is MPEG2 encoded and occupies its particular allocation in the transmission stream. (I do not work in the industry and have just an overview knowledge.)

The encoding introduces what could be described as a blurring. This effect is worse if the picture content is complex or fast changing.

In the end what we see is a somewhat foggy result supposedly containing 1920 x 1080 pixels but most of the time consisting of blurred pixels in many parts of the image.

If this result is cropped by say 2 to 5%, and then upscaled, it is not too surprising that the degradation introduced by the upscaling might not be readily observed, because a lot of the the source pixels are rather blurred to begin with, and the eye may be preoccupied with other parts of the image. Certainly upscaling alone can't make a person temporarily lose their facial features, the way MPEG2 encoding -- with rather suboptimal bandwith -- can.

I withhold my final verdict on upscaling and image quality for 1920 x 1080 video until I've been able to inspect the picture content of some of the new technology DVD's sent to a display that can be alternated between 1:1 mapping mode, and conventional cropping, upscaling and reinterpolation mode.

Sorry this is a lengthy post. I'm almost certain it will not alter the opinions of our two experts. Must go now and play back tonight's Dr Who :-)

P.S. 9.45pm I accept what Adam-O says that the resolution for video by necessity must be filtered or a thin vertical string that was one pixel wide would, as it was panned by the camera, at times occupy two pixel widths (as a half-bright image) and other times just one pixel width (as a bright image). Zooming in and out would give a similar result. Interestingly, even a half-pixel wide string could register on a CCD camera as a half-bright pixel, or even as two quarter-bright pixels! However the approach being suggested is that the brain prefers not to cope with this type of detail. Presumably a one pixel wide string might even be suppressed altogether from appearing, because of this need to average and filter. That apparently is the price to be paid for a smooth looking video. And it cuts down too on the bandwith requirements whatever medium is in use to convey the video picture content: disk, satellite/terrestrial, or cable.

Link to comment
Share on other sites



As a simple experiment, open one of your high res photographs full screen in Photoshop on your high quality, high res PC monitor.

Now use the zoom control to scale the image up slightly.

Can you really see a significant change in the image?

If you want to be real pedantic, scale and save the image at your PC monitors resolution so it can be viewed 1:1 at full screen, and then upscale that image 2-5% and save as another file to simulate overscan.

Then use a picture viewer to show both images at full screen, and switch between the 1:1 image and the 2-5% scaled image.

Don’t forget to do all the evaluation at a screen hight to viewing distance ratio that you expect to use with you TV.

If you think that level of degradation is important, then more power to you mate.

I thought I was fussy. :blink:

Link to comment
Share on other sites

Thanks very much, Ikari.

I presume these animated GIFs alternate between what ought to be (1) an original version as decoded from the broadcast data stream, and (2) an upscaled, re-interpolated, cropped back (i.e. overscanned) version of the decoded data stream. Theoretically, after good quality scaling, when displayed on an LCD screen pixel to pixel, the overscanned version could be nearly as clear as the original. Perhaps so similar as to be indistinguishable in quality.

First GIF - BBC Planet Earth 1280 x 720 pixels

This GIF illustrates one of the problems in asking an audience to compare an overscanned image to an original. The overscanned picture looks bigger so automatically looks more imposing! And for people with poor oversight, or sitting a long way from the screen, could actually allow a viewer to see MORE detail.

I initially had trouble viewing the GIFs as internet explorer wanted to resize them. When viewed with specific GIF displaying/editing software, you can observe the separate pictures at your leisure and at a pixel on pixel resolution. (e.g. with "Advanced GIF Animator"). The text is softer in the overscanned version, which is to be expected. The view of the earth is however much the same as far as perceivable detail is concerned, to my eyes. I believe this is because the picture of the earth is not particularly clear to begin with. It could be a different story if the image were a close-up of a person in the studio.

This is an interesting GIF because it reminds us that even when viewing video material, there may be overlaid graphics that are noticeably degraded by overscanning even if the picture content is not.

Second GIF - 1920 x 1080 test pattern

Here, as expected, the overscanned image is quite noticeably deficient, compared with the original. Whereas the original has clear text and displays the resolution test bars cleanly, the overscanned version has blurred text, and very noticeable artifacts in the detail of the resolution bars. It would be interesting to know what bit rate/compression were used for encoding and transmitting the original, as it is exceptionally clear.

The question arises though, how often in Australia will broadcast video material be of a pristine high resolution image, and how much of such a pristine image's detail will not be filtered and MPEG2'd away. (And another objection raised by the experts, is that you can only really notice these deficiencies if you sit pretty close to the screen.)

Still I'd prefer not to have the overscanning, please!

Displaying still camera pictures

When I try this type of exercise on pictures from my own digital still camera, along the lines Owen describes in his post earlier this morning, upscaling by 5% does introduce visible deterioration in image detail. (I use Paint Shop Pro for the image processing). The loss in detail is slight, but for pedantic moi, annoying. This is, I concede, a much more exacting use to put an LCD display to, than displaying broadcast video.

Link to comment
Share on other sites

Inexperienced people get all worked up about 1:1 mapping, but for video use, it just does not matter if decent scaling is employed.

Thats the real key mate. And i think you miss it every time.

If you have the option for 1:1 pixel mapping there is NO scaling required (given the proper source). If a panel forces over scanning and it has SHITHOUSE scaling, then yes we will be losing quality. 1:1 pixel mapping then negates a panels inability to properly scale.

Link to comment
Share on other sites

Unfortunately, some 1920 x 1080 LCD TVs do not offer the option of 1:1 mapping. They do force upscaling, cropping and reinterpolation. The scaling and reinterpolation would normally be well done, but must introduce some loss in quality. For example on the first GIF in Ikari's post above, the High Definition logo is not as clear in the overscanned version as in the 1:1 mapped version. With the current data rates and compression protocols for brodcast digital tv in Australia, it seems that most video material is not visibly impacted by the overscanning, other than the obvious inability to see the outer edges of the picture (which experts argue are often not intended to be seen anyway).

But later this year there there will be new technology high-definition DVD disks (not too expensive) on sale with high quality transfers of movie classics. And there will be stand-alone players (initially pricey). I don't think anyone has proven that overscanning will not degrade the visible detail of this higher quality material.

When I buy an LCD monitor, or LCD monitor with built in tuner, I will be not want it to forcibly reprocess 1080i material sent to it, into an upscaled reinterpolated cropped version.

Interestingly, some LCD monitors can apparently be tricked into not overscanning (Topic: Acer AT3705-MGW 37" LCD TV, True 1080p? e.g. mrangryfish Mar 10 2006, 08:02 PM). weblink

Link to comment
Share on other sites



This GIF illustrates one of the problems in asking an audience to compare an overscanned image to an original. The overscanned picture looks bigger so automatically looks more imposing! And for people with poor oversight, or sitting a long way from the screen, could actually allow a viewer to see MORE detail.

That is why I left out the cropping step in my example.

If the 2-5% scaled image is not cropped, the video card will scale it down to fit the display.

This way the 1:1 image and the scaled image can be viewed at the same size.

When I try this type of exercise on pictures from my own digital still camera, along the lines Owen describes in his post earlier this morning, upscaling by 5% does introduce visible deterioration in image detail. (I use Paint Shop Pro for the image processing). The loss in detail is slight, but for pedantic moi, annoying. This is, I concede, a much more exacting use to put an LCD display to, than displaying broadcast video.

Did you view the images at the same size (upscaled image scaled back down to fit the screen), and at a screen height to viewing distance ratio that you would use for normal TV viewing?

Here is a 1920x1080 HD test pattern taken from a 19Mbps Mpeg2 stream.

http://img154.imageshack.us/my.php?image=testpatternrm3.png

Here is the same image upscaled 4% and down scaled back to 1920x1080 for comparison.

http://img132.imageshack.us/my.php?image=t...rnscalednl3.png

Download both and look at them at a normal viewing distance.

Viewing in your browser is not ideal, better to use a proper full screen photo viewer.

The scaled image is slightly dimmer for some reason, ignore that and just look at the resolution.

The images are in a lossless PNG format.

Link to comment
Share on other sites

Unfortunately, some 1920 x 1080 LCD TVs do not offer the option of 1:1 mapping.

Is this the case though when connected to a HTPC, DVI-DVI or HMDI-DVI ?

I recently trialled a Pyrod 1920x1080 LCD TV, DVI-DVI from PC and it displayed 1980x1080 native without overscan.

Link to comment
Share on other sites

Thats the real key mate. And i think you miss it every time.

If you have the option for 1:1 pixel mapping there is NO scaling required (given the proper source). If a panel forces over scanning and it has SHITHOUSE scaling, then yes we will be losing quality. 1:1 pixel mapping then negates a panels inability to properly scale.

That’s a valid point mate, however, you seem to have to wrong idea. I am not anti 1:1 mapping, it’s just not as important for video viewing as many would believe, that’s all I am saying.

If you buy an el-cheapo LCD panel, you will get the scaling you paid for, however decent quality displays have decent quality scalers.

Obviously, if you have a cheap panel, you would want to avoid the internal scaler if possible.

HTPC users have the option of scaling on the PC to provide overscan, or get 1:1 if required.

I have been viewing a 57” 1080 display for over two years, and have a significant library of 1080 HD.

I can set overscan to whatever suits me, but I have to say that I never choose to go without a little overscan, as it just looks a whole lot better with nice clean edges on the image.

HDDVD and BluRay will most likely require a little overscan, just as DVD’s do.

All video formats are designed with overscan in mind, it’s the industry standard, like it or not.

Link to comment
Share on other sites

Here is a 1920x1080 HD test pattern taken from a 19Mbps Mpeg2 stream.

http://img154.imageshack.us/my.php?image=testpatternrm3.png

Here is the same image upscaled 4% and down scaled back to 1920x1080 for comparison.

http://img132.imageshack.us/my.php?image=t...rnscalednl3.png

Download both and look at them at a normal viewing distance.

Viewing in your browser is not ideal, better to use a proper full screen photo viewer.

The scaled image is slightly dimmer for some reason, ignore that and just look at the resolution.

The images are in a lossless PNG format.

Most interesting. The original is a little soft, certainly softer than the test pattern in Ikari's second gif, which is exceptionally clear. The rescaled one is a little softer again. If I stand 3 metres from my 19" CRT monitor operating at 1024 x 768 and with both png images in paint shop pro displayed as windows of the right-hand corner of the test pattern, and at actual 100% size, I can see the difference. The upscaled image is just a bit more blurred than the non-upscaled mage. By the way, 3 metres away is as far as the room my monitor is in permits. I haven't rigorously done the mathematics of viewing distance. However, here are soem quick calcs.

Roughly measured, the rescaled entire test pattern displays at about 13.5cm high on my CRT monitor when PSP is displaying at 40% original size. It ought therefore to display at 34cm high if my monitor screen were bigger but retained the same scanning pitch, and PSP was displaying at 100% original size. I think 34cm high is not all that big for a widescreen LCD display. My viewing distance of 3 metres is over 8 times the caclulated screen height. Am not sure how far I would sit in practice when watching video on a 34cm high widescreen display.

All of this just shows the difference can be seen. It doesn't establish it's a major issue for watching video.

Unfortunately I must leave this forum discussion for a little while. Cannot spend my life here. Thanks again to Owen for these images. They certainly give the discussion a practical edge.

Link to comment
Share on other sites

Most interesting. The original is a little soft, certainly softer than the test pattern in Ikari's second gif, which is exceptionally clear. The rescaled one is a little softer again. If I stand 3 metres from my 19" CRT monitor operating at 1024 x 768 and with both png images in paint shop pro displayed as windows of the right-hand corner of the test pattern, and at actual 100% size, I can see the difference. The upscaled image is just a bit more blurred than the non-upscaled mage. By the way, 3 metres away is as far as the room my monitor is in permits. I haven't rigorously done the mathematics of viewing distance. However, here are soem quick calcs.

Roughly measured, the rescaled entire test pattern displays at about 13.5cm high on my CRT monitor when PSP is displaying at 40% original size. It ought therefore to display at 34cm high if my monitor screen were bigger but retained the same scanning pitch, and PSP was displaying at 100% original size. I think 34cm high is not all that big for a widescreen LCD display. My viewing distance of 3 metres is over 8 times the caclulated screen height. Am not sure how far I would sit in practice when watching video on a 34cm high widescreen display.

3 meters on a 19" CRT :P Come on man, you cant be serious. :D

I am viewing those images on a 17" 1920x1200 LCD (23cm picture height) at 300mm (0.3 meters) viewing distance, and I am hard pressed to pick any difference other then the brightness.

Nothing wrong with my vision either.

Something very wrong there dude. :blink:

The images are softer then the Gif’s because they are from HD video, not PC graphics.

Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...
To Top