All this, again started with me noticing people asking for refunds because of mere fact how term native resolution understood in general public. So some peoples action resulted with a lot of unhappy people who thought they will not see a true resolution of 1080p on the screen - which is not true. I myself had private conversation with people explaining them what it is about and calming them after they read rants online.
My facebook beer offer was around something else (but I may be wrong). And I will debate anything if I feel there is a reason to debate - And I am not afraid to lose. But will defend where defence is obviously needed.
In this debate I said it would be the fairest to say that Native resolution of the sensor is 540p with pixel shifting and Native resolution of the projector (screen) 1080p.
You can all blame TV set manufacturers who decided to use native resolution massively in marketing of TV screens. When an ordinary person buys a projector, your mother, father, uncle, barber, chef, lorry driver - for him/her it the same as buying a new TV set and words native resolution are compared with what they mean for TVs namely number of unique dots on the screen.
That is the fact of life. When you add expert version and go into chips and call someone a liar then you defend your expert position but you demolish the world of the ordinary people. And then the above mentioned happens.
So in my debates I am simply defending the right of an ordinary guy to buy what he actually wants to buy. Right to use the word as he is used to use and not getting confused because experts need to be right. Right to compare apples to apples (tv screens to projector projections) not apples to oranges (different technologies to create the image on the screen/projection). Sorry if that is arrogant.
p.s. not throwing fuel on fire. I am done on this topic. Likes and private messages from people who read my messages are enough for me.
Very interesting and clear explanation. Never mind the other 95% who do not understand what is happening. So I backed and bought a projector with a native resolution of only 960x540 even if with the help of software wizardly the display shows 1920x1080. This REALLY explains the huge difference of resolution quality from my TV of 75" to this projector. The image quality is notoriously blurred when compared to the TV. I feel cheated…
Philips people, could you step in and clarify this situation please?
it is not entirely software. It’s actually more on hardware. As explained, this technique is being used by many projectors that are marketed as native like many 4k projectors are. Difference is barely noticeable I think between those that do not use this and those who uses it.
Also, don’t expect to get the same quality as your TV. Even high-end and bigger projectors might not even win
Last thing, picture quality is currently worse on the internal system. Try using an external source like a firestick or TV box and play HD contents. You’ll see that this is indeed a 1080p projector
Well, unless you really have a high standard with resolutions
So… you confirm that the input it deals with is indeed 960x540? It downgrades the input it receives (from Netflix or others) and then it expands the information of 960x540 by a factor of 4 (equivalent to a digital zoom, where you always loose quality?) into a false 1920x1080, hoping to trick the human eye. Is this what is happening? From one dot our PPM creates 4 dots?
No, I don’t think it works like that. Videos or inputs have no visibility on how the display works. They just know it’s 1080p and the engine does the job to display the content in 1080p and not 540p.
Anyway, it’s really comes down to how you see it and not how it is done. It’s up to you if you see FHD or not. I guess there’s no point arguing more about this
The only issue I think was that Philips was not clear or accurate about using this technique.
It is definitely understandable that they use it because otherwise, PPM might have been a lot bigger.
No. The video path is 1920x1080 at up to 60Hz from internal Android or external HDMI/USB-C video sources until it reaches the DMD processing FPGA. At this point the 1920x1080 image is split in four resulting in 960x540 video at up to 240Hz. Each one of these four sub-frames is projected in a different position, basically shifting half a pixel to the right, half a pixel down, halv a pixel left and then finally half a pixel up returning to where it started. Rinse and repeat. The TI datasheet for the DMD device used https://www.ti.com/lit/ds/symlink/dlp230np.pdf (page 17) explains this very briefly. The DMD is marketed as a 1080p device.
To put it in simple words, it is a true 1080p image without software trick, but obtained with a combination of 1) speed of the micro mirrors and 2) a system that physically (through actuators) slightly moves the image of mirrors. Pretty much same technology than your 4K TV if you have one.
Hello,
in order to have your opinion i have done 2 pictures (almost the same, 1 sec difference) from the same 4k video on youtube, on my 1080p home projector and PPM, Same conditions (light, wall, dimensions, fps hz…) the PPM was updated with the last firmware available, I tried to optimize the PPM, but isn’t really easy, give me your advice! My home projector:
Very interesting - thank you for sharing. With a bit of online exploration I found the http://www.ti.com/lit/ds/dlps056b/dlps056b.pdf - which is the real deal: the Texas Instruments DMD thing that has indeed 1920x1080 micro mirrors and needs no special image trickery to display a Full 1080p HD image - it is truly native. What our lovely little Picopix Max has inside is indeed the DLP230NP, which only contains 960x540 micro mirrors, and with these “the fast switching speed of the DMD micromirrors combined with advanced DLP image processing algorithms enables each micromirror to display 4 distinct pixels on the screen during every frame, resulting in a full 1920 x 1080 pixel image being displayed.” as mentioned in the document you linked (I thank you once again) This is indeed a way to say that it displays Full HD but the real image being display is not fully native HD - it is 4 times smaller and goes through image processing algorithms to make it 1920x1080 but it lost the original quality it had. Am I understanding this correctly?
Hi Pedro. Your understanding is not correct and probably misled by the “image process algorithms”. It is an algorithm that simply makes the link between the movments of the mirrors, the LEDs and the actuators so that the image you see on screen corresponds to the 1080p source. Once gain, the image you see is 1080p. It is totally different to scaling.
Hi Maxime, I’m sorry if I’m making this thread too long, but I am really trying to get to the bottom of this. Sorry for all the questioning. I understand that the display image is 1080p. There is an input from let’s say, Netflix, with 1080p quality. This input arrives at 960x540 mirrors, which is 4 times less mirrors than pixels. I imagine 3 possible scenarios (I could not find details on the Texas Instruments documentation on their site that I could really understand or clarify my doubts), and here they are:
The HD 1080p input image is downgraded to 1/4 of its quality so that 1 pixel goes to 1 mirror. There is then mirror and bytes trickery (hardware and software) to expand the downgraded image to an output of a 1080p display. Counting the pixels, we have the right total but the real quality is lower than real 1080p. This is the worse case scenario, as we get a sub-optimal solution.
The HD 1080p input image is sliced in tiny chunks of 4 pixels each. Each chunk is sent to one of the mirrors and it gets displayed as a 1080p output (sounds unlikely judging by the TI stuff, but it is beyond my knowledge to say otherwise). The display quality should be real 1080p
The HD 1080p input image is sliced into all its pixels, and in an amazing - but I guess possible - technical trickery, each mirror receives 4 pixels in 4 different moments, moving only slightly each time to display them in the right position, and the software reconstructs the whole damn thing back to 1080p, every pixel in the right position, and given the super fast speed at which is all happens we don’t really notice it at all. So, each mirror would juggle 4 pixels very quickly. The display quality should be real 1080p.
So, 1) sounds highly plausible, 2) is viable, and 3) is amazing technical complexity and wizardry! Is there a 4th option (probably the right one ) or is any of the previous 3 correct? Once again, sorry for being so curious.
BTW, the Texas Instrument text seems to match options 2) or 3): “ the fast switching speed of the DMD micromirrors combined with advanced DLP image processing algorithms enables each micromirror to display 4 distinct pixels on the screen during every frame, resulting in a full 1920 x 1080 pixel image being displayed ”
It’s a native 1080p picture put up with a 1/4 size projector array, by moving the entire array a little bit 4 times per frame to print the correct pixels to the right part of the image.
In effect, a 60fps 1080p picture is produced by the projector running at 240hz 540p
The only concern I have with this is that each pixel is only lit 1/4 of the frame whereas with a 1920x1080 LED array, each pixel will be lit for the entirety of the frame.
Depending on how much the human eye is able to detect this, there will be some impact. His much that is will be down to individual perception. I fully understand that in any 1/60th second, all pixels (1920x1080) will be individually lit, but not at the same time.
It is not dissimilar to the old interlaced display on a tv and while the switching is faster, that interlacing was definitely visible to the human eye.
Personally I think it’s a worthwhile trade-off but let’s see how it behaves in reality especially with fast moving images.