How does grayscale on monochrome LCD work

How does grayscale on monochrome LCD work

Show Video

hello everyone quentin here what type of screen are you currently using to watch this video i guess many people will say it's in tftlcd be a tnva or ips some other would probably say it's amoled which is also quite common nowadays these display technologies have been a critical part of the model devices we use today however before any of these dominate the mobile devices there exists another type of screens the passive matrix lcds these gray scale screens with a greenish tint many portable devices made last century and early 2000s featured this type of display while they are almost 16 or modern consumer electronic devices they could still be easily found on various embedded devices i'm always fascinated by the distinct look of these screens so i picked a few of these screens online and started learning my controllers to drive them that was more than 10 years ago this is one of the screens i bought so let's do something with it but before talking about this one specifically i want to clarify generally there are two types of screen interfaces the video pixel interface and the msu mpu interface pixel interface would only take pixels and work like a video stream with continuous pixel transmission and horizontal vertical synchronization signals sort of like vga mpr interface is kind of like co-processor interface the screen and it has its own memory and the whole sends command to the screen over some general purpose protocols like spy square c etc one major difference is then who would be refreshing the screen lcd typically needs constant refreshing to keep the image display if using the pixel interface then the image is streamed directly into the scanning circuit and driving the panel if using mpu interface then the image is being written into screens controller memory the controller will then refresh the screen asynchronously generally for these monochrome screens screens with resolution equaled or higher than 240x160 tend to use the pixel interface and for color tft screens there with the resolution equal to higher than 640x480 tend to use the pixel interface or mpu interface the same division exists for ink screens as well these resolutions are just for reference only and based on my own personal experience please refer to the data sheet for the interface used on your screen the advantage of using a pixel interface is that the screen output is synchronized to the host so synthetic tiering could be easily avoided but the downside is host must be fast enough to push the pixels at screen refresh rate and typically need to have large enough memory to hold the interference buffer mpu screens on the other hand doesn't require these but getting a tearing free image on these could be tricky in this video i'm going to focus on screen with pixel interfaces back to our topic this is a 220 by 100 pixel reflective fxtn lcd screen with de facto industrial standard st and lcd pixel interface i don't know the exact name for this protocol but i can confirm the same protocol is used across many many different pixel interface sdn screens which is just really just horizontal think plus vertical think plus pixel clock plus inversion clock with a bunch of data lines transmitting multiple pixels per clock i'm not going into details about the protocols though if you're interested it's the same protocol as the er screen one i showed you in the previous video and you can find more detailed information there in summary the microcontroller sends the entire frame to the screen at a constant refresh rate of 60 to 120 hertz so let's just place an image on it this is a source image and this is the result the issue is obvious the screen is one bit monochrome so for each pixel it could be either on or off without any shades in between if i just directly display image by clamping the pixel value to 0 1 it will look like this the good news is i could do better this playing image with the limited color palette is will research and understood topic and one answer is to do this ring the main idea is using diffusion of available colors to create approximation of colors not available from the color palette for example if i were to apply this ring to the image it will look like this it's still outputting to a one bit screen but now i can see more details from this image compared to the direct clamping there are many different ways of doing the desiring and the one i used here specifically is based on error diffusion let's take a closer look then i have an input image that's a bit gray scale and output is one bit or in other words if i to represent the output in input range the output would be either 0 or 255 anything in the middle is unavailable to choose the output value based on input a quantizer is introduced here if the input is larger than a half or 127 then the outputs is 255 otherwise zero this is what i've been doing here clamping the image to black and white directly now bringing the arrow diffusion onto the table each time the quantizer chooses an output value that's different from the input and error is introduced the error is simply the difference between input and output in the previous case the error is simply dropped however if the goal is to minimize the error then i should do something to compensate for the error error diffusion means to diffuse the error to neighboring pixels as a result when the quantizer quantized these pixels the error would have a chance to be compensated this is then what i showed before but something is off i remember these screens can do many level of gray scale like on these devices they could display different shades of gray or green however almost all the screens i got with one bit mono initially i thought this was probably because the screen i got was for industrial control or similar stuff so they don't need it only screen for consumer electronics would have the great skill support which turned out to be false i took apart some devices that have 16 level grade scale and found out that screens themselves are one big screens what's the trick there let's take a deep dive into that so now i have been given a screen that could display either black and white and i would like to display different shades of grey on the screen how do i do that well if i use another analogy to it i have a digital output pin which could be either high or low and there is an led connect to it and instead of just lighting up or shutting off i want to have a different brightness this sounds familiar there is a commonly used technique called pwm for doing this basically define a period then for some time in this period the output is high and for the rest it's low the whole period gets repeated over and over if this is fast enough that eyes couldn't detect one would think it's some brightness in the middle can i do the same on these monochrome lcds yes and no there are some lcds that have pwm built into the driver chip so they natively take multiple bits per pixel as input and would modulate the gray scale during the screen refreshing process one example is that the gameboy's screen takes 2-bit per pixel input natively and produce 4-level gray scale however based on my own observation this is not very common most other screens like the ones on these pdas only take one bit per pixel input and only do black and white natively well then if the screen driver chip doesn't have pwm butane can i still use pdm on maybe a higher level you see lcd to keep the display needs to be constantly refreshed so if i look at the single pixel it's being driven at a constant interval and i have a control over the level in this sense i could simply put a pwm waveform here for example if i want a level grade scale then i can have a pwm period of 15 then based on output level the duty cycle would be 0 to 15 cycles sounds great does it work let's see to code it there are 1 bit per pixel frame buffers that would be sent to the screen and there are 8 bit per pixel buffers that would hold the grayscale image to be displayed now i just need to write a function that does the software pwm on these buffers now i set up a pdm period counter which goes from 0 to 14 incrementing on each frame then i loop for each pixel it compares the incoming pixel value with a counter value if larger at alpha's one otherwise zero so with the different pixel values it will output a different duty cycle on each pixel now let's run it and see how it looks it doesn't seem to work very well the image is quite flickery and the reason should be obvious the minimum unit of output is a frame or at 120 hertz refresh rate about 8.3 milliseconds if the pwm period is 15 cycles that's 15 frames or 124.5 milliseconds total this translates to a pwn frequency of only 8 hertz surely it would flicker pretty bad i could reduce the pdm period down to say three so the pwm frequency is up to 40 hertz for sort of a steady image but obviously the grade scale level is then reduced for four levels i can get a great skill then it's all good right now samsung is still off i know these devices could do 16 level or even higher level grade scale out of these one-bit screens so there must be a better way of doing that after reading some papers i found out typically this is done by using optimized sequences for example when outputting 50 gray if the pdm period is set to like 16 it would be 8 cycles high and 8 cycles low than repeat but to reduce flicker it could be just 1 cycle high and 1 cycle lower than repeat still maintaining the same 50 duty cycle sounds great so let's implement that i designed some naive sequences for 32-bit level grade scale the duty cycle linearly corresponds to the pixel value some may recognize this and say isn't this just mimicking pdm yes it is but let's hold on to it for now the code for using such sequences is also straightforward again i keep a periodic counter from 0 to 30. instead of directly comparing the pixel value with a counter value to produce a pwm waveform both are used as index into the sequence lookup table and derive the value to use now running the code i'm almost getting 32 grayscale levels except some levels are still being flickery this is totally understandable for these levels optimization doesn't improve much for example in the extreme case of the lightest level that's not purely white it's just one thousand thirty one seconds it doesn't matter where you put that pulse what to do with these flickery levels well there are two possible ways to fix that one is to simply not using these duty cycles in the sequences are crafted the duty cycle linear corresponds to the input grade level this is not really required and this is also not what typically done there are many duty cycles that have a shorter period but are not using this 31 frame sequence for example like 5 9 3 5 3 4 7 8 etc i can use these sequences to replace once that's too flickery i also don't have to start with one pulse i could start with say one h3d cycle at the lightest grade level this would obviously screw up the linearity but the screen response is not linear to begin with anyway now with a probably better sequence table run again and see a little bit better but still far from being good it says there are two ways to fix that and now it's time for method two looking at the flickery shades imagine if it's just one pixel there then it wouldn't be all that obvious that it's flickering a block of pixels update the same way making it especially easy to catch with eyes in other words if pixels are not being flipped at the same frame then it wouldn't be that obvious there are multiple ways to achieve that like reversing or rotating the sequence table based on the pixel location like if it's even odd etc or simply use some random functions i'm going to just use an lfsr to add an offset to the sequence for each pixel remember to reset the rfsr at each frame so each pixel always get the same offset run again much better right this method of using sequence table along with other recommendations for generating grayscale is typically called frame rate control or frc for short while it doesn't really control the frame rate at all as a result i've implemented what's typically done on a grayscale lcd controller from the last century i've also got 32 levels of gray scale out of this one bit lcd the difference is that typically these are implemented on hardware but here i'm emulating these with software in the early part of this video i introduced the concept of digital ring uh in fact all the stuff i've been talking about is dizzering the initial error diffusion design i showed is spatial distributing or this over an area then later the pwm and grade scale sequence are temporal digitally or digital over a period of time now thinking about the error diffusion if it's possible to diffuse the error over an area would it be possible to also diffuse the error over a period of time like a temporal error diffuser let's try that as well back to this function where it picks the value from the sequence table modify the code to quantize the pixel then calculate the error value but instead of diffusing the arrow to neighboring pixels the error is accumulated in a pixel error buffer when the next frame comes the quantizing error is added back to the quantizer input so that's taken care of did i just reinvented the first order delta sigma modulator i said before the whole optimized sequence thing is just a mimicking pdm or pulse density modulation and now with the delta sigma modulator i'm getting a real pdm signal run it does the result look familiar yes it's very similar to the previous case with optimized sequence but without any randomness if i were to add some randomness to it by adding a plus one or negative one random number into the pixel each time then we're getting some randomness looks pretty good thinking about that again adding a small random number is basically injecting a small noise injecting noise to a sigma delta modulator i just reinvented a first order noise shaper cool now the issue left is just the gamma correction or great scale calibration as you can see their screen response is quite non-linear the many levels near white and black are not usable to get more usable levels and bumping up the total grade scale level to 64. then i'm letting it cycle through all 64 levels and taking a photo of each level i'm using a simple python script to crop the image center and calculate the average srgb value if i plot this value against the level number the result is quite nonlinear and matches observation the goal is to have a straight line here note this is not linear brightness because the display should be tuned for a gamma of 1.8 or 2.2 but since the camera is also the doing the gamma correction i just need to shoot for the linear srgb value and the gamma should be automatically taken care of how am i going to do that i cannot change the relationship between the output value and the brightness level but i can change how the input is mapped to the output the input image is the 256 shades but the output is 64 shades i could just craft the table to map 256 shades into 64 and compensate for non-linearity color lookup table calibration on a monochrome lcd did i just go a little bit too far now it looks really good to be honest i didn't come to this stage by accident i've worked with delta sigma modulators and noise shapers in audio applications before so i know these and i've been wondering about if they could be applied to driving these lcds for a long time i looked through many different lcd controller manuals searched many relevant papers but seems like no one's doing that finally i decided to just try it myself and see if it works now we can clearly see it works but is it worth it you see in order to implement this feedback loop i had to create a state buffer for it to remember the accumulated error for each pixel for the controller the processing for each pixel is now a read modify write process to a large buffer compared to a table lookup this additional complexity could mean a lot in this proof of concept experiment i'm using the microcontroller to emulate an lcd controller but in real life applications that would be an asic if the lcd controller is originally like these self refresh controllers then it would need to double the size of internal ram but if the lcd controller is like those without memory at all like a controller taking vga input and output to the screen this would mean it now needs to implement an external memory controller and a whole set of pipelines just for this though this could be why it's never really being used instead people are using these pre-crafted sequences to drive the screen okay enough about disadvantages does it have any advantages let's compare them to ensure both sides have the same brightness response for the same level i've reverted to use a linear duty cycle sequences plus the same calibration lookup table as we can see with static images there isn't much difference in the test using a noise shaper doesn't yield much meaningful advantage over the traditional approach now let's look at the dynamic response or basically playing back a video the video i'm using is a sintel movie by the blender foundation licensed under creative commons attribution 3.0 just

for the sake of fun first let's look at the results with direct clamping and one bit spatial dizzering and four level p1 grade scale well it's let's bring in the kind of industrial standard method of using great skill sequences the video now looks much better right then bring in the noise shaper now we can see the difference with noise shaper dynamic response is much improved compared to using pre-crafty sequences now let's explore if this could be tweaked a little bit more for example currently this is a first order noise shaper what if i use a second order noise paper well if we say the video frame rate is 24 frames per second then the screen is being refreshed the 120 hertz this is essentially an over scaling ratio of only five given this low ratio higher order noise shapers wouldn't improve much so how about increasing the ratio i can overclock the screen to 240hz and see how well it does now in conclusion in this video i explored and implemented various different methods of producing grayscale images on these passive matrix monochrome lcds in addition to use industry of standard methods i've also tried to use noise shaping technology commonly using other applications here and got some very good results as well that's it for today i'm including the full lens movie played on this screen using second order noise shaper at 240 hertz refresh rate after this and i will see you in the comment section and see you next video bye [Music] [Music] uh [Music] [Music] ah [Music] this blade has a dark past it has shed much innocent blood you're a fool for traveling alone so completely unprepared you're lucky your blood's still flowing thank you so what brings you to the land of the gatekeepers i'm searching for someone someone very dear a kindred spirit a dragon a dangerous quest for a lone hunter i've been alone for as long as i can remember hmm uh [Music] shhh [Music] hey shh we're almost done hey sit still [Music] good night scales [Music] come on [Music] skills uh [Music] [Music] [Music] yes [Applause] [Music] [Music] let's go [Music] [Music] [Applause] [Music] so [Music] um [Music] [Music] uh i have failed you've only failed to see these are dragon lands sintel you are closer than you know [Music] so [Music] uh uh [Music] skills [Music] oh [Music] living [Music] [Music] scales scales [Music] [Applause] [Music] [Applause] [Music] [Applause] [Music] uh ugh [Music] [Music] come take my journey into night come be my shadow walk at my side and when you see all that i have seen can you tell me love from pride [Music] i have been waiting [Music] all that you hold dear do you fear what you will find breaks through the night [Music] i found in your eyes [Music] i will be listening [Music] showing me what i've become [Music] foreign [Music]

2022-04-30 14:05

Show Video

Other news