Binarizing QR codes

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
yo what's up what's up what's up welcome back uh today we are working on our QR code scanner uh we have this like version of the QR code scanner that kind of works on idealized cases so small QR codes perfectly access aligned perfect lights and darks perfect uh yeah like basically like the the type you would see before you print it um and now what we want to do is we're starting to look at how do we go from a noisy real photo of a QR code with which maybe has like shadows and Imperfections how do we go from that into uh something that we can scan uh and so uh two days ago what day it today Wednesday so on Monday we were looking at uh we built this like little debug UI so we made this like really shitty version of a QR of a QR code so we took like some photo from the internet and we like cranked up the noise and added the shadow to simulate like you know you've got a bad phone it's dark in the bar there's lights on and shadows whatever um and we built this little dbug UI which I think actually isn't running right now hold on uh Zig build and run this okay and so what we can do here is we can like select a region uh it will take that region that we selected it'll blur it a little bit and generate this histogram over here and then we started working on like the world's worst like binning algorithm so the idea is that um depending on where you've selected what is considered dark and light is going to change right down here in this region you're going to have your lights kind of in this area your darks in this area where histogram is kind of indicating like the stuff on the left is number of pixels that are dark and the stuff on the right is number number of pixels that are right or bright and everything in between um and so where it gets kind of interesting is like you know these guys are going to have kind of different ranges in the different areas of the image and then also like over across this boundary of the Shadow what is considered light and dark is kind of confusing um and so what we want to do is we want to figure out like how do we like categorize these so what we're doing with our first algorithm was we just look at the uh tallest thing in the histogram and put it into a clump and uh over time we start to generate more clumps we have to like figure out some sort of merging strategy which we haven't done yet um and so yeah I think that's that's what we want to work on first is how do we like kind of turn this into two distinct categories light and dark um and so I think that we have to kind of uh fix this algorithm a little bit um I think that we probably want to like remove some of the noise on the histogram and try to get it to be like a nice little uh nice and round so that the our algorithm can do a little bit better um we need to figure out if there are more than two bins right like in this scenario we should end up with like four categories right how do we decide which ones are light and which ones are dark maybe it's by limiting the region so that we we only need to consider to right like if we're if we're looking at a pixel in the center of this region maybe it's good enough that over here it biases towards the light side and here it bies towards the dark side maybe that's fine maybe it's not um once we've kind of figured that out then we can just start trying to like actually apply a bending algorithm to this to this picture so what we want to get at the end of this is we want to get image that has two colors perfectly light and perfectly dark and we want it to like kind of represent the original that is the plan uh so I think the first thing I want to do is I want to smooth out the his the histogram a little bit specifically if we're if our region is kind of small uh this is kind of hard to interpret right it's very noisy and all like like possibly flat like right you could you could imagine like drawing a line here so if we if we were to like Smooth this out a little bit by averaging the bins I think that we'll get something that might be a little bit more defined uh so I didn't want to try that the very least let's try it and see what happens so uh since the other day this has all been merged into like the ual QR code decoder repo so it's a little bit different than we saw the other day but it's pretty close um somewhere in here we have this like server and our web our web UI will request to get a histogram and we will handle that somewhere handle histogram this is generated in our like application date he calls hey let me get this histogram and let's just try for now um we'll like maybe add a like smoothing radius uh and let's kind of set this to some U size um let's say that we are going to Smooth by going from uh let's see so we have We're looping over every pixel in our image and we are then saying hey we counted this many pixels at this brightness so that's here and then now we then we want to smooth it once we have all of the answers so we can say for Z to R Len uh we can say then we're going to go over some smoothing race we're going to say like smoothing start is going to be I minus smoothing radius over two uh but this could be at like zero or at the end so we need to like do a saturating subtraction which I think Zig has built in with this like minus pipe it's either pipe minus or minus pipe um Zig saturating ad uh operators operators here we go uh it's the pipe jeez look at that minus pipe equals so the pipe goes after the minus uh the smoothing end will be uh the minimum of the length of the histogram or I plus smoothing radius and then we can just go for smoothing start smoothing end uh we'll call this like J I guess doesn't really matter and we'll make we'll compute some average uh and we'll say what uh average plus equals RIT P um I guess we'll call this unsmoothed actually because we want we don't want to adjust the unsmooth version as we're iterating and then at the end here we will just say r p red I this is going to be I here uh is equal to average we actually don't even have to divide here because I think that everything will just scale relative to each other and it's fine probably un smooth L and then we do the same thing here where uh this is red and we actually don't even have to stash the average here we could just say that uh un r i plus equals UNS smooth J yeah and then we have to set this smoothing radius so let's just uh we might as well just like immediately parameterize this because I know that I'm going to want to check how it looks at different at different sizes or at different smoothing amounts so we might as well just hook that up now so in the server when we handle the histogram we're going to take we're going to look at our query parameters uh for this request uh I think I call do in here not late today not that's cool hooray we made it I kind of assumed that if you were late every day like I assume that you'd be late every single time because your time zone would just line up shittily with the stream because I stream at the same time every day but pretty cool that you managed to make it okay so here we're going to have some query parameters that we get in our request and we're going to say if the uh if we see a parameter that is equal to the word like uh ra radius smoothing radius I guess we'll call it uh and we see that with the the given key then we have to parse the smoothing radius as an INT so here we'll start by maybe we'll set some default it's 238 what time zone is that is that like uh like Western Europe uh time in UK a little bit east of England Spain nice nice nice nice cool uh okay so we're going to look for a smoothing radius and if we get a smoothing radius parameter we have to parse it as an N so we'll say our smoothing radius and he's going to be usze and maybe we'll just default him to no smoothing zero I think I think that works with the way that we set up the histogram minus 0 to plus Z just does nothing I think that's fine uh no we need to do I'm actually not sure about this we we need to make sure that um in app whatever whatever we set here um has to be able to handle no smoothing so here we're iterating from I minus radius over two to Plus radius over two it's supposed to be and I guess what we should do is handle uh odd numbers in some way uh so what we want this to do is we want this to end up going to uh 0 to one right right so I - 0 to I + 1 will end up just copying I which is what we want so this is like smoothing uh smoothing radius over two subtracted and then add in this has to be plus smoothing radius percent two I think and then I think a smoothing radius of one will do what we want yeah yeah let's try that maybe I don't know um so here we have to parse the string and turn it into an INT so it's stood format parse int as a u size and it's pam. Val maybe and in base 10 and then here we can we can call our histogram function with the smoothing radius nice and then everything else just happens um we need to make sure that we can get query parameters here so we'll say hey uh if somebody gives us some extra trailing on this histogram don't worry about it try to parse this qu frams we're fine with that and maybe that just runs now not quite we'll just throw zero smoothing in there on initialization 280 it's almost like we want to stash the smoothing radius because we're going to be calling this here like when we pick a new when we pick a new region of Interest we reset the cluster finder and our we like pull in a hisr ground for that but maybe we can just use one here and when we [Music] ch um we're actually we do actually need to like stash this somebody's going to call hey I want to set the smoothing radius his smoothing radius it's going to be U size yes this makes more sense I think then here we going to set his hist smoothing radius to one or something then we're going to have histogram isn't going to take this in anymore he's going to just use uh whatever he has stash and we're going to have a set smoothing radius uh it's camel case in this language like this and he's going to take in maybe the radius yeah and then we just say self his smoothing radius equals radius okay okay uh then here we can use self kissed smoothing radius and I think we're chilling I think we're chilling okay so on the server side we need to now add this set smoothing set his smoothing we'll call it here like this so set his smoothing so we can match it this guy doesn't need beginning match type anymore and then here we can say we have a new thing that we can do and it's set his smoothing and then somebody's going to try to switch on all these guys and we have to set them up so set his smoothing handle set his smoothing and that's going to look very similar to what we had previously in handle histogram and we'll just kind of like split this stuff out so we'll take the query parameter matching in here slap down here uh this guy then just says set his smoothing radius to the smoothing radius he now uh doesn't have like a default parameter if somebody's calling this they better actually send it to us so we'll say that it starts as null and then we assign it and if it's still null down here we'll call this maybe uh the optional version and then we'll say uh const smoothing radius is the optional version but if it's not initialized then we will return an error so we'll maybe log here we'll say uh smoothing radius not provided and then we'll return an error uh return error invalid argument maybe uh-huh okay [Music] okay and then at the end of this we also have to say uh respond empty request cuz they're going to be waiting for us to say we did it so here have to say we did it we we succeeded thank you for giving us the for asking us to do something for you um app. Zig 122 there's a argument here that we don't need [Music] anymore and server 189 we forgot to write try [Music] here uh otherwise it looks like we're chilling okay okay so this is working and now I guess let's say uh we'll spin up a new shell so that we can run the server in the background while we edit the JavaScript live um we have to give him a sec he's thinking he's got a hard life Zig build and run and we set the www route to Source res U here okay so let's add a little slider here uh so maybe we'll do it here and we'll say input uh type equals range and uh Min is equal to like one and Max is equal to like I don't know 100 and I think it's like this and then we can say label 4 equals um what smoothing range yep smoothing range yeah this is ID is equal to smoothing range okay refresh so we have this smoothing range that I can slide around I guess we need to set the initial value so value equals one there we go and now hopefully when I slide this around if I go into index JS I can attach something to this button so uh I can say smoothing radius on change we can say update smoothing radius boom and we'll have some async function update smoothing radius and he is going to do something that's very similar to this uh he is going to set what do we call it I don't remember it was a set his smoothing and it what what is the parameter smoothing radius yeah smoothing radius is equal to and then we just have to grab the value so the value is going to be from some element uh smoothing range and let me just shove that in here we don't need an image refresh here we do need a histogram refresh okay it's doing nothing it's doing nothing why not why not get element ID is null which one this one why why why why oh it's cuz it's smoothing range uh okay let's try again refresh and are we chilling chilling no we're not chilling refresh oh there we go oh there we go hey there nice okay yeah yeah yeah yeah yeah yeah so we start from here and we can like kind of like increment it here oh yeah baby so we can probably get like pretty wild with it to be honest God that's so cool look at that okay so on like uh small area like this are like around there is pretty good and does that kind of like extend anywhere yeah it feels pretty reasonable overall we could probably do a little less maybe something like that so like I don't know what's going oh well I was going to say I don't know what's going on here but I guess it's that um the the region is just too small so let's assume that when we're doing our like actual thing we'll be looking at maybe a region that's like around this size feels good to me like maybe like almost a quarter of the image might be what we're looking at maybe a little smaller maybe something like this we need like enough data that uh it's like reasonable but not so much data that we're just looking at the whole image because we we want to preserve some sort of like local data Maybe maybe Global's fine to be honest right maybe we look at the whole thing and we identify that there are like some dark Peaks and some light Peaks but we need to we need to correlate them [Music] together yeah maybe what we do is we kind of like maybe we walk over the whole image selecting some regions and if like there will be some Trends right in this section of the image we're always going to be seeing like the cluster here and the cluster here and on this side we're always going to be seeing like this dark cluster and this light cluster and then maybe over time we can kind of correlate those with the full image as like which clusters can we expect to be dark and light or something maybe this probably like the way it is right now probably works a lot better now this like just start with the next biggest thing can you test regions against an expected pattern I guess the problem is that like I don't know what the expected pattern is I'm expecting to have light and dark but that's kind of about all I know um actual shape not the histogram like I'm expecting a finder like if I'm looking for the finders maybe I'm expecting this shape here and maybe there's some image transformation I can do to get there maybe I feel like what I I feel like before we start we just need to get into like black and white Den noised that's how I'm feeling right now but it could be that that's the wrong approach but we'll see we'll see maybe we'll just keep going and see what happens okay uh so we kind of did this we kind of did this and let's say that we are at 2350 and let's try to like look at the binning algorithm like so right now what he's just doing is he just sorts the histogram and he say says which point is tallest and I'm going to look at that one and he's like which point is the next tallest I'm going to look at that one and if they start to be like close to one of the ones that is already marked into a cluster he kind of like merges himself with that I think what I want to do is I want to like kind of start by just uh like walking down the hill to some amount right if we just like look we we start with the the center guy and we go like blo right until we hit some threshold and then we move on maybe that will give me something a little bit more um so let's maybe like this is like kind of like the idealized version of it and maybe we say like while we walk to the left and we see uh we see a large decrease or not even a large decrease we see a decrease maybe we just keep until we see an increase maybe maybe so then we would like kind of like take this section as one thing and then this section as another thing this as another thing yeah okay let's try that let's try that so found this twitch stream from YouTube algorithm by the way hello that's good to know CU that is kind of part of the strategy was hoping that uh by posting to YouTube people discover the Discover is a Little Bit Stronger there than on Twitch maybe uh saw Z that was neat I also think it's neat which is why I'm doing it so glad glad that that extends to others as well okay so we grab like the biggest bucket and uh we say hey does it belong to any cluster and if it doesn't then we add to that cluster and maybe what we should do now is look at that cluster ID and we just kind of want to say maybe we do this so we say hey I'm going to add this item to the cluster and maybe instead of adding a single item We'll add a range so here we'll say Min and here we'll say Max and just when we add a cluster we're going to like walk left in until we see an increase is that vimwiki I've heard that a couple times I don't know what vimwiki is this is just a markdown file and then I type into it I don't think I need anything more advanced than that uh but maybe one day I'll discover that I do uh okay so maybe we'll say while okay we'll say Min is equal to the biggest bucket and while our histogram uh at some value we'll do maybe we'll just say well true for now we're going to say like if any we'll say like I'm trying to like figure out like some sort of like extra smoothing that we might need maybe we don't maybe we don't need it maybe we just say if histogram self histogram at Min minus one is less than the histogram at Min Minal minus one if Min equals z we have to break uh and then else here we break and then we do the same thing for Max but in the other direction uhhuh so we say uh if Max is equal to histogram Len then here we say + one is less than still + one and then we add the cluster min max and then let's just double check this logic here so we say hey if our thing is less than cluster Min and within some Epsilon of it we add it this is maybe a little sketchy but for now let's just do this if it's um we'll get rid of this like Epsilon and we'll just say if the bucket is bigger than cluster in maybe greater than equal to and the biggest gluster is less than equal to Cluster Max we don't have to we've already covered this guy so we won't worry about it uh let's just see what happens if we do that doesn't compile obviously why would I expect anything different here I have to say you are you side that um oh no oh it's cuz I wrote const here I wrote const when I am mutating this guy and then this should be Max and Max here as well all right refresh boom so next next okay what if I smooth more now why is it only going to the left Max equals Max + 1 oh it's cuz I wrot in here oopsies oopsy daisies okay refresh so here uh oh cordum nice what did I on uh oops this is if Max plus one is equal to histogram length oopsies oopsies oopsies oopsies all right let's try again okay let's smooth them a little more and then boom no oh that didn't do I thought it would do something here is so wrong oh my God I keep writing I dude this is why functions are good like limiting scope so you don't up and use the wrong variable pretty nice uh interesting so this it's not working like I thought it would to be honest like I'm pretty sure that if I look here oh there is a very slight increase okay so we also want to like handle some amount of Jitter in the signal okay so it's like if uh let's think how do we determine how much Jitter to handle here right cuz there's like some amount of very slight uptick happening here that's resulting in like the the algorithm stopping so we need to say like if we need to like count I guess we'll do like percentage if it goes up by less than n% uh then we'll allow it we will eat that guy as well so maybe we'll say uh we can call this maybe Epsilon um or we'll call it Jitter I don't know I don't really have a good word for it but like how much can we increase and still consume but that's not really what I want either to be honest because what if you have like a slow Hill Climb right if you have an a histogram that looks like you know this uh you're going to be doing something like you know you're if you start here you're going to go down down down down down but if each one of these increases is less than 10% then you're going to be like oh no right you're going to like keep you're going to keep going all the way and consume everything which isn't what we want we almost want like if the trend line is positive right you almost want to take like the [Music] derivative interesting interesting perhaps take Min and Max into account oh yeah I kind of like that so like you kind of like keep walking and you say if like the slope of Max to Min is like lower like if we have a steeper slope maybe um I mean you could just smooth more as well right if you smooth enough oh even like infinite smoothing you still have enough Jitter that it's a problem I guess I feel like this is obvious and I'm struggling are we searching for Peaks I'm more searching for like uh distance like I'm I'm more trying to like bict I guess like I'm trying to like consume like this whole segment here right so we start here like this is this is like pretty good this is what I kind of wanted is like he he goes here and he goes yum yum yum yum yum and then he hits here and he goes okay I'll stop maybe it's okay let's just keep going and see what happens maybe we can just handle this by merging clusters a little bit more effectively let's try that let's try that so we'll say first of all I want us fix this next spam so maybe I'll put a little bit of like a while true in here and instead of turning at this point we'll hit continue then we break at the end and here we do bar biggest bucket equals undefined then we set this guy here and then we have him down [Music] here and then we just have to do some sort of like cluster merging but I don't know what that looks like yet let's just start here and see what happens start here see what happens okay that's obviously not doing what I wanted to it's like eating itself which is bad so what's happening here this should be return no this should be continue so if our bucket is within some cluster we don't use it why am I why am I like consuming another bucket here I guess it's cuz probably in this like hill climbing stuff no I'm not like okay that's fine what's happening here here we must just not be hitting this case when the slope changes Direction it's a split yeah I I agree I like understand that but I would just think that like I don't know how to uh that's that's effectively what I'm doing now right is just just that the slope I'm only considering the slope over like a distance of one which maybe is wrong right you almost want like average you want you want to like Smooth out the line if that makes sense which might just be I guess you could just do more smoothing iterations here and that would do the same thing so you could say like I want to smooth over three elements but just do it like two times or something and maybe that results in something that has like a more consistent slope maybe have to think about it have to think about it because there's this thing um kernel density estimation which kind of does exactly what we want here I think where instead of giving like bins like this it creates like a smooth function and I'm pretty sure that all they do is they run a kernel over like they're essentially doing histogram smoothing where they say hey we have a kernel and the kernel where do they Define these things um here so you you'll have like some kernel that says like multiply things within the range negative 1 to one of my guy by like one this like we're essentially doing by uh just doing like a moving average we're essentially doing this box version but they have like other distributions where you say like wait the stuff that's closer to me a little bit stronger but they're essentially saying like hey look at the points within some distance of me and average them in some way which is effectively what we're doing by do doing a sliding window over the histogram then they [Music] do I think they have what do they they just sum the thing and divide by H which is smoothing parameter called bandwidth I see I see so that's the part we're not doing but it's effec the same and then maybe we get some sort of like nice differentiable thing out of it but for now I just for now I want to fix at the very least the like the edge casing here so here we need to say I guess we just do this the same if condition here we just do it on each like on the border um so we say Pub function is in cluster and we take a point um so we'll say function any cluster contains and we're just going to say point is a u size and we will return a buol and then we just do this Loop here yeah boom okay so this just becomes now uh if self clusters any cluster contains our biggest bucket we continue and then down here we say uh if Min equals z or any cluster contains Min uh saturating sub one and then here we say or self clusters any cluster contains Max plus one I guess we need we don't need to saturating sub here because we've already checked this equals equals down here so hopefully that fixes the fact that they like start overlapping with each other uh oops doesn't compile because uh this is return true and this is a return false here oopsie okay boom boom boom boom okay at the very least now they're not overlapping with each other but we still have the problem where they're not consuming enough I thought you were programming and go n a little bit of Zig a little bit of Zig a little bit of Zig a little bit of HTML a little bit of JavaScript all right I'm going to try adding this like Jitter parameter and just see what happens so if this is less than Min times Jitter I think we just do this in both spots oon this should be minus let's just see what that does um and then I guess we have to cast this to oh god um self histogram Min minus one let's do a little bit of nice naming here potential idx is equal to Min minus one uh we say histogram potential idx like this and we cast this to a float multiply it by the Jitter uh so we'll say potential Val with Jitter is a float [Music] 32 and this is actually uh plus minus minus this time self histogram Min okay so if potential Val with Jitter is this and then we'll copy paste this down here and just adjust so this is Max + one and this is Max okay and this is dential Val okay so now we should allow at least some movement maybe too much maybe too little we still up because oh my God uh const current Val F32 is is equal to this float from in I appreciate this but it's annoying which is kind of its point right they're trying to make it so that it's a struggle to do this casting stuff so that you're aware of it but holy holy so it's like this and maybe this one as well these guys get turned to floats is there anything else that we can do to clean this up a little bit we can say cons potential Val F32 is equal to this and then that cleans it up a little bit cuz now we can just say potential Val F32 minus this yeah I like that better all right I think that's right might not be but it might be very well might be so let's say now we hit next next next next next still no huh next oh that did something next next okay so now the Jitter is just clearly too high so let's maybe drop it down a little bit that or I wrote a bug we never know okay that feels getting close we're getting close what are we working on so there is top right corner kind of summarizes a little bit but uh what we're trying to do is we're basically trying to figure out how to turn this like QR code into a clean black and white image so that we can process it uh to like find the finders and stuff uh and so right now we're looking at this like histogram we're trying to like uh identify kind of like which segments are dark and which segments are light given a certain area of the histogram or sorry given region of the image so we like get generate a histogram then we take like the tallest part and then we try to like consume the people around it that one worked pretty well that one worked pretty well and then the idea is that like hopefully we can look at this and say well this stuff over here is going to be the lights the stuff over here is going to be the darks we have to find some way to like maybe we iteratively take each Chunk from each side kind of like fold them in cuz like here I'm not actually even sure what this middle section is okay let's try this guy over here and we'll smooth this a little bit okay I I also wanted to change our histogram smoothing a little bit I don't like the idea of it always consuming farther away if that makes sense like we have smoothing radius but I also want to do like smoothing iterations and so I just want to say that like so how do we how do this we will Loop four zero smoothing iterations we're going to do something and I guess it's going to be like we copy the UNS smooth data into right here uh no we have here we have the output and just at the end of every iteration we copy yes yes yes ma' copy uh the we want to take the current return value and put it into unsmoothed so the idea is that on each iteration un smooth gets updated and then we update red on the next iteration boom is z compile to W for browser interactions uh that is an approach I have not looked at that right now right now what's happening we're just serving we're doing everything on the back end and exposing some like JavaScript web like we're just exposing some Json endpoints to get the data that we need to get uh okay so image histogram here we want to we also want to save hist smoothing iterations set then when we set this guy uh first we'll say h smoothing iterations will set to zero one zero no smoothing by default that seems reasonable and whenever we set this guy we'll say set smoothing pams pams with the num iterations and then we can say self his smoothing iterations equals n iterations boom uh-huh all right I think that that's all of the places that we needed to put this so now in the server when we get a set histogram smoothing we can just say smoothing uh iterations and we will also make this something that's checked so we'll say hey we'll start it as null and if we don't find it then we've got a problem you know smoothing iterations not provided and then here we say smoothing iterations pass that guy in and then we just do a little uh else if we see smoothing iterations we set the smoothing generation opt boom boom boom boom all right might be good enough probably won't compile it never compiles but if it did then it might work um right so this isn't radius anymore this is uh pams and UNS smooth is not an indexable pointer so at uh app 264 I just have to put a little Amper sand on this guy and this guy to say hey could you coer those into slices for me please and maybe we're chilling looks like we're chilling so why am I not getting the histogram smoothing iteration is not provided good thing that we put that error log so uh I guess we'll do the same thing here where uh we say hey can we look at iterations and maybe there's like some max value of like five maybe we just make the input type here number um and then just don't set Minimax and then in our index JavaScript when we set the smoothing frams uh we just have to add in plus like it goes Plus on this side and uh smoothing iter ation is equal to the number of s durations which comes from this guy boom okay refresh all right so error handling request what's going on set Roi that's fine histogram we're getting the histogram but something's wrong something's wrong for sure okay let's go back to app and look at our set smoothing okay I'm assigning him here uhhuh and then I'm using him in the image histogram call and here we are just iterating for zero to smoothing iterations M Copy is source to test yep here we assign test I'm just not seeing it I'm not seeing it okay let's just do the stupid thing and just go back to where we were before right just ignore this completely like surely this shouldn't be broken right this is fine that's working as expected so how come doing anything here is not working the way we expect oh um let's make sure that when the smoothing iterations change that also updates the histogram and that's doing nothing cool uh can we maybe look at what request is being sent when this changes we are calling smoothing iterations equals 3 are we getting ARs here boom boom boom no so nothing's complaining what am I missing smoothing iterations comes from here we would error out if it wasn't getting set uh let's break out GDB I guess this is at binarizing debug break it this line run run with image path oh GDP args break at this line Run Okay boom let's look at our smoothing iterations seven okay uh then what the is the problem we zigging we sure are we are zigging and zagging that's for sure uh okay so this guy we are 100% using him here and is it possible that like it's just not doing anything like is that possible I don't see how it would not be doing anything so clearly when I get to zero it's clearly not doing the zero like that's clearly having an effect oh it is doing something and I'm just an idiot okay so with one iteration right we can just do this there we go there there we go okay now when I hit next nothing's happening super cool there that did something here we're back at that problem where uh our Jitter is too high and it's climbing up the hill when it shouldn't which is annoying I would prefer to not have like a tunable parameter here to be honest let's go back to zero Jitter and see if with the extra smoothing we can get a way without it so maybe maybe we don't need it if we just smooth more you know okay uh nothing is happening why why is nothing happening we are getting clusters we have clusters they're just not doing anything minus Jitter times this I guess the fact that our clusters uh we need to make sure that the max value is inclusive in our clustering thing so we'll double check at the index JavaScript something's happening on the far right oh there is something happening on the far right good eye which is oh there we go that looks kind of right oh yeah yeah yeah yeah yeah wait that's doing exactly what I want so with no Jitter it's working kind of correctly so now let's go back to maybe uh increasing the smoothing range to try to get well what are these Peaks that's what I want to know is like okay if we did something like this like that's like almost perfect right boom boom boom right that's super clean super clean this U's method who the is U U's method oh this is pretty funny wow this looks a lot like what we're trying to do huh he's doing some sort of moving average and looking at variants this is not U's method but this is an interesting thing to know about for sure so he is I don't even know what that's that's that's pretty nice though whatever the that is so what is he doing gross oh that's a lot of letters weight to the probabilities of two classes being thresholded um timal thresholding for contrasting images yeah this is just better than what I'm doing but I'm going to keep doing what I'm doing because I haven't read this and I'm not going to be able to parse this effectively immediately right now so we're just going to keep going here but I think I'm kind of happy with what just happened here uh when we reset uh so something just crashed integer overflow oh cuz the okay hold on hold on uh we integer overflowed for sure at our application here let's just a say red I divide equal by smoothing and minus smoothing start boom boom and then hopefully we won't overflow like we did there okay so there's something wrong going on with uh like something's not resetting like if I refresh this like the clustering algorithm should be resetting here and it's not so let's fix that so maybe that's on set here we need to clear the cluster find and reinitialize him with this yeah like this boom and output image comes from our self here all right try that again uh this guy is considered nonfailed so now he can fail so we go back to server and we call set Instagram s prims we WR try boom and now hopefully like Vis visualization state is like consistent we also have to set reasonable starting parameters because whatever is getting set off rip is just wrong this should be one by default there we go there we go so now refresh there we go there we go probably just like kind of like eyeballing it it feels like around here is a good smoothing range and like maybe like a lot of iterations is good here and then let's just add a very very small amount of Jitter allowance okay next next yeah yeah yeah yeah that feels good that feels good for sure boom boom boom bo bo bo boom yeah yeah yeah look at that look at that it's clean it's clean boom boom boom boom boom boom oh yeah I'll take it I'll take it for sure okay cool so now what we have to do is we have to figure out how the to like bisect these guys um but I think it's just like probably we start at the left most bin and we set it to dark and we start at the rightmost bin and we set that to light and if there's an odd number like maybe we just prefer whichever one has less of the pixels right let's like assume that we want about 50/50 and so if we are unsure then we just go beep boop I will be in whatever category has less yeah that seems reasonable to me seems reasonable to me oh we maybe we need more colors a little bit but that's fine yeah yeah I'll take it I'll take it boom boom boom bo Jitter is allowing a little too much maybe we even need even less Jitter I don't like that I'm seeing red climb up a little here yeah there we go there we go good enough for me good enough for me all right I'm just going to put a note here uh this is sketch and can allow for uh climbing an adjacent Hill that we shouldn't probably something that approximates slope and looks for bottoming out is better but we were not smart enough to do that without thinking very hard there we go we'll say we could do better uh but we're just not going to right now okay there we go I think this Epsilon parameter isn't used yeah it's not so it's fine and okay okay so I think the next step next step is to take our clusters and sort them and then in them so I guess for now since this is all like in our like debug visualizer mode maybe what we do is we add a URI for uh binarized image and we do this little trick by the way this is kind of funny uh I don't know if this how you're supposed to do it but when I update the image I need to like force the browser to re-quest it so in our like index JS we just like append the time stamp to our request as a query parameter and just be and are just like well it's a new picture cuz you ask for it now every time you ask it's a new picture um so that's why we say our match type is like begin instead of instead of an exact URL match uh this guy we need to say that our purpose is binarized image and let's see now what are we going to do here we're going to say hey if if handle binarized image if we get a binarized image request we're going to try to handle that Biz image request and that's going to look very similar to handling an output image request which is our blurred thing handle bized image all right and so we are going to have to say hey application could you give us a binarized image okay yep and then in our app we're just going to have a binarized image and we're going to have to make that thing somehow so that's going to be like uh extract binarized image from from something input image and the cluster finder maybe maybe can you speak to using Zod all I got off the rust bus recently I'm looking for something clean thinking Zig but your experience on it um it's pretty fun I would say it fills this Niche that I think needs to be filled that's like rust is kind of like C++ for C++ people right like C++ is right I don't think anybody really likes it maybe I'm wrong but I feel like it pisses me off but like the things that like make C that people like in C++ that you see people WR about like modern C++ it's like if you took modern C++ to the extreme right and then just didn't make Like Crazy Design decisions in the middle uh I feel like Zig is like if if you took the same like C++ to rust is like C to Zig is how it feels where you've kind of take you've kept like the Simplicity but you've tacked on a lot of stuff that like is nice I guess it feels like what C++ could have been in some ways right you have like these generics that you can generate um but they don't come with like this template system and it feels good hand wavy right uh but I would say that I would not use it in production that seems scary right like there are some languages that aren't going anywhere right like C will be around forever C++ will be around forever rust it seems like has enough momentum that it's going to be around for a long time I don't think we have to worry about it going away but Sig is like you're you're kind of relying on this one guy to like keep driving it in the direction that he's been driving it and it's like that's that you know being owned by like one guy is like really nice in that like there's like clear Direction and clear consistency and it's like you don't have this like designed by committee where you get like shitty stuff out of it uh but you know what happens if he decides he gets bored of it and like if we just sto like if we just stop making new Zig versions like you know if one guy on GCC stops working it's not like GCC is going away but like you know you'd have to worry about that so I don't know if I would use Zig and production but it does feel like if it gained more stability and like more longevity it definitely feels like I would use it where I would have used C++ sorry it's kind of rambly but that's how I feel if that's helpful extract B minized image and we take our input image um which is going to be something a image we're going to have our clusters so I guess we don't pass in the cluster finder we pass in the Clusters and this is going to return an image um and I guess we need some sort of Roi from the input maybe we're going to use the uh actually the Blurred image blurred yeah yeah yeah yeah we'll just say we'll just binarize the image yeah modern SE supposed to so alien to me now I feel like I was like comfortable in like C++ actually yeah C++ 11 and 14 were like fine right I like like Auto was kind of nice range based for Loops kind of not or yeah what do they is that what they call them range base 4S were kind of nice like like increase of like shared pointer unique pointer move semantics they're all they're all kind of nice writing any class where you try to use move semantics is right like writing writing a class is just hard in C++ now but the move semantics are kind of maybe arguably worth it maybe maybe right being able to express like unique pointer is like something that you couldn't do before which is kind of nice then I haven't really worked at like C++ 20 C++ 17 just felt like they went Ham on con xer which I liked but yeah I don't know I I noticed that recently that uh C++ modules still nobody uses like it's not even implemented which is really funny like became part of the spec what like 4 years ago and just like nobody's even finished it yet which is really funny wonder if Zig could fulfill the C promise go tried to make but instead go is more like python replacement for cloud and not systems language I was wondering about that I was Googling okay this is going to sound so stupid really stupid but why when when do you use go or Java I'm like not sure really like because I get like if you're like you're trying to write something like high performance you drop to like C C++ Zig Russ like all of those guys kind of fit that right if you're trying to like bang out something you work in Python and it's like what like what what is the middle ground there like I guess you just want like something that's like a little bit faster than python but like not enough faster like you don't care enough that you want to go all the way like I just don't really see where like go and Java fit you know so when you say like it seems like the C promise go tried to make it's like I get you're saying it's like you're saying it's a lot faster so like yeah I guess Python's pretty slow but then you end up using it to stitch together C modules so you're saying like and Cloud software kind of makes sense because I was also thinking like if you want to use any system integration then like I would imagine that c C++ is like everything speaks C so that it's like feels like a better fit you want fast but you don't want to manage memory okay sure so I just feel like with with maybe I'm just like too deep in the sauce but I feel like managing memory isn't that bad with ri right like if you just have Scopes and you right like I guess C++ is I guess I guess the difference is you're just saying okay J I was going to say like with with with like automatic deallocation on destruction it's like who cares like that's just as good as gar garbage collection and more flexible but then I guess if you were like your options were C++ or Java I can totally see why you would skip C++ like that I think it's that most devs just haven't managed memory and are generally scared of it I could see that but like isn't working with a garbage collector just as frustrating like when you're like oops I have a circular reference and I have a leak like you just have like kind of like more roundabout leaks don't you maybe that was that's my guess but I guess like I'm just talking my ass CU I don't really use garbage collected language that often good Barn go is not as slow as youd think about to management memory ORD is magnitud fast than python a little slow than C++ rust okay okay I can see that I can see that so you're like we want to go faster but we also don't want to like waste all of our time debugging stupid I could see that I could see it yeah everyone everyone's kind of saying that it's not as far from c as I would have guessed like it's not like it's like C halfway to put if you if you have like C here and python here it's not like Java's in the middle it's like C Java python in which case sure I can see it okay and I want to extract an image from this image and I have my clusters so first we kind of have to like categorize our categorize our values so we'll just say that um dark Max is zero and uh bright Min is 255 these are u8s and we just walk our clusters until we run out um so we'll say like our cluster our dark cluster index is zero and the light cluster index is clusters clusters l L items items L like this are you enjoying Z yeah for sure I think it's fun um it feels like uh rust kind of like I like having WR written rust but it is a little like you have to write it their way and so like having the freedom to up is kind of nice you know you know uh these are U sizes but writing this again it feels like writing I feel like maybe C's not so bad anymore either you know it like once you learn the strategies from being forced to write rust then like those strategies can apply anywhere and it's like maybe it's fine you know you know um okay so I have to sort my clusters I guess huh um maybe we even just take in cluster range here and here we say stood sort PDQ are clusters which are elements of cluster range uh context is probably nothing I probably don't give a and then the Sorting function is going to be like what I don't know struct we have some function f who takes in something sort function so what is this guy tiet function that takes in this thing and T context here is void left hand side is cluster range and right hand side is cluster range boom and we're just going to say LHS men is less than rs. men sure that's fine good enough for me and we'll just say that uh while dark cluster idx is less than light cluster idx uh we say that the dark Max is [Music] clusters hold on it's clusters Max uh clusters dark idx dark cluster idx Max and then the uh bright Min is clusters dark light cluster idx Min uh-huh defer dark cluster idx plus equal 1 and light cluster idx minus equal one here we have to be a little bit careful so here we know that they're not equal so we're okay doing this and then uh if they are equal at this point uh we just pick one we just pick one anyone know how to switch between H and. C files in new of him go to definition I use um clang D has a switch header Source thing in it uh I have it bound to H maybe yeah clang D switch header switch Source header that's what I do uh but there's also something called like a. vam if I remember correctly but that I haven't I haven't done it I haven't been deep in C++ for a while so don't take my word for it um okay so if these are the same then I have to to assign it to one so we just have to say I guess here we're going to count how many it how many items are in each of the two bins so we'll look at our histogram uh this is having U sizes and I guess this is by cons as well and we'll just say for Num dark pixels is going to be zero and num light pixels is going to be zero and we'll just say uh for 0 to the dark Max we can say num dark pixels plus equals histogram I right we can do the same thing for uh light Min to histogram Len the num light pixel it's like this and we'll just assign it to whichever one makes sense so we'll say uh if num dark pixels is less than num light pixels dark Max is now going to be the cluster at dark cluster idx maxed otherwise we assign the minimum light value yeah uh-huh um sure that might work this is ripe for bugs right now but uh we'll see we'll see maybe what we should do here actually is visualize the thresholded histogram yeah so let's do that we'll say function extract light dark threshold and we'll pass [Music] in this stuff I don't think we actually need the image at this point and uh we'll just return a UI or u8 which is a pixel value yes a brightness value and so really here we actually don't need to be tracking the bright Min ever because the dark Max should end up equaling the brightness men later yeah so I think we just take out that stuff that's cool and then all of this goes here um okay and now we need to actually call this somewhere so we'll say uh dark light threshold is extract light dark threshold from the histogram uhhuh and the Clusters clusters what the clusters clusters items all right which means that we need a dark light threshold who's going to start at uh 128 or something who cares you guys complaining about cloning and rust I will say okay so re rust cloning and AR and arcs and RC's and stuff like shared pointers and that stuff it's like you would just do that in other languages except they don't tell you that you're doing it and so it's kind of nice to just kind of embrace it sometimes where you're just like look like copying this string is stupid but in reality copying a string never matters right like how often does it actually cause a problem how often does it C cause a noticeable slowdown it's like almost never and so like if just do it you know clone clone clone clone it like who cares and you know the the language will like kind of guide you into not doing that and you should listen when you when it's easy but when it's not easy like it when I run CL D Source switch header it just exits in Zer milliseconds I'm using clang D from language server I mean if the if clang D is set up right that's always worked for me but no promises your mileage may vary etc etc etc you know oh shadowing what what an annoying thing okay this guy uh use he doesn't like light men so this is also dark Max here oh maybe I did need to track light man I lied I do need it uh how do I get this back this is the only part where it really matters I think go back to here Brightman goes here bar Bri Min is equal [Music] 255 uh hopefully everything else is okay there I shouldn't have deleted it I've changed my mind all right um bized image is missing that's okay we'll just delete that for now get this here and coercion at 169 this seems to be a little Amper Sandy probably yep um expected this found that so but this guy take the % away no Field named binarized image so we'll look at server and this binarized image section will just uh return error unimplemented for now and what we'll do is we will add a new API to get the like partition Point uh dark light partition uhhuh yes and we'll say handle dark light partition and this is going to look similar to like getting clusters I guess handle dark light partition and that it's going to return some Json but the Json is just going to be like one number I guess we just say we just write the value in which case it's not even really Json we just say like response writer right all and we do a stood format buff print uh we need some buffer VAR buff is like 4 u8 or something uh cuz 255 is Max value so three is enough then we can write the value of self app dark light threshold into that buffer uh so we'll call this dlts for a dark light threshold string and then we just write dlts here all right here we need to use this buff so I think we just write buff here and con okay um if there's something to complain about in Russ it's the lifetime not borrowing borrowing gets easy if use language but life times are pin in the ass yeah I've I've found that if you write a few like after like if you're doing more than a little bit of of uh life timing give the up like just it's not worth it it's not worth it just like in C++ if you're doing more than a little bit of templating also stop if you're like pulling out cphon or uh ADL in C++ stop like and the same thing in Russ if you're pulling out too many lifetimes stop like just do the stupider thing uh okay so now we probably need to say that there is some URL that we can call to get the like dark light threshold and it's going to say dark light threshold is that what we called it get dark light partition is what we called it maybe I should use the same name everywhere that would reduce Confusion by a lot um so here we'll say partition here all right let's see how we are um use ersan to Cor to slice so here I need to write VAR and here write and I think uh DTS has a possible error so we WR try there um then we have 184 in app and in here what's it oh okay uh we this up so expected type cluster found con cluster range oh cuz we can't write these so we we need to say hey we want to sort these guys and you have to let us dark Max expected 8 found youze so 182 um we'll just do some incast here cuz I'm pretty sure those can't be out of the range of 0 255 uh num dark pixels these guys have to be us sizes here [Music] and function type function oh histogram this is have underscore everywhere here nice noce noise noce noise expected you8 found you size so this is also an incast are we done no of course we're not done course not done this guy never returns we have to here return dark Max yeah all right now let's see oopsies we have to set the smoothing and now if we call a dark light threshold is that what I already forgot server dark light threshold partition dark light partition so this is going to be zero and actually I think that we forgot to ever assign it I think that we didn't ever actually set it so I guess when we set anything of Interest uh we need our exit condition from this cluster finder algorithm so we need to say here we return null if it is if we've gone through all of our buckets otherwise we're chilling and then when we set Roi uh we just run the cluster finder maybe we do it when we call uh it doesn't matter it doesn't matter but every time we set this guy we'll just say uh while self cluster finder next we do nothing we don't give a and then dark light threshold is equal to extract dark light threshold given the his stogram and the Clusters or something and is this the only place no we need to do this here as well which is wrong this is all wrong I'll clean this up later don't worry Don't You Dare worry Don't You Dare worry about it this will get cleaned up let's use an aand here and here [Music] and cluster finder next where is he complaining server maybe yeah all go okay so now when we go in here and we go like this uh uh we crash super cool uh so this is index out of bounds extract dark light Threshold at the light cluster index uh so this is a app 193 so light cluster in idx is uh this minus one oops oopsies uh okay integer overflow so this has to be uh training sub okay and then when we search this okay let's set our smoothing oh there we go nice okay and if we call this we get a value of 90 so that's probably here which I think is probably good what about now 27 that does not look good what about over here 202 it's almost like it's backwards maybe our sort maybe we should uh stood debug print the Clusters here cuz it doesn't look like it's working the way I want it to does not look like it's doing what I want refresh refresh okay 0 to 1 2 to 37 38 to 77 78 to 166 so that does look sorted it also looks like there's way more clusters than I would expect oh it's because these guys are there's a bunch of really small ones so it's almost like we want to discard clusters that are too small uh so we want to do something like as we're walking the Clusters it's like here we're just going one by one but we actually want to do something more like wild clusters dark cluster idx do Max minus clusters we'll say uh con dark cluster is dark cluster idx and we'll say while true we have this cluster and if clusters Max or dark cluster Max minus dark cluster Min is less than like [Music] five dark cluster IX plus equals 1 and dark Max is equal to this no we just keep we just keep incrementing it this is a little sketchy we actually don't want like the width of the cluster we want like the number of elements in it yes I like that um so let's do like num Elms in cluster and we'll take in the histogram and the Clusters cluster and we'll return number of elements and we'll just say VAR sum is equal zero and then we say uh four cluster in two cluster Max we have sum index we just say sum plus equals the [Music] histogram at that value uhhuh and then down here we can say VAR cons the total total histogram size is equal to the num elements in cluster I spelled this wrong I want an uppercase C and we'll just use the histogram as well as the cluster we'll just say we'll just say Min is equal to zero and Max equal to five uh which means that the Nom Alm in cluster this has to be plus one uh or is there like inclusive ranges in this language Zig inclusive range I don't actually know don't actually know range where is like the name of the dot dot syntax yeah it doesn't look like it okay just plus one here no biggy no biggie uh okay so we have our total histogram size this boom and then we'll just say while while something while the cluster while the number of elements in the cluster yeah is less than some threshold so like total histogram size over 10 or something thing 20 right like it just it doesn't have to be like crazy it just has to be something uh yeah and then I guess we do this for light as well which also has to be factored out but shut up you know you know we'll deal with it later okay okay uh let's see what that does this guy needs to be a usze and refresh draw an Roi update the smoothing and okay here is an example here where we should consume this green section this green section yellow red and orange should be termin light which means that we should consider here 84 I don't know if this is 84 to be honest this might be 84 so let's uh maybe draw a second histogram based off of with the splitting Point uh so here we'll say histogram 2 we're going R this next button it's not doing anything anymore um and histogram 2 is going to be something we'll figure this out so in our let's run over here um Zig build and run this guy and let's look at our index JS and we have render histogram and I guess here we'll just H we'll have a second context and like what's going on in here we like figure out the bar width this all stays the same we clear the context and then we render our bars here and we set them to be blue here what we'll do now is we'll set them to be uh like blue and we'll set the colors to be like blue and yellow or something depending on if they're black and white white so say uh if here we have to get the split point so split Point response is a wait uh from dark light partition what was it called I already forgot it's over here here boom so we get this guy and then we split Point const split point is equal to await response Json and then we say uh if I is greater than or equal to or is greater than split Point uh we just change this to like yellow or something I don't know matter um and we'll take this into a function uh render histogram uncolored right and he's going to take in the context as well as the I guess I'll just take in the canvas it so we'll just do this as well and then he needs the bar width to be passed in as well and is there anything else in here split point uh and data you know what this noise this is too much work for now I'm just going to type return here I'll probably regret this in a second but let's just see uh okay so what's it complaining about get element ID is null which element is null next oh cuz I deleted the button that would do it um now what oh there we go what the is going on Console ah body has already been consumed uh cuz this is split Point response oops refresh okay oh yeah hell yeah okay okay okay okay you know what I might take that that feels pretty G damn good yeah that feels like it's working am I crazy oo that one might not be good this is a problem uh because I'm pretty sure that this hump is supposed to be bright here we go we'll try that so I guess the the failure there was that the smoothing range was too low okay what do these like histograms even look like there we go okay so let's let's kind of stick them here um okay so I can see maybe there's a risk that if the smoothing range is too low some of the like filtering won't work correctly but otherwise it seems like it's probably okay I feel like for the most part that looks about like how I would do it sorry I'm not really saying anything because I'm just staring at this histogram selecting over and over again does anyone else expect the histograms to be a lot more peaked uh I mean without the smoothing it might be um there's like some blurring going on as well as histogram smoothing which is not helping right if we take out this blur uh zigg app where is it um app. Z and we take out the blur image if we just do this I think I think that'll do it let's just look refresh refresh refresh refresh refresh refresh refresh no uh why I guess maybe we need the blur for it to work but can we blur it less oh it's probably the there's a bug in here somewhere just looking at the Blurred even selector region there does not look to be that large of a range of colors in there uh but okay so un smoothed like the histogram if we look at the size of this guy this is only taking up half of the range right which isn't really shown right this is the full range here and like if we were just to look at like one dark section here you do see like quite a short segment here so I think it's like I think that it's doing the right thing but I've been wrong in my life this is at 25 before when we liked it and we had like I think I like it like this okay okay so now what now what we have to apply this to the whole image and maybe for now what we'll do the mass contained in the regions by Between the Peaks is very large the mass contained in the regions between the Peaks is very large oh I see what you're saying yeah but I think that's just a result of no it's not even a result of smoothing huh yeah I see what you're saying now actually it does feel like the Peaks I get what you're saying now you're like this is two-tone so why the is there so much gray in here um and I do think that's a result of the blur right like I do think that probably in here there is a lot of gray I could be wrong though yeah I don't know where that's coming from you are right that these feel very very tall it could be that the like with the noise in this area here it gets kind of like we just end up with disperse pixels seems possible did I just crash I did crashes okay so let's pick maybe a smoothing parameter like this so we have our smoothing radius as five for 25 iterations and I guess now we can just apply we can just make a binarized image apply threshold I would expect Four Peaks in the dark light section I think we do see Four Peaks here we see this one here this one here this one here this one here there's a lot of gray but I feel like the mass of the identical black and white should be larger by by mass I mean function integraded yeah I do I I get what you're saying now I I think I agree um oh me man uh the the thing is bugged if I don't smooth enough so we can't really look that easily maybe we could just fix the bug yeah I do I'm I am pretty sure that it didn't look like this in the original image in fact what we can do is we can uh copy this image and uh we can open it in Camp we open it we can look at the oh here it is right like this is the histogram of the unblurred image like this right but as soon as we apply some amount of blur right it does it the like the blurring does heavily affect you know what I mean like here this is kind of what you're saying right you would expect to have like 1 2 3 Four Peaks but then as soon as we add like any amount of blurring it like really degrades that you know what I mean and it could be yeah I'm just not sure I'm not sure it I guess it's just that the mass in each of these pieces is actually quite small and it's probably a result of like the way that the the algorithm that we used to add noise is probably just doing something that's like a little unexpected would be my guess no I don't know what you mean to be H yeah I just I guess I wasn't really saying words there um so I think what's happening is this histogram on the right here for the unblurred image we have like four very narrow but tall sections and these are saying like we have a lot of pixels that are this exact value and then we have like a lot of pixels that are like this value under underneath but not like there there's not a significant number of pixels at any given specific pixel brightness then when we blur it by using filters blur we're not doing a gazing blur we're doing a box blur but whatever it's close enough I think what's happening is like that Peak is getting like copy pasted into a lot of bins around it um and so I think yeah I'm not really I'm not entirely sure but I think it's just that like the the slopes are softer now and now the like height of the histogram is maybe normalized a little bit differently and that's causing some form of difference but but that's a really good histogram for the unblurred version maybe we're just making things worse by blurring I was worried that the um I was worried that the the noise algorithm wasn't indicative of real noise right cuz we got here by taking like a really clean image and then applying noise on top and I worry that in the real world if I took a photo it might not look the same and so I kind of get what you're saying like yeah we're making things worse by blurring but I do think that it's worth it taking that risk but I I agree with you like there's a chance that that's true all right so go back here we said smoothing radius five num durations 25 and let's just let's just apply the threshold so we can get a look at it and be happy you know you know so we'll say uh maybe we'll Place blurred image here do we have a function already for extract binarize no so we will make this one extract binarized image and we take in the image and the threshold so this is going to be some US size and no wait it's going to be and we are going to create a new image which is going to I think we have some code to do that somewhere wherever we call blur image here we go so image from Luma I guess we don't even have to we already have an image right so here we can just say try uh maybe we just apply it to the original so we'll say find image here and we'll pass in our image and our uh what do we call it l dark light partition threshold dark light threshold okay so this here is just going to be called binarize image and he's going to be void and we're just going to iterate over all the pixels and if the pixel is above the threshold uh is above threshold uh image set X y25 else set it to zero yeah that should be it should be it uh here we will put the threshold here and here we say like this boom something about this is wrong this doesn't go here this does not go here this does not go here so this guy we use him in the cluster finder and after we do all of this then we Biz image down here yes yes yes yes so here we binarized the output image and with the self dark light threshold there we go and we just do that everywhere we use this cluster finder next and again this is a lot of duplication wrong spots whatever but I just want to see the results before ending stream so we're taking shortcuts and I'll clean it up later I just want to see it work uh expected u8 find U size uh we'll just incast it and fix it later uh still no good expected air Union type found void uh yeah just this can't fail which is nice I'll take that and one more 423 this guy has to have an ENT as well oh okay let's see it something about this is wrong uh fine fine we will not use that as the up image we'll use this as binarized image cuz I feel like we are up somewhere so somewhere we are going to say self binarized image is equal to Output image uh but we need to allocate a new actual chunk of data in this guy uh how do I like do I have like a image duplicate I guess not but I can just make one we can just make one so we can look at uh image do Zig and we can just add a little bit of a uh function clone takes himself and returns a new version of him and he's going to return this is everything is the same except for the data so we just we're just going to say uh what is in this guy data is a u8 slice so we're just going to say stood self do I have an alligator in here I don't which is annoying so we'll clone him with some allocator and we will say alec. alic uh self data Len uh V8s yep yeah oh this is also annoying this is also annoying I'm not going to do this it just doesn't work out nicely um because the abstraction is wrong because I took too many shortcuts uh okay self binarized image is equal to self output image self binarized image. dat is equal to self Al Al u8 of self output image data L yep uhhuh and then we meem copy uh from the output into the binize image and then we binarize our binarized image then hopefully that doesn't break anything else and I just apply this here and here boom uh yeah maybe I don't know man does that build at least 152 uh oh my God right here right here this is awful I know I know I know I know but we are going to I just want to see the answer I just want to see it so I just want to take as many shortcuts so I can get angry at myself and then fix it later do we build yet no 152 bized image you will [Music] undefined okay so now I refresh that's just the Blurred thing nice I apply my smoothing and then now we should be able to go to our server and add back in the binarized image Handler which should just be write image as BP get binder image okay I think everything here is in the right spot refresh apply my smoothing uh and here can I just go to this oh hell yeah baby okay okay let's apply all here okay so that's definitely something um it's pretty clear that we've up this and this like are we're leaning too close to uh light So based off this histogram maybe it is too smooth so let's try that again it's still not great huh I guess you could kind of argue that because we are blurring it so much it's hard to preserve the edges so maybe we can blur it less um let's just do two iterations instead of three or is that colel size three iterations one maybe we'll try bluring it a little less because you could argue that from the Blurred image we don't have enough info oh and now at 248 we are crashing because if our clusters are empty this is a problem so um we'll just say that here uh we'll just say if dark cluster idx is greater than and cluster is L uh return dark Max or something I don't know doesn't matter we just need something still no good oh this would be greater than or equal to oops oopsies okay now at least we get here nice nice okay okay okay and then go back over here refresh uh and let's try to look at something that we're going to recognize refresh okay that looks about right that looks about right so what's interesting there is that like okay say we smooth less okay refresh like this is clearly wrong right this is very very clearly wrong uh okay okay let's see maybe along here how's that look Dark Light Dark Light Dark Light Dark Light it's not great but again maybe could make the argument that the Blurred image isn't a nice place to start from right like these three dots here let's at least put it on the same page so we can see it fine fine fine fine fine let's put it on the same page so uh this is going to be bized and we'll look here and we'll look at image binarized is image binarized and bized is this uh okay does that at least show us the two things next to each other it does nice okay and actually we need to update this guy um after we reender the histogram after we update the smoothing parameters uh so I guess we'll just do it in render histogram because I'm too lazy to put it in the right spot and again shut the up we we'll do the right thing another day another time another time I just want to see something okay so the question is is do we believe that the thing on the right is a good representation of the thing on the left here I would argue yes here yeah I mean it doesn't look great but I think it's because the Blurred thing doesn't look great right so even if I just take like this and I try to look at the timing row I think that we do see a pretty clear Light Dark Light Dark Light Dark Light Dark Light Dark Light Dark Light Dark if we look at this timing row we have light Dark Light Dark Light Dark Light Dark Light Dark Light Dark Light Dark like it's not bad to be honest it I I was pretty disappointed at first um but I think it just could just be that the source image is like too up and so this could be maybe good enough maybe let's see if we like adjust this at all like what happens if we play around with these parameters oh yeah we could like drop the number of smoothing iterations let's try that oh yeah oh baby yeah I don't know it's not awful this is a problem so there's a little bit of uh thresholding issues where maybe the hump is getting Miss is getting put on the wrong side which seems pretty crazy so I think there's probably a bug in here but other than that it's not so bad I wonder if we can take a look at maybe a different uh a different sample let's look at something else just to see I've downloaded a few QR codes from the internet and let's just kind of like you know take a look at that guy oh hell yeah baby look at that wait that's so good this is like a way cleaner image right but yeah look at that that's so sick that's like almost perfect and that that histogram has like a nice clean split what is it look like I mean it could be just the source data is like really good right but yeah that's not bad at all not bad at all okay I'm pretty happy with that pretty happy with that um I think that probably what I'm going to end up doing is I'm probably going to end up like picking a window size that's like somewhat local and then uh probably like iterating that kind of Chunk by chunk around like some smaller area and use that in some way not really sure yet um but yeah I think I'm going to play this a little bit more off stream um and let's see tomorrow probably hopefully I get it to a point where I'm happy enough with it that we can start trying to look at uh trying to like find the finder codes in these versions I think that we're what we're going to have to do is we're going to have to like maybe we can get away with completely access aligned boxes still um um but yeah pretty happy pretty happy there uh so raid purple yeah yeah okay sorry I have been meaning to start hitting the raid button a little bit more and byy a little bit more I mean at all so I'll try I'll figure out in a second but before we do that let's uh do our little plug so thanks for watching guys if you like what you saw we stream most days at around 2 o'clock Pacific time for two hours um we're working on this QR code tooda right now but there's like a variety of other projects on like the link YouTube and GitHub so if you want to see those you can check them out there's stuff like ra rust operating system neural network stuff some like 3D geometry stuff a little bit of webd some like ZB stuff some diff tooling just kind of like whatever the we feel like right so if you want to see any of that that's there or if you want to swing back by in the future um we're around um yeah if you're watching on YouTube you should try swinging by on Twitch and send hi see what's up I think you guys are right we should try rating so how do we do that we go to Twitch we go to so we look for a channel to raid you are suggesting purple elf and I think that I've seen her a little bit in the past so I'm down how do we do this there's a raid button and we just type in purle elf nice all right guys uh enjoy go say hi don't be rude don't be weird you know just be normal I'll catch you guys on the next one do I have to wait till the raid starts before I hit stop streaming I think so so I'm going to hit the R button and then we'll we'll turn off so thanks for watching guys I'll catch you on the next one bye
Info
Channel: sphaerophoria
Views: 549
Rating: undefined out of 5
Keywords:
Id: iTTiRl_tZ4s
Channel Id: undefined
Length: 137min 42sec (8262 seconds)
Published: Thu May 02 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.