Mini ITX NAS Motherboard Intel CPU 6xSATA 4x2.5GbE 2xM.2

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
this is a mini itex motherboard that is designed specifically to use for Nas now you're probably familiar with this type of layout on an ITX motherboard which comes usually like with some sort of socketed CPU one or two uh fullsize RAM slots to if you're lucky maybe four set of slots and then a fullsize pcie slot and clearly you've got usually some uh USB display out and some sort of ethernet connection here and in this case this has an m.2 slot and a Wi-Fi slot as well so if you're building something like this basic uh 4bay Nas boox something like this would probably be more than sufficient for that use case however if you're looking to build something like in this Johnsville N3 which includes up to can support up to eight hard drives you're going to need something with a little bit more expansion options more than likely that's where a device like this comes in now it isn't uh made by any known manufacturer like Asus gigabyte MSI or azrock I did get this from Amazon you can also find it on AliExpress uh probably cheaper but they're pretty much a noname brand but they do come with some quality components now this is an n100 alder Lake 6 wat TDP four core 4 thread 3.4 GHz CPU that supports up to 9 pcie Lanes they also have a version which is based on the Jasper Lake ceron n5105 which is also a four core 4 thread but a 10 wat TDP CPU and supports up to eight pcie Lanes once I get these all hooked up I'm going to install and perform some benchmarking on Windows 11 Ubuntu chunes scale unraid and open media volt but first let's take a look at the components that are on each of these boards as you notice it has a heat sink and fan here which comes with the board itself because the CPU is actually soldered on again it's a selon n100 it also supports a single Channel ddr4 or ddr5 Ram chip this particular board comes with a dr5 socket and it uses the traditional uh laptop sodum type Ram here and uh the specs for the CPU do call out up to 16 gigabyt I do not have a 32 GB chip otherwise I check to see if it'll actually support more um but it should support the 16 gigabyt there natively you notice it also has a 24 pin ATX connector here as well as a 4 pin CPU connector so a traditional ATX power supply will work with this board you can see the onboard USB 2.0 port and then I flip the board around here and you can see that there are two m.2 here they're supposed to be nvme and a pcie 3.0 standard and my assumption is that they're pcie 1X Lane each because a limited number of lanes supported by the CPU now if you look down here you can see that we have a pcie 1x slot if you are not aware of how pcie Works uh the front section here up to that divider there supplies the power to the device up to 75 watts of the motherboard then anything beyond that is just the pcie channels you look on a traditional 16x slot here obviously it goes all the way back here but in this case it's just uh that little segment there the nice thing is that the back end is open so that you can slot in a larger than a 1x card in here without having to use an adapter make any modifications to the board we'll take a look at that in a second uh so that's actually nice to have even though you only get the 1X speed uh you do have expansion options there now moving to the back over here you can see that that there are six SATA ports here and if you look closely you can see the J Micron jmb 585 chip and uh that supports up to five uh 6 GB or sata 3 ports so one or two of these are probably taken off the uh CPU as I believe the CPU itself can support up to two set of ports natively so it's using four or five off of here and one or two off of the CPU now moving on to the back ports You' got two USB 2.0 ports a fullsize display port fulls siiz HDMI two usb3 ports which should be 10 GB but what I like most about this board is that it supports up to four 2 and half GB ethernet ports and it uses the Intel I 226v chipset and if you know how to do link aggregation or SMB multi Channel you can link them all together up to 10 gbit of total speed or you can use them as independent networks or failover or whatever you want to and looking at the bottom there's really not much there other than the four screws that uh support the CPU heat sink and fan now I pr drop this board up on a box cuz I want to show you this PCI e1x slot and slott in a couple adapters this case this is an PCI e8x card and uh you can see here it fits in there just fine the problem is that it overhangs the SATA ports and not a huge deal because you could just use right angle uh SATA cables if you needed to however if you happen to use a 16x card like this guy here this video card it's going to slot in between the SATA connectors there uh so thankfully it does slot in there the problem is that if you happen to have even right angle connector uh SATA cables it's the clips on the back are going to interfere with the uh board itself so it'll make it difficult to slot it in there so I have these right angle SATA connectors or cables uh with the clasp on it so if I snap those in here okay those are now in place and you can see that the latch here overhangs that Gap there where the pcie card wants to sit so uh you really need to go with something more like this guy here a right angle without any kind of clasp on it and that should give you the clearance you need to put a pcie card in there so I have one of these right angle connectors in here without the clasp on it and you can see that there's plenty of clearance here to put the pcie card and on the other side these three SATA connectors you can use regular straight out SATA connectors even with the clasp because it's not going to interfere so if you take this uh 16x video card it'll slide in there just fine now taking a look at the n5105 version uh again you got the Solon n515 Jasper Lake CPU which is four core four thread 10 watt TDP and it supports up to eight pcie lanes and if you notice what is absent from here is a actual pcie uh 1X slot so that's probably because it only supports eight pcie length and not nine like the n100 now notice it has two sodum slots so it uses traditional ddr4 laptop type sodm Rams and up to 16 GB total and uh I did put a 16 GB uh RAM in here it did not boot I put two 8 GB and it booted just fine and a single gigabyte booted just fine in this guy here and uh also uses the ATX 24 pin power connector and the four pin CPU connector is over here this also supports two m.2 slots which you can see they're oriented a little bit differently than the n100 this one runs perpendicular to the uh CPU where this heat sink whereas this one runs parallel to the heat sink here and they're both again supposed to be m.2 nvme and we'll take a look at that closer I thought I would also o mentioned that both of these boards and the m.2 slots they only support the 2280 form factor the 80 mm long form factor so if you want to use a 2260 or 2242 SSD you'd have to get some sort of adapter for that and uh you can see it also supports the uh five seta ports here and it also has I did pull this uh heat sink off and it does have a gmbb 585 chipset on here as well looking at the back you've Al got the USB 2.0 uh fulls sizee display port HDMI port two usb3 ports and the again the four two and F gigabit Ethernet Jacks that are also the Intel I 226v chipset and you'll notice that this has two onboard usb2 ports and again on the bottom same thing like the other one there's really nothing there just the four screws for the heat sink for the CPU now you may be wondering okay I've got uh six set of ports on here but what if I do have something like this John's Bo and three which supports up to eight set of hard drives how do I get my other two set of hard drives connected to this motherboard and you could probably do this on a traditional ITX motherboard but uh because you have the 2 and half gbit ethernet ports on here uh it does give you expansion options because you don't have to use up your PCI slap for a faster uh network adapter now you could use something like this m.2 Oro it uses an as media controller which supports up to six six set of ports uh the full set of three or six GB set of ports or you could use something like this SAS adapter which uses these connectors and you could use a connector that has this on it with four set of connectors on the other side so you can support up to eight off of this guy here or in the case of this n100 board you can just get a traditional uh SATA card that you could just slot in here like this and uh connect your SATA reports that way now full disclosure here I did get this n100 board uh that I was using initially and it did end up having some issues where ethernet port ended up just disappearing not sure why and uh it did end up starting to hang and lock up so I did request for an RMA they Shi me out a replacement board right away other thing you'll notice here is that the uh Santa chipset doesn't have a heat sink on it whereas on over here it does not a huge deal sometimes they probably make decisions later on that they don't need it but either way way I'm wondering if maybe this was return or something but um the the new board seems to be working perfectly well knock and wood so this is what the benchmarks are all based on as a new board now you may be wondering why this motherboard and not just use a traditional ITX formfactor motherboard finding an ITX motherboard that supports at least one 2 and 12 GB ethernet port and uh has at least a second m.2 slot so you can do some expansion uh can actually be difficult to find and also will get expensive really fast uh you're looking at typically at least 200 bucks just for the motherboard and then adding in a CPU on top of that and uh so it can get expensive real quickly and you don't always even get all the features that you need now there are other ITX Nas based motherboards out there and like the cww K which is an AMD chipset and it comes with a full sizee 16x pcie slot and the m.2 slots are actually full 2x lanes and you can support up to it says nine setup ports uh through these uh connectors here and it also has 2 and half GB ethernet Etc however you look at the cost I mean you're looking at $423 17 for this guy compared to the 125 bucks for the board that we're looking at as rock rack also make some nice server based motherboards or Nas based motherboards but again uh these are difficult to find and when you do find them again they're expensive they're 4 to 600 bucks a piece so this is my setup here I've got two sets of six 500 GB 7200 RPM hard drives and I'm going to use those to test both the uh seter ports and the throughput there as well as using the m.2 adapter to six seter ports here through an m.2 slot on each of these boards and I've got uh m.2 pcie ssds for the boot drive and then 8 gigs in dual channel two sorry 16 gigs in dual channel two 8 gig chips on the n5105 board and then a single uh 16 GB chip on the n100 board then I've got the evg a 500 gold power supply and also going to be measuring power through the device over here for testing the m.2 slots and the throughput and the USB ports and the PCI e1x slot I needed a couple of fast nvme m.2 drives and that's where these come in this is the Samsung 970 Evo plus 2 tbte and the WD black SN 770 2 TB m.2 nvme ssds now I also have the 16x pcie slot adapter for nvme drives so I can test the PCI e1x slot but I also stuck this in my desktop the 16x slot my desktop to get a baseline for the performance for these discs and I'll show you the crystal disc Mark results here so you can see that it's plenty fast to support up to a 4X slot and these devices are more than likely only a 1x pcie anyhow when it came to installing the operating systems there were really no issues with the Linux based ones ubun 2 open Medi volt chunes scale or unraid after an installation and up all the devices were detected and working just fine but Windows 11 on the other hand had some missing device drivers that I had to hunt down because there is no support site uh I did email the vendor and they just said they're on the internet so a lot of use that was most importantly though the Intel I 226v ethernet controller was not detected which is the chip that's used on both motherboards so I had to locate and download the drivers from another PC and load them in through a USB drive before I could even update Windows because I had no network connection whatsoever till I got those drivers installed once once I got them installed after Windows update completed there were still a couple missing devices with the n5105 I found some chipset drivers on gigabytes website and that cleared those up for the n100 there was an SM audio bus device that needed updating natur some scouring found some drivers through Microsoft driver catalog website and another driver from some someone at 10 forums through a Google Drive download so it of course maybe a little bit skeptical but I did scan it and it did come up clean for viruses no viruses and in the end everything worked fine I ran a lot of benchmarks on each of the system components on both motherboards for the CPU Ram both m.2 slots the pc1x slot that's only on the n100 the two four 2 and half GB ethernet ports the two usb3 type A ports and the six onboard SATA ports I mainly checked for functionality and general performance on the five operating systems that I installed on here first up the m.2 ports and the PCI e1x slat on the n100 are definitely only 1X pcie 3.0 Lane as they benched only about 850 megabytes per second sequential performance and Crystal dis Mark K dis Mark and HD p-t test and then when doing an actual 1 GB file transfer test it ran about 800 megabytes per second for reference a theoretical maximum for pcie 3.01 X Lane is about 1,000 megabytes per second the two usb3 type A ports on the back performed even better than the m.2 slots actually Hing about 1,000 megabytes per second with the crystal dis Mark and HD parm test uh the real world 1 gigaby file transfer tests were all over the place though but the n100 typically fared better than the n5105 in that situation for the 2 and half GB ethernet ports they were all instantly detected at 2500 megabit per second link speed and when performing a 10x 1 GB file transfer over the network they all ended up about 270 280 megabytes per second which is where it should be the only exception was n5105 Windows right tests were only performed at 240 megabytes per second but read performance were all similar in performing as at expected speeds I can't explain the windows reason it's probably just Windows being Windows looking at the SATA ports which is important for setting up a NZ right uh the six set of ports on board had some interesting if not confusing results uh I did run through a number of tests including single dis six dis grade Zero 12 dis grade zero six disc grade six and two six disc grade six arrays and a raid 60 setup and without getting into too much detail the bottom line is that the n100 seta ports restrict performance for some reason connecting a fast SSD to each seta Port individually the individual Port performance was limited to about 420 megabytes per second which you can see here uh this indicates each of these ports and this is the speed average speed over that Port comparatively with the desktop PC that you can see here where the same SSD was plugged in which ran closer to about 560 megabytes per second the weird thing is you notice that this port two actually runs faster than the other ports my thought is that this port actually is connected directly to the CPU whereas all these other five are connected to the jamv 585 Micron chipset so 420 megabytes per second uh should be perfectly fine for a single dis situation right cuz hard drives typically don't uh run faster than 200 to 250 megabytes per second individually anyhow but if you are running ssds obviously you're going to hit that wall and become a bottleneck however more concerning is that when connecting six hard drives in a raid zero array it actually peaked at about 500 megabytes per second for the array using the onboard SATA ports on the n100 whereas the n5105 and the RICO m.2 adapter offered up over 900 megabytes per second it just as odd because the n100 and n515 SATA ports both use the same J Micron jmb 585 chipset to test a real world scenario you look at the 1 Gigabyte file copy uh you can see here that uh the n5105 and the Oro really didn't Faire much better than the n100 anyways in the real world file copy test so maybe it's not as big of a deal okay now I'm going to show you the limits of the uh onboard seta Port here this is the n100 here's the six hard diss that are connected to the six seta ports on board and I've got it running in open media Vault here so here you can see the six disc SDA through SDF which are the 500 gigabyte 7200 RPM hard drives that I just showed you and I did do a quick HD Prim test on each of those 206 megabytes per second all the way down to 186 megabytes per second but around 200 megabytes per second on average I'm going to go ahead and build a raid zero array here you can see I have the raid zero raid created and I did a uh cat proc MD stat so you can see here's the raid zero and then I'm going to go ahead and do a HD parm end up with about 515 megabytes per second now I'm going to do a timed copy of the 10 1 GB Pils from the Samsung SSD over this USB adapter over to the ra zero array and you can see here that equated to 16. 626 seconds so if you take the total of those file sizes 10,240 megabytes ided by 16.62% second now why it's higher is probably because it's been uh cached through Ram so the first few gigabytes will be cached through Ram so it'll be faster so now I'm going to shut down and then I'm going to switch the discs over to the Oro m.2 array okay so now I have the six hard drives connected over to this Oro m.2 addin card I'm going to go ahead and run the same tests that I did on the with the AR board set of ports well first up we're going to go ahead and do the HD parm there you can see we have a 835 megabyte pers second transfer speed as opposed to the 515 megabyte pers second transfer speed we had previously I'm going to do the pile copy test this time it took 14138 seconds so that's about 724 megabytes per second okay so here are the six hard drives again but this time connected to the n5105 and the onboard seta port I'm going to do the same test over here we do a cat proc MD stat and it is now named MD 127 so instantly you can see there we have 1,000 or 1 Gigabyte per second rights to the same rate array but on the n5105 and if I do the file copy test and there we have 14 seconds so again 10 240 727 megab per second you can see I've got the n100 version of the motherboard here and I've got 12 hard drives hooked up to it with six of them going to the onboard uh seta ports and then the other one to the Oro 6 Bay with as media 1166 chipset on it and I'm going to set this up in a raid 60 and you can also see that uh all 12 discs are detected here in open Medi volt so the first thing I want to do here is just select this do raid six and select the first six discs the other ones are down here I'm going to go ahead and click save and it's going to start to make that before I approve this I'm going to go ahead and go ahead and make another raid six and go the additional six drives click save and then and going to go ahead and apply this going to generate these arrays at the same time now if you notice md0 are the onboard set of ports and md1 are the uh Oro you can already see that the Oro is building a lot faster than the on board because that 500 megabyte per second limit it seems to impose for some reason on those setup pores now here you can see first I did a single dis raid six build on both the n100 which is blue and the n 105 which is pink and purple and the n515 took 106 minutes whereas the n100 took 127 minutes and when I built both of these raid sixes simultaneously um the n515 still completed both raids at the same time in 106 minutes and the n100 the onboard set ports took 145 minutes and the Oro uh m.2 adapter actually finished in 114 minutes so it seems like that set a port is definitely restricting performance during a raid six build as well I also tracked the CPU usage while the raid arays were being built and you can see that the CPU utilization was at about 50% um with the n100 with the load average of 15 minutes at around uh four so it was pretty much uh heavily loaded and then you look at n515 it also was about 50% CPU utilization with a load average uh hit up at about above four and then actually peaked all the way up to about 4 and A2 however system responsiveness was actually pretty pretty decent I set up unraid with a four data disc and two parody disc scenario using the 500 GB hard drives and I did the initial sync and both with the onboard seta ports and the RICO m.2 seta adapter and you can see here that the n5105 took 53 minutes to complete the Oro and both n100 and n515 took 53 minutes to complete but the n100 on board took 77 minutes now once I created that uh initial sync I went and did a parody check and as expected the parody check should take about as long as initial sync and it took 54 minutes on the uh n515 as well as the Oro on the n515 and but on the n100 it took 93 minutes to finish that on the onboard SATA and the Oro took 60 Minutes not sure why it took a little bit longer in this case but those are the results for Tunes scale I set up a six dis raid Z2 array which is effectively a raid six or four data two parody disc scenario however you want to look at it and it did a file transfer of one terab of data over the 2 GB ethernet Network and uh you can see the results here I didn't expect these to be too far off from each other because you only talk about 200 something megabytes per second and uh those are the results now once I got the data there I did uh pull a disc and then replaced it and then did a reso of the data and the n100 with the Oro mz2 adapter and then 5105 both took 38 minutes to rebu build and then the uh n100 with the onboard set adapter took 47 minutes to rebuild so you can see again uh the onboard seter ports are affecting the performance of the uh the discs speaking of system performance I ran cinch R23 before and after I did a repaste so I removed the heat sink and fan removed the thermal paste that was on there cleaned it up and put a better thermal paste on there just see if I could get any change Improvement in temperatures or performance or anything so here we we have the n100 and uh what's important here is that you can see that before and after the repaste The Legend is down here the CPU utilization didn't really change little dip here is indicated because at the end of the uh R23 uh scene rendering it will reset and so it can sometimes the CPU utilization will just drop quickly and then swike back up when it starts up again in any case the clock speeds were didn't change and the uh CPU power difference really wasn't much different uh the numbers here just start represent the peak um of each of these graphs after the 902 initial boost speed so right here it indicates 80 Celsius which was a peak but the average throughout this test was about 7475 Celsius uh prior to the repac and after the repaste it averaged about 55 56 Celsius so you're looking at about a 20° Celsius drop in temperature overall as far as the n515 uh you can see here we ended up with about a 10c or 12C drop on average so not quite as drastic but still an improvement and everything else all the other metrics were uh didn't really change much though so it didn't really affect performance so much the next Benchmark is using handbrake to encode a 4K 60 10- Minute long video to 1080p 30 frames per second using the default fast 1080p setting and handbrake and you can see the results here um using qsv as well as CPU qsv is the quicks sync video or the embedded uh GPU acceleration for encoding videos and you can look here the n5105 took about 39 minutes to encode the 10-minute video and uh 28.6 minutes for the n100 compare that with the i31 105 about 16 and a half and using Quicks sync you improved on that slightly so you can see it's not a real powerhouse when it comes to encoding speaking of encoding if you want to use this for a Plex Media Server it's okay to ser up two three even four files simultaneously as long as they're in the native format and resolution didn't seem to have a problem with that but when it came to transcoding on the Fly there's no way it could handle I tried a a single 4K 60 video converting it to a 1080P and it just would buffer and pause and stutter and stop for a good minute or two before it could even pick up again so that's not a possibility but as far as serving just regular video files it's not a problem at all now I also tracked power usage in an idle and a loaded scenario and both versions of the motherboard idled at about 20 watts when configured with the onboard CPU 16 GB of RAM uh single m.2 mvme drive a single ethernet port connected and then using a 500 wat EVGA gold PSU and at load the n100 would typically run 35 to 40 Watts whereas the n 105 would Peak at about 30 Watts so in the end what does this all mean if you're looking for a budget build primarily for storing and serving files either option seems to check all the boxes as I was able to successfully connect up to 12 discs with the use of an add-in card without issue in both versions you get a low voltage four core four thread Intel CPU four 2 and half GB ethernet ports two m.2 slots 2 10 GB USB ports and six SATA slots and in the case of the n100 you also get a 1x pcie 3.0 slot lot in both cases the CPU is more than adequate for managing software raid configurations including a 12 dis raid 60 with MD admin and bu2 and open media vault as well as a vdev comprised of two six dis raid Z2 arrays and two N scale although with a maximum RAM support of only 16 gigabytes might become a limiting factor especially with true as depending on what features you enable I'm mostly excited about this board only because there are unfortunately not a lot of full featured Nas boards using the ITX form factor and the ones you can find are usually quite a a bit more expensive since they are a generic brand don't expect any support or warranty with these boards though so you kind of have to take a leap of faith with it but at 125 bucks it's affordable makes use of no name brand components like the Intel CPU J Micron SATA controller and the Intel I 226v 2 and GB ethernet controller and for my testing other than the n100 board that I had to exchanged when I got the replacement they both handled their workload without any other issues the biggest concern was with the N1 100's onboard seta ports that were underperforming for some reason despite using the same J Micron JM 585 controller chip as the n5105 it worked it just it seemed to performance a bit in most scenarios I mean maybe it was just this particular board maker that has issues as there are others available on AliExpress otherwise n100 does offer the added benefit of a pcie slot if that's important to you I actually plan on making use of the n50 105 version of this motherboard and a home Nas project I'm working on so hopefully can provide some long-term feedback down the road I guess that means I should tell you to like And subscribe if you want to get notification of such a video in the future I hope you found this helpful until next time talk to you later
Info
Channel: HTWingNut
Views: 133,624
Rating: undefined out of 5
Keywords:
Id: PO8Kfi4qpY8
Channel Id: undefined
Length: 29min 43sec (1783 seconds)
Published: Thu Mar 14 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.