I’ve been trying to make my computers quieter for nearly three decades. Custom liquid cooling loops, magnetically-stabilised fluid-dynamic bearings, acoustic dampeners, silicone shock absorbers, you name it. Well, last week I finally managed to build a completely silent computer. Without further ado…
Say hello to the Streacom DB4 — a 26x26x27cm case that doesn’t have a single fan. Indeed, it doesn’t have any moving parts at all. It’s totally silent — 0dB.
If you strip away the top and four (13mm-thick extruded aluminium) side walls, you see a minimal chassis, and a central mounting plate for a mini-ITX motherboard (oriented with I/O pointing down through the bottom of the case).
At the time I selected components for the system, there were only four mini-ITX motherboards for me to choose from:
- ASUS ROG Strix B350-I Gaming
- Gigabyte AB350N-Gaming-WiFi ITX
- MSI B350I Pro AC
- ASRock Fatal1ty AB350 Gaming-ITX/ac
(Astute readers will notice they are all AMD (Socket AM4) motherboards. The whole Meltdown/Spectre debacle rendered my previous Intel system insecure and unsecurable so that was the final straw for me — no more Intel CPUs.)
I ended up getting the ASRock AB350 Gaming-ITX/ac motherboard:
Although any mini-ITX motherboard can — theoretically — be mounted in the DB4, the whole case is designed to be passively-cooled by using heat pipes to transfer the heat generated by the CPU and GPU to the side panels where it can be radiated and convected away. Careful analysis of the routes the CPU heat pipes would need to take, and clearances required by them, revealed that certain motherboards simply wouldn’t work — components were in the way.
- The Gigabyte has the ATX power connector at the top of the board for some reason, so that was a massive, insurmountable obstacle.
- The Asus has a solid bank of SoC VRM caps that the heatpipes would have literally rested on. Anyone that knows anything about capacitors and heat knows that would have been a recipe for disaster.
- The MSI has a huge SoC VRM heatsink that would have posed an insurmountable obstacle to one (maybe even two) of the heatpipes.
The ASRock was the only motherboard that could fit the DB4 and (optional) LH6 Cooling Kit heat pipes without much in the way of drama. All of that will probably make a lot more sense when you see the heatpipes installed:
To fully appreciate the minute clearances involved, here it is from another angle:
Yep, literally fractions of a millimetre of clearance in some places.
The DB4 comes with the hardware necessary to shift heat from the CPU to one of the side panels via four heatpipes and a single heat spreader. In this configuration a 65W CPU can be supported. By adding the LH6 Cooling Kit, you can connect the CPU to two side panels via six heatpipes and three heat spreaders, and support a 105W CPU.
In such a passively-cooled system, the heat dissipation figures limit the CPUs that can be installed. For reference:
- Ryzen 5 2400G 4C8T 3.6GHz — 46-65W
- Ryzen 5 1600 6C12T 3.2GHz — 65W
- Ryzen 5 1600X 6C12T 3.6GHz — 95W
- Ryzen 7 1700 8C16T 3.0GHz — 65W
- Ryzen 7 1700X 8C16T 3.4GHz — 95W
- Ryzen 7 1800X 8C16T 3.6GHz — 95W
So a stock DB4 can only support up to a 2400G/1600/1700 — forget overclocking — whilst a DB4+LH6 can support even a 1600X/1700X/1800X — with a little bit of room for overclocking.
Unlike Intel — who only support their sockets for as long as it takes you to unwrap the box — AMD supports their sockets for much longer. The AM4 will be supported until 2020. Thus my cunning plan was to start off 2018 with a CPU that can be comfortably cooled by the DB4+LH6, overclock, stress test and monitor thermals for a couple of years, then — if the advantages would be tangible and I feel the need — throw in a more efficient CPU when the last AM4 CPUs come off the production line in 2020, then cruise for the next half-decade or so.
All of that led me to install a 65W Ryzen 5 1600. Since I have a B350 motherboard, I have the ability to overclock the CPU to 1600X/95W levels without much of an issue.
Note: If you are happy sitting within the 65W thermal envelope, and are not overclocking anything, you could forego the LH6 Cooling Kit. Because the DB4 heat pipes are shorter than the LH6 ones, and don’t go over the edge of the motherboard, pretty-much all of the component obstruction issues that eliminated the Gigabyte, Asus and MSI motherboards from consideration would no longer apply. Something to keep in mind if you don’t need the speed but do want some of the features that one or more of those boards may have (but which the ASRock does not).
As far as memory goes, I went with a Corsair Vengeance LPX CMK32GX4M2Z2400C16 32GB (2x16GB) DDR4 kit.
I’ve never had a problem with Corsair Vengeance LPX RAM. This specific kit was on the QVL for the motherboard and an overclocker was able to push his kit to 3200MHz on exactly the same motherboard as I have, so I was confident that I could get a nice memory overclock with minimal effort — subject to the silicon lottery, of course. Since this machine isn’t for gaming, and isn’t running an APU, really high memory speeds aren’t as important to me as large amounts of RAM.
SSDs are the only totally silent storage option, and I got rid of my last HDD more than seven years ago, so this system was always going to have SSDs. The only question was “Which ones?”
Since the motherboard has an M.2 slot on the back, I decided to go with a 1TB Samsung 960 Evo NVMe for the main drive and a 1TB Samsung 860 Evo SATA for the backup drive.
I would have preferred two NVMe drives (to cut down on cable clutter), but the ASRock motherboard only has one M.2 slot. The Asus motherboard, on the other hand, has two slots but (as mentioned before) that is not compatible with the LH6 Cooling Kit. Ah well — compromises of some sort often need to be made.
For what I will be doing with this computer, fast transfer rates and a life expectancy of at least seven years is what I am after from these drives. I only really need ~600GB of space, so by over-provisioning a couple of hundred gigs I can let wear-levelling do its thing and make seven years an easy target to hit.
Even though this system is not meant to be a gaming rig, there’s no harm in putting in the best GPU you can without blowing the thermals. The GPU Cooling Kit allows up to a 75W GPU to be modded and cooled via heat pipes and spreader to a single wall. That pretty-much limits you to the GTX 1050 Ti and below if you prefer Nvidia cards — like I do.
The GPU I wanted was the MSI GeForce GTX 1050 Ti Aero ITX OC 4GB but my parts supplier ran out of them literally as I was assembling my online order. With no idea when supplies would be restored (thanks to the cryptocurrency mining craze), I went with my second preference of an ASUS Phoenix GeForce GTX 1050 Ti 4GB:
Whilst both GPUs fit into the space, the MSI was a few centimetres shorter than the Asus. None of the dual-fan 1050 Ti GPUs had even the remotest chance of fitting.
After removing the fan, shroud and heatsink I cleaned up the GPU itself, applied fresh thermal paste, then fitted the GPU Cooling Kit:
The final step was to pop heat sinks onto each of the four VRAM chips:
Power testing of a wide range of 1050 Ti cards reveals that they do indeed pull the full 75W when under load, so I’m at the limits of the GPU Cooling Kit and there’s no room for overclocking the GPU (even if I wanted to).
To power all this I installed a Streacom ZF240 Fanless 240W ZeroFlex PSU:
I researched the power draws of the various components and worked out that the power budgets on all rails — except the 12V rail — had plenty of headroom. The 12V rail can theoretically hit ~85% of the 168W max capacity if both the CPU and GPU are running at 100%. Normally I prefer to leave myself a lot more headroom than that, but since this system is not meant for gaming, and I can’t actually think of any other scenarios where I’m likely to max out both at the same time, I’m not really concerned. (If it does become an issue then I can install a SFX PSU with minimal effort and buy myself more headroom.)
Over the years I’ve also come to appreciate PSU efficiency curves, and recognise that ‘idling’ systems with large PSUs is a horrible waste of energy. To get the most out of your PSU you should size it so that your typical usage falls in the 25-75% range. The ZF240 has an efficiency rating of 93% and I think my selection of components will let it achieve such levels on a regular basis — given my historic and anticipated usage patterns.
Low power consumption is an especially important issue if you plan on going off-grid. Since that’s a goal we have in the 2–4 year time frame, and this computer will be used much longer than that, it makes sense to aim for high efficiency and low power consumption at the same time.
Final remarks…
The pursuit of silence can be costly and this build certainly was — ending up just shy of AU$3,000. If cryptocurrency miners weren’t inflating prices all over the place, it probably could have come in closer to $2,400 — still a fair bit, but not eye-watering. Nonetheless, the price is less than each of my last three systems and it manages to achieve what none of them ever did: Complete and utter silence.
This computer makes no noise when it starts up. It makes no noise when it shuts down. It makes no noise when it idles. It makes no noise when it’s under heavy load. It makes no noise when it’s reading or writing data. It can’t be heard in a regular room during the day. It can’t be heard in a completely quiet house in the middle of the night. It can’t be heard from 1m away. It can’t be heard from 1cm away. It can’t be heard — period. It’s taken nearly 30 years to reach this point, but I’ve finally arrived. The journey is over and it feels great.
If you are after a silent — not just quiet, but silent — daily driver, then I strongly recommend a passively-cooled case, heat pipes and solid state drives. Eliminate the moving parts (e.g. fans, HDDs) and you eliminate the noise — it’s not that complicated. It also doesn’t need to be really expensive (my system requirements were not ‘average’ so please don’t infer from this post that all DB4-based systems are as expensive). Silence (and a perfectly respectable computer) can easily be had for half the price.
That’s about it, methinks. If you have any questions or would like more details (about any aspect of this build) to be added to the post, fire away in the comments.
Cheerio!
- Update (2018-04-25): Does Pinnacle Ridge change anything?
- Update (2018-05-01): Passively-cooled CPU Thermals
- Update (2018-05-15): Passively-cooled CPU Thermals (Part 2)
Really nice build and write up! I’ve grappled with PC noise for years, opting for watercooling systems to reduce noise. Fully passive is something I have never tried but this has got me thinking now!!!
Thanks! I grappled with noise for decades, so I know what you mean. At the end of the day the only true fix is to eliminate moving parts. In 2018 passive cooling and SSDs make it possible. So long as you aren’t trying to build a high-end (hot) gaming rig, a system like mine should serve you well. Thanks to cases like the Streacom DB4, you don’t need to be an elite hacker with mad modding skillz either. 😉
I hope you can give a silent system a shot. Fair warning though: Once you’ve used a completely silent system for a while, you’ll never go back.
Hey TP69 !
Great article and big thank you in sharing how you managed to make a completely Silent PC. It hasn’t been as long as you trying to find the ‘perfect’ solution but I have been searching for an article as yours for a couple of years.
You mention that your build isn’t for gaming. I am not a hardcore gamer, but I occasionally enjoy to boot-up Windows and play some games. I have found all the parts you have in your build, and was wondering if the cooling of the GPU would be sufficient for some rare (2-3h) gaming sessions ? I am not planning in overclocking the GPU.
If I manage to get this working, I will then migrate my ‘home-lab’ to a complete silence as well.
Great stuff, and thanks again for this article
Chess
I’m glad you found it useful.
The limitations with gaming have less to do with how long you can play and more to do with how graphically demanding the games are. A single case wall can dissipate all the heat that a 75W GTX 1050 Ti can produce when running flat-out — so if your game (with settings that are acceptable to you) runs fine on a 1050 Ti, then you can play it all day long on a DB4.
If you don’t already have a 1050 Ti (or equivalent) then I’d suggest going to YouTube and doing a search for “1050 Ti benchmarks”. You might get lucky and someone may have already benchmarked the game(s) you play. Benchmarks normally display the settings they used, so it can give you an idea about what to expect if you go down this route and help you decide if it’s worth doing or not.
To be clear: I didn’t said that this system can’t be used for gaming. I just said that I — personally — won’t be using this system for gaming. Any game that can be played on a 1050 Ti can be played on a DB4. That’s actually a huge number of games. 🙂
So I finally got the new fanless server up and running – i7-8700, nofan cr95 passive heat sink, thermaltake p1 case Ambient temp in my equip closet is ~30c and the rig idles at 43c. I just finished 12 hour test of prime95 and it leveled off at 73c max for the basically entire test. The cr95 is a great piece of kit – gets hot to the touch but not insanely so. 15k in Passmark and Zero noise … I am loving it.
Congrats! Yes, the CR95 sure is a nice — huge, but nice. One underrated thing about completely passive cooling solutions that doesn’t get mentioned a lot is the peace of mind that comes from knowing your cooling system is effectively bullet-proof. No moving parts, nothing to wear out or break down, nothing to clean or maintain, leak-proof, etc., etc. Truly set and forget. Great for unattended servers.
https://hackaday.com/2018/05/22/this-computer-is-as-quiet-as-the-mouse/
Cheers mate!
Beautiful case, thanks for outlining the project so well. It inspired me to build my own,
I’m almost done with the build, but for whatever reason the video isn’t giving me a signal. A couple questions:
1) Did you plug an eight pin connector into the motherboard, or just stick with the four pin power connection; and
2) Did video output give you any trouble?
I bought a second mobo, both are behaving the same, indicating to me I’m doing something wrong.
I plugged the GPU into another system and got video, so I don’t believe it’s that. No CPU pin bends that I note.
I think I’ll test against a different set of memory (bought the Corsair pair you’d spec’d out), and a different GPUs if they’ll be willing. I’ve done about half a dozen builds, but this is my first mini-ITX board, and I’m not sure where I’m going wrong here.
I’m glad it gave you some inspiration — not so happy to hear you’re having issues.
The ZF240 PSU only has a 4-pin connector (not an 8). It is plugged into the motherboard as shown in this image (which you should be able to click a couple of times for the high-resolution version):

(Sorry about the angle, it’s really hard to get a phone and light in there now that it’s closed up.)
I have a BenQ XL2720Z (27″, 144Hz) display that I have connected to three different systems, so the 1050 Ti is connected to the monitor using DVI (simply because the monitor’s DP and HDMI ports are already taken). The video signal has always been picked up by the monitor — as long as DVI is specified as the source on the monitor, of course.
To be honest, I haven’t tried any of the other output ports on the card.
PS: I just rebooted and poked around in my UEFI BIOS for a bit. About the only thing I saw in there that might be responsible for no video output was the Fast Boot mode (in the Boot menu). Make sure that’s Disabled. I seem to have vague recollections that only very specific combinations of hardware and OS are compatible with that option, and mine wasn’t one. No video was supposedly a symptom if you enable it with an incompatible setup.
Hi
How about your GPU temperature?
Thank you
I haven’t had a chance to do thorough GPU testing so far. What I do know is that the GTX 1050 Ti scores ~9200 in glmark2 and that when I peg the GPU at 100% with pure OpenGL workloads (simulation visualisations) the GPU temperature tends to stabilise in the 47–49°C region in a room with an ambient temperature of 20°C.
So at this point I’d say 100% GPU loads reach equilibrium temperatures of Ambient + 28°C (±1°C).
That’s about the best info I can give you right now.
I am not an extreme gamer, but want higher performance than a standard desktop, which is what the performance envelope for this case seems to be. 60 fps with full game display settings is all I ask 🙂 Even if I go with the 65W CPU (Intel Core i7-6700), and went with a smaller external GPU like the GTX 1050 (75W) it would put me over the limit of 110W for the case. And that is if I could find the power from their Zeroflex 240W PSU.
I like the concept of a fan-less passively cooled computer. I’d like to see them build a high performance model. I think they could do that by increasing the surface area of the case by making case size larger or the fins larger. I would add a fanless SFX PSU like the Silverstone NJ450-SXL 450W PSU. They could make an adapter for the Silverstone that would take the top piece off of the Silverstone case and put on a heat sink with pipes to the side of the computer case, similar to what they with their LP6 CPU cooling kit.
I would then go with the higher performance components such as the Intel Core i7-8700K CPU, ASUS ROG Strix B250I Mobo and GeForce GTX 1080 GPU. I will continue to monitor their product line and have sent this feedback to Streacom. Thanks.
The case doesn’t have a 110W limit. Not sure where you got that information from. With the optional LH6 the CPU has a 105W limit. With the GPU Cooling Kit installed, the GPU has a 75W limit. Combined you get 105+75=180W.
Careful with your wishes for an 8700. Intel lied about the TDP — it’s nowhere near 65W. Hardware Unboxed has more details:
Do an image search for “Streacom DA4”. Note DA not DB. The DA is the actively-cooled version which already exists as a prototype. If you want a hotter/higher performance/noiser system that might be a better choice. No idea if/when it will actually enter production.
Here is what Streacom says on their website on their System Build Guide page. It’s under the CPU tab so it may refer to only that, but the language implies a case limit. I could be misreading it:
“Below are the maximum recommended TDP values for our passive cooled cases. For our non passive cases (F1C, F7C) the max TDP will depend on which CPU cooler is purchased and of course the power supply limitations.
Model FC5 FC8 FC9 FC10 DB4 DB4+LH6
Max TDP 73W 95W 73W 95W 65W 110W
Apart from the maximum cooling performance of the case, you will also need to consider the power output limits of the PSU.”
http://www.streacom.com/support/system-build-guide/
Streacom responded via email to my feedback regarding a high performance system:
“We have been working on larger cases that support higher TDP components for some time, but unfortunately our approach to fanless cooling does not scale up very well and you end up with some very heavy impractical solutions, added complexity for evenly distributing the heat and of course cost. Another issue is that whilst you can cool higher TDP CPUs and GPUs, on these higher performance components, there are also the secondary components that need cooling which are not so easy to create universal mountings for. We did show a prototype last year at Computex, the DB6, which was capable of handling a 1060 and accepted full size components, but that case is currently on hold until we work out some of the limitations. Ultimately we feel that without the support of hardware vendors creating components that are specifically designed for fanless cooling, we wont be able to bring higher TDP fanless cooling to the market without a lot of limitations. “
On the page that you linked to the TDP being discussed is the CPU TDP. In the LH6 manual (http://www.streacom.com/downloads/ug/db4-lh6-kit-160826.pdf) it says “total cooling performance [of stock+LH6] can reach 105W* of heat dissipation.” Not sure why that 5W discrepancy exists but I’ll never get there so it doesn’t matter to me. 🙂
It all makes sense when you think of each case wall as a heat sink with a maximum dissipation capacity of about 65–75W. If you only bond an internal heat source to one wall (using 4 heat pipes), than that’s your limit. If you bond to two walls, then your limit is ~140W. If you bond to three walls, then you limit is ~210W. The trick is actually doing the bonding.
In any case, I’ve had this running for months now so I can vouch from personal experience that with DB4+LH6+GPU heat pipes bonded to three walls the cooling can happily handle ~150W total system draw (75W used by the GPU and 75W by the CPU, drives, and the rest of the system). It’s not theory — I do it nearly every day. If 105/110W was a ‘total’ limit, my system would have melted months ago and I wouldn’t be able to write this response.
Streacom’s response regarding a higher performance system makes sense. It’s really hard to do in a small form factor for a reasonable price. Physics just doesn’t care about ambitions or desires. Another guy tried to get an 8700 working in a DB4 over on the Small Form Factor Forum and failed. High performance, passive cooling, silent, small, and affordable are terms that don’t naturally occur in the same sentence.
I had a similar problem with my build, so it was good to see Tim’s response that the 12V worked OK with just 4 pins.
In the end I traced the issue to the CPU not being seated properly. I guess the cooling arrangements put a lot of mechanical stress onto the CPU, so just because it was well seated when first dropped in doesn’t mean it’s still where it’s supposed to be after adding all the heat pipes and pulling things into place for efficient heat transfer.
Along the way I tried another graphics card that I pulled from an old PC, which at least gave me some faith that it wasn’t a problem with my new graphics card.
Getting everything working out in the open on a bench turned out to be a sensible course of action before putting the system back together in the DB4 case.
An over-tightened CPU cooler — now that’s something I wouldn’t have ever suspected! Then again, when I tighten up coolers I lay a small ruler across the motherboard and stop tightening at the first clear sign of deflection — so I doubt any of them could ever be tightened to the point of causing a connection/seating problem.
Thanks for the troubleshooting tip Chris!
I kept meaning to circle back and say thank you for taking the time to photograph the GPU power plug. It removed a question mark for me during the build process. I ended up paying a few extra bucks to bench test a few parts, and against odds, the problem on my build ended up being two bad motherboards in a row. After figuring that out and finishing the build, I use this as my daily work machine.
I love that this build makes less noise than my co-worker’s Macbook Pro.
Glad the photo was useful, but two bad motherboards in a row — what are the chances of that?!? The important thing is that you’ve got it all sorted now and are enjoying it. Well done!
Wonderful review thank you.
I am genuinely envious. I’ve be coveting that beautiful db4 since it came out.
The let down for me, apart form the price point the case comes in at, is the restricted psu options.
If the case was simply taller and I had room for a full atx fanless psu I’d be there. Even just an after market range of psu’s from the likes of Seasonic or Silverstone.
Hi Tim,
The first picture makes it seem like it’s pretty much a box. Could you seal it completely and then fill it with something inert that transfers heat slightly better than air does? If there are no fans than anything inert should do. Maybe some kind of oil? it might not make that much of a difference but could only improve its heat properties, right?
Removing anything would be a pain though (all oily).
Yeah — it still brings a smile to my face. 🙂
I’ve pegged both CPU and PSU many, many times and the Streacom ZF240 PSU hasn’t missed a beat. It is simply not possible for my system to draw enough power to reach any of the current limits on any of the rails. Very happy with the ZF240 so far and see no reason to have a bigger PSU.
That said, you can actually fit a SFX power supply in the DB4. Both Silverstone and Seasonic make them. The problem is that I don’t think either Silverstone or Seasonic make fanless SFX power supplies. The ones they do make tend to turn the fan on at ~30% load. If you installed a 450W+ SFX then the fan would only kick in at ≥150W draw — which isn’t possible with my setup. So even though you’ve got a fan, it shouldn’t turn on, and the whole system should remain silent.
Food for thought.
Most of the heat (generated by the CPU and GPU) is transferred directly to the walls via heatpipes. You don’t actually want that heat coming back from the walls to the CPU/GPU/motherboard, so having an (insulating) air gap between the (hotter) walls of the case and the (cooler) internal components is actually a good thing. Filling the case with something like mineral oil would thermally bridge everything and be self-defeating, I think.
In my thermals post you can see more images that show the vents on both the bottom of the case and on the side panels. The (relatively) small amount of heat generated by other components (like VRMs and memory controllers/chips) warms up the air inside the case to around ~40°C on most days, before it passively vents out of the top of the case. That’s quite a reasonable temperature (for a passively-cooled system) and no cause for concern.
When I’m hammering the CPU or GPU the internal air temperature can get over 50°C. The only times I’ve seen it above 60°C were during torture tests when I pegged both CPU and GPU at 100% for hours on end — something that never happens during normal use.
Since the system is already being adequately cooled, I’m happy with the current setup and am not looking to change it.
If I was concerned about the internal heat, and did want to improve the passive ventilation, then all I’d probably do is mod the plastic cover that sits on top of the case or replace it with a rigid perforated screen/mesh. That would do the trick of lowering the internal temperature by a few degrees.
Anyone keen to experiment with sealing the DB4 and filling it with oil should be aware that the system already weighs ~10kg. Adding ~16kg of oil would bring the total to ~26kg. That’s a lot! The shelf my DB4 currently sits on can’t even handle that sort of weight, so I’d have to buy/build a reinforced stand to support it if I wanted to try that experiment.
Do you think with this build I could be able to use Gentoo and compile programs comfortably? The biggest program being Firefox. Or do you think the compiling programs is too stressful for this case?
Anthony, I run a simulation on this machine that pegs all 12 threads at ~100% load for 4+ hours on an almost-daily basis. The CPU gets to ~60°C (~39°C above the ambient room temperature of ~21°C) and does not come anywhere close to thermally throttling. The case is more than capable of dissipating all of the heat that a Ryzen 5 1600 can generate under 100% load. In fact, with the LH6 Cooling Kit installed, it’s a walk in the park. There’s plenty of headroom for a faster processor (as discussed in subsequent posts) or some overclocking. Heat is not a problem.
The 960 Evo NVMe is a beast of a SSD. I’ve never had anywhere near as much IO capacity on a personal machine before. Loving it. A sweet choice for devs.
If you want to know how quickly this thing will compile an open source project, I’m happy to conduct a test. I’m running Ubuntu 18.04 LTS, but don’t really want to mess with my build environment too much, so would be willing to clone and compile a git repo that you choose and report back times/temps. If that would help, let me know the details (e.g. what repo, the full compiler command).
Alternatively, the Ryzen 5 1600 has already been benchmarked compiling various things (like the kernel at https://www.servethehome.com/amd-ryzen-5-1600-linux-benchmarks-and-review-we-like-this-one/) so you can just check those out. This system’s performance should be very similar.
Summary: The passive cooling that the DB4+LH6 combination delivers does not limit a Ryzen 5 1600 in any way that I have been able to detect. I’m 100% confident that any load you can throw at a 1600 in an actively-cooled case you can also throw at a 1600 in a DB4 — provided the ambient room temperature is maintained at ‘reasonable’ levels. The optional LH6 is not even needed, in my opinion — stock 1600s just don’t generate that much heat. A DB4 with a Ryzen 5 1600 and NVMe SSD makes for a nice development system — assuming that 6 cores @ 3.2GHz is enough for your particular workload. The same would apply for a Ryzen 5 2600 (6 cores @ 3.4GHz).
any chance you will start a business and sell these?
No chance at all. Everyone’s ears are different. What is totally silent to one person may not be to another. Age is a big factor. Other people’s acoustic environments are also variables beyond my control. Thus there is no way I can assemble and test a silent system, put it in a box, ship it, and guarantee that it will subjectively behave exactly the same way for a customer as it does here for me.
All I can do is document how this particular silent system is working for me, answer any questions people may have, and encourage them to give it a try if they have the motivation and the means.
The AB350 mobo has an 8pins connector to power the CPU. However the ZF240 only has a 4pin connectors. I have connected the 4-pin in the 8pin and things are working but I wonder how you solved this problem.
Fabien, the 8-pin EPS socket on motherboards is actually just a pair of 4-pin sockets side-by-side. Each 4-pin socket can be used to provide up to 144W to the motherboard. That is in addition to the power coming in over the 24-pin ATX connector. If you are installing ‘reasonable’ components in a DB4 with ‘reasonable’ power draws, then a single 4-pin connector is fine, and there is no ‘problem’ to be solved. A GTX 1050 Ti, for example, only draws 75W. A single 4-pin connector in the EPS socket more than covers that.
As mentioned in a previous comment, I just plugged the 4-pin in as shown:

It would actually be pointless to put a power supply into a DB4 that has 2×4 or 1×8-pin EPS power connectors. If you installed components that used that much power (over 288W), there is no way the case could get rid of all of the heat, temps would shoot up, and your components would fry.