I’ve been trying to make my computers quieter for nearly three decades. Custom liquid cooling loops, magnetically-stabilised fluid-dynamic bearings, acoustic dampeners, silicone shock absorbers, you name it. Well, last week I finally managed to build a completely silent computer. Without further ado…
Say hello to the Streacom DB4 — a 26x26x27cm case that doesn’t have a single fan. Indeed, it doesn’t have any moving parts at all. It’s totally silent — 0dB.
If you strip away the top and four (13mm-thick extruded aluminium) side walls, you see a minimal chassis, and a central mounting plate for a mini-ITX motherboard (oriented with I/O pointing down through the bottom of the case).
At the time I selected components for the system, there were only four mini-ITX motherboards for me to choose from:
- ASUS ROG Strix B350-I Gaming
- Gigabyte AB350N-Gaming-WiFi ITX
- MSI B350I Pro AC
- ASRock Fatal1ty AB350 Gaming-ITX/ac
(Astute readers will notice they are all AMD (Socket AM4) motherboards. The whole Meltdown/Spectre debacle rendered my previous Intel system insecure and unsecurable so that was the final straw for me — no more Intel CPUs.)
I ended up getting the ASRock AB350 Gaming-ITX/ac motherboard:
Although any mini-ITX motherboard can — theoretically — be mounted in the DB4, the whole case is designed to be passively-cooled by using heat pipes to transfer the heat generated by the CPU and GPU to the side panels where it can be radiated and convected away. Careful analysis of the routes the CPU heat pipes would need to take, and clearances required by them, revealed that certain motherboards simply wouldn’t work — components were in the way.
- The Gigabyte has the ATX power connector at the top of the board for some reason, so that was a massive, insurmountable obstacle.
- The Asus has a solid bank of SoC VRM caps that the heatpipes would have literally rested on. Anyone that knows anything about capacitors and heat knows that would have been a recipe for disaster.
- The MSI has a huge SoC VRM heatsink that would have posed an insurmountable obstacle to one (maybe even two) of the heatpipes.
The ASRock was the only motherboard that could fit the DB4 and (optional) LH6 Cooling Kit heat pipes without much in the way of drama. All of that will probably make a lot more sense when you see the heatpipes installed:
To fully appreciate the minute clearances involved, here it is from another angle:
Yep, literally fractions of a millimetre of clearance in some places.
The DB4 comes with the hardware necessary to shift heat from the CPU to one of the side panels via four heatpipes and a single heat spreader. In this configuration a 65W CPU can be supported. By adding the LH6 Cooling Kit, you can connect the CPU to two side panels via six heatpipes and three heat spreaders, and support a 105W CPU.
In such a passively-cooled system, the heat dissipation figures limit the CPUs that can be installed. For reference:
- Ryzen 5 2400G 4C8T 3.6GHz — 46-65W
- Ryzen 5 1600 6C12T 3.2GHz — 65W
- Ryzen 5 1600X 6C12T 3.6GHz — 95W
- Ryzen 7 1700 8C16T 3.0GHz — 65W
- Ryzen 7 1700X 8C16T 3.4GHz — 95W
- Ryzen 7 1800X 8C16T 3.6GHz — 95W
So a stock DB4 can only support up to a 2400G/1600/1700 — forget overclocking — whilst a DB4+LH6 can support even a 1600X/1700X/1800X — with a little bit of room for overclocking.
Unlike Intel — who only support their sockets for as long as it takes you to unwrap the box — AMD supports their sockets for much longer. The AM4 will be supported until 2020. Thus my cunning plan was to start off 2018 with a CPU that can be comfortably cooled by the DB4+LH6, overclock, stress test and monitor thermals for a couple of years, then — if the advantages would be tangible and I feel the need — throw in a more efficient CPU when the last AM4 CPUs come off the production line in 2020, then cruise for the next half-decade or so.
All of that led me to install a 65W Ryzen 5 1600. Since I have a B350 motherboard, I have the ability to overclock the CPU to 1600X/95W levels without much of an issue.
Note: If you are happy sitting within the 65W thermal envelope, and are not overclocking anything, you could forego the LH6 Cooling Kit. Because the DB4 heat pipes are shorter than the LH6 ones, and don’t go over the edge of the motherboard, pretty-much all of the component obstruction issues that eliminated the Gigabyte, Asus and MSI motherboards from consideration would no longer apply. Something to keep in mind if you don’t need the speed but do want some of the features that one or more of those boards may have (but which the ASRock does not).
As far as memory goes, I went with a Corsair Vengeance LPX CMK32GX4M2Z2400C16 32GB (2x16GB) DDR4 kit.
I’ve never had a problem with Corsair Vengeance LPX RAM. This specific kit was on the QVL for the motherboard and an overclocker was able to push his kit to 3200MHz on exactly the same motherboard as I have, so I was confident that I could get a nice memory overclock with minimal effort — subject to the silicon lottery, of course. Since this machine isn’t for gaming, and isn’t running an APU, really high memory speeds aren’t as important to me as large amounts of RAM.
SSDs are the only totally silent storage option, and I got rid of my last HDD more than seven years ago, so this system was always going to have SSDs. The only question was “Which ones?”
Since the motherboard has an M.2 slot on the back, I decided to go with a 1TB Samsung 960 Evo NVMe for the main drive and a 1TB Samsung 860 Evo SATA for the backup drive.
I would have preferred two NVMe drives (to cut down on cable clutter), but the ASRock motherboard only has one M.2 slot. The Asus motherboard, on the other hand, has two slots but (as mentioned before) that is not compatible with the LH6 Cooling Kit. Ah well — compromises of some sort often need to be made.
For what I will be doing with this computer, fast transfer rates and a life expectancy of at least seven years is what I am after from these drives. I only really need ~600GB of space, so by over-provisioning a couple of hundred gigs I can let wear-levelling do its thing and make seven years an easy target to hit.
Even though this system is not meant to be a gaming rig, there’s no harm in putting in the best GPU you can without blowing the thermals. The GPU Cooling Kit allows up to a 75W GPU to be modded and cooled via heat pipes and spreader to a single wall. That pretty-much limits you to the GTX 1050 Ti and below if you prefer Nvidia cards — like I do.
The GPU I wanted was the MSI GeForce GTX 1050 Ti Aero ITX OC 4GB but my parts supplier ran out of them literally as I was assembling my online order. With no idea when supplies would be restored (thanks to the cryptocurrency mining craze), I went with my second preference of an ASUS Phoenix GeForce GTX 1050 Ti 4GB:
Whilst both GPUs fit into the space, the MSI was a few centimetres shorter than the Asus. None of the dual-fan 1050 Ti GPUs had even the remotest chance of fitting.
After removing the fan, shroud and heatsink I cleaned up the GPU itself, applied fresh thermal paste, then fitted the GPU Cooling Kit:
The final step was to pop heat sinks onto each of the four VRAM chips:
Power testing of a wide range of 1050 Ti cards reveals that they do indeed pull the full 75W when under load, so I’m at the limits of the GPU Cooling Kit and there’s no room for overclocking the GPU (even if I wanted to).
To power all this I installed a Streacom ZF240 Fanless 240W ZeroFlex PSU:
I researched the power draws of the various components and worked out that the power budgets on all rails — except the 12V rail — had plenty of headroom. The 12V rail can theoretically hit ~85% of the 168W max capacity if both the CPU and GPU are running at 100%. Normally I prefer to leave myself a lot more headroom than that, but since this system is not meant for gaming, and I can’t actually think of any other scenarios where I’m likely to max out both at the same time, I’m not really concerned. (If it does become an issue then I can install a SFX PSU with minimal effort and buy myself more headroom.)
Over the years I’ve also come to appreciate PSU efficiency curves, and recognise that ‘idling’ systems with large PSUs is a horrible waste of energy. To get the most out of your PSU you should size it so that your typical usage falls in the 25-75% range. The ZF240 has an efficiency rating of 93% and I think my selection of components will let it achieve such levels on a regular basis — given my historic and anticipated usage patterns.
Low power consumption is an especially important issue if you plan on going off-grid. Since that’s a goal we have in the 2–4 year time frame, and this computer will be used much longer than that, it makes sense to aim for high efficiency and low power consumption at the same time.
The pursuit of silence can be costly and this build certainly was — ending up just shy of AU$3,000. If cryptocurrency miners weren’t inflating prices all over the place, it probably could have come in closer to $2,400 — still a fair bit, but not eye-watering. Nonetheless, the price is less than each of my last three systems and it manages to achieve what none of them ever did: Complete and utter silence.
This computer makes no noise when it starts up. It makes no noise when it shuts down. It makes no noise when it idles. It makes no noise when it’s under heavy load. It makes no noise when it’s reading or writing data. It can’t be heard in a regular room during the day. It can’t be heard in a completely quiet house in the middle of the night. It can’t be heard from 1m away. It can’t be heard from 1cm away. It can’t be heard — period. It’s taken nearly 30 years to reach this point, but I’ve finally arrived. The journey is over and it feels great.
If you are after a silent — not just quiet, but silent — daily driver, then I strongly recommend a passively-cooled case, heat pipes and solid state drives. Eliminate the moving parts (e.g. fans, HDDs) and you eliminate the noise — it’s not that complicated. It also doesn’t need to be really expensive (my system requirements were not ‘average’ so please don’t infer from this post that all DB4-based systems are as expensive). Silence (and a perfectly respectable computer) can easily be had for half the price.
That’s about it, methinks. If you have any questions or would like more details (about any aspect of this build) to be added to the post, fire away in the comments.
- Update (2018-04-25): Does Pinnacle Ridge change anything?
- Update (2018-05-01): Passively-cooled CPU Thermals
- Update (2018-05-15): Passively-cooled CPU Thermals (Part 2)
146 thoughts on “Completely Silent Computer”
Very interesting article. I’m glad to see I’m not the only person who wants a silent computer. I’m surprised you didn’t choose ARM for its low energy consumption plus high performance.
Good catch Miriam. I actually spent the greater part of a week researching and deliberating AMD vs ARM. My daily driver needs to get a certain amount of “serious” computational work done in restricted time windows. The Ryzen 5 1600 will give me similar performance as my current system does using about half the power. An ARM would use far less power but would take noticeably longer to complete those tasks. The restricted time windows don’t allow that. So AMD it was.
Roughly speaking, my computational needs have been stable since 2011. I don’t need ‘the fastest’ CPUs any more. I’m happy to invest in more energy-efficient systems so long as they are equally performant. At the rate that ARM processors are currently improving, I expect them to be a superior alternative when I update or replace this system seven years hence. Unless AMD releases a compelling upgrade before they phase out the AM4 socket in 2020, the Ryzen 5 1600 is quite likely the last x86 processor I’ll ever buy.
In the interim, I need a dedicated system for some security and automation functions. I’m looking at Arch Linux ARM for the OS. That project will give me a chance to try out Arch Linux as well as ARM processors first-hand. Then I’ll have a concrete reference point when it comes time to replace my daily driver in 2025.
Cool! I’ll be very interested in your adventures with that. I’m looking at ARM myself. My computational needs are pretty modest.
Brillant article Tim! Experiencing a truly silent workstation for the first time is really something special.
Is it okay for me to publish one or two of your pictures as well as your final remarks? I would absolutely link to tp69 of course.
Congrats again! 🙂
Cheers. Yes, there’s a huge difference between ‘quiet’ and ‘silent’. Quiet was good, but silent is great. If it wasn’t for the single Power LED on the DB4 I wouldn’t even know the machine was on. It’s quite strange — but in a fabulous sort of way.
I don’t mind you re-publishing some of the images/text. I would ask that you not hotlink the images though — take a copy and host them on your own server. I play a lot (behind the scenes) with the images on my site, so hotlinks would break in pretty short order as I tweak things and rename images. I’m also not sure about WordPress’ policy on hotlinking — so don’t want to suffer any grief on that front.
Thanks for sharing Tim, I really appreciate it.
Sir, you are misinformed if you think that AMD CPUs are/were not affected by Spectre/Meltdown.
Simon, I never said that AMD CPUs weren’t affected — I didn’t even suggest it. What I did say was that my Intel CPU was rendered insecure and unsecurable. That was the last straw. Intel’s ridiculous socket obsolescence policy. The Intel Management Engine insanity. Stagnant multicore development. High prices. Poor efficiency and thermals. Then along comes Meltdown/Spectre and renders my daily driver worthless. There’s only so much a sane person can put up with. I’m not a “fanboi” so I don’t defend the indefensible. I vote with my wallet. I was forced to buy a new processor, so I bought one from a company that didn’t just cost me thousands of dollars. That’s all there was to it. Economically. Rational. Behaviour.
Having said that, AMD is only affected by Spectre, whilst Intel is affected by both Meltdown and Spectre… so any attempt to paint them as being equally vulnerable would be disingenuous — at best.
To be absolutely clear: I prefer to avoid getting involved in pointless Intel vs AMD arguments. If I could, I’d rather be buying Open Source Hardware from neither of them.
Any chance a ryzen 7 2700x would work in your setup? I am eyeing that cpu for a completely passive build – i guess i like to live dangerously…
The Ryzen 7 2700X has a TDP of 105W so the DB4+LH6 heat pipes would be able to cool it — providing you don’t overclock it. However, once you add 75W from the GPU to that number you end up at 180W. The ZF240 only provides 168W (14A) on the 12V rail. Potentially overdrawing the 12V rail by 12W (7.1%) definitely puts you into exciting territory. If your “workload” can peg both the CPU and the GPU — at the same time — then you’re asking for trouble.
One solution is to upgrade to a larger SFX PSU. Another option is to downgrade the GTX 1050 Ti to something like a GT 1030 — which drops the GPU TDP from 75W to 30W and total draw on the 12V rail down to 135W (comfortably below the 168W limit).
I won’t have a gpu. This is for a media server only so may be a workable solution.
Do you plan on doing benchmarking results showing CPU, RAM and Graphics tests along with temperature specs? I would be curious to see what this case and your choice of hardware and configuration would show.
Historically I’ve booted into a Windows partition to do that sort of stuff. This machine, however, will be pure Ubuntu — so I want/need to teach myself how to do that sort of monitoring and testing using native Linux software.
Being a passively-cooled system, my main interest is (obviously) the thermals. Working out how hard I can push various components before they thermally throttle is a priority. Then I need to work out if anything unexpected happens when I do things like peg all cores of the CPU at 100% for 4–6 hours at a time — something I need to know before running unattended overnight jobs. I also need to work out how hot the exterior of the panels get, as well as how the system reacts to the top vents being covered, before the cats are allowed anywhere near it.
So yeah — lots of testing to do, with practical tests taking priority over academic ones. I’ll post the results as they come in.
Thanks for the link.
Yes, the Ryzen 7 1700 IHS is soldered, but when I was speccing my system I wasn’t seriously considering Ryzen 7s — mainly because they have more cores than this computer actually needs — so I didn’t pay much attention to der8auer’s video at the time. The Ryzen 5 2400G uses TIM. The Ryzen 3 2200G also uses TIM. I (somehow) formed the opinion that solder was only used on the high-end Ryzen 7 CPUs and that the Ryzen 5 and Ryzen 3 lines used TIM. It seemed to make sense at the time. That opinion was, however, incorrect.
As it turns out — after a bit of digging — all Summit Ridge CPUs are soldered, all Raven Ridge APUs use TIM, and all new Pinnacle Ridge CPUs are soldered (confirmed by AMD’s Robert Hallock).
So, in essence, as of 2018-04, all Ryzen CPUs are soldered and all Ryzen APUs use TIM. Good to know.
Thanks for bringing it to my attention. I’ll correct my post.
Why 32 instead of 16GB of RAM?
I’ve been running and tuning a simulation for several years. The hydrologic cycle, in particular, takes up a large amount of memory. Around 14–15GB of RAM is dedicated to the sim, and about 17–18GB is left over for everything else — of which I use about 12±3GB on a daily basis.
Thanks for your reply.
Very nice build you have here. I am in the process of debugging a very similar one. I have the Black DB4, using an I7 8700k, ASRock MB, and and a Palit Kalm 1050 Ti. Same memory and drives. I have the option to overclock or use lower voltages depending on where the temperatures hit. My first NVMe drive died after 2 days, so replacing it this evening.
It is a booger trying to squeeze all the heat pipes into the case when putting in the motherboard. 🙂
Anthony, what sort of temps do you get at idle and with a heavy workload using the I7-8700k in the DB4 case?
It sure was. I think my hands are too large — I felt like a gorilla trying to do brain surgery. I’ve a newfound respect for all the modders and system builders out there that work with/on/in small mini-ITX cases like this (which is only 17L in volume).
CPU thermal testing has started…
Once I fixed my cpu heatsink (the loose copper part slid during installation and whole thing was angled), the idle temps are around 37-40. After playing an MMORPG some, I had temps up to 62. The GPU was running a little over 50. The case only felt slightly warm with the CPU cooled sides. They went down fairly quickly after turning the game off but things stayed a little warmer. I have not tried long gaming sessions yet. Once I have software settled in, I plan to run some benchmarks. I don’t expect to push things like I would if it were a larger/cooled and was anticipating possibly trying to undervolt the cpu to keep temps down, but so far I don’t think I will need to try that. Hopefully I can manage a slight overclock after I get my bearings with the Borg cube 🙂
Recommendation: using right/left angle plugs for audio, video, usb, even power cable helps a lot.
Choosing a 2700X for a home server is maybe not a great idea. The power draw is not insignificant when idle (which is the state of 99% of home server for 90% of the time). Just my 2 cents
A fair assessment. Outgoing server is a 65 watt i7-3770s and there is so much cpu power available nowadays in the 65 watt power envelope (ex – 8700 and some of the higher end Ryzens (non x versions), 2700x may not be needed. I am a fan of overkill though.
I’ve finished my first round of thermal testing: The system is idling at 32°C and reaches equilibrium at 58°C when under 100% CPU load.
Full details over on Passively-cooled CPU Thermals.
Have you ever had to deal with coil whine? I have that issue on a Dell laptop that I recently purchased and am not sure what to do about it.
I think your use of the case itself as a heatsink is a brilliant one. The Mac laptops sort of do this, but I’m annoyed they generate a lot of heat and lose performance whenever the CPU or GPU temps go up. My Mac Desktop Pro (the black trash can) while not quiet, performs well even under heavy load without running the fan by 800rpm.
Good system! One suggestion for the OS: you can consider using Qubes OS since you seem to need a security focused operating system.
With regard to “totally silent” — I have found that even without fans, and all solid-state, some motherboards I’ve used still emanate little “eeeeee” mosquito-like sounds depending on what’s going on. I don’t know if it’s capacitors or what, and may also be related to age. Also, not-so-great on-board audio hardware can pick up interference and create little “eeee” sounds noticeable in headphones.
Have you experienced either of those? I haven’t built a new rig in a while, so maybe those issues have gone away on modern hardware. It would just bug me if it was _mostly_ totally silent, but with those little high-pitched nuisances all the more noticeable against a quiet backdrop.
As a server side app dev and dba I’ve recently abandoned the ‘latest/fastest/biggest’ path I was on for the last … gosh, 30+ years. Now my workstation is an Intel NUC i5 running linux with 16GB RAM pushing two monitors and drawing about 16W. Also quiet. Best workstation I’ve ever owned. 🙂
Sweet build and a very informative post! Got some ideas from here for my would be gaming rig!
Well, you’ve just made me completely aware of how noisy my (quiet) desktop is…
I disagree. You can hear SSD capacitors working on heavy or even any disk workload.
Sure have Kerrick. Of the three AMD graphics cards I’ve ever owned, two of them had coil whine that drove me nuts. The last one that exhibited coil whine was a Sapphire Radeon HD 7970 Dual-X OC 3GB — still have that in a box somewhere around here.
I don’t think there’s anything that can be done — by an average user — to properly and permanently fix coil whine. You just need to replace the graphics card — if you can (not sure what options you have with your laptop though).
One thing I did notice with the HD 7970 was that the coil whine would only occur at certain resolutions and frequencies. In my case it whined at the monitor’s native resolution of 2560×1440 at 60Hz. The coil whine would go away if I dropped resolution to 1920×1080 — but then the UI on the games I played would become blurry and hard to read. Given a choice between blurry UI and coil whine I chose Option 3 and replaced the card.
Maybe I’ve been lucky, but I’ve never had any coil whine issues with Nvidia cards, so now I just buy those.
Try running at different resolutions and/or frequencies (if you can) whilst inquiring about warranty options with the reseller/Dell — that’s all I can suggest.