Archer: Workstation Build
Four years ago I built my first custom desktop. It was more of an exploratory venture than anything else, an initial curious step into an endlessly exciting realm. The machine itself was a tessellation of a reasonably good mainstream consumer processor, a mid range graphics card, and entry-level everything else, packed into a sound-dampening case. Till now it is still running strong.
Looking back, it is almost laughable how something built out of curiosity ended up playing such major roles in my school life. The amount of computational resources I found myself in need of was astounding. It is easy to reason that such assets would be important as I tread forwards, especially when I am already experiencing issues at such an early stage of my journey; and just as a machinist cannot do without his wrenches and spanners, I reckoned I cannot do without a silicon mind supplementing my own.
It is entirely plausible that I have misjudged this, and that a couple of years later I would find myself lamenting my blunder in making such extravagant purchases. Truth be told, I have absolutely no idea what sort of life awaits me. I seek comfort in knowing that I have worked fairly hard in high school, to reasonably fruitful ends; but I am also aware that undergraduate research is very different from what I have been doing. For one, no single undergraduate is likely to own the project and have the privilege of working on it independently, and two, the resources available are likely to be much vaster than what we had in high school. Both differences point to a reduced need for personal computational tools. Attempting to predict the life that I will lead, and by extension, my future needs, is dodgy at best. At the same time, however, the whole point of getting into the institute is to spend my time as fruitfully as possible; so perhaps the quantity of work might see an increase, if I am not already slain by the curriculum. At the very least, if I have utterly misjudged this, it would still be an enjoyable experiment, for reasons that I will explain later. There is an appeal in putting things together that I love.
The original plan was to fly to the United States, and then order the parts in and build my next rig there. Seeing that I am in need of computational resources right now, and having had a chat with a particularly tall and absolutely overpowered friend, an alternative was realized. It might be feasible to build the rig temporarily here, and then break it down later and send the parts over for a re-assembly. If done correctly, the shipping costs might not be prohibitive. Further, the components that enjoy the most cost savings when purchased overseas, notably the graphics card, is not necessary at this preliminary stage. And lastly, it is probably easier to settle dead-on-arrival and warranty issues here. There are a lot of might-s here, because this whole plan carries a degree of uncertainty. I'll worry about making it work when I'm closer to getting my freedom back.
The parts list for this build, which I will be naming Archer, is as follows:
First, a few words on that specific combination I have chosen. This is meant to be a lopsided build. The processor is near the summit on the consumer market, at this point in time—it sits right on its silver throne at the top of the Core i7 HEDT family. Any higher and we would intrude upon the quintuplet Core i9 HEDT space, whose prices break the thousand mark for the processor alone. And yet accompanying it is the Sapphire HD6450, which is six years out of date.
I don't think this kind of imbalance is common in the builds typically shown online. The considerable difference arises because the fundamental purposes of the machines are different. Most of those computers are built for gaming or media entertainment; but I intend to do none of those on mine. For one, I hardly own any modern game titles; and two, applications that may benefit from hardware acceleration, such as CAD or video processing, can be done on the desktop. At this stage the machine targets only raw processor compute power, and so the GPU can be kept as budget as possible.
Keen-eyed readers might notice something strange on the CPU—the baseline frequency of the i7-7820X is officially listed at 3.60 GHz, a far cry from the 4.80 GHz written on the list above. Indeed, the chip I intend to purchase is not vanilla. This is the reason why it is to be obtained from Silicon Lottery, and not a local source. It is a pre-binned chip, guaranteed to be capable of a 4.80 GHz overclock on all cores. Furthermore, it comes with a delidding service. The current HEDT chip family, in identical fashion with previous generations, do not have permanent solder between the processor die and the heat spreaders attached atop; instead a removable thermal interface material is used, inevitably sacrificing conductivity. This presents underwhelming thermal performance, especially when pursuing overclocks. Delidding refers to the breaking open of the chip to replace this interface material with superior after-market ones, to enhance heat dissipation.
The Skylake-X generation is technically not the latest. The Coffeelake generation has already been released, with the Core i7-8700K being the flagship processor, carrying 6 physical cores and 12 threads. The highest feasible overclock on the i7-8700K seems to hang around the 5.20 GHz range, a tad higher than the i7-7820X. Judging by these figures, the former would indeed be a better, and marginally cheaper, choice for single-threaded or lightly multi-threaded applications; but the typical workload intended for this machine is heavily multi-threaded. We are comparing 16 threads with 12 slightly faster ones—the 16 wins. This is the reason why we have gone with the Skylake-X, and it is supported by benchmarks.
It is about time we proceed with the build itself. I'll explain the rest along the way.
As is customary, a group photo of the components before starting. I have somehow forgotten to include the Samsung 960 EVO in there, but we'll see it soon enough anyways.
We begin with the motherboard. This is an X299 motherboard of mATX form factor—relatively uncommon at this juncture in time. Its local unavailability was the reason for purchasing it via Amazon. Four PCI-e expansion slots is more than sufficient; and as far as I can tell, there is no difference between the power delivery of this mATX board and that of the ATX counterparts. In other words, this is a more space-efficient option with little trade-off. Why waste valuable space? It would also make my life moving this hunk a whole lot easier.
The CPU goes in next. There is nothing much to say regarding this process—it is essentially pick and place, with the orientation specified by a little silk-screened triangle on both the processor and the socket. Though the chip is delidded, the heat spreader is sealed back with silicone at the same positioning. The chip hence mounts the same way as a vanilla one does.
The Samsung 960 EVO NVMe SSD can be mounted onto the M-keyed M.2 slot next, and the 4 sticks of RAM can be slotted onto their slots in similar fashion. Perhaps I should take the initiative and explain how this figure of 32 GB came about. From experience I seldom exceed 2 GB per thread, for the work that I have come across so far; and since this processor has 16 threads, 32 GB is the logical estimate.
It might be worth mentioning that the Samsung 960 EVO interfaces through NVMe, which is approximately an order of magnitude faster than Samsung 850 EVO and 850 PRO, both based off the SATA III interface. Four PCI-e lanes are taken up by NVMe drives for data transfer, which means we have effectively four fewer lanes left for the graphics card and any other peripherals. This might be a consideration when using mainstream consumer chips, which typical have 16 lanes or fewer, essentially limiting the graphics card to run at 8x configuration with an NVMe drive plugged in. But elevating to the HEDT family, we have at minimum 28 PCI-e lanes, making this whole calculation exercise irrelevant.
Lastly, we assemble the included Wi-Fi E-keyed M.2 card and bracket onto the motherboard, and we are largely done with the board. One of the reasons for choosing the EVGA X299 Micro instead of other competing options is that it offers included Wi-Fi capability, which I have learned from experience to be an exceedingly valuable feature. The trouble of dealing with expansion cards and external adapters is unnecessarily frustrating. Surprisingly there weren't any instructions written on the manual regarding this installation process—a production oversight, perhaps?
Here comes another bit of design consideration. The Intel Core i7-7820X has a thermal design power (TDP) of 140W, which we may interpret as a nominal heat production of approximately 140W under maximum load. The exact definition of TDP is more complicated than that, and certainly it is more trouble than it is worth to digest it by heart; we shall just take it as such. We intend to put a rather aggressive overclock on it, involving a core voltage increase to approximately 1.26 V. Consulting existing overclocking benchmarks online, we may estimate the heat production of our overclocked chip to be between 260W to 280W at maximum load—which is an absolute heck tonne. To put it in perspective, the overclocked Intel Core i7-4770K on my desktop produces about 140W under maximum load, which is already close to the limits of the attached 240 mm radiator.
This raises the pertinent question of how to dissipate such an amount of heat. The next step ahead from AIO water cooling options would logically be fully custom loops. Custom loops generally have pumps capable of significantly greater flow rate; copper radiators are also available, as compared to the aluminium ones in AIO coolers. Of course, custom loops also allow multiple radiators to be used, which might single-handedly bring the greatest improvement in thermal performance. But this is at the steep expense of build simplicity and ease of maintenance, on the user's part. I am personally antagonistic to anything that requires high maintenance, and my idea of building a computer is to finish the job in a couple of hours and go straight to using it—custom loops are therefore out the window.
A viable option is to go for an AIO option larger than the 240 mm variant. There is the Corsair H115i cooler, sporting a 280 mm radiator; and there is a competing option from NZXT, the Kraken X62, which is basically the same thing with different aesthetics. Up one notch we have the Fractal Design Kelvin S36, and the Thermaltake Water 3.0 Ultimate, both with 360 mm radiators. To a very qualitative, first-order approximation, the cooling capacity of a radiator is proportional to its surface area. Presuming identical fin densities, we might expect a 36% greater heat dissipation from a 280 mm radiator than a 240 mm counterpart—this is a reasonable improvement. Surprisingly we find that a 360 mm radiator has only about 10% greater surface area than a 280 mm option, and this seems insufficient to justify purchasing from a less experienced vendor. This explains why the Corsair H115i is the final choice.
And this, indeed, is where the aforementioned experiment lies. I am curious on the exact performance limits of a top-grade AIO water cooler on the market, and I need to know if I should be more accepting of fully custom water cooling loops.
And now it is time to bring out the computer case. The second part of the experiment, albeit a relatively minor one, is to judge the merits of a cuboid case, as compared to the more typical mid-tower or full-tower form factors. Intuition might have it that a cuboid case is less space-efficient, since its footprint is larger than a tower; but this is negated if it is stacked atop a drawer unit or similar. Moreover, most tower cases for mATX boards seem horribly space-inefficient on the inside, especially because I do not intend to have a second graphics card, nor do I intend to have multiple hard disks. I figured it is the right time to step into new territory.
The Thermaltake Core V21 is rather unique, in that the four longitudinal panels—top, bottom, left and right—can all be mutually swapped. This allows the window to be placed on any side, and offers great flexibility in planning airflow. The front panel can be rotated as well, to facilitate changing the orientation of the case. On the interior, two pairs of rails support the mounting of radiators and fans; and these rails are similarly relocatable to any side of the case.
The power supply goes into the case first. You might have noticed that the same "i" suffix is on both the PSU and the water cooler—it is not a coincidence. The suffix is attached to models that support the digital Corsair Link interface. In the case of the cooler, it allows the speed of the fans and pump to be controlled via software, and also allows the cooler to report temperatures and diagnostics; and in this case, it allows the PSU to report power consumption and efficiency figures. I want to be able to look at these information easily when testing, and this makes these product lines a good fit.
The motherboard can go in next—but not before installing the rear I/O cover and shield. Following that, the radiator and the fans can be mounted. I chose a primary left-to-right airflow path, with the radiator being the exhaust, and so the two mounting rails were relocated to the right side. There is a secondary airflow from front to back, created by the front 200 mm intake fan, whose purpose is to cool the hard disk located below the motherboard tray. This configuration also results in a positive gauge pressure on the interior, which would hopefully help with dust ingestion.
And now things start to get messy. We mount the hard disk into the cages below the motherboard tray, and the GPU onto the PCI-e slot; and we apply the Cooler Master MasterGel Maker thermal paste onto the CPU and mount the water block—which I admittedly think I messed up. The mounting pressure on the water block was initially skewed, courtesy of unintentional uncontrolled tubing tension; and though I attempted to compensate, I'm really not sure if the distribution of the thermal paste is ruined. More on that in another story. For now the build happens to work, and I can't be bothered to change it.
At the end of it all that we connect the wires across the various components. The two auxiliary Fractal Design fans get mounted on the left pair of rails, completing the primary airflow path; and after some very limited effort in cable management, the build is largely complete.
Writing the build in this manner is perhaps a little misleading, because it implies that everything was done with machine-like efficiency, one after another. This is not true. In fact, before the auxiliary fans are installed and the cables somewhat managed, I decided to do a little mid-build testing, just to make sure that the components work, and that the cooler is sufficient to dissipate the heat from the processor. It would be easier to troubleshoot and re-assemble components at this stage, rather than when the build is completely assembled.
Stress-testing the machine necessarily means the OS must be installed first. Perhaps I'll explain explicitly on the usage of the SSD-HDD hybrid setup, for the benefit of those who might not be familiar. The point of having a hybrid like this is so that the OS and the most-often-used programs can be installed on the SSD, which offers significantly faster I/O speeds than the HDD. This speeds up the machine by a considerable margin. Meanwhile, documents and videos and everything else goes on the HDD. This way we don't have to rob a bank to build a computer. Of course, if you have sufficient budget, an all-SSD configuration would be faster still, but you will see diminishing returns the higher on the spectrum you ride.
I have already written a blog post on overclocking, so I will not go into the details here. For a quick-and-dirty test, I ran Prime95 and IntelBurnTest on the half-completed machine. At stock settings the heat is peanuts for the cooler. And the machine boots fine, so that's a trivial indication that nothing has disastrously failed. To test the memory, I fell back on MemTest86. And that's basically it—at the very least, the machine works. Now for the real test. Silicon Lottery guarantees that the i7-7820X chip is stable at 4.8 GHz with 1.262 V core voltage, with -3 AVX offset and -5 AVX512 offset. The offsets essentially imposes additional frequency penalties when AVX and AVX512 are used, for effective frequencies of 4.5 GHz and 4.3 GHz respectively. Implementing these figures on the BIOS, I went back to stress-testing again. Temperatures are high, hovering in the 85-90 degrees Celsius range under load—but the machine still works, and the overclock appears stable. The machine passed.
It is almost a sin to not pause for a few moments and appreciate the sheer beauty of having 16 little boxes on Task Manager. Here, I implore you, stop what you're doing right now and stare at this screenshot, as you would for a piece of true art. I think this beats a good number of the modern art pieces out there. I'm very sure the lines on my screen are straighter than that single line on the canvas, and you can reflect on the meaning of life better when there are 16 little boxes encouraging you with hundred-watt intensity.
Before anyone comes after me with a machete, or perhaps a paintbrush, I should declare that I appreciate art and design and aesthetics in general. Unless the particular art involves painting single lines on a monochromatic canvas, in which case, nope. Back to the build. The test is done and finished, so all that's left is to complete the assembly. There actually is a slight problem to this. The side case panels no longer fit properly with the fans and radiators mounted, although Thermaltake claims that the case can handle them just fine. The issue is with insufficient clearance—the screw heads for the fans and radiators protrude out from the mounting rails a millimeter too far, and is in the way of the panels. I could not be bothered to do any modifications to the case, so I just forced the side panels in. A little bending of sheet metal can't do any harm, right?
The machine looks like this on its table. What's left now is simply to continue stress-testing for a while longer, and the build can be considered done! Of course there's still the temperature issue, and I think I will see some improvements if I re-apply the thermal paste, but that is for another story.
Lastly, some rough benchmarks that shows the performance of this machine as compared to the desktop:
I am very happy with the results. I had targeted a rough 200% increase in compute power, and I largely got it. Till next time, goodbye!