Water Cooling Unconversion#

The nVidia 6800 Ultras that used to inhabit Phillip were the bane of my water cooling plans. I water cool for the quiet, not the performance. But these 6800s are so hot, my quietness plan was all messed up.

And its entirely my fault, too. I bought waaay too much video card. I wanted to experiment with SLI, using two video cards to run one display. Granted, the one display was a Samsung 243T with a native resolution of 1920x1200. The cards performed amazingly well, except that they were so hot, I ended up adding an additional radiator and two fans to the system to keep it cool.

The latest refit retired these toaster ovens, and I figured rather than let them sit there and rot, I'd let me friendly neighborhood computer store resell them for me. In fact, the owner came and picked them up, he had them sold before I was ready.

Unfortunately, the new owner wasn't into water cooling, so I had to convert these water cooled video cards back into air cooled ones. I had kept all the fan equipment in the original boxes, so it wasn't tough to find. Reassembly, however, is tricky.

 

Freshly removed from Phillip and drained of water, one water-cooled nVidia 6800 Ultra.

Four center plate screws, six spring loaded edge screws and two voltage regular mount screws later, the water cooling block is removed from the board. Notice the less than perfect impressions on the cooling block from the RAM chips of the video card. The system was never unstable, but it sure looks like this block wasn't as tightly fitted as it could be.

Deploying all the air cooling hardware. The copper block goes onto the GPU (along with the black backing plate), the angled aluminum block with the white blobs on it goes onto the RAM chips, the voltage regulator heat sink is the bottom right hand corner of the picture, and the fan assembly itself is in the top right hand corner.

After cleaning off the old thermal paste, I applied new stuff to the GPU, used the original white contact pads for the RAM, and carefully put everything together. The copper GPU plate goes on first, using spring loaded screws that go through the plate, board and into the black backing plate. Then the RAM cooler goes on with six different spring loaded screws. The comes the heat sink for the voltage regulators, held on with a pair of spring loaded clips. Finally, the fan itself is held on by three screws and plugged into the board.

Innit purdy? I like the water cooled version better myself.

That was the first one, the second one was even easier. Then a careful repack into the box, including power cables and DVI-VGA adapters for each.

These video cards were not a great investment for me - I think they were worth about 20% of what I paid for them a year later. Not counting the water jackets, which I still have and I can't imagine what I'll do with them. Maybe EBay.

Sunday, February 19, 2006 9:45:19 PM (Pacific Standard Time, UTC-08:00) #    Comments [1]  | 

 

Refit and Clean up of a Water Cooled SLI System#

Before I could get into rebuilding Terrance, my triple-screen system, there was an obstacle that had to be resolved first: Phillip, the SLI system that sits on top of it.


This is what my workstation rack looked like - at the bottom, barely visible, is a Minuteman 1000RM E rackmount UPS. I had my electrician rewire the outlets in my workstation bays so that the power passed through the rack closet, so this UPS could move into the rack closet, saving me 2Us and two fans.

Above the UPS is the Terrance, triple-screen system. It was the quietest thing in the stack, a P4 based system with a Matrox Parhelia to drive the three Viewsonic 18" displays. And sitting on top was the problem child: Phillip, the AMD-based gaming system with a pair of nVidia 6800 Ultras configured for SLI. This is plugged into a Samsung 243T. This rig put out 100 frames per second in Half Life 2 at 1920x1200. It also nearly melted in the process. The 6800 Ultras are just too hot. I ended up strapping a big radiator to the top of the case with a pair of Vantec 120mm Stealth fans mounted on it. Yes I know: fan bad. But melting worse.

Here's a top view of the SLI system, you can see the size of the additional radiator. This kept the system cool even under heavy SLI use. In exchange, of course, for ugliness and noise. Whenever I would record .NET Rocks! I'd have to turn this machine off.

The solution was to get rid of the 6800 Ultras. I considered going with later model SLI cards, say a pair of 7800GTs. These are actually cooler than the 6800s, and have more horsepower. Then I got a look at ATI's X1900. A 512MB video card with comparable performance to many SLI systems. In one card. How great is that? So I switched - trade in the 6800 Ultras for one Sapphire X1900XT (and a bunch of money).

I'm very much of the mindset that anything worth doing is worth doing excessively. And since I was going to totally overhaul Terrance, why not do the same for Phillip? The problem was, there really wasn't much better than the existing gear. The ASUS A8N SLI motherboard is great. The AMD processor in it, granted a single-core 4000+, but still a great processor. 7200rpm hard drive, dual burners... what could I really do to improve it? The new video card gets rid of the heat problem, so other than that, a couple of gigs of stinky fast Corsair RAM is all I could come up with.

The upside to this is that it meant I had one machine that would stay operational - it didn't need to have a scratch re-install because I wasn't changing the motherboard, just the video card.

However, it also meant breaching the water loop of the biggest, ugliest water cooled machine I've ever built.

The number one problem you face when breaching a water loop is how to do it without making a mess. The first thing I always do is take the cap off the reservoir, which allows air in to the system. The water loop is more or less air tight, so creating some pressure relief lets water leave the lines. Next I open up the highest point in the loop, which is normally the top of the radiator. In this case (look at the photo above) the top of the radiator is quite high up, and the line is essentially dry when the pump isn't running.

Ordinarily I'd use my little bulb pump to force all the water out of the system, but since its fatal encounter with the resident terrier, it was up to my lungs. So I pulled the line from the top of the upper radiator, then added a bit of hose onto the radiator connector and aimed it at ye olde yogurt container. Then I blew into the other end. And blew, and blew. There's a lot of water in the system.

Eventually I drained enough that the lower line of the upper radiator was also dry, and then I pulled that off as well, and reconnected it to the upper connector of the lower radiator.

That got the upper radiator out of the loop. I was careful in actually removing it because it still had a lot of water in it. I had to rotate it a bunch of times to get the majority of the water out.

I took a break at this point, you can see in the above photo the now completed water loop without the additional radiator. This is how I originally configured the system until I discovered that 6800 Ultras run at sun-like temperatures.

Next step, extract the 6800s.

The 6800s were plumbed into the system between the processor block and the Northbridge block, which is between the two video cards. And boy, was that ever fun to get together the first time. However, getting them out wasn't so bad - the connectors for the water blocks sit relatively high up, so with most of the water out, the lines were pretty much high and dry. I had to cut a new segment of hose to run between the processor block and Northbridge block.

Once I got the cards out, this is what I found.

The second video card in the SLI pair had sprung a leak. That goop is from the water loop dripping down into and beside the PCI-E slot. Beats me why the thing still worked. I wasn't all that concerned, since all this was all on the second PCI-E slot, and I was switching to a single card. Notice I've already turned the ASUS Patent Pending SLI mode card over, although I don't think its actually inserted correctly...

You can see where the leak came off the water jacket and dripped down onto the motherboard. I strongly suspect I melted the seals on this water jacket when it overheated... before I realized I needed a second radiator for it.

I'd worry about the 6800 Ultras later. Now it was time to fit the new video card and get things back up and running again.

You can see the new card and the new hose running from CPU to Northbridge. It looks too high in this photo, but it wasn't. Unforunately, Innovatek hasn't made a water jacket for the X1900XT yet, so I'm going to have to leave the fan on the video card for now. The good news is that its speed sensitive, so when I'm not running anything graphically intense, its pretty quiet.

One interesting problem was that the power adapter cord that came with the video card was only a four prong cable, and there's a six prong plug on the board. I tried it, and it didn't work - the machine kept coming up with a BIOS level error on the display saying "plug power into the video card." I used one of my six prong spares and it powered up fine. Price of being first with one of these cards, I guess.

What isn't in the above photo is the 2GB matched pair of Corsair 3500LLPro I stuffed in, fast response RAM with lots of head room and blinky lights.

Phillip powered up fine in this new configuration, and Half-Life 2 plays great on it.

One machine down, one to go.

Saturday, February 18, 2006 2:29:44 PM (Pacific Standard Time, UTC-08:00) #    Comments [1]  | 

 

Cynicism and High Resolution Monitors#

 

When last we left my latest journey into the realm of the resolutionally absurd, I had a couple of large boxes on the floor, one of which contained an Apple 30" Cinema display. This display uses dual-link DVI, which has been around for awhile, but is not widely understood. The reality of the DVI system is that it supports a lot of different modes, and dual-link is the most powerful and most expensive of them. Inside a dual-link DVI cable is two entirely separate sets of video signals. This is necessary to handle the 2560x1600 resolution of the Apple 30" Cinema display. A single-link DVI pretty much maxes out at 1920x1200.

So to drive the Apple 30" Cinema display at 2560x1600, I needed a dual-link capable video card and a dual-link cable. The display came with a six foot hard-wired cable, much to my annoyance, since I nee at least twelve feet of cable to reach my workstation bay. That meant finding a dual-link extension cable.

You'd think finding a dual-link video card would be easy, and you'd be right, unless you wanted assurance that you're actually getting one. It used to be that dual-link DVI was a rare and expensive feature, requiring you to order specific cards for the purpose. That's not true anymore: pretty much every nVidia 7800 series video card supports dual-link, its so common that its not mentioned anywhere in the specifications or documentation at all.

So, when I looked at my situation: new monitor, unfamiliar cabling protocol, fixing wiring with an extension and utilizing an essentially undocumented feature of a video card, I figured there was no hope in hell of it actually working. Cynical? Perhaps. But just because you're not cynical doesn't mean you aren't screwed.

And realize that the machine I wanted to rebuild is my main workstation - granted I have backup machines, but taking the main workstation down is not something I do casually.

So, I built a testbed. Left the existing machine entirely alone and bought the parts to rebuild it, with the intention of testing all those parts independently of the existing machine.

Since I was going to need two video cards, one to drive the Apple display and the other to drive the two wing displays, I wanted to get a motherboard with two PCI-Express slots in it. This means an SLI board like the ASUS A8N I currently had in my gaming system. Normally SLI uses two video cards to run one display, thereby doubling the frame rate. For my purposes, I'd be using two video cards independently, but with symmetrical performance. Sure, I could have done this with one AGP card and one PCI card - but that would suck. Dual PCI-E is the way to go.

I chose the ASUS A8N32-SLI motherboard for the job, and just for good measure, plugged an AMD 4800 Dual Core in it. Hey, two video cards deserve two processors, right? The video cards I chose are MSI's implementation of the nVidia 7800GT. These are high performance video cards, but not top of the line: I'd had enough of the heat problems with the 6800 Ultras to know better. These are great cards, lots of horsepower, but not so much that they're running in a state of near meltdown.

So, to build the test bed, I rigged up the motherboard with the processor, some spare RAM I had lying around, a hard drive and DVD player. Just sitting there at the service desk on a towel. I stuck one video card in it because I wanted to work out the first issue: could I make the Apple 30" Cinema display work in 2560x1600 mode with an extension cable. The list of failure points was long, but the key ones were whether or not I had the right video card, and whether or not 2560x1600 signals would travel through an extension cable and still be bearable on the far end.

Here's what the rig looked like:

You might just spy the screwdriver stuffed under the back of the board. The video card sticks down enough that it was popping itself out of the slot when I was testing, freaking me out when suddenly nothing worked. Getting the "machine" up and running wasn't all that difficult. I first fired things up with the video card plugged into the little 15" LCD panel you see sitting behind the board. Once I was sure the basic configuration worked, I fired it against the Apple display without the extension cable.

That worked as well, so I went ahead and did an install of Windows XP. This takes awhile, between the hard drive formatting and basic install. The rig was plenty quick. Once the base OS install was finished, I focused on video drivers. This was best done by getting network drivers running first, and downloading the latest video drivers.

640x480 on a 30" display is hilarious - the icons are the size of your fist. Then I got the nVidia reference drivers installed, and the resolution bumped up to 1280x800. Better, but not what I wanted. My mistake was plugging the display into the top connector on the video card: only the lower connector has dual-link support. Once it was down there, I got this:

And just in case you can't read it clearly:

Final test was the extension cable. Plugging it in was fine. The screen was clear and stable with and without the extension cable. On reboot, the starting low-resolution screens had little bits of distortion in them, but as soon as it kicked into high resolution mode again, it looks perfect.

So, tests complete, I guess I'm ready to tear apart some gear and get these new displays integrated into the office.

Sunday, February 12, 2006 3:46:18 PM (Pacific Standard Time, UTC-08:00) #    Comments [3]  | 

 

Time for New Toys!#

I've said this before, and its still true - Christmas isn't a good time for me in terms of me getting toys. Its good for everyone else because they all come to me for advice about toys. Spouses are especially interesting around Christmas, there are some that say "don't buy anything until you talk to Richard" and others say "don't you dare talk to him, he's a bad influence on you!" Either way, I'm happy to help folks out selecting gadgets. I just don't expect them for myself. After all, considering how difficult it is to buy for me when I'm doing it, I wouldn't even try to put that sort of pressure on my loved ones.

My happy time comes after Christmas for a variety of reasons. Since Christmas is over, I'm not stepping on anyone's presents if I buy myself something. Also, since January is a slow time for gear sales, my regular suppliers really appreciate my buying spree.

So, with that preamble, let me give you a few photos of what showed up over here in ToyLand...

Yes indeed, I finally pulled the trigger on ordering DigitalTigers' ZenView PowerTrio HD. The combined resolution of the three monitors involved is 4960x1600 - just over 7.9 million pixels. Woohoo! The package arrived remarkably fast, around a week or so. There are two boxes, one containing the Apple 30" Cinema Display, and the other has everything else in it: the two Samsung 204T panels, the stand, cables and instructions.

The "everything else" box was nicely packed. Each of the sets of components sat in its own foam layer. The picture above is the top layer containing the stand. Beneath that were the layers of the Samsung displays, each separate by foam and/or styrofoam. Its a great package.

Here's a look inside the Apple box. Note the large monitor. That's 30" diagonal, baby.

I didn't realize that the LG.Phillips display was actually the Apple 30" Cinema display. The big whammy there is that there's only a one year warranty on this display, unlike the Samsung panels (and virtually every other LCD out there) that have three year warranties. Also, the cables on the Apple display are hard-wired, which meant using an extension cable. My displays sit about 12 feet from the computer when you account for going through the desk, in the cable channel at the back of the desk and into the slide-out workstation bay. For the Samsung displays, I bought 12 foot DVI cables, but I had to get a dual-link DVI extension cable.

Although the stand is branded DigitalTigers, I'm pretty sure its an Ergotron stand, if for no other reason than I already had the same style stand for my old triple screen display. Although admittedly, to handle these enormous monitors, the stand is a bit bigger.

Next step - serious testing.

Thursday, February 9, 2006 10:06:04 PM (Pacific Standard Time, UTC-08:00) #    Comments [0]  | 

 

All content © 2023, Richard Campbell
On this page
This site
Calendar
<June 2023>
SunMonTueWedThuFriSat
28293031123
45678910
11121314151617
18192021222324
2526272829301
2345678
Archives
Sitemap
Blogroll OPML
Disclaimer

Powered by: newtelligence dasBlog 1.9.7067.0

The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

Send mail to the author(s) E-mail

Theme design by Jelle Druyts


Pick a theme: