Last year (2011), when I graduated high school, my school was going through a huge overhaul of the campus. The old campus had been slowly been being destroyed as new buildings had been built. That said, they also went though a shift in the technological resources. They decided to write off or just plain junk a significant portion of the old computers (P4 era Dell Optiplex machines) and I happened to obtain two 17″ TFT panel LCD screens. I bring them home thinking I’ll do something with them in the future. I happened to be working with a upstart gaming studio, E1FTW Games (http://www.e1ftwgames.com/), that summer (i still am) and I had an iMac on my desk so I did nothing with the monitors. After I left for college, I used the pair of them with an old eMachines computer that my family had long since forgotten about as my computer when I was at home because my primary (most awesome) computer I had brought with me to college, and resided in my dorm room. When I finished my first year, I live (am still as of July, 2012) at my parent and I set my big computer back up. I use a 1080p 23.5″ LCD tv as my primary monitor, but I seriously wanted to use the pair of monitors I had with my desktop. Much to my dismay, Nvidia GPUs only support 2 monitors per chip, so, even though I had the three monitors on my desk, only the 23.5″ panel worked along with one of the smaller screens. So, here I was trying to find the cheapest Nvidia gpu I could that would fit into a PCIe x1 slot. Much to my surprise, they cost more than their PCIe x16 counterparts, something I regard as pretty damn stupid. So I was searching for one that would fit in the PCI bus. They were even more than the PCIe x1 cards. It just wasn’t fair. So I was lining up to buy a GeForce GT 430 for something like $80 that slotted in a PCI socket. I was pretty bummed that this was the only solution, but then I had an idea. PCIe is supposed to be failure tolerant. If one of the channels goes dead, it just isolates the problem by ignoring the fact that it exists. So I had a though – could I stick a PCIe x16 card in a PCIe x1 slot and operate a just a 1/16th the bandwith? Sure enough, there were websites all over the internet that described cutting the end of the PCIe x1 slot off and placing in a PCIe x16 card. I decided to try it out, and what do you know, it worked. So here are some pictures.
There was one unforeseen side effect of running the GPUs under Linux (my OS of choice). I was trying to use Xinerama to make one contiguous display so I could do the awesome extended desktop thing, but alas, it was not to be, considering that I am using two widely varied cards. The GeForce 7300 card is so old, it was available before Windows XP had any service packs. It doesn’t even use shading processors. It can only run one shader script at a time and has vertex, fragment, and geometry units straight on the card – its a DX9 GPU. The primary card is a GeForce GTX 560. A card with 8 times the amount of memory that runs 100s of times, 336 CUDA cores, and supports DX11 and OpenGL 4.x. So compositing did not work and GL was disabled on the displays because it wasn’t compatible with main card, in turn because not all cards had GL, KDE wouldn’t run the effects manager. This resulted in really slow window operations, the UI was so very laggy. So I decided to give separate X screen a go. It works flawlessly. Windows may be locked to their respective screens, but its not at all bad. Kwin places new windows in the screen where the mouse is when the application is launched. Although, i do wish that when I want a new chromium window I could put it in another screen without having to run DISPLAY=”:0.2″ chromium from the console all the time when its already launched in another X screen. I spend a lot of time in the console though, so its not really too bad. Beats having only one monitor. Since I chose to do it this way, OpenGL applications are supported in all the windows and they start by default unless instructed otherwise on the primary screen. Fullscreen OpenGL applications on the two side monitors are unpredictable and unstable, but just fine on the center screen, driven my the massive GPU. All in all, its an awesome setup and I love it. Linux had come a long way since its conception and now with Unity3D, a very popular game engine, officially supporting Linux and Autodesk releasing their 3D software for Linux (such as Maya), maybe Windows will start loosing its stranglehold on gaming.