+1 for Nvidia

An article from Legit Reviews says that:

NVIDIA has centered the Windows taskbar to the middle display by default! When we first started using a triple monitor setup like this one of our biggest gripes was that the Windows start button was all the way to the left and the clock was four feet over on the far right monitor. This was a pain in the butt, but NVIDIA has solved it and enabled this by default (yes, you can disable it in the control panel). NVIDIA also made it to where you can maximize windows to a single display, which is something you can’t do on the AMD Eyefinity setup without having to find and install optional software.

That means that with an Nvidia graphic card all the displays will act independently as apposed to all the displays acting as one big monitor like ATI. I don’t know about you but one of the reasons I didn’t use Eyefinity that much is because maximizing the window would make it go across all three monitors.

NVIDIA GeForce GTX 680 Accessory Display

T minus one month till I get the GTX 670!

GTX 670 Is the Winner

It looks like the GTX 670 is the card to get. It’s not too expensive, about $400, and still has all the features of the GTX 680 except for a little less power, 1344 stream processors and a 915Mhz core clock. I saw some rumors about a 4Gb NVRam version coming out at the end of June and if it can overclock like the current GTX 670 it will probably be the best card to have for the next two years.

I’ll probably get the 4Gb version of the GTX 670 card since it will have enough power to run my displays and be in my budget.

I sold my 5870’s about 6 months ago and will have to wait another month to get a new card. I hope that a new card isn’t announced in the next year or so making me start this process all over again. 🙂

OBi100 VoIP Box

I just ordered the OBi100 from Amazon. What this cool little device, about the size of a deck of cards, does is connect to your Google Voice account so you can call out with a “plugin in the wall” phone.

image

I had a Cisco device that did this a while back but when Google purchased Gizmo5 they stop all SIP calls. For around $45 I can use a real phone instead of using my cell phone all the time.

Ill post more when it comes in.

Skype HD Video Calling

Finally my video in Skype, v5.9, is in HD and it only took a $8.5 billion purchase from Microsoft. I think that Logitech had an agreement with Skype to only allow their “Skype Certified” cameras to enable HD and now that Microsoft owns Skype their cameras automatically.

For the past few years I’ve tried everything to get my video to be HD, new web cameras, editing configuration files, forwarding ports. Even though I tried everything in my arsenal of tech tricks I always got the “watching YouTube on my iPhone over 1xRTT” quality” or 360p on good days. If you use YouTube you can see that 360p is just above undecipherable.

YouTube Video Quality Settings:2012-05-06_220912

I don’t know why but I decided to upgrade Skype and, BAM, HD video. I couldn’t believe it. Everything I did in the past and out of nowhere it works. The video quality was so bad that I started to use Google Hangouts, more on that later. I hung up and called my friend back and, BAM, HD. Just to verify I restarted my computer and I still was able to send in HD. 

When in refer to HD I mean anything above 480p or 704×480. The video I was sending is in the 720p or 1280×720. I believe that my camera can send 1080p, 1920×1080, but not everyone has the resources to view the video. As you can see below I was sending my video out at around 400KB/s and using about 20% of my CPU. My computer not the latest and greatest but it is pretty fast (3.5Ghz 4 core HT) and I have a pretty speedy broadband connection, 50/7, from Comcast. Viewing the technical info of the call the average upload, sending video, was 480KB/s where as the download video, receiving was 112KB/s. All that means is that you need to have a decent computer and a middle tier broadband connection and you should be able to send HD video with a capable camera.

Network Usage:
Skype Network Usage

CPU Usage:
Skype CPU Usage

Doing some testing I found that Skype tests the connection and a few other things to make sure you can send in HD. I made a video with my brother Aaron to demonstrate. About 50 seconds in you can see my video change from a letter box, square, to widescreen, rectangle. You can also see that my network usage goes from 350KB/s to about 360KB/s. Aaron wasn’t able to get his video to send hence the letter box window.

HD Test Call

 

 

 

 

 

HD Video Recieve
Above you can see a screen shot of what the HD video looks like on the receiving end it’s about twice the size and not as blurry.

Yes I know my office area is a mess, I’m moving :).

I started to use Google hangouts when it launched a few months ago It had minor “beta” issues but was still better than Skype since it uses less processing power and allows multiple video calls for free, Skype makes you pay $10 a month, I tend to use it more. Some of the people I talk to have slower computers and can’t do anything else when they talk on Skype. Which brings me to the only negative of HD video when using Google hangouts. The quality of video on Google Hangouts is in HD resolution (size of the video) but still has some pixilation since, I think, the video is relayed from their server to your computer and not a direct connection like Skype. 

Either way they are both great technologys but I’ll probably use Google hangouts and Google Chat because they work natively on Android phones.

VelociRaptor 600GB

Well its finally here, my current hard drives (WD6000HLHX) are half what I paid for them a little over 3 years ago. I have two of them in RAID 0 and used a 1TB Caviar Black for more important data. I think this is a indication that I should install my OCZ Vertex 2 SSD :).

image

If you are looking for a nice upgrade and for some reason dot want an SSD the 600GB Raptor is a steal for $169. I do know that they came out with a 1TB Raptor for $329 that preforms very well but if your going to spend close to $350 on a hard drive you might as well get SSD.

Graphic Card Conundrum

Currently there are three graphic cards that I can upgrade to. The first card is a Geforce GTX 680, I’ll list the specs below for all cards, the other two are the Radeon HD 7970 and the Radeon HD 7950 from ATI. They all perform well but my goal to be able to enjoy Battlefield 3 and the next few game releases on all three of my 24” monitors. By “enjoy” I’m referring to 50fps or higher.  If you see my other posts about my current setup I have three Dell U2410’s connect to my custom built workstation.

My workstation is a Core i-7 860 overclocked to 3.5Ghz and has 16Gb of RAM, used for virtual machine, and two 600Gb VelociRaptor ( I’ll use my SSD sooner or later). I had two 5870’s in Crossfire which worked out well for Bad Company 2, Civ 5, and other games before current generation of power hungry games.

GeForce GTX 680

GeForce GTX 680

The GeForce GTX 680, above, has 1536 stream processors, 28nm process, and a 1006Mhz core clock with 2048Mb RAM.

Below you can see the Radeon HD 7970. It has 2048 stream processors, 28nm process, 925Mhz core clock and 3072Mb or ram. 

Radeon HD 7970

Radeon HD 7970

 

Radeon HD 7970

The HD 7950 looks exactly the same but has 1792 stream processors, 28nm process, 800Mhz core clock and 3072Mb or ram.

Right now I’m using a graphic card that one of my friends is letting me borrow, GTX 480, and it runs Battlefield 3 at 50fps to 60fps on low and AOE above 60FPS  but my main goal of being able to game on three monitors is too much for the card and it doesn’t support the feature.

I’m truly stuck between the GTX 680 and the HD 7970. I was eyeing the HD 7950 but after doing a little more research I saw that I would get need the higher performance cards with my setup. With a resolution of 1920×1200 the HD 7950 Crossfire falls about 10FPS behind, resolution of 5760×1200. Also the three monitor multi graphic cards benchmarks are pretty close, 26 FPS average  for the GTX 680 and 25 FPS average for the HD 7970.

I might be able to overclock the cards too. With overclocking I can push the numbers and the card to its limit to get free performance. Here too the cards are almost exactly the same except for the price. It make we wonder if $50 is worth an extra monitor and a little less power draw, 480 watts for the GTX 680 SLI and 490 watts for the HD 7970 CF.

I’ll do a little bit more research and make a pro and con list next week and see what card ends up on top.

(Images from Gru3d.com)