Reviews 52193 Published by

The Tech Report posted an article on Nvidia's GeForce 8300 chipset



Just a few short years ago, home theater PCs were pretty cutting-edge. You pretty much had to be an enthusiast to even know such a thing was possible, and setting up a suitable system wasn't cheap—especially if you wanted to make the most of a high-definition TV. But as is often the case in this industry, cutting edge features and capabilities quickly trickle down to the mainstream. Even today's run-of-the-mill home theater PCs are leagues ahead of the once-impressive media rig that I assembled several years ago and still use today.

Several factors have conspired to make home theater PCs so capable and popular. Microsoft deserves some credit for bringing a 10-foot GUI to Windows, making it easier for folks to control their PCs from the couch without having to mess with additional software. The industry trend toward lower power consumption has helped, too, delivering scores of cool-running chips that can get by with the kind of near-silent cooling you want in your living room. Integrated graphics chipsets have also stepped up in a big way, offering credible gaming chops and an arsenal of advanced video decoding tricks.

For a few months now, AMD's 780G has reigned as the only integrated graphics chipset capable of handling high-definition video decoding. Now it has company in the form of Nvidia's new GeForce 8300. This single-chip core logic package features a graphics core derived from the GeForce 8400 GS, full Blu-ray decode acceleration, a HyperTransport 3.0 processor link prime for Phenom processors, PCI Express 2.0 connectivity, Gigabit Ethernet, loads of SATA RAID, and an even dozen USB ports. Impressive specs, no doubt, but can the GeForce 8300 unseat the 780G as our integrated graphics chipset of choice? Read on to find out.
Nvidia's GeForce 8300 chipset