Google Search

Thursday, May 31, 2012

Intel Says Haswell Coming in 2013

dropped a few hints to its upcoming 22 nm Haswell architecture, currently under development by the company's secret Oregon team. In a post on the Intel Software Network blog titled "Haswell New Instruction Descriptions Now Available!", the company reveals that it plans to launch the new CPU in 2013.

Haswell will utilize the same power-saving tri-gate 3D transistor technology that will first drop with Ivy Bridge in early 2012. Major changes architecturally reportedly include a totally redesigned cache, fused multiply add (FMA3) instruction support, and an on-chip vector coprocessor.

The vector process, which will work with the on-die GPU, was a major focus of the post. The company is preparing a series of commands called Advanced Vector Extensions (AVX), which will speed up vector math. It writes:

Intel AVX addresses the continued need for vector floating-point performance in mainstream scientific and engineering numerical applications, visual processing, recognition, data-mining/synthesis, gaming, physics, cryptography and other areas of applications. Intel AVX is designed to facilitate efficient implementation by wide spectrum of software architectures of varying degrees of thread parallelism, and data vector lengths.

According to CNET, Intel's marketing chief Tom Kilroy indicates that Intel hopes for the new chip's integrated graphics to rival today's discrete graphics.

Intel has a ways to go to meet that objective -- its on-die GPU in Sandy Bridge marked a significant improvement over past designs , however it also fell far short of the GPU found in Advance Micro Devices (AMD)

Intel has enjoyed a love/hate relationship with graphics makers AMD and NVIDIA Corp. (NVDA). While it's been forced to allow their GPUs to live on its motherboards and alongside its CPUs, the company has also fantasized of usurping the graphics veterans. Those plans culminated in the company's Larrabee project, which aimed to offer discrete Intel graphics cards.

Now that a commercial release of Larrabee has been cancelled, Intel has seized upon on-die integrated graphics as its latest answer to try to push NVIDIA and AMD out of the market. Intel is promoting heavily the concept of ultrabooks -- slender notebooks like the Apple, Inc.'s (AAPL) MacBook Air or ASUTEK Computer Inc.'s (TPE:2357) UX21, which feature low voltage CPUs and -- often -- no discrete GPU.

Mr. Kilroy reportedly wants ultrabook manufacturers using Haswell to shoot for target and MSRP of $599 USD, which would put them roughly in line with this year's Llano notebooks from AMD and partners. It's about $100 USD less than current Sandy Bridge notebooks run.

Intel faces pressure from a surging ARM Holdings plc's (ARMH) who is looking to unveil notebook processors sometime next year.

Wednesday, May 30, 2012

What Is DDR3 Ram

 DDR3 -- modules have greater speeds and more memory capacity than prior generations, though to take advantage of them, the memory sockets on the computer’s motherboard must match the DDR3 standard.      

D-RAM

  • A computer’s main data storage takes the form of Dynamic Random Access Memory. This memory type achieves high capacity by using only one transistor per data bit, the fundamental unit of data. Each byte has its own unique address, making it instantly accessible.

DDR3

  • Double Data Rate gets its name from the fact that it moves data twice every clock cycle. The clock is a high-speed electronic pulse having abrupt rising and falling sides. DDR moves memory on both the leading and trailing edges of the pulse, an improvement over earlier types of memory modules which moved data only once per pulse

 

Speed

  • The DDR3 series of memory modules has four data rates: 800 MHz, 1,066 MHz, 1,333 MHz and 1,600 MHz. These correspond to data rates of 6.4 gigabytes per second, 8.53 GB/s, 10.67 GB/s and 12.8 GB/s. These rates are double those of DDR2 memory, which, in turn, were a twofold improvement over the original DDR series.

Capacity

  • DDR3 memory modules come in exponents of 1 GB, including 1-, 2-, 4- and 8-GB modules. These capacities are double those of the DDR2 series, and DDR2 doubled the capacities of DDR. Within a series, you can mix and match modules of different capacities and speeds; the slowest modules sets the overall speed.

Power Consumption

  • Each DDR3 module consumes 3 watts of power; this compares favorably with 4.4 watts for DDR2 and 5.4 for DDR. Lower power consumption translates to better energy efficiency and cooler operation of the computer.by j.tbarett

Tuesday, May 29, 2012

Installing Your New Video Card

Physical Installation

The first part of installing a video card is the physical installation. First, open your computer's case so that you can have access to the motherboard. Always unplug it before you begin to remove the case.
Now, you must find the correct slot for your video card. In this tutorial, we are learning how to install a PCI Express video card, so you will need to locate the PCI Express slot (commonly abbreviated PCI-E or PCIe). The PCI Express (Peripheral Component Interconnect Express) is a high performance standard implemented on all modern personal computers. Nowadays, it almost completely replaced the PCI, PCI-X, and AGP standards. The standard PCI Express slot size is x16 (there are also x1, x2, x4, x8, x12, x16, and x32 sizes, but this is not important to our tutorial). Here is how a PCI Express slot should look like:
Locate the PCI Express slot on your motherboard.
Always make sure you find it! If you are unable to find the slot, your motherboard might not support PCI Express. In this case, you cannot install a PCI Express video card. DO NOT ATTEMPT TO INSTALL THE CARD IN A DIFFERENT SLOT! It may permanently damage your hardware.
After you have the box with your PCI Express video card at hand, before opening it, make sure your work environment is not full of static electricity. Static electricity can destroy your new video card or any of the components in your computer. You should always have a pair of anti static gloves when you work with sensitive electronic components. An antistatic tablemat would also be an addition to security, but not necessarily required. Before you put your gloves on, touch a metallic surface to get rid of the static electricity. After you put your gloves on, open the box and extract the video card. It should look something like to one below (of course, the manufacturer and model number might be different):
Notice how the PCI Express interface your graphics card matches the motherboard.
Now that you know where to place it and you have the PCI Express video card n your hand, let's see exactly how you install it. On one side of the card, you will see the connector. This goes into the PCI Express slot. In the front of the video card, you will see the video adapters. These must fit in the back case's hole. Here how you hold the video card and you can see the connectors and video adapters:
Make sure the video card and the motherboard are properly aligned!
You must align the video card to the PCI Express slot. Make sure the card's connectors are perfectly aligned to the PCI Express slot's connectors. This is a very important part of the process because a wrong alignment can pose difficulties and can lead to hardware failure. Here is exactly how the connectors should be matched:
Insert the video card (Carefully!) in to the motherboard.
Lower the video card until its connectors touch the motherboard's connectors. Gently push the PCI Express graphics card into the slot. Do not force it. It should slide relatively easy. Keep pushing until you hear a click. The click means that your video card is in place and secured. Here is how it should look after the whole process is completed:
The video card is now physically installed!
Congratulations! You just physically installed your PCI Express video card in a professional manner, all by yourself! Your new video card is now fully functional and your system should see it. If your system does not see it, try to reinstall it again, pushing harder this time. If the problem persists, your PCI Express slot or even the video card might be damaged. Contact your vendor to obtain some information.

Driver installation

Now that we have our graphics card installed and ready to use, put your case back together, plug the computer, and start it. Windows should start normally. However once you see the desktop, you will notice that something is not right. The resolution is low, icons are large, and the overall image is of bad quality. Don't worry! This is because you don't have the hardware drivers installed. It is perfectly normal.
How do you install the drivers? It is quite a simple process and it should not take more than 10 minutes.

In case you still have your CD and documentation

Do you remember what came with your video card? If you don't, I'll tell you: a CD. This CD contains the hardware drivers you need. So, go ahead and insert it into the CD-ROM. An auto-setup window should appear, asking you what you want . You just need to install the drivers, so go ahead and click that option. Windows should install the drivers and, after it finishes, ask you to reboot your computer. Allow it to reboot. After it restarts, the desktop is just the same as before – unfriendly! So right click on your desktop, choose Properties, go to Settings, and choose a larger resolution. This depends entirely on your monitor, so make sure you know what resolution is recommended for yours. Click Apply, and OK. Windows will change the resolution and ask you if you want to keep it. If you like it, click OK.
Want to make sure your drivers are in place and no errors have occurred? Simply go to Start, click Control Panel, choose Performance and Maintenance, and click System. In the window that appeared, click on the Hardware tab. Now click on Device Manager. Go to Display Adapters and click on the small "+”. You should see the model of your PCI Express graphic card there. If there is a problem with it, a yellow exclamation mark should appear besides it. If you don't see it, everything is OK and your card is working properly. If you want to further check it, double click on it. The Properties window should appear. In the General tab, you should see this:
  • "This device is working properly."
If you are having problems with this device, click Troubleshoot to start the troubleshooter.''”
In the Driver tab, you should see the driver provider, driver date, and driver version filled.This means that your graphic card is completely installed, both hardware and software. Congratulations!
Keep in mind that drivers found on the graphic card's CD can be 6 months old. The drivers will work of course, but will lower your card's performance. We all want full performance don't we? To obtain it, we must update the drivers. The steps bellow apply to both updating and finding drivers.

In case you don't have the original CD or documentation

Lost the original driver CD? Don't have the documentation anymore? To make sure you still get the latest drivers installed, you must find an alternative way to get the newest version. If, for example, your PCI Express video card has an Nvidia chipset, you should go to Nvidia's website to find new drivers. Please note that Nvidia might not be the manufacturer – in my case, the manufacturer is Asus. The manufacturer does not usually keep drivers on his website. You should go to the chipset manufacturer's official website if you want to find the latest drivers. Here is where to go on the Nvidia website:
The NVIDIA driver website homepage.
Once in the download drivers section, choose your graphic card model, click search, and you should be able to find and download the driver:
Selecting the correct NVIDIA device.
You must now install the driver. Go where you downloaded it, double click on it, and the following window should appear:
The security warning that occurs when you run the NVIDIA driver.
Click Run because you can be sure that this driver is genuine, from Nvidia's own official website. Another window should appear:
The first screen of the installation process of the NVIDIA driver.
You must choose where to extract the files. Choose a drive where you have approximately 100-150 MB of free space. After extraction is completed, the setup should start. It looks something like the one below:
Completed extraction of the NVIDIA video driver.
Click Next, accept the Terms and Conditions, and click Yes. The drivers will be installed. After it finishes, you will be asked if you want to restart your system now or later. If you are working on something, save, and then click "Yes, I want to restart my computer now”:
Final step of the NVIDIA video driver installation.
Go to Device Manager, double click on your PCI Express video card and go to the Driver tab. You should see the driver date and driver version there. Notice that both fields are filled and up to date.
Some drivers have bugs and can cause errors. Don't blame the manufacturer, everybody makes mistakes! If you don't manage to find the right driver on the chipset manufacturer's website, you should try Google. For example, I googled "latest GeForce GTX 670 drivers” and found quite a large list of websites. You should pick a website that you can trust because anything you install on your computer can contain viruses, spyware, or adware. There are many websites that act like huge video driver databases. These websites offer both old and new drivers. Some of them require a paid membership, but you should be able to find some that are completely free.
If you don't find your driver anywhere, you should try the premium websites. If this does not fix the problem, try to email the chipset manufacturer and ask him about the drivers,you realize that this is the worst-case scenario! I didn't hear of any PCI Express graphic card without proper drivers. The technology is recent so drivers are abundant.
Our tutorial ends here. I hope you had a good lecture and that it will help you when it comes to installing a PCI-E video card, both hardware and software. Good luck with your new card.byplaytool.com

Windows 8 The End Of ReBoots

Microsoft yesterday promised that a feature it's added to Windows 8 will put a stop to endless reboots.
Unlike earlier versions, Windows 8 will automatically call up a new menu with repair and recovery options when the software sniffs out problems getting the machine to boot or the OS to load properly.

In a post to the Building Windows 8 blog Tuesday, Chris Clark, a program manager with the user Experience team, described new tools embedded in the operating system designed to step in when a PC reboots more than twice because of problems.

Although Clark couched the changes as necessary because of increasingly-fast boot times -- meaning users are often unable to interrupt the process with traditional key presses like F2 or F8 -- one side effect is that endless reboots should be a thing of the past.

The problem has plagued Windows at times.

In 2008, an update to prep machines for the release of Vista Service Pack 1 (SP1) crippled PCs when it sent them into an endless cycle of rebooting.

The same trouble resurfaced in late 2009 as users tried to upgrade to the then-new Windows 7 from Vista, and again in 2010 when a Microsoft security update conflicted with rootkit-infected Windows XP PCs.

Windows 8, however, features what Clark called "automatic failover," which loads a new boot options menu "when there is no way to successfully complete Windows startup." From the boot menu, users can select the operating system's repair or restoration tools to try to find and fix the underlying problem.

"This automatic failover behavior will take you directly to the boot options menu whenever there is a problem that would otherwise keep your PC from loading Windows," said Clark. "This even includes cases where it appears  that boot has succeeded, but in actuality the PC is unusable."

Windows 8 also takes other boot-related actions. An unsuccessful core boot sequence triggers an automatic retry. If a second attempt fails, then the machine automatically loads Windows Recovery Environment , the diagnostic and recovery toolset that's been bundled since Vista.

Because the failover kicks in after Windows 8 detects two consecutive reboots, theoretically the new OS should be immune to the endless rebooting that has dogged XP, Vista and Windows 7

Along with addressing endless reboots, the new boot options in Windows 8 let users initiate a restart that doesn't rely on a key press during the reboot process, eliminating the need to interrupt the reboot at just the right time.

As Clark explained, Windows 8 -- following in the footsteps of Windows 7 -- has such short boot times that it's essentially impossible to press a pause key such as F2 or F8 before the process moves on.

Windows 8 boot times are something that Microsoft has already hawked. Last September, when the company debuted the Developer Preview. According to the company's test, Windows 8 boots between 30% and 70% faster than earlier versions.bygreggkeizer

Monday, May 28, 2012

DDR4 Coming Soon?

Can you believe DDR3 has been present in home PC systems for four years already? It still has another two years as king of the hill before DDR4 will be introduced, and the industry currently isn’t expecting volume shipments of DDR4 until 2015.
JEDEC isn’t due to confirm the DDR4 standard until next year, but following on from the recent MemCon Tokyo 2010, Japanese website PC Watch has combined the roadmaps of several memory companies on what they expect DDR4 to offer.
Frequency abound! Voltage drops!
It looks like we should expect frequencies introduced at 2,133MHz and it will scale to over 4.2GHz with DDR4. 1,600MHz could still well be the base spec for sever DIMMs that require reliability, but it’s expected that JEDEC will create new standard DDR3 frequency specifications all the way up to 2,133MHz, which is where DDR4 should jump off.
fastest ddr4 image
As the prefetch per clock should extend to 16 bits (up from 8 bits in DDR3), this means the internal cell frequency only has to scale the same as DDR2 and DDR3 in order to achieve the 4+GHz target.
The downside of frequency scaling is that voltage isn’t dropping fast enough and the power consumption is increasing relative to PC-133. DDR4 at 4.2GHz and 1.2V actually uses 4x the power of SDRAM at 133MHz at 3.3V. 1.1V and 1.05V are currently being discussed, which brings the power down to just over 3x, but it depends on the quality of future manufacturing nodes – an unknown factor.
While 4.2GHz at 1.2V might require 4x the power it’s also a 2.75x drop in voltage for a 32 fold increase in frequency: that seems like a very worthy trade off to us – put that against the evolution of power use in graphics cards for a comparison and it looks very favourable.
ddr4 quad channel 4x
One area where this design might cause problems is enterprise computing. If you’re using a lot of DIMMs, considerably higher power, higher heat and higher cost aren’t exactly attractive. It’s unlikely that DDR4 4.2GHz will reach a server rack near you though: remember most servers today are only using 1,066MHz DDR3 whereas enthusiast PC memory now exceeds twice that.
Server technology will be slightly different and use high performance digital switches to add additional DIMM slots per channel (much like PCI-Express switches we expect, but with some form of error prevention), and we expect it to be used with the latest buffered LR-DIMM technology as well, although the underlying DDR4 topology will remain the same.
This is the same process the PCI bus went through in its transition to PCI-Express: replacing anything parallel nature with a serial approach. DDR4 will become a point-to-point bus and the parallelism is being left with the memory controller itself with multiple memory channels.
If we look at Intel’s upcoming LGA2011 socket that is anticipated to use a quad-channel memory interface and a single DIMM per channel, it’s now quite obvious that future CPUs using this socket stand a good chance of using DDR4, especially as LGA1366 has had a well defined three year lifespan. In the same timeframe DDR4 could see considerable market acceptance so it’s a smart move by Intel.
The big questions remain to be answered then: is it a consumer (cost) friendly process and how well does TSV cope with overclocking? We’ll have to wait for the first samples in 2011-2012 to find out.

No price cuts for Windows 8

 Microsoft will not reduce the price of Windows 8 upgrades, as it did three years ago before the roll-out of Windows 7, a retail sales analyst said today.
"I would expect upgrade pricing to consumers to be on par with Windows 7," said Stephen Baker of the NPD Group. "They had a compelling reason to get consumers off of Vista and priced [it] to make that happen [in 2009]. But the reason to get consumers onto a more modern platform with a software upgrade is a lot less now than in 2009."
Three years ago, Microsoft dropped the price of the primary Windows 7 upgrade edition -- Home Premium -- by $10, or about 8%, from what it had charged customers for the comparable upgrade to Vista two years earlier. It also cut the price of the full version of Windows 7 Home Premium by $30, or 17%.
 however, including the business-oriented Windows 7 Professional, were priced the same as their Vista ancestors.

Microsoft did not openly tout the Windows 7 price cuts as a way to move people off Vista, but the newer OS has put Vista in the rear-view mirror: The latest statistics from Web metrics company Net Applications showed Vista with a 7% share of all operating systems, down from its peak of 19% in October 2009, the month Windows 7 launched.

 Windows 7 holds a 39% global share, second only to the nearly-11-year-old Windows XP.

 Microsoft is probably weeks away from announcing Windows 8 pricing -- in 2009 it waited until late June to reveal Windows 7's -- Baker made a case for why the company will keep to its current chart.

 "I think they see a world where the consumers' trust in Windows will be rewarded and they can derive revenue from that."

Under that theory, Microsoft would be hesitant to cut prices, believing that doing so would cheapen the value of the new OS in the eyes of customers. And Microsoft has been adamant about Windows 8's value, casting the new OS as a revolutionary departure that justifies the repeated use of the tag phrase "Windows 8 reimagines Windows."

That's why Baker saw price cuts as sending the wrong message.

"I think Microsoft will be, and correctly in my view, very wary about devaluing Windows in the customer's eye by some sort of cheap pricing trick," he said. "They believe the product has value and will grow rapidly regardless of the price of the upgrade and that is incentive enough, in my mind, to not reduce the price."

Some, however, have speculated that Microsoft will drop the price of Windows 8 to encourage users to adopt the new OS. The reason: The more people running Windows 8, the more revenue Microsoft can earn from the new Windows Store.

Windows Store is Microsoft's app market for Metro-style software, and will be accessible only to Windows 8 and Windows RT users. Microsoft will take a 30% cut of the first $25,000 each app earns, then 20% of all additional revenue.

Baker dismissed that idea as well.

"I am sure they would look at the value of Windows 8 on its own, not subsidized by something else," Baker argued. "It would not be in their best interest to use those [Windows Store] revenues to subsidize Windows 8 pricing."

While Baker believes that Microsoft will hope for a quick adoption of Windows 8, he doesn't think the company needs to reduce the price of upgrades to accomplish that.by gregkeizer

Windows 8 price possibilities
If Microsoft did cut Windows 8 prices, here are some possible outcomes based on 5% and

Sunday, May 27, 2012

Eurocom Neptune 3D

We reviewed Eurocom’s top-of-the-line mobile workstation, the Panther 2.0, in our June 2011 issue. That high-end behemoth weighed more than 15 pounds and cost upward of $5,000, but it sported a desktop Core i7-980X CPU and a pair of Radeon HD 6970s in CrossFire. This time around we’re taking a look at the company’s lighter-weight mobile workstation, the Neptune 3D.
While also billed as a high-end desktop-replacement, the Neptune 3D is far more modest than its beefy big brother. It’s based on a mobile Sandy Bridge CPU (Intel’s Core i7-2760QM) and a single mobile GPU (Nvidia’s GeForce GTX 580M). The Neptune 3D weighs less than nine pounds, but it features   A  17.3-inch, 120Hz, 3D display.
Eurocom bundles one pair of Nvidia’s 3D Vision active-shutter LCD glasses with the machine, and the emitter is built into the chassis. The 1920x1080 LED-backlit panel is bright, no doubt to compensate for the darkening the glasses cause. After we turned it down a bit to evaluate the screen’s quality, we saw that it produced crisp text and still images, as well as impressively dark blacks. The display was equally impressive in motion, with no visible blurring or ghosting. The matte finish did a great job of reducing ambient glare. If you dig 3D, you’ll enjoy the 3D experience the Neptune delivers; if you’re not sold on 3D, nothing about this notebook will change your mind.
Storage comes in the form of a 250GB Intel SSD, supplemented by a 750GB mechanical hard drive. The machine is also outfitted with an optical drive that can burn Blu-ray media at 6x speed and DVDs at 8x. The chiclet-style keys are comfortable and responsive. Lap weight and a two-hour battery life are well within standard range for high-end gaming notebooks.
Aside from the 3D feature, the Neptune 3D’s all-around performance is the epitome of standard. It trounced our aging zero-point machine, but that’s exactly what we expect from a system running Sandy Bridge hardware. As a gaming laptop, the Neptune 3D is equally sufficient. The GeForce GTX 580 handled all but the most demanding games with relative ease, and the system delivered benchmark numbers on par with a similarly clocked Sandy Bridge desktop machine.
The main issue here is cost. With a price tag  of $3,500, we expect a bit more bang for our buck. The 3D video is nice, but it’s not enough to justify the Neptune 3D’s gaudy price tag. Eurocom should have overclocked the CPU for even better all-around performance, or dropped a second GPU under the hood for faster gaming.
Still, if you’re looking for a well-constructed desktop replacement with a sharp display, 3D capability, and strong all-around performance, the Eurocom Neptune 3D won’t disappoint. But if you’re looking to raise the bar on high-performance mobile gaming, look elsewhere; the Neptune 3D is unremarkable.by dan scharff

Eurocom Neptune 3D

 


Saturday, May 26, 2012

ROG Xonar Phoebus

The Republic of Gamers is committed to helping you win more through superior audio power, precise positioning, and lifelike immersion ― the perfect aids for those critical online shooter moments. Opposing snipers hide in the dark, taking advantage of cover to keep you suppressed? It’s time to get your team on the offensive with powerful and precise battleground audio intelligence – thanks to advanced Xonar sound hardware. No more confusion or uncertainty. Clear audio turns you into a virtual online gaming general, and paves the way to victory.

Superior audio detection and positioning make gamers equipped with ROG audio gear true champions. Developed using real life insight from full-time players and refined by the expertise of the Xonar team, the ultimate goal here is to help you win! Built for the most dedicated gamers, ROG audio products are committed to win-boosting features, from clear communications to precision detection that amplifies every footstep taken by opponents. No more sneaking around or getting stealth-flanked!

Command Your Way to Victory

ROG Command technology
Precise positioning and lifelike immersion
Get the drop on opponents with improved audio positioning and bask in game music with crystal-clear 118dB SNR sound. All done by the creators of Xonar Xense (2011 CES Innovations Award winner) and Xonar Essence One (2012 CES Innovations Award winner). The commitment of the Xonar team delivers leading-performance gaming products, merged with the excitement of a unique, stylish and gamer-focused design. ROG Xonar Phoebus, named after the Greek god of the sun and music, comes with exclusive ROG Command technology, Hyper Grounding technology, EMI-shielding and top notch components, combined with the latest Dolby® surround enhancement to ensure you hear exactly what game designers want you to hear.

ROG Command Technology Boosts In-game Voice

Clearer communications with up to 50%
environmental noise cancelation
Beyond excellent sound quality, ROG Xonar Phoebus gives you clean and seamless communications for a totally immersive experience. World-exclusive ROG Command technology was specifically developed by the Xonar team to bring effective noise cancelation to a gaming soundcard set. Intelligent sound detection algorithms and array microphones (two on the dedicated control box) enable up to 50% ambient noise reduction, effectively turning loud locations such as LAN parties into something as quiet as a library.
Excellent sound quality
How it works?
With external microphone - With external microphone
A smart algorithm analyzes sound detected by both the external microphone and one of the micro-mics on the control box to cancel out environmental noise.
Without external microphone - Without external microphone
Array microphones on the control box form a virtual 30゚cone of precise voice capture.
*Control box intelligently detects plugged-in headsets when engaging ROG Command features

Embrace Exceptional Sound Quality for the Win

118dB Clarity SNR
Up to 4X audio clarity compared with the SNR ratios of competing gaming audio cards.
Hyper Grounding technology
Noise-blocking multi-layer PCB designed with decades of ASUS professional layout experience.

Digital to Analog Converter (DAC)
The PCM1796 digital-to-analog converter on ROG Xonar Phoebus delivers clear and crisp 118dB SNR high fidelity audio. The new soundcard therefore outperforms competing gaming audio products by a factor of four, and onboard audio by an impressive 32 times. It also excels in total harmonic distortion (THD) performance, with a miniscule 0.00039% of sound output experiencing distortion.
EMI Shielding
Stylish and efficient EMI shield blocks electromagnetic interference to help provide cleaner audio

Headphone Amplifier
Since many gamers choose to refrain from disturbing others while insisting on the best sound quality, ROG and Xonar teams integrated the top-flight TPA6120A2 headphone amplifier on ROG Xonar Phoebus, driving up to 600 ohm impedance for genuinely impactful headphone output.
3 Headphone Gain Settings
Three different gain settings cater to various headphone impedance levels, with aesthetic lighting that changes accordingly: blue for ‹32 ohms and red for the two settings over 32 ohms.
* SNR (Signal-to-Noise Ratio) unit is dB. SNR= 10log (Psignal/Pnoise); p represents power
** Compared with the SNR ratio (109dB) of average competing gaming audio cards

Control Box within Easy Reach

Hassle-Free Headset Connections
Clad in the trademark ROG red and black color scheme, the ROG Xonar Phoebus control box offers gamers improved response and convenience. Connected to the card via the back I/O, the control box can be placed on the desk so it remains within easy reach. The control box further offers microphone and headphone connectors, making it possible to use even short-wired headsets with ease, as it brings the connectors closer to gamers.
Instant Volume Tuning
Gamers can easily adjust volume and one-click mute quickly without bringing up game or operating system menus, or having to hunt around for display remote controls.
Hassle-Free Headset Connections
Hassle-Free Headset Connections
Illuminated Gold-Plated Jacks
Gold-plated jacks ensure greater durability. To help with gaming in the dark, the control box headphone-out lights up green, while the microphone-in glows red!

New Xonar Audio Center

New Xonar Audio Center
Effortless adjustment with New Xonar Audio Center
The latest Xonar audio center brings great sound combined with the spirit of ROG to provide gamers impactful and precise audio control delivered in style. It is designed with a range of effortless adjustment features: with one-click ease, gamers can access features such as I/O settings, volume, game effects, and digital siginal processor (DSP) mode selection. The spacious and intuitive interface further delivers a completely user friendly experience, and gamers can also enable Dolby® Home Theater V4 with a click.

Realistic Surround and Enhanced Sound

Dolby® Home Theater V4
New Dolby® Home Theater V4 enhances surround sound with improved processing and playback.
Realistic surround
  • Surround decoder ― converts 2-channel audio sources to multi-channel. Experience immersive audio through multi-channel playback
  • Surround virtualizer ― delivers stunning virtual surround playback for standard stereo headphones or speakers
Sound enhancement
  • Intelligent equalizer ― allows users to adjust tone dynamically with easy visual controls
  • Dialogue enhancer ― improves premium clarity for in-game communications
  • Volume leveler ― helps maintain chosen volume consistently across audio sources
Realistic Surround and Enhanced Sound
from asus.com

ASUS Xonar Essence STX Virtual 7.1 Channels

The ASUS Xonar Essence STX is a PCI-e audio card that is designed for the music enthusiast. Equipping the Xonar Essence STX with the best components and the finest design, the STX delivers a top-of-the-line audio experience with a 124 dB SNR rating. With a built-in headphone amp that can power headphones up to 600 ohms and 6.3 mm headphone jacks, the STX makes a perfect pair with high-end headphones.


  • neweggHigh Signal-to-Noise Ratio (SNR) A high SNR rating means less unwanted noise. Therefore the audio is cleaner which translates to better sound quality. The Xonar STX is the world’s first sound card to achieve a 124 dB SNR rating for crystal clear audio.
  • neweggBuilt-in Headphone Amp A high-quality TI TPA6120A2 headphone amp supports headphones up to 600 ohms of impedance with lower than 0.001% distortion. This allows users to drive their headphones as intended without additional amplification.
  • neweggEMI Shield EMI shielding seals the RCA/headphone signals from noise and offers distortion-free audio.
  • neweggNichicon “Fine Gold” Professional Audio Capacitors Using state-of-the-art etching technology, Nichicon “Fine Gold” capacitors deliver rich bass and crystal clear high frequencies.
  • newegg6.3 mm Headphone and Mic Jacks The 6.3 mm headphone jacks offers compatibility with high-end headphones. An adapter is included for the usage of headphones with 3.5 mm connectors.
  • newegg

Friday, May 25, 2012

Corsair Vengeance 1500 USB Gaming Headset

We awarded Corsair’s HS1 USB headset a 9 verdict last year, remarking that its huge 50mm drivers, solid and comfortable construction, and $100 price tag added up to a surprisingly good value for a freshman effort. The one element that denied the HS1 a Kick Ass award was its uninspired.
Corsair’s new flagship USB headset, the Vengeance 1500, retains all the strengths of the HS1 and eliminates nearly all its weaknesses. The Vengeance 1500 packs the same gigantic drivers as its predecessor, providing top-notch sound quality for this price range. The circumaural design and thick, squishy padding make for a tight seal around your ears that isolates you from the pollution of ambient noise.

The Vengeance 1500 sounds every bit as good as Corsair's earlier HS1 USB headset, and it looks a whole lot better.
While it doesn’t deliver the level of quality that some higher-end products provide—Sennheiser’s PC 333D G4ME, for example—the Vengeance 1500 does provide respectable dynamic range and bass response that’s perfectly suitable for both games and movies. And while nothing can compare to an actual surround-sound setup, Corsair does deliver Dolby Headphone. This software algorithm upmixes stereo and 5.1-channel sources to simulate a 7.1-channel speaker system wrapped around your head, delivering better positional awareness than stereo phones are capable of providing.
Build quality as compared to the HS1 has also improved significantly. The struts connecting the ear cups to the headband feature an attractive brushed-aluminum finish, and the cups themselves swivel to lay flat against your chest when the headset is resting on your neck. They might feel odd if you’re transitioning from an on-ear headset, but after many extended gaming sessions, we’ve found the Vengeance 1500 to be one of the most comfortable headsets we’ve tested. They are quite large, however, so they might not be the right choice if your head is particularly small.
Corsair’s HS1 is a solid headset; the only reason we wouldn’t recommend it today is that the Vengeance 1500 is even better. If you’re looking for a serious gaming headset and can afford to spend 100 bucks, you won’t go wrong with this one.by alex castle

Things To Look Forward To On Windows 8

Windows 8 To Have Flash Integration?

 Reports suggest that Adobe Flash will be supported in both desktop and Metro-style Internet Explorer 10 browsers in Windows 8.
According to reports, Microsoft is looking to integrate Adobe Flash capabilities into Internet Explorer 10.
Adobe Flash will function on any website when the browser is ran as a desktop application, but will be limited when used as a Metro-style browser on tablet devices.
Screenshots from WinUnleaked, released by Windows ‘insider’ Canouna through a tweet, show that the browser ‘plugin’ will have some functionality in the Metro version of Internet Explorer 10, even though previous reports have stated such plugins were to be barred from use — as the corporation expects a radical shift towards HTML5 in the coming years, and away from plugin services

The Flash capability of the Metro browser is reported to be limited to “trusted” websites. These include streaming sites such as Hulu, YouTube, and Vimeo, news broadcasters such as CNN and the BBC, and a number of entertainment and social media sites including Facebook.
Internet Explorer chief Dean Hachamovitch has previously called Flash and other similar plugins a “relic” of an archaic time in the development of Internet services. He said:
“Running Metro-style IE plug-in free improves battery life as well as security, reliability, and privacy for consumers. Plug-ins were important early on in the Web’s history. But the Web has come a long way since then with HTML5.
Providing compatibility with legacy plug-in technologies would detract from, rather than improve, the consumer experience of browsing in the Metro-style UI.”
However, as many websites are still reliant on Flash, removing the capabilities of the Metro browser from running it had the potential to damage its attractiveness to consumers. As an integrated facility, Microsoft side-steps its previous conviction as it will not be treated as a standard plugin, instead, as a feature of the browser itself.
Microsoft’s anticipated Release Preview version of Windows 8 is to be launched in the first week of June.

AMD Entering Tablets With New Processors

It hasn't been much of an ARM wrestle in the tablet space up to this point, and it's not because AMD and Intel haven't talked the talk. For the most part, they just haven't walked the walk, which has allowed ARM to dominate the category. That could change once Windows 8 comes into view in a few months, and if Microsoft's upcoming Metro infused OS proves popular on touchscreen tablets, you can expect a dogfight between AMD and Intel.

Intel hasn't been shy about saying it plans to compete in the mobile device category, but what about AMD? According to DigiTimes, AMD is readying the release of its Hondo processors, which are low power chips built around the company's 40nm Bobcat architecture. Hondo processors boast a power consumption of just 4-5W, 1-2 processing cores, and on-die DirectX 11 graphics. These will go up against Intel's Clover Trail-W CPUs.by paul lilly

AMD will roll out Tamesh processors in 2013. These second generation ultra low power APUs will feature AMD's Jaguar architecture with additional power enhancements and performance improvements.

Wednesday, May 23, 2012

3DMark 11 Performance Preset Comparisson

Nvidia Geforce GTX 780 SPECIFICATION

 


    GPU
    Manufacturer: NVIDIA
    Specs status: Rumored
    GPU Series: GeForce 700
    GPU Model: Maxwell GM110
    Fabrication Process: 22nm
    CUDA Cores: 2048
    TMUs: 128
    ROPs: 32

    CARD
    Dimensions (inches): – / – / Dual-slot
    Display Outputs:

    SUPPORT
    Bus Support: PCI-E 3.0 x16
    DirectX Support: 11.1
    OpenGL Support: 4.2

    CLOCKS
    GPU Base Clock: ~1100 MHz
    GPU Boost Clock: ~1150 MHz
    Shaders Clock: ~1100 MHz
    CPU Clock: ~1000 MHz
    Memory Clock: ~1550 MHz
    Effective Memory Clock: ~6200 MHz

    MEMORY
    Memory Size: 3072 MB
    Memory Type: GDDR5
    Memory Bus Type: 384-bit
    Memory Bandwidth: ~250 GB/sec

    POWER
    Power Draw 200 W / – W
    Min Required PSU: 500 W
    Power Connectors: 2 x 6-pin
    Temperature: – C / – C / – C
    Noise Level: – dBA / – dBA


    PERFORMANCE

    3DMark11 Score
    3DMark Vantage

    RELEASE
    Release Date: Q1 2013
    Launch Price: