Monday, May 20, 2013

the best operating system for gamming

Every gamer asks himself every now and then this question: Which is the best operating system for gaming? I’m here to answer that question.
The real battle is between Windows XP and Windows 7 but if you ask me there is no “best operating system”. Every gamer has different needs that can’t be satisfied by a single operating system. We have to compromise and choose the best that fits our needs as an individual.
Depending on what are our requirements the “best operating system” can be any version of Windows. If you want to play DirectX 9 games then the best choice is probably Windows XP but if you want DirectX 10 and DirectX 11 you must choose a newer operating system, Windows 7. But new isn’t always better, see Vista.
Many people say that Windows XP is dying or dead and I like to tell them that more than 30% of PC users still use it. Even though Microsoft recently stopped supporting Windows XP this doesn’t mean that it’s completely dead.
Best Operating System
As you can see, against popular believe, Windows 7 is actually better than Windows XP in DirectX 9 thus better for gaming! This time around new seems to be better. Windows 7 runs well even with older hardware making it king over other operating systems.
When it comes to DirectX 10 and 11 games there is absolutely no competition whatsoever. Because Windows XP only has support for DirectX 9 there is clearly one winner: Windows 7.
Further more, Microsoft stopped supporting Windows XP thus making Windows 7 prone to improvements…
If you’re not sure what architecture of Windows 7 to choose, x64 or x86, read this: Most hardcore gamers choose the 64bit version of Windows because it supports more than 3 GB of ram and is more stable / reliable.
40% of gamers on Steam use Windows 7 Ultimate x64 and more users make the move to x64 every day. In contrast, only 20% of Steam users have Windows XP x32/x64 installed.
All we can hope is that Windows 8 will take 7′s throne on release date but something tells me that Windows 7 will be the best operating system for many years to come or until DirectX 12 is released.

Best Operating System for Gaming is: Windows 7 x64

All details on the games that were included in this roundup can be found here: BenchmarK3D.com/benchmarks
I will continue to add games so the graph above is not final.
Here is a list of included games:
  • Alice Madness Returns
  • Battlefield Bad Company 2
  • Call Of Juarez: Bound in Blood
  • Call Of Juarez: The Cartel
  • Bioshock 2
  • Crysis 2
  • Dead Island
  • Deus Ex Human Revolution Benchmark
  • DiRT 3
  • Duke Nukem Forever
  • Dungeon Siege III
  • Fable III
  • FEAR 3
  • Formula 1 2010
  • Grand Theft Auto IV
  • Hard Reset
  • Hunted the Demon’s Forge
  • Medal Of Honor 2010
  • Metro 2033
  • Red Faction Armageddon
  • Red Orchestra 2 Heroes of Stalingrad
  • StarCraft II
  • The Witcher 2
  •   http://benchmark3d.com

    What Is Crossfire

    Crossfire is a technology developed by graphics chipset manufacturer, ATI and allows the use of two graphics cards in a single system in order to boost the PC's graphics output. It is similar to Nvidia's SLI, but has some advantages over Nvidia's technology and works in a slightly different way.

     ATI's Crossfire technology, like the Nvidia SLI technology, allows the use of two graphics cards in a single system with the benefit of boosting the system's graphics capability. Similar to SLI, Crossfire also uses the PCI-Express 16x architecture to connect graphics cards.

    The first Crossfire system made it's appearance in 2005, but unlike Nvidia's SLI offering, which required technologies to be build into their graphics processing units in order to transfer data, the first Crossfire systems utilised a special master card on which chips were installed to link the two cards together. On top of this, to use Crossfire, it was necessary to use a motherboard that was compatible with Crossfire, as well as a Crossfire master card and a second, regular ATI graphics card.

    Initially, there were only two ATI master cards released: The X800 Crossfire Edition and the similarly named X850. To use Crossfire, the second card had to belong to the same family of card as the master, so you could use any variant of the X800 with the X800 Crossfire master card. In addition, this meant that the master card could auto-configure itself in line with the technologies on the slave card, meaning Crossfire wasn't tied into running at a slower clock-speed as was found with Nvidia's SLI.

    Whereas the Nvidia SLI cards were required to be connected together inside the PC, early generations of Crossfire-enabled cards connected externally via a pass-through connector, which connected the slave card to the master card via a DVI output. The Crossfire master card had a decoder chip which could convert the signal received from the slave card so that it could be combined with the data produced by the master card and rendered. Later generations of Crossfire adopted the Nvidia approach, and Crossfire graphics cards are now connected internally using a bridge cable.

    Crossfire-enabled motherboards are enabled from the BIOS, rather than by an invertible card, as in SLI.

    Crossfire provides several ways to divide 3D rendering work between the graphics cards. The first mode used by Crossfire is 'alternate frame rendering' or AFR which first appeared with ATI's Rage Fury MAXX graphics card, which used two Rage 128 graphics chips to render alternate frames. Crossfire AFR works in a similar fashion; each card takes on the job of rendering alternate frames, with one card rendering odd-numbered frames and the other rendering even-numbered frames.

    The second mode of Crossfire is known as 'scissoring' which is similar to the split frame rendering used by Nvidia's SLI mode. In 'scissoring' mode, the screen is divided into two areas so that each half requires a roughly equal amount of work to produce. Each card creates one part of the image, and the slave sends it's work to the primary card for merging and display.

    The third mode of Crossfire is known as 'supertiling'. This mode involved dividing the screen into what would resemble a chessboard, with each graphics card processing alternate squares of the board. However, 'supertiling' mode is only found in games that use the Direct X Direct3D API and only graphics cards that both have 16 pixel pipelines each can properly utilise 'supertiling'.

    A fourth mode of Crossfire is also available, known as 'super AA'. This involves the two graphics cards performing anti-aliasing processes to produce a smoother image and increase image quality. Each of the two cards perform anti-aliasing on each frame using different sample patterns which are them combined to give a cumulative level of anti-aliasing than would be possible on a single card without suffering a loss of frame rate.

    In the early days of Crossfire, it proved very difficult for anyone looking to build a Crossfire enabled PC due to severe shortages of master cards. As a result, with the next generation of Crossfire, ATI released their Xpress 3200 motherboard chipset. This allowed Crossfire systems to fully utilise the full speed of the PCI Express x16 slot across 2 installed cards, allowing them both to run with all 16 lanes functional. When it was used in conjunction with two X1300, X1600 and X1800 graphics cards, it allowed users to run a Crossfire system without the need to have a dedicated Crossfire master card. Furthermore, because neither card in these setups have the input connector used on Crossfire master cards, the image data is shuttled to the display card via the PCI Express bus rather than using a connector cable inside the PC. However, more powerful cards such as the X1900 range require a connector cable to be used as well as a master card.

    ATI announced in late-2006 that the Crossfire system would be further expanded to allow the format to perform physics calculations by linking cards in Crossfire mode with another, third card which carried out the physics calculations. However, such a setup would require three PCI Express x16 motherboards and would likely be accompanied by a hefty pricetag.

    NVIDIA Says AMD's DirectX 11 Cards Are Only a Short-Term Advantage

    During the past months, most news articles that revolved around graphics products usually made at least a passing note that ATI (a business unit of AMD) was the only developer offering DirectX 11-capable graphics processing units. Still, even though it hasn't managed to introduce its own Fermi products, NVIDIA reportedly doesn't see AMD's 60-day head start as something noteworthy in the grand scheme of things.
    "To us, being out of sync with the API for a couple of months isn't as important as what we're trying to do in the big scheme of things for the next four or five years. We're just around the corner from preparing our next GeForce and the experience of what you'll see in 3D, what you'll feel in physics, and the improvements you get in graphics will be obvious to the market," said Michael Hara, senior vice president of investor relations and communications of NVIDIA.

    According to NVIDIA, the short-term advantage that AMD has been enjoying so far will be overshadowed by the transition from purely graphics-oriented GPUs to units capable of parallel computing tasks. Granted, the advanced visual features of DirectX are not overlooked by the GPU maker, but the company is fully confident that Fermi will reclaim the market.

    "This 60-day lag between these events of when our competition has DX11 and when we're coming to market will absolutely seem insignificant in the big picture," Hara stated. "We're almost there. In Q1, the world will get to see what we've done with Fermi."

    Among the advanced capabilities offered by DirectX 11 (which was released along with Windows 7) are support for multi-core processors and DirectCompute that enable developers to fully utilize the parallel processing capabilities for graphics processors for operations such as video editing. DirectX 11 also supports tessallation allowing for the rendering of smoother curved surfaces.

    "We go through revolutionary changes every three of four years, and that's exactly where we're at today. The next big evolution in the API world has come with DirectX 11 (DX11), but we believe that's a part of the experience," Hara added.


    softpedia 

    What is DirectX 11......

    DirectX 11 has created a lot of buzz throughout the 3D gaming community, but what could the application programming interface (API) mean for the designer or CAD professional?AMD’s Rob Jamieson explains
    All 3D programs rely on an application programming interface (API). When combined with a 3D graphics card this is what makes real time 3D graphics possible. Most professional 3D applications use OpenGL, whereas most games use Microsoft’s DirectX. However, with DirectX 11 and supporting hardware and software now out in the market, details of what the API offers 3D CAD users are becoming more clear.
    Direct X currently provides the 3D graphics engine for a few CAD applications including Autodesk Inventor and MicroStation. It is also the secondary graphics engine in Solid Edge. Currently Autodesk Inventor is mainly DX9 with some bits of DX10 used to release the 3D engine to do other tasks. It’s likely that Autodesk would introduce some of the new features of DX11 inside a future release of its 3D CAD software but what does DX11 offer?

    DirectX 11 can be used to tessellate meshes on the fly. In engineering, the technology could potentially be used to refine meshes for FEA

    Multi core

    For some time now we have been using multi core CPUs. However, the number of applications that can really exploit all of these cores are still limited and include rendering, plus certain software for Finite Element Analysis (FEA), Computational Fluid Dynamics (CFD) and CAM. Recently this has started to change with multithreaded software appearing in other areas, including graphics.
    The DirectX 11 API also supports multithreading to handle things like display lists. This is information that is passed backwards and forwards between the computer’s system memory and the graphic card’s GPU and memory. By using spare CPU cores to control the graphics card, the workstation can dedicate specific CPU cores to work with the CAD application. This not only speeds up the CAD application, but can increase the physical 3D rotation speed of the model -  in some cases as much as 20 to 50%.

    Compute shaders

    The next thing that DX11 offers is “compute shaders”. These can perform post-processing operations including physics, Artificial Intelligence (AI) and particle systems. Most 3D animation software, including 3ds Max, support some of these features in one form or another and they use the CPU or GPU to handle this. DX11 has great access to the geometry that these operations use so it has a key importance and an advantage to make things happen quickly without the long wait that can be associated existing methods.
    Physics are also of great interest to solid modellers to test if components collide when moving. A lot of modellers do this already using the CPU but when the assembly gets larger, the processing time can be huge. Using shaders could solve this.

    Tessellation

    Tessellation is the process of transforming data and adapting it for another purpose, on the fly. It can take an existing mesh and enhance certain sections of a model to give more detail. It can also be used the other way round to reduce detail in a scene. For example, in a landscape the far off detail is not required, so it can be reduced to increase real time 3D performance or reduce render times. In engineering, the technology could potentially be used to refine meshes for FEA.

    Visual effects

    3ds Max uses .fx shaders today to show depth of field and motion blur that can be taken to the final render. DX 11 enhances the use of these inside a viewport where there are options to interact with applying them. Studio GPU’s software Mach Studio Pro also has some of these capabilities in a rasteriser format. Roughly 85% of applications use a rasteriser approach including most 3D creation applications. There is often a lot of confusion between rasteriser and ray tracing when people see realtime shader effects. All of these functions add realism to any 3D application including CAD.

    OpenGL

    In order to make use of these new capabilities, a DirectX 11 compatible graphics card is required, but the raw hardware can also be utilised by OpenGL and OpenCL, which is used to execute across heterogeneous platforms consisting of CPUs, GPUs, and other processors (tinyurl.com/D3DopenCL).
    The recently announced OpenGL 4.0 includes a raft of new features bringing OpenGL in line with Microsoft’s DirectX specification. OpenGL 3.3 was also just released, providing as many of the new version 4 features as possible to older graphics hardware. (http://www.opengl.org has a list of all these new features).
    OpenGL is still extremely important for CAD as 70% of CAD applications use it. It also has the advantage of working on multiple platforms, including Windows, Linux and Mac, whereas DX11 only works on Windows Vista and Windows 7.

    The future

    Historically, when it comes to 3D, most CAD users have been primarily concerned with frame rates, visual quality and stability. However, with the advent of DirectX 11 there is the potential for benefits to go way beyond these standard metrics. And while OpenGL is likely to remain a strong foundation technology for CAD, the increased competition from DX 11 is certainly helping advance 3D technology in general, which can only be a good thing.

    www.directx.com
    develop3d 
    Custom Search
    Powered By Blogger