Results 1 to 6 of 6
  1. #1
    Jr. Member
    Join Date
    Dec 2006
    Age
    61
    Posts
    64
    Thumbs Up
    Received: 0
    Given: 0

    Graphics Card Output


    0 Not allowed!
    Does anyone have any idea about how to measure graphics card output?
    The card is a Radeon Dual-X R9270X.
    Motherboard is a Gigabyte 990FXA-UD5.
    CPU is AMD FX8350
    RAM is 16gigs 1600MHx Adata
    Connecting to a Sony KDF-50E2000 TV
    Sony says 340MHz through the HDMI cable will be enough for a 1080 pixel picture and carry the audio as well.
    But how to determine if the output will be sufficient?
    I haven't been able to find anything at all about how to measure graphics card output.
    Any suggestions?
    Please help!!
    If you try to please everybody, nobody'll like it.

  2. #2
    CB Mod. & Porn Expert
    Join Date
    Aug 2014
    Posts
    8,056
    Thumbs Up
    Received: 18,640
    Given: 24,311

    0 Not allowed!
    There are various benchmarks that can be used to gauge the performance of a graphics card, though I don't think you need worry about that. The R9 270X will have no problem outputting 1920x1080. One of the selling points of the R9 range is their 4K capabilities.

  3. #3
    Administrator Irrumator's Avatar
    Join Date
    May 2007
    Posts
    21,246
    Thumbs Up
    Received: 10,297
    Given: 14,116

    1 Not allowed!
    I wouldn't worry about the graphics card. Be careful what HDMI cable you use, there are different standards and some have very different speeds and/or will not output audio.

  4. #4
    Moderator Dracula975's Avatar
    Join Date
    Oct 2006
    Posts
    19,587
    Thumbs Up
    Received: 44,066
    Given: 39,420

    0 Not allowed!
    Quote Originally Posted by Irrumator View Post
    I wouldn't worry about the graphics card. Be careful what HDMI cable you use, there are different standards and some have very different speeds and/or will not output audio.
    Also the length of the cable could be a factor if it's over 50 feet. DO NOT get pressured into buying the most expensive cable. Just the right standard at the best price.


    To get better answers, ask better questions.


  5. #5
    Member
    Join Date
    Apr 2007
    Age
    53
    Posts
    191
    Thumbs Up
    Received: 54
    Given: 303

    1 Not allowed!
    Two benchmarks I have used...

    http://unigine.com/en/products/benchmarks

    http://www.3dmark.com/

    3d mark is probably better, but a large file to download, you run the program and it will test and rate your video card, and show you results from other users with similar hardware, and where you stand, compared to office machines to high end gaming rigs.

    They are free, but they will try to get you to buy stuff.

  6. #6
    Newbie
    Join Date
    Sep 2015
    Posts
    17
    Thumbs Up
    Received: 4
    Given: 0

    0 Not allowed!
    Ok a graphics card has what's called the base clock, which is the frequency it uses in normal use
    Then you have boost clock which is its overclocked frequency
    Then there is Memory. The more memory the better Fps you get with higher resolutions
    Then there is VRAM which is like computer ram but its for rendering
    then there is the chip, the better the chip the better the card eg the GTX titan uses the GK110
    then there is what's called CUDA cores, the more cores the better a card
    then there is memory speed - that's self explanatory
    then there is memory bandwidth - that's how much memory can get processed at a time
    then there is its power connectors - so the pin adapters that go into the motherboard
    then you have TDP - which is how much power it uses
    then there is output - which is what plugs it supports eg. DVI, HDMI, DP
    That's it!
    Last edited by kaufen; 6th June 2016 at 22:49.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •