hate these ads?, log in or register to hide them
Page 139 of 143 FirstFirst ... 3989129136137138139140141142 ... LastLast
Results 2,761 to 2,780 of 2841

Thread: What Video Card Should I Buy?

  1. #2761
    Donor Pattern's Avatar
    Join Date
    April 9, 2011
    Posts
    6,277
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.

  2. #2762
    Cosmin's Avatar
    Join Date
    March 14, 2012
    Location
    UK
    Posts
    5,170
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    Actually looking at a 1440p 27'', but it's around £700 and I don't feel like blowing that much right now. Hell, my TV is a 36'' 720p "HD Ready" Samsung. It can upscale to 1080i on my PS3, so not unhappy with it. Plays shit quite nicely
    Guns make the news, science doesn't.
    Six shooters ruined PvP.
    What are you doing with your life?İDoomchinchilla 2015

  3. #2763
    Movember 2011Movember 2012 Nordstern's Avatar
    Join Date
    April 10, 2011
    Posts
    8,424
    If we don't count the Apple IIe, then my first graphics card was a S3 ViRGE 3D 2000 2MB. Utter shit, but it ran TIE Fighter and MechWarrior 2. Many fond memories playing those. Years later, tried to play the original Half-Life and somehow made it to the Xen level with 7-16 fps, never exceeding 20.

    Next up was the integrated graphics in the Intel 845 chipset. Played some Neverwinter Nights and Freelancer with that, worked alright. NWN:HOTU was too much, though.

    Built my own system circa 2005/2006, got a Sapphire Radeon X300SE 64MB, fully knowing it was a cut-down card and I intended to upgrade later as my budget allowed. NWN played so much better and I started playing lots of Battlefield 2. Even got Splinter Cell: Chaos Theory to run, but had to turn down settings due to framerates and lack of Shader Model 3.0 and Hardware T&L. Vowed to upgrade once I heard of a game called EVE Online...

    ATI Radeon X1950 Pro 256MB was next, and stayed with me for a while. For some reason, I thought I needed a one-slot video card, and this fit the bill. Maybe I was going to do Crossfire? I dunno. Now I could run SC:CT at full settings. Battlefield 2 was much better. Started playing EVE, Prey, Far Cry, Mass Effect 1, Half Life 2, maybe some others. New graphics in EVE made the card choke and I knew it was only a matter of time before I couldn't play EVE anymore. Motherboard failed due to a southbridge heatsink popping off due to overheating, decided to upgrade the graphics at the same time.

    Next up was a XFX Radeon HD5750 1GB, part of an all-new system (except the case, since I liked it). This card worked great, and I still have it lying around. Monitor died, then the flat panel died, so I upgraded to a 1080p flat panel display. Did lots of Battlefield 3, even some Battlefield 4 (but it had some framerate issues, even on lower settings). Handled EVE, Skyrim, Star Trek Online and Sins of a Solar Empire: Rebellion like a champ. Deus Ex: Human Revolution and Mass Effect 2 ran at full settings. Decided to contribute my graphics horsepower to BOINC, was informed I needed a double-precision card. I had no idea what double-precision meant at the time, so I started doing some research.

    After a few years, I decided that instead of simply buying a more powerful graphics card, I wanted a more power-efficient card. I saw the RX400 series and thought it was promising, but all the new cards drew far more power than my HD5750. Passed on the RX470/480 cards (in hindsight, a mistake) and acquired an XFX RX460 2GB. It was a little more powerful than my last card but drew less power, had double-precision, and had a 0dB fan (which was important to me). No new games at this point other than Deus Ex: Mankind Divided. Everything was a little smoother and quieter, but no big jumps. I was able to play with all the Skyrim graphics mods I wanted, however.

    Currently running an XFX RX560 4GB since I don't want to pay outrageous sums for a RX480/580. I won't be upgrading the graphics on this system anymore, as it is an Ivy Bridge system limited to PCI Express 2.0 and has a 500w power supply. I don't know what my next system will be, but it will definitely have two video cards capable of handling VR.
    Last edited by Nordstern; August 29 2017 at 12:48:04 AM.
    "Holy shit, I ask you to stop being autistic and you debate what autistic is." - spasm
    Quote Originally Posted by Larkonis Trassler View Post
    WTF I hate white people now...

  4. #2764
    walrus's Avatar
    Join Date
    April 9, 2011
    Location
    Fancomicidolkostümier- ungsspielgruppenzusammenkunft
    Posts
    5,945
    Diamond Stealth 3D 2000 -> NVidia Riva TNT 2 -> Ati Radeon 9800 Pro -> Ati Radeon HD 2600 Pro -> AMD Radeon 7950
      Spoiler:
    Quote Originally Posted by RazoR View Post
    But islamism IS a product of class warfare. Rich white countries come into developing brown dictatorships, wreck the leadership, infrastructure and economy and then act all surprised that religious fanaticism is on the rise.
    Also:
    Quote Originally Posted by Tellenta View Post
    walrus isnt a bad poster.
    Quote Originally Posted by cullnean View Post
    also i like walrus.
    Quote Originally Posted by AmaNutin View Post
    Yer a hoot

  5. #2765
    Movember 2012 Zekk Pacus's Avatar
    Join Date
    April 11, 2011
    Posts
    6,827
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    I'm happy with my boring old 23" 1080p IPS screen. Like I said I keep looking but nothing's attracting me at a price I want to pay.

    Maybe next year.
    Last edited by Zekk Pacus; August 29 2017 at 04:08:47 PM.
    'I'm pro life. I'm a non-smoker. I'm a pro-life non-smoker. WOO, Let the party begin!'

  6. #2766
    GeromeDoutrande's Avatar
    Join Date
    April 10, 2011
    Location
    Fakefrenchistan
    Posts
    1,769
    I usually try to keep a card for three years or so, but I upgraded to a 1070 to go with a higher resolution display earlier this year. I think three years is quite a practical time period these days, maybe even too short.

  7. #2767
    Super Moderator Global Moderator QuackBot's Avatar
    Join Date
    March 7, 2012
    Posts
    20,879
    Quote Originally Posted by Straight Hustlin View Post
    Yeah lately I upgrade the GFX card a lot less often than I used to. Used to upgrade it every year or two on some of my old machines. Now I've been getting a good (but not top notch) with the initial build, upgrading with something pretty good 3 years or so down the line; then getting a whole new machine 3 or so more years later since at that point just throwing more gfx card at it doesn't solve any actual bottle necks.
    People are more or less.

  8. #2768
    Cosmin's Avatar
    Join Date
    March 14, 2012
    Location
    UK
    Posts
    5,170
    Wow that was a trip down memory lane. My first GFX card was a Riva TNT embedded in the motherboard of my HP system which was sporting a Pentium III. It even had its own VRAM (8MB!). Then I was lucky enough to get a GeForce 2 Pro (which I still have, still in my HP system lol). Then upgrade beckoned and along a 2.4GHz Pentium 4 (Northwood core) along came a GeForce FX5200, which was awful from every pov, but did the job. Then I got an Athlon64 (Venice) to which I paired one ATi X800GTO which was p. awesome for the monies. Next up was a 8800GT (best buy ever <3) inside a system containing an Intel E2140 (clocked 100% to 3.2GHz on a DFI Dark motherboard).

    The 8800GT made it in my later system built around a Xeon X3440 with a DFI LanParty Dark P55-T3eH6. This was funny, because as soon as I put 4x4GB sticks in it, it just died. So I was liek wtf and put the same RAM in my gf's PC which sported the same motherboard. Needless to say, that motherboard died as well. Apparently, albeit the board could support 16GB RAM, as soon as you'd actually go ahead and do it, it'd just die. Completely. Got both boards refunded and got a MSI Big Bang Fuzion which ended on eBay after the last upgrade to the 4930k. Video card after the 8800GT was an AMD 7970 from XFX which basically made the system BSOD constantly because of bad drivers and what not so exchanged it for a GTX680. Come to think of it, I've kept a video card in my system for more or less 2-3 years at least.
    Guns make the news, science doesn't.
    Six shooters ruined PvP.
    What are you doing with your life?İDoomchinchilla 2015

  9. #2769
    XenosisMk4's Avatar
    Join Date
    July 13, 2017
    Location
    More turbo-lightspeed neoliberal platitudes/virtue signaling/misplaced priorities on full display.
    Posts
    1,580
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    Mate, I'm currently using an IPS LG Ultrawide

    If I want to upgrade, I either go 4K IPS Ultrawide or 144hz IPS Ultrawide

    I don't have £900 to spent on a monitor currently :V

    This monitor has doomed me forever

  10. #2770
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    3,064
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.

    I'll give walrus a pass for owning an (even older) Diamond Stealth 3D 2000, but that card was known as a 3D decelerator at the time. The Voodoo cards were the first real GPUs as we now know them, even though it didn't have any 2D capabilities in it's first iteration.

    And to answer the original question I try to keep a few years between GPU purchases and switch (chipset) brands all the time. I generally donate my "old" GPU to indi, as she plays less demanding games then I do, so that works out fine.

  11. #2771
    XenosisMk4's Avatar
    Join Date
    July 13, 2017
    Location
    More turbo-lightspeed neoliberal platitudes/virtue signaling/misplaced priorities on full display.
    Posts
    1,580
    I have an ATI All in Wonder in a drawer somewhere, comes with a TV tuner

  12. #2772
    Donor
    Join Date
    April 9, 2011
    Posts
    1,312
    Quote Originally Posted by Overspark View Post
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.
    Heh. Despite having been around long enough that the first actual graphics card I ever used was an original IBM 16K CGA board, I've never actually owned one of the 3D-only Voodoo cards. My first 3Dfx was a Voodoo Banshee (which I still think was a good and unfairly maligned card). I did once see a pair of Voodoos working in SLI, complete with the VGA-VGA dongle sticking out the back. I just laughed at it, same as I did at the original ATI Crossfire system and its DVI-DVI dongle.

  13. #2773
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    3,064
    Quote Originally Posted by Bombcrater View Post
    Quote Originally Posted by Overspark View Post
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.
    Heh. Despite having been around long enough that the first actual graphics card I ever used was an original IBM 16K CGA board, I've never actually owned one of the 3D-only Voodoo cards. My first 3Dfx was a Voodoo Banshee (which I still think was a good and unfairly maligned card). I did once see a pair of Voodoos working in SLI, complete with the VGA-VGA dongle sticking out the back. I just laughed at it, same as I did at the original ATI Crossfire system and its DVI-DVI dongle.
    SLI didn't come until the Voodoo2, but you're right about it being ridiculous then & now.

  14. #2774
    Cosmin's Avatar
    Join Date
    March 14, 2012
    Location
    UK
    Posts
    5,170
    Quote Originally Posted by Overspark View Post
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.

    I'll give walrus a pass for owning an (even older) Diamond Stealth 3D 2000, but that card was known as a 3D decelerator at the time. The Voodoo cards were the first real GPUs as we now know them, even though it didn't have any 2D capabilities in it's first iteration.

    And to answer the original question I try to keep a few years between GPU purchases and switch (chipset) brands all the time. I generally donate my "old" GPU to indi, as she plays less demanding games then I do, so that works out fine.
    I never had monies for Voodoo

    I do have a Voodoo 2 for historical purposes in my HP though.
    Guns make the news, science doesn't.
    Six shooters ruined PvP.
    What are you doing with your life?İDoomchinchilla 2015

  15. #2775
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    3,064
    Quote Originally Posted by Duckslayer View Post
    Quote Originally Posted by Zekk Pacus View Post
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    I'm happy with my boring old 23" 1080p IPS screen. Like I said I keep looking but nothing's attracting me at a price I want to pay.

    Maybe next year.
    same
    There's been a lot of very interesting monitor developments in the past few years though:

    1) G-Sync / FreeSync. It may not sound like that big a deal, but the level of fluidity you get out of these monitors is a huge step up from normal monitors, and once you're used to it it's hard to go back. Both technologies are good these days. G-Sync is stupidly expensive but p much all monitors that have it are good. FreeSync is much much cheaper (like $300 cheaper) but don't just buy the cheapest one as there is more variance in what they actually bring to the table (mostly look out for a low minimum Hz and a high maximum Hz, you want the max to be >2.5 times the min, or just get a FreeSync 2 panel which eliminates this issue). One of the most interesting things these technologies can do for you is make low-fps situations or even low-fps spikes almost unnoticeable, allowing you to crank up the visuals without too much impact on heavy games. The other is that screen tearing is completely eliminated without the overhead of V-sync. Not all games support it properly, and it can be a bitch to get them to cooperate, especially when using funky technologies like Vulkan. Oh and obviously G-Sync only works on Nvidia, and FreeSync works on "everything else" (it's royalty-free but only AMD who made it uses it), so there's that.

    2) 21:9 gives you far more screen to look at. It mostly gives you more peripheral vision, which is great in all 1st/3rd perspective games. In RTS-like games it just gives you more map to look at. In EVE it gives you loads more room for all your little windows (lolEVE), etc. A lot of games support it, some better than others. Worst case is the game stays at 16:9 and you get black bars at the sides. Some games intentionally don't support it properly in a misguided attempt to be "fair", such as Overwatch. Once again, hard to go back to 16:9 once you get used to more.

    3) Higher refresh rates. 60Hz isn't that good. I have a 75Hz monitor overclocked to 80Hz and that is already a huge step up. With monitors going much further these days (100Hz, 144Hz, etc) they're obviously hitting diminishing returns, but they are much more fluid than older 60Hz panels. A stupidly high Hz value like 144Hz brings along some of the advantages that G-Sync / FreeSync would bring wrt fluidity and less v-sync issues, but it's not completely the same.

    4) HDR. If properly supported it's awesome, bringing you incredibly high brightness with detailed shadows alongside it at the same time. But it's still a fairly new technology on PC so I'm still in a "wait and see" mode until everything standardizes on one technology (right now there are 5?). Will probably take a couple of years to mature.

    Obviously getting a monitor that can do all of those things properly would be nice, but they barely exist at this point, and if they do they're stupidly expensive. You'll have to decide for yourself if it's worth upgrading now (I did and am extremely happy I did) or wait a couple of years, assuming something new doesn't come along in that timeframe which you'll also want to have. I don't think waiting one year will be enough for everything to come together, so you'll still be making a compromise next year and might as well buy this year IMHO.

  16. #2776
    vDJ's Avatar
    Join Date
    July 31, 2012
    Location
    �� out there
    Posts
    1,273
    And still just a handful of monitors with input lag similar to a CRT TV :'(

  17. #2777
    Super Moderator Global Moderator QuackBot's Avatar
    Join Date
    March 7, 2012
    Posts
    20,879
    Quote Originally Posted by vDJ View Post
    And still just a handful of monitors with input lag similar to a CRT TV :'(
    Like on that time i ran with you, i was a similar.

  18. #2778
    GeromeDoutrande's Avatar
    Join Date
    April 10, 2011
    Location
    Fakefrenchistan
    Posts
    1,769
    I think Valve in part simulate(d) HDR with the Source engine so that color brightness appeared higher than what it objectively was on the display and in part bundled some other color-related features under the term although they are not related to HDR.

    Some more info on "Valve HDR" here:
    https://arstechnica.com/features/2005/09/lostcoast/

  19. #2779
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    3,064
    Yeah HDR is really an umbrella term meaning a lot of different things. The general thread is to attempt to show more relevant shades of colour/brightness than you usually get working in a traditional way. This can be accomplished at the source end (rendering in larger colour spaces before condensing it down and outputting) or at the display end (accepting larger colour spaces and having the brightness to properly display them). HDR monitors are naturally talking about the display end. HDR photography is something else again, vaguely comparable to doing HDR at the source end.

  20. #2780
    Movember 2011Movember 2012 Nordstern's Avatar
    Join Date
    April 10, 2011
    Posts
    8,424
    Quote Originally Posted by Duckslayer View Post
    Didn't Half Life 2: Lost coast introduce HDR like a decade ago?
    Yes and no.
    High-dynamic-range imaging (HDRI) is the compositing and tone-mapping of images to extend the dynamic range beyond the native capability of the capturing device.[1][2]

    High-dynamic-range video (HDR video) is greater than standard dynamic range (SDR) video which uses a conventional gamma curve.[3]

    High-dynamic-range rendering (HDRR) is the real-time rendering and display of virtual environments using a dynamic range of 65,535:1 or higher (used in computer, gaming, and entertainment technology).[4]
    "Holy shit, I ask you to stop being autistic and you debate what autistic is." - spasm
    Quote Originally Posted by Larkonis Trassler View Post
    WTF I hate white people now...

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •