hate these ads?, log in or register to hide them
Page 141 of 142 FirstFirst ... 4191131138139140141142 LastLast
Results 2,801 to 2,820 of 2840

Thread: What Video Card Should I Buy?

  1. #2801

    Join Date
    April 18, 2011
    Posts
    2,296
    Lets see here, you actually made me go down memory lane and I discovered a few gaps. Lets start with the best worst card I've ever owned.

    1999: The S3 Savage 2000. A mess of broken hardware and buggy drivers but it was the card for playing Unreal thanks to its S3TC texture compression. It was gorgious.

    The quirky one.
    2001: PowerVR's Kyro2. A novel card with it's tile based rendering that was way way faster than anything at its pricepoint.

    The Best one
    2003: ATI 9500 PRO. Great card, TV out worked without a hitch. Just a pleasure in general.

    The forgotten one
    2007. Nvidia 8800 GTS. The thing I recall about this card was that I had trouble playing World in Conflict with it. I could start the computer and play. But if I wanted to start it again after a break I would have to restart the computer or it wouldn't work. Didn't like how it handled its TV-out, was subpar to the 9500 in that regard.

    Another good one
    2009. ATI 5850. Great card, no complaints whatsoever. Multimonitor support was such a step up from the 8800. Shipped it to an Eve buddy who is still using it.

    My best Nvidia card
    2013. Nvidia GTX 760. Good card for the money and was happy with it until I installed Win 10 and it just wouldn't play nice no matter how many reinstalls.

    My current one.
    2016. AMD RX470. Solid card, multimonitor is still better than nvidias solution for some reason.
    Last edited by Spartan Dax; August 28 2017 at 04:25:06 PM.

  2. #2802
    Movember 2012 Zekk Pacus's Avatar
    Join Date
    April 11, 2011
    Posts
    6,733
    Used to be more frequent - I used to skip a gen then buy. My most recent upgrade was a GTX 560 to a GTX 970 - I had planned to get the GTX 960 but 2GB of VRAM. I remember having an 8800GT for AGES because I was more of a console gamer at the time.

    Most gamers I know are currently on around a 36 month schedule - personally I'm waiting for Volta, the 970 is doing fine at 1080p but I'd like to go 1440p/4k at some point in the nearish future.

    (for funsies, as much as I can remember: Matrox something or other -> Voodoo 2 -> SLI Voodoo 2 -> nVidia GeForce 2 Ti -> Radeon 9700 Pro -> nVidia GeForce 6600 -> nVidia GeForce 8800GT -> nVidia GeForce 560 -> nVidia GeForce 970)

    To be fair, that's one of the things you're paying for with a Titan - you're paying for the absolute bleeding edge in a consumer format, so it should hold performance longer.
    Last edited by Zekk Pacus; August 28 2017 at 05:39:12 PM.
    'I'm pro life. I'm a non-smoker. I'm a pro-life non-smoker. WOO, Let the party begin!'

  3. #2803
    Straight Hustlin's Avatar
    Join Date
    April 14, 2011
    Posts
    9,806
    Yeah lately I upgrade the GFX card a lot less often than I used to. Used to upgrade it every year or two on some of my old machines. Now I've been getting a good (but not top notch) with the initial build, upgrading with something pretty good 3 years or so down the line; then getting a whole new machine 3 or so more years later since at that point just throwing more gfx card at it doesn't solve any actual bottle necks.

  4. #2804
    Approaching Walrus's Avatar
    Join Date
    March 8, 2013
    Location
    MAKE AMERICA WHOOP AGAIN
    Posts
    6,219
    Console stagnation holds back the need to frequently upgrade too since tons of games are made to run crossplatform. Pretty sure my gaming laptop from 2012 is still more powerful than a ps4.

  5. #2805
    Donor Pattern's Avatar
    Join Date
    April 9, 2011
    Posts
    6,155
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.

  6. #2806
    Cosmin's Avatar
    Join Date
    March 14, 2012
    Location
    UK
    Posts
    5,056
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    Actually looking at a 1440p 27'', but it's around £700 and I don't feel like blowing that much right now. Hell, my TV is a 36'' 720p "HD Ready" Samsung. It can upscale to 1080i on my PS3, so not unhappy with it. Plays shit quite nicely
    Guns make the news, science doesn't.
    Six shooters ruined PvP.
    What are you doing with your life?İDoomchinchilla 2015

  7. #2807
    Movember 2011Movember 2012 Nordstern's Avatar
    Join Date
    April 10, 2011
    Posts
    8,188
    If we don't count the Apple IIe, then my first graphics card was a S3 ViRGE 3D 2000 2MB. Utter shit, but it ran TIE Fighter and MechWarrior 2. Many fond memories playing those. Years later, tried to play the original Half-Life and somehow made it to the Xen level with 7-16 fps, never exceeding 20.

    Next up was the integrated graphics in the Intel 845 chipset. Played some Neverwinter Nights and Freelancer with that, worked alright. NWN:HOTU was too much, though.

    Built my own system circa 2005/2006, got a Sapphire Radeon X300SE 64MB, fully knowing it was a cut-down card and I intended to upgrade later as my budget allowed. NWN played so much better and I started playing lots of Battlefield 2. Even got Splinter Cell: Chaos Theory to run, but had to turn down settings due to framerates and lack of Shader Model 3.0 and Hardware T&L. Vowed to upgrade once I heard of a game called EVE Online...

    ATI Radeon X1950 Pro 256MB was next, and stayed with me for a while. For some reason, I thought I needed a one-slot video card, and this fit the bill. Maybe I was going to do Crossfire? I dunno. Now I could run SC:CT at full settings. Battlefield 2 was much better. Started playing EVE, Prey, Far Cry, Mass Effect 1, Half Life 2, maybe some others. New graphics in EVE made the card choke and I knew it was only a matter of time before I couldn't play EVE anymore. Motherboard failed due to a southbridge heatsink popping off due to overheating, decided to upgrade the graphics at the same time.

    Next up was a XFX Radeon HD5750 1GB, part of an all-new system (except the case, since I liked it). This card worked great, and I still have it lying around. Monitor died, then the flat panel died, so I upgraded to a 1080p flat panel display. Did lots of Battlefield 3, even some Battlefield 4 (but it had some framerate issues, even on lower settings). Handled EVE, Skyrim, Star Trek Online and Sins of a Solar Empire: Rebellion like a champ. Deus Ex: Human Revolution and Mass Effect 2 ran at full settings. Decided to contribute my graphics horsepower to BOINC, was informed I needed a double-precision card. I had no idea what double-precision meant at the time, so I started doing some research.

    After a few years, I decided that instead of simply buying a more powerful graphics card, I wanted a more power-efficient card. I saw the RX400 series and thought it was promising, but all the new cards drew far more power than my HD5750. Passed on the RX470/480 cards (in hindsight, a mistake) and acquired an XFX RX460 2GB. It was a little more powerful than my last card but drew less power, had double-precision, and had a 0dB fan (which was important to me). No new games at this point other than Deus Ex: Mankind Divided. Everything was a little smoother and quieter, but no big jumps. I was able to play with all the Skyrim graphics mods I wanted, however.

    Currently running an XFX RX560 4GB since I don't want to pay outrageous sums for a RX480/580. I won't be upgrading the graphics on this system anymore, as it is an Ivy Bridge system limited to PCI Express 2.0 and has a 500w power supply. I don't know what my next system will be, but it will definitely have two video cards capable of handling VR.
    Last edited by Nordstern; August 28 2017 at 11:48:04 PM.
    "Holy shit, I ask you to stop being autistic and you debate what autistic is." - spasm
    Quote Originally Posted by Larkonis Trassler View Post
    WTF I hate white people now...

  8. #2808
    walrus's Avatar
    Join Date
    April 9, 2011
    Location
    Fancomicidolkostümier- ungsspielgruppenzusammenkunft
    Posts
    5,892
    Diamond Stealth 3D 2000 -> NVidia Riva TNT 2 -> Ati Radeon 9800 Pro -> Ati Radeon HD 2600 Pro -> AMD Radeon 7950
      Spoiler:
    Quote Originally Posted by RazoR View Post
    But islamism IS a product of class warfare. Rich white countries come into developing brown dictatorships, wreck the leadership, infrastructure and economy and then act all surprised that religious fanaticism is on the rise.
    Also:
    Quote Originally Posted by Tellenta View Post
    walrus isnt a bad poster.
    Quote Originally Posted by cullnean View Post
    also i like walrus.
    Quote Originally Posted by AmaNutin View Post
    Yer a hoot

  9. #2809
    Movember 2012 Zekk Pacus's Avatar
    Join Date
    April 11, 2011
    Posts
    6,733
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    I'm happy with my boring old 23" 1080p IPS screen. Like I said I keep looking but nothing's attracting me at a price I want to pay.

    Maybe next year.
    Last edited by Zekk Pacus; August 29 2017 at 03:08:47 PM.
    'I'm pro life. I'm a non-smoker. I'm a pro-life non-smoker. WOO, Let the party begin!'

  10. #2810
    GeromeDoutrande's Avatar
    Join Date
    April 10, 2011
    Location
    Fakefrenchistan
    Posts
    1,695
    I usually try to keep a card for three years or so, but I upgraded to a 1070 to go with a higher resolution display earlier this year. I think three years is quite a practical time period these days, maybe even too short.

  11. #2811
    Super Moderator Global Moderator QuackBot's Avatar
    Join Date
    March 7, 2012
    Posts
    20,806
    Quote Originally Posted by Straight Hustlin View Post
    Yeah lately I upgrade the GFX card a lot less often than I used to. Used to upgrade it every year or two on some of my old machines. Now I've been getting a good (but not top notch) with the initial build, upgrading with something pretty good 3 years or so down the line; then getting a whole new machine 3 or so more years later since at that point just throwing more gfx card at it doesn't solve any actual bottle necks.
    People are more or less.

  12. #2812
    Cosmin's Avatar
    Join Date
    March 14, 2012
    Location
    UK
    Posts
    5,056
    Wow that was a trip down memory lane. My first GFX card was a Riva TNT embedded in the motherboard of my HP system which was sporting a Pentium III. It even had its own VRAM (8MB!). Then I was lucky enough to get a GeForce 2 Pro (which I still have, still in my HP system lol). Then upgrade beckoned and along a 2.4GHz Pentium 4 (Northwood core) along came a GeForce FX5200, which was awful from every pov, but did the job. Then I got an Athlon64 (Venice) to which I paired one ATi X800GTO which was p. awesome for the monies. Next up was a 8800GT (best buy ever <3) inside a system containing an Intel E2140 (clocked 100% to 3.2GHz on a DFI Dark motherboard).

    The 8800GT made it in my later system built around a Xeon X3440 with a DFI LanParty Dark P55-T3eH6. This was funny, because as soon as I put 4x4GB sticks in it, it just died. So I was liek wtf and put the same RAM in my gf's PC which sported the same motherboard. Needless to say, that motherboard died as well. Apparently, albeit the board could support 16GB RAM, as soon as you'd actually go ahead and do it, it'd just die. Completely. Got both boards refunded and got a MSI Big Bang Fuzion which ended on eBay after the last upgrade to the 4930k. Video card after the 8800GT was an AMD 7970 from XFX which basically made the system BSOD constantly because of bad drivers and what not so exchanged it for a GTX680. Come to think of it, I've kept a video card in my system for more or less 2-3 years at least.
    Guns make the news, science doesn't.
    Six shooters ruined PvP.
    What are you doing with your life?İDoomchinchilla 2015

  13. #2813
    XenosisMk4's Avatar
    Join Date
    July 13, 2017
    Location
    More turbo-lightspeed neoliberal platitudes/virtue signaling/misplaced priorities on full display.
    Posts
    869
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    Mate, I'm currently using an IPS LG Ultrawide

    If I want to upgrade, I either go 4K IPS Ultrawide or 144hz IPS Ultrawide

    I don't have £900 to spent on a monitor currently :V

    This monitor has doomed me forever

  14. #2814
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    2,982
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.

    I'll give walrus a pass for owning an (even older) Diamond Stealth 3D 2000, but that card was known as a 3D decelerator at the time. The Voodoo cards were the first real GPUs as we now know them, even though it didn't have any 2D capabilities in it's first iteration.

    And to answer the original question I try to keep a few years between GPU purchases and switch (chipset) brands all the time. I generally donate my "old" GPU to indi, as she plays less demanding games then I do, so that works out fine.

  15. #2815
    XenosisMk4's Avatar
    Join Date
    July 13, 2017
    Location
    More turbo-lightspeed neoliberal platitudes/virtue signaling/misplaced priorities on full display.
    Posts
    869
    I have an ATI All in Wonder in a drawer somewhere, comes with a TV tuner

  16. #2816
    Donor
    Join Date
    April 9, 2011
    Posts
    1,298
    Quote Originally Posted by Overspark View Post
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.
    Heh. Despite having been around long enough that the first actual graphics card I ever used was an original IBM 16K CGA board, I've never actually owned one of the 3D-only Voodoo cards. My first 3Dfx was a Voodoo Banshee (which I still think was a good and unfairly maligned card). I did once see a pair of Voodoos working in SLI, complete with the VGA-VGA dongle sticking out the back. I just laughed at it, same as I did at the original ATI Crossfire system and its DVI-DVI dongle.

  17. #2817
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    2,982
    Quote Originally Posted by Bombcrater View Post
    Quote Originally Posted by Overspark View Post
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.
    Heh. Despite having been around long enough that the first actual graphics card I ever used was an original IBM 16K CGA board, I've never actually owned one of the 3D-only Voodoo cards. My first 3Dfx was a Voodoo Banshee (which I still think was a good and unfairly maligned card). I did once see a pair of Voodoos working in SLI, complete with the VGA-VGA dongle sticking out the back. I just laughed at it, same as I did at the original ATI Crossfire system and its DVI-DVI dongle.
    SLI didn't come until the Voodoo2, but you're right about it being ridiculous then & now.

  18. #2818
    Cosmin's Avatar
    Join Date
    March 14, 2012
    Location
    UK
    Posts
    5,056
    Quote Originally Posted by Overspark View Post
    ITT anyone who didn't start with a graphics card with a 3Dfx Voodoo chipset which required you to put a VGA daisy-chain cable between your 2D card and your 3D card is a newbie.

    I'll give walrus a pass for owning an (even older) Diamond Stealth 3D 2000, but that card was known as a 3D decelerator at the time. The Voodoo cards were the first real GPUs as we now know them, even though it didn't have any 2D capabilities in it's first iteration.

    And to answer the original question I try to keep a few years between GPU purchases and switch (chipset) brands all the time. I generally donate my "old" GPU to indi, as she plays less demanding games then I do, so that works out fine.
    I never had monies for Voodoo

    I do have a Voodoo 2 for historical purposes in my HP though.
    Guns make the news, science doesn't.
    Six shooters ruined PvP.
    What are you doing with your life?İDoomchinchilla 2015

  19. #2819
    Duckslayer's Avatar
    Join Date
    April 10, 2011
    Location
    Here
    Posts
    11,753
    Quote Originally Posted by Zekk Pacus View Post
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    I'm happy with my boring old 23" 1080p IPS screen. Like I said I keep looking but nothing's attracting me at a price I want to pay.

    Maybe next year.
    same

  20. #2820
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    2,982
    Quote Originally Posted by Duckslayer View Post
    Quote Originally Posted by Zekk Pacus View Post
    Quote Originally Posted by Pattern View Post
    ITT people with shitty monitors wondering why they don't feel like upgrading video cards.
    I'm happy with my boring old 23" 1080p IPS screen. Like I said I keep looking but nothing's attracting me at a price I want to pay.

    Maybe next year.
    same
    There's been a lot of very interesting monitor developments in the past few years though:

    1) G-Sync / FreeSync. It may not sound like that big a deal, but the level of fluidity you get out of these monitors is a huge step up from normal monitors, and once you're used to it it's hard to go back. Both technologies are good these days. G-Sync is stupidly expensive but p much all monitors that have it are good. FreeSync is much much cheaper (like $300 cheaper) but don't just buy the cheapest one as there is more variance in what they actually bring to the table (mostly look out for a low minimum Hz and a high maximum Hz, you want the max to be >2.5 times the min, or just get a FreeSync 2 panel which eliminates this issue). One of the most interesting things these technologies can do for you is make low-fps situations or even low-fps spikes almost unnoticeable, allowing you to crank up the visuals without too much impact on heavy games. The other is that screen tearing is completely eliminated without the overhead of V-sync. Not all games support it properly, and it can be a bitch to get them to cooperate, especially when using funky technologies like Vulkan. Oh and obviously G-Sync only works on Nvidia, and FreeSync works on "everything else" (it's royalty-free but only AMD who made it uses it), so there's that.

    2) 21:9 gives you far more screen to look at. It mostly gives you more peripheral vision, which is great in all 1st/3rd perspective games. In RTS-like games it just gives you more map to look at. In EVE it gives you loads more room for all your little windows (lolEVE), etc. A lot of games support it, some better than others. Worst case is the game stays at 16:9 and you get black bars at the sides. Some games intentionally don't support it properly in a misguided attempt to be "fair", such as Overwatch. Once again, hard to go back to 16:9 once you get used to more.

    3) Higher refresh rates. 60Hz isn't that good. I have a 75Hz monitor overclocked to 80Hz and that is already a huge step up. With monitors going much further these days (100Hz, 144Hz, etc) they're obviously hitting diminishing returns, but they are much more fluid than older 60Hz panels. A stupidly high Hz value like 144Hz brings along some of the advantages that G-Sync / FreeSync would bring wrt fluidity and less v-sync issues, but it's not completely the same.

    4) HDR. If properly supported it's awesome, bringing you incredibly high brightness with detailed shadows alongside it at the same time. But it's still a fairly new technology on PC so I'm still in a "wait and see" mode until everything standardizes on one technology (right now there are 5?). Will probably take a couple of years to mature.

    Obviously getting a monitor that can do all of those things properly would be nice, but they barely exist at this point, and if they do they're stupidly expensive. You'll have to decide for yourself if it's worth upgrading now (I did and am extremely happy I did) or wait a couple of years, assuming something new doesn't come along in that timeframe which you'll also want to have. I don't think waiting one year will be enough for everything to come together, so you'll still be making a compromise next year and might as well buy this year IMHO.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •