Do I need this?
I just ordered a 2080ti founders
You baller. Do you need anything special to run it or will it fit anything that is standard PCIe?
I’m debating the necessity of this new technology. A few developers have already announced that their games will be using ray tracing, including Battlefield V which launches in October. That is soon.
However, I haven’t yet seen a direct comparison of these new cards to the 10xx series. I’ve only seen the graphic that says TEN-THOUSAND PERCENT BETTER or whatever. Is it only better at ray tracing? When performing other processes is it only 5% better?
It would be pretty buck for Nvidia to design a brand new interface and release a card before anyone mentioned hardware to support it. Doesn’t intel/amd coordinate with board manufacturers to get compatible sockets into the market for a new CPU release. Can you imagine, here is ur new card, now wait a few months while someone makes a thing for it to plug into.
I need over a thousand dollars more than a graphics card.
Ideally I’d like both, but my card is a 980, showing it’s age.
Also, 285w for the 2080.
The 2080ti seems to have similar clock speeds to the 1080ti, and has ~13% more cores… So I imagine that, ignoring new technologies like ray tracing, it will be about 13% faster.
No. Let me help you step up your audio game first.
Now that’s a markdown.
Get those 1080 prices down to where I can actually afford one.
Be sure to wait for reviews before you buy one of these new video cards. This initial report does not look promising.
It sure would be nice if they could be certain of what resolution they were playing at…
It’s going to be 25-35% faster than the 1080Ti. Which imo, is more than marginal.
Im of course not stating that off any specific benchmarks, as there are none, but more so from looking at clock speed/core count differences. Which have historically scaled fairly similarly across architectures. At least since the 700 series.
They played Battlefield V and Metro: Exodus with Nvidia RTX turned on, and saw performance run in excess of 100 fps at 4K and Ultra settings.
The 2080ti has ~20% more CUDA cores (3584 vs 4352), and is clocked slightly slower (1645Mhz boost vs 1635Mhz boost). Architectural differences should make up for the slower clock speed. So 20% faster seems more accurate.
From what I have read this is the opposite of what people are currently reporting. I believe it has been 40-50fps at 1080p with RTX on and 100+fps with it off, similar to what the current 1080Ti do (minus the RTX of course). Have there been new benchmarks to back up the 100fps with RTX on?