This page is no longer updated, and is the old forum. For new topics visit the New HOL forum.
Register | Edit Profile | Subscriptions | Forum Rules | Log In
Stirlingsays 14 Sep 18 11.22pm | |
---|---|
Now that Nvidea have the release dates of the 20th September for the 2080 and 2080ti Turing cards I'm wondering if any of our Palace Hol brethren are interested in upgrading their graphics cards. Like many I'm licking my lips in expectation of some price drops on 1080ti cards and dipping my toes early next month. Ray tracing, while an exciting technology, seems like it needs another card generation or two before it can really impress. Any thoughts?
'Who are you and how did you get in here? I'm a locksmith. And, I'm a locksmith.' (Leslie Nielsen) |
|
Alert a moderator to this post |
Jimenez SELHURSTPARKCHESTER,DA BRONX 14 Sep 18 11.44pm | |
---|---|
Originally posted by Stirlingsays
Now that Nvidea have the release dates of the 20th September for the 2080 and 2080ti Turing cards I'm wondering if any of our Palace Hol brethren are interested in upgrading their graphics cards. Like many I'm licking my lips in expectation of some price drops on 1080ti cards and dipping my toes early next month. Ray tracing, while an exciting technology, seems like it needs another card generation or two before it can really impress. Any thoughts?
Pro USA & Israel |
|
Alert a moderator to this post |
Stirlingsays 15 Sep 18 12.16am | |
---|---|
Originally posted by Jimenez
'Who are you and how did you get in here? I'm a locksmith. And, I'm a locksmith.' (Leslie Nielsen) |
|
Alert a moderator to this post |
JusticeToad Beckenham 15 Sep 18 6.33am | |
---|---|
The 2080 is a grand! [Link] PC gaming has always been a premium market but this is getting silly, especially given that consoles can offer pretty good graphics now.
|
|
Alert a moderator to this post |
chateauferret 15 Sep 18 9.38pm | |
---|---|
I have been learning how to use the graphics card as a pertty good parallel computing engine and for certain general computing (as well as just graphical display) tasks it can be used to blow CPU processing out of the water. This generally means work involving the same smallish bit of computation repeated for many millions of data items. I took a set of C++ procedures involving the generation and transformation of 3D noise across a spherical space (planet generation) which was running in a couple of minutes on the CPU, ported it to GLSL and ran it as a compute shader on a reasonably decent but not bleeding-edge NVidia graphics card and it did it in a few tenths of a second. You can get software development kits like CUDA which present the GPU's parallel architecture as a pure general computing platform and run high-volume computations in massive parallel, moving the data from the CPU to the GPU beforehand and pulling the results back on completion. Other applications would include raster image filtering (Gaussian blur, edge detection, emboss etc.), Fourier transforms, process simulation such as erosion modelling, and even neural networks (I think). Why anyone wants to waste computing power like this on s*** like Fortnite I don't comprehend.
============ |
|
Alert a moderator to this post |
davenotamonkey 15 Sep 18 10.36pm | |
---|---|
Originally posted by Stirlingsays
Now that Nvidea have the release dates of the 20th September for the 2080 and 2080ti Turing cards I'm wondering if any of our Palace Hol brethren are interested in upgrading their graphics cards. Like many I'm licking my lips in expectation of some price drops on 1080ti cards and dipping my toes early next month. Ray tracing, while an exciting technology, seems like it needs another card generation or two before it can really impress. Any thoughts? I have a 1080ti, and not just any one - one of the high-spec cards. The only interest I'd have is driving 4K VR. And that would be before I can stump up an upgrade to the Vive, so not soon. Based on price-point, early benchmarks (NVIDIA, so they'll likely select the particular tests) show the 2080 is about 13% faster than the 1080ti. I'll wait!
|
|
Alert a moderator to this post |
davenotamonkey 15 Sep 18 10.41pm | |
---|---|
Originally posted by chateauferret
I have been learning how to use the graphics card as a pertty good parallel computing engine and for certain general computing (as well as just graphical display) tasks it can be used to blow CPU processing out of the water. This generally means work involving the same smallish bit of computation repeated for many millions of data items. I took a set of C++ procedures involving the generation and transformation of 3D noise across a spherical space (planet generation) which was running in a couple of minutes on the CPU, ported it to GLSL and ran it as a compute shader on a reasonably decent but not bleeding-edge NVidia graphics card and it did it in a few tenths of a second. You can get software development kits like CUDA which present the GPU's parallel architecture as a pure general computing platform and run high-volume computations in massive parallel, moving the data from the CPU to the GPU beforehand and pulling the results back on completion. Other applications would include raster image filtering (Gaussian blur, edge detection, emboss etc.), Fourier transforms, process simulation such as erosion modelling, and even neural networks (I think). Why anyone wants to waste computing power like this on s*** like Fortnite I don't comprehend. I never really got into CUDA to be honest. I'd have to have really hacked away at my legacy code, to the point it's probably diminishing returns. I'm aware it's really good at array functions though. Is this procedural planet topology you're generating, incidentally? PS: those games are driving the FLOPS, don't bash them ;-)
|
|
Alert a moderator to this post |
Stirlingsays 15 Sep 18 10.43pm | |
---|---|
Originally posted by chateauferret
I have been learning how to use the graphics card as a pertty good parallel computing engine and for certain general computing (as well as just graphical display) tasks it can be used to blow CPU processing out of the water. This generally means work involving the same smallish bit of computation repeated for many millions of data items. I took a set of C++ procedures involving the generation and transformation of 3D noise across a spherical space (planet generation) which was running in a couple of minutes on the CPU, ported it to GLSL and ran it as a compute shader on a reasonably decent but not bleeding-edge NVidia graphics card and it did it in a few tenths of a second. You can get software development kits like CUDA which present the GPU's parallel architecture as a pure general computing platform and run high-volume computations in massive parallel, moving the data from the CPU to the GPU beforehand and pulling the results back on completion. Other applications would include raster image filtering (Gaussian blur, edge detection, emboss etc.), Fourier transforms, process simulation such as erosion modelling, and even neural networks (I think). Why anyone wants to waste computing power like this on s*** like Fortnite I don't comprehend.
'Who are you and how did you get in here? I'm a locksmith. And, I'm a locksmith.' (Leslie Nielsen) |
|
Alert a moderator to this post |
Stirlingsays 15 Sep 18 10.47pm | |
---|---|
Originally posted by davenotamonkey
I have a 1080ti, and not just any one - one of the high-spec cards. The only interest I'd have is driving 4K VR. And that would be before I can stump up an upgrade to the Vive, so not soon. Based on price-point, early benchmarks (NVIDIA, so they'll likely select the particular tests) show the 2080 is about 13% faster than the 1080ti. I'll wait! Maybe by the time of the 30 series VR will eached a critical mass with developer time and the specs it offers. I see it breaking through to the mass market via the consoles first so if they stick with it....though AMD are the chip makers there. Exciting times.
'Who are you and how did you get in here? I'm a locksmith. And, I'm a locksmith.' (Leslie Nielsen) |
|
Alert a moderator to this post |
Stirlingsays 15 Sep 18 10.52pm | |
---|---|
Originally posted by davenotamonkey
I have a 1080ti, and not just any one - one of the high-spec cards. The only interest I'd have is driving 4K VR. And that would be before I can stump up an upgrade to the Vive, so not soon. Based on price-point, early benchmarks (NVIDIA, so they'll likely select the particular tests) show the 2080 is about 13% faster than the 1080ti. I'll wait! I've got my eyes on a Asus STRIX 1080ti.....I might go second hand as possibly the 30 series will be released before or around the Q4 of 2019.....and those cards are likely to give you far higher than reliable 60 plus...probably more like 100 fps on 4K. Proper immersion. Edited by Stirlingsays (15 Sep 2018 10.53pm)
'Who are you and how did you get in here? I'm a locksmith. And, I'm a locksmith.' (Leslie Nielsen) |
|
Alert a moderator to this post |
davenotamonkey 15 Sep 18 11.00pm | |
---|---|
Originally posted by Stirlingsays
I've got my eyes on a Asus STRIX 1080ti.....I might go second hand as possibly the 30 series will be released before or around the Q4 of 2019.....and those cards are likely to give you far higher than reliable 60 plus...probably more like 100 fps on 4K. Proper immersion. Edited by Stirlingsays (15 Sep 2018 10.53pm) That's a nice one... I imagine there will be quite a few trading up, so you could grab a bargain. I ended up adding keyword alerts to [Link] so you get a notification if something is posted there. For some games, I couldn't go back to "pancake mode" - 3D all the way :-)
|
|
Alert a moderator to this post |
Stirlingsays 15 Sep 18 11.08pm | |
---|---|
Originally posted by davenotamonkey
That's a nice one... I imagine there will be quite a few trading up, so you could grab a bargain. I ended up adding keyword alerts to [Link] so you get a notification if something is posted there. For some games, I couldn't go back to "pancake mode" - 3D all the way :-) Thanks for the link, I use pcpartpicker, looks like I have something else now as well. VR is the future along with AR.....but I do love pancakes as well.
'Who are you and how did you get in here? I'm a locksmith. And, I'm a locksmith.' (Leslie Nielsen) |
|
Alert a moderator to this post |
Registration is now on our new message board
To login with your existing username you will need to convert your account over to the new message board.
All images and text on this site are copyright © 1999-2024 The Holmesdale Online, unless otherwise stated.
Web Design by Guntrisoft Ltd.