Page 1 of 1

Nvidia drops 32bit support after 390

Posted: 2017-12-24 16:42
by 4D696B65
http://nvidia.custhelp.com/app/answers/ ... a_id/4604/
After Release 390, NVIDIA will no longer release drivers for 32-bit operating systems1 for any GPU architecture.

Re: Nvidia drops 32bit support after 390

Posted: 2017-12-24 17:23
by Bulkley
We went from 8 to 16 to 32 so regularly that I fully expected 32 bit to disappear several years ago. I also expected that by now we would be buying 128 bit processors. It looks like good enough is becoming dominant.

Re: Nvidia drops 32bit support after 390

Posted: 2017-12-26 17:35
by llivv
as multi core grows the bus may get shorter again.
not likely but almost anything seems possible these days.....

IBM many years ago was watching flow through 100x8bit cores back when playstation 3 was supposed to get a 20 core brain or something like that.
They- prolly were trying to figger out how the interprocessing worked u/micro scale.

besides we linux users are all going to be "massively parallel"? along with everyone else, soon.

Re: Nvidia drops 32bit support after 390

Posted: 2017-12-27 04:07
by dotlj
Moore's law used to apply to memory, disk capacity, processors but getting down to 10nm technology has slowed things down.
Some years ago I also thought that we might also see 128 bit processors. The changes in the 6th, 7th and 8th generation Intel processors have been less impressive that previous generations.
At least memory is continuing to increase and new cpu's support more memory, greater bandwidth, and so on.
I'm not surprised to see Nvidia drop support as more support is being dropped for 32 bit processors.
How long will we have to wait to see what Intel and AMD do with their graphic processors?
It's not always easy to buy a suitable computer without Nvidia GPU, depending on your specifications. YMMV.
I'd prefer to buy Nvidia free so that I don't need the latest kernel and non free firmware to get external monitors to work.

Re: Nvidia drops 32bit support after 390

Posted: 2017-12-27 19:50
by llivv
I remember reading about the DEC alphas' 128 bit data path shortly after intels pentium replaced it's 486

The issue that I got from reading about it back in the last century was that it was near impossible to programm the thing ie: take advantage of the huge pipe it had for each clock cycle.
I still don't understand a LOT of how these things actually work.
Galaxy Quest wrote:Historical Documents
https://en.wikipedia.org/wiki/Alpha_21064
https://en.wikipedia.org/wiki/128-bit