Date: prev next · Thread: first prev next last
2013 Archives by date, by thread · List index


2013/11/6 Kracked_P_P---webmaster <webmaster@krackedpress.com>


To be honest, number crunching on a GPU is better than a CPU, since GPUs
were designed to number crunch.  Look at the newer CUDA and ATI GPU
cards.  They have 32, 64, 128, 512, or more "cores" or "streams".  There
is a movement to create systems that are GPU based instead of CPU
based.  The gaming systems are almost all GPU based, since they do not
need to run traditional packages that a "home" or business computer
needs to work with.

Well, the Excel vs. Calc speed comparison on the same system [32-bit] is
a different "thing" than making a 64-bit version of LO or a GPU based LO
package.  The difference between Calc and Excel may be the efficiency of
the coding.  There are still a lot of old legacy code in LO that is
being worked on to make it work better and much more efficient.  Just
saying we need to create a 64-bit version of LO to fix the "speed
issues" is not really solving that issue.

As for making LO work with a GPU card, well I would not be surprised
that not too long from now, both Windows and Linux will have a version
that is GPU based.  That is one of the things that would make our
current systems faster without replacing the motherboard or CPU.  Just
buy a newer, faster, GPU for the system.  This is what the gamers do
currently.  The price of these faster GPUs are going down.  For $100, I
could buy a GPU 3 to 4 time faster than one I could buy 2 or 3 years
ago.  The GPU speeds per price is a much better "ratio" than the CPU
speed per price.  You just get more speed or number crunching power for
you money with a GPU card, compared to the CPU/motherboard costs.



​Whoa, you're leaping way too fast here. GPU are better at number
crunching, but only at that, and litteraly at number crunching. Branching,
working with the rest of the hardware, interrupts, and even some kind of
float or integer operation are not exactly their shining part.

I agree that moving the *relevant* operations to GPU-based hardware can
(and actually do) provide a very large boost, but the kind of operation
that benefit from them is really limited. There is a reason we still have
those general-purpose CPU around you know. Talking about windows or linux
systems that are GPU based is a bit of a stretch.
Even for LO Calc, formulas that are more complicated than "add x and y",
for example using lookup functions, statistical functions and other funny
stuff would probably not run better on GPU. One of the key point of the GPU
(or GPGPU, general purpose GPU) performance boost is that you can handle
*arithmetic operations* that are *massively* parallel, not that you have a
lot of raw computing power.


-- 
Cley Faye
http://cleyfaye.net

-- 
To unsubscribe e-mail to: users+unsubscribe@global.libreoffice.org
Problems? http://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: http://wiki.documentfoundation.org/Netiquette
List archive: http://listarchives.libreoffice.org/global/users/
All messages sent to this list will be publicly archived and cannot be deleted

Context


Privacy Policy | Impressum (Legal Info) | Copyright information: Unless otherwise specified, all text and images on this website are licensed under the Creative Commons Attribution-Share Alike 3.0 License. This does not include the source code of LibreOffice, which is licensed under the Mozilla Public License (MPLv2). "LibreOffice" and "The Document Foundation" are registered trademarks of their corresponding registered owners or are in actual use as trademarks in one or more countries. Their respective logos and icons are also subject to international copyright laws. Use thereof is explained in our trademark policy.