If you want to do serious math computing, use OpenCL or CUDA on your AMD or Nvidia graphics card.
Disrupting the Borg is expensive and time consuming!
Google Search
-
Recent Posts
- Maldives Underwater By 2050
- Woke Grok
- Grok Explains Gender
- Humans Like Warmer Climates
- Homophobic Greenhouse Gases
- Grok Explains The Effects Of CO2
- Ice-Free Arctic By 2027
- Red Hot Australia
- EPA : 17.5 Degrees Warming By 2050
- “Winter temperatures colder than last ice age
- Big Oil Saved The Whales
- Guardian 100% Inheritance Tax
- Kerry, Blinken, Hillary And Jefferson
- “Climate Change Indicators: Heat Waves”
- Combating Bad Weather With Green Energy
- Flooding Mar-a-Lago
- Ice-Free Arctic By 2020
- Colorless, Odorless CO2
- EPA Climate Change Arrest
- Nothing Nuclear Winter Can’t Fix
- “We Are From The Government And We Are Here To Help”
- Blinken Not Happy Yet
- Chief Executive Kamala
- “Investigated And Discredited”
- Ice-Free Arctic Warning
Recent Comments
- conrad ziefle on Woke Grok
- conrad ziefle on Maldives Underwater By 2050
- arn on Woke Grok
- Tommyb on Ice-Free Arctic By 2027
- Archie on Woke Grok
- Gamecock on Woke Grok
- arn on Homophobic Greenhouse Gases
- arn on Woke Grok
- Richard E Fritz on Homophobic Greenhouse Gases
- William on Woke Grok
what if you want a table or a single number as your final result?
GPU programming is ideal for generating tables of parallel calculations. If you want a single number, do it on the CPU.
I’ll look into it, thanks Steve. I’ve seen some interesting things been done with perl, java, etc. I don’t know much, but maybe I should… the problem is to have time to learn a new language.
NSA likes the number-crunching speed of the graphics processors, sorts the passwords out real fast no doubt.
I’ve done the most serious math imaginable on my Sharp EL-512 scientific calculator (the best scientific calculator ever made, bought it in 1984 for $17).
I still have my EL-5103, and I still love it.
And then there is this…
http://www.parallella.org/board/
And since it runs Linux on its 16 (+2 X ARM) cores…
That comment demonstrates you don’t know what you are talking about when it comes to “serious” math computations. The tool must match the task and not all tasks can be done by the same tools.
I have done a lot of “serious” number crunching using C on single and multi-core CPUs that cannot be done on a brain dead GPU. A GPU is only suitable for doing a few things on multiple long streams of rigorously standardized data ( aka. pipe lined vector processing). If that is your task, a GPU is great. If not, you will likely not complete your task without using an actual CPU.
ROFL. You are little behind the times. There are very few parallel problems which can’t be substantially accelerated on modern GPUs.
However not all can be and THAT is the point. There is no possible universal tool that can solve all problems equally well and that includes computer languages and hardware. This is as true as there is no one tool that can fix watches, cars, trucks, trains, and airplanes.
You have absolutely no knowledge of the task I had to solve nor the qualities necessary for its solution to have. Why then do you insist that you know what is best for me and the problems I have had to solve? Are you also telepathic, clairvoyant, and omniscient? Or are you just an arrogant fool when it comes to computers and computer programming? Perhaps you have written a few programs and you think that makes you an expert programmer. Sorry, that doesn’t quite cut it.
The bottom line is if you want to use a GPU and solve only the problems it can solve, go ahead and use it. I don’t give a flying fig one way or another. However, my goal is to solve problems and make things that work. I will use the tools that are suited to my purpose. I don’t limit myself to the latest proprietary gadget and fad language just so I can pretend that I am soooo superior. I am simply competent, productive, my results perform as I say they do, and they constitute reliable deliverable products.
I’m not talking about “your problem” This discussion is about scientific problems which scientists normally use Fortran for.
I will agree that the problems I have had to solve are not the kinds of problems that so called scientists try to solve using Fortran. I don’t solve problems based upon fantasy and science fiction. I solve real world problems so that they are actually solved rather than being used as the bases of a scam for more public funding to do more of the same. Actual customers need, want, and will voluntarily pay for the solutions I produce BECAUSE they solve real problems and not the fantasy and science fiction kind of problems.
Now for full disclosure. I have programmed my kind of solutions in Fortran as early as 1965 and have used a wide variety of hardware platforms, languages, and OS’s since then. Fortran can be made to work but it is quite limited in what you can do with it. Even the later versions of Fortran who’s programs could be more correctly structured are limited. I had to use far to many assembly language modules to get the behavior and performance that was required.
However, no matter what you like to pretend, some problems are better served by certain tools than others. Your “preferred” tools solve only one very narrow class of problems extremely well. For almost all other problems either it is a force fit or all but useless. To say that the one narrow class of problems is the totality of all “serious” math computations is outside the context of reality and into the realm of delusion.
Are GPUs good for floating point calculations?
They only have floating point math. There are no integer units.
Oh man am I behind the times.
There are no native 32bit/64bit integer units; however, the 24bit integer multiply-add is still native on SIMD GPUs, for efficient address lookups in a workgroup.
You can learn a lot about GPU programming at the following site. The blogger documents his attempt to port a chess engine to GPU
http://chessgpgpu.blogspot.com/2013/01/what-is-goal.html
Interesting. The blogger you refer to supports my contention that our host’s preferred tools are either useless or a force fit for the kind of computational problem he faces. They to ONE thing extremely well. If your problem fits that capability, then it is great. If not and you still insist on using it, it requires an extraordinary effort who’s cost exceeds the benefit by a wide margin.
Question: how may executions of the computation using our host’s preferred special purpose tool will it take to exceed the time it took to adapt the problem to his tool?
Question: how realistic is it that that number of executions will occur?
Question: what will the real world payoff be if it does? That is beyond being able to say it was done according to the new state of the art and allow you bragging rights that it is really “kool” to have done it. Or worse, beyond being able to scam still more extorted funds from taxpayers who actually do useful work for a living.
Don’t get me wrong. It is OK by me if that is what you want to do. However, don’t expect me to be happy to being forced to pay for it. Do it with your own resources and keep your hands out of my pocket. I don’t put my hands in your pocket to pay for my work. That should be a fair trade.
Now back to the real world. I have some real work to get done.
C++ AMP : Is my choice.
http://channel9.msdn.com/posts/AFDS-Keynote-Herb-Sutter-Heterogeneous-Computing-and-C-AMP