I have just realised that I have alread

I have just realised that I have already been doing quantum computing!<br><br...
Tim HuttonTim Hutton - 2014-07-07 10:36:02+0000 - Updated: 2014-07-18 12:42:13+0000
I have just realised that I have already been doing quantum computing!

On a CPU, if-then-else branching is like the Copenhagen interpretation: we only take one branch or the other. But on a GPU (or other SIMD environment) branching is like the many-worlds interpretation of quantum mechanics: we evaluate both branches and then allow the results to interfere to get a single result:

result = conditional * output1 + (1-conditional) * output2;

The double-slit experiment is exactly an if-then-else branch on a GPU, with data as the stream of photons. We don't need to make sure each bit goes through the correct slit, instead we fire the data at the slits and each bit goes through both. Only those results that we are interested in appear on the wall behind, the others destructively interfere.

GPU Gems - Chapter 34. GPU Flow-Control IdiomsChapter 34. GPU Flow-Control Idioms

Shared with: Public, Paul Gray
Cornus Ammonis - 2014-07-07 18:29:36+0000
Great article, those z-culling and occlusion query optimizations are very smart! CPUs often execute both sides of a branch as well, to mitigate branch misprediction penalties. They don't take a significant performance hit in doing so either; once the branch is resolved the unneeded branch is thrown away midflight (no context switch necessary), and the calculations for the unneeded branch are mostly filling unused pipeline anyway. I don't know for certain if the big CPU manufacturers do this (trade secret) but it's a reasonable guess that they make the decision to execute both branches probabilistically, since modern branch predictors are known to use probabilistic (perceptron) models.
Paul Gray - 2014-07-07 18:59:14+0000
When branch prediction comes back with 42 every time is when I agree about the quantum aspect :).

I'm just waiting for analogue computers to start coming back into fashion, that and photonic CPU's. 

But whatever we have you can be guaranteed of one thing, quantum computer or not, people will still argue that random is not random enough to be called random.
Cornus Ammonis - 2014-07-07 19:08:14+0000
+Paul Gray Especially if it's Intel's RDRAND :)

This post was originally on Google+