by daveguy on 12/14/2015, 8:23:38 PM
by fivesigma on 12/14/2015, 11:52:28 PM
TL;DR version: If there's any quantum phenomena going on in that device, it is only happening within the 8-qubit domains and not between them.
Bonus food for thought: how much more of a speedup factor would an ASIC farm provide compared to a general purpose CPU for that particular use case if you threw $150M on it?
by DannyBee on 12/14/2015, 10:08:25 PM
"There were $150 million dollars that went into designing this special-purpose hardware for this D-Wave machine and making it as fast possible. So in some sense, it’s no surprise that this special-purpose hardware could get a constant-factor speedup over a classical computer for the problem of simulating itself."
I kinda wonder how much money scott thinks goes into most chip hardware ;-)
(I think the special vs general purpose argument is certainly true, but citing the money numbers to bolster it seems ... silly)
by gnoway on 12/14/2015, 9:52:59 PM
I saw this headline and immediately thought "that's some pretty special paper."
It's been a long day.
by dvh on 12/14/2015, 3:10:31 PM
> "A mainstream media article about quantum computer that starts with a reference to D-Wave can be safely ignored"
marcosdumay, HN, 1 hour ago
If you came here like I did, thinking "HEY! That's old news!". You are correct. This added bonus is an interview of Aaronson for MIT News about his response to the google paper (which he previously blogged about). The original blog post has an update from Aaronson:
" MIT News now has a Q&A with me about the new Google paper. I’m really happy with how the Q&A turned out; people who had trouble understanding this blog post might find the Q&A easier. Thanks very much to Larry Hardesty for arranging it."
EDIT: It is definitely worth the read. A concise laymans summary of the original post and outline of the issues still to be overcome by D-Wave.