Quantum Computing reminds me of the story of the Deep Thought computer from The Hitchhiker’s Guide to the Galaxy. All answers are known and pre-computed if you only knew how to formulate the question. There are niches where quantum computing can really payoff such as in cryptography. Elsewhere, where you need determinism, not so much.
OK you maniac-brainiacs here on Rand’s fine site help me out here.
My starting point is the Wikipedia article, and as we know, Wikipedia is the world’s most trusted information source.
Oversimplifying, somewhat, by leaving out complex numbers, a “q-bit” does not store a binary number, a zero or a one, rather, it stores a real number p on the range zero to one. Number p is a probability that if you do a readout of the q-bit, you will get a 1 with probability p and a 0 with probability 1-p.
You don’t really get to add, subtract multiply or divide the values of p of different q-bits, but you get to combine q-bits with some sort of logic gate-like things where you get a new value of p, where the probability that the new bit returns 1 on readout is p and returns 0 with probability 1-p.
I presume there is also some kind of way to initialize a quantum bit to perhaps not any p but some predetermined p? Or maybe it initializes to a random p on the interval 0 to 1? Then there are some auto-magical algorithms that you somehow initialize the quantum computer, cycle its q-bits through some gate transformations, and then read out the 0’s or 1’s with probability of the p-value stored in the q-bit, and that these algorithms do something useful besides getting a grant funded or promoting an assistant professor to associate professor?
Just as Boolean algebra describes a regular kind of computer, and this description doesn’t depend on whether the 0 or 1 states are implemented in 12-nanometer CMOS or with clothes pins, a quantum number has these q-bit thingies that read out a 0 or a 1 with probabilities 1-p and p, and there is some way of initializing a q-bit to a quantum state p, a way of combining q-bits to come up with a new quantum state p that remains between 0 and 1 making it a proper binary probability, and there are algorithms for doing something remotely useful with a device following those rules, and the rules stay the same whether the q-bits are implemented with a rubidium laser cold-trap or a Ouija-board session?
Have I missed anything? Rand has some fine people visiting his Web site who turned up the paper on the Optimal Control algorithm for the Space-X booster-rocket tail landing, not that anyone here including me understands that paper, but at least someone turned it up. Help me out, here.
My main suggestion is, Scott Aaronson, Democritus. https://www.scottaaronson.com/democritus/
Aaronson is really good at describing the field, what it does, what it can’t do. I’ve skimmed it a few times, each time glided over math parts that were too much trouble, and I learned a lot about everything. For instance, he has a wonderful chapter on Free Will.
It would be nice if it was what I used to think, a way to do massive parallel processing by running a quintillion classic computers at once and then combining the results. For instance, the Nth computer checks if the ball is in box N and returns N if it is and 0 otherwise, and then you add them all up to find the ball. [This case is actually kinda a working example.]
But the last step, “combining the results”, only happens in very specific ways. So far the field looks like, see what happens in some special setup, and then see if the result is interesting and useful. Sometimes it is. But no one has found a way to make designer results to order.
I took a look at the course Web page, and the situation reminds me of when a biology professor, I’ll call him Bernie, was invited to speak before the Madison Computer Club about gene sequencing.
Bernie gave a professional-quality lecture-on-my-research-for-the-informed-lay audience. Don’t know if I fully understand gene sequencing, but it is not done directly but rather you have these “restriction enzymes” that cut the DNA in specific places, and then all you get to observe is the lengths of the resulting sequences after making a specific cut. In that way it is analogous to quantum computing in that you don’t have a way to read out the quantum states directly either.
Being the Madison Computer Club of the early 1980’s, the membership being an eclectic group of computer-kit entrepreneurs and early microcomputer adopters, we just peppered Bernie with questions, because we were not going to let him leave the auditorium lecture room in Computer Sciences we had reserved without figuring out how this works. Bernie gave it his best shot answering our question in the lingo of biology and biochemistry, but he tried to disengage and get home to his family by offering, “its complicated” and “there is a lot of other things I don’t have time to explain you need to learn first.”
Don’t remember if there was some kind of break between items on this being a club meeting, but I remember a bunch of us huddled in the back because we were simply not going to give up on understanding this, regardless of what Bernie was telling us. There was a club member, also named Paul, who said, “I think it works like this”, and he proceeded to summarize Bernie’s lecture but in translation from biology professor jargon to computer-geek talk.
I don’t think I fully caught on after Paul’s explanation, but it seems a lot of the other guys indeed did.
Scott Aaronson is still lecturing on a seemingly impenetrable subject for a lay audience, Bernie style. A signals-and-systems EE like me knows about real numbers, complex numbers, binary numbers, linear transformations on numbers and probability. I guess I am looking for the other guy named Paul in Madison Computer Club who could translate Quantum Computing into Electrical Engineering jargon.
The other thing I see missing in discussions of Quantum Computing is some manner of “separation of concerns.”
You can write a C-language computer program without knowing electron-transport in a MOSFET. You have a quantum-mechanical level description of electron transport, and then you have a circuit-level description where quantum mechanics no longer enters into it if you except the voltage-current curves of the circuit elements. From the circuits you get gates, but from a gate description Boolean algebra takes over and you no longer need to know about circuits. Once you have the Boolean algebra, you can go to higher functional levels, through descriptions of functional units (memory addressing, adders, multipliers), a description of the machine registers, instructions and memory addressing modes, and from there you can go to a somewhat higher-level language such as C, where there remains a correlation between addressing modes and pointer operations, and so on.
OK, OK, high-performance IC design requires your logic simulation to drill down into the circuit level that you meet timing constraints and voltage thresholds, C-language programming for some memory-constrained Internet-of-things processor requires consideration of the underlying instruction set, and so on. But you get the idea.
“Conventional” computing is a rather opaque subject unless you break it down this way. I am thinking Quantum Computing could be amenable to a similar treatment? To understand Quantum Computing gates, you don’t have to ooh and ahh, “Quantum Mechanics! Spooky action at a distance! Schrodinger’s Cat” You just need to say, these are the mathematical rules these gates follow, and I don’t care (right now) about the Physics spells behind them?
I attended the talk with Scott Aaronson. He did actually mention that there are people working on high level compilers, etc to program a quantum computer but it’s all very much in the research stage. Noone really knows how to do that. Aaronson says we are very much, in the quantum computer age, at the functional equivalent of the classical vacuum tube computer. Others think we are even less further along than that. It the early days of classical computing, problems were ‘programmed’ using wires and plugboards. Seems like quantum computers are very much like those days. Needing hardwired connections to solve a specific problem. It’s pretty far from being anything like “general purpose”. Mainly owing in large part to the lack of “separation of concerns” as you describe them. Even so, it seems unclear that there will ever be a clean or clear hierarchy of concepts like there is with classical computers. I think the jury is still out about how well this form of computing will “generalize”.
A quantum computer takes advantage of the strange (and perhaps illusory – we still don’t know) property of quantum mechanics that the state of the system is a wavefunction defined over what would have been the classical configuration space of the system.
If you have one qbit, the state of your computer is (c1,c2) where c1 and c2 are two complex numbers, normalized to one.
If you have two qbits, the size of the state of your computer is (c1,c2)x(c3,c4), and so on. The computer has some amplitude to be in any one of it’s possible memory states, and quantum gates operate on that entire configuration simultaneously.
Of course, what you get out at the end is just one of those states (01), (11), (10) etc. You have to do the calculation some number of times and build a histogram to “read” the output – a destructive process.
I sort of view quantum computers as an experimental test of Everett’s “many-worlds” interpretation of quantum mechanics. Is this exponentially vast state space real (indispensible in explaining the dynamics of the world), or were we staring cross-eyed at some sort of statistical mechanics over wave-mechanics all along?
If they can demonstrate some *unambiguous* (non contrived, special cased, obfuscated) quantum supremacy, then that’s good evidence of the reality of configuration space. Current claims seem premature.
Quantum Computing reminds me of the story of the Deep Thought computer from The Hitchhiker’s Guide to the Galaxy. All answers are known and pre-computed if you only knew how to formulate the question. There are niches where quantum computing can really payoff such as in cryptography. Elsewhere, where you need determinism, not so much.
OK you maniac-brainiacs here on Rand’s fine site help me out here.
My starting point is the Wikipedia article, and as we know, Wikipedia is the world’s most trusted information source.
Oversimplifying, somewhat, by leaving out complex numbers, a “q-bit” does not store a binary number, a zero or a one, rather, it stores a real number p on the range zero to one. Number p is a probability that if you do a readout of the q-bit, you will get a 1 with probability p and a 0 with probability 1-p.
You don’t really get to add, subtract multiply or divide the values of p of different q-bits, but you get to combine q-bits with some sort of logic gate-like things where you get a new value of p, where the probability that the new bit returns 1 on readout is p and returns 0 with probability 1-p.
I presume there is also some kind of way to initialize a quantum bit to perhaps not any p but some predetermined p? Or maybe it initializes to a random p on the interval 0 to 1? Then there are some auto-magical algorithms that you somehow initialize the quantum computer, cycle its q-bits through some gate transformations, and then read out the 0’s or 1’s with probability of the p-value stored in the q-bit, and that these algorithms do something useful besides getting a grant funded or promoting an assistant professor to associate professor?
Just as Boolean algebra describes a regular kind of computer, and this description doesn’t depend on whether the 0 or 1 states are implemented in 12-nanometer CMOS or with clothes pins, a quantum number has these q-bit thingies that read out a 0 or a 1 with probabilities 1-p and p, and there is some way of initializing a q-bit to a quantum state p, a way of combining q-bits to come up with a new quantum state p that remains between 0 and 1 making it a proper binary probability, and there are algorithms for doing something remotely useful with a device following those rules, and the rules stay the same whether the q-bits are implemented with a rubidium laser cold-trap or a Ouija-board session?
Have I missed anything? Rand has some fine people visiting his Web site who turned up the paper on the Optimal Control algorithm for the Space-X booster-rocket tail landing, not that anyone here including me understands that paper, but at least someone turned it up. Help me out, here.
My main suggestion is, Scott Aaronson, Democritus. https://www.scottaaronson.com/democritus/
Aaronson is really good at describing the field, what it does, what it can’t do. I’ve skimmed it a few times, each time glided over math parts that were too much trouble, and I learned a lot about everything. For instance, he has a wonderful chapter on Free Will.
It would be nice if it was what I used to think, a way to do massive parallel processing by running a quintillion classic computers at once and then combining the results. For instance, the Nth computer checks if the ball is in box N and returns N if it is and 0 otherwise, and then you add them all up to find the ball. [This case is actually kinda a working example.]
But the last step, “combining the results”, only happens in very specific ways. So far the field looks like, see what happens in some special setup, and then see if the result is interesting and useful. Sometimes it is. But no one has found a way to make designer results to order.
I took a look at the course Web page, and the situation reminds me of when a biology professor, I’ll call him Bernie, was invited to speak before the Madison Computer Club about gene sequencing.
Bernie gave a professional-quality lecture-on-my-research-for-the-informed-lay audience. Don’t know if I fully understand gene sequencing, but it is not done directly but rather you have these “restriction enzymes” that cut the DNA in specific places, and then all you get to observe is the lengths of the resulting sequences after making a specific cut. In that way it is analogous to quantum computing in that you don’t have a way to read out the quantum states directly either.
Being the Madison Computer Club of the early 1980’s, the membership being an eclectic group of computer-kit entrepreneurs and early microcomputer adopters, we just peppered Bernie with questions, because we were not going to let him leave the auditorium lecture room in Computer Sciences we had reserved without figuring out how this works. Bernie gave it his best shot answering our question in the lingo of biology and biochemistry, but he tried to disengage and get home to his family by offering, “its complicated” and “there is a lot of other things I don’t have time to explain you need to learn first.”
Don’t remember if there was some kind of break between items on this being a club meeting, but I remember a bunch of us huddled in the back because we were simply not going to give up on understanding this, regardless of what Bernie was telling us. There was a club member, also named Paul, who said, “I think it works like this”, and he proceeded to summarize Bernie’s lecture but in translation from biology professor jargon to computer-geek talk.
I don’t think I fully caught on after Paul’s explanation, but it seems a lot of the other guys indeed did.
Scott Aaronson is still lecturing on a seemingly impenetrable subject for a lay audience, Bernie style. A signals-and-systems EE like me knows about real numbers, complex numbers, binary numbers, linear transformations on numbers and probability. I guess I am looking for the other guy named Paul in Madison Computer Club who could translate Quantum Computing into Electrical Engineering jargon.
The other thing I see missing in discussions of Quantum Computing is some manner of “separation of concerns.”
You can write a C-language computer program without knowing electron-transport in a MOSFET. You have a quantum-mechanical level description of electron transport, and then you have a circuit-level description where quantum mechanics no longer enters into it if you except the voltage-current curves of the circuit elements. From the circuits you get gates, but from a gate description Boolean algebra takes over and you no longer need to know about circuits. Once you have the Boolean algebra, you can go to higher functional levels, through descriptions of functional units (memory addressing, adders, multipliers), a description of the machine registers, instructions and memory addressing modes, and from there you can go to a somewhat higher-level language such as C, where there remains a correlation between addressing modes and pointer operations, and so on.
OK, OK, high-performance IC design requires your logic simulation to drill down into the circuit level that you meet timing constraints and voltage thresholds, C-language programming for some memory-constrained Internet-of-things processor requires consideration of the underlying instruction set, and so on. But you get the idea.
“Conventional” computing is a rather opaque subject unless you break it down this way. I am thinking Quantum Computing could be amenable to a similar treatment? To understand Quantum Computing gates, you don’t have to ooh and ahh, “Quantum Mechanics! Spooky action at a distance! Schrodinger’s Cat” You just need to say, these are the mathematical rules these gates follow, and I don’t care (right now) about the Physics spells behind them?
I attended the talk with Scott Aaronson. He did actually mention that there are people working on high level compilers, etc to program a quantum computer but it’s all very much in the research stage. Noone really knows how to do that. Aaronson says we are very much, in the quantum computer age, at the functional equivalent of the classical vacuum tube computer. Others think we are even less further along than that. It the early days of classical computing, problems were ‘programmed’ using wires and plugboards. Seems like quantum computers are very much like those days. Needing hardwired connections to solve a specific problem. It’s pretty far from being anything like “general purpose”. Mainly owing in large part to the lack of “separation of concerns” as you describe them. Even so, it seems unclear that there will ever be a clean or clear hierarchy of concepts like there is with classical computers. I think the jury is still out about how well this form of computing will “generalize”.
A quantum computer takes advantage of the strange (and perhaps illusory – we still don’t know) property of quantum mechanics that the state of the system is a wavefunction defined over what would have been the classical configuration space of the system.
If you have one qbit, the state of your computer is (c1,c2) where c1 and c2 are two complex numbers, normalized to one.
If you have two qbits, the size of the state of your computer is (c1,c2)x(c3,c4), and so on. The computer has some amplitude to be in any one of it’s possible memory states, and quantum gates operate on that entire configuration simultaneously.
Of course, what you get out at the end is just one of those states (01), (11), (10) etc. You have to do the calculation some number of times and build a histogram to “read” the output – a destructive process.
I sort of view quantum computers as an experimental test of Everett’s “many-worlds” interpretation of quantum mechanics. Is this exponentially vast state space real (indispensible in explaining the dynamics of the world), or were we staring cross-eyed at some sort of statistical mechanics over wave-mechanics all along?
If they can demonstrate some *unambiguous* (non contrived, special cased, obfuscated) quantum supremacy, then that’s good evidence of the reality of configuration space. Current claims seem premature.