This naturally gives rise to the question, what are some non-classical techniques? And many of these have been discussed before such as a different way of isolating trends or a different method of modelling. These ideas are based on the fact that we have only so much processing power and increasingly large amounts of data. But there is another option. What if we limit our processing technique, but in exchange give it nearly unlimited power? In other terms, running our programs won't tell us the same things, but they will run orders of magnitude faster! What is this amazing technology you ask? Well, welcome to quantum computing.
Quantum computing has suffered a large part of poorly-researched journalism over the years but after focusing my summer research project at Stanford on quantum information theory I feel prepared enough to banish the illusions.
The basis of quantum computing is that the concept of a bit, a "light bulb" that is either on or off, can be slightly changed. In classical computing this idea of on or off, all or nothing is how we store data. Through long strings of on and off light bulbs (or 1's and 0's as they are often known) we can express all manners of ideas. Quantum computing uses physical properties of the universe to make things a little bit more interesting. Instead of a bit being on or off, it has some probability of being on, some probability of being off. Basically that means that we don't know if it is a 1 or a 0 and if we look closely enough we can find out, but without looking closely all we know are these probabilities. (And while this explanation still skates around some major concerns, it is accurate enough for this blog post)
But Ryan, what does this have to do with analyzing data? I'm glad you asked! It turns out that since this bit can have a whole continuous spectrum of probabilities of its on and offs, it can store a lot more data in it. This means that we can put are large amounts of data, translate them into these "quantum bits" and use them for our purposes. But, this comes with a great drawback. Information in a "qubit" is not as accessible as a regular bit. When we "read" qubits, information is lost. It resolves into either a 1 or a 0, and any other information is lost. However, there are certain mathematical techniques that we can use to solve problems faster than we could using classical bits. And thus comes the hope that someday we can use these techniques to analyze large amounts of data in a quick fashion.
No comments:
Post a Comment