Skip to main content

Quantum Computers May be More of an Imminent Threat than AI

The Washington Post published an article yesterday (Feb. 5, 2018) about the rapid progress being made in the area of quantum computing, and how this will likely effect fields such as cryptography, and the dependent IT security field. The article also talks about quantum computing's potential "in [improving] weather forecasting, financial analysis, logistical planning, the search for Earth-like planets, and drug discovery."



The article spends a fair amount of time talking about the traveling salesperson problem (TSP), which I talk about in my post on Genetic Algorithms Basics. Combinatorial problems, like TSP, are not easily solvable on traditional computers. For those interested in complexity theory, they fall into a category of problems called NP-complete. While solutions to these problems can be found, they cannot be guaranteed in polynomial time. The only way to find a solution to these problems in polynomial (i.e., reasonable) time is to guess - hence the name Non-deterministic Polynomial (NP).
Quantum computers could be used to create AI in the form of pre-architected, pre-trained artificial neural networks.

Quantum computers are great combinatorial optimizers. If you can frame your problem as a combinatorial optimization problem, then a quantum computer may be able to solve your problem in minutes or seconds. Well known combinatorial optimization problems include TSP, factoring large numbers, and finding encryption cyphers.

Quantum computers could also be used to solve other problems which may be less well-known. Those problems are currently solved using genetic algorithms, and range from scheduling and logistics, to laying out transistors on a silicon wafer, robot motion planning, and even creating AI in the form of pre-architected, pre-training artificial neural networks.

While quantum computers are well-suited to combinatorial optimization problems, they are not necessarily the best solution for all problems. Quantum computers will (at least for the foreseeable future) be confined to the data center, so all processing done on the edge - where much of the AI inference work will be done - will remain the domain of traditional processors.

The original article from the Washington Post is linked below. Please feel free to leave a comment or reach out on Twitter or LinkedIn!

https://www.washingtonpost.com/news/innovations/wp/2018/02/05/quantum-computers-may-be-more-of-an-imminent-threat-than-ai/

Popular posts from this blog

Neural Network Dense Layers

Neural network dense layers (or fully connected layers) are the foundation of nearly all neural networks. If you look closely at almost any topology, somewhere there is a dense layer lurking. This post will cover the history behind dense layers, what they are used for, and how to use them by walking through the "Hello, World!" of neural networks: digit classification.

Arrays of Structures or Structures of Arrays: Performance vs. Readability

It's one of those things that might have an obvious answer if you have ever written scientific software for a vector machine. For everyone else, it's something you probably never even thought about: Should I write my code with arrays of structures (or classes), or structures (or classes) of arrays. Read on to see how both approaches perform, and what kind of readability you can expect from each approach.

Neural Network Pooling Layers

Neural networks need to map inputs to outputs. It seems simple enough, but in most useful cases this means building a network with millions of parameters, which look at millions or billions of relationships hidden in the input data. Do we need all of these relationships? Is all of this information necessary? Probably not. That's where neural network pooling layers can help. In this post, we're going to dive into the deep end and learn how pooling layers can reduce the size of your network while producing highly accurate models.