I wonder if it will be an issue of capability or morality. Surely, comuter hardware and software is fast approaching a point where the machine is capable of doing what a human pilot can do, and in many ways better than a human can. But, when you're talking about arming a computer, essentially, and allowing it to decide when to pull the trigger, if human kind ready to turn moral decisions over to a machine? Turn the rules of engagement into an algarythm? It's one thing to have a drone operated remotely, but something else entirely to essentially click an icon and watch the video feed.