Some people fear giving too much control to machines, but after each colossally stupid case of human error (Costa Concordia, Asiana 214, and now the Santiago de Compostela derailment) the argument gets a little stronger.

Have you ever watched "Air Disasters" on the Smithsonian Channel? (Great show btw). It chronicles the history of plane accidents, and guess what? It's almost always human error.

Frankly, even with all the bugs and hacking and blue screens and god knows what else that a fully automated world will entail, I think it will still be safer than what we have now.

The sooner the better I say.