Saturday, December 19, 2015

Thursday, December 17, 2015

It's fascinating how history tends to repeat itself, and the evolution of self-driven cars are certainly one such...


Originally shared by Thomas Baekdal

It's fascinating how history tends to repeat itself, and the evolution of self-driven cars are certainly one such case. Think about it like this.

Back in the 1800s, we had self-driven horse carriages.The horse could to some extent autonomously drive the carriage with only minor directions from the person in the carriage.

This gave us two levels of security. First we had the human driver who directed the horse to go where he wanted to. But the horse itself would also navigate or brake if needed. If someone stepped in front of the horse it wouldn't just run into it, it would try to avoid it.

But then we got rid of the horse with the invention of the combustion engine, and this was quite a scary prospect. Now we had self-propelled vehicles where the only thing keeping it safe were the human behind the wheel.

So laws were imposed to ensure public safety, called the Locomotive Acts. Today we also know them as the 'red flag' law (which was repealed in 1896). As Wikipedia puts it:

Firstly, at least three persons shall be employed to drive or conduct such locomotive, and if more than two waggons or carriages he attached thereto, an additional person shall be employed, who shall take charge of such waggons or carriages.

Secondly, one of such persons, while any locomotive is in motion, shall precede such locomotive on foot by not less than sixty yards, and shall carry a red flag constantly displayed, and shall warn the riders and drivers of horses of the approach of such locomotives, and shall signal the driver thereof when it shall be necessary to stop, and shall assist horses, and carriages drawn by horses, passing the same.

The law(s) also imposed speed limits (all the way down to 4 MPH in the country and 2 MPH in the city).

It was all quite silly.

Fast forward to 2015, and something funny is happening. First of all, Google and many others are developing self-driving cars, which are already at the point where they are safer to drive than normal vehicles.

The first thing about this is that we have come full circle. In 1800, we had 'cars' that had two levels of safety. One was the human, the was the autonomous horse. In 2015, we have Google cars that also comes with two levels of safety (for the moment). One is the human, and the other is software and the sensors (who can do a much better job than any horse could do).

So for all sense of purposes, self-driving cars are much safer already today than anything we have ever had. But this hasn't stopped governments from freaking out about it.

A new California draft law is set to define the requirements for driving a self-driven car:

"It is required that the cars have a steering wheel, and a licensed driver must be ready to take over if the machine fails. [...] Both the manufacturer and an independent certifier would need to sign off that the car has passed safety testing. Any person who wants to lease or use one of the cars would need special training provided by the manufacturer and then receive a special certification on their driver's license." (http://goo.gl/nGsleP)

In other words. If I wanted to borrow a self-driven car from Google to do some shopping, it will not be enough that I have a driver's license. I actually need to get special training and certification ... for driving a car that is already safer than any other car on the road.

This is just like the old red flag laws.

And it's not just Google. Look at Tesla. Today with Tesla's assisted driving technology, you don't need a special driver's license to drive one. But if the car where to drive itself, you, the driver, would need more training and a special license to operate one.

Wait, what?

Here is how I think self-driven cars should be evaluated. Ask them to take a driver's test.

We have the theoretical test (the ones we humans complete in classrooms), which could just as easily be presented to the cars' software. Put the cars in a number of pre-defined scenarios, and see if the software recognizes the situation and acts accordingly. This is what that test is supposed to check with us humans, so why not do the same with cars?

Secondly, we have the practical test, which would be very simple to perform. Put the examiner in the car and see if the car can follow the directions. If the car takes the examiner to where he wants to go and it follows the rules of the road, it has passed.

But, wait-a-minute, you say. This might not be enough.

True, there may be several cases where things could still go wrong, but that is no different from how we humans drive. More than 2 million people were injured last year in the US alone.

2 million!

We don't need self-driving cars to be 100% safe. We just need them to cause personal injuries less than 2 million times per year, and then gradually improve on that number all the way down to zero.

From all the testing that Google has already done, it's pretty clear that we are already well below this threshold.

We shouldn't have laws requiring "the cars have a steering wheel, and a licensed driver must be ready to take over if the machine fails." We should have laws saying: "the car must be ready to take over if the human fails. "

We are the ones who are failing. Not just at driving our cars, but also in legislating whether something is safe or not.

Friday, December 11, 2015

Wednesday, December 9, 2015

This could be exciting!

This could be exciting!

Originally shared by Daniel Suarez

Stellarator fusion reactor ready for tests.

In the next day or so, German researchers will for the first time switch on the Wendelstein 7-X stellarator fusion reactor -- the largest ever built. Housed in the Max Planck Institute of Plasma Physics in Greifswald, the €1 billion machine looks like a prop escaped from the film 'Event Horizon.' However, its unusual shape serves a specific purpose, as seen in the video below.

More info available here: http://www.sciencemag.org/content/350/6259/369.full
https://www.youtube.com/watch?time_continue=72&v=u-fbBRAxJNk

In 1976 (yes, 1976), I heard my professor, one Don Norman, say pretty much the same thing.

In 1976 (yes, 1976), I heard my professor, one Don Norman, say pretty much the same thing. https://www.fastcompany.com/90202172/why-bad-tech...