The robots are coming … for whom the bell tolls

Reshared post from +Simon Wardley

The robots are coming … for whom the bell tolls.
I often read articles about how machine intelligence and robotics will replace basic roles in society, they will become the new drivers, the new waiters, the new hospital porters and nurses of this world. But is this likely? And if they are going to replace…

The robots are coming … for whom the bell tolls.
I often read articles about how machine intelligence and robotics will replace basic roles in society, they will become the new drivers, the new waiters, the new hospital porters and nurses of this world. But is this likely? …

I come across so many people on the internet giving advice on how to meet some fitness…

I come across so many people on the internet giving advice on how to meet some fitness goal. You know, the ones about how to get rid of tummy fat, get tone arms, etc.

Much of the advice I read sounds plausible.

Pretty much none of it ever includes any scientific evidence, and its infuritating.

Andrew Gelman thinks we should just drastically discount our confidence in the results…

Andrew Gelman thinks we should just drastically discount our confidence in the results of any paper published before 2016 because journals are terrible and they will never, ever fix their problems.

Let’s just put a bright line down right now. 2016 is year 1. Everything published before 2016 is provisional. Don’t take publication as meaning much of anything, and just cos a paper’s been cited approvingly, that’s not enough either. You have to read each paper on its own. Anything published in 2015 or earlier is part of the “too big to fail” era, it’s potentially a junk bond supported by toxic loans and you shouldn’t rely on it.

This is based on an an article (http://goo.gl/sYHBKo) that shows just how bad journals are at acknowledging and fixing the problems with published papers.



Embedded Link

Too big to fail: Why it’s unrealistic to expect scientific journals to retract their huge backlog of erroneous papers – Statistical Modeling, Causal Inference, and Social Science
I couple years ago I wrote an article, “It’s too hard to publish criticisms and obtain data for replication.” I gave two examples demonstrating the struggles of myself and others to get journals to admit errors. The problem is that the standards for post-publication review are higher than for pre-publication review. You can find an …

I mostly avoid discussions about privilege and social justice issues because I find…

I mostly avoid discussions about privilege and social justice issues because I find the concepts confusing and I find the sureness with which people talk about them confusing.

However, I stumbled into this nice essay by the best selling author Marti Leimbach on the subject which I found well-written (unsurprisingly), moving, and thought-provoking.

The part that first grabbed me given my above-mentioned confusion on the subject:

Nonetheless, this whole notion of “privilege” vexes me. We talk about it as though we can all recognise what it is. I am not always so sure. I can tell one narrative of my life and it seems to describe someone who grew up without privilege, and I can tell another narrative and it seems almost as though my life was one of ease and privilege from the time I was born.

I help my aging parents manage their rental property

Lately, I've been keeping statistics on the behavior of people who have appointments to view an apartment or house for rent.

Out of 50 appointments, 32% of the appointments were no-shows, and 29% were 10+ minute late.

Thanks for wasting my time, jerks.

The 50 states are a bad way to organize the USA

America is increasingly divided not between red states and blue states, but between connected hubs and disconnected backwaters. Bruce Katz of the Brookings Institution has pointed out that of America’s 350 major metro areas, the cities with more than three million people have rebounded far better from the financial crisis. Meanwhile, smaller cities like Dayton, Ohio, already floundering, have been falling further behind, as have countless disconnected small towns across the country.

A New Map for America – The New York Times
The 50-state model is holding the country back. It needs a new system, built around urban corridors.

GiveDirectly is doing an experiement with basic income (see here for more explanation…

GiveDirectly is doing an experiement with basic income (see here for more explanation about what basic income is: https://goo.gl/LvOrop) wherein they're giving 6000 Kenyans a guaranteed cash flow for 10 years. The amount they're giving seems to basically match what the average income these people are already actually making…$250-$400 per year. 

A charity’s radical experiment: giving 6,000 Kenyans enough money to escape poverty for a decade
This is a big deal.

Nice long form article exploring Laron syndrome which protects the few hundred people…

Nice long form article exploring Laron syndrome which protects the few hundred people with it from cancer and diabetes, and how researchers are mimicking its disease-protection effects via caloric-restriction diets.

The experimental diet that mimics a rare genetic mutation
Peter Bowes has been on a new diet that claims to guard against disease and slow ageing. Then he met a group with a mutation that lets them eat what they want while enjoying the same protection.

J1Ofdl65B15D.jpgimgmax1024

Man, how does that work…?

Reshared post from +gwern branwen

Man, how does that work…?

"In this paper we propose stochastic depth, a training procedure that enables the seemingly contradictory setup to train short networks and obtain deep networks. We start with very deep networks but during training, for each mini-batch, randomly drop a subset of layers and bypass them with the identity function. The resulting networks are short (in expectation) during training and deep during testing. Training Residual Networks with stochastic depth is compellingly simple to implement, yet effective. We show that this approach successfully addresses the training difficulties of deep networks and complements the recent success of Residual and Highway Networks. It reduces training time substantially and improves the test errors on almost all data sets significantly (CIFAR-10, CIFAR-100, SVHN). Intriguingly, with stochastic depth we can increase the depth of residual networks even beyond 1200 layers and still yield meaningful improvements in test error (4.91%) on CIFAR-10."

Embedded Link

[1603.09382] Deep Networks with Stochastic Depth
As an open-access site, arXiv serves people like you all over the world and your opinion counts. Please complete this questionnaire to help us improve arXiv and think of future directions for the service in a way that best serves users like you. The survey has four sections, and will take about …