Hi Daniel,

I wondered a while ago on how viable is the idea of regulating the ownership of our future, that is, if I offer my past I should be the first owner of the predictions these systems make.

Should we have mandatory access to any individual data predictions that companies which use past data for ML make?

With the advent of neural networks as a service everybody can make predictions based on behaviour and aggregated past data. As a human being if the past belongs to me, then by all means the future does too. If my data is fed into a predictive neural network i should expressly offer my consent and, at the same time, be granted access to see, not by special request but by proactive informative action, the predictions that come out of the network. My future is as personal as my past.

For example if Walmart collects data about my food purchases, combines that data and finds out I may have diabetes or if my underage daughter is pregnant and Target knows that.

Target correctly inferred based on purchasing data that one of its customers — sorry, guests — a teenage girl in Minnesota, was pregnant, based on an arcane formula involving elevated rates of buying unscented lotion, mineral supplements, and cotton balls. Target started sending her coupons for baby gear, much to the consternation of her father, who, with his puny human inferential power, was still in the dark. full article, Slate

I write so you feel like you’ve just had an idea. It’s a nice feeling.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store