Secrets from smart devices find path to US legal system

From phys.org: Secrets from smart devices find path to US legal system.

An Ohio man claimed he was forced into a hasty window escape when his house caught fire last year. His pacemaker data obtained by police showed otherwise, and he was charged with arson and insurance fraud.

In Pennsylvania, authorities dismissed rape charges after data from a woman’s Fitbit contradicted her version of her whereabouts during the 2015 alleged assault.

Vast amounts of data collected from our connected devices—fitness bands, smart refrigerators, thermostats and automobiles, among others—are increasingly being used in US legal proceedings to prove or disprove claims by people involved.

In a recent case that made headlines, authorities in Arkansas sought, and eventually obtained, data from a murder suspect’s Amazon Echo speaker to obtain evidence.

The US Federal Trade Commission in February fined television maker Vizio for secretly gathering data on viewers collected from its smart TVs and selling the information to marketers.

The maker of the smartphone-connected sex toy We-Vibe meanwhile agreed in March to a court settlement of a class-action suit from buyers who claimed “highly intimate and sensitive data” was uploaded to the cloud without permission—and shown last year to be vulnerable to hackers. [continue]

How does this make you feel about the electronic devices in your life?

The surprising things algorithms can glean about you from photos

This is an article I’ll be sharing with all my friends, because it’s important for us to understand the consequences one single photo can have.

Even if you do not tag the people in an image, photo recognition systems can do so. Facebook’s DeepFace algorithm can match a face to one that has appeared in previously uploaded images, including photos taken in dramatically different lighting and from dramatically different points of view. Using identified profile photos and tagged photos and social-graph relationships, a very probable name can be attached to the face. (…)

A person pounding the pavement of a city street can be identified and tracked block-to-block by the unique characteristics of her gait. (…)

Taking a photo or video in public isn’t illegal, nor is taking one with a person’s permission. It’s also not illegal to upload the file or store it in the cloud. Applying optical character recognition, facial recognition, or a super-resolution algorithm isn’t illegal, either. There’s simply no place for us to hide anymore. [continue]

That’s from Andreas Weigend’s article, The Surprising Things Algorithms Can Glean About You From Photos, published on Slate. I think you’ll want to read the whole thing.

A note at the bottom of the Slate article says, in part, “Andreas Weigend is the author of Data for the People: How to Make Our Post-Privacy Economy Work for You.” I am grateful for this Slate article – it has super information and will be a handy thing for me to send to friends and post on a certain bulletin board. So I’ve just bought Andreas’ book, as a way to thank him.

Oh, and about laws regarding the taking of photos: we had a house guest from the Netherlands a while ago. He said it’s illegal in the Netherlands to take photos of people without their permission. Really? That’s a great idea. I wish we had a similar law here.

Are any of you saying no when others want to photograph you?