September 16, 2024 at 12:00 am
Comments posted to this topic are about the item Technology and Privacy
September 16, 2024 at 8:39 am
I used to work for an advertising agency. Various companies offered up their customer lists to non-competitors and allowed various filters to be applied to the selection. Recency, purchase category etc. The data received would be basic name and address information.
The prices charged varied depending on what those lists of names and addresses represented and the risk to the seller. The lowest value list was the electoral roll. One of my duties was to dedupe the lists. primarily so the customer did not get charged for the same name coming from two or more separate sources. We stumbled on the idea that knowing that a customer was on a number of source files was both interesting and useful information. Geodemographic datasets are still on the market. They don't target you specifically, they categorise you based on where you live.
And that is why I am nervous about privacy. Your name on a list is neither here nor there. The more lists it appears on the more accurate a picture someone (probably an ML model) can paint of you. It's like being bracketed by searchlights. If this is used wisely and ethically then the worst it can do is annoy the recipient. The majority of companies do behave ethically and with an aspiration to wisdom. It doesn't take many rotten apples to spoil the barrel.
With anything that tracks your location the concern is that your location habits can be used to determine when you are NOT at home. The data from the Strava fitness app could reveal if you were likely to own a theft-worthy bicycle. Strava offer a number of privacy settings to mask your home location.
In general I think it wise to consider what someone with dubious motives could do with the data you emit. It's then up to you to make the judgement of how comfortable you are on you position on the possibility to probability scale.
September 16, 2024 at 4:03 pm
I am certainly concerned about data, though in some sense the car has left the proverbial bag. Far too much data is emitted, and too often we don't know the scope, scale and risk. Most humans are poor at assessing risk, but maybe more dissemination of how much data is being captured would help us start to decide how to manage risk.
Of course, I also think strong penalties for losing data, maybe not huge, but just stiff penalties for each person, like even $1k/person, would get companies to rethink risk.
I might also add 1 day of community service picking up litter and being filmed for security incidents would help if applied to CEOs/CFOs/CTOs/CSOs and directors of software dev.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply