Penguin Group, New York,
2014. Hardcover 288 pages $20.65, Kindle $15.99 from Amazon.ca
Tucker
is deputy editor of the Futurist
Magazine, so he has both an interest in technological developments, and a
concern with social impacts. From his
tone and his examples, he regards himself as a “realist” rather than an
“idealist,” meaning he believes we can direct technological change but we
cannot stop it.
Big
Data is a case in point. Every click we make online adds to the vast store of
data that governments and corporations collect on all of us. Tucker argues that
the development of Big Data is inevitable because it’s already provided major
security assistance in the fight against crime and terrorism and will be even
more useful in these areas as managing vast data bases become more cost
effective. In other words, the security benefits of the applications of Big
Data are far too great to be foregone.
The
dilemma is that over-enthusiastic applications of Big Data for security often ignore
or invade personal privacy – when this is not necessary and could be avoided! Part of the problem is that security
practitioners often do not want to be bothered with the due diligence that
would be needed to gather and use social data while still respecting and
protecting personal privacy.
Quite
simply, they have a cavalier attitude that everyone else (not themselves of
course!) should be willing to sacrifice whatever it takes to pursue
security. The irony of this situation is
that their pursuit of security also entails considerable secrecy, yet they see
no contradiction between insisting on their own privacy (secrecy) while at the
same time being willing to invade the privacy of others.
Seen
in these terms, the conflict of interests seem stark. In many cases however, the differences are
not so clear-cut. Many of the benefits of
Big Data can only be effective if personal data is gathered and analyzed so
that the patterns of a person’s behavior can form the basis for therapeutic assistance. By wearing miniaturized monitoring equipment
under one’s clothing, a health alert system can be created that diagnoses and
prescribes treatment for any ailment that might beset a person.
Similarly,
a GPS system in a person’s vehicle or cell phone can warn when you’re entering
a dangerous area. Many applications like these only work if a person
participates in data-gathering that is very private indeed.
So the question that Tucker asks is, Is more security worth
less privacy? The answer, to which he
often alludes but rarely proclaims, is that better design could assure a
maximum of security and a minimum of privacy invasion.
How?
Firstly, personal permission should always be sought before
private data is gathered or revealed, unless crime or terrorism are
unequivocally involved.
Secondly, when data is being aggregated to reveal social
patterns, the personal details of all of the data contributors should be
thoroughly scrubbed out so that forensic analysis cannot be used to identify
particular persons. Both of these
procedures are perfectly feasible, but they would not allow the carte blanche approach
that many security (and commercial) organizations seem to prefer.
Security and business organizations want society to
prioritize their mandates above those of any other individuals or groups. Public opinion could force a curtailment of
these methods, and with the growing application of Big Data that is very likely
to happen.
William Sheridan is a Knowledge Engineer who lives and works
in Ottawa, Canada.
See Brian Henry’s schedule here, including
writing workshops and creative writing courses in Barrie, Brampton, Bolton, Burlington, Caledon, Cambridge, Collingwood,
Georgetown, Guelph, Hamilton, Kingston, London, Midland, Mississauga,
Newmarket, Niagara on the Lake, Orillia, Oakville, Ottawa, Peterborough, St.
Catharines, Stouffville, Sudbury, Toronto, Windsor, Halton, Kitchener-Waterloo,
Muskoka, Peel, Simcoe, York, the GTA, Ontario and beyond.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.