Loading ...
Sorry, an error occurred while loading the content.

Too much information: devil in the details

Expand Messages
  • Don
    Too much information: devil in the details By Don Butler, Canwest News Service January 31, 2009 9:09 PM
    Message 1 of 1 , Jan 31, 2009
    • 0 Attachment
      Too much information: devil in the details

      By Don Butler, Canwest News Service
      January 31, 2009 9:09 PM
      http://www.canada.com/news/national/much+information+devil+details/1240498/story.html

      That oft-used colloquial expression "too much information" certainly can be applied to the way some business are boning up on your background.


      Last summer, the Laurentian Bank rejected a loan application for an all-terrain vehicle from a resident of the Kitigan Zibi First Nation, an Algonquin community near Maniwaki, Que., about 175 kilometres north of Ottawa. The man had an impeccable credit history. The problem was where he lived.


      The bank's policy is to deny all-terrain vehicle loans to people with certain postal codes, most of which are in aboriginal reserves. For those who reside in those codes, it doesn't matter how good their personal credit rating is. The bank has categorized them as unsuitable.


      The policy is a classic example of social sorting, driven by surveillance processes that legally vacuum up information about us - including where we live - and use it to slot us into categories of risk or desirability that affect our life chances, for good or for ill.


      Increasingly, social sorting defines the surveillance society. Governments and corporations draw on large databases of personal information compiled by commercial data brokers, credit reporting companies and others to define target markets and risky populations. "We're being classified and rated, no question about it," says Jeff Chester, of the Center for Digital Democracy in Washington.


      Based on this information - sometimes referred to as "dataveillance" - we're being slotted into categories such as Golden Empty Nesters, Burdened Optimists, White Van Culture and Highrise Hardship, or being assigned "trust scores" that can determine how we're treated.


      Our classifiers examine our online and in-store purchasing behaviour, the websites we visit, geographic data, information we post on social networking sites, credit reports they gather from companies such as Equifax and TransUnion, and anything else they can get their hands on.


      "Every minute little detail of information gets caught in an algorithmic way to paint a picture about you," explains Valerie Steeves, a criminology professor at the University of Ottawa. "Judgments are made about you on the basis of that shadow, even though the shadow may or may not be accurate."


      All this takes place almost entirely out of sight. Few of us even realize it's happening. Yet these hidden processes are having real effects on our lives.


      Because of social sorting, people are being denied jobs and insurance, says Philippa Lawson, former director of the University of Ottawa's Canadian Internet Policy and Public Interest Clinic. "Their lives are being seriously affected by decisions based on this information, which didn't used to be collected simply because we didn't have the technology to do it."


      Few of us ever will realize we didn't get an interview with a potential employer because they pulled up something unflattering - and quite possible inaccurate - about us from a commercial database, accessible by anyone for a fee.


      And, says David Lyon, a Queen's University sociologist and leading figure in the field of surveillance studies, this process of classification is increasingly automated, with a correspondingly smaller role for human judgment.


      To some degree, bureaucracies have always assigned us to categories, relying on stereotypes and the discretion of clerks for their classification. But computerized systems have altered both, Lyon says.


      They have reduced discretion vastly, because they impose more rules and regulations on the process. And they have increased stereotyping, because the categories they sort us into are narrower and subtler, producing more sharply defined stereotypes.


      Most often, "we don't know the criteria on which we're being assessed for our trustworthiness or our risk-proneness," Lyon says. "I don't believe we have recognized sufficiently just how far our own choices, life chances and opportunities to be involved in society in a regular way are being influenced by sets of surveillance processes that occur below the surface."


      Social sorting often works to the benefit of the affluent and educated. Many companies now rank customers by how much they spend on their products, for example, valuing most those who spend more. When preferred customers call for service, they are routed into shorter queues staffed by more skilled employees while the rest of us - the lesser valued - fume on hold. Some offer products to different customers at different prices, depending on how they have been categorized.


      Sometimes, social sorting can have unexpected - even perverse - results. You might expect people coming out of bankruptcy to have trouble getting a credit card, Steeves says. "In fact, these people are being targeted precisely because they're bad credit risks and have irresponsible credit histories. So you'll make a lot more money in interest from those people."


      Ottawa Citizen


      [Non-text portions of this message have been removed]
    Your message has been successfully submitted and would be delivered to recipients shortly.