Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Privacy Google Government

Dubai Police To Use Google Glass For Facial Recognition 122

cold fjord sends word about what the Dubai police plan on doing with their Google Glass. Police officers in Dubai will soon be able to identify suspects wanted for crimes just by looking at them. Using Google Glass and a custom-developed facial recognition software, Dubai police will be able to capture photos of people around them and search their faces in a database of people wanted for crimes ... When a match is made in the database, the Glass device will receive a notification. .... What's particularly interesting about the project is that facial recognition technology is banned by the Google Glass developer policy. ... The section of the policy that addresses such technology seems to disqualify the Dubai police force's plan for Glass."
This discussion has been archived. No new comments can be posted.

Dubai Police To Use Google Glass For Facial Recognition

Comments Filter:
  • Enforce (Score:5, Interesting)

    by mwvdlee ( 775178 ) on Monday October 06, 2014 @08:54AM (#48072401) Homepage

    I've always wandered if and how Google would enforce that rule.
    Now we'll find out.
    My money is on "Pay lipservice to privacy in the media, keep supplying the Dubai police anyway".

    • by Anonymous Coward

      Of course they'll keep supplying them, no one else uses it.

    • Re:Enforce (Score:5, Funny)

      by weilawei ( 897823 ) on Monday October 06, 2014 @09:04AM (#48072485)

      There's absolutely no potential to abuse this. Everyone knows that only rich people live in Dubai and rich people can't be criminals. Just look at the arrest rates.

      • I know you're not being serious, but it's rich people and almost unpaid slaves.

        • by Anonymous Coward

          Slaves are not people.

          • Re: (Score:2, Informative)

            by cellocgw ( 617879 )

            Slaves are not people.

            Bah. Here in the US of frakking A, slaves are (well, were) 60% of a person. Take that, you backwater countries like Dubai!

            • The Constitution actually gave the slave states Congressional representation counting 2/3 of a person for slaves, even though the slaves didn't get the right to vote. It was a compromise between the Southerners who wanted to get 100% representation and the Northerners who mostly didn't want them counted at all. Effectively, it meant that a Southern white man's vote counted more than a free state man's vote, because it took fewer Southern whites to get a Congressman.

      • by r1348 ( 2567295 )

        Rich people, and a mass of immigrant semi-slave workers that built all of those slightly overcompensating skyscrapers...

    • Re:Enforce (Score:5, Interesting)

      by davecb ( 6526 ) <davecb@spamcop.net> on Monday October 06, 2014 @09:04AM (#48072491) Homepage Journal
      The German Federal security service tried this years ago in airports, and got a combinatorial explosion in false positives (AKA the "birthday paradox") that drowned out the real positives. Google knows the math, and is trying to save the inumerate from an expensive failure (;-))
      • Re:Enforce (Score:5, Insightful)

        by swb ( 14022 ) on Monday October 06, 2014 @09:09AM (#48072525)

        But with facial recognition, the entire technology subsystems keep getting better -- high resolution cameras, faster processing which will enable improved and more sophisticated algorithms. It seems naive to say it doesn't work and can't work.

        • Re:Enforce (Score:5, Interesting)

          by davecb ( 6526 ) <davecb@spamcop.net> on Monday October 06, 2014 @10:01AM (#48072905) Homepage Journal

          better technology doesn't help enough!

          To oversimplify, if you have 1 error in a thousand, and you have 10,000 (crooks + innocent people), you do (10,000 * 9,999) comparisons and get 99,990,000 / 1,000 = 9,990 errors. In stats, it's a selection of every two persons out of 10,000.

          It's really something like (select one of 100 crooks from 10,000 innocents), but it's still an insanely huge number of comparisons. Hoeever good your technology, adding more people will give you (N * N-1) more chances of getting an error.

          Facial recognition vendors are very careful to NOT report their error rates in ways that expose this problem: it's the "elephant in the room" for that industry. And that includes Siemens, my former employer.

          • Re:Enforce (Score:5, Interesting)

            by postbigbang ( 761081 ) on Monday October 06, 2014 @10:12AM (#48072999)

            You forgot to mention the necessary sense of walking around: liberty. Even if you're a "positive", what of due process? Will you land in a jail, await a long process? How and who guarantees that you'll be then excluded if you're falsely positive? It's a slippery slope. Google has opened a Pandora's box of paranoia.

            Will people stop traveling in fear of false-positives? Where are governments permitted to gnaw on their citizenry, privacy death by a thousand cuts?

            • by Calydor ( 739835 )

              The first thing to spring to mind about false positives is they'll (in most cases, one would hope) be pulled aside and questioned, while the Glass runs a more in-depth analysis of their face rather than just the quick scan necessary to look out over a crowd. Various science-fiction movies and shows already give ideas in this regard, I believe. It's not like computerized facial recognition when a police officer looks at you is a new idea.

              • Depends on the jurisdiction and the procedures used THERE. How many stories about languishing in jails do you need to become reviled at the concept?

              • Re:Enforce (Score:4, Interesting)

                by davecb ( 6526 ) <davecb@spamcop.net> on Monday October 06, 2014 @11:40AM (#48073695) Homepage Journal

                It's TERRIBLE public policy for people to be pulled aside for mere physical resemblance to a third person. A person the cop's never seen, and only has a photo of, but they've been told by a computer that this is the person in the photograph.

                And computers are never wrong

                • by Calydor ( 739835 )

                  "Meh, I'm bored. Nothing ever happens on my WAIT THAT GUY LOOKS LIKE THAT PHOTO I SAW LAST WEEK!" *bullrush*

                  See, this is the kind of thing we have RIGHT NOW.

                • Sounds like a recipe for memory tampering.

                • by flux ( 5274 )

                  Once the police learns that there are false positives, I'm sure they will learn to put the proper weight for computer recognition.

                  • by davecb ( 6526 )

                    They're not supposed to learn things like that, it will affect their close rates

                    --dave
                    My local Chief of Police has fought for years to get his people to "keep the peace" instead of "show high case-closed numbers". He's started to succeed, and the crime rates are going down, but he's been rewarded by budget cuts and being phased out for being too expansive... Bummer!

            • Even if you're a "positive", what of due process? Will you land in a jail, await a long process?

              The brain dead obvious solution is that if the software identifies you as a "positive", then a human would look at the photo of the perp and the photo of the suspect, and verify that they match. I doubt if anyone is going to be arrested just because the "match" LED blinks.

            • Turn this on its head. Facial scans for ID cards, credit/debit cards, passports, driving licences etc. become the norm: a scanner fails to identify you at an airport, so you are a suspicious individual - guilty until proven innocent.
            • You forgot to mention the necessary sense of walking around: liberty. Even if you're a "positive", what of due process? Will you land in a jail, await a long process? How and who guarantees that you'll be then excluded if you're falsely positive? It's a slippery slope. Google has opened a Pandora's box of paranoia.

              Will people stop traveling in fear of false-positives? Where are governments permitted to gnaw on their citizenry, privacy death by a thousand cuts?

              It's Dubai. They don't have those worries.

              • You see, it might be Dubai, but the software will be perfected there, and it will migrate elsewhere. Slowly, it becomes acceptable in a conventional sense. Then it becomes "the norm".

                A thousand cuts..... then a million.

                • You see, it might be Dubai, but the software will be perfected there, and it will migrate elsewhere. Slowly, it becomes acceptable in a conventional sense. Then it becomes "the norm".

                  A thousand cuts..... then a million.

                  I agree, my point was Dubai doesn't necessarily have the same due process concerns of the OP.

          • by swb ( 14022 )

            I get that, but it seems like it would be something that would improve over time based on all manner of improvements.

            The inputs might get better -- comparison pictures in the database, which I would imagine are for the most part DMV or passport photos may end up being very high resolution images or include 3D scan data. The on-site imagery will almost certainly end up at 4K resolution if not some kind of real-time 3D scanning. And comparison and analytics will get better as the processing involved gets be

            • by davecb ( 6526 )

              You need as many 9's after the decimal point as you have digits in (N * N-1). As N is unbounded and accuracy is bounded, you get screwed. It's fine for a 10-person company (90 comparisons, negligable false positives) It's out of the question for airports (10,000 * 9,999 comparisons)

              As the ARPAnauts would say "it doesn't scale"

          • I think that you have your math wrong. Number one, you don't know who you are comparing the face of so according to your schema you would need to do 10,000 * 10,000 comparisons, since you don't have a way to say "Oh! This is John Derp's face, so let's only compare it to the other 9,999 faces!". Also, you don't compare each face with everyone else's, you only compare with a list of known crooks (hopefully not EVERYONE'S face is in the system, although I am not sure of anything these days).

            Then, assuming t

      • The German Federal security service tried this years ago in airports, and got a combinatorial explosion in false positives (AKA the "birthday paradox") that drowned out the real positives. Google knows the math, and is trying to save the inumerate from an expensive failure (;-))

        There's an estimate that about one in a million people looks practically identical to you. So if you have a database of 100,000 criminals, about every tenth random person matches someone from the database so closely that you would look identical to a police officer who checks.

      • by tlhIngan ( 30335 )

        Google knows the math, and is trying to save the inumerate from an expensive failure (;-))

        More like getting a product so tainted by the public that it's impossible to release it. I mean, Google Glass has its uses, but not only is general society not able to sort out its potential privacy issues (face it - we're still dealing with trying to fit cameras into our society properly, and those have been around for a couple of centuries now), but it takes just a few incidents before the public will conclude they'r

        • Google Glasses may be tainted for everyday public use but it will gain ground in business uses. Surgeons, construction managers, etc...
      • The cynical in me says there isn't any need for high precision. As long as the facial recognition system can pick out foreign faces over native ones. All the system really needs to do is to give plausible justification for racial profiling.

        Article 25 of the Constitution of the UAE provides for the equitable treatment of persons with regard to race, nationality, religious beliefs or social status. However...

        [...]

        Foreign laborers in Dubai often live in conditions described by Human Rights Watch as being "less than humane", and was the subject of the documentary, Slaves in Dubai.

        [wikipedia entry] [wikipedia.org]

        At least, that's what the cynical in me says. I have no other basis for that theory. The other reason, which would be much more likely, is to justify a shiny new toy for a police force which has plenty of funding and to grab some headlines in a mainstream press which real

      • and is trying to save the inumerate from an expensive failure (;-))

        If you can get some fool of a company PHB to pay you to develop an application that will never work in the real world, then as long as you cash the pay cheques before they get cancelled, then it's an expensive success for you and and expensive failure for them.

        Just make sure that you set out your project roadmap so that you get paid for delivering a system that works in the workshop, and let the client discover that it doesn't work in the r

    • Re:Enforce (Score:5, Insightful)

      by Wycliffe ( 116160 ) on Monday October 06, 2014 @09:32AM (#48072697) Homepage

      You can restrict first sale but I know places like ATT have tried to do that with the iphone but
      I had friends that still managed to resell dozens of them over to china without much effort.

      Google will most likely just enforce it by excluding it from their play store so they can't officially
      sell it thru normal channels but they can still "enable 3rd party apps" and be fine.

    • by bigpat ( 158134 )

      I think the Google rule is more a function of battery life since that kind of constant radio communication uploading video back to the cloud is a drain on batteries.

      In terms of personal privacy or police state concerns... The police already have decent facial recognition technology available to police and government along with fixed cameras that are hard wired for power. Yes there is a performance issue if you try to match too many faces to too many faces, but as others have said this is subject to Moore'

    • by hey! ( 33014 )

      I suspect the restriction is impossible to enforce, because it's almost certainly the case that the facial recognition isn't performed on the device itself. So it's a bit like saying you can't use the things for pornography; you'd have to know somehow that the user intends to pleasure himself later by looking at pictures of ladies' shoes.

      It's a bit too late on that score anway. Having boots on the ground is an anachronism, even if they've got high tech wearables. In 2000 Scotland Yard was able to foil t

    • I've always wandered if and how Google would enforce that rule.
      Now we'll find out..

      Given that the policy applies to "Glassware", which is on-board software, and the facial recognition is on a back end server ("not Glassware"), they probably are not going to do dick about it.

      If they *were* going to do something about it, it would be to not allow the Dubai police to distribute their Glassware in the Google store. I'm pretty sure the Dubai police will be side-loading the client app anyway, and would be just as happy that *NOTHING* from the Google store got onto their officers Glass devices

  • by Anonymous Coward

    So Glass gets about 45 minutes of battery life recording with the camera...probably even less if it's sending the video somewhere.

    The quality of the camera is also pretty crappy. Surely there are a hundred better ways to do this than with a Google Glass as a video source...

  • by Anonymous Coward

    Google policy simply does not apply. Period. The end. Corporate policy does not trump the authority of a sovereign government.

  • by ihtoit ( 3393327 ) on Monday October 06, 2014 @09:08AM (#48072523)

    I got dowsing rods for bomb detection, a Pacific road bridge on discount, and - oh, this is a gem - a barge that used to be a British aircraft carrier. All you's gotta do for that one is steal it from the Turks before they sell the keel to the Chinese.

    If they like, we have a surplus of cardboard policeman standies as well.

  • Facial recognition, like all biometrics, is not good for this purpose. You either have decent rates of true positive results at the expense of many false positives, or few false positives with many failures.

    It's never like it is in the TV shows. The results form biometrics are far too fuzzy to be useful in this context where one looks for one of a large set within a much larger set. It's somewhat useful in non-time-critical situations where one looks for one identity from a large set.

    In general, you can

    • Or, you just jail everyone using facial recognition as a pretense. You're assuming this is a real police force. It's not. This is Dubai. It's a monarchy/theocracy and a police state.

    • by Wycliffe ( 116160 ) on Monday October 06, 2014 @09:27AM (#48072655) Homepage

      Facial recognition, like all biometrics, is not good for this purpose.

      Even if the technology is poor, it should be able to pop up a photo and a confidence level then the cop can look closer
      and decide if it really is the right person. Even with today's technology a computer is going to be much better than a
      cop studying a list of a thousand pictures and trying to memorize them. If a computer can narrow it down to the top
      10 most likely then it's made the cop's job alot easier.

      Of course, this is Dubai, everyone knows what they are really looking for.
      They aren't looking for criminals. They are looking for escaped slaves.

  • by Applehu Akbar ( 2968043 ) on Monday October 06, 2014 @09:12AM (#48072549)

    Facial recognition may be against the terms of the beta, but you can bet that in production this will be a major application for Glass. It will be a hit with prosopagnostics, for example, despite the social stigma against the product.

    Contractors all over the world find it easy to bend whatever restrictions their own cultures may impose in applying tech of any kind when Dubai threatens to make a large purchase. Our best hope is that the technology will leak to ISIS. If the Silicon Valley beta experience is any guide, seeing Glass on Jihadi John in beheading videos to come will cause ISIS to suddenly lose favor with al-Ummah.

    • by torsmo ( 1301691 )
      Well said. Also, this is Dubai, with a king and God as rulers. They don't care about false positives, as long as the rich aren't inconvenienced, and the enslaved foreign labour in the slums is kept in check.
  • But I do wonder, have they considered hiring Tom Cruise for this? Of course, if this technology does take off, you can rest assure that after the events in Ferguson, Missouri, that chest cameras will be the least of the criminal's worries....
  • Facial recognition is the obvious killer app for glass. It would be very helpful for people like me who can't remember names or faces for the life of me. I think google's policy is short-sighted.

  • Why this is bad (Score:5, Interesting)

    by Charliemopps ( 1157495 ) on Monday October 06, 2014 @09:21AM (#48072617)

    For those that were unaware, Dubai is an awful place to live.
    The majority of low wage workers are shipped in from out of the country and are treated as slaves. They've no hope to leave and any question of the system will land you in prison. There are dozens of documentaries on the situation.

    Vice has a good one: http://www.vice.com/vice-news/... [vice.com]
    Caution, it's an auto-play video and it's got a loud intro.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      I happen to live in Dubai, moved here from Russia about a year ago. Now working in IT, and get pretty good salary even compared to US, especially give zero taxes.
      Everything around done by indians, filipino or pakistani people, most locals work in government organisations and military. For all my time here I never met a person who didn't speak english.
      Also it is not quite true that it is hard to leave, there are a lot of cases people getting into debt and just leaving the county for good. Problem starts when

      • I suppose the trains run on time, too.

        • by mjwx ( 966435 )

          I suppose the trains run on time, too.

          Dubai is fine, if you're a white guy with marketable skills and dont run afoul of an Emeriti citizen... But it sucks if you're an Indian or Filipino "guest worker" (pointing out they're sarcastic quotes for extra sarcasm). There is an enforced pecking order with the Emirs at the top, Emeriti citizens then white people. It start to go seriously downhill from there. Even amongst the privileged classes, more money meant you had more rights. Libertarians who've never seen the reality of unbridled capitalism won

          • You have no idea what Libertarianism is... none. You're basically arguing that Republicans, unregulated, would be a terrible idea. Something I agree with. The ideals you suggest are republican ideals, not Libertarian ideals. Republicans have about as much to do with Libertarianism as Democrats have to do with Socialism... as in, they steal some of the more popular ideas, but in reality could care less unless it furthers their ambitions. Dubai is about as polar opposite from what a Libertarian government wou

  • Technology is already being developed to countermeasure this google glass thing in Dubai. It's called the Burka and it's supposed to shield the device from doing its job. Specs remain unavailable at this time.

  • It's literally a matter of time before we see US cops wearing glass doing this same thing.

    I have seen it happen *over and over* throughout the years...it goes like this:

    "contractors" get ahold of technology X
    X has major privacy implications that prevent advanced nations from using it
    implement X in foreign countries then use w/e civil unrest happening at the moment to justify it's use
    profit

    • ANPR which automatically detects plate numbers has been in use in the US for a while now.

      As far as I'm concerned there are only benefits in having cops wear a camera.
      1. They're actions are monitored which deters corruption and abuse of power
      2. They can tie the face recognition system to identify wanted individuals

  • by gnasher719 ( 869701 ) on Monday October 06, 2014 @10:35AM (#48073203)
    I will easily believe that someone sold a system that uses Google Glass for facial recognition to the Dubai police. It's much hard to believe that someone sold them a system that actually works.
    • It works. The technology has existed since the 90s. The face recognition algorithm have improved significantly in the last 10 years due to better processing power and new imaging technologies.

      As long as the list of individuals put on the wanted list is limited, existing processing power can provide almost live results. The more profiles there are to evaluate the longer it will take. If the client application actually does the conversion from the image to the vector matrix the server can actually be far more

    • With all those gutras, agals, keffiyah and turbins, as well as hijabs, burqas, niqabs, and chandors.

      Toss in a variety of Ray Bans...

      Well, you see the problem? (But not much of a face.)
  • by Anonymous Coward

    Once this technology matures, you'll by able to dynamically have facebook profiles pop-up as you walk up to someone or even look at them. Forgetting names will be a thing of the past, and most people will have a general knowledge base of whoever they meet which creates some interesting social dynamics. Very exciting.

  • by retroworks ( 652802 ) on Monday October 06, 2014 @11:46AM (#48073749) Homepage Journal

    I walk into Staples to buy something, and then am distracted by the price of an HP laser printer, spend a minute looking it over. I get home and find an ad for the same HP Laser printer on Facebook. Ok, maybe they identified me from the credit card I used and just randomly advertised that? Nope. Because this weekend I walked into a Best Buy and wound up getting curious about a particular Sony movie camera. Left the store without making a purchase. Facebook ad for that specific Sony camera when I got home.

    Minority Report is here, and I don't see any AntiPhorm or Digital Haystack / Data Pollution solution. Guy Fawkes Masks or Groucho Marx glasses don't seem realistic. Maybe if people boycott the stores using facial recognition cameras for internet advertising it would blunt the ads, but the tech is still there.

    • Anyone else heard of this? Is that actually happening at Staples or other retailers?

      I've had experiences like this, but usually I find out later it was a coincidence. I know retailers use cameras to track traffic and shopping patterns, but I've not seen actual facial recognition.

    • Take off the tinfoil hat. Facebook knows you like cameras and printers from your techy profile, and you happened to look at popular ones.

      • Maybe it's only Tesco and Walmart, then.

        http://www.huffingtonpost.ca/2013/11/08/tesco-facial-recognition-scanners_n_4241801.html

        http://adage.com/article/digital/facebook-walmart-write-rules-facial-recognition/245707/

  • Just wait till police officer's, bankers, politician's faces are recognized by this technology as THE CRIMINALS.

    Oh, and imagine the ramifications for false positives!

    This won't be handled under civil lawsuits.

    The ramifications of this go very deep.

    There will be alternative databases of information to search against..

  • ...for everyone.

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...