{"id":123664,"date":"2018-12-10T12:00:25","date_gmt":"2018-12-10T12:00:25","guid":{"rendered":"https:\/\/www.transcend.org\/tms\/?p=123664"},"modified":"2018-12-06T12:58:11","modified_gmt":"2018-12-06T12:58:11","slug":"homeland-security-will-let-computers-predict-who-might-be-a-terrorist-on-your-plane-just-dont-ask-how-it-works","status":"publish","type":"post","link":"https:\/\/www.transcend.org\/tms\/2018\/12\/homeland-security-will-let-computers-predict-who-might-be-a-terrorist-on-your-plane-just-dont-ask-how-it-works\/","title":{"rendered":"Homeland Security Will Let Computers Predict Who Might Be a Terrorist on Your Plane \u2014 Just Don\u2019t Ask How It Works"},"content":{"rendered":"<div id=\"attachment_123665\" style=\"width: 710px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/binary-code-computer-virtual-reality-big-brother-surveillance-spy.jpg\" ><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-123665\" class=\"wp-image-123665\" src=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/binary-code-computer-virtual-reality-big-brother-surveillance-spy-1024x512.jpg\" alt=\"\" width=\"700\" height=\"350\" srcset=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/binary-code-computer-virtual-reality-big-brother-surveillance-spy-1024x512.jpg 1024w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/binary-code-computer-virtual-reality-big-brother-surveillance-spy-300x150.jpg 300w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/binary-code-computer-virtual-reality-big-brother-surveillance-spy-768x384.jpg 768w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/binary-code-computer-virtual-reality-big-brother-surveillance-spy.jpg 1440w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/><\/a><p id=\"caption-attachment-123665\" class=\"wp-caption-text\">Illustration: Soohee Cho\/The Intercept, Getty Images<\/p><\/div>\n<p><em>3 Dec 2018 &#8211; <\/em>You\u2019re rarely allowed to know exactly what\u2019s keeping you safe. When you fly, you\u2019re subject to secret <a target=\"_blank\" href=\"https:\/\/apps.bostonglobe.com\/news\/nation\/graphics\/2018\/07\/tsa-quiet-skies\/\" >rules<\/a>, secret <a target=\"_blank\" href=\"https:\/\/theintercept.com\/2014\/07\/23\/blacklisted\/\" >watchlists<\/a>, hidden cameras, and other trappings of a plump, thriving surveillance culture. The Department of Homeland Security is now complicating the picture further by paying a private Virginia firm to build a software algorithm with the power to flag you as someone who might try to blow up the plane.<\/p>\n<p>The new DHS program will give foreign airports around the world free software that teaches itself who the bad guys are, continuing society\u2019s relentless swapping of human judgment for machine learning. DataRobot, a northern Virginia-based automated machine learning firm, won a contract from the department to develop \u201cpredictive models to enhance identification of high risk passengers\u201d in software that should \u201cmake real-time prediction[s] with a reasonable response time\u201d of less than one second, according to a technical overview that was written\u00a0for potential contractors and reviewed by The Intercept. The contract assumes the software will produce false positives and requires that the terrorist-predicting algorithm\u2019s accuracy should increase when confronted with such mistakes. DataRobot is currently testing the software, according to a\u00a0DHS\u00a0news release.<\/p>\n<p>The contract also stipulates that the software\u2019s predictions must be able to function \u201csolely\u201d using data gleaned from ticket records and demographics \u2014 criteria like origin airport, name, birthday, gender, and citizenship. The software can\u00a0also draw\u00a0from slightly more complex inputs, like the name of the associated travel agent, seat number, credit card information, and broader travel itinerary. The overview document describes a situation in which the software could \u201cpredict if a passenger or a group of passengers is intended to join the terrorist groups overseas, by looking at age, domestic address, destination and\/or transit airports, route information (one-way or round trip), duration of the stay, and luggage information, etc., and comparing with known instances.\u201d<\/p>\n<p>DataRobot\u2019s bread and butter is turning vast troves of raw data, which all modern businesses accumulate, into predictions of future action, which all modern companies desire. Its clients include Monsanto and the CIA\u2019s venture capital arm, In-Q-Tel. But not all of DataRobot\u2019s clients are looking to pad their revenues; DHS plans to integrate the code into an existing DHS offering called the Global Travel Assessment System, or GTAS, a toolchain that has been released as open source software and which is designed to make it easy for other countries to quickly implement no-fly lists like those used by the U.S.<\/p>\n<p>According to the technical overview, DHS\u2019s predictive software contract would \u201ccomplement the GTAS rule engine and watch list matching features with predictive models to enhance identification of high risk passengers.\u201d In other words, the government has decided that it\u2019s time for the world to move beyond simply putting names on a list of bad people and then checking passengers against that list. After all, an advanced computer program can identify risky fliers faster than humans could ever dream of and can also operate around the clock, requiring nothing more than electricity. The extent to which GTAS is monitored by humans is unclear. The overview document implies a degree of autonomy, listing as a requirement that the software should \u201cautomatically augment Watch List data with confirmed \u2018positive\u2019 high risk passengers.\u201d<\/p>\n<p>The document does make repeated references to \u201ctargeting analysts\u201d reviewing what the system spits out, but the underlying data-crunching appears to be almost entirely the purview of software, and it\u2019s unknown what ability said analysts would have to check or challenge these predictions. In an email to The Intercept, Daniel Kahn Gillmor, a senior technologist with the American Civil Liberties Union, expressed concern with this lack of human touch: \u201cAside from the software developers and system administrators themselves (which no one yet knows how to automate away), the things that GTAS aims to do look like they could be run mostly \u2018on autopilot\u2019 if the purchasers\/deployers choose to operate it in that manner.\u201d But Gillmor cautioned that even\u00a0including a human\u00a0in the loop could be a red herring when it comes to accountability: \u201cEven if such a high-quality human oversight scheme were in place by design in the GTAS software and contributed modules (I see no indication that it is), it\u2019s free software, so such a constraint could be removed. Countries where labor is expensive (or controversial, or potentially corrupt, etc) might be tempted to simply edit out any requirement for human intervention before deployment.\u201d<\/p>\n<blockquote><p><strong><em>\u201cCountries where labor is expensive might be tempted to simply edit out any requirement for human intervention.\u201d<\/em><\/strong><\/p><\/blockquote>\n<p>For the surveillance-averse, consider the following: Would you rather a group of government administrators,\u00a0who meet in secret and are exempt from disclosure, decide who is unfit to fly? Or would it be better for a computer, accountable only to its own code, to make that call? It\u2019s hard to feel comfortable with the very concept of profiling, a practice that so easily collapses into prejudice rather than vigilance. But at least with uniformed government employees doing the eyeballing, we know who to blame when, say, a woman in a headscarf is needlessly hassled, or a man with dark skin is pulled aside for an extra pat-down.<\/p>\n<p>If you ask DHS, this is a categorical win-win for all parties involved. Foreign governments are able to enjoy a higher standard of security screening; the United States gains some measure of confidence about the millions of foreigners who enter the country each year; and passengers can drink their complimentary beverage knowing that the person next to them wasn\u2019t flagged as a terrorist by DataRobot\u2019s algorithm. But watchlists, among the most notorious features of post-9\/11 national security mania, are of questionable efficacy and dubious legality. A 2014 report by The Intercept\u00a0pegged the U.S. Terrorist Screening Database, an FBI data set from which the no-fly list is excerpted, at roughly 680,000 entries, including some 280,000 individuals with \u201cno recognized terrorist group affiliation.\u201d That same year, a U.S. district court judge ruled in favor of an ACLU lawsuit, declaring the\u00a0no-fly\u00a0list unconstitutional. The list could only be used again if the government improved the mechanism through which people could challenge their inclusion on it \u2014 a process that, at the very least, involved human government employees, convening and deliberating in secret.<\/p>\n<div id=\"attachment_123666\" style=\"width: 710px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/gtas-airplane-terrorist-big-brother-surveillance-spy.jpeg\" ><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-123666\" class=\"wp-image-123666\" src=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/gtas-airplane-terrorist-big-brother-surveillance-spy-1024x470.jpeg\" alt=\"\" width=\"700\" height=\"321\" srcset=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/gtas-airplane-terrorist-big-brother-surveillance-spy.jpeg 1024w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/gtas-airplane-terrorist-big-brother-surveillance-spy-300x138.jpeg 300w, https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/12\/gtas-airplane-terrorist-big-brother-surveillance-spy-768x353.jpeg 768w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/><\/a><p id=\"caption-attachment-123666\" class=\"wp-caption-text\">Diagram from a Department of Homeland Security technical document illustrating how GTAS might visualize a potential terrorist onboard during the screening process. Document: DHS<\/p><\/div>\n<p>But what if you\u2019re one of the inevitable false positives? Machine learning and behavioral prediction is already widespread; The Intercept <a target=\"_blank\" href=\"https:\/\/theintercept.com\/2018\/04\/13\/facebook-advertising-data-artificial-intelligence-ai\/\" >reported<\/a> earlier this year that Facebook is selling advertisers on its ability to forecast and pre-empt your actions. The consequences of botching consumer surveillance are generally pretty low: If a marketing algorithm mistakenly predicts your interest in fly fishing where there is none, the false positive is an annoying waste of time. The stakes at the airport are orders of magnitude higher.<\/p>\n<p>What happens when DHS\u2019s crystal ball gets it wrong \u2014 when the machine creates a prediction with no basis in reality and an innocent person with no plans to \u201cjoin a terrorist group overseas\u201d is essentially criminally defamed by a robot? Civil liberties advocates not only worry\u00a0that\u00a0such false positives are likely, possessing a great potential to upend lives, but also question whether such a profoundly damning prediction is even technologically possible. According to\u00a0 DHS itself, its predictive software would have relatively little information upon which to base a prognosis of impending terrorism.<\/p>\n<p>Even from such mundane data inputs, privacy watchdogs cautioned that prejudice and biases always follow \u2014 something only worsened under the auspices of self-teaching artificial intelligence. Faiza Patel, co-director of the Brennan Center\u2019s Liberty and National Security Program, told The Intercept that giving predictive abilities to watchlist software will present only the veneer of impartiality. \u201cAlgorithms will both replicate biases and produce biased results,\u201d Patel said, drawing a parallel to situations in which police are algorithmically allocated to \u201crisky\u201d neighborhoods based on racially biased crime data, a process that results in racially biased arrests and a checkmark for the computer. In a self-perpetuating bias machine like this, said Patel, \u201cyou have all the data that\u2019s then affirming what the algorithm told you in the first place,\u201d which creates \u201ca kind of cycle of reinforcement just through the data that comes back.\u201d What kind of people should get added to a watchlist? The ones who resemble those on the watchlist.<\/p>\n<blockquote><p><strong><em>What kind of people should get added to a watchlist? The ones who resemble those on the watchlist.<\/em><\/strong><\/p><\/blockquote>\n<p>Indeed, DHS\u2019s system stands to deliver a computerized turbocharge to the bias that is already endemic to the American watchlist system. The overview document for the the Delphic profiling tool made repeated references to the fact that it will create a feedback loop of sorts. The new system \u201cshall automatically augment Watch List data with confirmed \u2018positive\u2019 high risk passengers,\u201d one page read, with quotation marks doing some very real work. The software\u2019s predictive abilities \u201cshall be able to improve over time as the system feeds actual disposition results, such as true and false positives,\u201d said another section. Given that the existing watchlist framework has ensnared countless thousands of innocent people , the notion of \u201cfeeding\u201d such \u201cpositives\u201d into a machine that will then search even harder for that sort of person is downright dangerous. It also becomes absurd: When the criteria for who is \u201crisky\u201d and who isn\u2019t are kept secret, it\u2019s quite literally impossible for anyone on the outside to tell what is a false positive and what isn\u2019t. Even for those without civil libertarian leanings, the notion of an automatic \u201cbad guy\u201d detector that uses a secret definition of \u201cbad guy\u201d and will learn to better spot \u201cbad guys\u201d with every \u201cbad guy\u201d it catches would be comical were it not endorsed by the federal government.<\/p>\n<p>For those troubled by the fact that this system is not only real but currently being tested by an American company, the fact that neither the government nor DataRobot will reveal any details of the program is perhaps the most troubling\u00a0of all. When asked where the predictive watchlist prototype is being tested, the DHS tech directorate spokesperson, John Verrico, told The Intercept, \u201cI don\u2019t believe that has been determined yet,\u201d and stressed that the program was meant for use with foreigners. Verrico referred further questions about test location and which \u201crisk criteria\u201d the algorithm will be trained to look for back to DataRobot. Libby Botsford, a DataRobot spokesperson, initially told The Intercept that she had \u201cbeen trying to track down the info you requested from the government but haven\u2019t been successful,\u201d and later added, \u201cI\u2019m not authorized to speak about this. Sorry!\u201d Subsequent requests sent to both DHS and DataRobot were ignored.<\/p>\n<p>Verrico\u2019s assurance \u2014 that the watchlist software is an outward-aiming tool provided to foreign governments, not a means of domestic surveillance \u2014 is an interesting feint given that Americans fly through non-American airports in great numbers every single day. But it obscures ambitions much larger than GTAS itself: The export of opaque, American-style homeland security to the rest of the world and the hope of bringing every destination in every country under a single, uniform, interconnected surveillance framework. Why go through the trouble of sifting through the innumerable bodies entering the United States in search of \u201crisky\u201d ones when you can move the whole haystack to another country entirely? A global network of terrorist-scanning predictive robots at every airport would spare the U.S. a lot of heavy, politically ugly lifting.<\/p>\n<blockquote><p><strong><em>\u201cAutomation will exacerbate all of the worst aspects of the watchlisting system.\u201d<\/em><\/strong><\/p><\/blockquote>\n<p>Predictive screening further shifts responsibility.\u00a0The ACLU\u2019s Gillmor explained that making these tools available to other countries may mean that those external agencies will prevent people from flying so that they never encounter DHS at all, which makes DHS less accountable for any erroneous or damaging flagging, a system he described as \u201ca quiet way of projecting U.S. power out beyond U.S. borders.\u201d Even at this very early stage, DHS seems eager to wipe its hands of the system it\u2019s trying to spread around the world: When Verrico brushed off questions of what the system would consider \u201crisky\u201d attributes in a person, he added in his email that \u201cthe risk criteria is being defined by other entities outside the U.S., not by us. I would imagine they don\u2019t want to tell the bad guys what they are looking for anyway. ;-)\u201d DHS did not answer when asked whether there were any plans to implement GTAS within the United States.<\/p>\n<p>Then there\u2019s the question of appeals. Those on DHS\u2019s current watchlists may seek legal redress; though the appeals system is generally considered inadequate by civil libertarians, it offers at least a theoretical possibility of removal. The documents surrounding DataRobot\u2019s predictive modeling contract make no mention of an appeals system for those deemed risky by an algorithm, nor is there any requirement in the DHS overview document that the software must be able to explain how it came to its conclusions. Accountability remains a fundamental problem in the fields of machine learning and computerized prediction, with some computer scientists adamant that an ethical algorithm must be able to show its work, and others objecting on the grounds that such transparency compromises the accuracy of the predictions.<\/p>\n<p>Gadeir Abbas, an attorney with the Council on American-Islamic Relations, who has spent years fighting the U.S. government in court over watchlists, saw the DHS software as only more bad news for populations already unfairly surveilled. The U.S. government is so far \u201cnot able to generate a single set of rules that have any discernible level of effectiveness,\u201d said Abbas, and so \u201cthe idea that they\u2019re going to automate the process of evolving those rules is another example of the technology fetish that drives some amount of counterterrorism policy.\u201d<\/p>\n<p>The entire concept of making watchlist software capable of terrorist predictions is mathematically doomed, Abbas added, likening the system to a \u201ccrappy Minority report. \u2026 Even if they make a really good robot, and it\u2019s 99 percent accurate,\u201d the fact that terror attacks are \u201cexceedingly rare events\u201d in terms of naked statistics means you\u2019re still looking at \u201cmillions of false positives. \u2026 Automation will exacerbate all of the worst aspects of the watchlisting system.\u201d<\/p>\n<p>The ACLU\u2019s Gillmor agreed that this mission is simply beyond what computers are even capable of:<\/p>\n<blockquote><p><em>For very-low-prevalence outcomes like terrorist activity, predictive systems are simply likely to get it wrong. When a disease is a one-in-a-million likelihood, the surest bet is a negative diagnosis. But that\u2019s not what these systems are designed to do. They need to \u201cdiagnose\u201d some instances positively to justify their existence. So, they\u2019ll wrongly flag many passengers who have nothing to do with terrorism, and they\u2019ll do it on the basis of whatever meager data happens to be available to them.<\/em><\/p><\/blockquote>\n<p>Predictive software is not just the future, but the present. Its expansion into the way we shop, the way we\u2019re policed, and the way we fly will soon be commonplace, even if we\u2019re never aware of it. Designating enemies of the state based on a crystal ball locked inside a box represents a grave, fundamental leap in how societies appraise danger. The number of active, credible terrorists-in-waiting is an infinitesimal slice of the world\u2019s population. The number of people placed on watchlists and blacklists is significant. Letting software do the sorting \u2014 no matter how smart and efficient we tell ourselves it will be \u2014 will likely do much to worsen this inequity.<\/p>\n<p>__________________________________________________<\/p>\n<p><em>Related:<\/em><\/p>\n<ul>\n<li><em><a target=\"_blank\" href=\"https:\/\/theintercept.com\/2018\/11\/26\/cloud-act-data-privacy-us-tech-companies\/\" ><strong>New Law Could Give U.K. Unconstitutional Access to Americans\u2019 Personal Data, Human Rights Groups Warn<\/strong><\/a><\/em><\/li>\n<li><a href=\"https:\/\/www.transcend.org\/tms\/2018\/11\/amazons-accent-recognition-technology-could-tell-the-government-where-youre-from\/\" ><em><strong>Amazon\u2019s Accent Recognition Technology Could Tell the Government Where You\u2019re From<\/strong><\/em><\/a><\/li>\n<li><em><a target=\"_blank\" href=\"https:\/\/theintercept.com\/2018\/11\/13\/google-quayside-toronto-smart-city\/\" ><strong>Google\u2019s \u201cSmart City of Surveillance\u201d Faces New Resistance in Toronto<\/strong><\/a><\/em><\/li>\n<li><em><a target=\"_blank\" href=\"https:\/\/theintercept.com\/2018\/09\/26\/airport-facial-recognition-flight-delay\/\" ><strong>The Government Wants Airlines to Delay Your Flight So They Can Scan Your Face<\/strong><\/a><\/em><\/li>\n<\/ul>\n<p style=\"padding-left: 30px;\"><a href=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/05\/Sam-Biddle-e1526217424796.jpg\" ><img loading=\"lazy\" decoding=\"async\" class=\"alignleft size-full wp-image-111065\" src=\"https:\/\/www.transcend.org\/tms\/wp-content\/uploads\/2018\/05\/Sam-Biddle-e1526217424796.jpg\" alt=\"\" width=\"100\" height=\"100\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p style=\"padding-left: 30px;\"><em><a target=\"_blank\" href=\"https:\/\/theintercept.com\/staff\/sambiddle\/\" >Sam Biddle<\/a> &#8211; <a href=\"mailto:sam.biddle@theintercept.com\">sam.biddle@\u200btheintercept.com<\/a><\/em><\/p>\n<p>&nbsp;<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/theintercept.com\/2018\/12\/03\/air-travel-surveillance-homeland-security\/\" >Go to Original \u2013 theintercept.com<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>3 Dec 2018 &#8211; The software, developed under contract, will be given to foreign governments. It is already being tested.<\/p>\n","protected":false},"author":4,"featured_media":89314,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[60],"tags":[],"class_list":["post-123664","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-whistleblowing-surveillance"],"_links":{"self":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts\/123664","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/comments?post=123664"}],"version-history":[{"count":0,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/posts\/123664\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/media\/89314"}],"wp:attachment":[{"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/media?parent=123664"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/categories?post=123664"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.transcend.org\/tms\/wp-json\/wp\/v2\/tags?post=123664"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}