Bad Eye on High — Seeking ‘Adversarial intent’

September 29, 2011

“If this works out, we’ll have the ability to track people persistently across wide areas,” says Tim Faltemier, the lead biometrics researcher at Progeny Systems Corporation, which recently won one of the Army contracts.
“A guy can go under a bridge or inside a house. But when he comes out, we’ll know it was the same guy that went in.”
— Description of innovative drone technology, from Wired


(Illustration found here).

Even as the situation on the ground in Afghanistan is becoming worse — an uptick of nearly 40 percent more violence this year than in 2010 — the sky-ways above the war-spattered country has become alive with the whine of unmanned drones.
The conflict there, however, is getting bad, real bad: Three NATO soldiers were killed Wednesday in eastern Afghanistan, while nearly at the same time, eight Afghan policemen died in an ambush in the south, and all this horror in the wake of the US embassy attack in Kabul earlier this month, followed by the assassination of a former Afghan president who was trying to work with the Taliban on a peace plan.
No country for young or old men.

Emboldened by the successful slaughter in Afghanistan, the US reportedly are building new drone bases in Ethiopia and in the Seychelles, an archipelago in the Indian Ocean — all in secret, of course — to enable better attacks on insurgents in Somalia and Yemen.
Use of drones surged 134 percent in 2010 over the previous year, and President Obama has apparently made these machines the focal point for modern war — only five drone attacks in Pakistan during 2007, 36 in 2008 (most of those in the last half of that year) and in Obama’s first year in office, the tempo of such attacks in Pakistan increased 47 percent.

Despite all evidence to the total contrary, last summer, Obama’s chief counter-terrorism adviser John Brennan, blubbered “there hasn’t been a single collateral [civilian] death” in Pakistan and since drones are perfect, “exceptionally surgical and precise” and “do not put… innocent men, women and children in danger.”
Such bullshit — there’s been 236 attacks under Obama, one every four days.

And with drones, it’s quiet, easy killing with machines themselves making life-and-death decisions.
War has gone beyond human control:

And the machines being used to do the killing are also being enhanced, moving the United States one step closer to an apparent goal of constant low-intensity warfare capability worldwide.
The United States government is reportedly working to develop pilotless military drones that are fully automatic, identifying and destroying human targets on the ground without any intervention from an operator or pilot back in Nevada, and this is generating virtually no public outrage.
The drones would reportedly seek their targets based on facial-recognition software or other biometrics. The Defense Department planners have dubbed the technological leap “lethal autonomy,” meaning that the life-or-death decision can be made instantaneously and independently by the machine without any slowing down of the process due to a human being having to make a decision whether to fire or not.

The key words ‘generating virtually no public outrage‘ is where the US war effort is moving in an attempt to keep the Long War going even longer — out of sight, out of range.

And it’s topical — this week, the FBI arrested a 26-year-old US citizen on charges of “plotting an attack on the Pentagon and the U.S. Capitol with a remote-controlled model aircraft” and although authorities claim there was no real danger, the ability of this guy armed with a physics degree to plan something like this might become common place in the near future.

So these unmanned drone future is kind of scary, even for US peoples.
From the Wire link above:

The Pentagon isn’t content to simply watch the enemies it knows it has, however.
The Army also wants to identify potentially hostile behavior and intent, in order to uncover clandestine foes.
Charles River Analytics is using its Army cash to build a so-called “Adversary Behavior Acquisition, Collection, Understanding, and Summarization (ABACUS)” tool.
The system would integrate data from informants’ tips, drone footage, and captured phone calls.
Then it would apply “a human behavior modeling and simulation engine” that would spit out “intent-based threat assessments of individuals and groups.”
In other words: This software could potentially find out which people are most likely to harbor ill will toward the U.S. military or its objectives.
Feeling nervous yet?
“The enemy goes to great lengths to hide his activities,” explains Modus Operandi, Inc., which won an Army contract to assemble “probabilistic algorithms th[at] determine the likelihood of adversarial intent.”
The company calls its system “Clear Heart.”
As in, the contents of your heart are now open for the Pentagon to see.
It may be the most unnerving detail in this whole unnerving story.

Not only watch your back, but watch your emotions.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.