“Property will cost us the earth.”
—Andreas Malm, here.
Today’s links include several pieces by Carl Beijer and a few reflections on the future of drone warfare. Plus music.
Links
• Some PR advice for protesters (Carl Beijer)
1. Never throw the first punch or adopt a threatening posture.
If there is an imminent threat of violence, your absolute priority should be to protect yourself and those around you. … If you throw the first punch or even position yourself in a way that other people can construe as aggressive, the opposition will use this as a pretext to retaliate and the courts will use as an excuse for an adverse judgment.
I’ll just be blunt here: if you are alert enough to throw a first strike, you should be alert enough to either dodge or absorb a hit with little problem. These people are not the trained killing machines they want you to think that they are. Look how out of breath this Chabad goon was after landing two standing punches to a guy lying on the ground:
2. Narrate for the camera.
Most modern protest incidents have plenty of accompanying footage from witnesses with smartphones, but that doesn’t mean that your audience will always have a clear view what is happening. A lot of times the footage is just too dark, shaky, and poorly positioned to capture crucial details; even clear footage can miss certain things, like whether the clouds wafting into view are smoke or tear gas.So even if you aren’t the one holding the camera, it’s often a good idea to narrate what you see happening. As mentioned above, if someone attacks you, you should make a point of being loudly incredulous about it while making it clear that you are not resisting or provoking them. If you see a cop trying to goad a protester into fighting, say so. If counterprotesters are brandishing weapons, ask “why do you have a knife?” If you can identify someone or who they’re with, do it. If a Zionist is posing as the opposition and yelling slurs, don’t assume that everyone watching will realize what’s going on — call them out. Your extemporaneous narration in the video will be taken much more seriously than anything anyone tries to say about it afterwards
Other advice includes “record faces and other personally identifying information.” All in all, a terrific list.
File under “The obvious is not always obvious.”
• Occupation and capitalist ideology (Carl Beijer)
[W]hat is it about encampments that the ruling class finds so threatening? Why are its footsoldiers freaking out over what amounts, legally, to a case of mass loitering?
It is tempting to come up with all kinds of complicated and counterintuitive answers to this question, but I think we have already provided a simple answer. The capitalist class hates loitering. And it really hates mass loitering, because what this amounts to is a popular rejection of the laws of private property.
File under “Property will cost us the earth.”
• ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza (Yuval Abraham, +972 Mag)
In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”
Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.” […]
The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.
There’s some awful information in that article. More than you’d think from the headline.
File under “AI is never wrong…”