Calling to arms for the Attention-driven Tech Industry

With the Incident on the Capitol, the rooster is coming home to roost: It is very likely that US Federal legislation will go on to address some major links in the process that lead to a fascist liar and the party that enabled the behavior, driving a crowd and breaching the legislature building on January 6th, 2021. The link we all are familiar with is Facebook.

But the way laws work, it doesn’t make sense to sue FB or make a law that applies to only FB. After all the whole of the attention-driven economy is partly at fault, including the media. Maybe individual sites could be targeted if they cross over the customary Free Speech lines (all the public sites those protests organized on are probably going to get deplatformed if they resist taking down the content related to the event on 1/6). This means every site will have to bear the brunt of FB’s “negative externalities.” In other words, a powerful tool for humanity is going to get hammered due to one bad actor.

The exact same has already been happening for some time: Facial recognition. Clearview AI is well known in the Machine VIsion industry as unscrupulous shits where profit comes first before anything, being the only major player that enables law enforcement (and others) to do the same things the CCP does to the Uighurs. Thankfully US local law enforcement, as we know, are clumsy bad actors, lacking the technological sophistication and ruthless purpose that emboldens the Chinese government. But in 2020, these human rights abuses of facial recognition has basically started the banning of such uses by local US governments. Amazon is also a big-time no-no here, making one of the most commonly deployed product (Ring doorbells/cameras) tied to unconsented law enforcement use, after making similar mistakes letting any law enforcement at their big data sets.

Ultimately this is about externalities. I think it needs to be extremely clear, if you are doing a large-scale deployment of any new business and tech, to examine externalities. Think of it as risk assessment. It’s one thing to break fast, it’s another to break so fast that you never recover from it. It’s a third thing, unfortunately, that you break it so nobody else can ever touch it again. If the wrong legislation comes down, these technology sectors are dead ends.

It’s really important for big tech companies to be careful and not fuck it up for the rest of humanity. It’s not about starting Civil War 2, it’s about not starting the next Chernobyl (for nuclear power). It’s about not throwing the baby of advanced AI, out, with the bathwater of negative externalities that unethical actors like FB or Amazon is generating.