The Mediation and Commodification . Part of the #transhumancode #bestseller
https://m.barnesandnoble.com/w/the-transhuman-code-carlos-moreira/1130063219
The best technology can do is prioritize its efforts according to our administration, our design, our programming. The key is making sure the priorities we assign to it and the governance we ascribe to it are in the best interest of all humanity. How? In general terms, the solution is to codify the core human attributes that set us apart from every life form on the planet, into the technology we create. To write HI (human intelligence) code into AI technology so that the product serves us instead of making us subservient to it.
Technology only has the freedom to go as far as we allow it. To this point, we’ve been lethargic about its freedom. The manipulation of the U.S. Presidential election was perhaps a turning point. In the least, it has served as a global wake-up call. But the truth we must grasp is much smaller: we’re all complicit in the Facebook platform’s capacity to manipulate. We created it, embraced it, and empowered it for years, knowingly and unknowingly.
In her article for Vanity Fair, Susan Fowler describes the conflicting human efforts that have allowed technology to initiate a swelling, covert coup against humanity.
A few weeks into my tenure at Uber…a co-worker sat down next to my desk. “There’s something you need to know,” she said in a low voice, “and I don’t want you to forget it. When you’re writing code, you need to think of the drivers. Never forget that these are real people who have no benefits, who have to live in this city, who depend on us to write responsible code. Remember that.”
I didn’t understand what she meant until several weeks later, when I overheard two other engineers in the cafeteria discussing driver bonuses — specifically, ways to manipulate bonuses so that drivers could be “tricked” into working longer hours. Shortly thereafter, a wave of price cuts hit drivers in the Bay Area. When I talked to the drivers, they described how Uber kept fares in a perfectly engineered sweet spot: just high enough for them to justify driving, but just low enough that not much more than their gas and maintenance expenses were covered.
For a week in January 2012, Facebook removed between 10 and 90% of the positive emotional content from the newsfeeds of approximately 700,000 users. The action was part of a covert study being conducted by a Facebook and US academics who were interested in whether the emotions expressed by friends via social networks influenced users’ moods. The ultimate objective was to determine if it was possible to manipulate users’ feeds to keep them happier, which would in turn keep them on Facebook longer, making it possible to expose them to more ads and thus increase Facebook revenue. According to Guardian columnist Stuart Jefferies, the study found that “reducing the number of emotionally positive posts in someone’s newsfeed produced a statistically significant fall in the number of positive words they used in their own status updates and a slight increase in the number of negative words.”In other words, the study proved that Facebook can, if it so chooses, sway users’ moods to benefit the bottom line.
As you might imagine, when the news was leaked that Facebook ran the test, the response from users was less than cheerful. Explains Jefferies, “There was a disgust at the possibility that Facebook wanted to make us happier on their site so that we’d stay there longer…so that Mark Zuckerberg can buy more yachts.” While Jefferies deems it “a loathsome business model” he acknowledges that the practice is nothing new in the world of technology platforms. He cites Thomas Jones from the London Review of Books who reminds us that “the purpose of Facebook is to harvest, organize, and store as much personal information as possible to be flogged, ready-sifted and stratified, to advertisers… We aren’t Facebook users, we’re its product.”
This “mediation and commodification of every aspect of everyday life” as Jones puts it, is not the practice of Facebook alone. It is the modus operandi of nearly every technological tool you use, from cell phones and search engines to doorbells, security cameras, and voice-controlled speakers. The unveiled reality of the technologically-driven world we live in can lead one to ask, as Jefferies does, whether we’ve become pawns “whose moods can be altered like lobotomized lab rats” to boost corporate revenues? The truth is that the handful of multi-billion-dollar platform companies who are vying for control of the web — Facebook, Amazon, Google, and Twitter to begin with — aren’t looking to influence our moods for the sport of it. These companies are in most cases looking to boost our morale while using their product and then feed us more of what we are looking for. It is, on one hand, deploying the fundamentals of good customer service. What better way to serve a customer than to understand the person as much as possible?