Transparency and Manipulation
We like the idea of transparency within our society. It is the solution for addressing the corruption that happens behind closed doors. We extend the idea not just to politicians, but to loved one's as well. Having little to know secrets is supposed to give those around us a sense of trust and well-being. If there's nothing to feel ashamed or guilty about, then why hide?, goes the reasoning.
Critics of transparency have pointed to its ugly downside ever since its rise in the 20th century. Orwellian dystopias make note that the surveillance society looms around the corner when having to disclose too much. And even if the state never embraces this practice, surveillance, when applied to personal relationships, takes away our sense of autonomy. Someone asking where we went last night may be innocuous, or it can be despotic.
Having our data manipulate us doesn't have to necessarily be painful though. In the case of Google and Meta (aka Facebook) our data is used to feed us things we like. But, as we're being fed, we're also being profiled. Our affiliations, beliefs, and ideals are generalized and reaffirmed. We are caught in a feedback loop that has the ability to reinforce our own biases. As a result, we become unjustifiably confident in what we already think.
Though plenty of people have made this point before, I think there could be an element to all this that’s not good. If our online profiles are data driven then we're vulnerable to being nudged down a course that is an oversimplified characterization of what a person is supposed to be. Watch a particular comedian on YouTube and start getting political commentary recommendations of a certain brand. Search for a way to decorate your house and, depending on what influencer picture you saved, get suggestions about memes that have ideological undertones. The personal is supposedly the political and every part of one's life is surveyed to figure these things out. A situation that is all driven by AI compartmentalizations. It seems unsurprising then that we find ourselves increasingly polarized.
Being unsettled by the government knowing everything about us makes sense. And even though I don't believe major tech companies have a desire to tyrannically rule over us, the data we give them has the potential to take away our autonomy in the same way all transparency can. We may not realize though that the unmanned breadcrumbs the algorithms are dropping lead us to be less complex, more sure of ourselves, and intolerant of those that can't see the world with the clarity of us and our fellow comrades. Can we expect anything else if we construct ourselves upon the impersonal binary systems that inform our day to day?