The real nature of automated profiling
How mass profiling works and how it can become a weapon of mass manipulation.
Raise your hand if you know the difference between "service personalization" and profiling.
These two terms are often mistaken as synonyms, but they couldn't be more different. It's very important to understand their meaning, because almost all of cyberspace runs on profiling algorithms, which carry significant implications for us as individuals and for the entire society.
Before continuing, leave a like!
Individuality in the Digital Age
Our grandparents and all previous generations lived in a context where it was very difficult to obtain low-cost information, and it was equally difficult to interact and communicate with people outside their local sphere. Conforming to customs was an easy way not to make mistakes and to be accepted by one's community.
In the Digital Age we live in, it is now very easy to obtain information of any kind at very low cost and to interact with people on the other side of the world. This allows us to build our identity based on countless inputs and influences, well beyond the borders of our own country.
Being "unique" today is something much easier — at least in theory — compared to the past. Contamination and the creation of relationships even at distances of thousands of kilometers are increasingly easier, leading to the development of subtle but significant different personalities, preferences, and attitudes among individuals.
While this potential is positive, excessive individuality and market fragmentation creates significant challenges for companies that want to sell their products. Selling goods and services can indeed prove difficult if your audience has no face, lives hundreds or thousands of kilometers away from you, and possesses different characteristics from your fellow townspeople.
This is where big tech corporations, data brokers and mass profiling come in.
What actually is profiling?
The ease with which data on people's actions and behaviors can be acquired today has led the market to create a solution to such problem, giving rise to the phenomenon we now know as profiling.
Technically speaking, profiling is a process, typically automated, that uses algorithms to analyze one or more data sets in order to infer additional information about an individual, such as characteristics, behaviors, or preferences. This information is then used to create “profiles” that can be employed for assessments, predictions, or decisions about the individual.
Profiling consists therefore in three phases:
Data mining: collection and analysis of raw data
Inference: generation of additional information and profiles through models and algorithms
Application or decision: use of profiles for concrete actions
As you might imagine, profiling is not a human activity, but to draw a comparison, it can be likened to the idea of a stereotype. A stereotype is a simplified and persistent subjective characteristic, applied to a place, an object, an event, or a recognizable group of people who share certain characteristics or qualities.
In psychology, a stereotype is a preconceived, generalized, and simplistic opinion that is not based on a personal evaluation of individual cases but is mechanically repeated on people, events, and situations.
We could say then that the function of profiling is to create stereotypes through finding correlations between data that describes in a simplified or generalized way a person's behavior or certain charateristics, in order to place them in a group of people with the same profile.
In practice, profiling is the attempt to use mathematical models to generalize and simplify the attitudes, interests, and thoughts of 8 billion people and thus predict their behaviors, in order to sell goods, services, ideas.
Conversely, personalization is the activity of giving personal character to the activity, needs, or preferences of a person, or multiple people. Essentially, it's the effort required to transform a conventional good or service into a custom-made one for the individual who purchases it. The opposite of profiling.
The example of Spotify
Big corporations and companies want you to believe that their profiling is meant to offer you “service personalization”. However, as you may have already realized, these two concepts are very different. After all, would you ever willingly accept being reduced to a stereotype?
Today there is a billion-dollar and rapidly expanding industry that seeks to sell artificial intelligence services that, for example, promise to "understand human emotions" and personalize services of all kinds.
How can goods and services be personalized through the creation of stereotypes designed to do the exact opposite: standardize, generalize, and categorize people? They can't.
An example of this comes from a patent filed by Spotify in 2018, for the development of a new technology that would allow the analysis of users' voices to better understand their musical tastes and suggest content based on their preferences.
The new technology is nothing more than an artificial intelligence model, which, from voice analysis, would be able to acquire and analyze "metadata" related to the user's emotional state, gender, age, accent, number of people in the room, and the physical environment they are in.
Once this information is obtained and processed, Spotify would use it to recommend the music that the algorithm deems most suitable for us and the context we are in.
However, what Spotify is trying to do is not personalize the music feed, but profile the user more accurately by acquiring additional data. A subtle but fundamental difference.
There are 8 billion individuals in the world, each with their own individuality and subjectivity. We are all subjects in becoming, enriched by our differences and uniqueness: music is certainly an expression of this individuality. Thinking that an algorithm, just from voice samples and a few other metadata, can understand a person's emotional state, is very naive.
The risks of profiling
No software will ever be able to understand human emotions, and no software will ever understand our personality. If anything, it is mass profiling, jointly with advertising and propaganda, that gradually flattens individuals to fit forcibly into a certain consumer profile.
Mass profiling can therefore pose a threat to our subjectivity and freedom of self-determination. A child born into a world where every product, service, piece of news, and form of entertainment is automatically suggested by profiling and recommendation algorithms will struggle to develop critical thinking skills or the ability to make independent choices.
Profiling is a hammer
As Paolo Benanti said:
“Algorithms are arrangements of power. They are not simply things; they are ways of organizing society and power, rights, and privileges.”
Profiling algorithms are no exception. Even what may seem most harmless, like Spotify’s patent, has very specific effects on society.
Profiling cannot be used to create personalized services and products, but it works wonderfully when it comes to doing what it was born for: categorizing people based on common (generalized) characteristics, such as ethnicity, political opinions, religion, health status, etc.
When these tools are in the hands of States and big corporations (which basically work for the former), they become real weapons of mass manipulation: digital hammers that strike entire categories of people, united only by a few general characteristics.
The only way to avoid being hammered down, is to reflect and realise that we live in a world increasingly invaded and governed by profiling algorithms and technologies whose primary purpose is to manipulate people through the non-consensual invasion of their private sphere. Only by acknowledging such truth one can start a path of liberation against algorithmic manipulation.
Help me grow the community: forward this article to your friends!