Though it may not be a new phenomenon, scoring – which creates an index for decision-making by aggregating data – is a practice with deep roots in the culture of companies and organizations. Augmented by the wealth of data available about individuals and artificial intelligence, scoring is now undergoing a complete transformation. But amidst fears of getting barred from a flight due to a prior offence or missing out on a loan based on a shadowy mass of personal data compiled by artificial intelligence, scoring has raised a number of concerns regarding the potential of its applications to limit personal freedom. That’s why the AXA Foresight Trendbook has selected it as one of the main drivers of socio-economic transformation featured at the center of its study. The report’s release provided an occasion to talk with Daniel Kaplan, co-founder and CEO of the Next-Generation Internet Foundation (FING), who views scoring in the context of a confidence crisis in the digital age.
How would you define scoring? What makes it a major socio-economic trend in the years ahead?
We can define scoring as the creation of a general index by an organization using collected data. The index provides the organization with information about a situation or a user, which it will then use in its decision-making processes. In itself, the practice is neither new nor futuristic. In fact, banks have been using scores based on aggregated data for more than 20 years. However, what has fundamentally transformed the practice is the wealth of data available today, which just keeps getting bigger. That’s what has enabled organizations to utilize this tool in a wide range of fields, and also in less transparent ways.
So that means advances in technology have brought about this transformation and made scoring more widespread?
I want to emphasize an important point: it’s not technology (Big Data, Artificial Intelligence, etc.) that is changing things, but the way organizations decide to use this technology. And the way scoring is now used reflects a strategy that aims to create a system of data asymmetry that is stacked up in their favor. Their goal is to amass a wealth of data and to use this data in their user relations without users having any say in the matter. But I repeat: it’s not the technology that dictates these behaviors, these are strategies put in place by the organizations. For me, it speaks to a profound crisis in confidence.
Does that mean we are assigning scores to cover up a lack of confidence?
The problem is more complex than that: in my research on confidence in the digital age, I noticed one trend emerging inside organizations. First of all, there is a lot of confusion surrounding the notion of confidence itself. Ordinarily, if I have confidence in someone, I won’t monitor their actions or verify the information they give me. Unfortunately, organizations set things up to avoid even the slightest uncalculated risk. As a result, the exact opposite approach to confidence is taking root in the digital realm. We are seeing a strong climate of distrust towards individuals developing in technology practices and the systems put in place by organizations.
Does it also speak to a transformation in the relationship between individuals and organizations?
It’s true that some organizations use digital to reduce the amount of human contact they have with their customers or users. Between these dehumanized relationships and decisions resulting from data collected without user knowledge, we are seeing organizations place more pressure on individuals. Scoring is part and parcel of this new relationship in which people vanish behind an unforgiving logic and a script written by the organization.
What is the forecast for this trend? What actions can we take today?
To restore confidence and keep from going down the dark path of dehumanization, organizations need to explain their scoring methodologies and maintain an appropriate dialogue with users. It makes sense for companies to use data. Privacy concerns should focus more so on autonomy and agency, rather than secrecy. It’s less about knowing what companies know about us, and more about if this knowledge is used to make decisions without consulting us. One interesting possibility is to give users access to their own data, and thus enable them to anticipate or prevent the decisions companies make about them. This should open up a vast market of services for users.