Can AI recommendation algorithms provoke new interests instead of recycling familiar preferences?
Recommendation algorithms analyze our past behaviors to reproduce them as future possibilities. So, Netflix constantly proposes similar films, LinkedIn job opportunities replicate past roles, social media platforms connect us with others like ourselves, Tinder repeatedly surfaces the same person but with a different name. We are trapped inside of who we are. The accurate replication of established preferences provides efficiency and convenience, but also inhibits the discovery of new interests.
Re-engineer AI recommendations to provoke curiosity and new interests instead of only echoing those already established.
On one side, disruptive recommendations must break the flow of similarity, the logic of filtering for nearness or resemblance to past choices. There must be discontinuity with established interests.
On the other side, recommendations must be engaging, they must appeal to users despite being atypical and leading toward unexplored possibilities.
The industry standard for recommendations is collaborative filtering, meaning that recommendations are produced through a logic of resemblance. If two users have enjoyed similar movies in the past, and one of these users enjoys a new offering, that offering will be recommended to the other user.
The curiosity engine employs antagonistic filtering and explainability. Antagonistic filtering begins by identifying users who are significantly different. Then, among those diverse users, a strand of resonance is located. Resonance is not similarity, but a connection across differences. White wine is not similar to fish, and it is not similar to steak, but it resonates with one more than the other. Transferring to recommenders, what is sought are resonances between otherwise distinct users. From there, interests from one and the other can be recommended back and forth.
The idea is that the recommendations will be unfamiliar given that they emerge from different types of people. But, they will be engaging because a resonance exists between the two.
Finally, the reason for the offering will be explained. This will empower users to control the experiment in novel preferences, and so increase engagement.
Technical conclusion: the recommendation strategy is provocation, not accuracy. It seeks to provoke new interests, instead of accurately offering recommendations that resemble past preferences.
By provoking new interests and possibilities, the Curiosity Engine project increases human freedom, contributes to an open and diverse society, and helps platforms maintain their users.