The idea of a futuristic world is what we’ve been living in right now. The fact that we take so many things for granted means we are yet to fully mature in the areas of consciousness and technology.
However, Microsoft is leading human civilization to a futuristic world where people can chat with people who do not exist anymore. Microsoft calls this new technology a Hypothetical Chatbot You that uses chats, texts, voice notes, calls, and online profiles to mimic an individual’s personality or even his/her face to entertain people or answer some unheard questions.
According to the US patent report, the Hypothetical Chatbot You uses social data of people who might be your friend, spouse, celebrity, a fictional character, a historical figure, or even a random entity. Furthermore, the technology uses this data to display a talking version of a person with a photorealistic face and voice.
The Hypothetical Chatbot You is More than a chatbot.
According to the United States Patent report, the Hypothetical Chatbot You can use psychographic data, social data, and crowdsourced data to create a person’s life-like personality. Additionally, the 3D technology and user data can make a real-life version of a person who might not exist in the real world, or, perhaps, he/she died 50 years ago.
But what about the people who are not on social media? Hypothetically, some people are yet to join the internet. For those particular cases, the Hypothetical Chatbot You can use crowdsourced data and other social media behaviors to create a life-like version of an individual. Unlike regular chatbots who are capable of holding a conversion with human-like humor and sense, Hypothetical Chatbot You is one step ahead. Instead of relying on open data on the internet, it digs deep into the servers to create the host’s personality.
Privacy concerns and data sharing with Hypothetical Chatbot You
The United States Patent and Trademark Office might have registered the patent for the technology, it didn’t permit to use, share or sell the technology. According to Jennifer Rothman, a law professor at the University of Pennsylvania, the Hypothetical Chatbot You can create conflicts around the right to privacy, defamation, copyright infringement, and false endorsements.
According to her, the Hypothetical Chatbot You can be used as an impersonator. It can even lead to social, emotional, and financial harm, like what we have seen with celebrities’ deepfake videos floating on the internet. Moreover, another big concern is about finance and banking. Although it is believed that such a technology will not infiltrate the privacy of users, the risks are too high. If someone finds a loophole in the technology, there can be disastrous effects on the economy, social life, war, and peace.
Considering the technology’s repercussions, the world is not matured enough to use the Hypothetical Chatbot You. From banking, military, politics, entertainment to space, the use of such technology might limit our human dependency. Still, the risks are too high, and the cost of damage might be unbearable. Imagine using the technology in an office setup, and the chatbot reveals something from someone’s past or shares sensitive information from a person’s chat to the outsiders. To make matters worse, the bot might create riots or civil wars by using the stored data from someone’s account, which was, at no cost, meant to be released in public.