CIA has your fingerprints if you used ChatGPT palm reader, meme after viral moment reflects real anxiety
Did you ask ChatGPT to do your palm or face analysis? If you have, the internet says the CIA now has access to your biometric details like fingerprints and face scans. It is an outlandish claim, of course. But it reflects a real anxiety about privacy and AI tools like ChatGPT.
by Divya Bhati · India TodayIn Short
- Users are nowadays asking ChatGPT to read their palm
- The AI analyses the palm and generates a report
- Many say that giving biometric data to ChatGPT is a privacy risk
There is a trend going viral on social media where people are uploading their palms and faces to ChatGPT to get a reading and analysis. But there is another claim that is viral along with it: “Congratulations. Now the CIA has your face and palm data.” Like for real? It is more of a hyperbole than reality. But the claim that now the CIA has your biometric data because you are having some fun with ChatGPT does reflect some real anxiety about data privacy and AI tools like ChatGPT and Gemini.
As the trend of uploading palm images and faces to ChatGPT became popular, many privacy-minded people started sounding the alarm. Using a viral meme photo, in which a person is whispering into the ears of OpenAI CEO Sam Altman, one X user wrote as the caption of the image: "Sir, they already gave us their eyes & retina data during the Ghibli trend. Now they're voluntarily uploading high-res palm photos for AI palm reading."
Well, the data sharing is indeed real. And so are the privacy concerns. Behind the memes, there is a tinge of real anxiety that users feel as they give private details to AI tools like ChatGPT, Gemini and Claude. Although, the CIA part is a joke, though in the age when all our conspiracy theories are coming true who really knows! The governments across the world nowadays do work with tech companies behind the scenes.
The latest palm and face trend was sparked when OpenAI rolled out ChatGPT Images 2.0. Within a few hours of its release, as expected, users began testing the model in creative ways. Among all the crazy things users have been trying, one trend has stood out: palm reading. Users started uploading photos of their hands and asked ChatGPT to analyse. In response, they received a detailed palm reading. Naturally, the trend took off.
We tried it too, and like many others, we were impressed by how detailed and aesthetic the output looked. That said, the readings did feel somewhat generic and sycophantic, as AI gave the analysis that was optimistic and pleasing to read. Still, it came across as a fun way to see how the machine can “see” and analyse.
Internet fears CIA connection
As the trend became popular, privacy-minded users started sounding an alarm. Across platforms like X (formerly Twitter) and Reddit, many are decrying the trend. Some users are even joking that OpenAI might end up “sharing” biometric data with the CIA.
Several posts going viral point out that a palm photo isn’t just a simple image, it contains detailed biometric data, including fingerprints and unique hand geometry. Unlike passwords, these identifiers cannot be changed if compromised. A community note under a viral ChatGPT palm reading post on X highlights that “uploading palm photos to AI tools can expose extractable fingerprints and other biometric data that cannot be changed if compromised or misused.”
Some of this is being discussed through memes, while others are raising more serious concerns.
Some web users believe that intelligence agencies, particularly the CIA, could be monitoring AI interactions or gaining access to such data. Interestingly, this is not the first time we are seeing chatter around possibilities that big American AI companies like OpenAI or Anthropic are sharing user data to the intelligence department.
To be clear, there is no evidence to support these claims. However, the chatter is mostly around how these companies deal with user data and how much personal data users are willing to share with AI tools, often without fully understanding the implications.
- Ends