The Neon app, which paid users for call recordings used to train AI, has been disabled due to a security flaw that exposed call data, raising privacy and legal concerns, especially regarding consent laws across different states.
Neon, an app that paid users to record calls for AI training, was quickly popular but was taken down after a security flaw exposed sensitive user data, prompting an ongoing security audit and server patching.
The Neon app, a popular call-recording platform that paid users for their data, has been taken offline after a security flaw exposed users' phone numbers, call recordings, and transcripts, raising privacy concerns. The app's servers failed to prevent unauthorized access to user data, prompting the developer to shut down the service temporarily. The incident highlights ongoing issues with app security and oversight in app marketplaces.
Neon, a popular social app on the Apple App Store, pays users to record their calls and sells the audio data to AI companies, raising significant privacy concerns despite its legality. The app's broad data license and potential for misuse highlight ongoing issues with user privacy in the age of AI and data monetization.