Google Messages now automatically blurs NSFW photos and warns users before opening explicit media, with all processing done locally on the device to protect privacy. The feature also warns senders about the risks of sharing nude images. It is enabled via SafetyCore and is default for most users, with options for teens and adults to manage settings. This enhancement aims to prevent accidental exposure and promote safer messaging practices.
Nvidia has released Chat With RTX, a free AI chatbot that runs locally on PCs with Nvidia RTX graphics cards, using Mistral or Llama open-weights LLMs to search through local files and answer questions. The app supports various file formats, allows for the incorporation of information from YouTube videos, and emphasizes user privacy by not transmitting sensitive data to cloud-based services. While the app is rough around the edges and experienced crashes during testing, it represents a significant step towards cloud independence and offers generative AI capabilities directly on users' devices.
Nvidia has released a free demo version of an AI chatbot called Chat with RTX, which runs locally on PCs with GeForce RTX GPUs and can be personalized with user content. Powered by Nvidia TensorRT-LLM software, the bot can be trained on various file types and even YouTube videos, providing personalized responses without relying on cloud-based services, thus keeping user data secure. It requires an Nvidia GeForce RTX 30 Series GPU or higher and Windows 10 or 11, and utilizes retrieval-augmented generation and RTX acceleration to provide relevant answers using local files as a dataset.
NVIDIA has released a demo version of a chatbot, Chat with RTX, that runs locally on Windows PCs with NVIDIA GeForce RTX 30 Series GPUs or higher, allowing users to process sensitive data without sharing it with a third party or needing an internet connection. The chatbot can access local files and documents, create summaries, answer questions, and integrate YouTube videos and playlists for contextual queries. While it has hardware limitations and is currently a demo product with potential bugs, it represents a significant step towards a contextual digital assistant within the framework of personal data.
Nvidia has released an early version of Chat with RTX, an AI chatbot that runs locally on PCs with RTX 30- or 40-series GPUs, allowing users to feed it YouTube videos and documents for summarization and relevant answers. While the app has some bugs and limitations, it shows promise for data research and analysis, particularly for journalists. Chat with RTX can handle YouTube videos and local documents, providing near-instant responses without the lag of cloud-based chatbots. However, it's currently an early developer demo with known issues and limitations, and it requires significant system resources to run.