Tag

Distillation

All articles tagged with #distillation

Anthropic alleges Chinese firms used 16M Claude prompts to clone capabilities
technology3 days ago

Anthropic alleges Chinese firms used 16M Claude prompts to clone capabilities

Anthropic says three Chinese AI labs—DeepSeek, Moonshot AI, and MiniMax—launched industrial-scale distillation attacks against Claude, generating over 16 million exchanges via about 24,000 fraudulent accounts and proxy services. Each campaign targeted different Claude capabilities: DeepSeek for reasoning and censorship-safe responses (≈150,000 exchanges), Moonshot AI for agentic reasoning, tool use, coding, and vision (≈3.4 million), and MiniMax for agentic coding and tool use (≈13 million). The prompts were designed to harvest capabilities for training rival models and evade detection, highlighting significant national-security concerns due to unguarded capabilities. Anthropic says it has strengthened defenses and detection, noting such attacks exploit illicit distillation rather than typical user risk; Google had reported similar attacks earlier.

Google Accuses Copycats of Distilling Gemini While Scrutiny of Its Own Data Scraping Grows
technology13 days ago

Google Accuses Copycats of Distilling Gemini While Scrutiny of Its Own Data Scraping Grows

Google says actors are attempting to clone its Gemini AI through distillation—carrying out thousands of prompts to replicate its reasoning—and frames the effort as intellectual-property theft, a sharp contrast to the company’s own past data scraping for training. The company cites “private sector entities” and researchers as possible culprits, while noting real-time detection reduced the attack’s risk, in the broader context of an AI arms race and monetization pressure on models.

Distillation: Making AI Models More Efficient and Affordable
technology7 months ago

Distillation: Making AI Models More Efficient and Affordable

DeepSeek's use of knowledge distillation, a widely used AI technique that involves training smaller models using the outputs of larger ones, has sparked controversy but is a common practice in AI development. Originally developed in 2015 at Google to make ensemble models more efficient, distillation helps create smaller, cheaper, and faster AI models by transferring 'dark knowledge' from a teacher to a student model. It has become a fundamental tool in AI, enabling companies like Google, OpenAI, and Amazon to deploy powerful models more efficiently, and continues to be an active area of research and application.

Unveiling Quantum Secrets: Harnessing Undetected Light for Imaging and Insights into Photochemical Processes
science-and-technology2 years ago

Unveiling Quantum Secrets: Harnessing Undetected Light for Imaging and Insights into Photochemical Processes

Researchers have experimentally demonstrated a method called quantum imaging distillation with undetected light (QIUL) that can generate high-quality images of objects by removing noise. By using photon pairs and only detecting one photon while the other illuminates the object, the method is resilient to noise levels surpassing the actual signal of interest. The team implemented an interferometric modulation technique to distill the quantum image and verified its performance even under extreme noise intensities. This research contributes to the advancement of quantum imaging and its potential applications in fields like light detection and ranging (LIDAR).