DeepSeek Quietly Tests Updated Model with Recent Knowledge
DeepSeek conducts quiet testing of an updated AI model that incorporates more recent knowledge and information, potentially improving its capabilities beyond
Someone noticed DeepSeek quietly rolled out a new model in limited testing that knows about recent stuff like Gemini 2.5 Pro without needing web search.
The updated model has a 1M token context window and fresher training data than the public V3. Only certain accounts can access it right now on chat.deepseek.com and their app - looks like they’re doing a gradual rollout.
How to check if you have access: Just ask it something like “Do you know about Gemini 2.5 Pro?” If it gives detailed info without searching, you’re in the test group.
Turns out this probably isn’t the full V4 release people were expecting - more like a V4 Lite or intermediate update. The main giveaway is the context size (1M vs rumors of much larger) and the limited rollout strategy. Still, having up-to-date knowledge baked in saves time compared to waiting for web searches on every recent topic.
Related Tips
GPT-OSS 120B Uncensored: Zero Refusals Reported
GPT-OSS 120B Uncensored is an open-source language model reportedly designed without content restrictions, claiming to fulfill all user requests without
Kyutai's Hibiki Zero: 3B Speech-to-Speech Model
Kyutai introduces Hibiki Zero, a compact 3-billion-parameter speech-to-speech model that processes and generates audio directly without intermediate text
DeepSeek V4-Lite Spotted with 1M Token Context
DeepSeek V4-Lite has been observed featuring a one million token context window, significantly expanding its capability to process and analyze extremely large