This is a subreddit preview page. If you have a GummySearch account, please add this Subreddit to your audience to view the full analysis features there.
r/mlscaling
14k members
r/mlscaling is a subreddit with 14k members. Its distinguishing qualities are that the community is large in size.
ML/AI/DL research on approaches using large models, datasets, and compute: "more is different"
Popular Themes in r/mlscaling
#1
News
: ""Microsoft and OpenAI Plot $100 Billion Stargate AI Supercomputer", The Information"
34 posts
#2
Pain & Anger
: "Data movement bottlenecks could limit LLM scaling beyond 2e28 FLOP, with a "latency wall" at 2e31 FLOP. We may hit these in ~3 years."
2 posts
#3
Ideas
: "Introducing OpenAI o3 and o4-mini"
1 post
#4
Opportunities
: "Thinking Machines is aiming to raise a $1 billion funding round"
1 post
Popular Topics in r/mlscaling
#1
Ai
: ""The market plausibly expects Ai software to create trillions of dollars of value by 2027", Benjamin Todd"
55 posts
#2
Scaling
: "Elon Musk reportedly cancels mass-market car model to free up Tesla resources for giant datacenter for Scaling up self-driving cars"
49 posts
#3
Training
: ""The Longest Training Run: Training runs of large machine learning systems are likely to last less than 14-15 months. This is because longer runs will be outcompeted by runs that start later" (wait equation)"
34 posts
#4
Language Models
: ""Large Language Models are getting bigger and better: Can they keep improving forever?", The Economist"
25 posts
#5
Compute
: ""AI progress is about to speed up", Ege Erdil (the Compute drought is ending as LLMs finally scale to 100k+ H100 training runs)"
24 posts
#6
Llm
: "NSA research director Gilbert Herrera: the NSA can't create SOTA Llms because it doesn't have the data or budget"
22 posts
#7
Model
: "Sam Altman on Lex Fridman's podcast: "We will release an amazing new Model this year. I don’t know what we’ll call it." Expects the delta between (GPT) 5 and 4 will be the same as between 4 and 3."
22 posts
#8
Openai
: ""Microsoft and Openai Plot $100 Billion Stargate AI Supercomputer", The Information"
21 posts
#9
Reasoning
: "LIMR: Less is More for RL Scaling, Li et al. 2025 ["[P]recise sample selection, rather than data scale, may be the key to unlocking enhanced Reasoning capabilities"]"
20 posts
#10
Models
: "Epoch AI: Total installed Nvidia GPU computing power is growing by 2.3x per year"
18 posts
Member Growth in r/mlscaling
Yearly
+5k members(61.9%)
Similar Subreddits to r/mlscaling

r/ArtificialInteligence
1.4M members
184.0% / yr

r/ChatGPT
10.0M members
91.0% / yr
r/LanguageTechnology
55k members
16.4% / yr

r/LargeLanguageModels
4k members
43.8% / yr

r/LLMDevs
78k members
999.1% / yr

r/LocalLLaMA
454k members
194.1% / yr

r/MachineLearning
3.0M members
2.6% / yr

r/machinelearningnews
92k members
123.4% / yr

r/OpenAI
2.3M members
73.1% / yr

r/singularity
3.7M members
61.3% / yr