This is a subreddit preview page. If you have a GummySearch account, please add this Subreddit to your audience to view the full analysis features there.
r/mlscaling
13k members
r/mlscaling is a subreddit with 13k members. Its distinguishing qualities are that the community is large in size.
ML/AI/DL research on approaches using large models, datasets, and compute: "more is different"
Popular Themes in r/mlscaling
#1
News
: ""Microsoft and OpenAI Plot $100 Billion Stargate AI Supercomputer", The Information"
34 posts
#2
Pain & Anger
: "Data movement bottlenecks could limit LLM scaling beyond 2e28 FLOP, with a "latency wall" at 2e31 FLOP. We may hit these in ~3 years."
2 posts
#3
Opportunities
: "Thinking Machines is aiming to raise a $1 billion funding round"
1 post
Popular Topics in r/mlscaling
#1
Ai
: ""The market plausibly expects Ai software to create trillions of dollars of value by 2027", Benjamin Todd"
56 posts
#2
Scaling
: "Elon Musk reportedly cancels mass-market car model to free up Tesla resources for giant datacenter for Scaling up self-driving cars"
47 posts
#3
Training
: ""The Longest Training Run: Training runs of large machine learning systems are likely to last less than 14-15 months. This is because longer runs will be outcompeted by runs that start later" (wait equation)"
39 posts
#4
Language Models
: ""Large Language Models are getting bigger and better: Can they keep improving forever?", The Economist"
26 posts
#5
Model
: "Sam Altman on Lex Fridman's podcast: "We will release an amazing new Model this year. I don’t know what we’ll call it." Expects the delta between (GPT) 5 and 4 will be the same as between 4 and 3."
25 posts
#6
Compute
: ""AI progress is about to speed up", Ege Erdil (the Compute drought is ending as LLMs finally scale to 100k+ H100 training runs)"
24 posts
#7
Llm
: "NSA research director Gilbert Herrera: the NSA can't create SOTA Llms because it doesn't have the data or budget"
22 posts
#8
Openai
: ""Microsoft and Openai Plot $100 Billion Stargate AI Supercomputer", The Information"
19 posts
#9
Reasoning
: "LIMR: Less is More for RL Scaling, Li et al. 2025 ["[P]recise sample selection, rather than data scale, may be the key to unlocking enhanced Reasoning capabilities"]"
17 posts
#10
Language
: ""A Benchmark for Learning to Translate a New Language from One Grammar Book", Tanzer et al 2023 (efficiency of learning unknown Language from textbook scales drastically with model size)"
15 posts
Member Growth in r/mlscaling
Yearly
+5k members(68.0%)
Similar Subreddits to r/mlscaling

r/ArtificialInteligence
1.4M members
193.2% / yr

r/ChatGPT
9.7M members
95.8% / yr

r/LargeLanguageModels
4k members
35.9% / yr

r/LLMDevs
70k members
1007.9% / yr

r/LocalLLaMA
423k members
201.5% / yr

r/MachineLearning
3.0M members
2.6% / yr

r/machinelearningnews
88k members
120.4% / yr

r/OpenAI
2.3M members
80.9% / yr
r/PromptEngineering
106k members
590.0% / yr

r/singularity
3.7M members
70.1% / yr