This is a subreddit preview page. If you have a GummySearch account, please add this Subreddit to your audience to view the full analysis features there.

r/mlscaling

14k members
r/mlscaling is a subreddit with 14k members. Its distinguishing qualities are that the community is large in size.
ML/AI/DL research on approaches using large models, datasets, and compute: "more is different"

Popular Themes in r/mlscaling

#1
News
: "Trump allies draft AI executive order, includes "Manhattan Projects" for military AI"
20 posts
#2
Pain & Anger
: "Data movement bottlenecks could limit LLM scaling beyond 2e28 FLOP, with a "latency wall" at 2e31 FLOP. We may hit these in ~3 years."
2 posts
#3
Ideas
: ""Does Reinforcement Learning Really Incentivize Reasoning Capacity in LLMs Beyond the Base Model?", Yue et al 2025 (RL training remains superficial: mostly eliciting pre-existing capabilities hidden in base models)"
2 posts
#4
Money Talk
: ""Google announces $250/month AI Ultra subscription plan" ($50 more than OA Pro)"
1 post
#5
Opportunities
: "Thinking Machines is aiming to raise a $1 billion funding round"
1 post

Popular Topics in r/mlscaling

#1

Ai

: ""The market plausibly expects Ai software to create trillions of dollars of value by 2027", Benjamin Todd"
49 posts
#2

Scaling

: "Musk diverts 12k H100s from Tesla to Twitter; Nvidia comments Musk public statements on GPU Scaling "conflict with bookings & forecasts""
36 posts
#3

Llm

: ""Llms may be fundamentally incapable of fully general reasoning, and if so, short timelines are less plausible.""
21 posts
#4

Training

: ""The Longest Training Run: Training runs of large machine learning systems are likely to last less than 14-15 months. This is because longer runs will be outcompeted by runs that start later" (wait equation)"
21 posts
#5

Language Models

: ""DeepMind is holding back release of AI research to give Google an edge" (Ars Technica) {'I cannot imagine us putting out the transformer papers for general use now'}"
20 posts
#6

Model

: "Large Language Monkeys: Scaling Inference Compute with Repeated Sampling, Brown et al. 2024 [Given sufficient number of attempts, smaller Models can reach parity with larger Models in solving tasks. Pareto frontier for compute cost varies from task to task]"
19 posts
#7

Openai

: "Openai co-founder Sutskever's new safety-focused AI startup SSI raises $1 billion"
19 posts
#8

Reasoning

: ""Reasoning to Learn from Latent Thoughts" Ruan et al 2025"
17 posts
#9

Language

: ""A Benchmark for Learning to Translate a New Language from One Grammar Book", Tanzer et al 2023 (efficiency of learning unknown Language from textbook scales drastically with model size)"
16 posts
#10

Gpu

: "Musk diverts 12k H100s from Tesla to Twitter; Nvidia comments Musk public statements on Gpu scaling "conflict with bookings & forecasts""
14 posts

Member Growth in r/mlscaling

Yearly
+5k members(54.2%)

Similar Subreddits to r/mlscaling

r/agi

73k members
68.9% / yr
/r/ArtificialInteligence

r/ArtificialInteligence

1.5M members
157.7% / yr
/r/ChatGPT

r/ChatGPT

10.9M members
75.6% / yr

r/LanguageTechnology

57k members
18.0% / yr
/r/LLMDevs

r/LLMDevs

95k members
946.6% / yr
/r/LocalLLaMA

r/LocalLLaMA

497k members
169.5% / yr
/r/MachineLearning

r/MachineLearning

3.0M members
2.8% / yr
/r/machinelearningnews

r/machinelearningnews

103k members
121.4% / yr
/r/OpenAI

r/OpenAI

2.4M members
51.6% / yr
/r/singularity

r/singularity

3.7M members
39.7% / yr