Frame Rate debates (30fps vs. 60fps vs. higher) became gaming’s most pedantic yet passionate arguments. While PC gamers championed high refresh rates, console developers defended 30fps as “cinematic,” spawning mockery and technical literacy wars where smoothness became ideological battlefield.
The Great Divide
Frame rate priorities split gaming:
- PC gamers: 60fps minimum, 144Hz+ ideal
- Console: 30fps acceptable for graphics fidelity
- Competitive: 240fps+ for esports
- “Can’t see past 24fps anyway”: Debunked myth
”Cinematic” 30fps
Developers defending 30fps:
- The Order: 1886’s “cinematic” 24fps disaster
- “More realistic motion blur”
- Graphics vs. performance trade-off
- Mockery ensued (“cinematic 15fps next?”)
Console Limitations
Hardware constraints forced compromises:
- PS4/Xbox One struggling for 1080p/60fps
- Resolution vs. frame rate debates
- Performance modes in PS5/Series X era
- Players choosing for themselves
Competitive Advantage
Higher frame rates provided edge:
- Lower input lag
- Smoother tracking
- Competitive gaming standard 240fps+
- “Git gud” included upgrading hardware
The Human Eye Debate
Misinformation persisted:
- “Eye can’t see past 24/30fps” myth
- Scientific studies disproving
- Anecdotal “I can’t tell difference”
- Placebo accusations
Most could tell with A/B testing.
120Hz Console Era
PS5/Xbox Series X enabled:
- 120fps modes in select games
- HDMI 2.1 requirements
- TV upgrade necessity
- Mainstream high refresh rates
Diminishing Returns
Community acknowledged:
- 30→60fps huge difference
- 60→120fps noticeable
- 120→240fps competitive only
- 240→360fps minimal gain
Sources:
- Digital Foundry Frame Rate Analysis
- Display Technology Studies
- Competitive Gaming Performance Research