- What Is Rivals of Aether? Game Type, Platforms, and Modes Explained
- Black Myth Wukong Guide – What To Expect From This Amazing New RPG
- Top 5 Gaming Thrones for a Luxurious Gaming Experience
- Is Sonic Triple Trouble 16-Bit Official? Who Is The Bad Guy in Sonic Triple Trouble?
- Among Us Original: Where and How To Play The Classic Game
10 years of EA SEED: ray tracing, smarter AI, and new pipelines
Electronic Arts is marking the 10-year anniversary of SEED – its internal research group focused on future-facing game technology. The team outlined how prototype work in rendering, animation, content pipelines, and AI has moved into production across multiple EA titles. The update highlights early ray tracing research, real-time global illumination at stadium scale, machine-learning deformers, and reinforcement learning applied to gameplay systems. SEED also points to new LightStage-powered face capture workflows and audio-driven facial animation used in recent projects.
Real-time realism: lighting, reflections and animation

SEED’s early project PICA PICA explored ray tracing for real-time applications, demonstrating physically based lighting and reflections years before the technique became standard in games. The group also collaborated with Frostbite on GIBS (surfels-based global illumination), a system that brings real-time indirect lighting to large, open environments. According to EA, GIBS is implemented in EA SPORTS College Football 25, enabling more than 150 stadiums with dynamic day-night cycles and operation without prebaked lighting.
On the animation side, SEED’s Swish in Madden NFL 21 is cited as one of the first machine-learning deformers shipped in a game for player apparel. The system used neural networks to simulate cloth stretch, flow, and collisions in real time, shifting from hand-authored rules to learned behavior. For Dragon Age: The Veilguard, SEED research contributed to higher-quality and more varied motion data, aimed at expressive characters without compromising performance.
Creation tools: faster lipsync and high-fidelity face scans
SEED’s Voice2Face system generates realistic lip and facial motion directly from recorded speech. EA reports it was used in Battlefield 6 cinematics and for crowd chants in EA SPORTS FC 25, reducing turnaround time while maintaining visual quality. In parallel, new collaborative workflows around LightStage – a high-end facial scanning setup – helped teams capture fine skin details and subtle expressions for in-game head models with greater accuracy.
Smarter systems: reinforcement learning and at-scale security
- Reinforcement learning in gameplay – In EA SPORTS FC 26, a goalkeeper trained with reinforcement learning is used for adaptive positioning, analyzing player movement and adjusting like its real-world counterpart.
- Security and fair play – In Apex Legends, SEED-backed detection systems have flagged suspected cheating across millions of matches, supporting integrity at scale.
Read also our article: PlayStation Share of the Week goes off‑path – next theme is Harvest
At a glance: where SEED tech showed up
The table below summarizes the key technologies SEED highlighted and where EA says they have been demonstrated or deployed. It condenses the projects across rendering, animation, tools, and AI.
What’s next
EA says SEED will continue partnering with internal teams on techniques across ray-traced lighting, machine learning, and generative AI, with an emphasis on applied research that can ship in future titles.
Bottom line for players
Why it matters – SEED’s decade of R&D is already visible in shipped games, from lighting that adapts in real time to ML-driven animation and smarter AI behaviors. For players, that points to more convincing visuals, more responsive systems, and steadier security as these technologies propagate across upcoming releases.
Meet the Author
Співпраця - текст
Unlock special gaming deals, limited-time bundles, and more - sign up for By-gamers newsletter today!