https://arstechnica.com/information-technology/2024/12/new-physics-sim-trains-robots-430000-times-faster-than-reality/ Skip to content Ars Technica home Sections Forum Subscribe * AI * Biz & IT * Cars * Culture * Gaming * Health * Policy * Science * Security * Space * Tech * Feature * Reviews * Store * AI * Biz & IT * Cars * Culture * Gaming * Health * Policy * Science * Security * Space * Tech Forum Subscribe Story text Size [Standard] Width * [Standard] Links [Standard] * Subscribers only Learn more Pin to story Theme * HyperLight * Day & Night * Dark * System Search dialog... Sign In Sign in dialog... Sign in robot parade New physics sim trains robots 430,000 times faster than reality "Genesis" can compress training times from decades into hours using 3D worlds conjured from text. Benj Edwards - Dec 19, 2024 3:10 pm | 59 A simulated teapot and letters created using the Genesis platform. A simulated teapot and letters created using the Genesis platform. A simulated teapot and letters created using the Genesis platform. Credit: Zhou et al. A simulated teapot and letters created using the Genesis platform. Credit: Zhou et al. Text settings Story text Size [Standard] Width * [Standard] Links [Standard] * Subscribers only Learn more Minimize to nav On Thursday, a large group of university and private industry researchers unveiled Genesis, a new open source computer simulation system that lets robots practice tasks in simulated reality 430,000 times faster than in the real world. Researchers also plan to introduce an AI agent to generate 3D physics simulations from text prompts. The accelerated simulation means a neural network for piloting robots can spend the virtual equivalent of decades learning to pick up objects, walk, or manipulate tools during just hours of real computer time. "One hour of compute time gives a robot 10 years of training experience. That's how Neo was able to learn martial arts in a blink of an eye in the Matrix Dojo," wrote Genesis paper co-author Jim Fan on X, who says he played a "minor part" in the research. Fan has previously worked on several robotics simulation projects for Nvidia. Genesis arrives as robotics researchers hunt for better tools to test and train robots in virtual environments before deploying them in the real world. Fast, accurate simulation helps robots learn complex tasks more quickly while reducing the need for expensive physical testing. For example, on this project page, the researchers show techniques developed in Genesis physics simulations (such as doing backflips) being applied to quadruped robots and soft robots. Example images of the simulated physics-based worlds created by Genesis, provided by the researchers. Example images of the simulated physics-based worlds created by Genesis, provided by the researchers. Credit: Zhou et al. The Genesis platform, developed by a group led by Zhou Xian of Carnegie Mellon University, processes physics calculations up to 80 times faster than existing robot simulators ( like Nvidia's Isaac Gym ). It uses graphics cards similar to those that power video games to run up to 100,000 copies of a simulation at once. That's important when it comes to training the neural networks that will control future real-world robots. "If an AI can control 1,000 robots to perform 1 million skills in 1 billion different simulations, then it may 'just work' in our real world, which is simply another point in the vast space of possible realities," wrote Fan in his X post. "This is the fundamental principle behind why simulation works so effectively for robotics." Generating dynamic worlds The team also announced they are working on the ability to generate what it calls "4D dynamic worlds"--perhaps using "4D" because they can simulate a 3D world in motion over time. The system will reportedly use vision-language models (VLMs) to generate complete virtual environments from text descriptions (similar to "prompts" in other AI models), utilizing Genesis's own simulation infrastructure APIs to create the worlds. The AI-generated worlds will reportedly include realistic physics, camera movements, and object behaviors, all from text commands. The system then creates physically accurate ray-traced videos and data that robots can use for training. Of course, we have not tested this, so these claims should be taken with a grain of salt at the moment. Examples of "4D dynamical and physical" worlds that Genesis created from text prompts. Examples of "4D dynamical and physical" worlds that Genesis created from text prompts. This prompt-based system may let researchers create complex robot testing environments by typing natural language commands instead of programming them by hand. "Traditionally, simulators require a huge amount of manual effort from artists: 3D assets, textures, scene layouts, etc. But every component in the workflow can be automated," wrote Fan. Using its engine, Genesis could also generate character motion, interactive 3D scenes, facial animation, and more, which may allow for the creation of artistic assets for creative projects, but may also lead to more realistic AI-generated games and videos in the future, constructing a simulated world in data instead of operating on the statistical appearance of pixels as with a video synthesis diffusion model. Examples of character motion generation from Genesis, using a prompt that includes, "A miniature Wukong holding a stick in his hand sprints across a table surface for 3 seconds, then jumps into the air, and swings his right arm downward during landing." Examples of character motion generation from Genesis, using a prompt that includes, "A miniature Wukong holding a stick in his hand sprints across a table surface for 3 seconds, then jumps into the air, and swings his right arm downward during landing." While the generative system isn't yet part of the currently available code on GitHub, the team plans to release it in the future. Training tomorrow's robots today (using Python) Genesis remains under active development on GitHub, where the team accepts community contributions. The platform stands out from other 3D world simulators for robotic training by using Python for both its user interface and core physics engine. Other engines use C++ or CUDA for their underlying calculations while wrapping them in Python APIs. Genesis takes a Python-first approach. Notably, the non-proprietary nature of the Genesis platform makes high-speed robot training simulations available to any researcher for free through simple Python commands that work on regular computers with off-the-shelf hardware. Previously, running robot simulations required complex programming and specialized hardware, says Fan in his post announcing Genesis, and that shouldn't be the case. "Robotics should be a moonshot initiative owned by all of humanity," he wrote. Photo of Benj Edwards Benj Edwards Senior AI Reporter Benj Edwards Senior AI Reporter Benj Edwards is Ars Technica's Senior AI Reporter and founder of the site's dedicated AI beat in 2022. He's also a tech historian with almost two decades of experience. In his free time, he writes and records music, collects vintage computers, and enjoys nature. He lives in Raleigh, NC. 59 Comments Comments Forum view Loading Loading comments... Prev story Next story Most Read 1. Listing image for first story in Most Read: Human versus autonomous car race ends before it begins 1. Human versus autonomous car race ends before it begins 2. 2. Ars Technica's top 20 video games of 2024 3. 3. $2,100 mechanical keyboard has 800 holes, NYC skyscraper looks 4. 4. Film Technica: Our favorite movies of 2024 5. 5. Flu surges in Louisiana as health department barred from promoting flu shots Customize Ars Technica has been separating the signal from the noise for over 25 years. With our unique combination of technical savvy and wide-ranging interest in the technological arts and sciences, Ars is the trusted source in a sea of information. After all, you don't need to know everything, only what's important. More from Ars * About Us * Staff Directory * Newsletters * Ars Videos * General FAQ * RSS Feeds Contact * Contact us * Advertise with us * Reprints Do Not Sell My Personal Information (c) 2024 Conde Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement and Privacy Policy and Cookie Statement and Ars Technica Addendum and Your California Privacy Rights. Ars Technica may earn compensation on sales from links on this site. Read our affiliate link policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Conde Nast. Ad Choices