Epic Games hosted its State of Unreal keynote at GDC 2023 (Game Developers Conference), held Wednesday, revealing its long-term plans for content creation, in addition to new tools being used in upcoming games. The nearly eight-hour-long presentation was streamed live on the company’s YouTube channel, touching upon updates in Unreal Engine 5.2, a new performance capture system, an Unreal Editor for Fortnite, and more. We also got a fresh look at the Lords of the Fallen reboot, Hexworks’ upcoming dark fantasy souls-like title, focusing mainly on its high-resolution models and lighting technology, powered by Unreal Engine 5.
With that, here’s a roundup of the biggest announcements from Epic Games’ State of Unreal 2023 presentation:
Unreal Editor for Fortnite
As the name implies, this tool is a unique version of Unreal Engine that allows players to create and publish their work directly into Fortnite — be it new environments, absurd models, or gameplay experiences. Running on PC, the app is integrated directly with Fortnite, providing access to over four years of content and assets to play around with. It also comes with a “live edit” feature that lets anyone on your team join your UEFN session, regardless of the platform and collaborate on creations. Additionally, players can import custom assets and animate them, while adding extra depth with Unreal Engine 5’s Lumen, the real-time global illumination system.
Unreal Editor for Fortnite comes with a new programming language ‘Verse,’ offering “powerful customisation abilities” such as manipulating or chaining together devices. Epic Games claims that Verse is being designed as a programming language for the metaverse, with “upcoming features to enable future scalability to vast open worlds,” a press release adds. UEFN is now available on the Epic Games Store as a public beta.
Counter-Strike 2 Revealed by Valve, Releasing This Summer
Developer Ninja Theory is leveraging Unreal Engine’s MetaHuman Animator technology in its upcoming Hellblade II: Senua’s Saga to achieve photorealistic facial expressions in its characters. The tool allows developers to use an iPhone or stereo helmet-mounted camera (HMC) to capture all the nuances in the actor’s performance, and then transfer every detail and animation onto its digital kin to bring them to life. The tool is slated to release this summer (American), and claims to produce the kind of quality facial animation, which is expected from AAA games and even Hollywood filmmakers (for example: Love, Death, & Robots).
“With the iPhone you may already have in your pocket and a standard tripod, you can create believable animation for your MetaHumans that will build emotional connections with your audiences—even if you wouldn’t consider yourself an animator,” a blog post adds.
Unreal Engine 5.2
Epic Games partnered with vehicle manufacturer Rivian to create the gorgeous Electric Dreams demo, as a means to showcase the changes in Unreal Engine 5.2, now available via the Epic Games Launcher and GitHub. In it, a photorealistic Rivian R1T electric truck moves through an overgrown forest, while showcasing the new Substrate shading system. For the demo, the presenters changed the truck’s decal to an opal colour, in order to best demonstrate how light interacts with the surface. The effect is achieved through multiple layers, and the reflection is seen changing live when a dust layer is added on top of it.
Unreal Engine 5.2 also introduces Procedural Content Generation Framework (PCG) tools that enable artists to “define rules and parameters to quickly populate expansive, highly detailed spaces” in an efficient manner. The procedural assemblies/ models when dropped onto an existing plain, interact in real-time with other nearby elements, updating their appearance accordingly.
Marvel’s Spider-Man 2 Will Release Sometime in September, Venom Voice Actor Reveals
Lords of the Fallen technical showcase trailer
This new trailer from Hexworks took us on a gothic journey to Skyrest Bridge, one of the early locations in Lords of the Fallen, a reboot-ish sequel to the 2014 version. The developer claims to have used 3D scans of people to foster a better character customisation system in-game — body types, faces, and such — in addition to using UE5’s Chaos Physics Engine to facilitate movement effects on weapons or clothing. It was interesting to see Hexworks use a Bloodborne-esque character to demonstrate this effect — a nice nod to an iconic game.
The Lords of the Fallen trailer then touched upon the lighting system, which again, leverages Unreal’s Lumen GI illumination system to create lighting effects that bounce off surfaces in real-time. In the game, players will intermittently travel between two worlds — Axiom the realm of the living, and Umbral the realm of the dead, which have been crafted side-by-side, allowing for seamless swapping between them. “This means our artists and designers can ensure these worlds feel intrinsically linked, like two sides of the same coin,” the narrator claimed.
Creator Economy 2.0
The new Creator Economy 2.0 entails that developers who create and publish islands in Fortnite will benefit monetarily, depending on the engagement. Epic Games will share 40 percent of the net revenue collected from Fortnite’s item shop — skins, battle passes, etc — with creators of eligible islands. Oddly enough, the phrasing seems to suggest that Epic themselves is eligible for some pay. “…both islands from independent creators and Epic’s own such as Battle Royale.”
Epic Games is combining all its assets marketplaces — Unreal Engine Marketplace, Sketchfab, Quixel Bridge, and the ArtStation Marketplace — under the brand name ‘Fab.’ It essentially functions as a single destination for creators to discover, share, buy, or sell digital assets, across its massive library. Sellers gain an 88 percent revenue share, and the platform offers content ranging from 3D models, VFX, sound, and more. Fab is scheduled to launch later this year, albeit an alpha version of the Fab plugin is now included in Unreal Editor for Fortnite.
Affiliate links may be automatically generated – see our ethics statement for details.