Feb 23, 2026
This week I worked on the AI system for LGS. The robots are humanoid security units meant to be off-the-shelf hardware LGS modified for facility use. A focus is balancing the system so that the AI feels threatening and reactive without being unfair — something I'd been thinking about since reading breakdowns of the AI design in Alien: Isolation, which handles this problem better than almost any other game I can think of. I spent a while thinking through the architecture before writing any code.
The system is split into two layers. The first is an AI Director — a global autoload that manages pacing and tracks robot state across the whole facility. It holds save data for every robot so their positions and states persist when you leave and re-enter areas. Robot data is serialized as Godot Resource files and written to disk on scene transition, keyed by level and robot ID. When a robot spawns it checks for saved data and restores from it, otherwise starts fresh.
The second layer is the per-robot behavior, built from a reusable state machine (shared with the player controller) and a sensory component. Each robot has four states: Patrol, Investigate, Hunt, and the groundwork for more later. In Patrol the robot samples random walkable points from the navmesh and wanders between them. The sensory component runs every frame checking for the player — vision uses a horizontal and vertical FOV cone check followed by a physics raycast for line of sight. Detection builds a confidence value over time rather than triggering instantly, so brief glimpses don't immediately alert the robot. Once confidence hits a threshold the robot transitions to Hunt and actively chases. If it loses sight, a countdown timer runs and it falls back to Investigate, searching the last known position before giving up and returning to Patrol.
I also added a hearing system. Player actions like walking, running, and jumping emit sound events through the Director, which broadcasts to all nearby robots. Each robot's sensory component checks the sound against a distance-based threshold curve — loud sounds carry further, quiet ones don't. This means crouching and moving carefully actually matters.
The two-layer design is directly inspired by the AI system in Alien: Isolation, where a high-level Director manages the Xenomorph's pacing and a separate behavior layer handles its real-time reactions. The Director in that game is responsible for making the alien feel omnipresent and fair — nudging it toward the player when tension is low, pulling it back when things get too intense. I wanted the same feel for LGS. The Director and the per-robot behavior are deliberately decoupled — the Director handles pacing and persistence, the robots handle their own moment-to-moment decisions.
That said, the Director side is still incomplete. Right now robots find the player entirely through their own sensors. The Director isn't yet sending hints to robots about the player's general location, which is a key part of what makes the Isolation system work — it's what stops the alien from wandering the wrong end of the map while you stand still for five minutes. Implementing that is the next priority on the AI side.
The other obvious gap is that the robots have no body. No model or animations so they're just a capsule moving around the level. Animation is something I find genuinely difficult and have been putting off. Rigging and animating a humanoid robot is a non-trivial chunk of work and I don't want to block AI progress on it, so for now the logic runs on a placeholder. It'll need to be tackled eventually but I'd rather have the behavior working correctly first.
[ ←/ESC back ]