The Scope
Emojimo is a 27,000+ line vanilla JavaScript game engine rendered entirely on HTML5 Canvas using emoji as the primary visual language. No sprite sheets, no asset pipeline, no game framework. Every character, item, terrain tile, and UI element is a Unicode glyph drawn directly to a 2D canvas context. The entire project was built with Claude Code as a creative partner.
What started as an experiment in procedural world generation became a full survival game with AI-driven NPCs, real-time multiplayer architecture, and a suite of visual admin tools. The game generates a 100x100 tile procedural wasteland with distinct biomes (forest, desert, ocean, tundra, volcanic), each with unique flora, fauna, and ambient sound profiles.
AI-Powered NPCs
The NPC system is where the technical depth lives. Every NPC has a unique GPT-4-powered personality defined through structured system prompts that include backstory, personality traits, temperament, values, and dialogue style. NPCs don't just respond to the player -- they maintain persistent memory through a Retrieval-Augmented Generation (RAG) system backed by Supabase. Each NPC tracks relationship state (trust level, conversation count, relationship stage) and retrieves contextually relevant memories via vector similarity search before generating responses.
The trading system goes beyond simple buy/sell. NPCs can haggle, counter-offer, pressure, scam, gift, and even craft items mid-conversation. They have emotional responses that shift based on interaction history. A merchant who's been given gifts remembers them and adjusts pricing. An NPC you've angered will reference the specific incident.
The Engine
Under the hood, the game runs a proper game loop with delta-time interpolation at 60fps, viewport culling (only visible tiles render), and smooth 8-tile-per-second movement with sub-tile interpolation. The architecture evolved from a monolithic game.js into a modular engine with an EventBus, FeatureFlags system, Entity Component System, and clean separation between the reusable engine layer and game-specific code.
Key systems include an AnimationStateMachine (multi-frame emoji sequences per state), a 24-hour TimeSystem with clock emoji display and period-based lighting, grid-based movement with collision detection, a combat manager supporting melee and ranged weapons with projectile physics, and a dungeon system with procedurally generated interior spaces.
Multiplayer Architecture
The multiplayer layer uses Supabase Realtime for Phase 1 (2-10 players) with a planned migration path to raw WebSockets for Phase 3 (50-100 players). The architecture is designed as 12-15 toggleable modules that progressively enhance the single-player experience without breaking it. Network failures gracefully degrade to solo play. State synchronization targets 20-30 ticks/second with client-side prediction and server reconciliation.
The Admin Tools
The project includes a full suite of browser-based admin tools: a Character Creator for defining NPC personalities, skills, behaviors, and AI configurations; an Animation Editor with grid interface, live preview, and export/import; a Map Editor for hand-crafting world layouts; a Biome Manager for tuning terrain generation parameters; and an Item Database editor. Each tool writes to Supabase and hot-reloads into the running game.
Companion Systems
KennyBot 2000 is an autonomous AI companion with four behavioral modes -- Follow, Farm, Protect, and Hunt -- each with distinct movement patterns, engagement ranges, and damage profiles. The pet stays within a configurable radius of the player's home base and can be summoned from anywhere with a hotkey.
The game also features a vehicle system, fire propagation system, Spotify integration for in-game radio, a reputation/karma system that affects NPC behavior, and a v2 rewrite that introduced Google Sheets as a data backend for entity configuration -- letting non-developers tweak game balance through a spreadsheet.
Tech Stack
Vanilla JavaScript (ES6+), HTML5 Canvas, Supabase (Postgres + Realtime + Auth + Storage), OpenAI GPT-4 API, ElevenLabs voice synthesis, Vite build system, and a custom Ollama model for local NPC dialogue. No React, no game engine, no dependencies beyond the APIs.