"Daemon" is an action-packed first-person shooter set in modern, stylized Tokyo. It encourages expressive gameplay through customizable classes built around themes of holding space, oppressive damage, and tactical utility.
This online, networked 4v4 shooter was developed by a team of 15 students over the course of 15 weeks. With only three programmers on the team, the polish and reliability of the gameplay are a testament to the team’s efficiency and technical skill. What you're seeing is the final version delivered at the end of our 15-week timeline. Development is ongoing as we continue refining the experience with the goal of releasing it on Steam—free for anyone to play this summer.
Note: Since our game is not yet available on public storefronts like Steam or the Epic Games Store, access requires a development account. I am unable to distribute these accounts publicly. However, if you are a recruiter or hiring manager reviewing my application, I have included a couple of development accounts in my cover letter for your convenience.
Daemon Gameplay Trailer
For this project, I led our three-person programming team and was responsible for developing all major UI systems. My contributions included building the player inventory backend, designing and implementing the buy menu, creating all HUD elements (such as the kill feed, notification pop-ups, and score), and developing the main, scoreboard, lobby, and pause menus. I also handled world space UI features like damage numbers, name tags, and ping indicators.
The buy menu in Daemon allows players to purchase and sell items during a match. Players can choose from custom-built presets created by our development team or assemble their own loadouts by purchasing individual items. Build credits scale over the course of a match, and each player can equip up to four items at a time. The menu can be accessed upon death or at designated locations on the map. This system was a core part of our design philosophy, supporting expressive, dynamic gameplay and encouraging players to explore unique playstyles.
Inventory Item Class: Created a modular inventory item base class to streamline both UI and backend development. All inventory items inherit from this base class, which contains core variables such as the item's stat map, name, type, cost, and image. This architecture allowed developers and designers to access shared data and functionality without repetitive casting or boilerplate code.
The base class also included common utility functions, such as enabling/disabling input and toggling visibility, making it easier to control item behavior across different systems. This setup significantly improved workflow efficiency and laid the groundwork for more maintainable and scalable inventory features.
Inventory Backend: To support a smooth and reliable inventory experience in our networked game, I developed a backend system that connects directly with the build menu and HUD. When a player attempts to purchase an item, the system checks whether they can afford it, if it’s already in their inventory, and whether there’s at least one open slot available. If all conditions are met, the item is spawned on the server and added to a replicated inventory array. If it's the only item currently owned, it becomes the held item, and both its input and visibility are enabled automatically.
To maintain consistency and prevent UI elements from shifting, the inventory is structured as an array of exactly four indices—each representing a slot. If a slot is empty, it remains null rather than being removed, which ensures stability in both UI presentation and backend logic. Selling an item simply sets its slot to null, and swapping between items using the scroll wheel or number keys disables the currently held item and enables the new one.
Each time an item is bought, sold, or swapped, the HUD and the dynamic armor system update in real time. The dynamic armor visuals are managed through a scene capture component that renders a base character model on each client, reflecting gear changes based on the player's build. This structure gave us consistent performance, minimized replication issues, and provided a solid foundation for expanding inventory features in future updates.
Dynamic Stat panels: Developed a dynamic stats panel to address the issue of players being overwhelmed by item choices without understanding their value. The panel, located on the right side of the build menu, pulls an editable stat map from the currently hovered inventory item. It automatically hides unused stats and displays relevant ones as bar graphs or numbers, depending on the stat type (e.g., time, ammo, or health are shown numerically).
Additionally, the panel shows a detailed item description to provide further context and suggested use cases. This system helps players quickly grasp each item's function and effectiveness, giving them the critical information needed to build informed and strategic loadouts.
Dynamic Armor Modification: Implemented a dynamic armor visualization system tied directly to the player's loadout decisions. When a player buys or sells an item, we calculate how many credits they’ve invested in each category (Spirit, Tank, Support, Tech). Based on that distribution, armor pieces are dynamically assigned to the player.
To reflect these changes in the UI, we spawn a base character model on each client at match start, equipped with a Scene Capture Component. This component renders the character model to a texture, which is displayed in the build menu. As the player modifies their loadout, the system updates the armor on the model, tells the Scene Capture to refresh, and then updates the render texture, ensuring the player sees a real-time visual representation of their evolving build.
Reusable item buttons: Designed and implemented a custom user widget that dynamically displays an assigned inventory item's class, type, and price, and functions as an interactive button for purchasing and selling items. This modular approach allowed for quick integration into the buy menu. Each item could be added simply by assigning its class, eliminating the need for manually creating and managing individual buttons, images, and text, and leading to a much cleaner event graph.
Preset Buttons: Designed preset buttons to accept an array of inventory item classes defined by designers. Each button calculates the total price of its associated loadout and gathers references to the corresponding item buttons to handle purchases and sales efficiently. These curated presets were created to provide players with quick access to recommended loadouts, helping to reduce decision paralysis and streamline the buy phase by offering clear, accessible options.
Weapon Affordability Visual: Whenever a player buys or sells an item or receives credits, each item and preset button automatically recalculates affordability based on the player's current credits. If the player cannot afford an item, the corresponding button is grayed out. This system effectively reduced information overload by providing clear, immediate visual feedback, allowing players to quickly identify which items were available for purchase.
Scaling Images: Developed a custom function (included in our shared function library) to scale images proportionally based on their assigned textures. The function compares the texture’s X and Y dimensions, clamps the larger dimension to the corresponding image axis, and then uses the aspect ratio to calculate the other axis value. This ensures that images are precisely scaled relative to their source texture, maintaining consistent visual quality and proportions. This was particularly important for our project, as we dynamically switched images across multiple UI components and needed them to scale accurately without distortion.
Players were overwhelmed by the amount of item information and had trouble understanding what each item did.
Managing and displaying a large number of unique items made the UI cluttered and hard to maintain.
Items wouldn't consistently replicate to the client, causing them to not appear or function properly after being bought.
Making the visuals and information more digestible through the use of our presets, stats panels, and weapon affordability visuals.
Creating the reusable item buttons and utilizing the inventory item base class.
Utilizing rep-notify variables, call the function to enable the item once the replicated variable has notified the client that it has been properly replicated.
As with most competitive shooters, Daemon’s HUD was designed to deliver essential information clearly and efficiently without distracting from gameplay. It communicated critical details such as player health, equipped items, ammo count, ability cooldowns, team scores, match time, and hardpoint control time. In addition, it provided responsive feedback through the killfeed, game state notifications, and kill confirmations. The HUD played a vital role in keeping players informed and grounded in the moment-to-moment action, while also adding impactful visual "juice" to elevate the overall game feel.
Kill Feed: To keep all players informed of team and enemy activity, I implemented a dynamic kill feed system. When a player was killed, a server function captured the killer, victim, and the primary assister (the player who dealt the most damage without getting the final blow). This information was passed to the GameState, which exists on all clients, and a multicast event updated each player's kill feed. The UI would then check if the killed player was on the same team as the local player, using our team enums, and color names accordingly—blue for allies and red for enemies. This consistent color logic was also used across other UI elements for clear team identification. The kill feed is in the top left of the screen in the above video.
Game State Notifiers: As we brought in more playtesters, it became clear that players lacked crucial feedback on events like hardpoint rotations, credit gains, and other major game state changes. To address this, I developed a game state notifier system—text pop-ups that appear near the center of the player's screen. These were driven by a UI container (a vertical box) that listened for specific event delegates. When triggered, the system would display a message with the provided context. To maintain clarity and avoid UI clutter, we limited the container to three messages at once and ensured that duplicate messages would replace older instances of the same content.
Damage Indicators: After receiving feedback during an early industry playtest, we learned that while players could tell they were taking damage, they often had no idea where it was coming from—especially when attacked from off-screen. We tested this ourselves and confirmed that it made many engagements feel unfair or disorienting. To address this, we implemented a directional damage indicator inspired by Call of Duty. When a player takes damage, we run an event that captures the attacker’s world position at the time of the hit. Using this, we spawn an arrow on the HUD and calculate the direction based on the difference between the attacker’s world rotation and the player’s current rotation. The arrow rotates in real time while visible, updating each frame to reflect the player’s changing perspective, giving them clear and immediate feedback on the direction of incoming damage. This is visible at the center of the player's screen in the above video.
Ability Cooldowns: About seven weeks into the project, we made a key design shift: we removed bounty points and replaced them with ability cooldowns. This change introduced the need for a clear, intuitive way to communicate cooldown status to players. Taking inspiration from Overwatch, I implemented an ability cooldown timer system. Each inventory slot is assigned an item from the inventory array, and if that item has a cooldown, the UI binds to its cooldown start delegate. When an ability goes on cooldown, it notifies the UI and passes the cooldown duration. A background progress bar visually displays the cooldown using an animation whose playback speed is dynamically adjusted based on the cooldown length, by dividing the default animation speed by the cooldown time, creating a responsive and readable visual indicator for players. The same logic used for the ability cooldown animation was also applied to our weapon reload wheel. This consistency across UI elements helped players intuitively understand both ability and weapon timing at a glance.
Kill feedback: Continuing the trend of implementing feedback-driven features to enhance game feel, I created a kill feedback system. While simple in execution, it significantly improved clarity, satisfaction, and responsiveness. When a player secured a kill, an event ran on the killer's client, passing in the victim's PlayerState. This allowed us to reference the killer's HUD and display the victim’s name in an animated UI element. The result was a satisfying and informative visual that let players know not just that they got a kill—but who they took down.
Early playtests showed players couldn’t tell where incoming fire was coming from, especially when attacked off-screen, making engagements feel unfair and confusing.
Without any feedback on confirmed kills, players often didn’t realize they had eliminated an enemy, losing out on valuable context and satisfaction.
With both teams using the same character models and limited HUD indicators, many players struggled to quickly identify friend from foe, especially in chaotic moments.
Because the game was networked, getting vital UI updates and state changes to replicate consistently and accurately across all clients proved to be a constant technical challenge.
Creating damage indicator arrows that pointed to where the damager was at the time of the hit: This gave players crucial directional feedback, improving survivability and fairness during off-screen engagements.
Adding in both a kill feed displaying everyone's kills as well as a kill feedback animation that told the player who they had killed in a more accessible location on the screen: These systems enhanced clarity and satisfaction by clearly informing players of combat outcomes in real time.
To rectify this we ensured that the local player's team always appeared as blue and the enemies as red. For the UI this meant that your team's score is always blue and your teammates always appear as blue in the kill feed. We did more with this using name tags and hardpoint markers: This consistent color-coding across all UI elements helped eliminate team confusion and allowed players to quickly assess the situation.
I found that the best way to send information to every client was to utilize multicasts (events that run on every client) on blueprints that existed on every client (like the game state or player controller): Leveraging multicasts on widely replicated classes ensured reliable and consistent communication of key events across all clients.
World space UI was essential in Daemon for delivering critical in-game context without breaking player immersion. Elements such as hardpoint markers, player name tags, and damage numbers were all rendered in world space, allowing players to quickly identify objectives, track damage dealt, and distinguish between teammates and enemies. This approach ensured that important information was always spatially relevant, enhancing clarity and reinforcing the game's fast-paced, tactical gameplay.
Contextual Player Names: Player names were one of the first features where we encountered issues with network delay. To ensure every client had the correct name, I retrieved each player's PlayerState
during BeginPlay
, waiting until it was valid. Once retrieved, I pulled the player's name and updated a replicated RepNotify
name variable. This variable was then used to assign the name to the player’s name tag widget on all clients. Later, we expanded this system to include team-based coloring: when assigning the name, the local player compared the tag owner's team to their own, displaying ally names in blue and enemy names in red. This gave players essential context—both who they were fighting and what team they were on.
Contextual Hardpoint Markers: For most of the project, the hardpoint markers remained a stagnant red, which often led to confusion—at a glance, it looked like the enemy team always held the objective. To address this and provide clearer context, I redesigned the hardpoint marker to dynamically change colors: it turns yellow when the point is uncontrolled or being contested, red when only enemy players are present, and blue when only players from the local player’s team are on the point.
Incrementing Damage Numbers: Another feedback-driven feature was the implementation of incrementing damage numbers. Initially, we had no plans to include damage numbers, but after testing, it became clear that players lacked sufficient feedback when hitting enemies and had little sense of how much damage they were dealing. Our first approach spawned static damage numbers at the hit location, replicated to all clients, but this cluttered the screen and confused non-involved players. Inspired by Apex Legends, we pivoted to a client-only incrementing system: when a player damages an enemy, a number appears near the enemy's head, visible only to the attacker. If more damage is dealt to the same target within a short window, the number increments and the display timer resets. This system gave players much clearer and more satisfying feedback on their cumulative damage.
Ping: The ping system was one of the first UI features we implemented, designed to let players quickly communicate locations in the world. For example, a player spotting an enemy could say, “look at my ping” to direct teammates. The mechanic is simple yet effective: when a player presses the ping button (middle mouse by default), it casts a ray from the center of their camera into the world. On impact, it spawns a ping actor at that location, which is set to always face the player and render above all other elements to ensure visibility and prevent clipping.
In fast-paced combat scenarios, players often struggled to quickly identify whether another player was an ally or an enemy. This created confusion and hesitation in engagements, especially at a distance or when multiple players were present on screen.
The hardpoint marker originally remained a static red, regardless of who was in control. This caused frequent misreads during matches, as players assumed the enemy always held the objective, even when that wasn't the case. The lack of immediate, visual clarity on objective status hurt game readability and strategic decision-making.
Players had no way of knowing how much damage they were dealing or how low their target was, leading to confusion and unsatisfying combat. Without proper feedback, players often disengaged prematurely or overcommitted without realizing how close they were to securing a kill.
This was a recurring issue we encountered throughout development. Our initial approach was to add colored outlines to all player characters, with the color reflecting the player's team relative to the local client. However, due to the bulky armor and visual effects, these outlines often weren't distinct enough. To address this, we leveraged the existing name tag widgets—modifying their color dynamically per client to reflect the viewed player’s team. This made identifying allies and enemies much easier and more consistent, especially in chaotic moments.
Originally, the hardpoint marker remained a static red regardless of which team controlled the point, which led to confusion—players often assumed the enemy always had control. To solve this, we made the marker's color contextual. It now displays yellow when the point is neutral or contested, blue when controlled by the local player’s team, and red when controlled by the opposing team. This change gave players immediate visual feedback on the game state at a glance.
Players initially had no way to gauge how much damage they were dealing, making fights feel unsatisfying and disorienting. Our first implementation of damage numbers helped a bit, but it wasn’t until we introduced the incrementing damage number system that this challenge was fully addressed. By aggregating successive hits into a single floating number that updated in real time, players gained a clear, satisfying sense of the damage they were dealing. This feature had a strong positive impact on gameplay clarity and overall feel.
The scoreboard in Daemon allows players to see team compositions and assess individual performance through stats such as kills, deaths, and (eventually) assists—though assist tracking was not fully functional in the build featured on the right. Players can also view each other's builds using balance progress bars, offering insight into strategic loadout choices. Additionally, the scoreboard currently displays each player's maximum available credits, though we plan to replace this with a stat showing time spent on hardpoints to better reflect objective-focused play.
Team sensitive player names: To improve clarity and ensure consistency, we designed the scoreboard so that the local player’s team would always appear at the top, with a blue background to reinforce team color identity. Implementing this was straightforward: using the GameState, I accessed the PlayerStates for both teams. When the scoreboard is opened, it checks the local player's team assignment. If the player is on Team Two, for example, the top section of the scoreboard is populated with Team Two’s players, and the bottom with Team One’s, and vice versa. This small adjustment significantly improved readability during fast-paced matches.
Player stats: The PlayerState
class played a critical role in powering our UI systems, particularly for tracking and displaying player statistics. One key example of this was our kill tracking system. As previously mentioned, when a player secured a kill, a server RPC was called with references to the killer, victim, and assister. Using these, I accessed each player’s PlayerState
and incremented their respective stat categories—kills, deaths, and assists—all of which were replicated variables. Once updated, these variables triggered a delegate that notified the scoreboard to refresh, ensuring that each client always had accurate and up-to-date player stats.
To improve readability and reduce the mental effort required to scan the scoreboard, we wanted the local player’s team to always appear at the top of the list.
One issue we encountered was inconsistent or collapsed UI elements when a team had fewer than the maximum 8 players.
When the scoreboard was opened, it would first check the player’s team using the GameState
. Based on that, it would determine which team’s player names should populate the top and which should go on the bottom. This ensured consistent visual hierarchy and helped players more quickly understand the match status.
To solve this, we used size boxes to lock in uniform slot dimensions. For name slots that weren’t used, we simply set their visibility to "Hidden" (rather than "Collapsed"), which kept them invisible to the player while still occupying layout space. This kept the scoreboard clean and evenly spaced even when the teams weren’t full.
As with any game—especially competitive ones—Daemon’s settings menu was essential for giving players control over their experience and ensuring accessibility. It allowed users to remap keybinds for core actions, adjust mouse sensitivity, fine-tune a range of graphical settings, and customize their audio preferences. This level of flexibility helped players tailor the game to their personal playstyles and hardware setups, contributing to both comfort and performance.
Remapping keybinds: To ensure that players could tailor the controls to their preferences, we introduced a customizable keybinding system. This allowed players to rebind their inputs directly from the settings menu, improving accessibility and overall user experience. I implemented this system using the same foundational approach I used on Project Mercury. For a more in-depth breakdown of the process and implementation details, I encourage you to take a look at the Project Mercury section of my portfolio.
Multitude of graphical settings: We were mindful of ensuring the game remained technically accessible across a wide range of PC hardware. Beyond general optimization (which we actively worked on), one of the best ways to support this was by implementing a robust set of visual settings. This not only made the game more accessible to lower-end systems but also allowed players with high-end rigs to enjoy the experience at a higher fidelity. While Unreal Engine provides many of these settings by default, I created a custom UI button system to allow players to cycle through their available options. Pressing the left or right buttons would decrement or increment the underlying setting value, and the display text was dynamically pulled from a corresponding array based on the current value.
Saving/applying and resetting settings: To prevent players from accidentally applying unwanted changes in the settings menu, we implemented an Apply Settings system. Any modified setting would be stored in a temporary variable and only applied to the game once the player confirmed their choices by pressing the Apply button. We also included a Reset to Defaults option to let players easily revert changes. In addition to applying the settings at runtime, pressing Apply also saved the current settings to the player’s machine using Unreal’s SaveGame
object system, ensuring that preferences persist between sessions. To support this, I created a base class for settings menus that managed core functionality like apply and reset, while individual settings submenus (such as graphics or keybinds) overrode these base methods to handle their specific data.
In a networked game, we couldn’t simply pause the world like we would in a single-player project. Fully disabling input would also block UI hotkeys, preventing players from closing menus or navigating settings.
Given that some team members, including myself, were using mid-tier hardware like an RTX 2060 laptop, we needed to ensure that the game remained playable without sacrificing key visual elements.
Every player has different preferences for sensitivity, FOV, and input bindings, so we wanted to allow for a highly customizable experience.
In past single-player projects, we would often pause the game when menus opened, but that approach was not viable for this networked project. Simply disabling all non-UI input wasn't an option either, as it would prevent players from using hotkeys to close UI panes. The final solution involved broadcasting a delegate when the player opened a menu. This delegate disabled input for all items in the player's inventory and temporarily prevented the player and their camera from moving, effectively locking gameplay input without compromising UI usability.
To ensure the game was accessible on a variety of hardware setups, we introduced a wide range of customizable visual settings. Players could adjust texture, shadow, and VFX quality to help reduce graphical load or enhance visual fidelity, depending on their system's capabilities.
We also included settings for mouse sensitivity, control schemes, and FOV. This gave players the ability to tailor the core gameplay experience to their preferences, ensuring comfort and responsiveness regardless of playstyle.
The lobby menu in Daemon served as a critical transition point between joining the game and starting a match. It allowed players to view all connected participants, see which team each player was on, and prepare for the upcoming game. The menu supported team switching, displayed player names, provided the game code, and featured the game's name based off of which player was the host. This interface was important for organizing matches smoothly, creating a clear pre-game phase, and setting the stage for competitive play.
It was integral to communicate to players who was on each team so they could easily identify open slots, locate specific teammates, and confirm which team they were on after joining.
To ensure that team selection in the lobby was reflected for all players, I utilized multicast events. When a player clicked a team button, the system would first add them to the appropriate team array on the server. Then, using a multicast, the player’s name was added to the corresponding team array on every other client. Each client would search through its list of UI name slots and replace the first instance of “Empty” with the new player’s name. When a player left or changed teams, their name would be removed from the array and the corresponding UI slot would be reset back to “Empty.” This kept team information consistent and up to date across the lobby for all players.
Without a dedicated QA team, we took full responsibility for testing and balancing the game throughout development. To achieve this, we implemented a multi-pronged approach that included internal testing, weekly external playtests, and the use of detailed bug and crash reports. These strategies allowed us to identify and fix critical issues early, validate gameplay balance, and ensure technical stability across a variety of hardware configurations.
Our testing process began with rigorous internal evaluation. Any time a networked feature was implemented, the responsible programmer would pair with a teammate—ideally another programmer—for immediate in-editor testing to verify proper replication and functionality across the network. If issues were discovered, one person would run as the server and the other as the client, stepping through the debugger to identify replication hiccups or timing issues. Once a feature passed editor testing, we moved on to testing in packaged builds—the same format we provided to players and industry professionals—creating upwards of 10 to 20 builds per week. If a bug only surfaced in the build version, we would revisit the codebase to identify areas susceptible to network latency and attempt to reproduce the issue. Because many of our bugs stemmed from timing and replication problems, we eventually established internal playtests with 4–8 team members, both in-editor and in builds. While most bugs were resolved at the one-on-one testing level, the full team sessions were essential for catching the edge cases that slipped through. These structured internal tests became a priority after early external playtests revealed critical issues that hindered gameplay evaluation.
In addition to our internal testing efforts, we hosted weekly external playtests to gather feedback on balance, gameplay flow, and player experience. These sessions were designed to be as bug-free as possible, which is why we implemented our multi-layered internal testing system beforehand. Nonetheless, issues would still emerge—often due to players interacting with the game in unexpected ways or from network stress when matches exceeded eight connected users (including spectators). When bugs arose, we quickly made use of our reporting form, and programmers were prompt in addressing the issues. With tests typically held on Fridays, Saturdays, and Sundays, our dev team would often spend Friday nights and Saturday mornings fixing and testing bugs found during earlier sessions. These playtests drew a diverse group of participants, including industry professionals, competitive FPS players, and our fellow students at Michigan State.
We implemented a shared Google Form for bug and crash reporting across both internal and external playtests. When a player encountered a bug, they were asked to document what happened, how it affected gameplay, steps to reliably reproduce it, and—if possible—include a recording of the issue. For crashes, players were also instructed to submit their crash logs. Prior to each external playtest, we walked participants through how to access the form and retrieve crash logs to ensure a smooth reporting process. After every internal 4+ player session and all external tests, programmers reviewed new reports, prioritized fixes, and updated the team on what had been resolved, how it was fixed, and when. To maintain quality control, no bug was considered fully resolved until it was retested and verified in a 6+ player packaged build environment following the documented reproduction steps.
Around week 5 or 6, once our game had reached a fully networked and playable state, we began encountering crashes that affected either the server, individual clients, or both. These crashes disrupted our playtests and hindered player enjoyment, making it clear that we needed a more rigorous debugging process. We turned to Unreal’s crash logs to diagnose issues. After a crash, players could navigate to the project’s Saved/Crashes
folder and locate a crash log (.txt file). In approximately 70% of cases, reading the final lines of this file provided enough context to identify the issue. For the remaining 30%, we used the memory dump (.dmp) file in conjunction with Visual Studio and our PDB (program database) file to pinpoint the specific line or area of code causing the crash.
This workflow helped us identify and fix several major issues. One critical crash involved inconsistent player respawns—using the memory dump, we discovered that an object had been placed on a player spawn point, and our respawn logic hadn’t accounted for blocked spawns. Another recurring crash stemmed from accessing an invalid array index; we traced the issue to a skeletal mesh asset with a malformed bone hierarchy.
Optimization is even more critical in networked projects than in single-player games—especially when the game is competitive, as ours is. One key area where we learned this was in our UI implementation in Unreal Engine. In my earlier experience, I commonly used UI bindings to connect elements like text fields or progress bars directly to player variables (e.g., binding a health bar’s percentage directly to a health variable). This seemed like best practice, as it's what my professors taught and what most online tutorials demonstrated.
Another bad practice I fell into in earlier projects was cramming too many UI elements into a single widget, rather than breaking them down into modular, reusable components. For this project, I focused on improving that by creating custom UI components that I could easily reuse across different parts of the interface.
For instance, I decomposed our HUD into individual widgets such as the score overlay, kill feed, game state notifications, and player health. This not only made each component easier to manage—with much cleaner and less cluttered Event Graphs—but also allowed me to selectively display HUD elements based on the context. Since we had both a player HUD and a spectator HUD, this modularity made it easy to reuse and rearrange components as needed.
I also applied this principle to our buy menu by creating a custom button widget. Instead of manually creating and managing over 20 unique buttons for each item, I built a single customizable button widget that accepted an item class as a parameter. This approach saved significant development time and greatly simplified the logic in the buy menu’s Event Graph.
As we discovered during this project, network delay poses a significant challenge in online development, especially when using replicated variables that are modified on the server and then accessed by the client. A reliable way to mitigate this issue is by using RepNotify variables. These are variables that, when changed on the server, trigger a notification function on the client only after the updated value has been successfully replicated. This ensures that any client-side logic relying on the variable’s updated state only runs once the data is valid and synchronized, preventing errors or unintended behavior caused by premature access.