Escape Area 51 is a top-down view stealth game. You will be an alien who crashed its aircraft in a lab called Area 51. And you are going to be dissected for the research. You have nothing but yourself to rely on.
This is a group project from a course. My main role in the group is the developer. Most of the code was written by me, such as the code of enemies’ AI and interaction of throwable items.
I was not the designer in the team, so I will talk based on my experience of developing.
The guard, the enemy in this game, has a set pattern about how they move. This is the most difficult point of the whole development to me at the time. So this will be one of the main topics to talk about.
In the design document, the designer wants the guard to “see” and “hear”, to reacts based on this information (such as chasing the player character). Meanwhile, the player cannot attack the guard directly but only by using the items in the map.
Besides, there’s a light system. The light also has some effects on a guard.
It is important to mention that I didn’t write the pathfinding code for the guard. I used A* Pathfinding Project for this.
Vision and hearing
This is the first time for me to implement such a thing and it takes me some time to think about it. My final solution is as follows:
Firstly, there are two colliders for the range of these sensations. For the guard’s hearing, it’s a circular area centred on the guard. And the vision is a capsule-shaped area in front of him.
When the player enters the collision box, which represents the visual range, the system tries to make a ray towards the player character with the guard as the target. When this ray collides with the map layer, it means that there are other obstacles (e.g. walls) between them. However, if there is no collision between the two, it means the guard can see the player character directly, triggering the action of chasing the player.
The guards’ hearing does not react directly to the player, but rather to items thrown by the player. This system is a little more complex than vision system.
If an object is thrown and stops in the hearing range of a guard, it will make some “sound”. The thrown object will create a “sound source” at its position. The guard will constantly detect “sound objects” within his hearing range and will go check them once he detects one.
In this game, the large map is split into rooms. Each room has a separate piece of code that controls all the elements in the room, including guards and lights etc. When the player has turned off the lights by interacting with the light switch, the closest guard will go to check it and try to turn the lights back on.
The version of Unity used at that time does not support 2D lighting, so I used 3D lighting and modified materials to achieve this visual effect.
Until the end of the course, there were still a number of bugs in the project that had not been fully fixed. For example, guards sometimes don’t try to turn the lights back on when it is turned off, guards are slow to turn around, etc. After our group’s reflection and conclusion, we believe that in order to avoid the recurrence of these problems in future projects, we should:
- Rationalise the timing of development.
- Adjust the design scale according to the available development time.*
- Optimise staffing and rationalise the allocation of human resources for development.
You can download the build and project of this project from GitHub.
- [W, A, S, D] move the character.
- [Mouse] aiming
- [Mouse Left Button] throw an item
- [Keyboard key 1, 2, 3] switch item
- [E] interact with items and switches
- A* Pathfinding Project Retrieved from https://arongranberg.com/astar/#