7.7.2025 - 3.8.2025 ( Week 11- Week 15)
Gam Jze Shin / 0353154
Experiential Design / Bachelor of Design in Creative Media
Task 4: Final Project
Index
1. Instructions
2. Task 4: Final Project
3. Feedback
4. Reflection
Students will synthesise the knowledge gained in task 1, 2 and 3 for application in task 4. Students will create integrate visual asset and refine the prototype into a complete working and functional product experience.
Task 4 Progress
Since I already have the homepage and bottom navigation set up, I just need to add the graph editor and link the new page to load the scene from the previous page.

fig 2.5 Effect added
Then, I needed to add a 3D environment, so I searched the Unity Asset Store for a suitable road environment model. I also looked for 3D models of road signs to include in the scene.
When I added the 3D environment models into my Unity project, I noticed that some of them appeared pink. This indicated that the materials were missing or incompatible. I consulted ChatGPT for help and discovered that the issue was due to the models being created with an older version of Unity. To resolve this, I upgraded the project's render pipeline to URP (Universal Render Pipeline). After updating the materials to be compatible with URP, the textures were successfully restored. Through this process, I also learned that assets from older Unity versions often require material or shader upgrades to display correctly in newer Unity projects.
In my 3D view, I added an environment to represent a road scenario. Then, I placed the necessary 3D models as prefabs within the scene, which are essential for the quiz level functionality.
A script also needs to be added in Visual Studio to link the Box Collider with the reminder message. Above show the script I have written to handle that interaction.
Besides the road sign, I also added a plane in the scene to display a video. Depending on the selected road sign, a different video will be shown on the plane. I included a 'Play Video' button to start the video and a 'Close' button to stop or hide the video display.
Video From: https://www.youtube.com/watch?v=JU0PXiLPw3U
I also added a script to handle raycasting, which detects when the user interacts with objects like the road sign. I used the console to display debug messages to confirm that the raycast is working correctly.
After completing everything, I started the build and run process to test the app on an Android phone. The initial build took a longer time because many assets needed to be processed and exported. However, after the first build, subsequent builds became faster—even when I added new elements—because Unity only needed to recompile the updated changes.
During the first export attempt, an error occurred and the build process failed. I checked the Console in Unity to identify the issue, reviewed the error message, and fixed the problem accordingly. After resolving it, the export was successful. I tested the app on an Android phone and confirmed that everything was working as intended. With the app functioning properly, I was finally able to proceed with recording the walkthrough video and preparing my presentation. I am glad that the app now runs smoothly from Unity to the Android device.
Google Drive
Click here to view project file in Google Drive.
Presentation Video
Click here to view Task 4 Presentation Video in YouTube.
fig 2.28 Task 4 Presentation Video
AR Project Final Video Walkthrough
Click here to view Task 4 Video Walkthrough in YouTube.
fig 2.29 AR Walkthrough Video
Week 14
General Feedback
Mr. Razif mentioned that we should ensure the features we include are functional—if a feature doesn't work properly, we shouldn't proceed with it. Instead, we should focus on simpler features that work reliably.
Reflection
Over the past few weeks, I have become more familiar with using Unity. However, importing 3D assets was initially challenging for me. To make my app more functional and engaging, I had to explore how to find, download, and import assets into Unity. Sometimes the imported models didn’t display correctly due to missing materials, which required me to troubleshoot and find solutions. Another challenge was exporting the app to my phone—errors would occur during the build process, and I had to identify and resolve them before I could successfully run the app. Despite the difficulties, I felt a great sense of accomplishment when everything worked, and I learned a lot through this AR project.
There are many details to pay attention to, especially when linking UI elements with script functions. It’s important to test them repeatedly to ensure everything works correctly. I also observed that the AR camera in Unity, when connected to a ground plane, doesn't always appear directly in front of the user—it depends on where the ground plane is detected. Sometimes the AR content appears too near or too far, which can affect the user experience. Although this issue limits the placement accuracy, most of the functionality still works as intended.
Comments
Post a Comment