Explorations in Vision OS
____
This documentation serves as a showcase of my ongoing journey into the realm of iOS development, where I've been experimenting with Apple's Vision OS software framework to create immersive applications. My journey began during my Master's program when I first encountered Reality Composer and marveled at its evolution into Reality Composer Pro. Over the past two years, I've been diligently expanding my iOS development skills through summer courses and online kodeko classes. This series of documentation aims to highlight the prototypes I've been crafting, representing the culmination of my academic foundation, practical experience, and unbridled enthusiasm for iOS development.
____
This documentation serves as a showcase of my ongoing journey into the realm of iOS development, where I've been experimenting with Apple's Vision OS software framework to create immersive applications. My journey began during my Master's program when I first encountered Reality Composer and marveled at its evolution into Reality Composer Pro. Over the past two years, I've been diligently expanding my iOS development skills through summer courses and online kodeko classes. This series of documentation aims to highlight the prototypes I've been crafting, representing the culmination of my academic foundation, practical experience, and unbridled enthusiasm for iOS development.
Realitykit, SwiftUI, ARKit, Gesture-based interaction, Reality Composer Pro, Shader graph, geometry modifiers, paritcle system.
Blog: Interactive 3D Buttons in RealityKit: Implementing Tap Gesture, & 3D Buttons from Reality Composer Pro
____
In this blog post, I explore the challenges and solutions encountered while developing an interactive 3D button interface using RealityKit in SwiftUI. The project involved creating multiple buttons within a 3D scene, each with unique functionalities triggered by user taps. Given the limitations of SpatialTapGesture for handling multiple entities, I decided to take another approach. This notion post walks through the process of setting up the buttons, and implementing tap/hover detection.
____
In this blog post, I explore the challenges and solutions encountered while developing an interactive 3D button interface using RealityKit in SwiftUI. The project involved creating multiple buttons within a 3D scene, each with unique functionalities triggered by user taps. Given the limitations of SpatialTapGesture for handling multiple entities, I decided to take another approach. This notion post walks through the process of setting up the buttons, and implementing tap/hover detection.
Blog: Text Placement on 3D Cubes in RealityKit
____
This prototype series is dedicated to the challenges and solutions involved in placing 3D text on a cube within RealityKit. The process starts with creating a basic RealityKit scene, followed by generating a text mesh, and finally, accurately attaching the text to the front face of the cube.
____
This prototype series is dedicated to the challenges and solutions involved in placing 3D text on a cube within RealityKit. The process starts with creating a basic RealityKit scene, followed by generating a text mesh, and finally, accurately attaching the text to the front face of the cube.