Squat Analysis
After injuring myself too many times while doing squats, I decided it was finally time to learn the proper form... by building an app.
Available now for the iPhone
Available now for the iPhone
Role: iOS engineer, mobile vision developer, UX/UI designer
Software: Swift, SwiftUI, Vision framework, AVFoundation, Core Motion, LiDAR/Depth API, Blender, 3D modeling, Character animation/rigging, After-Effects, particle animation.
Software: Swift, SwiftUI, Vision framework, AVFoundation, Core Motion, LiDAR/Depth API, Blender, 3D modeling, Character animation/rigging, After-Effects, particle animation.
Goal:
1. Understand Apple's Vision detection framework for pose detection.
2. Create a privacy-first fitness tool that processes everything on-device.
3. Explore real-time biomechanical analysis through camera sensors, and camera perspectives.
1. Understand Apple's Vision detection framework for pose detection.
2. Create a privacy-first fitness tool that processes everything on-device.
3. Explore real-time biomechanical analysis through camera sensors, and camera perspectives.
UI Design Process:
1. Create UI inside Figma, bring it over to PSD for final touches
2. Set the scene inside Blender
3. Import the UI, set the camera and lighting
4. Render image and bring it Xcode to develop the UI card
1. Create UI inside Figma, bring it over to PSD for final touches
2. Set the scene inside Blender
3. Import the UI, set the camera and lighting
4. Render image and bring it Xcode to develop the UI card
Development Process: September 2025 - January 1, 2026 (4 months)
The overall building process always required real-world testing since it couldn’t be validated in the Xcode simulator. These were some of the variables to account for:
1. Camera angle and distance
2. Lighting conditions and user clothing
3. Strongest detection points are the knees and lower body, while wrists and upper body are harder to track
4. Configuring custom detection logic for both the front and rear camera lenses, including LiDAR support
5. Designing responsive text/UI information based on the user’s distance from the iPhone
The overall building process always required real-world testing since it couldn’t be validated in the Xcode simulator. These were some of the variables to account for:
1. Camera angle and distance
2. Lighting conditions and user clothing
3. Strongest detection points are the knees and lower body, while wrists and upper body are harder to track
4. Configuring custom detection logic for both the front and rear camera lenses, including LiDAR support
5. Designing responsive text/UI information based on the user’s distance from the iPhone
Fun design challenge: How do you design an app that is positioned five feet away from the user? (And you can’t physically control)
In traditional video games, secondary information is usually placed along the edges because the main experience happens at the center of the screen. But in a squat analysis app, the user is the center of the experience — so they need to clearly see feedback while moving.
That meant every piece of text had to be concise and readable from a distance. The main text pop-ups had to be placed at the top of the screen. The bottom area was already occupied by foot analysis visuals, and when the iPhone tilts upward during use, it further reduces the visible space at the bottom. Positioning feedback at the top ensured it stayed clear, visible, and unobstructed.
In traditional video games, secondary information is usually placed along the edges because the main experience happens at the center of the screen. But in a squat analysis app, the user is the center of the experience — so they need to clearly see feedback while moving.
That meant every piece of text had to be concise and readable from a distance. The main text pop-ups had to be placed at the top of the screen. The bottom area was already occupied by foot analysis visuals, and when the iPhone tilts upward during use, it further reduces the visible space at the bottom. Positioning feedback at the top ensured it stayed clear, visible, and unobstructed.
Follow the development journey on instagram