top of page

Techniques

Software, hardware, APIs and algorithms used to create our app.

Software

Xcode

IDE used to develop our project using UIKit and SwiftUI.

Reality Composer

Augmented Reality tool developed by Apple that allows for the creation of interactive and animated USDZ models that can be directly imported and used in AR apps. Used to create the models for our app and their corresponding animations.

Hardware

Lidar Compatible Phone

Used in conjunction with RealityKit, this sensor allows the app to detect multiple surfaces and render the weather models' scales and locations based on it. Additionally, it provides information about lighting that allows for realistic lighting of the models in the rendering.

APIs

OpenWeather

This API allows our app to fetch realtime data of any city's current weather globally.

RealityKit

Augmented Reality framework developed by Apple using Swift. Allows for the swift and uncomplicated development of AR experiences. Used in conjunction with lidar sensor in newest iOS devices and provides high quality and physically based renderings of the models used.

ARKit

Provides the sensing data for our app using LiDAR. Supports occlusion for surfaces and people, adaptive lighting, and some basic object detection. Paired with RealityKit, this allows our app to easily create AR scenes.

Our Approaches

UI

Taking advantage of both UIKit and SwiftUI, the app displays to the user a button that allows them to view the mesh created by RealityKit and the current city and its temperature (in Celsius) and weather. A search bar allows the user to look up any city they want to view the weather of. The user can turn on or off the mesh used by RealityKit through the hide mesh button.

Data

Taken from OpenWeather and stored in our WeatherModel class to be readily accessed by any of the functions in our program.

Model Loading

Asynchronously loads models and their built in animations created in Reality Composer into the scene. A weather model we created keeps track of the condition taken from the OpenWeather data and loads the model with the corresponding weather condition. If the data pulled from OpenWeather says that it is sunny in Santa Barbara, then the model class will look up the model with the same name and asynchronously load it into the screen.

Physical Interaction with Environment

By utilizing LiDAR and the abundance of APIs made readily available by RealityKit and ARKit, the app is able to detect and create realtime meshes of the user's environment for it to render and display the models with correct sizing and lighting. Its detection of multiple surfaces and depth allows for the realistic interaction of the model with the environment. Because people are included within this LiDAR mesh, users can interact with raindrops that will bounce off of them.

bottom of page