top of page

Our Process

From initial design to final implementation, this documents every step we have taken in developing weathAR and the design decisions made along the way.

phase0.png

01

Initial Design

For our initial design, we wanted to render the weather onto the user's indoor ceiling and have a circular UI display containing the city's name, temperature and weather, and a scrolling feature that would allow the user to see the day's forecasted weather. As the user scrolls through the hours, the changes in weather would be displayed in realtime. We wanted the main focus to be on the rendered weather, so we wanted a very minimalistic UI as to not detract attention away from the weather.

02

First Approach

After seeing the limitations of requiring the user to have a ceiling to display the weather models, we decided to forego rendering the displays on the ceiling and have the models be displayed a certain height above the user. This did require the user to find some planar surface of which to render the weather models on top of, but this allowed them the freedom to use the app indoors or outdoors. For the planar detection, as we were attempting to create the app for as many iOS devices, we were restricted by the non-LiDAR sensor devices that had limited capability in detecting planar surfaces. Although we managed to get physical interaction working with the real environment with the help of Reality Composer, it would only detect one planar surface and all physical interactions would occur on that surface. This led us to do a switch over to developing the app only for LiDAR enabled iOS devices as we wanted to take full advantage of the capabilities the sensor allowed us in detecting our environment and performing physical interactions with it.

Screen Shot 2022-03-14 at 2.16.50 AM.png
phase2.png

03

Final Approach

We used starter code found on Apple's RealityKit developer page that contained environment detection and created meshes out of the environment in realtime. This capability gave us the tools needed to detect multiple surfaces at once as required for a realistic interaction between the rendered weather models and the user's space. The LiDAR sensor is able to capture the environment's lighting and stores the data that can then be used by RealityKit to generate realistic lighting for the models to make the models appear more integrated with the space. Due to time constraints, we did not implement the hourly scroll of the forecasted weather.

04

Future Work

For future work, we would want to refine the rain animation so that it better aligns with the rendered clouds. In our original design, we wanted to incorporate a scrolling feature where the user could view the day's forecasted weather and see the changes in the day's weather reflected in the model being rendered. Another feature we want to fully integrate is gesture tracking as that would allow the user to control the weather using their hands and interact with more weather elements such as clouds and snow.

bottom of page