JP, R&D — WEEK 1
Arriving at class with some idea as to what and which thesis to pursue, I was in a familiar place. For the past six months my work focused on using open-source geographic data to construct a 3D virtual model of the Ohio State University main campus using ArcGIS as a geographic information system that processes and visualizes the open source data. The data consists of Lidar (3d), Shapefiles (2D), and GeoTiffs (Photo).
Process
- Collect open-source data
- USGS (Lidar)
- OGRIP (Photo)
- Franklin County Auditor (Building)
- City of Columbus (Public Trees)
- OSU GIS Portal (Campus Info)
- Use ArcGIS Pro 3D Basemap Solution
- Process Lidar
- Create DEMs
- Generate Buildings
- Populate Trees
- Share Web Map
Next Steps
I intend to start building in Unreal Engine 5 using the open-source source data. I plan to use the found data from this build in ArcGIS to drive a procedural 3D environment in UE5. Next week my plan is to explore tools that will help me translate the data into UE5. I am most interested in using Gaea, Houdini, and QGIS.
ArcGIS Interface with 3D Basemap

3D Basemap of The Ohio State University Oval — Columbus OH

Visualizing the Contrast between open-data for OSU Campus and the City of Columbus.

JP, R&D — Paper Prototype
The core deliverable for my final project will be a qualitative analysis tool, workflow, and research product that uses a Data-driven Environment within UE5. This deliverable will include an open data-driven environmental model and video recordings of individuals’ narration of the space. Text, audio, and video will be geo-located to the areas within the virtual space. These virtual spaces can be based on real-world places or synthetic environments.
My paper prototype is an interface for the Qualitative Research of Virtual Environments using this Geonarrative format.

JP, R&D — WEEK 2
Process
- Define Area of Interest: University District
- Identify existing city planning commission area boundary for the Ohio State University
- Site Selection and Visit: Iuka Ravine
- Choose site (Iuka Ravine) to begin terrain mapping
- Make a visit to the site and document the space in real life.
- Model Terrain to Test Tools
- Use the Digital Elevation Models (DEMs) from previous ArcGIS build
- Import into Gaea to test procedural terrain modeling for UE5
- Import into Houdini to test procedural terrain modeling UE5
Next Steps
- Use Houdini 19 for procedural terrain and city modeling
- Export terrain and city model to Unreal Engine 5
- Create Landscape Material
- Explore tool creation with Houdini Engine and UE5



JP, R&D — WEEK 3
Process
- Download USGS Digital Terrain Model
- Download Aerial Photo from OGRIP
- Extract Digital Surface Model from USGS Lidar
- Download .SHP of buildings and streets data from city
- Import .TIFs and .SHP into Blender using BlenderGIS
- Create base meshes
- Apply raster image
- Georeference shapes
6. Import .TIFs into Gaea to explore terrain tools
Next Steps
- Test Zbrush tools for merging terrain models
-
- Explore Audio Visualization in Unreal Engine 4
- INSPIRATION
- Metasounds
- Niagra Audio
- INSPIRATION
JP, R&D — WEEK 4
Process
- Identify street forms to explore based on literature
- Model street case studies in Rhino
- Export from Rhino to Twinmotion via Plug-in
- Publish via Twinmotion Cloud
- Export to UE4 via Datasmith Plug-in
- Import .obj exported from Rhino into Blender
- Use Blender to export .glb
- Import .glb into Mozilla Spoke
- Publish via Mozilla Hubs
Next Steps
- Identify the polycount, texture, and lighting limits for importing and constructing scenes in Mozilla Hubs
- Explore the use of photogrammetry to bring in real-life street view models into Hubs
- Explore the use of 360 photospheres in Hubs
- Explore the composition of spatial sound in Hubs
- Test collaboration within Hubs
Model in Rhino

Street Scene Dressed within Twinmotion

Model within a Mozilla Hubs Room

JP, R&D — WEEK 5
I am working to interpret and translate analog co-design methods into immersive virtual co-design environment. This week I started organizing the Dr. Elizabeth BN Sanders’ “Generative Design Methods” according to the “Path of Expression”. Dr. Sanders describes the “Path of Expression” as equipping the participant with the knowledge and feelings needed to design future experiences by
- Grounding in the Present
- Remembering the Past
- Imagining the Future
I have grouped the “Generative Design Methods” based on whether they address the Present, Past, or Future.
I am starting to develop space planning for virtual co-design sessions that start with individual spaces that then lead to collective spaces. I have learned through speaking with Dr. Sanders that any of the collective design methods depend on individuals first making things individually and then collectively.
One resource I remember and think has application in this line of research is Bret Victor’s “Humane Representation of Thought” presentation:
The Humane Representation of Thought from Bret Victor on Vimeo.
Here is an image of the methods sorted and a copy of Bret Victor’s research poster from “Humane Representation of Thought”.

JP, R&D — WEEK 6
I have been developing the concept for our project 2 assignment. I plan to use immersive soundscape scenarios in a virtual co-creation environment and elicit individuals’ cultural values in regards to soundscapes. The prototype model uses:
- 360 Photos from Google Street View
- Audio captured in the field
- Audio toolkit using Youtube clips
- Music from choice of participant
- Mozilla Hubs room to host immersive experience
Within Mozilla Hubs, there are a set of soundscapes which comprise of different combinations juxtaposing urban and natural images with each other. Or scenarios where an audio toolkit is used to cover up a noisy soundscape, etc.
This prototype will be use tested by fellow graduate student designers before the use with community members in the Columbus, OH University District.
The final model of the co-creation scenarios will use Opensource Photogrammetry instead of 360 photos from Google Streetview:
- Choose street scenes
- Photograph with camera rig
- Correct Images via Darktable
- Process mesh in Meshroom
- Retopologize using Instant Meshes
- Optimize mesh and material in Blender
- Publish in Mozilla Hubs
JP, R&D — WEEK 7
This week I held two interviews with experts at the university on Photogrammetry and have started developing a participatory photogrammetry framework where many people can create a large photogrammetry model and bring the model into an immersive collaborative space. This requires site selection, photography shoot planning, team coordination, technical processing, and facilitation of collaboration virtually.
For my research I have been testing Mozilla Hubs as the immersive collaborative space and conducted a pilot study using Mozilla Hubs. The Mozilla Hubs use testing had 12 design students create an avatar, enter Mozilla Hubs, and complete a series of survey questions. The questions lead the participants through two experiences where 360 Photos were combined with recorded location-based audio to create a soundscape environment scenario. Participants were asked about mood affect, their preference for music, and how they would like to change the environment.


JP, R&D WEEK 8
This week I took the time to clean the 5 transcripts from the Pilot Co-design Study from last week using Descript App to clear away the filler words and gaps in silence.
(Example of clean transcript)

Later I went with two other graduate students to try out a participatory photogrammetry process. We went to Iuka Ravine Park. Walking together with one person facing forward and a person on each side taking pictures of the side: / ^ \
These images 1,000+ will be processed later today when there is access to the updated software.
JP R&D — WEEK 11
This week I have been preparing for Friday’s playtest at ACCAD where I will exhibit a soundscape study in VR using Mozilla Hubs. I have been working on in another Design Studio as part of a research team developing a qualitative research protocol. In this course I focus on the technical development of my Design Research & Development MFA thesis and have developed the VR experience that accompanies the study.
The experience includes 360 Google Streeview photos of the five most, and five least, forested areas within the University District — Columbus, OH. Audio recordings of each site were taken and will be spatially located within the 360 photos in a Mozilla Hubs VR room. Accompanying the experience will be mood surveys for each 360 Photo experience. We plan to measure the affect each scenario has on the participant’s mood.
Here is a snapshot from our planning Miro board illustrating where the sites where the 360 photos and audio are from.

JP, R&D — WEEK 12
This week I worked on sketching out a storyboard of a short film about mental health, creative expression, and oppression. I also researched how, as an individual, I could produce the film myself. Virtual Production seems to be a viable method for this one-person approach. To this end I will research and plan inputs for a Virtual Production of my short film: “This is not Art.”

WEEK 13 — JP, R&D
This week I spent the time creating Metahumans for the storyline. I have developed.

The story occurs in the Ohio State University Harding Psychiatric Hospital and I was able to request the actual architectural model from the hospital. I brought the architecture into Blender and exported an .OBJ into twinmotion.

In Twinmotion I quickly applied materials and populated the scene with models of characters for communicating my storyline. 
I am now working in Unreal Engine with dialogue and animating the characters throughout the scene.
This semester will focus on the technical challenges involved in the development of a Geographic Information System (GIS) to author Geonarratives using Digital Humans within 3D Data-driven Virtual Environments.
As work progresses over the course of this semester I will include updates and visuals to document my progress over the course of this class.
Working Plans
- Model Data-driven Virtual Environment of University District
- Prototype system for Location-based Qualitative Research
- Produce interactive experience based on Design Research & Development


