diff --git a/assets/images/dreamcrafter_progress_teaser.png b/assets/images/dreamcrafter_progress_teaser.png new file mode 100644 index 0000000..a2313c5 Binary files /dev/null and b/assets/images/dreamcrafter_progress_teaser.png differ diff --git a/assets/images/nerfstudio_logo.png b/assets/images/nerfstudio_logo.png new file mode 100644 index 0000000..1418b7f Binary files /dev/null and b/assets/images/nerfstudio_logo.png differ diff --git a/assets/pdfs/nerf_vfx_abstract.pdf b/assets/pdfs/nerf_vfx_abstract.pdf new file mode 100644 index 0000000..a31a517 Binary files /dev/null and b/assets/pdfs/nerf_vfx_abstract.pdf differ diff --git a/dreamcrafter_progress.md b/dreamcrafter_progress.md new file mode 100644 index 0000000..e63e9bd --- /dev/null +++ b/dreamcrafter_progress.md @@ -0,0 +1,35 @@ +--- +layout: post +title: Dreamcrafter (In Progress) + +show_tile: false +--- + + + +## Overview +We propose Dreamcrafter, a Virtual Reality 3D content generation and editing system Assisted by generative AI. Our system addresses this gap by proposing a system that harnesses the immersive experience and spatial interactions of VR, coupled with the advanced capabilities of generative AI, to enhance the process of 3D environment creation and editing. + +NeRF and diffusion models offer unparalleled realism and detail in rendering, however, their integration into user-friendly platforms for 3D environment creation is still in its infancy. +Dreamcrafter aims to bridge these gaps by creating a seamless and intuitive interface for users to design, modify, and generate complex and photorealistic 3D environments. Editing of radiance fields and generative 3D objects is currently limited to text prompts or limited 2D interfaces. Current research in NeRFs and diffusion models is primarily on enhancing image/reconstruction quality, and we aim to address the noticeable lack of exploration in the application of user interfaces designed for editing and controllability of these models and novel 3D representations. +The core of our approach is a VR-based system that allows users to interact with and manipulate 3D objects and environments in real-time. Dreamcrafter involves two subsystems systems which leverage novel 3D representations and stable diffusion. The stable diffusion powered system assigns semantically mapped spatial tags to 3D primitive objects to generate stable diffusion previews of scenes. Our second subsystem leverages NeRFs and 3D Gaussian Splatting for rendering and editing of 3D photo realistic scenes. Dreamcrafter is designed to be simple to use, lowering the barrier to entry for users without extensive experience in 3D modeling, while still providing realistic output results. + + +Our current system is + +We developed a few prototype systems within a month +## Prototype Demos +Here are some demo videos of our prototype systems that demonstrate a basic version of some of the key features/interactions. + +## Future plans + + +Here is our in-class assignment short paper. + + + + diff --git a/gallery.md b/gallery.md index c28778a..f7f56c9 100644 --- a/gallery.md +++ b/gallery.md @@ -3,7 +3,7 @@ layout: landing title: Gallery description: 3D renders, 3D models, 3D Captures image: assets/images/galleryImage.png -show_tile: true +show_tile: false --- diff --git a/nerfstudio_contributions.md b/nerfstudio_contributions.md new file mode 100644 index 0000000..03a90eb --- /dev/null +++ b/nerfstudio_contributions.md @@ -0,0 +1,49 @@ +--- +layout: post +title: Nerfstudio Contributions + +show_tile: false +--- + + + +Since Jan 2023 I have made contributions to the Nerfstudio API system including some features and small improvements to other parts. I am also acknowledged in the Nerfstudio SIGGRAPH paper. + +## Nerfstudio Blender VFX Add-on + + + +I created a Blender add-on that allows NeRFs to be used in visual effects. This enables a pipeline for integrating NeRFs into traditional compositing VFX pipelines using Nerfstudio. This approach leverages using Blender, a widely used open-source 3D creation software, to align camera paths and composite NeRF renders with meshes and other NeRFs, allowing for seamless integration of NeRFs into traditional VFX pipelines. It allows for more controlled camera trajectories of photorealistic scenes, compositing meshes and other environmental effects with NeRFs, and compositing multiple NeRFs in a single scene. This approach of generating NeRF aligned camera paths can be adapted to other 3D tool sets and workflows, enabling a more seamless integration of NeRFs into visual effects and film production. This also supports Nerfstudio gaussian splatting as well. + +The exported mesh or point cloud representation is imported into Blender and a render camera path is generated by transforming the coordinate space of the NeRF scene to the Blender virtual camera, allowing aligned camera paths. + +

v0.1.16 Released 🎉

New Blender integration for VFX workflows!#NeRF #nerfacto #VFX #Blender3d pic.twitter.com/uU8QO1AWwU

— nerfstudio (@nerfstudioteam) January 27, 2023
+ +I created documentation for it [here](https://docs.nerf.studio/extensions/blender_addon.html) and a tutorial video demonstrating basic exmaples using the add-on as well as a breakdown of other effects that can be done with it. + + + + + +
+## 🥽 VR Video Rendering + +I implemented VR180 and VR360 (Omnidirectional stereo) render cameras to support VR video rendering. This allows users to render stereo equirectangular videos to view on VR headsets or post on YouTube. Documentation is [here](https://docs.nerf.studio/quickstart/custom_dataset.html#render-vr-video). +The Blender add-on is used to create the final render path and to correctly scale the NeRF to real-world size since the scanned NeRF coordinate system is arbitrary. + + + +## Additional contributions +I have also made smaller contributions towards Nerfstudio including towards [Viser](https://viser.studio/), the new 3D viewer that Nerfstudio uses, as well as adding Nerfstudio support for [Instruct-GS2GS](https://docs.nerf.studio/nerfology/methods/igs2gs.html). + + + diff --git a/nerfstudio_vfx_blender.md b/nerfstudio_vfx_blender.md new file mode 100644 index 0000000..bd426c7 --- /dev/null +++ b/nerfstudio_vfx_blender.md @@ -0,0 +1,45 @@ +--- +layout: post +title: Nerstudio VFX Blender Add-on + +show_tile: false +--- + + + +## Nerfstudio Blender VFX Add-on + + + +## Overview + +I created a Blender add-on that allows NeRFs to be used in visual effects. This enables a pipeline for integrating NeRFs into traditional compositing VFX pipelines using Nerfstudio. This approach leverages using Blender, a widely used open-source 3D creation software, to align camera paths and composite NeRF renders with meshes and other NeRFs, allowing for seamless integration of NeRFs into traditional VFX pipelines. It allows for more controlled camera trajectories of photorealistic scenes, compositing meshes and other environmental effects with NeRFs, and compositing multiple NeRFs in a single scene. This approach of generating NeRF aligned camera paths can be adapted to other 3D tool sets and workflows, enabling a more seamless integration of NeRFs into visual effects and film production. This also supports Nerfstudio gaussian splatting as well. + +The exported mesh or point cloud representation is imported into Blender and a render camera path is generated by transforming the coordinate space of the NeRF scene to the Blender virtual camera, allowing aligned camera paths. + +## Implementation Details + +For generating the JSON camera path, we iterate over the scene frame sequence (from the start to the end with step intervals) and get the camera 4x4 world matrix at each frame. The world transformation matrix gives the position, rotation, and scale of the camera. We then obtain the world matrix of the NeRF representation at each frame and transform the camera coordinates with this to get the final camera world matrix. This allows us to re-position, rotate, and scale the NeRF representation in Blender and generate the right camera path to render the NeRF accordingly in Nerfstudio. Additionally, we calculate the FOV of the camera at each frame based on the sensor fit (horizontal or vertical), angle of view, and aspect ratio. Next, we construct the list of keyframes which is very similar to the world matrices of the transformed camera matrix. Camera properties in the JSON file are based on user specified fields such as resolution (user specified in Output Properties in Blender), camera type (Perspective or Equirectangular). In the JSON file, aspect is specified as 1.0, smoothness_value is set to 0, and is_cycle is set to false. The Nerfstudio render is the fps specified in Blender where the duration is the total number of frames divided by the fps. Finally, we construct the full JSON object and write it to the file path specified by the user. + +For generating the camera from the JSON file, we create a new Blender camera based on the input file and iterate through the camera_path field in the JSON to get the world matrix of the object from the matrix_to_world and similarly get the FOV from the fov fields. At each iteration, we set the camera to these parameters and insert a keyframe based on the position, rotation, and scale of the camera as well as the focal length of the camera based on the vertical FOV input. + + + +

v0.1.16 Released 🎉

New Blender integration for VFX workflows!#NeRF #nerfacto #VFX #Blender3d pic.twitter.com/uU8QO1AWwU

— nerfstudio (@nerfstudioteam) January 27, 2023
+ +I created documentation for it [here](https://docs.nerf.studio/extensions/blender_addon.html) and a tutorial video demonstrating basic exmaples using the add-on as well as a breakdown of other effects that can be done with it. + + + +I also have a blog post style walkthrough of making it. + + + + + diff --git a/projects.md b/projects.md index 64c8dcc..bfed1cd 100644 --- a/projects.md +++ b/projects.md @@ -35,9 +35,28 @@ show-title: true

Experimentation with NeRFs/Neural Rendering

-

Various experiments with NeRFS, Neural Rendering, Relighting, Virtual Production, AI, and more

+

Various experiments with NeRFS, Neural Rendering, Relighting, Virtual Production, and more + Will be updated with 2023 projects soon.

+ + + +
+ + + +
+
+
+

Gallery

+
+

3D renders, 3D models, 3D Captures + Will be updated soon with NeRFs taken over the last year.

+
diff --git a/research.md b/research.md index a5ecff6..85783cf 100644 --- a/research.md +++ b/research.md @@ -23,7 +23,7 @@ menu-show: true
-
+
@@ -35,9 +35,9 @@ menu-show: true

We propose a method for editing 3D Gaussian Splatting (3DGS) scenes with text-instructions in a method similar to Instruct-NeRF2NeRF. Given a 3DGS scene of a scene and the collection of images used to reconstruct it, our method uses an image-conditioned diffusion model (InstructPix2Pix) to iteratively edit the input images while optimizing the underlying scene, resulting in an optimized 3D scene that respects the edit instruction. We demonstrate that our proposed method is able to edit large-scale, real-world scenes, and is able to accomplish more realistic, targeted edits than prior work.
- Paper comming soon + - Paper comming soon
- Nerfstudio integration supported + - Nerfstudio integration supported

  • Project Page
  • @@ -58,14 +58,14 @@ menu-show: true

    Nerfstudio Blender VFX Add-on

    -

    We present a pipeline for integrating NeRFs into traditional compositing VFX pipelines using Nerfstudio, an open-source framework for training and rendering NeRFs. Our approach involves using Blender, a widely used open-source 3D creation software, to align camera paths and composite NeRF renders with meshes and other NeRFs, allowing for seamless integration of NeRFs into traditional VFX pipelines. Our NeRF Blender add-on allows for more controlled camera trajectories of photorealistic scenes, compositing meshes and other environmental effects with NeRFs, and compositing multiple NeRFs in a single scene.This approach of generating NeRF aligned camera paths can be adapted to other 3D tool sets and workflows, enabling a more seamless integration of NeRFs into visual effects and film production. +

    We present a pipeline for integrating NeRFs into traditional compositing VFX pipelines using Nerfstudio, an open-source framework for training and rendering NeRFs. Our approach involves using Blender, a widely used open-source 3D creation software, to align camera paths and composite NeRF renders with meshes and other NeRFs, allowing for seamless integration of NeRFs into traditional VFX pipelines. Our NeRF Blender add-on allows for more controlled camera trajectories of photorealistic scenes, compositing meshes and other environmental effects with NeRFs, and compositing multiple NeRFs in a single scene. This approach of generating NeRF aligned camera paths can be adapted to other 3D tool sets and workflows, enabling a more seamless integration of NeRFs into visual effects and film production.
    - Shown in CVPR 2023 Art Gallery + - Shown in CVPR 2023 Art Gallery
    - Arxiv submission in progress

    + - Arxiv submission in progress

@@ -120,8 +120,41 @@ menu-show: true

Future/Current Research Projects

I am also working on a project in capturing, creating, and viewing synthetic NeRF environments in VR for my masters thesis.

-

Since Jan 2023, I have been contributing features to the Nerfstudio system including the Blender VFX add-on and VR180/Omnidirectional (VR 360) video/image render outputs.

- +

+
+
+ + + +
+
+
+

Dreamcrafter (In Progress)

+
+

In my current 5th-year masters program (1 year graduate program after 4 year undergrad degree), I am attempting to build the initial concept of the VR environment creation system I proposed in 2022. For my VR/AR class I worked with a team to implement two prototype systems which leverage NeRFs, 3DGS, and Stable Diffusion to create a VR interface for 3D photo-realistic content creation. This includes a system to edit existing NeRF/GS scenes through voice, hand controls, and existing diffusion models (such as Instruct-Pix2Pix). We also have a system leveraging ControlNet to create 2D mockups of scenes based on 3D primitive objects. I am currently devloping the complete system with intelligent natural langue region selection and additional features. We are working towards a research publication for 2024.

+ +
+
+
+
+ + + +
+
+
+

Nerfstudio Contributions

+
+

Since Jan 2023, I have been contributing features to the Nerfstudio system including the Blender VFX add-on and VR180/Omnidirectional (VR 360) video/image render outputs.

+ +
+
+
+