Gallery
+3D renders, 3D models, 3D Captures + Will be updated soon with NeRFs taken over the last year.
+diff --git a/assets/images/dreamcrafter_progress_teaser.png b/assets/images/dreamcrafter_progress_teaser.png new file mode 100644 index 0000000..a2313c5 Binary files /dev/null and b/assets/images/dreamcrafter_progress_teaser.png differ diff --git a/assets/images/nerfstudio_logo.png b/assets/images/nerfstudio_logo.png new file mode 100644 index 0000000..1418b7f Binary files /dev/null and b/assets/images/nerfstudio_logo.png differ diff --git a/assets/pdfs/nerf_vfx_abstract.pdf b/assets/pdfs/nerf_vfx_abstract.pdf new file mode 100644 index 0000000..a31a517 Binary files /dev/null and b/assets/pdfs/nerf_vfx_abstract.pdf differ diff --git a/dreamcrafter_progress.md b/dreamcrafter_progress.md new file mode 100644 index 0000000..e63e9bd --- /dev/null +++ b/dreamcrafter_progress.md @@ -0,0 +1,35 @@ +--- +layout: post +title: Dreamcrafter (In Progress) + +show_tile: false +--- + +
+ +## Overview +We propose Dreamcrafter, a Virtual Reality 3D content generation and editing system Assisted by generative AI. Our system addresses this gap by proposing a system that harnesses the immersive experience and spatial interactions of VR, coupled with the advanced capabilities of generative AI, to enhance the process of 3D environment creation and editing. + +NeRF and diffusion models offer unparalleled realism and detail in rendering, however, their integration into user-friendly platforms for 3D environment creation is still in its infancy. +Dreamcrafter aims to bridge these gaps by creating a seamless and intuitive interface for users to design, modify, and generate complex and photorealistic 3D environments. Editing of radiance fields and generative 3D objects is currently limited to text prompts or limited 2D interfaces. Current research in NeRFs and diffusion models is primarily on enhancing image/reconstruction quality, and we aim to address the noticeable lack of exploration in the application of user interfaces designed for editing and controllability of these models and novel 3D representations. +The core of our approach is a VR-based system that allows users to interact with and manipulate 3D objects and environments in real-time. Dreamcrafter involves two subsystems systems which leverage novel 3D representations and stable diffusion. The stable diffusion powered system assigns semantically mapped spatial tags to 3D primitive objects to generate stable diffusion previews of scenes. Our second subsystem leverages NeRFs and 3D Gaussian Splatting for rendering and editing of 3D photo realistic scenes. Dreamcrafter is designed to be simple to use, lowering the barrier to entry for users without extensive experience in 3D modeling, while still providing realistic output results. + + +Our current system is + +We developed a few prototype systems within a month +## Prototype Demos +Here are some demo videos of our prototype systems that demonstrate a basic version of some of the key features/interactions. + +## Future plans + + +Here is our in-class assignment short paper. + + + + diff --git a/gallery.md b/gallery.md index c28778a..f7f56c9 100644 --- a/gallery.md +++ b/gallery.md @@ -3,7 +3,7 @@ layout: landing title: Gallery description: 3D renders, 3D models, 3D Captures image: assets/images/galleryImage.png -show_tile: true +show_tile: false --- diff --git a/nerfstudio_contributions.md b/nerfstudio_contributions.md new file mode 100644 index 0000000..03a90eb --- /dev/null +++ b/nerfstudio_contributions.md @@ -0,0 +1,49 @@ +--- +layout: post +title: Nerfstudio Contributions + +show_tile: false +--- + + + +Since Jan 2023 I have made contributions to the Nerfstudio API system including some features and small improvements to other parts. I am also acknowledged in the Nerfstudio SIGGRAPH paper. + +## Nerfstudio Blender VFX Add-on + + + +I created a Blender add-on that allows NeRFs to be used in visual effects. This enables a pipeline for integrating NeRFs into traditional compositing VFX pipelines using Nerfstudio. This approach leverages using Blender, a widely used open-source 3D creation software, to align camera paths and composite NeRF renders with meshes and other NeRFs, allowing for seamless integration of NeRFs into traditional VFX pipelines. It allows for more controlled camera trajectories of photorealistic scenes, compositing meshes and other environmental effects with NeRFs, and compositing multiple NeRFs in a single scene. This approach of generating NeRF aligned camera paths can be adapted to other 3D tool sets and workflows, enabling a more seamless integration of NeRFs into visual effects and film production. This also supports Nerfstudio gaussian splatting as well. + +The exported mesh or point cloud representation is imported into Blender and a render camera path is generated by transforming the coordinate space of the NeRF scene to the Blender virtual camera, allowing aligned camera paths. + ++ +I created documentation for it [here](https://docs.nerf.studio/extensions/blender_addon.html) and a tutorial video demonstrating basic exmaples using the add-on as well as a breakdown of other effects that can be done with it. + + + + + +v0.1.16 Released 🎉
— nerfstudio (@nerfstudioteam) January 27, 2023
New Blender integration for VFX workflows!#NeRF #nerfacto #VFX #Blender3d pic.twitter.com/uU8QO1AWwU
+ +I created documentation for it [here](https://docs.nerf.studio/extensions/blender_addon.html) and a tutorial video demonstrating basic exmaples using the add-on as well as a breakdown of other effects that can be done with it. + + + +I also have a blog post style walkthrough of making it. +v0.1.16 Released 🎉
— nerfstudio (@nerfstudioteam) January 27, 2023
New Blender integration for VFX workflows!#NeRF #nerfacto #VFX #Blender3d pic.twitter.com/uU8QO1AWwU
Various experiments with NeRFS, Neural Rendering, Relighting, Virtual Production, AI, and more
+Various experiments with NeRFS, Neural Rendering, Relighting, Virtual Production, and more + Will be updated with 2023 projects soon.
+ + + +3D renders, 3D models, 3D Captures + Will be updated soon with NeRFs taken over the last year.
+We propose a method for editing 3D Gaussian Splatting (3DGS) scenes with text-instructions in a method similar to Instruct-NeRF2NeRF. Given a 3DGS scene of a scene and the collection of images used to reconstruct it, our method uses an image-conditioned diffusion model (InstructPix2Pix) to iteratively edit the input images while optimizing the underlying scene, resulting in an optimized 3D scene that respects the edit instruction. We demonstrate that our proposed method is able to edit large-scale, real-world scenes, and is able to accomplish more realistic, targeted edits than prior work.
- Paper comming soon
+ - Paper comming soon
- Nerfstudio integration supported
+ - Nerfstudio integration supported
We present a pipeline for integrating NeRFs into traditional compositing VFX pipelines using Nerfstudio, an open-source framework for training and rendering NeRFs. Our approach involves using Blender, a widely used open-source 3D creation software, to align camera paths and composite NeRF renders with meshes and other NeRFs, allowing for seamless integration of NeRFs into traditional VFX pipelines. Our NeRF Blender add-on allows for more controlled camera trajectories of photorealistic scenes, compositing meshes and other environmental effects with NeRFs, and compositing multiple NeRFs in a single scene.This approach of generating NeRF aligned camera paths can be adapted to other 3D tool sets and workflows, enabling a more seamless integration of NeRFs into visual effects and film production. +
We present a pipeline for integrating NeRFs into traditional compositing VFX pipelines using Nerfstudio, an open-source framework for training and rendering NeRFs. Our approach involves using Blender, a widely used open-source 3D creation software, to align camera paths and composite NeRF renders with meshes and other NeRFs, allowing for seamless integration of NeRFs into traditional VFX pipelines. Our NeRF Blender add-on allows for more controlled camera trajectories of photorealistic scenes, compositing meshes and other environmental effects with NeRFs, and compositing multiple NeRFs in a single scene. This approach of generating NeRF aligned camera paths can be adapted to other 3D tool sets and workflows, enabling a more seamless integration of NeRFs into visual effects and film production.
- Shown in CVPR 2023 Art Gallery
+ - Shown in CVPR 2023 Art Gallery
- Arxiv submission in progress
I am also working on a project in capturing, creating, and viewing synthetic NeRF environments in VR for my masters thesis.
-Since Jan 2023, I have been contributing features to the Nerfstudio system including the Blender VFX add-on and VR180/Omnidirectional (VR 360) video/image render outputs.
- + +In my current 5th-year masters program (1 year graduate program after 4 year undergrad degree), I am attempting to build the initial concept of the VR environment creation system I proposed in 2022. For my VR/AR class I worked with a team to implement two prototype systems which leverage NeRFs, 3DGS, and Stable Diffusion to create a VR interface for 3D photo-realistic content creation. This includes a system to edit existing NeRF/GS scenes through voice, hand controls, and existing diffusion models (such as Instruct-Pix2Pix). We also have a system leveraging ControlNet to create 2D mockups of scenes based on 3D primitive objects. I am currently devloping the complete system with intelligent natural langue region selection and additional features. We are working towards a research publication for 2024.
+ +Since Jan 2023, I have been contributing features to the Nerfstudio system including the Blender VFX add-on and VR180/Omnidirectional (VR 360) video/image render outputs.
+