All Things 3D
3D Tech Cloest MO CAP YO VIVE
updated
Come back to the this video tomorrow for a link to get 20% off when you preorder before December 1st on any of the ‘Gaming Lens’ products*
*Products availability will vary based on manufacturing, packaging and delivery of inventory between the end of December and January, 2024.
Meet the Alldocube ‘s iPlay50, here attached to a Sony DualShock 4 using the ‘Gaming Lens’ mount going through a number of game streaming services like Microsoft Xbox Game Pass , Nvidia Now , Valve’s Steam Link , open-source streaming app Moonlight all via WiFi or 4G (YES GoogleFi works great!), but more importantly with its 8GB RAM and fast storage, it also plays every AAA Android game I threw at it like ‘Alien Isolation’ , Doom 3 , Half-Life 2 (and Portal 1 & 2) and even Epic’s Fortnite. This is also due to it being GooglePlay certified, which sadly many low-cost Android tablets are not.
Why is this exciting to me? I have been looking for an 8”-9” tablet to use with the ‘Gaming Lens’ mount, but other than 1280x800 8” tablets, the only other solution is the iPad Mini 6th generation, which will set you back $500 and that is for 64GB storage, and less the ideal 4:3 screen for gaming. However, it did great for anything I threw at it, including Playstation remote play, but still $500! Also the fact you can’t run Game Pass or Nvidia Now through an app (just through Safari) and definitely NO Fortnite. So again, what a surprise to find a low cost Android tablet that does about everything the iPad mini does, with a more open Android OS, and did I mention 4G? If that isn’t enough, how about 18W fast charging via a #USBC port (only USB2 speed though) and the ability to plug a Dualshock 4, DualSense or an @XBOX controller directly for the fastest, low latency connection.
However, none of these tablets would even be in the game for an excellent Android gaming platform if it weren’t for the ‘Gaming Lens’ mount which is very close to production, hopefully in time for the holiday season, preordering available soon.
Oh, hang in there for the end of the video :) BOO!
Happy Halloween!
You can see more of his images here:
sites.google.com/site/thomasappere
twitter.com/thomas_appere
flickr.com/photos/thomasappere
instagram.com/thomas_appere
planetary.org/profiles/thomas-apprere
Papers:
nature.com/articles/s41561-018-0233-2.epdf?author_access_token=0rSXvzl1uqVFkfXd6gLGm9RgN0jAjWel9jnR3ZoTv0MFzzEAVt3mAX-a87uOFAse8-NQltCF6-J8ciVCfQSmcHX9lAsJcopiZ6DpMNAUhXCjK9QoINRIbzbmAX3oLKAjTiqnbpXXXotOPIdXW3djTA%3D%3D
drive.google.com/file/d/0B8YujxCPJC0PNjY0YmM5ODYtNGY1NC00YzdkLWJkMjAtNDIwZjk3YjVkNjIz/view?resourcekey=0-pNFWziqJN1kSlgtGrSrCkg
sites.google.com/site/thomasappere/publications?authuser=0
Links to Associates:
3D models from Gwenael Caravaca at LPGN - sketchfab.com/LPG-3D
vr2planets.com
The RTX4060 has gotten a lot of negative press with hohum specs. Which is true if you are comparing it to the 500W behemoths like the #RTX4090 or even a #RTX4070. But if you compare it to the RTXA2000 or even RX6400, for same price (former) or twice that (latter) you get 2X the performance (former) or 3X the performance (latter) for only 25W-40W more power consumption. This has allowed me to build a fairly low-cost solution ($700) based on the @Intel 11th gen #i1140T and an @msiUSA #miniITX Pro board, 16GB 3200Mhz & 1TB M.2 2280 gen3 that showed noticeable performance and visual gains on my 8.9” 2.5K (shown here at 1920x1200 90fps) #GamingLens .
Even though I was able to house all of this in 3.8L case with a max power draw of 200W for the entire system including the Gaming Lens, it still generated a great deal of heat. Using the case’s existing side panels with too little perforation, the GPU was over 85c & the CPU 70c+ in a @ULSolutions #3DMark stress test that did not pass. I constructed new side panels that framed #PVC mesh to bring the GPU down to 75c and the CPU down to 61c allowing it to pass at 98.7%. Even better, I was able to go from 9380 points to 9600 points in #TimeSpy, which is 6000 more points than exact same system with an RX6400. Spend a little more for 13th gen ‘T’ processor with DDR5 and you will add about 500-1000 to this number.
As far as the 8GB #VRAM limitation. At 1200P, even on #TheLastofUs with #DLSS , I still had 800MB unallocated on ‘High’ settings. #Cyberpunk2077 I had it on Max settings with ‘Ray Tracing’ enabled. #APlagueTaleRequiem was on ‘Max’ settings and #DeadspaceRemake on ‘Ultra’.
Not bad for the an extra $150 over the most used GPU on #Steam - GTX1650. Would a 12GB or 16GB version been better? Probably, but for a smaller 1080P or 1440P screen, I feel I got my monies worth.
If Vikas sounds familiar, he was the cofounder of Occipital with Jeff Powers, who I have had on the show several times over the past ten years talking about the progress of one of the first portable 3D scanners, as well as one of the very first Mixed Reality headsets - 'Bridge' using a technique called "unbounded tracking" that we now know as "inside/out tracking" that everyone is using now.
If you want to find out more about LightTwist head to lighttwist.com
Speaking of ten years, this is the tenth anniversary year of 'All Things 3D' and even though I (Mike) have been on a hiatus from the show, I am back with more news, more "3D Tech Closet" projects and tutorials, with the first '3D in Review' episode hitting YouTube later today (1/29/2023) with more to come every Friday, including an audio only podcast you can find at https://allthings3D.net or wherever you listen to podcast soon.
Glad to be back and if you're not a subscriber, please subscribe. Sadly, I don't have much control over Google, but my content is always free and I request no advertisement and anything I review or work on is based solely on my own knowledge, research and opinion, with no payment or free products from the company I am talking about, unless specifically stated.
----------------------------------------------------
Quick links:
2:07 Intro
3:20 The Spark that created 'LightTwist'
6:13 How does it look?
10:00 A look back at Occipital and its experience shaped that Vikas's future.
18:32 How about we jump into the demo of 'LightTwist'?
------------------------------------------------
Credits
"Mandalorian" is a Disney+ & Lucas Films Production
"Unreal VIrtual Studio" is a product of Epic Games
To celebrate the the one year anniversary of Perseverance and Ingenuity coming back together today (not quite this close though), I created this little VR180 with AmbiX audio to showcase the latest features in Unreal 5.0 based on one of the more recent builds of 'UE5 Main'. It takes advantage of Lumen GI & Reflectivity as well as some features from 4.26+ like Render Queue and 2D Capture. The latter uses a novel method of allowing over/under stereo capture by using a Cinema Camera instead of the older technique of using an orthographic camera; which not only allows for more control in the visual appearance, but also allows Render Queue to be used providing faster rendering and more important, the ability to improve anti-aliasing (at the expense of performance) for a better image quality. What is important in this method, is was able to use the standard 2D plane instead of a stereo omni capture to capture a 110 degrees of 'panini' correct FOV, that was later converted to an equirectangular in Adobe After Effect. This is notable since one can basically use any camera to capture and FOV of 100-130 which is about the limits of most VR HMDs to provide a very immersive experience without having to resort to slower, less effective methods to derive perspective correct stereo "3D" output. Now the goal is to provide stereo paired output without having to resort to 3D capture process, which sadly is not much better than cube capture by capturing the output from VR camera instead.
Sadly, the Motion Blur effect seems to suffer, as can be seen in almost no blur in the props, yet more at ground level the frame rate appears more realistic. Something I will have to spend more time with in the future.
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.space
To celebrate the the one year anniversary of Perseverance and Ingenuity coming back together today (not quite this close though), I created this little VR180 with AmbiX audio to showcase the latest features in Unreal 5.0 based on one of the more recent builds of 'UE5 Main'. It takes advantage of Lumen GI & Reflectivity as well as some features from 4.26+ like Render Queue and Stereo Omni Capture. The latter uses a novel method of allowing over/under stereo capture by using a Cinema Camera instead of the older technique of using an orthographic camera; which not only allows for more control in the visual appearance, but also allows Render Queue to be used providing faster rendering and more important, the ability to improve anti-aliasing (at the expense of performance) for a better image quality. Sadly, the Motion Blur effect seems to suffer, as can be seen in almost no blur in the props, yet more at ground level the frame rate appears more realistic. Something I will have to spend more time with in the future.
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.space
To celebrate the the one year anniversary of Perseverance and Ingenuity coming back together today (not quite this close though), I created this little VR180 with AmbiX audio to showcase the latest features in Unreal 5.0 based on one of the more recent builds of 'UE5 Main'. It takes advantage of Lumen GI & Reflectivity as well as some features from 4.26+ like Render Queue and Stereo Omni Capture. The latter uses a novel method of allowing over/under stereo capture by using a Cinema Camera instead of the older technique of using an orthographic camera; which not only allows for more control in the visual appearance, but also allows Render Queue to be used providing faster rendering and more important, the ability to improve anti-aliasing (at the expense of performance) for a better image quality. Sadly, the Motion Blur effect seems to suffer, as can be seen in almost no blur in the props, yet more at ground level the frame rate appears more realistic. Something I will have to spend more time with in the future.
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.space
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.space
How did I do it? FIrst you can't just put the cameras next to each other even though they are pocket size, but you can by inverting one of the cameras with the shortest sides next to each other, however that creates another challenge and that is how to keep them in place. I was able to do this with a 'Gearbox' universal camera cage and some extension tubes. From there I had to drill a few new holes so I could align and secure the inverted cameras through the tripod mounting holes. I also had to use rubber shims to bring the sensors in parallel with each other and use double sided tacky tape along the edges making contact to keep them in place. Sadly even with short sides against each other, I still had to contend with 85mm versus 65mm distance between the lens centers. This sadly will cause the eyes to strain at objects close to the camera, but objects further from the camera will have a more effective 3D effect. I can manipulate this in software to get them closer to one another, but this has other consequences, so I am left with 75 mm apart. Knowing now that I am only 10 degree off of a full 180 vertical, I can probably just butt their bottoms up against each other with a stabilizing bar between them and custom threaded bolt, threaded through with enough thread to allow for silicone washer/bushing and tacky tape along the entire rail to provide even more grip to the entire bottom of the plate. The plate will have to extend past both camera to allow for another T-Plate to be attached to allow for a tripod mount.
I also use dummy battery systems to power each of the cameras with a large 5 VDC lithium-Ion battery and USB cables to export files since once mounted, it is pain to get at the batteries. Also thank Panasonic for creating custom memory settings and the ability to trigger the shutter from a phone or the touch screen. Not having these features make adjusting the upside down camera -- painful.
In any case I have found viewing the footage before upload in my Quest 2 pleasant and well within its 17.2 pixels per degree.
How did I create the VR180?
First I took the two seperate video files and used FFMPEG to convert from spherical/fisheye to equiangular based upon the FOVs motioned above. I used HVEC Nvenc to render the video with the added 'roll:180' added to the left camera to invert it vertically to match the other camera.
I then imported these two equirectangular videos into Adobe Premiere and did tiny bit of grading to bring the white/black components down/up slightly, but it really wasn't needed. Then I added the VR Projection filters, again not sure if I needed it since I adjusted the cameras to match via the Horizontal/Vertical adjust. Than scaled the horizontal out to clip the 220-180 portion of the video on both sides, then rendered it out again using HVENC, but at slightly smaller scale. Since Premiere adds the proper meta tag for 180 O/U, I just test in my Quest 2 using Skybox VR player before uploading it here. I will make a link for my next set of videos to download the versions I created before uploading in the next series as well as upload them to my Oculus Creator page as well.
Final goal is to create a script for FFMPEG to do the full encoding, skipping Premiere entirely. I have started to do this for my Unreal VR360 & VR160 videos as well to improve workflow.
In doing so many of the materials had to be modified, and a number of lights were either turned off or attenuated to allow Lumen's excellent GI to do its magic. It is notable that I had to make a few tweaks to the eagle rotation on some frames to prevent a light saturation effect to the material. And if you look closely as the eagle is coming out of the sun, there is frame I left to show what happens. This was done on a Github UE 5 build from mid December so maybe this will be fined tuned, but I think it has more to do with not tweaking every material. I should also be noted that like standard reflection captures sphers and screen capture reflection modes, character meshes don't appear. If you look closely in the original project in the eyes of the Meerkat as the eagle is almost upon it, there is no reflection. Of course this would only be for a few frames, but nevertheless I decided to create a static mesh of the character mesh at that moment and add it to the sequence for those frames to capture the reflection and appropriate lighting. To add even more realism, I moved the static model towards the Meerkat for even greater effect. Of course you will never notice unless you do a frame by frame analyisis, but it was worth doing. Of course I could have just enable RTX reflections that does allow character meshes and provided even better reflections, but this was test with "software" lumen only on a standard Intel i9 and 128 GB memory.
I also took that time to add spatial audio by attaching the audio track to the Meerkat and using modified 'Sound Attenuation" to spread out the 2-channel original track to four channels of a 5.1 channel layout adding more depth. It would have been more convincing if they had laid down each foley or creature audio track separately, but it appears it was mixed outside of the Unreal Engine, which sadly the Unreal audio developers seem to think this is the best way to do this and spending more time in turning Unreal Engine 5 turning it into a synthesizer instead working with the Sequencer developers to allow full mix downs to a full spatial track series using Microsoft Spatial Audio API. Which I have found I can do, but only by creating 2 passes with the camera turned 90 degrees on its side to capture up/down channels and using Premiere to layout a conversion to AmbiX. If you are looking for stereo, 5.1 or 7.1 audio, you can export it as a multichannel WAV file, but I found I needed to export it and import into a Audacity and export it again for it to be muxed properly with the ProRes LT video file to create a 6 channel AAC file in FFMPEG and uploaded to YouTube. The original MOV was 3.7 GB for less than two minutes of animation. Not the most efficient, but I have found it to provide the best transfer to YouTube. It should be noted using "-copy" is the best way to use FFMPEG since it will do a direct transfer without transcoding the output. If you need to transcode, I highly suggest x.265 with nvidia GPU transcoding. Super fast and the results are very good.
One more note: Even though this could have been rendered in real-time with my RTX-3080 with Lumen at '4' quality. I needed to wait a few frames to allow Lumen to settle down before the next 'shot camera' was rendered. Even so, it only took 10 minutes to render both the 4K MOV & WAV tracks using the legacy 'Movie Scene Capturer'. I tried Render Queue, but it needed a set up for each shot, which wouldn't have given me more quality unless I wanted to use a lot of frame samples for anti-aliasing which would have taken much longer to render. I really hope they don't pull out the old 'Movie Scene Capturer' since 'Render Queue' doesn't do full mixdown audio and cannot capture orthographic camera output.
Oh, and happy new year.
Since most of my latest Unreal Engine project have already been converted over to 4.27.1 or 4.27.2, I was bummed to find out that unsurprisingly since UE 5 Early Access was release in July, 2021 with the last update coming in early September that levels created or modified in 4.27 would not show up. To alleviate that I went out to Unreal's Github and downloaded the latest source build (12/7) and created a new UE 5 that could open it. However, it also seems Lumen had some terrible glitches handling lights and particle occlusion that caused it flash-leak out of occluding box. So I went back and creating another build based a later date (12/16) and now I had a whole host of new problems since emissive textures were 10-100 times brighter, and sequencer file kept freezing, requiring me to kill the process. Soooo, I just created new sequencer, sequence, which I wanted to do anyway for a new "Hot Cocoa in VR" trailer video based on some new "friends" to keep you company in the long winter months.
In working with Lumen, I find it very easy but frustrating since even though everything is real-time, many of my material had to be tweaked to behave. It should also be noted that I wanted to do this without any NVIDIA RTX features so I remained on DX-11 so what you see is ALL Lumen and few Nanite meshes.
To render this out, I used both the legacy 'Movie Scene Capture' and 'Render Queue'. The former was used ONLY for rendering audio since sadly 'Movie Scene Capture' still does allow for submixes. The video was rendered as an EXR (Linear 16 bits per color) image sequence with 3:2 Temporal Super Resolution anti-aliasing. This image sequence was brought into Adobe Premiere using the EXR Pro plugin and and set the project up for (203 58% PQ), then adjusted white and black levels to the HDR PQ levels of 1000 nits. For export I chose HEVC at 40 mps Rec. 2100 PQ
Even though this was more of a study in using Lumen & HDR, most of it useless in VR since Lumen & Nanites do not work there. My next test is to see if they omni-capture tool can, that way I can capture 7K images using specially designed capture cinema camera.
You can download the file used for upload to YouTube to view directly on your HDR monitor: https://1drv.ms/v/s!AjCOOLquXxqngvcdjzwXSieV2DeriA?e=oI6THn
In working with Lumen, I find it very easy but frustrating since even though everything is real-time, many of my material had to be tweaked to behave. It should also be noted that I wanted to do this without any NVIDIA RTX features so I remained on DX-11 so what you see is ALL Lumen and few Nanite meshes.
To render this out, I used both the legacy 'Movie Scene Capture' and 'Render Queue'. The former was used ONLY for rendering audio since sadly 'Movie Scene Capture' still does allow for submixes. The video was rendered using Apple ProRes LT, which is only a 4:2:2 encoder, but at 150Mbps, as well as 3:2 Temporal Super Resolution anti-aliasing to create a 5 GB MOV file that rendered 2 minutes in 45 minutes. I then used FFMPEG to mux the MOV with the stereo audio WAV to create the final MOV file that was used to create this 4K file. Sadly, Unreal has not upgraded Apple ProRes HDR, so this is still in REC-709. And since the goal was to do EVERYTHING in Unreal, exporting a 3700 EXR 4K image files into Adobe Premiere would not allow me to make that claim (I guess I could have brought the image sequence into FFMPG -- maybe next time), and I think it looks pretty good since I had control over all the lighting to prevent a blown out areas thanks to Unreal's new set of monitor tools.
So even though this was more of a study in using Lumen, most of it useless in VR since Lumen & Nanites do not work there. My next test is to see if they omni-capture tool can, that way I can capture 8K images using specially designed capture cinema camera.
This version adds a few “little” surprises to make you smile, maybe even laugh a little.
“It’s cold out there, come in where its warm & cozy”
hotcocoa-vr.com
The songs " A Lighthouse in Space" & "Floating" by @Kim Aspen, aka Jimmy Walsteen were licensed from Adobe Content. Please check out Jimmy Walsteen & Kim Aspen artist page here on YouTube:
youtube.com/channel/UCN2vMEt08AGJTnWthIcVWDQ
youtube.com/channel/UCDXTvrDTilCO9O30SP1rllA
3D #AR / #VR Model on Sketchfab - skfb.ly/o7AE7
Apple #USDZ & #gLTF #3D model - (Link to be added soon)
Image Credits
NASA JPL
James Hastings-Trew
Owl Creek Technologies
Music Credits
Mars Bringer of Wars
Composer - Gustav Holtz
Performer - Gizz Van Buskirk
*I am developing an inexpensive LED side lit picture frame for a series of 10" velums to illuminate. As well as 3" backlit nightlight and 6" animated frame for the 2021 holidays. Interested? Reach out by emailing info@owlcreektechnologies.tech
Goals of your mission:
- Observe from either NASA's Perseverance Mars Rover Mastcam-Z or as the #MarsHelicopter #RTE "eye", for each of Ingenuity's flights.
- Program your own flight pattern and observe from either Perseverance or Ingenuity.
- "Free Mode" allows you to fly Ingenuity like a drone with its computer assisted flight mode. Be careful and don't let your battery die. Even at 10m you won't stand a chance as you hit the Mars surface like a rock.
- Or just stroll along taking pictures & samples as the Perseverance Rover.
Using actual images from Perseverance & Ingenuity to create photogrammetry based 3D meshes of several terrain areas will closely match those actually surveyed, as well as areas not yet explored based on the orbiting #HIRISE camera system. Along with accurate color grading based color targets that NASA Jet Propulsion Laboratory uses to calibrate its own images taken by Perseverance, as well audio produced in a physically accurate model of low density atmosphere of Mars.
Find out more over the next few SOLs here at https://IngenuityinVR.space
https://1smallstepfor.space
* Part of the "Apollo 11: One Small Step For...' VR Experiences" that will also include "Ascent: Eagle Has Left the Moon" (Q1 2022) and "LUNAR WARS" part 1 & part 2 in (Q2 2022 or when Unreal Engine 5.0 is released).
The 3D terrain was created by piping the 100+ images taken by the NAVCAM camera at 640x480, scaled to 1280x720, then running it in ‘Reality Capture’ & Apple’s ‘Object Capture’ photogrammetry tools to create the terrain. Sadly, NASA JPL has not offered actual lens data so there was an accumulated curvature error that had to be corrected in Blender. I chose Apple’s Object Capture since it does a better job of creating a decent mesh size as well as 8K textures for diffuse, roughness, AO, & normals. This was brought into Blender 3.0 Alpha w. CycleX to create the initial animations seen on Twitter (All Things 3D). I then used Adobe PS to create an orthographic texture from the 3rd 4K image taken by the RTE camera, which after cropping left me a decent 3K texture that I could project onto the terrain using the current low res texture as a guide. Finally sectioned, reUVed & baked.
Ingenuity RTE & NAVCAM and Perseverance Mastcam-Z images for 360 courtesy of NASA JPL
Ingenuity audio and Mar's wind based on Perseverance recordings courtesy NASA JPL
"The Commander Thinks Aloud" is a song composed by The Long Winters and performed by eaneikiciv . I found it fitting not only in the title, but also what the song is really about and one of the US's few space tragedies -- Space Shuttle Columbia.
I hope you enjoy it.
You can find eaneikciv's cover on SoundCloud - soundcloud.com/eaneikciv/the-commander-thinks-aloud
A version by piano solo done John Roderick (of the Long Winters) can be heard here - youtu.be/grPBYQ_a6Cg?t=239
A new (free VR) version of "Excursion: 145 Minutes on the Moon" is coming out 6.2.2021 using UE 4.27 . Find out more here: https://1smallstepfor.space
Mar's Tech Updates:
As mentioned above, the main goal is to emulate the flight characteristics of Ingenuity so that a VR user can fly it without having to be concerned about pitching or rolling the helicopter too much that flight becomes unstable, as well as provide external forces so that craft doesn't feel like a "point and direct" game style craft. As in "Ascent: Eagle Has Left the Moon," part of the "Apollo 11: 'One Small Step For...' VR Experiences" series, I created a complex flight model to emulate the forces at play in lift off and returning to orbit around the moon. This allowed me to synthesize the vectored thrust necessary to create the actual flight path Eagle took and described by Neil Armstrong and Buzz Aldrin as it reach escape velocity and engine shutdown. Of course over 50 years later there have been volumes of data and books available describing the navigation computer algorithms and the exact thrust (Newtons) and fuel amounts needed to for Eagle's ascent module and RTS thrusters to create simplified model for Unreal Engine's Blueprint visual scripting system. Sadly, there has been only one paper published in 2018 by the Ingenuity team, but it more of summary and description of the engineering goals, with limited specifications, which I found in trying to emulate Ingenuity's RTE camera, as seen in this video's perspective, to be wrong in that paper. The other article does provide some updates and a few illustrations, but the charts are not very detail. Again, NASA please release the data.
Unreal Engine Tech Updates:
One of the areas that was a problem with "Ascent: Eagle Has Left the Moon" was smoothly transitioning from detailed terrain to actually DEM based mesh and orthographic images at different levels of detail. The LROS cam system has provided detailed images of the moon of from 25cm to 50m of the entire moon. Blending these together was a problem & frankly one of the reasons it has not been released yet. However taking that development experience and working from scratch, I was able to to come up with a much better system of blending UE terrain and HiRise DEM & Orthographic images so that they are seamless. (I offer anyone a free copy of "Ingenuity in VR" to find the seam, but it will already be free). This method also allows me to add more detail terrains dynamically as more information comes in from Perseverance' cameras. What is still needed before release is the recreation of a number of unique rock formations and more precise geo data. Eyeballing locations based upon geographic features is not really the most exact way to go about positioning the rover and helicopter or many of the rocks, especially if the images presented by NASA also change over time. Since "Ingenuity in VR" is supposed to be drone simulator as well as discovery tool, again I wish NASA would provide more data to the public.
YouTube Tech Update:
NASA JPL just released their own "3D" video of Ingenuity's 3rd flight from the Mascam-Z camera, but it should be noted it was only at 480p and an red/cyan paired anaglyph. In creating this video, I took the time to figure out how use YouTube's "3D" feature, which sadly has been downgraded over the past two years in favor of their 180 panoramic video method. This broke most of the 3D (stereo paired) videos on YouTube and on top of that is not even a "colored" anaglyph method. Even more frustrating is the method of YouTube recognizing that your video is in 3D, which is to either encode using frame stacking (mp4) or meta tag (mkv) since they did away with a simple toggle in. Luckily this FFMPEG command line statement: "ffmpeg -i vidfilein.mp4 -vcodec libx264 -crf18 -x264opts frame-packing=3 vidfileout.mp4" does work, just replace vidfilein.mp4 with your file & vidfileout.mp4 with your output file name. -crf18 (compression quality) is optional, which is a little better than the default of '20'.
Ingenuity Technical Notes:
After talking to helicopter expert Wayne Johnson, and one of NASA's consultant used in the design of the Ingenuity Helicopter. I have learned that it only has three rotors speeds, off, idle, and full speed. Vertical thrust, and vector positional changes are all done with coordinated & differential pitch changes to the propeller blades about 400-500 times a second. I had already read that, but what I didn't know was that spinning on its Y axis, yaw rotation that it is also done with differential pitch changes to the upper and lower props to create the necessary rotational torque to move the body in a spin. By differentially changing the upper and lower props pitch, you don't lose altitude and the motor RPM never changes. This reduces strain on the motors and more importantly creates a flat power curve with only the initial velocity change from off, idle to full speed creating the only surge. This goes a long way in increasing flight time, but also reduces stress on the Li-Ion batteries.
Next week, we will be ready to show off what a "360" panoramic would look like if Ingenuity were to take four or five overlapping images based only on either 90 or 72 degree yaw rotational shifts. As well as the very first "real time" video capture of "Ingenuity in VR" from the perspective of the MastCam-Z camera and the FPV view from Ingenuity's RTE camera. Both modes will be available in flying in the Jezero crater. We will also take video of our hands using the controllers to show how they affect flight in the real-time video. Getting pretty exciting on our end as we draw nearer to "Ingenuity in VR's" first VR flight.
https://IngenuitytinVR.space
#ingenuity , #ingenuityhelicopter , #perseverance , #perseverancerover , #marshelicopter , #marsrover
Technical Notes:
Notice the inset video which is from the perspective of Ingenuity's RTE color camera. What is interesting is the paper written in 2018 by Bob Balaram, lead engineer for the Ingenuity project, states that this is a Sony IMX-214, with 4208 x 3120 effective pixels (confirmed in the images sent back from Ingenuity). The problem is this is a 1/3.2 sensor (same one used in the Apple iPhone 5) and there is no way that the FOV is only 47 degrees, which I have figured to be about a 93.92 degrees horizontally to allow the landing feet to show (and why you see some geometric distortion as well), and the 22 degree horizon offset cannot be true either, since it needs to be more like 42 degrees to not show the rover. It is a little odd, that they didn't pitch up, or climb a couple more meters to take a picture of the rover. Also, with an 93.92 degree FOV, you can capture an entire 360 with four 90 degree offset rotations with 3 degrees overlap, five 72 degree offsets would be better, giving you 20 degrees of overlap, but requiring five images to be processed and uploaded. I am surprised that they did not already capture these images, but I did have to pitch up to take these shots above the horizon. At 5.2 meters, this would be a very impressive 360 panoramic. Add a few more, pitched up and down and you can create a equirectangular at 13 megapixel per image. Let's hope they do this before the end of the week. It also dawned on me that if this is still active when Perseverance moves on, they should just program a flight path to keep it within radio distance. With the panoramic, its laser range finder and B/W lookdown camera, they should be able to stay clear of any surfaces that could cause it to tip. Even if does land canted, it can always send back IMU data so they can program it to pitch the propellers for vertical flight.
Technical Unreal Engine Notes:
This teaser video shows off the latest updates to the Mars landscape, including some unique ways in using Unreal Engine's Terrain painting to actually paint orthographic textures created from photogrammetry 3D models created from NASA JPL Perseverance Mastcam-Z images from SOL-3, as well as hours of handplanting to create a similar terrain. The rover tracks was done by creating decals and laying them down like railroad tracks. Next week I finish up the physics model and controls, stayed tuned for updates as we closer to release and sadly Ingenuity reaches its final week before Perseverance packs up and moves on leaving Ingenuity to stranded, but still able to fly. My hopes is that they start mapping flights that keep it close enough to Perseverance to continue relaying images back.
#ingenuityhelicopter #perseverancerover #VR
In any case, this plus all the other motion/facial capture tools available for the Indy developer should finally give me the power to bring Vincent back to life, at least in "Chatting with Vincent" at VR experience where you sit down with Vincent and ask him questions with a simple voice input parser. The room, is has been recreated in the authentic colors of the actual bedroom Vincent stayed in within the "Yellow House," in Arles France. Sadly, this is also the time that Vincent had one of his major psychotic episodes in which he cut off his left ear. As you may have noticed, this 'MetaVincent' still has his ear, which even if I could remove it later, I probably won't to harken back to a time when he was more cogent. Plus, I think it would be unsettling in pretty real VR setting.
Technical Notes:
Both inset videos show what the VR view will look like based upon the actual camera specification for #Ingenuity 's RTE horizon facing color camera and #Perseverance 's Mastcam-Z camera, including a working zoom. Both cameras will, like their real world counterparts, will have limited motion, all dependent on how you manipulate the joysticks on your VR hand controllers. Their will also be a "spectator mode" (main video) that will allow you free motion behind the rover with limited "physical motion" roaming space to view the actual flights of Ingenuity, which has not completed its first historic launch, which I am happy to say I witness in real-time very early Monday morning when the data started coming back.
For those of you who are anxious to get your hands of "Ingenuity in VR," we are sorry we missed our 4/15/2021 (SOL 65) date, but we have decided to take more time to meticulously painting and adding rocks around the crafts, as well as mesh refinements to the surround crater ridge and delta cliffs. Thankfully NASA Jet Propulsion Laboratory (JPL) and its team at Arizona State University under Jim Bell's leadership have done a magnificent job of providing detailed images from the Mastcam-Z and NavCam to make this task more accurate. We are also hoping to see how close we are in recreating the Ingenuity RTE camera view in these preliminary teaser videos, but will make any necessary adjustments to make the user feel that are "one" with Ingenuity.
#marshellicopter
Yea, we are going to limit you to 46 deg. FOV in your #VR HMD as well.
#marshelicopter
https://IngenuityinVR.space
#marshelicopter
As mentioned in the previous teaser, the Unreal Engine was used to create the Apple Pro Res 4K video and 5.1 audio that was only brought into Adobe Premiered to multiplex the audio to upload to YouTube. A 1080P, stereo version was also created for Facebook, Twitter and LinkedIn.
https://IngenuityinVR.space
Music from the first MIB movie made almost 25 years ago by Will Smith - youtu.be/fiBLgEx6svA
#marshelicopter
This video was completely rendered in the Unreal Engine and only multiplexed in Adobe Premiere/Encoder. 5.1 Audio was exported directly from Unreal as well with no extra audio layers or sweetening done in a DAW.
#marshelicopter
The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or modes of user control: Through the eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through down looking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode.
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" PBR texture to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia as 25cm per pixel orthographic image and a 1m per pixel height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating a whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video we only imported one 16K quadrant with a special terrain derived from a DEM in HiRISE TIFF, converted into a fairly high polygon mesh to capture some of the nuances of the sand drifts. We also added a rough high detailed terrain around the rover again based upon images captured by the rover on SOL 11-20 and did some basic color grading. In the future we will blend the two terrains together and using height determining blending techniques, give the #VR user an experience that is out of this world - literally
More teaser videos on YouTube : youtube.com/playlist?list=PLqnhoKrXoLPEhJca5y_H85xFGx7PCU3As
More info about the VR app coming to https://ingenuityinvr.space soon
#marshelicopter
The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or to modes of user control: Through eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through down looking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode.
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with a special terrain derived from a DEM in HiRISE TIFF, converted into a fairly high polygon mesh to capture some of the nuances of the sand drifts. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.
The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or to modes of user control: Through eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through down looking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode.
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with a special terrain derived from a DEM in HiRISE TIFF, converted into a fairly high polygon mesh to capture some of the nuances of the sand drifts. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.
The lighting is based on a cubemap created from the an earlier 8K 360 I modified from the actual 360 from JPL.
More to come...
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with special terrain derived from the height map of the same 16K quadrant. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.
The lighting is based on cubemap created from the an earlier 8K 360 I modified from the actual 360 from JPL.
More to come...
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with special terrain derived from the height map of the same 16K quadrant. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.
The lighting is based on cubemap created from the an earlier 8K 360 I modified from the actual 360 from JPL.
More to come...
Owl Creek Technologies regraded the sky and surface as well as ensured the color remained as vibrant as those in the TIF file. We also added a little fun "cover" over the lower intersecting point.
We also added the audio files recorded by #Perseverance, but Ambisonicized them to give them a "3D" positional quality and give more life to the barren area surrounding the rover known as #JezeroCrater .
Looking forward to when all the image files will be available from the right camera of the stereo paired #MastCamZ camera that took this as well as many of the other gorgeous extremely detailed images that being made avaiable here: mars.nasa.gov/mars2020/multimedia/raw-images so that a "3D" stereo 360 can be created, but until then just pretend you have one eye closed. :)
Enjoy and look forward to our Unreal Engine based Ingenuity FPV flight sim coming out in a few weeks.
Now for the technical stuff:
This version of Hot Cocoa in VR is based off a novel/new technique to capture full 16bit HDR stereo projection cube maps and color graded down to 12-bit Apple Pro Res in Rec.709 color space, then optimized in AdobePremiere to push up the mid levels to make it easier on the eyes in your #VR headset. This version is also includes a novel new way to capture #Ambisonic audio from a 5.1 channel sequence for left/right, front/back, "omni", plus a 5.1 channel sequence to capture Up/Down. Using this technique and the Ambisonic VST library from Matthias Kronlachner allowed these nine channels to create the full spatial audio experience within the Unreal Engine without having to add additional audio tracks or positioning outside of the fixed positioning required to map the eight channels, plus one omni or center channel. This technique allows the full spatial experience to be heard as though you were listening to it in the actual #VirtualReality app, which the "Winter Edition" will be available soon.
You can find the same 360 video on Facebook here:
https://fb.watch/3tjVV9AXP-/
The songs " A Lighthouse in Space" & "Floating" by @Kim Aspen, aka Jimmy Walsteen were licensed from Adobe Content. Please check out Jimmy Walsteen & Kim Aspen artist page here on YouTube:
youtube.com/channel/UCN2vMEt08AGJTnWthIcVWDQ
youtube.com/channel/UCDXTvrDTilCO9O30SP1rllA
Now for the technical stuff:
Further test and research in the best way to push HDR compatible content from the Unreal Engine to Adobe Premiere or other NLE as well further test and research in pushing multi-track spatial audio from Unreal to Premiere. This version is best enjoyed in a quiet setting, with the lights dimmed with a large HDR compatible monitor or screen as well as a 5.1 surround sound system. The next version will be the 6k Stereo 360 using my novel 360 capture system and a new technique in only needing to audio export passes to derive either 1st order or 2nd order Ambisonic audio without having to recreate new audio channels in your DAW or NLE. Look for the Apple ProRes Rec709 10-bit stereo 360 and 180 Videos with 1st order and 2nd order audio channels. Then finally an upgrade to the VR experience. All this to be followed up with tutorial on how to do all this yourself.
The songs " A Lighthouse in Space" & "Floating" by Kim Aspen, aka Jimmy Walsteen were licensed from Adobe Content. Please check out Jimmy Walsteen & Kim Aspen here on YouTube:
youtube.com/channel/UCN2vMEt08AGJTnWthIcVWDQ
youtube.com/channel/UCDXTvrDTilCO9O30SP1rllA
Now for the technical stuff:
This version of Hot Cocoa in VR is based off a novel/new technique to capture full 16bit HDR stereo projection cube maps and graded down to 12-bit Apple Pro Res in Rec.709 color space, then optimized in Adobe Premier to push up the mid levels to make it easier on the eyes in your VR headset. (Note this version is based on an 10-bit Rec.709, Profile High10, Level 6.2, 80 Mbps H.264 /620 Kbps AAC encode for compatibility with Facebook).
This version also includes a novel new way to capture Ambisonic audio from a quad/4-channel sequence for left/right, front/back and another sequence to capture Up/Down. Using this technique and the Ambisonic VST library from allowed these six channels to create the full spatial audio experience within the Unreal Engine without having to add additional audio tracks or positioning outside of the fixed positioning required to map the six channels. This technique allows the full spatial experience to heard as though you were listening to it in the actual VR app, which the "Winter Edition" will be available soon.
Even though Premiere can only work with 1st order Ambisonic audio (AmbiX), this method is more than adequate for assembling, editing and positioning audio in the soundfield without requiring another audio spatial application like FB Spatial Workstation, Reaper or Pro Tools for Google YouTube 360 Video with AmbiX immersive audio. Keep in mind you can still bring your rendered audio/video file into the Spatial Workstation encoder for Facebook reencoding as done on my 'Owl Creek Technologies' page: https://fb.watch/2Yy0G7c5yY/ It should be noted that Facebook does not allow for HDR and limits resolution to 5120 x 5120 Over/Under stereo 360. In trying to upload this 6K HDR video with a this AmbiX audio track converted for Facebook, it became an unwatchable mess. Converting it to 5K was not enough, and finally had to remove HDR encoding. Sadly, I did not do a proper conversion to SDR and the lower end is now crushed. Here on YouTube, they do a pretty good job of converting back to SDR so don't lose as much detail in the shadows.
0:00 Opening
Examples of 360s done to promote gaming or movie experiences
5:07 - Hellblade: Senua’s Sacrifice - youtu.be/Kz2xpeGkcRU
7:51 - VR Spacewalk Experience | BBC HOME - youtu.be/hEdzv7D4CbQ
9:52 - GTA V - 360 VR Video - youtu.be/SWiWKrJdb18
10:51 - Darth Vader Immortal - youtu.be/GcLhDx10Em0
11:54 - Hot Cocoa in VR - youtube.com/playlist?list=PLqnhoKrXoLPEAkSfPvk1WlOvYqsOVpSrZ
13:43 - Excursion: 137 Minutes on the Moon - youtube.com/playlist?list=PLqnhoKrXoLPEbIPDT4AeJUOG31u5cwCmG
Tools for capturing and editing Video and Audio in the Unreal Engine
17:20 - Kite & Lightning Panoramic Capture Tool - docs.unrealengine.com/en-US/WorkingWithMedia/StereoPanoramicCapture/index.html
20:52 - NVIDIA Ansel - nvidia.com/en-us/geforce/geforce-experience/ansel/#:~:text=NVIDIA%20Ansel%20is%20a%20powerful,%2C%20HDR%2C%20and%20stereo%20photographs.
22:01 - Surreal Capture - surrealcapture.com
24:40 - Unreal Scene Capture Cube - docs.unrealengine.com/en-US/Resources/ContentExamples/Reflections/1_6/index.html#:~:text=Scene%20Capture%20Cubes%20capture%20a,a%20texture%20within%20any%20Material.
25:43 - John Carmack’s Quest (Android) 360 capture player - developer.oculus.com/downloads/package/vr5kplayer
29:08 - Spatial Media Metadata Injector - github.com/google/spatial-media/releases
30:24 - Google 180 Creator - arvr.google.com/vr180/apps
- Adobe Premiere
Video
30:53 Audio - http://www.willyurman.com/teaching/handouts/Ambisonic_in_Premiere.pdf
31:22 - Facebook Spatial Audio - facebook.com/formedia/blog/introducing-spatial-audio-for-360-videos-on-facebook
31:39 - Reaper Digital Audio Workstation - https://reaper.fm
360 Capturing ﹘ The Process
33:24 - Panoramic Capture Tool
Quick Start - docs.unrealengine.com/en-US/WorkingWithMedia/StereoPanoramicCapture/QuickStart/index.html
Reference - docs.unrealengine.com/en-US/WorkingWithMedia/StereoPanoramicCapture/Reference/index.html
Tips & Tricks - docs.unrealengine.com/en-US/WorkingWithMedia/StereoPanoramicCapture/TipsAndTricks/index.html
Other stuff to make your life easier - unrealengine.com/en-US/tech-blog/capturing-stereoscopic-360-screenshots-videos-movies-unreal-engine-4?sessionInvalidated=true
47:46 - NVIDIA Ansel Photography Plugin
Overview - docs.unrealengine.com/en-US/WorkingWithMedia/Ansel/Overview/index.html
Variable Reference - docs.unrealengine.com/en-US/WorkingWithMedia/Ansel/Reference/ConsoleVariables/index.html
Testing - docs.unrealengine.com/en-US/WorkingWithMedia/Ansel/Testing/index.html
Missing Ansel folder - nvidia.custhelp.com/app/answers/detail/a_id/4932
Ibrews ‘Macro Creator’ hack to capture video frames - gumroad.com/l/dAgAT
1:01:59 - Unreal Scene Capture Cube
Overview
Implementing (see this video)
Forum - forums.unrealengine.com/development-discussion/rendering/40943-cuberendertarget-to-2d-texture
Sergey Marasov - http://brabl.com/360-video-capture-in-unreal-engine-4
1:13:00 - Unreal Stereo Capture Cube
Overview
Implementing (see this video)
1:17:45 - Surreal Capture
Overview
Tips and Tricks
360 Image Uploading
- Kuula 360 - kuula.co
360 & 180 Video Uploading
- YouTube - support.google.com/youtube/answer/6178631?hl=en
- Facebook - facebook.com/help/828417127257368
- Oculus Quest or other Android based VR HMD - developer.oculus.com/blog/techniques-for-improved-vr-video-w-john-carmack/?locale=es_ES
- Standalone 360 Players
Desktop
"Hot Cocoa in VR" YouTube 360 - youtu.be/B0yN4gyi7K8
"Hot Cocoa in VR" Google 180 - youtu.be/-tpR8_uluN4
Actual "Hot Cocoa in VR" VR app for Steam VR & Oculus (desktop) - owlcreektech.itch.io/hot-cocoa-in-vr (Open XR coming soon)
Join us here in Google 180 for hot cocoa and stay for awhile for the reading of "It Was The Night Before Christmas"
This is the first of a new series of VR experiences that calm your mind and heart by immersing you in tranquil experiences away from modern life with all of it anxiety provoking issues.
Merry Christmas and a Happy New Year
Mike & Shavaun
You can find the full immersive VR version for SteamVR and Oculus here: owlcreektech.itch.io/hot-cocoa-in-vr with OpenXR coming soon.
Join us her in stereo 360 for hot cocoa and stay for awhile for the reading of "It Was The Night Before Christmas"
This is the first of a new series of VR experiences that calm your mind and heart by immersing you in tranquil experiences away from modern life with all of it anxiety provoking issues.
Merry Christmas and a Happy New Year
Mike & Shavaun
You can find the full immersive VR version for SteamVR and Oculus here: owlcreektech.itch.io/hot-cocoa-in-vr
"Peace is just a breath away. Calm your mind and heart with guided imagery in our virtual cozy Christmas cabin. "Hot Cocoa in VR" is here, created specially for you."
This is the first of a new series of VR experiences that calm your mind and heart by immersing you in tranquil experiences away from modern life with all of it anxiety provoking issues.
Today however, you should join us for hot cocoa and stay for awhile for the reading of "It Was The Night Before Christmas"
Merry Christmas and a Happy New Year
Mike & Shavaun
owlcreektech.itch.io/hot-cocoa-in-vr
*This video was completely rendered in Unreal's sequencer using ProRes 422LT, converted using 'Shutter Encoder' to H264 2020 PQ 10bit and muxed with rendered 16bit 2-channel WAV file to capture a reasonable facsimile of sound location without having to resort to Adobe Premiere or Blackmagic Davinci Resolve. In fact, even the titling at the end was done using the 'font renderer' with a few tweaks to a material to give it ethereal translucent glow that call be called in the sequencer, or for that matter in a Blueprint.
Look for a Google 180 of this sequence video using a fairly new 360 tool called "Surreal Capture" that works in any Unreal based executable or even the editor's render window this weekend. As well as full tutorial to the local Portland Unreal Developer Meetup group and uploaded to YouTube afterwards that discusses the pros and cons of several 360 tools one can use with the Unreal Engine.
Read more about it and download the initial version for the Oculus and Steam VR for free on itch.io
Next week a version in the new #VR #OpenXR standard will be released for Microsoft Windows, Windows Mixed Reality, as well as stereo "3D" 360 and 180 videos for Facebook and YouTube.
Until then, sit back and relax while Pamela Shavaun Scott , LMFT uses her special power of guided imagery to quiet your thoughts and relax in a warm inviting cabin.
The hot cocoa is on us.
owlcreektech.itch.io/hot-cocoa-in-vr
"Hot Cocoa" is a VR experience using safe calming environments that stimulate your senses. What senses your VR system cannot interact with, your host -- psychotherapist Pamela S. Scott with over 38 year career helping 1000s of clients overcome depression, anxiety, family and relationship problems, will use a powerful tool called "Guide Imagery" to help you fully immerse yourself further in the experience.
It will not only be available as VR app for the desktop, but stereo 360 and 180 standard 16:9 ratio video created for those who are using their phone or mobile platform like the Oculus Quest(2).
A special thanks to Unreal for releasing so much free content like main components of this experience, titled "Log Cabin" by Gabro Media ( facebook.com/gabromedia ) who's main map was almost perfect, with a few alterations and add-on to make it little more cozy and well a perfect place to spend the holidays. I also give a shout to all the other 3D artists who helped indirectly making it possible for myself and the many small indie teams populate their experiences with creative, beautiful and original pieces of 3D art who will be listed in app notes when released.
I would also like to bring attention to the talented guitarist Kim Aspen (open.spotify.com/artist/0TWDlZlnx5EiVZ1Ik1l2Al) who brings life to her rendition of "O' Tannenbaum" that fits perfectly in this slow dolly shot through the cabin.
Look closely, and you may even see the "Mikey Elf." A combination of very cute elf model I found and remodeled to fit my face that was 3D scanned with my 4eyes lens system & the Occipital Structure Sensor ( structure.io ) then hand painted using 3D Coat ( 3dcoat.com )
Enjoy the sneak holiday treat and look for the 360, 180 videos and VR app soon.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences"* series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you and you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences" series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences"* series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you and you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.
This stereo SBS was captured with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences" series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.