All Things 3DMike and Chris wrap up CES, GE's new near real-time CT scanning with expert radiologist Dr. Klioze, Mike's latest project and products feedback and of course the "Print Whisperer's" latest print tip.
Projects of the week: Mike - NEODiMOOUNT case punching Mike - App development to incorporate the Grip & Shoot and 4eyes lens compatibility. Mike - Building a new skull with the MakerGear M2 for the Make magazine and AP article and coating it with XTC-3D to coat it: http://www.reynoldsam.com/product/xtc… Mike - VR-One/NEODiMOUNT conversion Chris - Taz4 Dual Extruder. Got it working.
3D in Review for January 10 - 16, 2015All Things 3D2015-01-16 | Mike and Chris wrap up CES, GE's new near real-time CT scanning with expert radiologist Dr. Klioze, Mike's latest project and products feedback and of course the "Print Whisperer's" latest print tip.
Projects of the week: Mike - NEODiMOOUNT case punching Mike - App development to incorporate the Grip & Shoot and 4eyes lens compatibility. Mike - Building a new skull with the MakerGear M2 for the Make magazine and AP article and coating it with XTC-3D to coat it: http://www.reynoldsam.com/product/xtc… Mike - VR-One/NEODiMOUNT conversion Chris - Taz4 Dual Extruder. Got it working.
Updates from the Past: Mike - Valcrow’s glamour video of his Ducati and Ducati girl
Upcoming Events: 3D Printing World, January 29-31, http://www.3dprinterworld.com ‘Gaming Lens Mount’ is coming!All Things 3D2023-11-23 | “Float” any low-cost (#Alldocube comes to mind) 8”-9” tablet above a #Sony or #Xbox controller for a console/PC gaming experience on the go.
Come back to the this video tomorrow for a link to get 20% off when you preorder before December 1st on any of the ‘Gaming Lens’ products*
*Products availability will vary based on manufacturing, packaging and delivery of inventory between the end of December and January, 2024.Introducing the ‘Gaming Lens’ mount attached to a surprisingly affordable 8.4” Android Gaming TabletAll Things 3D2023-10-04 | Sony just made available the ‘Portal’ for preorder in Japan with great success. However, before you preorder your own, what if I told you there was an 8.4” IPS 1920x1200 display, Android 13 tablet with a MediaTek Helio G99 SOC, 8GB RAM, 256GB UFS storage, 13MP Rear/5MP Front cameras, WiFi5, BT5 AND 4G LTE ! All this in a package that resembles an @Apple iPad Mini 6 at 1/3 the $$$ ?
Meet the Alldocube ‘s iPlay50, here attached to a Sony DualShock 4 using the ‘Gaming Lens’ mount going through a number of game streaming services like Microsoft Xbox Game Pass , Nvidia Now , Valve’s Steam Link , open-source streaming app Moonlight all via WiFi or 4G (YES GoogleFi works great!), but more importantly with its 8GB RAM and fast storage, it also plays every AAA Android game I threw at it like ‘Alien Isolation’ , Doom 3 , Half-Life 2 (and Portal 1 & 2) and even Epic’s Fortnite. This is also due to it being GooglePlay certified, which sadly many low-cost Android tablets are not.
Why is this exciting to me? I have been looking for an 8”-9” tablet to use with the ‘Gaming Lens’ mount, but other than 1280x800 8” tablets, the only other solution is the iPad Mini 6th generation, which will set you back $500 and that is for 64GB storage, and less the ideal 4:3 screen for gaming. However, it did great for anything I threw at it, including Playstation remote play, but still $500! Also the fact you can’t run Game Pass or Nvidia Now through an app (just through Safari) and definitely NO Fortnite. So again, what a surprise to find a low cost Android tablet that does about everything the iPad mini does, with a more open Android OS, and did I mention 4G? If that isn’t enough, how about 18W fast charging via a #USBC port (only USB2 speed though) and the ability to plug a Dualshock 4, DualSense or an @XBOX controller directly for the fastest, low latency connection.
However, none of these tablets would even be in the game for an excellent Android gaming platform if it weren’t for the ‘Gaming Lens’ mount which is very close to production, hopefully in time for the holiday season, preordering available soon.
Oh, hang in there for the end of the video :) BOO!
Happy Halloween!Interview with Dr. Thomas Appéré, Physics & Chemistry AgrégéAll Things 3D2023-09-04 | In this interview I speak to Dr. Thomas Appéré about astrophysics and his passion for composing "artful" images from NASA and ESA of Mars, Europa and Titan.
Links to Associates: 3D models from Gwenael Caravaca at LPGN - sketchfab.com/LPG-3D vr2planets.comRTX4060 SOLO from ZOTAC easily runs the best looking games on the GAMING LENSAll Things 3D2023-07-10 | My goal has always been to design or create low cost “tiny” PCs which sadly don’t go hand in hand. One of the those goals is to create slim or very small PCs based on #miniITX , #picoITX and #NUC formats. The problem is getting #dGPU small enough without breaking the bank. I have found #Nvidia #GTX1650 , #RTXA2000 , as well as the #AMD #RX6400 & #Zen2 & #Zen3 #APU s to fit the $$$ amount, but I found they don’t quite have the oomph in performance that a larger GPU can bring to the table. If you add a few more cubic centimeter to your build space, you can use a half-length, dual-slot cards using an AMD #RX6600 ( @Asrock ) Nvidia #GTX1060, #RTX2060 , #RTX3060 or now the #RTX4060 ( #ZOTAC #SOLO ) .
The RTX4060 has gotten a lot of negative press with hohum specs. Which is true if you are comparing it to the 500W behemoths like the #RTX4090 or even a #RTX4070. But if you compare it to the RTXA2000 or even RX6400, for same price (former) or twice that (latter) you get 2X the performance (former) or 3X the performance (latter) for only 25W-40W more power consumption. This has allowed me to build a fairly low-cost solution ($700) based on the @Intel 11th gen #i1140T and an @msiUSA #miniITX Pro board, 16GB 3200Mhz & 1TB M.2 2280 gen3 that showed noticeable performance and visual gains on my 8.9” 2.5K (shown here at 1920x1200 90fps) #GamingLens .
Even though I was able to house all of this in 3.8L case with a max power draw of 200W for the entire system including the Gaming Lens, it still generated a great deal of heat. Using the case’s existing side panels with too little perforation, the GPU was over 85c & the CPU 70c+ in a @ULSolutions #3DMark stress test that did not pass. I constructed new side panels that framed #PVC mesh to bring the GPU down to 75c and the CPU down to 61c allowing it to pass at 98.7%. Even better, I was able to go from 9380 points to 9600 points in #TimeSpy, which is 6000 more points than exact same system with an RX6400. Spend a little more for 13th gen ‘T’ processor with DDR5 and you will add about 500-1000 to this number.
As far as the 8GB #VRAM limitation. At 1200P, even on #TheLastofUs with #DLSS , I still had 800MB unallocated on ‘High’ settings. #Cyberpunk2077 I had it on Max settings with ‘Ray Tracing’ enabled. #APlagueTaleRequiem was on ‘Max’ settings and #DeadspaceRemake on ‘Ultra’.
Not bad for the an extra $150 over the most used GPU on #Steam - GTX1650. Would a 12GB or 16GB version been better? Probably, but for a smaller 1080P or 1440P screen, I feel I got my monies worth.Interview with Vikas Reddy, Founder of LightTwistAll Things 3D2023-01-29 | We're back with our first interview in two years, and the first for 2023. In this interview we had Vikas Reddy, founder of "LightTwist" to talk about his new product 'LightTwist'. A cloud based "virtual" multimedia studio, something like Zoom but WAAAAy more powerful due to its integration with the Unreal Engine 5.1, allowing for a full immersive "3D" digital studio (of your own creation) with multiple cameras, inputs and even interactive animated content.
If Vikas sounds familiar, he was the cofounder of Occipital with Jeff Powers, who I have had on the show several times over the past ten years talking about the progress of one of the first portable 3D scanners, as well as one of the very first Mixed Reality headsets - 'Bridge' using a technique called "unbounded tracking" that we now know as "inside/out tracking" that everyone is using now.
If you want to find out more about LightTwist head to lighttwist.com
Speaking of ten years, this is the tenth anniversary year of 'All Things 3D' and even though I (Mike) have been on a hiatus from the show, I am back with more news, more "3D Tech Closet" projects and tutorials, with the first '3D in Review' episode hitting YouTube later today (1/29/2023) with more to come every Friday, including an audio only podcast you can find at https://allthings3D.net or wherever you listen to podcast soon.
Glad to be back and if you're not a subscriber, please subscribe. Sadly, I don't have much control over Google, but my content is always free and I request no advertisement and anything I review or work on is based solely on my own knowledge, research and opinion, with no payment or free products from the company I am talking about, unless specifically stated.
10:00 A look back at Occipital and its experience shaped that Vikas's future.
18:32 How about we jump into the demo of 'LightTwist'?
------------------------------------------------
Credits
"Mandalorian" is a Disney+ & Lucas Films Production
"Unreal VIrtual Studio" is a product of Epic GamesWelcome Back After 1 Year Perseverance & Ingenuity Unite Back at Octavia Butler Landing Site VR180All Things 3D2022-03-01 | NOTE: This video uses a new technique to render stereo 180 from the Unreal Engine by choosing to render from a 2D capture probe instead of Omni (cube) capture probe. By pushing the FOV to 110-130 and then using Adobe After Effect to map it to a half sphere you can effectively provide VR180 video that envelops the entire FOV of your VR headset. In this example I used an FOV of 110, which is little more than the Meta Quest 2, providing a immersive "picture box" experience. I can see this technique being used in standard filming without too much trouble with 25mm prime lens on two cameras. Or even converting "3D" formatted 21:9 ratio movies to full immersive experiences.
To celebrate the the one year anniversary of Perseverance and Ingenuity coming back together today (not quite this close though), I created this little VR180 with AmbiX audio to showcase the latest features in Unreal 5.0 based on one of the more recent builds of 'UE5 Main'. It takes advantage of Lumen GI & Reflectivity as well as some features from 4.26+ like Render Queue and 2D Capture. The latter uses a novel method of allowing over/under stereo capture by using a Cinema Camera instead of the older technique of using an orthographic camera; which not only allows for more control in the visual appearance, but also allows Render Queue to be used providing faster rendering and more important, the ability to improve anti-aliasing (at the expense of performance) for a better image quality. What is important in this method, is was able to use the standard 2D plane instead of a stereo omni capture to capture a 110 degrees of 'panini' correct FOV, that was later converted to an equirectangular in Adobe After Effect. This is notable since one can basically use any camera to capture and FOV of 100-130 which is about the limits of most VR HMDs to provide a very immersive experience without having to resort to slower, less effective methods to derive perspective correct stereo "3D" output. Now the goal is to provide stereo paired output without having to resort to 3D capture process, which sadly is not much better than cube capture by capturing the output from VR camera instead.
Sadly, the Motion Blur effect seems to suffer, as can be seen in almost no blur in the props, yet more at ground level the frame rate appears more realistic. Something I will have to spend more time with in the future.
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.spaceVR180 8K Welcome Back After 1 Year Perseverance & Ingenuity Unite Back at the Landing SiteAll Things 3D2022-03-01 | NOTE: This was upscaled from 3K to 4K to allow for YouTube VR180 processing to allow for 4320s (8K) and 2160s (4K) modes which provide a more accurate view of the original Apple ProRes 3072x6144 formatted video.
To celebrate the the one year anniversary of Perseverance and Ingenuity coming back together today (not quite this close though), I created this little VR180 with AmbiX audio to showcase the latest features in Unreal 5.0 based on one of the more recent builds of 'UE5 Main'. It takes advantage of Lumen GI & Reflectivity as well as some features from 4.26+ like Render Queue and Stereo Omni Capture. The latter uses a novel method of allowing over/under stereo capture by using a Cinema Camera instead of the older technique of using an orthographic camera; which not only allows for more control in the visual appearance, but also allows Render Queue to be used providing faster rendering and more important, the ability to improve anti-aliasing (at the expense of performance) for a better image quality. Sadly, the Motion Blur effect seems to suffer, as can be seen in almost no blur in the props, yet more at ground level the frame rate appears more realistic. Something I will have to spend more time with in the future.
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.spaceWelcome Back After 1 Year Perseverance & Ingenuity Unite Back at Octavia Butler Landing Site VR180All Things 3D2022-02-27 | NOTE: Sadly, YouTube does not recognize Oculus/Meta spec bases VR180 to allow for 3K viewing and limits it to 1920S (stereo). I am uploading a version upscaled from 3072 to 4096 to allow for 4KS and even 8KS. Look for the link here soon.
To celebrate the the one year anniversary of Perseverance and Ingenuity coming back together today (not quite this close though), I created this little VR180 with AmbiX audio to showcase the latest features in Unreal 5.0 based on one of the more recent builds of 'UE5 Main'. It takes advantage of Lumen GI & Reflectivity as well as some features from 4.26+ like Render Queue and Stereo Omni Capture. The latter uses a novel method of allowing over/under stereo capture by using a Cinema Camera instead of the older technique of using an orthographic camera; which not only allows for more control in the visual appearance, but also allows Render Queue to be used providing faster rendering and more important, the ability to improve anti-aliasing (at the expense of performance) for a better image quality. Sadly, the Motion Blur effect seems to suffer, as can be seen in almost no blur in the props, yet more at ground level the frame rate appears more realistic. Something I will have to spend more time with in the future.
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.spaceWelcome Back! Perseverance & Ingenuity Unite Back at the Octavia Butler Landing Site After 1 YearAll Things 3D2022-02-27 | To celebrate the the one year anniversary of Perseverance and Ingenuity coming back together today (not quite this close though), I created this little Side-by-Side video with stereo audio to showcase the latest features in Unreal 5.0, based one of the more recent builds of 'UE5 Main'. It takes advantage of Lumen GI & Reflectivity as well as some features from 4.26+ like Render Queue and 2D Capture. The latter uses a novel method of allowing over/under stereo capture by using a Cinema Camera instead of the older technique of using an orthographic camera; which not only allows for more control in the visual appearance, but also allows Render Queue to be used providing faster rendering and more important, the ability to improve anti aliasing (at the expense of performance) for a better image quality.
The mars environment is based on images from MRO HiRISE, Ingenuity Mastcam-Z & Navigation cameras to create an HDRI environment, two levels of terrain mesh, landscape mesh & the actual landing site terrain based on photogrammetry. This not only provides the most detail at inches from the surface, but provides smooth transition from the surface to 10 meters (and higher) for first lift off of Ingenuity from either FPV eyes of Ingenuity's RTE camera or the MastcamZ stereo cameras, including a zoom feature. More to come in the next month at:
https://ingenuity-vr.spaceVR180 6K (6340 x 4574 4:3 O/U) using two Panasonic GX-85s w. Meinke fisheye lensesAll Things 3D2022-02-02 | Here are the test results of using two Panasonic GX-85s with Meinke fisheye lenses to capture almost a full 180 (180 x 170) of my backyard. What makes this important is that I was able to shoot in 4K - 4:3 ratio image mode with the LUT hack to produce a much richer image than the standard 4K - 3840 x 2160, along with more vertical FOV due to the 4:3 ratio. Plus the original spherical resolution is 3328 x 2496 for 220 x 170 degrees FOV allowing you to create a M4/3 VR180 camera system for less than $1700, even less if you can pick up the GX-85s used like I did. The Meinke fisheye lenses are $159.95 on Amazon at 3.5 mm and are close but not quite a full spherical 220 FOV horizontally and 170 FOV vertically, which is not bad considering the only other M4/3 fisheye that can is 3.25 mm and cost $600. Even than I am not sure it won't be complete encapsulation. Of course at this price, the lens is completely manual with focus and aperture rings from 1.7m to infinity, 2.4F - 16F respectively. Sadly there is also extensive chromatic aberration in the outer edge that you can see in the tree branches at the edge of the video, but I should be able to get rid of this with mask based color defringing tool
How did I do it? FIrst you can't just put the cameras next to each other even though they are pocket size, but you can by inverting one of the cameras with the shortest sides next to each other, however that creates another challenge and that is how to keep them in place. I was able to do this with a 'Gearbox' universal camera cage and some extension tubes. From there I had to drill a few new holes so I could align and secure the inverted cameras through the tripod mounting holes. I also had to use rubber shims to bring the sensors in parallel with each other and use double sided tacky tape along the edges making contact to keep them in place. Sadly even with short sides against each other, I still had to contend with 85mm versus 65mm distance between the lens centers. This sadly will cause the eyes to strain at objects close to the camera, but objects further from the camera will have a more effective 3D effect. I can manipulate this in software to get them closer to one another, but this has other consequences, so I am left with 75 mm apart. Knowing now that I am only 10 degree off of a full 180 vertical, I can probably just butt their bottoms up against each other with a stabilizing bar between them and custom threaded bolt, threaded through with enough thread to allow for silicone washer/bushing and tacky tape along the entire rail to provide even more grip to the entire bottom of the plate. The plate will have to extend past both camera to allow for another T-Plate to be attached to allow for a tripod mount.
I also use dummy battery systems to power each of the cameras with a large 5 VDC lithium-Ion battery and USB cables to export files since once mounted, it is pain to get at the batteries. Also thank Panasonic for creating custom memory settings and the ability to trigger the shutter from a phone or the touch screen. Not having these features make adjusting the upside down camera -- painful.
In any case I have found viewing the footage before upload in my Quest 2 pleasant and well within its 17.2 pixels per degree.
How did I create the VR180?
First I took the two seperate video files and used FFMPEG to convert from spherical/fisheye to equiangular based upon the FOVs motioned above. I used HVEC Nvenc to render the video with the added 'roll:180' added to the left camera to invert it vertically to match the other camera.
I then imported these two equirectangular videos into Adobe Premiere and did tiny bit of grading to bring the white/black components down/up slightly, but it really wasn't needed. Then I added the VR Projection filters, again not sure if I needed it since I adjusted the cameras to match via the Horizontal/Vertical adjust. Than scaled the horizontal out to clip the 220-180 portion of the video on both sides, then rendered it out again using HVENC, but at slightly smaller scale. Since Premiere adds the proper meta tag for 180 O/U, I just test in my Quest 2 using Skybox VR player before uploading it here. I will make a link for my next set of videos to download the versions I created before uploading in the next series as well as upload them to my Oculus Creator page as well.
Final goal is to create a script for FFMPEG to do the full encoding, skipping Premiere entirely. I have started to do this for my Unreal VR360 & VR160 videos as well to improve workflow.Meerkat Demo 4K in Unreal Engine Editor 5.1, Lumen GI & Reflection + 5.1 AudioAll Things 3D2022-01-01 | This is an update to the "Meerkat Demo" (Created by Weta Digital) project that is available from Unreal Engine Market that provides an excellent example of 4.2x animation, hair and screen capture GI, but updated to Unreal Editor 5.1 (GitHub 12/10 UE5 Main Build) and lumen.
In doing so many of the materials had to be modified, and a number of lights were either turned off or attenuated to allow Lumen's excellent GI to do its magic. It is notable that I had to make a few tweaks to the eagle rotation on some frames to prevent a light saturation effect to the material. And if you look closely as the eagle is coming out of the sun, there is frame I left to show what happens. This was done on a Github UE 5 build from mid December so maybe this will be fined tuned, but I think it has more to do with not tweaking every material. I should also be noted that like standard reflection captures sphers and screen capture reflection modes, character meshes don't appear. If you look closely in the original project in the eyes of the Meerkat as the eagle is almost upon it, there is no reflection. Of course this would only be for a few frames, but nevertheless I decided to create a static mesh of the character mesh at that moment and add it to the sequence for those frames to capture the reflection and appropriate lighting. To add even more realism, I moved the static model towards the Meerkat for even greater effect. Of course you will never notice unless you do a frame by frame analyisis, but it was worth doing. Of course I could have just enable RTX reflections that does allow character meshes and provided even better reflections, but this was test with "software" lumen only on a standard Intel i9 and 128 GB memory.
I also took that time to add spatial audio by attaching the audio track to the Meerkat and using modified 'Sound Attenuation" to spread out the 2-channel original track to four channels of a 5.1 channel layout adding more depth. It would have been more convincing if they had laid down each foley or creature audio track separately, but it appears it was mixed outside of the Unreal Engine, which sadly the Unreal audio developers seem to think this is the best way to do this and spending more time in turning Unreal Engine 5 turning it into a synthesizer instead working with the Sequencer developers to allow full mix downs to a full spatial track series using Microsoft Spatial Audio API. Which I have found I can do, but only by creating 2 passes with the camera turned 90 degrees on its side to capture up/down channels and using Premiere to layout a conversion to AmbiX. If you are looking for stereo, 5.1 or 7.1 audio, you can export it as a multichannel WAV file, but I found I needed to export it and import into a Audacity and export it again for it to be muxed properly with the ProRes LT video file to create a 6 channel AAC file in FFMPEG and uploaded to YouTube. The original MOV was 3.7 GB for less than two minutes of animation. Not the most efficient, but I have found it to provide the best transfer to YouTube. It should be noted using "-copy" is the best way to use FFMPEG since it will do a direct transfer without transcoding the output. If you need to transcode, I highly suggest x.265 with nvidia GPU transcoding. Super fast and the results are very good.
One more note: Even though this could have been rendered in real-time with my RTX-3080 with Lumen at '4' quality. I needed to wait a few frames to allow Lumen to settle down before the next 'shot camera' was rendered. Even so, it only took 10 minutes to render both the 4K MOV & WAV tracks using the legacy 'Movie Scene Capturer'. I tried Render Queue, but it needed a set up for each shot, which wouldn't have given me more quality unless I wanted to use a lot of frame samples for anti-aliasing which would have taken much longer to render. I really hope they don't pull out the old 'Movie Scene Capturer' since 'Render Queue' doesn't do full mixdown audio and cannot capture orthographic camera output.
Oh, and happy new year.The Night Before Christmas with a special guest at the end.All Things 3D2021-12-25 | A surprise guest joins us in the cabin.Kitty in Hot Cocoa Unreal 5.1 Lumen Render in 4K HDR (REC. 2100 PQ)All Things 3D2021-12-22 | NOTE: This version uploaded to YouTube using HEVC Rec. 2100 PQ. (10 bits) 1000 nits maximum.
Since most of my latest Unreal Engine project have already been converted over to 4.27.1 or 4.27.2, I was bummed to find out that unsurprisingly since UE 5 Early Access was release in July, 2021 with the last update coming in early September that levels created or modified in 4.27 would not show up. To alleviate that I went out to Unreal's Github and downloaded the latest source build (12/7) and created a new UE 5 that could open it. However, it also seems Lumen had some terrible glitches handling lights and particle occlusion that caused it flash-leak out of occluding box. So I went back and creating another build based a later date (12/16) and now I had a whole host of new problems since emissive textures were 10-100 times brighter, and sequencer file kept freezing, requiring me to kill the process. Soooo, I just created new sequencer, sequence, which I wanted to do anyway for a new "Hot Cocoa in VR" trailer video based on some new "friends" to keep you company in the long winter months.
In working with Lumen, I find it very easy but frustrating since even though everything is real-time, many of my material had to be tweaked to behave. It should also be noted that I wanted to do this without any NVIDIA RTX features so I remained on DX-11 so what you see is ALL Lumen and few Nanite meshes.
To render this out, I used both the legacy 'Movie Scene Capture' and 'Render Queue'. The former was used ONLY for rendering audio since sadly 'Movie Scene Capture' still does allow for submixes. The video was rendered as an EXR (Linear 16 bits per color) image sequence with 3:2 Temporal Super Resolution anti-aliasing. This image sequence was brought into Adobe Premiere using the EXR Pro plugin and and set the project up for (203 58% PQ), then adjusted white and black levels to the HDR PQ levels of 1000 nits. For export I chose HEVC at 40 mps Rec. 2100 PQ
Even though this was more of a study in using Lumen & HDR, most of it useless in VR since Lumen & Nanites do not work there. My next test is to see if they omni-capture tool can, that way I can capture 7K images using specially designed capture cinema camera.
You can download the file used for upload to YouTube to view directly on your HDR monitor: https://1drv.ms/v/s!AjCOOLquXxqngvcdjzwXSieV2DeriA?e=oI6THnKitty in Hot Cocoa (or UE 5.1 Testing of Lumen at 4K)All Things 3D2021-12-20 | Since most of my latest Unreal Engine project have already been converted over to 4.27.1 or 4.27.2, I was bummed to find out that unsurprisingly since UE 5 Early Access was release in July, 2021 with the last update coming in early September that levels created or modified in 4.27 would not show up. To alleviate that I went out to Unreal's Github and downloaded the latest source build (12/7) and created a new UE 5 that could open it. However, it also seems Lumen had some terrible glitches handling lights and particle occlusion that caused it flash-leak out of occluding box. So I went back and creating another build based a later date (12/16) and now I had a whole host of new problems since emissive textures were 10-100 times brighter, and sequencer file kept freezing, requiring me to kill the process. Soooo, I just created new sequencer, sequence, which I wanted to do anyway for a new "Hot Cocoa in VR" trailer video based on some new "friends" to keep you company in the long winter months.
In working with Lumen, I find it very easy but frustrating since even though everything is real-time, many of my material had to be tweaked to behave. It should also be noted that I wanted to do this without any NVIDIA RTX features so I remained on DX-11 so what you see is ALL Lumen and few Nanite meshes.
To render this out, I used both the legacy 'Movie Scene Capture' and 'Render Queue'. The former was used ONLY for rendering audio since sadly 'Movie Scene Capture' still does allow for submixes. The video was rendered using Apple ProRes LT, which is only a 4:2:2 encoder, but at 150Mbps, as well as 3:2 Temporal Super Resolution anti-aliasing to create a 5 GB MOV file that rendered 2 minutes in 45 minutes. I then used FFMPEG to mux the MOV with the stereo audio WAV to create the final MOV file that was used to create this 4K file. Sadly, Unreal has not upgraded Apple ProRes HDR, so this is still in REC-709. And since the goal was to do EVERYTHING in Unreal, exporting a 3700 EXR 4K image files into Adobe Premiere would not allow me to make that claim (I guess I could have brought the image sequence into FFMPG -- maybe next time), and I think it looks pretty good since I had control over all the lighting to prevent a blown out areas thanks to Unreal's new set of monitor tools.
So even though this was more of a study in using Lumen, most of it useless in VR since Lumen & Nanites do not work there. My next test is to see if they omni-capture tool can, that way I can capture 8K images using specially designed capture cinema camera.“Cozy Cabin” Hot Cocoa in VR, Holiday Version 2021-2022All Things 3D2021-12-06 | Promo video for the latest coming-soon version of “Hot Cocoa in VR”(2021-2022 - UE 4.27) that is also a standalone “Guided Sensory Perception” video experience that licensed psychotherapist Pamela S. Scott, LMFT with over 30 years of experience lends her soothing voice to help you escape to your "safe place" cabin to rest, relax and most important - recharge your emotional battery to face another day.
This version adds a few “little” surprises to make you smile, maybe even laugh a little.
“It’s cold out there, come in where its warm & cozy”
The songs " A Lighthouse in Space" & "Floating" by @Kim Aspen, aka Jimmy Walsteen were licensed from Adobe Content. Please check out Jimmy Walsteen & Kim Aspen artist page here on YouTube:
youtube.com/channel/UCDXTvrDTilCO9O30SP1rllAMars Bringer of Wars or How tiny Ingenuity Really Is on the Planet MarsAll Things 3D2021-09-29 | I was working on another project* and needed a really great image of the entire Mars surface from space and also realized how tiny the area NASA Perseverance & Ingenuity is actually exploring in the scheme of things. So I encircled the entire Jezero crater with a luminescent green toroid which is 45 km wide, or about the size of Hawaiian island of Maui. As you can see, NASA should play nice with the God of Wars -- Mars.
Apple #USDZ & #gLTF #3D model - (Link to be added soon)
Image Credits NASA JPL James Hastings-Trew Owl Creek Technologies
Music Credits Mars Bringer of Wars Composer - Gustav Holtz Performer - Gizz Van Buskirk
*I am developing an inexpensive LED side lit picture frame for a series of 10" velums to illuminate. As well as 3" backlit nightlight and 6" animated frame for the 2021 holidays. Interested? Reach out by emailing info@owlcreektechnologies.techIngenuity in VR Test Flight #2All Things 3D2021-08-19 | Latest test video of "Ingenuity in VR" using Unreal Engine 's 4.27 GPU based lens distortion plugin to create a better depiction of what it will be like "to be" #ingenuity as you traverse in FPV mode across rocks and sand dunes of the Mar's Jezero Crater dry lake bed (even buzz Percy just for the hell of it) available sometime at the end of Q1 2022 on Ingenuity's 1st anniversary of flight.
Goals of your mission:
- Observe from either NASA's Perseverance Mars Rover Mastcam-Z or as the #MarsHelicopter #RTE "eye", for each of Ingenuity's flights.
- Program your own flight pattern and observe from either Perseverance or Ingenuity.
- "Free Mode" allows you to fly Ingenuity like a drone with its computer assisted flight mode. Be careful and don't let your battery die. Even at 10m you won't stand a chance as you hit the Mars surface like a rock.
- Or just stroll along taking pictures & samples as the Perseverance Rover.
Using actual images from Perseverance & Ingenuity to create photogrammetry based 3D meshes of several terrain areas will closely match those actually surveyed, as well as areas not yet explored based on the orbiting #HIRISE camera system. Along with accurate color grading based color targets that NASA Jet Propulsion Laboratory uses to calibrate its own images taken by Perseverance, as well audio produced in a physically accurate model of low density atmosphere of Mars.
Find out more over the next few SOLs here at https://IngenuityinVR.spaceExcursion: 145 Minutes on the Moon - OpenXR TrailerAll Things 3D2021-08-15 | This December (2021) "Excursion: 145 Minutes on the Moon" * will finally be released for ALL desktop VR systems due its compatibility using OpenXR. We had to wait till Unreal Engine 4.27 to fix a bug in its use with SteamVR before we could release it. But it also allowed us to redo light maps using GPU Lightmap Rendering for cleaner surface light & shadows and utilize new VR rendering faturess as well. It will still be capable of running on GTX 1070 at 90 fps, and less than that with a GTX 1060 or even a GTX 1050ti in Windows Mixed Reality. The goal is make available exclusively in the new Windows 11 App store as FREE download.
https://1smallstepfor.space
* Part of the "Apollo 11: One Small Step For...' VR Experiences" that will also include "Ascent: Eagle Has Left the Moon" (Q1 2022) and "LUNAR WARS" part 1 & part 2 in (Q2 2022 or when Unreal Engine 5.0 is released).Ingenuity in Flight SOL-133 (3D Anaglyph Glasses Viewable)All Things 3D2021-07-24 | This "3D" video is based on an actual 3D terrain created using a unique workflow taking Ingenuity’s NAVCAM & RTE images to create a section of the 625m flight path.
The 3D terrain was created by piping the 100+ images taken by the NAVCAM camera at 640x480, scaled to 1280x720, then running it in ‘Reality Capture’ & Apple’s ‘Object Capture’ photogrammetry tools to create the terrain. Sadly, NASA JPL has not offered actual lens data so there was an accumulated curvature error that had to be corrected in Blender. I chose Apple’s Object Capture since it does a better job of creating a decent mesh size as well as 8K textures for diffuse, roughness, AO, & normals. This was brought into Blender 3.0 Alpha w. CycleX to create the initial animations seen on Twitter (All Things 3D). I then used Adobe PS to create an orthographic texture from the 3rd 4K image taken by the RTE camera, which after cropping left me a decent 3K texture that I could project onto the terrain using the current low res texture as a guide. Finally sectioned, reUVed & baked.
Ingenuity RTE & NAVCAM and Perseverance Mastcam-Z images for 360 courtesy of NASA JPL Ingenuity audio and Mar's wind based on Perseverance recordings courtesy NASA JPLTriumphs and FailuresAll Things 3D2021-07-06 | I was hoping to have this done by yesterday but had a snag using Unreal Engine 's upcoming #ue5 & 'Movie Render Queue' with something called "widgets" showing up where they shouldn't. Figured it out and was able to render this out in Apple ProRes add audio/music in Adobe Premiere.
"The Commander Thinks Aloud" is a song composed by The Long Winters and performed by eaneikiciv . I found it fitting not only in the title, but also what the song is really about and one of the US's few space tragedies -- Space Shuttle Columbia.
A version by piano solo done John Roderick (of the Long Winters) can be heard here - youtu.be/grPBYQ_a6Cg?t=239
A new (free VR) version of "Excursion: 145 Minutes on the Moon" is coming out 6.2.2021 using UE 4.27 . Find out more here: https://1smallstepfor.spaceUp, Up, Up Ingenuity (In 3D & 5.1 Audio)All Things 3D2021-05-15 | In this "Ingenuity in VR" teaser video we have decided to test the limits of Ingenuity's (in VR) thrust vector algorithm at 1 meter per second velocity by going straight up. There are no other external forces (wind) affecting it at this time so, but even than the "synthetic" IMU would minimize any external forces to allow Ingenuity to fly smoothly. The key is to synthesize the feedback loop to work at 500 cycles per second like the "real" Ingenuity, which is not possible using the standard timing 'tick' based timer system in Unreal, which at the most can reach 90 frames per second, so we will have to do some predictive averaging to provide a similar experience. Or... just fake it and create a function table based on the data, which sadly has not been made available other than a one page article with some simple charts. I had hope through some request to the team at Ingenuity more information would be available. Sadly, I have received no response from the team's engineering leader Bob Balaram, or Nasa's public relations representatives.
Mar's Tech Updates: As mentioned above, the main goal is to emulate the flight characteristics of Ingenuity so that a VR user can fly it without having to be concerned about pitching or rolling the helicopter too much that flight becomes unstable, as well as provide external forces so that craft doesn't feel like a "point and direct" game style craft. As in "Ascent: Eagle Has Left the Moon," part of the "Apollo 11: 'One Small Step For...' VR Experiences" series, I created a complex flight model to emulate the forces at play in lift off and returning to orbit around the moon. This allowed me to synthesize the vectored thrust necessary to create the actual flight path Eagle took and described by Neil Armstrong and Buzz Aldrin as it reach escape velocity and engine shutdown. Of course over 50 years later there have been volumes of data and books available describing the navigation computer algorithms and the exact thrust (Newtons) and fuel amounts needed to for Eagle's ascent module and RTS thrusters to create simplified model for Unreal Engine's Blueprint visual scripting system. Sadly, there has been only one paper published in 2018 by the Ingenuity team, but it more of summary and description of the engineering goals, with limited specifications, which I found in trying to emulate Ingenuity's RTE camera, as seen in this video's perspective, to be wrong in that paper. The other article does provide some updates and a few illustrations, but the charts are not very detail. Again, NASA please release the data.
Unreal Engine Tech Updates: One of the areas that was a problem with "Ascent: Eagle Has Left the Moon" was smoothly transitioning from detailed terrain to actually DEM based mesh and orthographic images at different levels of detail. The LROS cam system has provided detailed images of the moon of from 25cm to 50m of the entire moon. Blending these together was a problem & frankly one of the reasons it has not been released yet. However taking that development experience and working from scratch, I was able to to come up with a much better system of blending UE terrain and HiRise DEM & Orthographic images so that they are seamless. (I offer anyone a free copy of "Ingenuity in VR" to find the seam, but it will already be free). This method also allows me to add more detail terrains dynamically as more information comes in from Perseverance' cameras. What is still needed before release is the recreation of a number of unique rock formations and more precise geo data. Eyeballing locations based upon geographic features is not really the most exact way to go about positioning the rover and helicopter or many of the rocks, especially if the images presented by NASA also change over time. Since "Ingenuity in VR" is supposed to be drone simulator as well as discovery tool, again I wish NASA would provide more data to the public.
YouTube Tech Update: NASA JPL just released their own "3D" video of Ingenuity's 3rd flight from the Mascam-Z camera, but it should be noted it was only at 480p and an red/cyan paired anaglyph. In creating this video, I took the time to figure out how use YouTube's "3D" feature, which sadly has been downgraded over the past two years in favor of their 180 panoramic video method. This broke most of the 3D (stereo paired) videos on YouTube and on top of that is not even a "colored" anaglyph method. Even more frustrating is the method of YouTube recognizing that your video is in 3D, which is to either encode using frame stacking (mp4) or meta tag (mkv) since they did away with a simple toggle in. Luckily this FFMPEG command line statement: "ffmpeg -i vidfilein.mp4 -vcodec libx264 -crf18 -x264opts frame-packing=3 vidfileout.mp4" does work, just replace vidfilein.mp4 with your file & vidfileout.mp4 with your output file name. -crf18 (compression quality) is optional, which is a little better than the default of '20'.Peek A Boo... (4K Apple ProRes/5.1 Audio)All Things 3D2021-05-07 | In this little teaser for "Ingenuity in VR," we have added a new series of rocks and boulders with the unique characteristics of a blue/green oxidation seen so prominently in some 'earth" color balanced images. However to keep Mars, well looking like Mars, we only do a grey balance, and adjust the single light source to 5500K, but bringing more of the natural color to the rocks and other surfaces that have been smoothed down by ancient water & wind currents, but also to reflect that there must have also been volcanic activity millions of years ago, since many of the rocks have pits. It also shows that the rocks have a lot of copper and iron, which reflects the different colored oxides.
Ingenuity Technical Notes:
After talking to helicopter expert Wayne Johnson, and one of NASA's consultant used in the design of the Ingenuity Helicopter. I have learned that it only has three rotors speeds, off, idle, and full speed. Vertical thrust, and vector positional changes are all done with coordinated & differential pitch changes to the propeller blades about 400-500 times a second. I had already read that, but what I didn't know was that spinning on its Y axis, yaw rotation that it is also done with differential pitch changes to the upper and lower props to create the necessary rotational torque to move the body in a spin. By differentially changing the upper and lower props pitch, you don't lose altitude and the motor RPM never changes. This reduces strain on the motors and more importantly creates a flat power curve with only the initial velocity change from off, idle to full speed creating the only surge. This goes a long way in increasing flight time, but also reduces stress on the Li-Ion batteries.
Next week, we will be ready to show off what a "360" panoramic would look like if Ingenuity were to take four or five overlapping images based only on either 90 or 72 degree yaw rotational shifts. As well as the very first "real time" video capture of "Ingenuity in VR" from the perspective of the MastCam-Z camera and the FPV view from Ingenuity's RTE camera. Both modes will be available in flying in the Jezero crater. We will also take video of our hands using the controllers to show how they affect flight in the real-time video. Getting pretty exciting on our end as we draw nearer to "Ingenuity in VR's" first VR flight.
https://IngenuitytinVR.space
#ingenuity , #ingenuityhelicopter , #perseverance , #perseverancerover , #marshelicopter , #marsroverProposed Last Flight for Ingenuity (Apple ProRes UHD with 5.1 Audio)All Things 3D2021-04-26 | Sadly, this flight will never happen because it is deemed to risky to fly that close to Perseverance. However, that doesn't mean you can't fly any way you want when "Ingenuity in VR" is made available soon.
Technical Notes: Notice the inset video which is from the perspective of Ingenuity's RTE color camera. What is interesting is the paper written in 2018 by Bob Balaram, lead engineer for the Ingenuity project, states that this is a Sony IMX-214, with 4208 x 3120 effective pixels (confirmed in the images sent back from Ingenuity). The problem is this is a 1/3.2 sensor (same one used in the Apple iPhone 5) and there is no way that the FOV is only 47 degrees, which I have figured to be about a 93.92 degrees horizontally to allow the landing feet to show (and why you see some geometric distortion as well), and the 22 degree horizon offset cannot be true either, since it needs to be more like 42 degrees to not show the rover. It is a little odd, that they didn't pitch up, or climb a couple more meters to take a picture of the rover. Also, with an 93.92 degree FOV, you can capture an entire 360 with four 90 degree offset rotations with 3 degrees overlap, five 72 degree offsets would be better, giving you 20 degrees of overlap, but requiring five images to be processed and uploaded. I am surprised that they did not already capture these images, but I did have to pitch up to take these shots above the horizon. At 5.2 meters, this would be a very impressive 360 panoramic. Add a few more, pitched up and down and you can create a equirectangular at 13 megapixel per image. Let's hope they do this before the end of the week. It also dawned on me that if this is still active when Perseverance moves on, they should just program a flight path to keep it within radio distance. With the panoramic, its laser range finder and B/W lookdown camera, they should be able to stay clear of any surfaces that could cause it to tip. Even if does land canted, it can always send back IMU data so they can program it to pitch the propellers for vertical flight.
Technical Unreal Engine Notes: This teaser video shows off the latest updates to the Mars landscape, including some unique ways in using Unreal Engine's Terrain painting to actually paint orthographic textures created from photogrammetry 3D models created from NASA JPL Perseverance Mastcam-Z images from SOL-3, as well as hours of handplanting to create a similar terrain. The rover tracks was done by creating decals and laying them down like railroad tracks. Next week I finish up the physics model and controls, stayed tuned for updates as we closer to release and sadly Ingenuity reaches its final week before Perseverance packs up and moves on leaving Ingenuity to stranded, but still able to fly. My hopes is that they start mapping flights that keep it close enough to Perseverance to continue relaying images back.
#ingenuityhelicopter #perseverancerover #VRMetaVincentAll Things 3D2021-04-21 | Thanks to MetaHuman Creator for allowing me to take part in their beta. Here is example of what can be done with a few minutes using their simple, but powerful sculpting tools. It is my hope to be able to work with the hair more (In this case add a 'Widow's Peak' and thicken his beard. I am also hoping, there will be mesh tools that allow you vertex/vertices manipulation to fine tune facial areas. Currently not sure how you import these into the Unreal Engine, but I have to assume there will be a plugin that allow import from the cloud based MetaHuman Creator tool. From there I should be able to export textures and mesh files to manipulate further.
In any case, this plus all the other motion/facial capture tools available for the Indy developer should finally give me the power to bring Vincent back to life, at least in "Chatting with Vincent" at VR experience where you sit down with Vincent and ask him questions with a simple voice input parser. The room, is has been recreated in the authentic colors of the actual bedroom Vincent stayed in within the "Yellow House," in Arles France. Sadly, this is also the time that Vincent had one of his major psychotic episodes in which he cut off his left ear. As you may have noticed, this 'MetaVincent' still has his ear, which even if I could remove it later, I probably won't to harken back to a time when he was more cogent. Plus, I think it would be unsettling in pretty real VR setting.Fly Your WayAll Things 3D2021-04-20 | Teaser to show off the ways you can fly in #ingenuityinvr based on the actual camera parameters for NASA - National Aeronautics and Space Administration / NASA Jet Propulsion Laboratory (JPL) NASA's Perseverance Mars Rover & #ingenuityhelicopter.
Technical Notes: Both inset videos show what the VR view will look like based upon the actual camera specification for #Ingenuity 's RTE horizon facing color camera and #Perseverance 's Mastcam-Z camera, including a working zoom. Both cameras will, like their real world counterparts, will have limited motion, all dependent on how you manipulate the joysticks on your VR hand controllers. Their will also be a "spectator mode" (main video) that will allow you free motion behind the rover with limited "physical motion" roaming space to view the actual flights of Ingenuity, which has not completed its first historic launch, which I am happy to say I witness in real-time very early Monday morning when the data started coming back.
For those of you who are anxious to get your hands of "Ingenuity in VR," we are sorry we missed our 4/15/2021 (SOL 65) date, but we have decided to take more time to meticulously painting and adding rocks around the crafts, as well as mesh refinements to the surround crater ridge and delta cliffs. Thankfully NASA Jet Propulsion Laboratory (JPL) and its team at Arizona State University under Jim Bell's leadership have done a magnificent job of providing detailed images from the Mastcam-Z and NavCam to make this task more accurate. We are also hoping to see how close we are in recreating the Ingenuity RTE camera view in these preliminary teaser videos, but will make any necessary adjustments to make the user feel that are "one" with Ingenuity.
#marshellicopterIngenuity in VRs First Flight in FPV ModeAll Things 3D2021-04-19 | I guess we will find out in 1 hour (4/19/2021, 3:30 am PDT) what @NASA / @NASAJPL #ingenuityhelicopter RTE camera first shot(s) will look like. Here is our first flight of #IngenuityinVR "FPV" mode just moments ago.
Yea, we are going to limit you to 46 deg. FOV in your #VR HMD as well.
#marshelicopterIngenuitys First Flight? (4K ProRes, 5.1 Audio)All Things 3D2021-04-10 | Maybe not NASA's first flight of Ingenuity which we all hope will be successful on Sunday, April 11, 2021 (SOL 61), but the your first flight on Ingenuity either in FPV "look down" mode, or Perseverance "Mastcam-Z" mode. Either way, do it manually or assisted but be a part of NASA from the comfort of your home or office. Remote Access Control has a new meaning on April 15th, 2021 (SOL 65).
https://IngenuityinVR.space
#marshelicopterBreaking News! NASA Detects Unusual Power Surge on Perseverance/Ingenuity (4K ProRes - 5.1 Audio)All Things 3D2021-04-07 | Not really but just another teaser trailer for the upcoming "Ingenuity in VR" experience to show off the completed 'control rig' for Perseverance's Drill/Watson arm and the Mastcam. As well Perseverance's new paint job to better match the version on Mars as well as some tweaks to the photogrammetry terrain based on NASA/JPL images capture on SOL 3. Here is hoping they do another batch at Ingenuity's launch site, before we launch on Mar's SOL 65 (April 15th).
As mentioned in the previous teaser, the Unreal Engine was used to create the Apple Pro Res 4K video and 5.1 audio that was only brought into Adobe Premiered to multiplex the audio to upload to YouTube. A 1080P, stereo version was also created for Facebook, Twitter and LinkedIn.
https://IngenuityinVR.space
Music from the first MIB movie made almost 25 years ago by Will Smith - youtu.be/fiBLgEx6svA
#marshelicopterThrough the Eyes of Perseverance (ProRes-422 - 5.1 Audio)All Things 3D2021-03-29 | As we get closer to releasing "Ingenuity in VR," it is a race with NASA, JPL, and the Perseverance/Ingenuity team on who will have the first FPV experience. All kidding aside, most of the images, HDRI, meshes are from NASA, brought into Reality Creator, Blender, 3D Coat and finally the Unreal Engine to create the closest thing to actually being there in person.
This video was completely rendered in the Unreal Engine and only multiplexed in Adobe Premiere/Encoder. 5.1 Audio was exported directly from Unreal as well with no extra audio layers or sweetening done in a DAW.
#marshelicopterThe Maiden Voyage The 3rd Teaser Trailer for Ingenuity in VR.All Things 3D2021-03-15 | In the third and final teaser, Ingenuity is shown in its maiden voyage from a 3rd person perspective. You also get an idea of a rough draft of the detailed area around #Perseverance based on actual images taken from the rover during SOL 11-20. More work to come in cleaning up and blending the super detailed photogrammetry (3D model based overlaid images) area with the less detailed larger terrain as well as adding more animated articulation points on the rove and "hand painted' touches to give it a more "weatherized" look. In any case it will be an "out of this world experience" for everyone to enjoy, young AND old. The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or modes of user control: Through the eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through down looking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode. In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" PBR texture to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia as 25cm per pixel orthographic image and a 1m per pixel height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating a whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video we only imported one 16K quadrant with a special terrain derived from a DEM in HiRISE TIFF, converted into a fairly high polygon mesh to capture some of the nuances of the sand drifts. We also added a rough high detailed terrain around the rover again based upon images captured by the rover on SOL 11-20 and did some basic color grading. In the future we will blend the two terrains together and using height determining blending techniques, give the #VR user an experience that is out of this world - literally More teaser videos on YouTube : youtube.com/playlist?list=PLqnhoKrXoLPEhJca5y_H85xFGx7PCU3As
More info about the VR app coming to https://ingenuityinvr.space soon
#marshelicopter2nd Teaser Trailer for Ingenuity in VR. Now in Stereo 3D 8K 360 with AmbiX audio.All Things 3D2021-03-15 | This trailer shows off a few features of the Unreal Engine using their stereo projection cube-map, along with an 7680 x 7680 over/under image camera to capture the terrain surrounding where Perseverance landed (in Jezero crater) in stereo "3D" 360 brought into Adobe Premiere along with two sets of six audio channels derived from a novel technique to capture 3D XYZ audio to be converted to AmbiX. Here this is dramatically shown in the illusion of high pitched whine of Ingenuity's two rotors (based on an actual recording during test flight) in front/left to rear/left and then approaching again at the end of the video from below. The Doppler effect was manually created based upon the actual shift in frequency related to Mar's atmosphere and the velocity of the observer's camera as it strafes the rover and Ingenuity since UE's built in Doppler effect was too slow and derived from Earth's normal speed of sound constant instead of Mar's much lower speed of sound. In fact we had to slow down the camera movement because its velocity initially was faster than Mar's speed of sound. Sadly UE's motion blur did not capture very well in the 360 so a little direction blur was added in Premiere. It should also be noted that the blades spin at 2400rpm or 400rps, which breaks down to 240 degrees per 1/60 of second, which close to one revolution per frame, meaning the blades don't really turn much at this capture speed, especially when you have no actual residual imager lag. :(
The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or to modes of user control: Through eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through down looking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode.
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with a special terrain derived from a DEM in HiRISE TIFF, converted into a fairly high polygon mesh to capture some of the nuances of the sand drifts. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.2nd Teaser Trailer for Ingenuity in VR. Now in Google 180 with AmbiX audio. (based on 8K 360)All Things 3D2021-03-11 | This trailer shows off a few features of the Unreal Engine using their stereo projection cube-map, along with an 7680 x 7680 over/under image camera to capture the terrain surrounding where Perseverance landed (in Jezero crater) in stereo "3D" 360 brought into Adobe Premiere along with two sets of six audio channels derived from a novel technique to capture 3D XYZ audio to be converted to AmbiX. Here this is dramatically shown in the illusion of high pitched whine of Ingenuity's two rotors (based on an actual recording during test flight) in front/left to rear/left and then approaching again at the end of the video from below. The Doppler effect was manually created based upon the actual shift in frequency related to Mar's atmosphere and the velocity of the observer's camera as it strafes the rover and Ingenuity since UE's built in Doppler effect was too slow and derived from Earth's normal speed of sound constant instead of Mar's much lower speed of sound. In fact we had to slow down the camera movement because its velocity initially was faster than Mar's speed of sound. Sadly UE's motion blur did not capture very well in the 360 so a little direction blur was added in Premiere. It should also be noted that the blades spin at 2400rpm or 400rps, which breaks down to 240 degrees per 1/60 of second, which close to one revolution per frame, meaning the blades don't really turn much at this capture speed, especially when you have no actual residual imager lag. :(
The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or to modes of user control: Through eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through down looking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode.
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with a special terrain derived from a DEM in HiRISE TIFF, converted into a fairly high polygon mesh to capture some of the nuances of the sand drifts. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.
The lighting is based on a cubemap created from the an earlier 8K 360 I modified from the actual 360 from JPL.
More to come...Teaser Trailer for Ingenuity in VR. Now in Google 180 with AmbiX audio.All Things 3D2021-03-10 | The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or to modes of user control: Through eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through downlooking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode.
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with special terrain derived from the height map of the same 16K quadrant. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.
The lighting is based on cubemap created from the an earlier 8K 360 I modified from the actual 360 from JPL.
More to come...Teaser Trailer for Ingenuity in VR Teaser Now in 4K with 5.1 AudioAll Things 3D2021-03-09 | The goal when this is done is for the VR user to be able stage and launch Ingenuity for either a programmed flight, or to modes of user control: Through eyes of Perseverance's MastCAM-Z for standard "drone" like flight, or "FPV" mode through downlooking cameras, which is what Ingenuity will use to "map" its path. Currently it will be optimized for the desktop with a "lite" version for the Oculus Quest 2 in standalone mode.
In creating the version of the craft for Unreal Engine, we broke apart the NASA - National Aeronautics and Space Administration #Perseverance & #Ingenuity to provide some articulation in the camera and rotors. We also redid the models UV and textures for only one "super" texture for PBR to improve VR performance. The goal is to provide a series of movable components to perform tasks that you program the rover to stage the initial launch of Ingenuity. (You will notice both the rotors and MastCAM structure move in this video)
The terrain is based off of NASA Jet Propulsion Laboratory (JPL) earlier images downloaded from Astropedia in 25cm orthographic image and 1m height map in their proprietary HiRISE DTM Mosaic which we scaled up to match the 25cm orthographic image creating whopping 400GB image, which we immediately cropped down to a more manageable 64K x 64K area, that will be brought into UE as for quadrant 16K x 16K UDIM texture. For this demo video, we only imported one 16K quadrant with special terrain derived from the height map of the same 16K quadrant. The next step is to create a very high resolution terrain based on photogrammetry the surface around the rover, along with particles generation for shifting sand.
The lighting is based on cubemap created from the an earlier 8K 360 I modified from the actual 360 from JPL.
More to come...Perseverance in 8K 360 with AmbiX (Created from the JPL panoramic TIF file)All Things 3D2021-03-02 | This 8192 x 4096 360 video was created from the very high resolution TIFF (32K pixels) formatted 360 panoramic created by NASA Jet Propulsion Laboratory (JPL). However the panoramic made available is not of consistent color grade across the entire panoramic and sky and lower points have not been properly covered until now.
Owl Creek Technologies regraded the sky and surface as well as ensured the color remained as vibrant as those in the TIF file. We also added a little fun "cover" over the lower intersecting point.
We also added the audio files recorded by #Perseverance, but Ambisonicized them to give them a "3D" positional quality and give more life to the barren area surrounding the rover known as #JezeroCrater .
Looking forward to when all the image files will be available from the right camera of the stereo paired #MastCamZ camera that took this as well as many of the other gorgeous extremely detailed images that being made avaiable here: mars.nasa.gov/mars2020/multimedia/raw-images so that a "3D" stereo 360 can be created, but until then just pretend you have one eye closed. :)
Enjoy and look forward to our Unreal Engine based Ingenuity FPV flight sim coming out in a few weeks.Hot Cocoa in VR Winter Edition in 5K Stereo 360 - AmbiX (Revised audio mix & 8 bit color conv.)All Things 3D2021-02-06 | The holidays are over but for many, winter is not. With that in mind, I have gone back and removed all of the holiday stuff, changed out the music, and Pamela S. Scott, LMFT has volunteered to provide us another soothing "Guided Sensory Perception" voice-over to continue to help you escape to our "safe place" cabin to rest, relax and most important -- recharge your emotional batteries to face another day. Oh yea, and we added a little "friend" to help you feel a little less lonely.
Now for the technical stuff:
This version of Hot Cocoa in VR is based off a novel/new technique to capture full 16bit HDR stereo projection cube maps and color graded down to 12-bit Apple Pro Res in Rec.709 color space, then optimized in AdobePremiere to push up the mid levels to make it easier on the eyes in your #VR headset. This version is also includes a novel new way to capture #Ambisonic audio from a 5.1 channel sequence for left/right, front/back, "omni", plus a 5.1 channel sequence to capture Up/Down. Using this technique and the Ambisonic VST library from Matthias Kronlachner allowed these nine channels to create the full spatial audio experience within the Unreal Engine without having to add additional audio tracks or positioning outside of the fixed positioning required to map the eight channels, plus one omni or center channel. This technique allows the full spatial experience to be heard as though you were listening to it in the actual #VirtualReality app, which the "Winter Edition" will be available soon.
You can find the same 360 video on Facebook here:
https://fb.watch/3tjVV9AXP-/
The songs " A Lighthouse in Space" & "Floating" by @Kim Aspen, aka Jimmy Walsteen were licensed from Adobe Content. Please check out Jimmy Walsteen & Kim Aspen artist page here on YouTube:
youtube.com/channel/UCDXTvrDTilCO9O30SP1rllA4K, HDR-10, 5.1 Surround Sound Hot Cocoa in VR Winter EditionAll Things 3D2021-02-04 | The holidays are over but for many, winter is not. With that in mind, I have gone back and removed all of the holiday stuff, changed out the music, and Pamela S. Scott, LMFT has volunteered to provide us another soothing guided imagery voice over to continue to help you escape to our "safe place" cabin to rest, relax and most important -- recharge your emotional batteries to face another day. Oh yea, and we added a little "friend" to help you feel a little less lonely.
Now for the technical stuff:
Further test and research in the best way to push HDR compatible content from the Unreal Engine to Adobe Premiere or other NLE as well further test and research in pushing multi-track spatial audio from Unreal to Premiere. This version is best enjoyed in a quiet setting, with the lights dimmed with a large HDR compatible monitor or screen as well as a 5.1 surround sound system. The next version will be the 6k Stereo 360 using my novel 360 capture system and a new technique in only needing to audio export passes to derive either 1st order or 2nd order Ambisonic audio without having to recreate new audio channels in your DAW or NLE. Look for the Apple ProRes Rec709 10-bit stereo 360 and 180 Videos with 1st order and 2nd order audio channels. Then finally an upgrade to the VR experience. All this to be followed up with tutorial on how to do all this yourself.
The songs " A Lighthouse in Space" & "Floating" by Kim Aspen, aka Jimmy Walsteen were licensed from Adobe Content. Please check out Jimmy Walsteen & Kim Aspen here on YouTube:
youtube.com/channel/UCDXTvrDTilCO9O30SP1rllAHot Cocoa in VR Winter Edition in 6K Stereo 360 Video - AmbiXAll Things 3D2021-02-04 | The holidays are over but for many, winter is not. With that in mind, I have gone back and removed all of the holiday stuff, changed out the music, and Pamela S. Scott, LMFT has volunteered to provide us another soothing "Guided Sensory Perception" (GSP) voice over to continue to help you escape to our "safe place" cabin to rest, relax and most important -- recharge your emotional batteries to face another day. Oh yea, and we added a little "friend" to help you feel a little less lonely.
Now for the technical stuff:
This version of Hot Cocoa in VR is based off a novel/new technique to capture full 16bit HDR stereo projection cube maps and graded down to 12-bit Apple Pro Res in Rec.709 color space, then optimized in Adobe Premier to push up the mid levels to make it easier on the eyes in your VR headset. (Note this version is based on an 10-bit Rec.709, Profile High10, Level 6.2, 80 Mbps H.264 /620 Kbps AAC encode for compatibility with Facebook).
This version also includes a novel new way to capture Ambisonic audio from a quad/4-channel sequence for left/right, front/back and another sequence to capture Up/Down. Using this technique and the Ambisonic VST library from allowed these six channels to create the full spatial audio experience within the Unreal Engine without having to add additional audio tracks or positioning outside of the fixed positioning required to map the six channels. This technique allows the full spatial experience to heard as though you were listening to it in the actual VR app, which the "Winter Edition" will be available soon.Stereo 3D 360 Hot Cocoa in VR revised for 6K, HDR and AmbiXAll Things 3D2021-01-11 | This stereo 360 version of the VR Experience "Hot Cocoa in VR" created for the holidays has been rendered in 6K from a modified and recompiled Unreal Engine 4.26 that allows up to 16K x 8K cube map capture in 10-bit Apple Pro Res LT with recreated soundfield recaptured using a never before published method within Adobe Premiere using the the Mathias Kronlachner Ambisonic VST libraries. ( http://www.matthiaskronlachner.com ) and the new VR visualization & auralization tools in Premiere.
Even though Premiere can only work with 1st order Ambisonic audio (AmbiX), this method is more than adequate for assembling, editing and positioning audio in the soundfield without requiring another audio spatial application like FB Spatial Workstation, Reaper or Pro Tools for Google YouTube 360 Video with AmbiX immersive audio. Keep in mind you can still bring your rendered audio/video file into the Spatial Workstation encoder for Facebook reencoding as done on my 'Owl Creek Technologies' page: https://fb.watch/2Yy0G7c5yY/ It should be noted that Facebook does not allow for HDR and limits resolution to 5120 x 5120 Over/Under stereo 360. In trying to upload this 6K HDR video with a this AmbiX audio track converted for Facebook, it became an unwatchable mess. Converting it to 5K was not enough, and finally had to remove HDR encoding. Sadly, I did not do a proper conversion to SDR and the lower end is now crushed. Here on YouTube, they do a pretty good job of converting back to SDR so don't lose as much detail in the shadows.How-To on creating stereo 3D 360s and 180s in a number of tools made for the Unreal EngineAll Things 3D2020-12-24 | Why capturing standard video & images isn’t enough when it comes to VR promotional material.
0:00 Opening
Examples of 360s done to promote gaming or movie experiences
Actual "Hot Cocoa in VR" VR app for Steam VR & Oculus (desktop) - owlcreektech.itch.io/hot-cocoa-in-vr (Open XR coming soon)Hot Cocoa in VR now in Google 180 for YouTubeAll Things 3D2020-12-24 | Peace is just a breath away. Calm your mind and heart with guided imagery in our virtual cozy Christmas cabin. "Hot Cocoa in VR" is here in Google 180, created specially for you.
Join us here in Google 180 for hot cocoa and stay for awhile for the reading of "It Was The Night Before Christmas"
This is the first of a new series of VR experiences that calm your mind and heart by immersing you in tranquil experiences away from modern life with all of it anxiety provoking issues.
Merry Christmas and a Happy New Year
Mike & Shavaun
You can find the full immersive VR version for SteamVR and Oculus here: owlcreektech.itch.io/hot-cocoa-in-vr with OpenXR coming soon.Hot Cocoa in VR now in stereo 3D 360 for YouTubeAll Things 3D2020-12-24 | Peace is just a breath away. Calm your mind and heart with guided imagery in our virtual cozy Christmas cabin. "Hot Cocoa in VR" is here in stereo "3D' 360, created specially for you.
Join us her in stereo 360 for hot cocoa and stay for awhile for the reading of "It Was The Night Before Christmas"
This is the first of a new series of VR experiences that calm your mind and heart by immersing you in tranquil experiences away from modern life with all of it anxiety provoking issues.
Merry Christmas and a Happy New Year
Mike & Shavaun
You can find the full immersive VR version for SteamVR and Oculus here: owlcreektech.itch.io/hot-cocoa-in-vrHot Cocoa in VR full demo video in Unreal Engines ProRes 10bit HDR, 1080P*All Things 3D2020-12-19 | This is the latest version 1.1 that can be downloaded from itch.io, that fixes a few things and adds some new features to enhance the experience that psychotherapist Pamela Shavuan Scott (voiceover) describes below:
"Peace is just a breath away. Calm your mind and heart with guided imagery in our virtual cozy Christmas cabin. "Hot Cocoa in VR" is here, created specially for you."
This is the first of a new series of VR experiences that calm your mind and heart by immersing you in tranquil experiences away from modern life with all of it anxiety provoking issues.
Today however, you should join us for hot cocoa and stay for awhile for the reading of "It Was The Night Before Christmas"
*This video was completely rendered in Unreal's sequencer using ProRes 422LT, converted using 'Shutter Encoder' to H264 2020 PQ 10bit and muxed with rendered 16bit 2-channel WAV file to capture a reasonable facsimile of sound location without having to resort to Adobe Premiere or Blackmagic Davinci Resolve. In fact, even the titling at the end was done using the 'font renderer' with a few tweaks to a material to give it ethereal translucent glow that call be called in the sequencer, or for that matter in a Blueprint.
Look for a Google 180 of this sequence video using a fairly new 360 tool called "Surreal Capture" that works in any Unreal based executable or even the editor's render window this weekend. As well as full tutorial to the local Portland Unreal Developer Meetup group and uploaded to YouTube afterwards that discusses the pros and cons of several 360 tools one can use with the Unreal Engine.Stay awhile, the hot cocoa is on us.All Things 3D2020-12-12 | Here is a standard "2K" promo video of the cabin environment of "Hot Cocoa in VR"
Read more about it and download the initial version for the Oculus and Steam VR for free on itch.io
Next week a version in the new #VR #OpenXR standard will be released for Microsoft Windows, Windows Mixed Reality, as well as stereo "3D" 360 and 180 videos for Facebook and YouTube.
Until then, sit back and relax while Pamela Shavaun Scott , LMFT uses her special power of guided imagery to quiet your thoughts and relax in a warm inviting cabin.
The hot cocoa is on us.
owlcreektech.itch.io/hot-cocoa-in-vrHot Cocoa & O Tannenbaum in 4KAll Things 3D2020-12-09 | Here is sneak peak at Owl Creek Technologies' (my tech company) entry into the Unreal MegaJam to be completed Friday by 11:00 am PDT.
"Hot Cocoa" is a VR experience using safe calming environments that stimulate your senses. What senses your VR system cannot interact with, your host -- psychotherapist Pamela S. Scott with over 38 year career helping 1000s of clients overcome depression, anxiety, family and relationship problems, will use a powerful tool called "Guide Imagery" to help you fully immerse yourself further in the experience.
It will not only be available as VR app for the desktop, but stereo 360 and 180 standard 16:9 ratio video created for those who are using their phone or mobile platform like the Oculus Quest(2).
A special thanks to Unreal for releasing so much free content like main components of this experience, titled "Log Cabin" by Gabro Media ( facebook.com/gabromedia ) who's main map was almost perfect, with a few alterations and add-on to make it little more cozy and well a perfect place to spend the holidays. I also give a shout to all the other 3D artists who helped indirectly making it possible for myself and the many small indie teams populate their experiences with creative, beautiful and original pieces of 3D art who will be listed in app notes when released.
I would also like to bring attention to the talented guitarist Kim Aspen (open.spotify.com/artist/0TWDlZlnx5EiVZ1Ik1l2Al) who brings life to her rendition of "O' Tannenbaum" that fits perfectly in this slow dolly shot through the cabin.
Look closely, and you may even see the "Mikey Elf." A combination of very cute elf model I found and remodeled to fit my face that was 3D scanned with my 4eyes lens system & the Occipital Structure Sensor ( structure.io ) then hand painted using 3D Coat ( 3dcoat.com )
Enjoy the sneak holiday treat and look for the 360, 180 videos and VR app soon.8K Google 180 sample of Buzz Aldrin descending the ladder in Excursion: 137 Minutes on the MoonAll Things 3D2020-12-03 | Here is a stereo (3D) 360 demo video captured from one of the six brief sequences of during Neil Armstrong & Buzz Armstrong's lunar excursion. Here we se Buzz Aldrin descending the ladder as Neil Armstrong looks on.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences"* series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you and you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.Stereo 360 sample of the Buzz Aldrin seismometer alignment from Excursion: 137 Minutes on the MoonAll Things 3D2020-12-03 | Here is a stereo (3D) 360 demo video captured from one of the six brief sequences of during Neil Armstrong & Buzz Armstrong's lunar excursion. Here we se Neil Armstrong using the ALSCC (Apollo Lunar Surface Closeup Camera) and Buzz Aldrin aligning the Seismometer.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences" series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.Stereo 360 sample of the Buzz Aldrin descending the ladder in Excursion: 137 Minutes on the MoonAll Things 3D2020-12-03 | Here is a stereo (3D) 360 demo video captured from one of the six brief sequences of during Neil Armstrong & Buzz Armstrong's lunar excursion. Here we se Buzz Aldrin descending the ladder as Neil Armstrong looks on.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences"* series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you and you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.A SBS 3D 8K stereo video showing off Excursion: 137 Minutes on the MoonAll Things 3D2020-12-03 | This video was captured as SBS (Side by Side) '3D' stereo image at 8K x 2048, upscaled here to 4320p (8K) to showcase one of the 1 of 6 cinematic transitions at the edge of the little west crater. NASA voice chatter corresponds to that time when Buzz was heading back to Eagle, and Neil Armstrong was finishing up taking images. The vacuum cleaner like device to his right is the ALSCC (Apollo Lunar Surface Closeup Camera) used to take stereo images of the surface. You will learn more of that in this part of the "Apollo 11: 'One Small Step For...' VR Experiences"* series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you and you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This stereo SBS was captured with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space
* An EPIC Mega Grant winner.8K Stereo 360 of Neil Armstrong at the Little West Crater in Excursion: 137 Minutes on the MoonAll Things 3D2020-12-03 | Here is a 8K stereo (3D) 360 demo video captured from one of the six brief cinematic sequences during Neil Armstrong & Buzz Armstrong's lunar excursion. Here we see Neil Armstrong on the ridge of the "Little West Crater" as he takes a few panoramic while Buzz Aldrin heads back to Eagle finishing up his excursion.
This is part of the "Apollo 11: 'One Small Step For...' VR Experiences" series, titled "Excursion: 137 Minutes on the Moon" to be out this holiday on Steam VR, Oculus, and Vive Port. In the actual VR experience you can move around, pick up rocks, work with the tools & test equipment and even take pictures using your chest mounted Hasselblad EL500 camera that you can make appear/disappear with press of a button. Images are stored on your hard drive so you can send them to your friends to, you know, prove that you were on the moon.
This 360 capture was done with 'Surreal Capture' a tool that allows you to capture 360 , stereo SBS video directly from any Unreal Engine game or VR experience in real time at 7680 x 3840 with varying sample rates. It also works great even during development while in the editor. Sadly stereo 360 capture is not implemented yet (coming soon) but it is easy to get at least true stereo for 180 degrees by using the horizontal offset variable and inserting 6. Captures most Post Processing effects, except Exposure/Eye Adaption. Working with the developer to correct this if possible. surrealcapture.com
You can find out more about the "Apollo 11: 'One Small Step For...' VR Experiences" series at https://1smallstepfor.space