We'll go over how to detect controller input changes, how to read controller state information directly, and how we can detect a variety of gestures generated from the controller touchpad. Also, we will make use of head pose info to adjust a canvas UI position and rotation, which will make the UI follow us around. Lastly, We’ll test all of these features by using a powerful dev tool called the “Application Simulator” as well as deploying to the device.
📚 Magic Leap 2 Input Video Chapters: 00:00 - 00:31 - Introduction to Magic Leap 2 Input (Controller & Head Pose) 00:31 - 01:31 - Creating a new unity project with ML2 input features and Unity ML2 application simulator 01:31 - 02:50 - ML2 application simulator action bindings and demos 02:50 - 15:54 - Interacting with ML2 Input API with Unity scripting (Controller Input) 15:54 - 19:02 - Reading head pose information to change UI position and rotation during input events
📙 Great Magic Leap 2 Input Resources: - Controller API Overview: https://developer-docs.magicleap.cloud/docs/guides/unity/input/controller/unity-controller-api-overview/ - Application Simulator (Action Bindings): https://developer-docs.magicleap.cloud/docs/guides/developer-tools/app-sim/app-sim-action-bindings/#movement-controls - Helpful ways to read input within callbacks: docs.unity3d.com/Packages/com.unity.inputsystem@1.0/api/UnityEngine.InputSystem.InputAction.html - Controller Gestures: https://www.magicleap.care/hc/en-us/articles/4424698871565-Controller-Overview
Also huge thanks to @MagicLeap for sponsoring this ML2 video!
Powerful Magic Leap 2 INPUT Features Are HERE! (Controller & Head Pose)Dilmer Valecillos2023-11-16 | Today, we're going to cover how to use the Magic Leap 2 Powerful INPUT API.
We'll go over how to detect controller input changes, how to read controller state information directly, and how we can detect a variety of gestures generated from the controller touchpad. Also, we will make use of head pose info to adjust a canvas UI position and rotation, which will make the UI follow us around. Lastly, We’ll test all of these features by using a powerful dev tool called the “Application Simulator” as well as deploying to the device.
📚 Magic Leap 2 Input Video Chapters: 00:00 - 00:31 - Introduction to Magic Leap 2 Input (Controller & Head Pose) 00:31 - 01:31 - Creating a new unity project with ML2 input features and Unity ML2 application simulator 01:31 - 02:50 - ML2 application simulator action bindings and demos 02:50 - 15:54 - Interacting with ML2 Input API with Unity scripting (Controller Input) 15:54 - 19:02 - Reading head pose information to change UI position and rotation during input events
📙 Great Magic Leap 2 Input Resources: - Controller API Overview: https://developer-docs.magicleap.cloud/docs/guides/unity/input/controller/unity-controller-api-overview/ - Application Simulator (Action Bindings): https://developer-docs.magicleap.cloud/docs/guides/developer-tools/app-sim/app-sim-action-bindings/#movement-controls - Helpful ways to read input within callbacks: docs.unity3d.com/Packages/com.unity.inputsystem@1.0/api/UnityEngine.InputSystem.InputAction.html - Controller Gestures: https://www.magicleap.care/hc/en-us/articles/4424698871565-Controller-Overview
Also huge thanks to @MagicLeap for sponsoring this ML2 video!
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#xr #metaverse #unity #magicleapSenseGlove Nova 2: Touch and Feel Virtual Reality in Your Hands!Dilmer Valecillos2024-10-08 | Do you want to take VR to a whole new level? These haptic gloves, called the SenseGlove Nova 2, let you feel virtual objects as if they were right in your hands. I had the opportunity to test them, and they were mind-blowing! I wasn't a fan of the $5,999.00 USD price, but aside from that, they showcase an impressive level of innovation in haptics.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#virtualreality #mixedreality #shortsI Tried Snaps NEW Spectacles AR Glasses!Dilmer Valecillos2024-10-08 | These are Snap's new Spectacles AR Glasses, targeted at developers and recently announced by Snap CEO Evan Spiegel.
These AR Glasses are not your typical smart glasses that simply mirror your display. Instead, they provide a full understanding of the real world, allowing you to place virtual objects in precise locations, effectively using the world as your canvas. You can interact with holograms using hand tracking, voice control, or even your mobile device as a controller-what Snap calls multi-modal input.
In any case, they're pretty cool and super lightweight (226 grams). The Spectacles price is currently $99 USD per month, with a 12-month commitment.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#spectacles #augmentedreality #tech #shortsSnap Spectacles AR Glasses Unveiled - BUT Here Is My Dev Take!Dilmer Valecillos2024-10-05 | Snap announced its new Spectacles AR Glasses during my trip to the Snap Partner Summit, and they are hugely improved from the previous generation with better battery life, cameras, processors, increased FoV, multi-modal support, and much more.
Today, I also cover a lot of cool things, including specs, unboxing, setup, SnapOS apps, building a Spectacles app from the ground up, and finally, my developer take on these AR glasses.
00:00 Intro To Snap Spectacles 01:14 Spectacles Specs 02:09 Spectacles Unboxing 02:26 Spectacles Setup 03:04 Spectacles Hand Interactions Tutorial 03:55 Spectacles Mirror, Spectator, and Controller Toggle 04:40 Showcasing A Few Spectacles Games And Apps 05:18 Creating A Lens Studio Spectacles App 07:43 Installing Spectacles Interaction Kit (SIK) 13:58 Creating A New TypeScript Component (For Planet Details) 20:59 Extending SIK With A Custom Interactable Rotator Component 23:07 Sending Lens to Spectacles & Testing in AR 23:44 My Dev Take (Takeaways) 24:51 Outro
📢 Huge thanks to @SnapAR and @OfficialSnapchat for sponsoring this video!
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#snap #spectacles #augmentedrealityPico 4 Ultra Vs Quest 3 Passthrough Comparison!Dilmer Valecillos2024-09-28 | In one of my previous videos, I had the opportunity to review the PICO 4 Ultra and noticed a few differences in the passthrough results compared to the Meta Quest 3. I know some of you also asked me questions about these differences so this video should give you a sneak peek at some of these differences, such as distortion, quality, and how the sensors react to lighting conditions.
Thanks everyone and have an amazing new week!
📌 Support me by Subscribing to avoid missing future videos! youtube.com/dilmerv
📌 Support me in Patreon so I can keep doing stuff like this for free! patreon.com/dilmerv
📌 Get XR & game development tips from me in Twitter https://x.com/dilmerv
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#pico4ultra #quest3 #shorts #metaverseMeta Quest 3S First Look During Meta Connect 2024!Dilmer Valecillos2024-09-26 | This was a first look at Meta Quest 3S during today #MetaConnect and honestly a great device for anyone wanting to get into XR at a very very low cost ($299.99 and shipping October 15)
📌 Support me by Subscribing to avoid missing future videos! youtube.com/dilmerv
📌 Support me in Patreon so I can keep doing stuff like this for free! patreon.com/dilmerv
📌 Get XR & game development tips from me in Twitter https://x.com/dilmerv
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#quest3s #mixedreality #virtualrealityXREAL Beam Pro + XREAL Air Pro 2: GREAT Lightweight AR Glasses!Dilmer Valecillos2024-09-16 | Here's a quick overview of the XREAL Beam Pro and XREAL Air 2 Pro, which offer a very low cost of entry into Augmented Reality. I had a blast testing them out and was amazed at how practical and portable they are. I can't wait to take these devices to my upcoming XR conferences!
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#xr #metaverse #unityPICO 4 Ultra UNBOXING: Is This A GOOD Mixed Reality Headset?Dilmer Valecillos2024-09-09 | Today, I'd like to show you PICO's new high-end mixed reality headset, the PICO 4 Ultra, which is packed with a lot of technology, including two 32-megapixel passthrough cameras, a ToF depth-sensing camera, four environment-tracking cameras, the upgraded Qualcomm Snapdragon XR2 Generation 2 processor, and many new upgrades and features.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#pico #pico4ultra #mixedrealityXREAL Beam Pro And Air 2 Pro AR Glasses: BUT Are They Worth IT?Dilmer Valecillos2024-09-03 | I had the opportunity to check out the XREAL Beam Pro in combination with the XREAL Air 2 Pro, and honestly, we really need to talk about them.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#XREAL #XREALAir2Pro #XREALBeamProThe Meta Immersive Debugger is Now Available! - Unity TutorialDilmer Valecillos2024-08-24 | Today, I’d like to introduce you to a new tool that will help you develop VR/MR games or apps efficiently during project iterations. It's called the Immersive Debugger and is a powerful in-headset runtime debugger tool that allows you to expose variables, methods, and console logs, all without needing any coding experience.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #tutorial #quest3Mixed Reality Utility Kit: A Powerful MR Utility for Scene API NOW Available!Dilmer Valecillos2024-08-19 | Today, I'd like to introduce you to Meta's Mixed Reality Utility Kit (MRUK) by going over its features, using it with the Meta XR Simulator and synthetic environments, and finally, creating a small demo that utilizes MRUK with C# to query information about our scene.
📚 This video covers the following: - Introduction to MRUK - How to setup and use MRUK features from a C# standpoint - Using Unity play mode & Meta XR Simulator to test scene understanding with MRUK - MRUK Android permissions & deploying to a Quest 3 device
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #mixedreality #quest3Creating Custom Unity Packages: Step-by-Step TutorialDilmer Valecillos2024-08-15 | In today's video, we cover how to create custom packages and samples for your projects to help streamline your codebase. By transitioning from a monolithic project structure to a slimmer, more modular version, you'll not only improve your workflow but also improve long-term reusability for future projects.
📚 Here is the summary: - Creating a GitHub repository to store your custom packages - Custom packages structure and explanations - Consuming and Updating Unity custom packages - Adding Unity samples to custom packages
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #unitytips #gamedevHow to create a Quest 3 Mixed Reality App - Unity BEGINNER Tutorial!Dilmer Valecillos2024-08-13 | I'll walk you through creating a Mixed Reality app based on a productivity app idea I had, using Meta Platform Tools in Unity. We'll be using Buildings Blocks to add a camera rig, passthrough, controller tracking, and interactions.
📌 Here's a summary of today's Unity Mixed Reality Quest 3 video: - Unity Mixed Reality project setup and configuration (Project Setup Tool & Building Blocks) - Creating a Measuring Tape feature (C# MonoBehaviour with input handling, line generation, and measurement label placement) - Creating a Level Tool feature (grab interactions + C# MonoBehaviour) - Creating a simple UI with buttons (poke interactions + C# MonoBehaviour) - Demos showing the finished prototype, as well as an extended prototype I submitted to Meta as part of a new program.
💡Get Started With Meta Presence Platform & Building Blocks: ocul.us/3Vadlsx
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#meta #unity #quest3Create a Mixed Reality Game FAST - BUT JUST With The Unity Asset Store!Dilmer Valecillos2024-08-07 | Today, I'd like to walk you through How To Create A Mixed Reality Game In Unity by just using a few assets from the Unity Asset Store.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#madewithunity #mixedreality #extendedrealityHow to create a Quest 3 Mixed Reality Game - Unity BEGINNER Tutorial!Dilmer Valecillos2024-07-23 | Today, I will go over how to create a Quest 3 Mixed Reality game in Unity for Beginners, including: enabling Quest 3 for developer mode with the Meta Horizon App, installing Meta Quest Link, setting up Unity and Android dependencies for PC and macOS, and finally, creating a new Unity Mixed Reality project.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #tutorial #quest3Apple visionOS Object Tracking Is Here! BUT DOES It Work Well?Dilmer Valecillos2024-07-10 | Today, I would like to share what I’ve learned about Apple’s Object Tracking features, recently announced as part of visionOS 2.0 Preview.
This video will walk you through the entire workflow: image capturing/scanning, object reconstruction, machine learning training with spatial object tracking, and integrating trained objects into Xcode.
📚 Vision Pro Object Tracking Chapters: 00:00 - Apple visionOS Object Tracking Introduction & Questions 00:29 - visionOS Object Tracking Requirements 01:17 - Preparing Physical Objects For Object Reconstruction 02:12 - Generating USDZ Files (3D Models) With Reality Composer Pro 03:10 - Training Models With Create ML Object Tracking Features 08:27 - Integrating Trained Object Tracking Models With Xcode 19:05 - Running Object Tracking Demo With The Apple Vision Pro 20:27 - Adding Reference Object USDZ Model For Additional Visualizations 21:02 - Object Tracking Demo With Reference Object Model Rendering 21:57 - Outro (Final Thoughts)
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#AppleVisionPro #VisionPro #AppleFASTER Iteration With Meta XR Simulator - MacOS Support Available!Dilmer Valecillos2024-07-01 | Today, I am excited to announce the availability of macOS support with the Meta XR Simulator as part of v66 or above, enabling faster Unity development iteration.
📚 Video Chapters: 00:00 - Intro to Meta XR Simulator 00:36 - Benefits of using Meta XR Simulator with Unity Mixed Reality Projects 00:56 - Intro to LightSaber prototype which we'll integrate to learn about Meta XR Simulator 01:11 - Setting Up Meta XR Simulator Demos Project Template & Resources 01:55 - Adding Building Blocks & LightSaber prefabs 02:16 - Launching Meta XR Simulator (Virtual Reality Testing & Inputs) 04:33 - Adding Grab Interaction Building Block & Testing With Keyboard, Mouse, & Xbox Controller 06:56 - Data Forwarding Setup To Enable Physical Controller(s) In Meta XR Simulator 09:26 - Adding Empire Crates & LightSaber Slicing Features (With Ezy-Slice) 12:33 - Meta XR Simulator Support for MAC 13:56 - Adding MR Features: Passthrough & Synthetic Environment Testing 17:05 - Adding MR Features: MRUK, Effect Mesh, and Find Spawn Positions 22:50 - Outro
💻 GitHub repositories are available with templates to get started, resources, and a completed project if you prefer to review the final version rather than following step by step:
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #meta #quest3GIANT Rockets In Mixed Reality With OpenXR ML2 Spatial Anchors Features!Dilmer Valecillos2024-06-12 | Today, I am excited to announce that Persistent Spatial Anchors with OpenXR are now available for Magic Leap 2.
📚 Video Chapters: 00:00 - Introduction To Spatial Anchors API & Spatial Anchors Storage API 02:13 - Unity Project Setup And Resources 02:26 - Camera Near Clipping Configuration & Spatial Anchor Permission 03:17 - AR Foundation (AR Anchor Manager Setup) 04:26 - C# Script With Spatial Anchors API (Anchor Creator Component) 17:05 - Anchor Creator Component Demo From The Headset 18:17 - Getting Additional Anchor Info From ML XR Anchor Subsystem 20:42 - ML XR Anchor Subsystem Demo 21:11 - Adding Spatial Anchors Storage API Capabilities 50:24 - Spatial Anchors Storage API Demo 51:00 - Adding Restore of Anchors From Storage With New UI 55:53 - Restoring of Anchors With UI Demo 56:52 - Outro
Spatial Anchors are fully compatible and built on top of Unity's AR Foundation. These new API additions allow you to perform asynchronous calls for publishing, creating, and deleting anchors, as well as updating anchors expiration dates for those stored with the Spatial Anchor Storage API.
Thanks to @MagicLeap @MagicLeapDevs for sponsoring this video.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#openxr #unity #ml2 #programmingWebXR Tutorial: BUILD A Mixed Reality Game In Mattercraft Using Realtime Physics!Dilmer Valecillos2024-06-04 | In today's video, we're going to create a Mixed Reality (MR) game from the ground up using WebXR in Mattercraft.
Mattercraft allows us to maintain a single codebase and run the same MR experience on Apple Vision Pro, Meta Quest 3, Zapbox + iPhone, and Magic Leap 2. We'll begin by experimenting with the new WebXR physics, which will enable us to create a fun slingshot mechanic and a target for our mini game.
📌 In this second Mattercraft video, we'll be working on the following areas: - Creating the WebXR Project by using the Headsets & VR Templates - Creating A Slingshot Mechanic (SlingshotGrabber.ts typescript behavior) - Creating A Ball Factory / or in other words A Ball Spawner (BallSpawner.ts typescript component) - Creating Target (SlingshotTarget.ts typescript behavior) - Creating A Simple Score System (GlobalContext.ts typescript context) - Creating A Simple Environment & various physical objects (Havok Physics)
📚 Video Chapters: 00:00 - Intro To Mixed Reality Project WebXR Tools 00:44 - Overview of WebXR New Features Covered Today With Mattercraft 01:00 - Creating A New WebXR Project & Setup Of Physics Dependency 03:41 - Setting Up Controllers / Hands With Grab Interactions 06:11 - Adding Shapes, Setting Up Shapes Rigidbodies, And Colliders 12:16 - Setting Up Slingshot Ball Component 17:34 - Setting Up Ball Factory 38:01 - Testing Ball Factory Physics Within Mattercraft Viewer And Apple Vision Pro 38:30 - Setting Up Slingshot Component And Slingshot Grabber Behavior 50:20 - Testing Slingshot Mechanic With Meta Quest 3 And Zapbox 51:03 - Setting Up Target And Slingshot Target Behavior 53:44 - Creating A UIVisualizer To Bind UI And Audio Manager 54:25 - Testing Our Final WebXR Mixed Reality Project 56:08 - Outro
📢 Recommended: 👉 If you missed the previous Mattercraft WebXR video then be sure to watch it from: youtu.be/1y40Y3wdpCY 👉 Get started with Mattercraft IDE by signing-up at: bit.ly/3VSmruD 👉 Get Started With AR/VR By Using 🕶️ Zapbox: zapbox.io/?via=dilmer (affiliate link)
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#webxr #applevisionpro #quest3My New VR/MR Portable Developer Setup (Alienware x16 R2 First Impressions)Dilmer Valecillos2024-05-21 | I briefly talked about getting a new Alienware x16 R2 recently and how much I needed a powerful laptop for XR (AR/VR) development. Well, I found that laptop, and here's a small video about it.
As far as pros and cons of using this laptop for XR development, here's what I’ve experienced in these last 2 weeks:
Pros 👍: - It is portable. I can now take it anywhere, unlike my previous main machine, which was a desktop. It was hard to work on XR projects unless I was using a powerful desktop. - It allows me to run Meta Link incredibly fast compared to my older desktop running a Titan GPU + 64GB + i9 versus the i9 Ultra + 4090 GPU + 32 GB on this laptop. - Large Unity projects open incredibly fast, I would say 50% faster than on my MacBook Pro (Apple M1 Pro 2021). - The keyboard feels like a full-size mechanical keyboard and is very responsive. - It has the ability to change between performance modes (battery, quiet, balanced, or performance for more high-resource experiences). - Customer service from @Dell @AlienwareChannel is amazing; they’re very responsive and willing to help right away.
Cons 👎: - Although it is portable, it doesn’t fit in a regular-sized backpack. You need a larger backpack or an Alienware Backpack, which works very well with this laptop. This is a con, but the Horizon Slim Backpack is awesome and very affordable at $38.99 USD. - It is a bit loud, but honestly, I wouldn’t expect less since it is portable. - The price is pretty high. I can justify it due to its portability and the ability to use it during conferences and trips, but for the average XR developer, this may be too expensive. - When I initially received this machine, there was screen light bleeding. However, I was able to get an exchange right away, which was delivered within 3-4 days from my initial request.
Overall, I am pretty happy with this purchase. I had a Razor Blade prior to this (with very low specs), and it became outdated too fast. For that reason, when Meta Link (aka Oculus Link) got an update, I couldn’t use that feature after one year of purchasing the laptop. This is the main reason why I decided to get the highest spec version of the Alienware x16 R2.
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#xr #metaverse #unityBuilding A Mixed Reality Tabletop Game FAST - Quest 3 & Presence Platform!Dilmer Valecillos2024-05-18 | Today, I am excited to announce a new YouTube video series featuring Meta's Presence Platform. In this series, we will prototype small tabletop games and productivity apps with mixed reality.
💡To Get Started With The Presence Platform & Building Blocks check out the docs at: ocul.us/3Vadlsx
In the first video, we will cover: - How to set up a Unity project with Meta XR components using the project setup tool. - Using core building blocks such as the camera rig building block, passthrough building block, controller tracking, hand tracking, visualizers, room mesh, and interactions. - How to build a mixed reality tabletop bowling game from the ground up. - Step-by-step instructions on how to set up and use the Meta XR Simulator with data forwarding features, allowing you to use physical controllers with the simulator. - How to use synthetic environments with Meta XR Simulator for testing passthrough via sim.
📚 Video Chapters: 00:00 - Intro to Meta Presence Platform video series and prototypes. 01:03 - Reviewing the tabletop bowling game we'll build today. 01:29 - Creating a new Unity Project with Meta XR components & the project setup tool 02:48 - Intro to Building Blocks. 05:06 - Adding additional Building Blocks: passthrough, controller tracking, hand tracking, virtual hands, and grab interactions. 06:39 - Intro to Meta XR Simulator & synthetic environments. 08:15 - Data forwarding feature overview. 09:44 - Building a mixed reality tabletop game. 23:54 - Demos showing our mixed reality bowling game completed.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#meta #unity #quest3NEW WebXR Tool Available For Apple Vision Pro, Quest 3, ML2, And More!Dilmer Valecillos2024-04-28 | Here's a sneak peek of my latest WebXR video, where I showcased some of the major features available with Mattercraft. A summary of the features includes support for Apple Vision Pro, Meta Quest 3, Magic Leap 2, Zapbox, and many others, as well as a LIVE PREVIEW tool for fast WebXR dev iteration, a very cool animation system, and much more.
💡Also, feel free to let me know if you’ve any questions below. Thanks everyone!
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#webxr #visionpro #quest3 #shortsWebXR in Mattercraft: Rapid AR/VR Creation for Apple Vision Pro & Quest 3!Dilmer Valecillos2024-04-26 | Today, I would like to invite you to join me as I introduce you to a new WebXR tool and build a SpaceX fan prototype that runs on multiple platforms.
📢 To register for FREE for 14 days and follow along, go to: bit.ly/3VSmruD 📌 Get Started With AR/VR By Using Zapbox (A Very Low Cost XR Device): zapbox.io/?via=dilmer (affiliate link)
📚 This video also covers the following areas: 00:00 - Intro To Major Mattercraft WebXR Tools 01:38 - Creating An Account And Logging Into Mattercraft 02:34 - Creating A WebXR Mattercraft Project 05:05 - Selecting A WebXR Controller (For instance: Quest 3, ML2, Hands, etc) 05:39 - Adding glTF / GLB 3D Models 07:56 - WebXR One-Click Publishing and Testing With Quest 3 & Apple Vision Pro 09:23 - Adding A NoSkyOnAR Custom Behavior With TypeScript 12:05 - Adding A Legend And Implementing The LineLegend Custom Behavior With TypeScript 16:30 - Adding Animations (States And Timelines) 20:58 - Adding Billboard Components To All Legends 23:28 - Adding UI Interactions With 3D Models And Event Bindings 26:42 - Final Demo of Our "SpaceX Fan Project" Running On Multiple Headsets 28:51 - Outro
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#webxr #applevisionpro #quest3Unity visionOS 2D Windows and FULLY Immersive VR! (Apple Vision Pro Development)Dilmer Valecillos2024-04-08 | In this video, we'll take a look at how to create visionOS experiences in Unity using 2D Windowed and Fully Immersive app modes.
Additionally, I will go over creating an Input Actions mapping file to bind various input events used with 2D Windowed apps and Fully Immersive VR, including utilizing the new VisionOS Spatial Pointer.
💡 A small fix is shown on this video to allow for proportionally scaling 2D windows and 3D content for the generated Unity Builds.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#applevisionpro #unity #visionosImmersed App Releasing Soon For Apple Vision Pro | Sneak Peek #apple #applevisionproDilmer Valecillos2024-04-01 | I had the opportunity to get a BETA version of Immersed for Apple Vision Pro, and even though they're early in development, I was able to see how much better this was compared to the native Vision OS mirroring features available today.
ℹ️ This BETA version supports: - Up to 5 displays - Standing vs Sitting mode (this means there's a slight Y-axis change when going from sitting to standing--a pretty cool feature) - Basic spatial window transformations (movement, scaling, and adding curvature) - Audio is also sent from your OS to the Vision Pro headset - Support for PC/MAC/Linux
I only tested it with MAC, and I found a few issues during the setup, but to be fair, they're early in BETA, so I am sure it will be fixed soon. On the other hand, the screens look amazing, and it was great to be able to move/scale/add curvature to windows in my working area.
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#applevisionpro #shorts #appleNEW Mixed Reality Spectator App For iOS NOW Available!Dilmer Valecillos2024-03-30 | Today, I spent a big part of my day playing around with multiple MR experiences while testing the new spectator app for iOS. Honestly, it is the best of its kind today and I wish we had something similar for all XR devices today—perhaps this app will push other platforms to release a similar app!
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#mixedreality #spatialcomputing #shortsDiving Into Unity OpenXR ML2 Gaze Features - Eye Tracking!Dilmer Valecillos2024-03-28 | Today, I'd like to go over a few Eye Tracking/Gaze Features that are now part of the new Magic Leap OpenXR support released recently.
We will cover the gaze interaction profile setup, requesting eye tracking permissions through C#, and creating a prototype that makes use of eye tracking data to interact with virtual content.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #openxr #magicleapHow To GET Advanced VR & AR Player Data With Cognitive3D AnalyticsDilmer Valecillos2024-03-23 | Today, I like to share my experience by using an advanced analytics tools for VR/AR which allows you to track and get player data.
The visualizations offered by Cognitive3D are something else and I would love to show you a few interesting demos as well as how to integrate their SDK into a Unity project.
📌 In addition to what was mentioned, I will also cover: - How you can use Scene Explorer to view your player(s) recorded sessions: including showing HMDs, Controllers, Gaze Generated Heatmaps, and additional stats. - How to track specific object behaviors associated through the use of Dynamic Objects + Custom Events. - How to customize your Cognitive3D session info for authentication purposes.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #spatialcomputing #cognitive3dOpenXR With Magic Leap 2 NOW Available - Unity Setup & Plane Detection!Dilmer Valecillos2024-03-15 | Today, I am really excited to announce that Magic Leap is now moving from their custom MLSDK to OpenXR (Open standards for XR).
This is a very good move because it now allows you to use a common set of APIs available with other XR devices due to the OpenXR runtime, in addition to using Unity AR Foundation & XR components which you may already be familiar with.
I’m also sharing a step-by-step process which includes: setting up an OpenXR Unity project, setting up a rig and controller, adding plane detection, and lastly adding plane classifications.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#openxr #unity #magicleapNEW Interaction SDK Features Are HERE! Hands Locomotion & MORE!Dilmer Valecillos2024-03-12 | Today, I am excited to provide you with an overview of Meta's New Interaction SDK features released with Meta XR version 62.0.0 & Interaction SDK OVR Samples package.
I will also go over Unity project setup with the updated Meta XR packages, new hand locomotion examples, new comprehensive rig examples, and lastly, multimodal support.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #meta #sdk #xrStep-By-Step Guide: Mapping My YouTube Studio With Immersal ARDilmer Valecillos2024-03-11 | Today, I would like to continue my AR location-based video series, where I use Immersal to scan my YouTube studio and generate an Augmented Reality map.
📚 Video Chapters: 00:00 - Mapping My Studio With Immersal AR (Introduction) 00:29 - Creating A New Unity Project 00:55 - Installing Immersal SDK Packages, Samples, And Scanning My Office 02:16 - AR Camera Setup & Immersal SDK Components 07:00 - Getting Map Id From The Immersal Developer Hub 08:33 - Changing the AR Localization Behavior 12:51 - Adding A Script To Toggle Point Cloud Visualizers 14:25 - Adding AR Objects Relative To Our AR Map 16:30 - Adding AR Content Raycast For AR Object Selection 19:25 - Outro
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#augmentedreality #spatialcomputing #unityApple Vision Pro: Unity visionOS Template IS NOW Available! #apple #applevisionproDilmer Valecillos2024-02-14 | Today, I did some testing with the latest version of Unity visionOS 1.0.3, which included utilizing the impressive Unity visionOS template showcased in this short video. Additionally, I went through the new setup process and reviewed the updated Play To Device features for Dev Fast Iteration. Moving forward, my plan is to create a new, small, step-by-step video series focusing on Full Immersive (VR) experiences, both immersive with bounded and unbounded volumes in Unity.
All the research I did today will help me get there, and I should’ve a total 2 new videos this month!
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#visionpro #unityApple Vision Pro FAST Development Iteration With Mac Virtual Display!Dilmer Valecillos2024-02-13 | Using the Apple Vision Pro along with Virtual Display to deploy from Xcode to the physical device without ever leaving the immersive experience significantly enhances development iteration times.
Learn how I use this today through a short Vision Pro development video with Xcode. For further resources on visionOS development, check out the playlists below.
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#AppleVisionPro #VisionPro #AppleHow To Build Immersive AR Environments With Immersal SDK?Dilmer Valecillos2024-02-12 | Today, I like to introduce you to Immersal, which allows us to build accurate and persistent multiuser Location-Based AR experiences that can run on most mobile devices.
We'll also explore an AR Navigation Demo created with Immersal SDK, demoing their cool and very reliable AR localization features. Additionally, we will build a Unity demo demonstrating persistent AR content placement.
📚 Immersal SDK Video Chapters: 00:00 - Immersal AR Technology (Introduction) 01:06 - AR Navigation Demo With Immersal 01:43 - AR Navigation Demo Prototype In Unity 02:44 - AR Navigation Demo Testing With an iPhone 15 Pro 03:19 - Using Mapper 2.0 To Scan My Studio (Immersal Workflow) 04:10 - Immersal Developer Hub Dashboard 08:25 - Integrating Immersal SDK in Unity 19:13 - Testing Our Immersal Demo App With iOS 19:51 - Switching To Android And Testing With a Google Pixel 4
Also huge 🎉 thanks to @Immersal for sponsoring this XR video. I had a great experience by learning and testing their useful AR localization tools.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#augmentedreality #immersal #unity #xrApple Vision Pro Developer Strap Is Here! BUT DO You Need It?Dilmer Valecillos2024-02-11 | The Apple Developer Strap was released very quickly after the Apple Vision Pro, and many of you had questions about it.
Today, I would like to answer some of those questions by going through unboxing, installation, Xcode deployment comparisons with Wi-Fi vs USB-C, running Virtual Display with USB-C, Apple Configuration with USB-C, and Reality Composer Pro 4K USB-C transferred videos.
Tech specs from Apple: USB-C data connection, individually amplified dual driver audio pods, and compatible with Mac.
📚 Vision Pro Developer Strap Chapters: 00:00 - Vision Pro Developer Strap (Intro) 00:26 - Unboxing The Vision Pro Developer Strap 01:55 - Installing The Vision Pro Developer Strap 05:11 - Connecting The Vision Pro Developer Strap To Your MAC 05:19 - Enabling Apple Vision Pro Developer Mode 05:38 - Using Apple Configuration With Vision Pro Developer Strap Via USB-C 06:06 - Connecting Vision Pro Developer Strap To Xcode 06:41 - Xcode USB-C Vs WiFI Deployments 08:18 - Vision Pro Virtual Display Connection Via USB-C 08:40 - Reality Composer Pro Developer Capture 4K Transfers Via USB-C 09:06 - Reality Composer Pro Vision Pro Connection 09:21 - Conclusion 10:18 - Outro
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#AppleVisionPro #VisionPro #AppleApple Vision Pro First Impressions From A DEV Perspective!Dilmer Valecillos2024-02-06 | I couldn't help to show you what I've been playing with since I got my Apple Vision Pro.
Today, I'll be unboxing the Apple Vision Pro, showing you a few cool demos as well as core visionOS features, and lastly I will give you my early review about what I like and don't like so far about Apple Vision Pro. Keep in mind that I will be doing a more detailed review in a few weeks.
Feel free to share your thoughts on the Apple Vision Pro in the comments section below! I'm curious to hear your opinions.
📚 Apple Vision Pro Chapters: 00:00 - Apple Vision Pro (Intro) 00:28 - Unboxing & Specs 09:55 - Device Setup & Persona Results 10:47 - Passthrough Quality 11:29 - visionOS Core Interactions 13:03 - visionOS Environments 13:31 - Sharing Mac screen on visionOS & Persistent UI 14:04 - Encounter Dinosaurs 14:38 - My Early Review / Takeaways
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#AppleVisionPro #VisionPro #AppleVisionProReviewUnity XR Hands Custom Gestures Tools Are Here!Dilmer Valecillos2024-01-31 | Today, we're going to take a look at XR Hands Custom Gestures tools, which allow you to author your own hand gestures with the OpenXR Plugin & XR Hands Packages.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #xr #xrhandsHOW To Get Started With ML2 Hand Tracking Features In Unity (XR Toolkit & ML2 SDK)Dilmer Valecillos2024-01-03 | In today's video, we'll go over how to integrate hand tracking features by using the ML2 SDK.
We'll also create a demo scene where Magic Leap 2 hand tracking permissions will be configured in Unity, and we'll be building a real-time hand visualizer to display each hand skeleton bone as well as its position and rotation.
📚 Magic Leap 2 Hand Tracking Video Chapters: 00:00 - 00:58 - Introduction to Magic Leap 2 Hand Tracking Features 00:58 - 04:53 - ML2 Hand Tracking Project Setup 04:53 - 06:08 - Integrating Hand Tracking with Hand Tracking Manager Script 06:08 - 07:52 - Getting XR Input Devices For Left And Right Hand Device 07:52 - 12:48 - Building A Hand Tracking Bone Visualizer 12:48 - 15:05 - Getting And Displaying Detected Gestures 15:05 - 17:11 - Adding Bone Names To Bone Visualizer
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#xr #mixedreality #unity #magicleapApple Vision Pro: Building A visionOS Prototype with ShapesXR and Unity!Dilmer Valecillos2023-12-30 | I recently launched a new YT video about the process of using ShapesXR as an XR design tool 👨🎨 and today I like to give you a sneak peek about it.
💡How can we prototype an Apple Vision Pro Prototype with ShapesXR and convert the results into a Fully Working Mixed Reality Unity project? Well, I answer those questions and many more with this long-form video: youtu.be/rkKGfp1PZ3c
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#shorts #visionos #unity #mixedrealityI Built A visionOS Prototype With ShapesXR And Unity!Dilmer Valecillos2023-12-28 | Today, I will walk you through designing an Apple Vision Pro landing area using a powerful XR Design tool called ShapesXR.
We'll utilize ShapesXR tools for UI Design, Eye Gaze Interactions, and Hand Pinch Gestures. In addition, I will use the ShapesXR Unity Plugin to convert the design into a Unity project, creating a fully functional VR/MR demo for Quest Pro with Eye Gaze and Pinch Gestures.
📚 What are we going to cover today? 00:00 - 00:47 - ShapesXR Prototype For Apple Vision Pro (Introduction) 00:47 - 00:53 - ShapesXR MR/VR App Installation Steps 00:53 - 01:13 - Pairing Your Quest 3 Or Quest Pro With ShapesXR 01:13 - 04:28 - ShapesXR Dashboard, Figma Token Setup, & Adding visionOS Resources 04:28 - 07:01 - Designing An Apple Vision Pro Landing Area With ShapesXR 07:01 - 11:06 - Setting Up Meta Tools, XR Toolkit, And Meta Gaze Adapter 11:06 - 13:52 - Installing ShapesXR Unity Plugin And Importing ShapesXR visionOS Space 13:52 - 25:11 - Adding XR Interactions And Implementing Eye Gaze With Hand Pinch Detection 25:11 - 25:28 - Outro
Also huge 🎉 thanks to @ShapesXR for sponsoring this XR video. I had a blast and this tool was designed in such a way that it enables you to be very creative, super easy to use, but at the same time very powerful.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#shapesxr #applevisionpro #unityVastly Improved Hand Tracking Accuracy & Latency: Magic Leap 2 OS Update!Dilmer Valecillos2023-12-19 | Magic Leap recently released big hand tracking improvements as part of ML2 OS 1.4.1.
📌 Here’re some of the highlights:
👉 Users will notice a 6x improvement in overall accuracy and a 10% improvement in latency of virtual hands. 👉 Keypoints are more closely aligned with users real hands, and depth is more consistent. 👉 Gestures are more responsive and reliable, with notable impacts to pinch and home gesture detection. 👉 New setting added to activate near vs far hand tracking interactions which adds a significant level of improvement to hand tracking features.
Overall, it feels much more in line with real hand movements. Gestures are more responsive during detection, and the virtual hands displayed within their "Model Viewer" experience look very impressive.
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#augmentedreality #mixedreality #arglassesMeta Haptics Studio and Haptics SDK: Full Walkthrough NOW Available!Dilmer Valecillos2023-12-16 | Today, I am really excited to share and walk you through a new pretty cool set of Haptic tools provided by Meta and now available as a production release.
The tools covered today include: Meta Haptics Studio, Meta Haptics Studio Companion App, and Meta Haptics SDK (For Unity & Unreal).
I'm also excited to show you how we can take an already established mixed reality game, such as The World Beyond, and guide you through the process of integrating Haptics.
📚 Meta Haptics SDK Video Chapters: 00:00 - 01:18 - Meta Haptics Introduction (What We'll Cover Today) 01:18 - 02:19 - Sneak Peek "The World Beyond" Project With Integrated Haptics 02:19 - 03:07 - Installing Meta Haptics Studio And Meta Companion App 03:07 - 13:38 - Meta Haptics Studio Overview (Amplitude, Frequency, Emphasis Envelopes) 13:38 - 24:56 - Integrating Haptics SDK into "The World Beyond" Game And Testing 24:56 - 25:18 - Outro
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#meta #haptics #quest3Meta Haptics Studio And Meta Haptics SDK NOW Available!Dilmer Valecillos2023-12-15 | Today, I am really excited to share a new pretty cool set of Haptic tools provided by Meta and Now Available with A Production Release! These tools include The NEW Meta Haptics Studio, Meta Haptics Studio Companion App, and Meta Haptics SDK (For Unity & Unreal).
💡 A Full YouTube video about these tools coming out this weekend!
📌 Support me by Subscribing to avoid missing future videos! youtube.com/@dilmerv
📣 Consider becoming a Patreon today: patreon.com/dilmerv and GET MY “Full Source Code” Tier
💡 What do you get from Patreon ? 👉 Access to this video GitHub repo + all code I work on for each video 👉 Access to special Patreon discord group where I can answer questions
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#meta #quest3 #unityVERY FAST Iteration With Unity visionOS PolySpatial Play To Device!Dilmer Valecillos2023-11-25 | Today, we're going to take a look at Unity's PolySpatial Play To Device tool to accelerate fast iteration in visionOS development within Unity.
We will also review the compatibility matrix for all requirements needed to run the Play To Device tool, learn how to configure the tool, and explore a few visionOS demos while using it. Additionally, I will share some interesting history about ARKit Remote as a fun part of this video.
📚 Unity PolySpatial Play To Device Chapters: 00:00 - 00:32 - Introduction to Unity PolySpatial Play To Device and video overview 00:32 - 03:16 - ARKit Remote History (to give you more context about why this tool is so important) 03:16 - 04:42 - Why is Unity PolySpatial Play To Device Really Needed? 04:42 - 06:48 - Unity PolySpatial Play To Device Requirements 06:48 - 11:54 - Setting Up And Reviewing Unity PolySpatial Play To Device Compatibility Matrix (Dependencies) 11:52 - 18:30 - Running Unity visionOS demos with Play To Device (IP Address Connectivity) 18:30 - 20:23 - Testing Unity PolySpatial With AR Features Through XR Simulation
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#xr #metaverse #unityHow To Get STARTED With Unity visionOS PolySpatial Tools! (Vision Pro Development)Dilmer Valecillos2023-11-21 | Today, we're reviewing Unity's visionOS PolySpatial tools, including creating a fully immersive VR experience, a shared space MR experience, and a full MR experience.
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#unity #visionos #apple #xrTransitioning Between Realities: ML2 Dynamic Dimmer And UnityDilmer Valecillos2023-11-08 | Today, I would like to introduce you to Magic Leap's 2 Dynamic Dimmer.
Magic Leap 2 Dynamic Dimmer provides two powerful options for developers:
- A Global Dimmer, which dims the environment to ensure clear, solid, and vibrant digital content in bright areas. Think of this as a tint applied to the background under all the virtual content, in which you can control its opacity value.
- A Segmented Dimmer, which allows applications to locally dim just the part of the display with virtual content. This means a subtle tint or border is applied around the edges of 2D/UI content and even 3D content.
📙 Great resources to learn more about ML2 Dynamic Dimmer features: 👉 Unity info: https://developer-docs.magicleap.cloud/docs/guides/unity/display/unity-global-dimming/ 👉 Guides: https://developer-docs.magicleap.cloud/docs/guides/features/dimmer-feature/
Also huge thanks to @MagicLeap for sponsoring this ML2 video!
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#metaverse #magicleap #unity #xrQuest 3: Powerful Mixed Reality Features with The NEW Meta Depth APIDilmer Valecillos2023-11-03 | Today, I would like to introduce you to the "Meta Depth API" for Quest 3, which is now part of Meta's Presence Platform tools.
Meta Depth API can greatly enhance mixed reality experiences by occluding digital objects with the real world. I will also explore various occlusion examples, including the creation of a basic 3D platformer with passthrough, scene understanding, and the exciting new occlusion features.
📚 Quest 3 With Meta Depth API Video Chapters: 00:00 - 00:35 - Meta Depth API With Quest 3 video introduction 00:35 - 02:40 - A variety of Demos With Meta Depth API (Hard Occlusion vs Soft Occlusion) 02:40 - 09:56 - Building a Meta Depth API demo with Occlusion Standard and Occlusion Particle Standard Unlit shader 09:56 - 10:19 - Testing your first Meta Depth API Unity Project! 10:19 - 15:16 - Adding a Robot Character Controller and testing it on the Quest 3 15:16 - 15:57 - Adding additional features such as UI, interactions, and score system 15:57 - 16:31 - Wrapping Up Meta Depth API tutorial and video
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#xr #metaverse #unityIs The QUEST 3 Good For MR Development? (First Impressions & Dev Tools)Dilmer Valecillos2023-10-21 | I've been experimenting with the NEW Quest 3 headset, testing various mixed reality experiences, and evaluating the hardware in depth.
Today, we will be exploring the Quest 3 development tools, including deploying from Unity. Today, I'd like to share what I've learned and my takeaways from this testing experience.
📚 Quest 3 Video Chapters: 00:00 - 00:27 - Quest 3 video introduction 00:27 - 01:57 - Testing Quest 3 First Encounters mixed reality demo 01:57 - 03:48 - Testing Quest 3 passthrough quality with a few objects 03:48 - 04:55 - Testing Quest 3 with Cubism mixed reality & hand tracking 04:55 - 06:09 - Testing Quest 3 with Immersed 06:09 - 07:06 - Opening the Quest 3 headset and Quest 3 accessories 07:06 - 09:16 - Quest 3 specifications 09:16 - 11:50 - Quest 3 development setup and deploying your first Quest 3 Unity App 11:50 - 12:26 - Comparing Quest 3 passthrough quality with Quest 2 and Quest Pro 12:26 - 13:25 - My overall takeaways about the Quest 3 device
💡 Recommended Oculus link cable: amzn.to/3rSLzF7 (affiliate link)
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#quest3 #mixedreality #unityvisionOS Development Fundamentals - RealityView Attachments, Systems, & More!Dilmer Valecillos2023-10-10 | This is video #2 of my visionOS development fundamental series where we add new features by using RealityView attachments, systems, and custom components.
📌 This video covers additional visionOS SDK topics including: - Structure of RealityViews including Update, Placeholder, and Attachments functions. - Adding a Swift extension method to provide image based lights and image based lights receivers to loaded entities. - Adding an immersive space of type mixed in addition to existing volumetric and full views created in the previous video. - Adding an Orbit System and Orbit Component for circular movement applied to SpaceX capsule. - Using the Preview tag for rapid development during System and Component testing.
Here're the visionOS tutorial project requirements: - Xcode Version 15 beta 8 or greater - visionOS Version 1 beta 3 Simulator Runtime or greater - (This is bundled with Xcode) Reality Composer Pro Version 1.0 (393.3) or greater
👉 My Blog / 🔥 Newsletter (Subscribe to get up to date XR news) blog.learnxr.io
#visionOS #swiftui #visionproUnreal Engine 5 For Magic Leap 2 Is Here - Build UE5 From Source!Dilmer Valecillos2023-10-04 | Today, we cover how to compile Unreal Engine 5 from source.
This video includes discussing all the dependencies, developer tools, and environment configurations necessary for integrating the NEW Magic Leap 2 Unreal Plugin.
📚 Magic Leap UE5 Plugin Video Chapters: 00:00 - 00:40 - Unreal 5 compile from source with or without ML2 00:40 - 01:20 - Requesting permissions to Unreal Engine GitHub repository 01:20 - 02:14 - Magic Leap 2 Hub installation, Magic Leap Unreal SDK, and Magic Leap Native C SDK 02:14 - 03:45 - Magic Leap 2 Android dependencies setup 03:45 - 05:10 - Compiling and building Unreal Engine 5 from source 05:10 - 10:39 - Magic Leap 2 UE5 project setup (camera / capsule collisions) 10:39 - 15:03 - Adding UE5 Magic Leap 2 controller support 15:03 - 15:43 - Player character animations with ML2 controller inputs 15:43 - 16:16 - Project summary and ML2 dev video recommendation
📙 Magic Leap UE5 Dev Resources: https://developer-docs.magicleap.cloud/docs/guides/unreal/unreal-overview
Also huge thanks to @MagicLeap for sponsoring this ML2 video!