Find out more in our case study: ultraleap.com/company/news/case-study/twilight-dreamcraft-ride
Ultraleap
DreamCraft Attractions and Cavu DesignWerks launched 'The Twilight Saga: Midnight Ride' this year at Lionsgate Entertainment World - one of China's leading theme parks - and it features hand tracking from Ultraleap!
Find out more in our case study: ultraleap.com/company/news/case-study/twilight-dreamcraft-ride
Find out more in our case study: ultraleap.com/company/news/case-study/twilight-dreamcraft-ride
updated 4 years ago
Find out more in our case study: ultraleap.com/company/news/case-study/twilight-dreamcraft-ride
A decade of innovation.
A decade of world leading products and solutions.
A decade of Ultraleap!
Aurora is a virtual space created by Ultraleap that's all about hand interaction. Aurora is made up of three islands, each hosting a different experience for you to explore. Navigate around the world of Aurora without needing to physically move yourself by following the simple tutorials.
Download it now via Steam, the Pico store or at our website!
-------------------------------------------------------------------------
Steam: store.steampowered.com/app/2537360/Aurora/?beta=0
Ultraleap: gallery.leapmotion.com/aurora
This is hand tracking.
Unlock new ways to experience digital content with hand tracking. It’s natural, intuitive and immersive and it can help you to:
► Express yourself
► Expand your creativity
► Design with precision
► Train immersively
► Lower barriers to XR entry
Enhance your XR experience like never before with world leading hand tracking from Ultraleap. Get your Leap Motion Controller 2 today from RobotShop, Schenker, @redboxvr113 @ThingbitsNet and IPMall
leap2.ultraleap.com/leap-motion-controller-2
-------------------------------------------------------------------------
▼ More info here ▼
The Leap Motion Controller 2 is the second generation of the iconic hand tracking camera that allows you to use your own hands to interact naturally with digital content in 3D.
It’s the most flexible camera ever developed by Ultraleap and is compatible across platforms (Windows, Android XR2, MacOS and soon to be Linux) and complimentary hardware including VR/MR/AR headsets, PCs, and holographic displays.
Hear from the many companies working with Ultraleap about why this new device is a game changer, why hand tracking is crucial to them and how it helps them to scale.
Timestamps:
00:02 Mohsen Khurasany | Walmart | Senior Manager AR/VR Content
00:16 Taylor Lallas | Manage XR | Product Manager
00:37 Brian Hamilton | Digilens | VP Sales & Marketing
00:56 Nick Syris | Pico XR | Enterprise Sales Director
01:12 Sean Hurwitz | Pixo VR | Founder
01:29 Diane Bowser | Lenovo | AR-VR-XR UX Designer & Innovator
01:56 Josh Franzen | ArborXR | Senior Product Manager
02:11 Robert Murphy | AXON | Director, Virtual & Augmented Reality
02:24 Taylor Lallas | Manage XR | Product Manager
Hear from Pico and their partners on this close partnership and why hand tracking helps to unlock new use cases, providing a fantastic open solution for the VR market that just works.
Timestamps:
00:06 Jordan Williams | ArborXR | Co-Founder
00:22 Amir Khorram | Pico XR | Managing Director, North America
00:53 Luke Wilson | ManageXR |CEO/Co-Founder
01:09 Nick Syris | Pico XR | Enterprise Sales Director
Key improvements over the original Leap Motion Controller include:
▷ higher resolution cameras
▷ increased field of view
▷ 30% lower power consumption
▷ 30% smaller package for optimum placement and convenience
It’s the most flexible camera ever developed by Ultraleap and is compatible across platforms and complimentary hardware including VR/MR/AR headsets, PCs, and holographic displays.
-------------------------------------------------------------------------
▼ More info here ▼
▷ Secure yours now: www2.ultraleap.com/leap2-robotshop
▷ Full press release: ultraleap.com/company/news/press-release/leap-motion-controller-2
————————————————————————————————————————
Timestamps:
► 0:00 Intro
► 0:10 Self-Service Ordering
► 0:20 Experiential Retail
► 0:30 Product Exploration
*Find out more about Ultraleap's hand tracking solutions: ultraleap.com/enterprise/xr
*Buy an Ultraleap hand tracking camera: ultraleap.com/tracking
*Start developing with hand tracking in Unity: developer.leapmotion.com/unity
*Start developing with hand tracking in Unreal Engine: developer.leapmotion.com/unreal
With thanks to @dilmerv and Charmaine Lee!
► Read blog: ultraleap.com/company/news/blog/vr-training
► Download ebook: ultraleap.com/company/news/resources/vr-training-ebook
With thanks to NMY Mixed Reality Studio & Lufthansa Aviation Training
————————————————————————————————
Humans learn best by doing. The ancient Greek philosopher Aristotle wrote that “for the things we have to learn before we can do them, we learn by doing them”. Since then, his point has been proved time and again: “active learning” helps people learn faster, retain information for longer, and fail less often.
Today, VR training is providing firms worldwide with a safe, interactive place for active learning and training. The technology required is tried, tested, and affordable.
In VR training, learners can interact within a 3D virtual environment, and learn procedural tasks or simulate interactions almost as naturally as they would on the job. This helps people to learn faster and remember for longer. Trainers are no longer restricted by location (a big benefit during the pandemic), environment, or risk. In addition, people can learn safely from their mistakes.
Thanks to the technology that’s powering virtual reality, Aristotle’s paradox – the idea that we can only really learn to do something by actually doing it ourselves – has finally become achievable.
Find out how VR could improve your training ROI today.
► Read blog: ultraleap.com/company/news/blog/vr-training
► Download ebook: ultraleap.com/company/news/resources/vr-training-ebook
----------------------------------------------------
* Learn more about TouchFree: ultraleap.com/enterprise/touchless-experiences/touchfree-solution
* Buy an Ultraleap 3Di camera: ultraleap.com/product/ultraleap-3di
* Read our PepsiCo case study: ultraleap.com/company/news/case-study/pepsico-touchless-ordering
----------------------------------------------------
Ultraleap TouchFree is an end-to-end solution for quick and easy deployment of touchless experiences using gesture control. Try it today and find out why consumers rate touchless interaction higher than both touchscreens and mobile solutions.
*Ultraleap 3Di camera: Patented stereo infrared camera designed for connection to public interactive screens
*TouchFree Application: Reliably track a wide variety of hand sizes and shapes and convert hand movements to an on-screen cursor.
* TouchFree Tooling: Integrate touchless as a native feature in minutes, including Web and Unity applications.
----------------------------------------------------
Why go TouchFree?
Gesture control delivers results in both self-serve kiosks and interactive digital signage
*82% think gesture control is hygienic
*66% adoption rate over a touchscreen
*+125% unaided advertisement recall
Learn more about TouchFree: ultraleap.com/enterprise/touchless-experiences/touchfree-solution
----------------------------------------------------------------------
*Read our Q&A with Stan Larroque, CEO of Lynx: ultraleap.com/company/news/case-study/lynx-stan-larroque
*Find out about integrating Ultraleap hand tracking into VR/AR/MR headsets: ultraleap.com/enterprise/xr/oems
*Learn about developing with hand tracking in VR: ultraleap.com/enterprise/xr/software-developers
------------------------------------------------------
* Buy your Ultraleap Hand Tracking Accessory for Pico Neo 3 Pro / Pro Eye today: ultraleap.com/enterprise/pico-neo-3/#pricingandlicensing
* Find out more about the Hand Tracking Accessory: ultraleap.com/enterprise/pico-neo-3/#overview
------------------------------------------------------
An early-access version of the Ultraleap Hand Tracking Accessory is available now for development and proof-of-concept work. The full bundle for scalable deployments will come in summer 2022.
Add Ultraleap hand tracking to the Pico Neo 3 Pro or Pico Neo 3 Pro Eye to unlock high-value use cases and bring in new users of all ages and levels of technical competency. Using hand tracking to supplement gaming controllers expands the VR enterprise market, opening up new possibilities such as flight attendant and surgical training in VR.
* Learn about hand tracking in VR training: ultraleap.com/enterprise/xr/vr-training
* Learn about hand tracking in location-based entertainment: ultraleap.com/enterprise/xr/vr-arcades
Here's a snippet of the interview! Check out the the full Q&A here: ultraleap.com/company/news/case-study/lynx-stan-larroque
————————————————————————————————————————
► Find out about hand tracking cameras, headsets, and tooling: ultraleap.com/enterprise/xr
► Download tracking software today: developer.leapmotion.com
With thanks to @mixedrealityTV for the amazing in-headset footage. See more here: youtube.com/watch?v=wrURe07Q1lw
————————————————————————————————————————
End-to-end touchless technology solution for quick and easy development of touchless experiences.
Suitable for retail, QSR, digital out-of-home, museums, and theme parks.
► Buy Ultraleap 3Di camera: ultraleap.com/product/ultraleap-3di
► Download TouchFree software: developer.leapmotion.com/touchfree
► Find out about camera placement and mounting solutions: docs.ultraleap.com/touchfree-user-manual/camera-placement.html
► Learn to design for touchless interfaces: docs.ultraleap.com/touchless-interfaces
85% would use a touchless kiosk again: foodingredientsfirst.com/news/pepsico-unveils-touchless-menu-gesture-technology-targeting-convenience-in-fast-food-outlets.html
Built with Ultraleap TouchFree – an end-to-end solution for touchless interaction. Camera hardware, reliable hand tracking software, and developer tooling.
* Download TouchFree software: developer.leapmotion.com/touchfree
* Ultraleap 3Di hand tracking camera: ultraleap.com/product/ultraleap-3di
* Find out more about designing touchless interfaces: docs.ultraleap.com/touchless-interfaces
Built with Ultraleap TouchFree – an end-to-end solution for touchless interaction. Camera hardware, reliable hand tracking software, and developer tooling.
* Download TouchFree software: developer.leapmotion.com/touchfree
* Buy hand tracking cameras: ultraleap.com/tracking
* Find out more about designing touchless interfaces: docs.ultraleap.com/touchless-interfaces
Download plugin here: developer.leapmotion.com/unreal
Explore Unreal documentation: docs.ultraleap.com/unreal-api
Find out about our plans to make it even easier to develop for Unreal or Unity: ultraleap.com/company/news/blog/unreal-unity-vr-plugins
————————————————————————————————————————
What are we doing with Unreal Engine for VR?
*Our award-winning Interaction Engine is now compatible with Unreal Engine. This is a layer that exists between the game engine and real-world physics, making interaction with virtual objects in VR feel natural, satisfying, and easy to use.
*Our Hands module is also compatible. Easily bind Ultraleap data to your own hand assets, or use our optimized and pre-rigged hand models. See docs.ultraleap.com/unreal-api/unreal-guide/hands-module.html
*We’ve added functionality for UI Input. You can now retrofit Unreal Motion Graphics (UMG) UIs so that they can be interacted with using hand tracking.
*Our Unreal Engine documentation has been overhauled and expanded. It also has a new home on our comprehensive developer resources site - see docs.ultraleap.com
We’d love to hear your feedback on our new Unreal Engine tooling for VR. What do you like? What’s not working so well for you? And what features do you want next?
And don’t forget to share the awesome things you create with us at twitter.com/ultraleap_devs
Happy building!
To find out more visit developer.leapmotion.com/touchfree
--------------
How TouchFree works
TouchFree is a touchless software application that runs any screen or interactive kiosk.
When combined with an Ultraleap camera module, it provides touchscreen emulation by detecting a user’s hand in mid-air and converting it to an on-screen cursor. It's the easiest way to retrofit interactive kiosks with touchless gesture control.
TouchFree runs invisibly on top of existing user interfaces, without the system needing major modifications. You can add touchless interaction to interactive kiosks without writing a single line of code, or changing existing user experience design.
The touchless interface provides familiar, touchscreen-style interactions and supports multiple camera positioning options. Powered by the world's leading hand tracking, it's robust, reliable, and easy to use.
Find out more about TouchFree here: developer.leapmotion.com/touchfree
Read our design guidelines for touchless interfaces here: docs.ultraleap.com/touchless-interfaces
With thanks to NMY/Lufthansa Aviation Training, Varjo and Autodesk VRED.
————————————————————————————————————————
Download Ultraleap Gemini (5.2.0) for Windows: developer.leapmotion.com/tracking-software-download
Read our blog on why Gemini matters in VR for training: ultraleap.com/company/news/blog/why-gemini-matters
Find out more about our hand tracking solutions for XR: ultraleap.com/enterprise/xr
————————————————————————————————————————
Using hand tracking to supplement or replace controllers expands the XR market. Natural, intuitive interaction brings in new users of all ages and levels of technical competency.
Ultraleap Gemini is the fifth generation of our computer vision model. Its unified neural network is several generations ahead of other hand tracking solutions. Gemini's robust, reliable hand tracking will bring XR, and your product, to the masses.
PC VR and mobile HMDs are both supported.
► Fast initialization: See your hands and start interacting immediately
► Robust for different hand anatomy: Works for everyone
► Interact naturally with two hands, even if one is partly obscured
► Works in a variety of different lighting conditions
► Picks out hands even against cluttered backgrounds
► Flexible for different platforms and camera hardware
Download TouchFree Tooling for Unity here: developer.leapmotion.com/touchfree-tooling-unity
----------------------------
To use TouchFree Tooling you will also require:
• TouchFree 2.0 (developer.leapmotion.com/touchfree)
• Ultraleap hand tracking camera/software (developer.leapmotion.com/get-started)
Find out more about developing touchless kiosk experiences:
• TouchFree user manual (docs.ultraleap.com/touchfree-user-manual)
• Camera placement (docs.ultraleap.com/touchfree-user-manual/camera-placement)
----------------------------
How TouchFree Interactive Kiosk Software Works
*TouchFree Application
TouchFree Application runs invisibly on top of existing user interfaces. You can retrofit systems without needing to modify your existing application’s user interface, or changing a single line of code. The interactive kiosk software provides a cursor that users control touchlessly and communicates with Windows’ input system to control your application.
*TouchFree Tooling
TouchFree Tooling allows Web and Unity developers to connect applications and content to TouchFree interactive kiosk software and use its hand positioning data directly. With it, you can move beyond retrofits and design self-serve kiosks or interactive digital signage with touchless interaction as a native feature.
*TouchFree Service
TouchFree Service is the engine of TouchFree and is required to power both TouchFree Application and applications integrated with TouchFree Tooling. The TouchFree Service interactive kiosk software converts hand movements, provided by Ultraleap’s hand tracking cameras, into positional data. TouchFree Application or your own kiosk application can use this to represent a cursor on-screen.
Download TouchFree Tooling for Web here: developer.leapmotion.com/touchfree-tooling-web
----------------------------
To use TouchFree Tooling you will also require:
• TouchFree 2.0 (developer.leapmotion.com/touchfree)
• Ultraleap hand tracking camera/software (developer.leapmotion.com/get-started)
Find out more about developing touchless kiosk experiences:
• TouchFree user manual (docs.ultraleap.com/touchfree-user-manual)
• Camera placement (docs.ultraleap.com/touchfree-user-manual/camera-placement)
----------------------------
How TouchFree Interactive Kiosk Software Works
*TouchFree Application
TouchFree Application runs invisibly on top of existing user interfaces. You can retrofit systems without needing to modify your existing application’s user interface, or changing a single line of code. The interactive kiosk software provides a cursor that users control touchlessly and communicates with Windows’ input system to control your application.
*TouchFree Tooling
TouchFree Tooling allows Web and Unity developers to connect applications and content to TouchFree interactive kiosk software and use its hand positioning data directly. With it, you can move beyond retrofits and design self-serve kiosks or interactive digital signage with touchless interaction as a native feature.
*TouchFree Service
TouchFree Service is the engine of TouchFree and is required to power both TouchFree Application and applications integrated with TouchFree Tooling. The TouchFree Service interactive kiosk software converts hand movements, provided by Ultraleap’s hand tracking cameras, into positional data. TouchFree Application or your own kiosk application can use this to represent a cursor on-screen.
• Download TouchFree 2.0 (developer.leapmotion.com/touchfree)
• TouchFree user manual (docs.ultraleap.com/touchfree-user-manual)
• TouchFree Tooling for Web and Unity (developer.leapmotion.com/touchfree-tooling)
• Camera placement (docs.ultraleap.com/touchfree-user-manual/camera-placement)
• Ultraleap Developer Resources (docs.ultraleap.com)
----------------------------
How TouchFree Interactive Kiosk Software Works
*TouchFree Application
TouchFree Application runs invisibly on top of existing user interfaces. You can retrofit systems without needing to modify your existing application’s user interface, or changing a single line of code. The interactive kiosk software provides a cursor that users control touchlessly and communicates with Windows’ input system to control your application.
*TouchFree Tooling
TouchFree Tooling allows Web and Unity developers to connect applications and content to TouchFree interactive kiosk software and use its hand positioning data directly. With it, you can move beyond retrofits and design self-serve kiosks or interactive digital signage with touchless interaction as a native feature.
*TouchFree Service
TouchFree Service is the engine of TouchFree and is required to power both TouchFree Application and applications integrated with TouchFree Tooling. The TouchFree Service interactive kiosk software converts hand movements, provided by Ultraleap’s hand tracking cameras, into positional data. TouchFree Application or your own kiosk application can use this to represent a cursor on-screen.
----------------------------
System Requirements
• Ultraleap Hand Tracking Camera (ultraleap.com/tracking)
• Our latest Hand Tracking Software (developer.leapmotion.com/tracking-software-download)
• Windows® 10, 64-bit
• Minimum Intel HD Graphics 530
• Intel® Core i3 or higher
• 4 GB RAM
• USB 2.0 port
NB: TouchFree is only compatible with Windows
————————————————————————————————————————
According to Delta Air Lines, it takes six weeks to train flight attendants, using full-scale dummy aircraft. Flight attendants have to retrain every 18 months. (youtube.com/watch?v=a8lbitoBhP4)
Substituting VR training for some of this would reduce costs. But if learners interact using VR controllers, this builds muscle memory of an abstract set of button pushes – not of actions flight attendants would do in the real world.
In contrast, Ultraleap's advanced Gemini hand tracking enables flight attendants to perform actions in VR training that closely mimic those they would do in reality.
Find out more about how hand tracking enables adoption of VR for training: (ultraleap.com/company/news/blog/vr-for-training)
————————————————————————————————————————
How we made the VR for training flight attendant demo:
Our demo is made with Unity 2020 and uses the Ultraleap Hand Tracking Unity package. (developer.leapmotion.com/unity)
The physics-based interactions within this demo are built using Unity’s native physics engine in combination with the Ultraleap Interaction Engine module (developer.leapmotion.com/releases/interaction-engine-120). This is a publicly available SDK that allows for hand-based physical interactions to work as you would expect them to in the real world.
To create a further level of immersion in this demo, we utilized Unity’s new articulation body feature (docs.unity3d.com/Manual/class-ArticulationBody.html) to create what we call "Articulation Hands". These allow your virtual hands to behave as if they were solid and will naturally deform around physical objects in the airplane.
Together, this creates a world that feels more real, behaves a bit more like you would expect the world to, and greatly enhances the immersiveness of the experience.
Read our interactive DOOH blog: ultraleap.com/company/news/blog/dooh-interactivity-study
Download interactive DOOH whitepaper: ultraleap.com/company/news/resources/dooh-interactivity-whitepaper
————————————————————————————————————————
Interactive DOOH study background:
Conversion ratio, dwell time and attention time taken from anonymous audience tracking data using Quividi digital signage analytics software, measuring audience response to side-by-side touchless interactive DOOH movie posters and static movie posters advertising. Data is averaged across three film posters (Detective Pikachu, Shazam! and The Curse of La Llorona), each with approximately a 3-week run. Brand favourability, ad recall and directional estimate of sales uplift from independent study conducted by ERm Research measuring 316 moviegoers’ response to advertising for the movie Pokemon: Detective Pikachu at the same cinema. The test group walked past touchless interactive DOOH advertising Detective Pikachu to get to their movie. The control group walked past static DOOH advertising Detective Pikachu. Directional estimate of sales uplift is calculated using uplift in purchase intent (17% more of the test group were interested in seeing the movie) x average number of impressions per screen over 3-week run (9714) x average cinema ticket price ($9.11).
Use with our tracking software for fast, robust, and reliable hand tracking.
————————————————————————————————————————
For more details, including which camera module is best for you and where to buy, visit ultraleap.com/tracking/.
Want to stay in the loop on the latest Ultraleap updates? Sign up to our newsletter ultraleap.com/newsletter-sign-up
Watch this demo of the set up process to get started with your hand tracked XR keyboard in Unity.
Want to stay in the loop on the latest Ultraleap updates? Sign up to our newsletter by visiting ultraleap.com/newsletter-sign-up
————————————————————————————————————————
What you need:
*Our latest Unity Modules Package (including our Core Unity Module and Interaction Engine) - developer.leapmotion.com/unity
*NaughtyAttributes - github.com/dbrizov/NaughtyAttributes
*TextMeshPro - docs.unity3d.com/Packages/com.unity.textmeshpro@3.0/manual/index.html
The XR Keyboard was built with Unity 2019.4.18f1 and Leap Motion Unity Modules 4.8.0.
Set Up Your Keyboard In Four Steps:
*Ensure you have Leap and the Interaction Engine set up in your scene
*Place the QwertyKeyboard prefab into the scene
*Set Up Input Fields. Add TMPInputFieldTextReceiver to any text fields you want the keyboard to input to (note: only compatible with TextMeshPro InputFields)
*Check it works and start customizing!
Find out more: docs.ultraleap.com/ultralab
Read about our design process and get everything you need to add a hand tracking keyboard in Unity - docs.ultraleap.com/ultralab/xr-keyboard
Want to stay in the loop on the latest Ultraleap updates? Sign up to our newsletter by visiting ultraleap.com/newsletter-sign-up
————————————————————————————————————————
Designed for the untrained end user, we’ve built a hand tracking keyboard that's robust and easy to use.
Removing peripherals to achieve direct physical interaction in VR and AR is key to natural-feeling input. But text entry is still a necessary part of any UI and a common point of frustration in VR. At some point, users will need to quickly enter a password or a username without wanting to learn a whole new typing paradigm.
Taking inspiration from both the physical and digital world we wanted to create a solution that doesn’t exist in either but retained the familiarity of both.
We’ve built a keyboard prefab for Unity so you can design your own keyboard, and have it work seamlessly with our hand tracking (it will even work on our new Gemini platform).
What you get is a robust and reliable basic open source keyboard that retains excellent usability no matter what you do (and there’s a lot you can do with it).
Find out more: docs.ultraleap.com/ultralab
We look at what it took to build the multi-sensory interface of the future.
Want to stay in the loop on the latest Ultraleap updates? Sign up to our newsletter by visiting ultraleap.com/newsletter-sign-up
————————————————————————————————————————
At Ultraleap we’ve always been focused on pushing human-computer interfaces forward.
In 2017 we partnered with the Universities of Glasgow, UCL, Bayreuth, and Chalmers University of Technology on an EU funded project to research whether it was possible to make two-dimensional digital information into something tangible and three-dimensional. The result was a prototype that uses ultrasonic levitation. The machine flies particles around so fast that 3D objects seem to materialise in mid-air.
This triumph over the laws of physics is the bleeding-edge of technology.
Using ultrasonic speakers to levitate small particles is the stuff of make-at-home science project. But the true potential of ultrasonic levitation opens up extraordinary possibilities for blending digital and physical worlds.
Find out more: docs.ultraleap.com/ultralab
The Levitate Project: levitateproject.org
Find out more about how hand tracking unlocks enterprise use cases in spatial computing: varjo.com/blog/how-hand-tracking-unlocks-enterprise-use-cases-guest-post-by-ultraleap
Join our Gemini Developer Preview: developer.leapmotion.com/gemini-v5-preview
Download the demo here: gallery.leapmotion.com/touchless-hotel-check-in
---------------------------
With hygiene top of guests' minds, more hotels than ever are turning to touchless check-in.
Our downloadable Touchless Hotel Check-in demo shows how an intuitive, easy-to-use interactive kiosk operated by touchless gesture control can be created in practice.
Touchless Hotel Check-In Demo - Key Features
*Quick and intuitive "Air Push" interaction style
*Customers can choose whether to operate the interactive kiosk using gesture control or traditional touchscreen buttons.
*With our best-practice user onboarding guidance, guests can typically learn how to use a touchless interactive kiosk in a few seconds (Read more here: docs.ultraleap.com/touchless-interfaces/instructional-information)
*Hand movements can be reliably detected up to 75 cm away from the surface
*Powered by Ultraleap’s world-leading hand tracking hardware and software – fast, robust, and accurate
*Runs using the TouchFree application. This can be used to retrofit existing interactive kiosks with touchless gesture control. It runs invisibly on top of existing user interfaces, allowing you to add touchless capability without writing a single line of code or changing the current user interface.
Download the demo here: gallery.leapmotion.com/touchless-hotel-check-in
To answer these questions this lecture, aimed at developers and designers, takes you behind the scenes at Ultraleap.
Given in October 2020 as part of the XR Bootcamp Hand Interactions Pro Event Series, watch it here to get expert insight from our engineers.
Video courtesy of XR Bootcamp: xrbootcamp.com/
youtube.com/channel/UCTM0Y8sc5YHIhOXNXT11MBw
-------------------------------
SPEAKERS:
John Selstad (Principal Software Engineer)
John covers how to get started with the Ultraleap Interaction Engine and core development tools. John also discusses interacting with objects at a distance.
Chris Wren (Global Applications Team Lead)
Chris takes you behind the scenes of our Crystal Cave multiplayer room-scale demo. Learn more about social interactions, connecting users with each other and the environment, and creating a sense of presence.
Hannah Limerick (User Research and Insights)
Test, test, test! Hannah shares testing insights from our VR autonomous vehicle experience. You’ll learn about the value of user testing and how users are always surprising you.
This deep dive into the future of hand interaction technology was given in October 2020 as part of the XR Bootcamp Hand Interactions Pro Event Series.
Tom gives us his unique perspective on the adoption of hand tracking in XR, Open XR, as well as a Q&A session that covers accessibility, touch-free screens, enterprise use cases, and automotive. Tom also explains the key advantages of Ultraleap camera modules over other providers.
Video courtesy of XR Bootcamp: xrbootcamp.com
youtube.com/channel/UCTM0Y8sc5YHIhOXNXT11MBw
To find out more visit: ultraleap.com/company/news/blog/touchless-interactions
--------------
At Ultraleap we know a lot about the many different ways to perform mid-air interactions. So we undertook extensive user-testing to find the best, most effective interaction for use with touchless interfaces.
Air Push was the result. Listen to Greg explain what goes on behind the scenes to make air push so robust while feeling natural and intuitive.
TouchFree detects hand movements in mid-air and converts them to an onscreen cursor. People can continue to use interactive screens without actually touching them.
--------------
In 2021 people are looking for new ways to interact without touching surfaces – for transport stations, quick-service restaurants, ATMs, elevators, and more.
But how will users adapt to this new way of using a familiar technology? We think the technology should adapt to them. Which is why we developed Air Push, a way of interacting touchlessly that is robust, flexible, and intuitive to use.
We’ve thought about every step in the functional design of touchless kiosks. With our user-centred design process, we’ve engineered robustness in.
--------------
Read more
ultraleap.com/company/news/blog/touchless-public-touchscreens
ultraleap.com/company/news/blog/touchless-ticket-machine
ultraleap.com/company/news/blog/touchless-self-ordering-kiosks
Find out more about our world-leading hand tracking hardware and software here: ultraleap.com/tracking
Take a look at our XR guidelines for tips on how to create amazing interactions: docs.ultraleap.com/xr-guidelines
Video courtesy of Ocean Outdoor: oceanoutdoor.com
Find out more about Ultraleap’s technology in digital out-of-home: ultraleap.com/enterprise/out-of-home
------------------
LEGO’s first brand campaign for 30 years championed the power of children’s creativity by inviting them to Rebuild the World. For the 2020 campaign, Ocean Labs integrated Ultraleap’s touchless gesture control and “virtual touch” haptics with a premium digital out-of-home screen.
The interactivity was completely contactless. Children were able to create LEGO builds using intuitive hand gestures, and without touching any surfaces.
Immersive technology:
*STRATOS Inspire haptic module (ultraleap.com/product/stratos-inspire)
*Ocean Outdoor premium curved full-motion screen, Four Dials, Westfield Stratford City, London, UK
Circle and line haptic sensations were combined and animated to create a variety of tactile effects. The installation also included sound effects to match different sensations.
The results
*Live for one weekend in October 2020
*Drove strong engagement with the LEGO brand with users engaging with the installation for long periods of time
*Outstanding example of the “honeypot” effect in action. This is when passers-by are attracted by people engaging in an interactive experience.
*Coverage in Campaign, Graphic Display World and TrendHunter
*Winner: Best Creative Idea Over £250k in Insight & Innovation Category, Campaign Media Awards
*Winner: Bronze in Outdoor Experience Category, Campaign Experience Awards
Download the demo here: developer.leapmotion.com/touchfree/#touchfree-demo
---------------------------------------
COVID-19 has significantly changed consumers’ perceptions of the hygiene risks of using self-serve kiosks. Given the option, QSR customers prefer touchless gesture control to touchscreens, mobile apps, or going to the counter. Read more here: ultraleap.com/company/news/blog/touchless-public-touchscreens
Our downloadable Touchless Self-Ordering demo shows how a self-ordering kiosk operated by touchless gesture control can be created in practice, and with minimal impact on user journey time.
---------------------------------------
Touchless Self-Ordering Demo – Key Features
▷Quick and intuitive “Air Push” interaction style
▷Customers can choose whether to use gesture control or traditional touchscreen buttons.
▷Includes a “Call to Interact” animation to help users understand touchless interaction quickly
▷Hand movements can be reliably detected up to 75 cm away from the surface
▷Uses Ultraleap’s world-leading hand tracking hardware and software – fast, robust, and accurate
▷Runs using the TouchFree application. This can be used to retrofit existing kiosks with touchless gesture control. It runs invisibly on top of existing user interfaces, allowing you to add touchless capability without writing a single line of code or changing the current user interface.
Download the demo here: developer.leapmotion.com/touchfree/#touchfree-demo
Download the demo here: bit.ly/2Wk7vW5
--------------
While over 80% of people think that public touchscreens were unhygienic, many consumers are also reluctant to use mobile app/second screen solutions. Gesture control can be used to create a touchless experience close to that of using a touchscreen.
Find out more about our study into consumer attitudes to public touchscreens and touchless interfaces here: bit.ly/3gU5A4g
--------------
Ultraleap’s Leap Motion Controller or Stereo IR 170 camera modules allow passengers to collect pre-booked tickets and check train times, simply by moving their hand in the air.
Our TouchFree application can be used to retrofit existing ticket vending machines and other interactive kiosks with touchless gesture control. It runs invisibly on top of existing user interfaces, allowing you to add touchless interaction without writing a single line of code.
Touchless Ticketing Machine Demo – Key Features
▷ Simple, intuitive “Air Push” interaction
▷ Hand movements can be reliably detected up to 75cm away from the surface
▷ Demonstrates both keyboards (e.g. for typing in a collection code) and simpler button-style interactions
▷ Uses Ultraleap’s world-leading hand tracking hardware and software – fast, robust, and accurate
▷ Works in a range of lighting conditions
▷ Includes a “Call to Interact” animation to help users get to grips with touchless interaction quickly
You can download the demo here: bit.ly/2Wk7vW5
Find out more here: bit.ly/2IXJIs6
To find out more visit developer.leapmotion.com/touchfree
--------------
How TouchFree works
TouchFree is a touchless software application that runs on an interactive kiosk or advertising totem.
When combined with an Ultraleap camera module, it provides touchscreen emulation by detecting a user’s hand in mid-air and converting it to an on-screen cursor. It's the easiest way to retrofit touchscreens with touchless gesture control.
TouchFree is designed to run invisibly on top of existing user interfaces, without the system needing major modifications. You can add touchless interaction without writing a single line of code, or changing existing user experience design.
The touchless interface provides familiar, touchscreen-style interactions and supports multiple camera positioning options. Powered by the world's leading hand tracking, it's robust, reliable, and easy to use.
Find out more about TouchFree here: developer.leapmotion.com/touchfree
Read our design guidelines for touchless interfaces here: docs.ultraleap.com/touchless-interfaces
With thanks to Groupe PSA.
Find out more about Ultraleap's haptics hardware and software here: ultraleap.com/haptics
-------------------------------------------------------------------------
▼ Here's how Ultraleap's haptic technology works in a bit more detail ▼
1. It all starts with one small ultrasound speaker. This small speaker emits ultrasound waves, which are too high a frequency for you to hear.
2. We put lots of these speakers together to create an array. Every ultrasound speaker in the array can be individually controlled.
3. Using our patented algorithms, the ultrasound speakers are triggered with very specific time differences. These time differences mean the ultrasound waves arrive at the same point in space, at the same time.
4. The place where all the ultrasound waves coincide is called the focal point.
5. Where the focal point is positioned in 3D space is programmable in real time. It can change position from instant to instant.
6. We use a hand tracking device (usually a Leap Motion Controller or Stereo IR 170 Camera Module) to track the exact position of your hand and position the focal point at a spot on it.
7. The combined ultrasound waves have enough force to create a tiny dent on your skin. We use this pressure point to create a vibration that touch sensors in your hands can detect.
8. By moving the pressure points around, we use them to create a wide range of tactile effects – from sculpting virtual lines and shapes to forming 3D controls in mid-air.
-------------------------------------------------------------------------
▼ Subscribe to our channel ▼
Stay tuned for more from Ultraleap → http://bit.ly/2rgEEAZ
Find out more about our world-leading hand tracking hardware and software here: ultraleap.com/tracking
--------------------------------------
HAND TRACKING HARDWARE
From a hardware perspective, hand tracking is relatively simple. The heart of a device is two cameras and some infrared LEDs. These track infrared light at a wavelength of 850 nanometers, which is outside the visible light spectrum. The LEDs pulse in sync with the camera framerate, allowing for significantly lower power use and increased intensity.
Ultraleap’s hand tracking modules the Leap Motion Controller (ultraleap.com/product/leap-motion-controller) and the Stereo IR 170 Camera Module work on this principle, as do VR/AR headsets built using Qualcomm’s XR2 reference designs. (ultraleap.com/company/news/press-release/qualcomm-snapdragon-xr2)
Wide angle lenses are used to create a large interaction zone within which a user’s hands can be detected. The Leap Motion Controller hand tracking module has an interaction zone that extends up to 60cm (24”) or more, extending from the device in a 140x120° typical field of view. The Stereo IR 170 Camera Module has an even larger interaction zone, extending from 10cm (4”) to 75cm (29.5”) or more, with a 170x170° typical field of view (160x160° minimum).
It takes the form of an inverted pyramid for the Leap Motion Controller, and an inverted cone-like shape for the Stereo IR 170. This is created by the intersection of the binocular cameras’ fields of view.
The range is limited by LED light propagation through space, since it becomes much harder to infer your hand’s position in 3D beyond a certain distance. LED light intensity is ultimately limited by the maximum current that can be drawn over the USB connection.
At this point, the hand tracking device’s USB controller reads the sensor data into its own local memory and performs any necessary resolution adjustments. This data is then streamed via USB to Ultraleap’s tracking software.
The data from the hand tracking device takes the form of a grayscale stereo image of the near-infrared light spectrum, separated into the left and right cameras. Typically, the only objects you’ll see are those directly illuminated by the device’s LEDs. However, incandescent light bulbs, halogens, and daylight will also light up the scene in infrared. You might also notice that certain things, like cotton shirts, can appear white even though they are dark in the visible spectrum.
--------------------------------------
HAND TRACKING SOFTWARE
Once the image data is streamed to your computer, it’s time for some heavy mathematical lifting. Despite popular misconceptions, our hand tracking platform doesn’t generate a depth map – instead it applies advanced algorithms to the raw sensor data.
The Leap Motion Service is the software on your computer that processes the images. After compensating for background objects (such as heads) and ambient environmental lighting, the images are analyzed to reconstruct a 3D representation of what the device “sees”.
Next, the tracking layer matches the data to extract tracking information such as fingers. Our hand tracking algorithms interpret the 3D data and infer the positions of occluded objects. Filtering techniques are applied to ensure smooth temporal coherence of the data. The Leap Motion Service then feeds the results – expressed as a series of frames, or snapshots, containing all of the tracking data – into a transport protocol.
Through this protocol, the service communicates with the Leap Motion Control Panel, as well as native and web client libraries, through a TCP or WebSocket connection. The client library organizes the data into an object-oriented API structure, manages frame history, and provides helper functions and classes. From there, the application logic ties into the Leap Motion input, allowing a motion-controlled interactive experience.
@VarjoTechnologies jumped to the challenge and created this demo, mixing hand tracking and eye tracking together, showing how it works and exploring the potential of human interaction in mixed reality.
This video contains unmodified XR-1 footage.
Read more about Varjo's revolutionary video pass-through mixed reality that has solved the 'hard AR' problem: varjo.com/xr-1
varjo.com/blog/video-pass-through-xr-changes-reality-as-you-know-it
No controllers. No wearables. Just natural spatial interaction.
To find out more about the most advanced VR device for professionals: www.varjo.com/products/vr-2-pro
--------------------------------
First prize: Touchless.Design
Ideum
The Ideum team created an integrated hardware and software solution for touch displays and touch tables in museums. The system includes a Leap Motion Controller, a 3.5-ft display, and LED lights that work in concert with custom cursors.
The interaction was, unusually, designed for a horizontal screen. It used novel gestures, and the use of LED lighting for additional feedback was great. This got a big “wow” from us – and a deserved first prize.
Full video: youtu.be/apu0_l-zF6g
Open source code: github.com/ideum/Touchless.Design-Ultraleap-Beyond-Touch
--------------------------------
Second prize: Soulful Bowl with Touchless Control
Patrick Saalfeld and Danny Schott
Customize your own bowl to the level of single ingredients. Instead of tapping on a small, pre-defined selection of ingredients via a touchscreen, the customer can grab, prepare, and drop single ingredients in a flexible, touchless way. Turning the user into the chef and making ordering food in a self-serve restaurant an immersive experience is very cool.
Full video: vimeo.com/436924741
Open source code: bitbucket.org/patrick_/soulful-bowl-with-touchless-control/src/master
--------------------------------
Runner-up: Vendoor
Ashish Bakshi, Cesar de Castro, Gabriel Santa Maria, Kavya Bakshi
Vendoor is a touchless walk-up storefront solution designed for small businesses. The interface is built around physical metaphors. Using it is more akin to hand-crafting a bespoke item than ordering off a generic point-of-sale machine. It’s also difficult to think of a more fun product for this application than unicorn ice creams!
Full video: youtube.com/watch?v=t7S9YgHR8Dk
Open source code: github.com/gabexr/Vendoor
--------------------------------
Runner-up: Touchless Elevator Concept
Tanay Singhal & Mahika Phutane
Tanhay and Mahika’s concept for a touchless elevator includes not only hand tracking but also mid-air haptics. We loved the inclusivity of it: the design includes both touchless tactile braille and audio feedback. There are intuitive gestures for opening/closing doors, and button magnification on hover for improved accuracy.
Full video: youtube.com/watch?v=wYpcyULBc30
Open source code: github.com/TanaySinghal/Touchless-Elevator
A recent study showed that in March 2020, 71 people in China were infected with COVID-19 as the result of one elevator trip. With 18 billion elevator trips per year in the United States alone, elevator buttons are a major shared surface with potentially significant risks. A majority of people also think that touching shared surfaces in public places is unhygienic.
While elevator controls aren’t complex in nature, they need to be truly intuitive, especially for customers with additional needs. This presents an interesting challenge for our touchless interaction design.
More information:
wwwnc.cdc.gov/eid/article/26/9/20-1798_article
Research by Ultraleap in late April and early May showed that only 12% of consumers thought that public touchscreens were hygienic.
See: ultraleap.com/company/news/press-release/end-of-public-touchscreens
www.media4growth.com/videos/creating-immersive-engaging-memorable-ooh-experiences-281
Featuring:
Catherine Morgan, Director, Ocean Labs, Ocean
Anders Hakfelt, SVP Product & Marketing, Ultraleap
Stephen Lepitak, Executive Editor, The Drum
There are an estimated 3 million ATMs in the world but COVID-19 has brought widespread concerns among consumers about the health risks of public touchscreens and buttons. Ultraleap’s ATM demo demonstrates how a common, well-understood interaction can be quickly moved from touchscreens or buttons into mid-air without creating user interaction issues. This safe and clean mid-air interaction will help to restore consumer confidence in public environments.
Download demo: bit.ly/2OfFM4I
This is an Interaction In Progress video. These videos show you behind the scenes at Ultraleap and a little snippet of what we are working on.
This is an Interaction In Progress video. These videos show you behind the scenes at Ultraleap and a little snippet of what we are working on.