UpnaLabUse acoustic waves to hold in mid-air samples such as water, ants or tiny electric components. This technology has been previously restricted to a couple of research labs but now you can make it at your home.
Acoustic Levitator DIY - TinyLev - levitate liquids and insects at homeUpnaLab2017-08-14 | Use acoustic waves to hold in mid-air samples such as water, ants or tiny electric components. This technology has been previously restricted to a couple of research labs but now you can make it at your home.
In this project, we create artificial piloerection using contactless electrostatics to induce tactile sensations in a contactless way. Firstly, we design various high-voltage generators and evaluate them in terms of their static charge, safety and frequency response with different electrodes as well as grounding strategies. Secondly, a psychophysics user study revealed which parts of the upper body are more sensitive to electrostatic piloerection and what adjectives are associated with them. Finally, we combine an electrostatic generator to produce artificial piloerection on the nape with a head-mounted display, this device provides an augmented virtual experience related to fear. We hope that work encourages designers to explore contactless piloerection for enhancing experiences such as music, short movies, video games, or exhibitions.Complex Selective Manipulations of Thermomagnetic Programmable Matter | PaperUpnaLab2022-12-13 | link to paper: 10.1038/s41598-022-24543-5 Programmable matter can change its shape, stiffness or other physical properties upon command. Previous work has shown contactless optically controlled matter or magnetic actuation, but the former is limited in strength and the latter in spatial resolution. Here, we show an unprecedented level of control combining light patterns and magnetic fields. A mixture of thermoplastic and ferromagnetic powder is heated up at specific locations that become malleable and are attracted by magnetic fields. These heated areas solidify on cool down, and the process can be repeated. We show complex control of 3D slabs, 2D sheets, and 1D filaments with applications in tactile displays and object manipulation. Due to the low transition temperature and the possibility of using microwave heating, the compound can be manipulated in air, water, or inside biological tissue having the potential to revolutionize biomedical devices, robotics or display technologies.Paper | LevPet: A Magnetic Levitating Spherical Pet with Affective ReactionsUpnaLab2022-09-20 | LevPet combines affective computing and magnetic levitation to create an artificial levitating pet with affective responses and novel ways of moving to express emotions. Our interactive pet can recognise the user’s emotional status using computer vision, and respond to it with a low-level empathy system based on mirroring behaviour. For example, if you approach it with a happy face, the pet will greet you and move in a nimble way. A repulsive magnetic levitator is attached to a mechanical stage controlled by a computer system. On top of it, there is the pet playground, where a house, a ping-pong ball,a xylophone and other accessories are placed. Two cameras allow to capture the user’s face and the objects placed on the playground, so that the pet can interact with them. LevPet is an exploration of how to communicate internal state with only a levitating sphere; it is a platform for experimentation and an interactive demo that brings together an outer-worldly levitating metallic sphere with familiar things like emotions and a playground made of traditional items.Paper | LeviPrint: Contactless Fabrication using Full Acoustic Trapping of Elongated PartsUpnaLab2022-06-29 | LeviPrint is a system for assembling objects in a contactless manner using acoustic levitation. We explore a set of optimum acoustic fields that enables full trapping in position and orientation of elongated objects such as sticks. We then evaluate the capabilities of different ultrasonic levitators to dynamically manipulate these elongated objects. The combination of novel optimization algorithms and levitators enable the manipulation of sticks, beads and droplets to fabricate complex objects. A system prototype composed of a robot arm and a levitator is tested for different fabrication processes. We highlight the reduction of cross-contamination and the capability of building on top of objects from different angles as well as inside closed spaces. We hope that this technique inspires novel fabrication techniques and that reaches fields such as microfabrication of electromechanical components or even in-vivo additive manufacturing.
Paper from SIGGRAPH 2022: dl.acm.org/doi/abs/10.1145/3528233.3530752Multimedia Undergrads 2019UpnaLab2022-06-15 | Multimedia undergrad projects from 2019. Due to the lockdown they were only able to use their phone and computer.Multimedia Undergrads 2020UpnaLab2022-06-15 | Multimedia undergrad projects from 2020/21 Ingeniería Informática. Universidad Pública de Navarra.Undergrad projects - Multimedia 2021UpnaLab2022-06-15 | A summary of the projects from the subject Sistemas Multimedia of the Undergrad in computer science at Upna.
Music from Uppbeat (free for Creators!): uppbeat.io/t/mountaineer/footpath License code: NJK4WQIK4TK4N3UJPaper | A Multi-Object Grasp Technique for Placement of Objects in Virtual RealityUpnaLab2022-05-09 | Paper: mdpi.com/2076-3417/12/9/4193 Abstract: Some daily tasks involve grasping multiple objects in one hand and releasing them in a determined order, for example laying out a surgical table or distributing items on shelves. For training these tasks in Virtual Reality (VR), there is no technique for allowing users to grasp multiple objects in one hand in a realistic way, and it is not known if such a technique would benefit user experience. Here, we design a multi-object grasp technique that enables users to grasp multiple objects in one hand and release them in a controlled way. We tested an object placement task under three conditions: real life, VR with single-object grasp and VR with multi-object grasp. Task completion time, distance travelled by the hands and subjective experience were measured in three scenarios: sitting in front of a desktop table, standing up in front of shelves and a room-size scenario where walking was required. Results show that the performance in a real environment is better than in Virtual Reality, both for single-object and multi-object grasping. The single-object technique performs better than the multi-object, except for the room scenario, where multi-object leads to less distance travelled and reported physical demand. For use cases where the distances are small (i.e., desktop scenario), single-object grasp is simpler and easier to understand. For larger scenarios, the multi-object grasp technique represents a good option that can be considered by other application designers.SonicSurface: DIY ultrasonic phased array for levitation, haptics, and directive audioUpnaLab2021-07-26 | Do you want to build an integrated 256-channels ultrasonic array? It can be used for acoustic levitation, haptic feedback, directional audio and other cool ideas that you have in mind. We show example applications and how to assemble the array.
UltraLeap Ltd. comercialices hardware and software solutions which are already tested and certified. ultraleap.com
This research was funded by the European Union's Horizon 2020 research and innovation programme under grant 101017746 (touchlessai.com) and the Government of Navarre FEDER 0011-1365-2019-000086.Paper | SliceView: Content Adaptation and Depth Perception in an Affordable Multi-View DisplayUpnaLab2020-10-21 | Authors: Iñigo Ezcurdia, Adriana Arregui, Oscar Ardaiz, Amalia Ortiz and Asier Marzo
Abstract: We present SliceView, a simple and inexpensive multi-view display made with multiple parallel translucent sheets that sit on top of a regular monitor; each sheet reflects different 2D images that are perceived cumulatively. A technical study is performed on the reflected and transmitted light for sheets of different thicknesses. A user study compares SliceView with a commercial light-field display (LookingGlass) regarding the perception of information at multiple depths. More importantly, we present automatic adaptations of existing content to SliceView: 2D layered graphics such as retro-games or painting tools, movies and subtitles, and regular 3D scenes with multiple clipping z-planes. We show that it is possible to create an inexpensive multi-view display and automatically adapt content for it; moreover, the depth perception on some tasks is superior to the one obtained in a commercial light-field display. We hope that this work stimulates more research and applications with multi-view displays.
--- Abstract: While pressing can enable a wide variety of interesting applications, most press sensing techniques operate only at close distances and rely on fragile electronics. We present EchoTube, a robust, modular, simple, and inexpensive system for sensing low-resolution press events at a distance. EchoTube works by emitting ultrasonic pulses inside a flexible tube which acts as a waveguide and detecting reflections caused by deformations in the tube. EchoTube is deployable in a wide variety of situations: the flexibility of the tubes allows them to be wrapped around and affixed to irregular objects. Because the electronic elements are located at one end of the tube, EchoTube is robust, able to withstand crushing, impacts, water, and other adverse conditions. In this paper, we detail the design, implementation, and theory behind EchoTube; characterize its performance under different configurations; and present a variety of exemplar applications that illustrate its potential.
--- Authors: Carlos E. Tejada, Jess McIntosh , Klaes Alexander Bergen Sebastian Boring, Daniel Ashbrook and Asier Marzo.
--- Affiliations: University of Copenhagen, Copenhagen, Denmark Universidad Publica de Navarra, Pamplona, Navarre, Spain
--- parts of the music are from: youtube.com/watch?v=0D2o8F2MOuITutorial: Marking the Polarity of Ultrasonic Piezos using a MultimeterUpnaLab2019-06-17 | Piezoelectric transducers are used in different projects.
However, quite often the marked polarity is not right.
Here, I will show you how to mark the polarity using a simple multimeter, copper tape and permanent marker.Interactive Devices - Undergrad - Computer Science - 2019UpnaLab2019-06-05 | Interactive Projects from Undergrad Students Computer Science 2019
Big Piano - Natalia Jimenez Handball projection - Iker de la Nava, David Erroz, Daniel Callado Smart Mirror - Julen Merchan Whiteboard wars - Unai Fernandez, Jon Apesteguia SoundDodge - Marcos Galvez, Iñaki Pérez GCube Snake - Javier Morala Stroboscopic Fountain - Josselyn Ramos, Rubén Jimenez Capacitive Array - Nerea Elizalde LIDAR Bow - Carlos Zubiaur, Iñaki Zabalegui MineSweeper Shock - Javier Diez Delta Maze - Mohamed Abdulla SandProjector - Ángela GonzalezHolographic acoustic tweezers | PaperUpnaLab2018-12-18 | Paper: Holographic Acoustic Tweezers. PNAS. 2018. Asier Marzo, Bruce W. Drinkwater. pnas.org/content/early/2018/12/11/1813047115
Abstract: Acoustic Tweezers use sound radiation forces to manipulate matter without contact. They provide unique characteristics when compared to the more established Optical Tweezers, such as higher trapping forces per unit input power and the ability to manipulate objects from the micrometre to the centimetre scale. They also enable the trapping of a wide range of sample materials in various media. A dramatic advancement in Optical Tweezers was the development of Holographic Optical Tweezers (HOT) which enabled the independent manipulation of multiple particles leading to applications such as the assembly of 3D micro-structures and the probing of soft matter. Now, 20 years after the development of HOT, we present the first realization of Holographic Acoustic Tweezers (HAT). We experimentally demonstrate a 40 kHz airborne HAT system implemented using two 256-emitter phased-arrays and manipulate individually up to 25 millimetric particles simultaneously. We show that the maximum trapping forces are achieved once the emitting array satisfies Nyquist sampling and an emission phase discretisation below π/8 radians. When considered on the scale of a wavelength, HAT provides similar manipulation capabilities as HOT while retaining its unique characteristics. The examples shown here suggest the future use of HAT for novel forms of displays in which the objects are made of physical levitating voxels, assembly processes in the micro-metre and millimetric scale, as well as positioning and orientation of multiple objects which could led to biomedical applications.Acoustic Virtual Vortices with Tunable Orbital Angular Momentum for Trapping of Mie ParticlesUpnaLab2018-01-22 | Paper: journals.aps.org/prl/abstract/10.1103/PhysRevLett.120.044301
Acoustic vortices can transfer angular momentum and trap particles. Here, we show that particles trapped in airborne acoustic vortices orbit at high speeds leading to dynamic instability and ejection. We demonstrate stable trapping inside acoustic vortices by generating sequences of short-pulsed vortices of equal helicity but opposite chirality. This produces a "virtual vortex" with an orbital angular momentum that can be tuned independently of the trapping force. We use this method to adjust the rotational speed of particles inside a vortex beam and for the first time create three-dimensional acoustics traps for particles of wavelength order (so called Mie particles).Ultraino: DIY ultrasonic airborne phased-array 64 channelsUpnaLab2017-12-18 | We present Ultraino, a modular, inexpensive, and open platform that provides hardware, software and example applications specifically aimed at controlling the transmission of narrowband airborne ultrasound. The software can be used to define array geometries, simulate the acoustic field in real time and control the connected driver boards. The driver board design is based on an Arduino Mega and can control 64 channels with a square wave of up to 17 Vpp and π/5 phase resolution. Multiple boards can be chained together to increase the number of channels. 40 kHz arrays with flat and spherical geometries are demonstrated for parametric audio generation, acoustic levitation and haptic feedback.
Paper: http://ieeexplore.ieee.org/document/8094247 Instructables: instructables.com/id/Ultrasonic-array Latest simulator release: github.com/asiermarzo/Ultraino/releasesEating From an Acoustic Tractor BeamUpnaLab2017-03-14 | ...Flour inside a Droplet of Levitating WaterUpnaLab2017-03-14 | ...Levitating Coffee and BurguerUpnaLab2017-03-14 | Acoustic LevitationGhost touch: turning surfaces into interactive tangible canvases with focused ultrasoundUpnaLab2016-10-27 | Digital art technologies take advantage of the input, output and processing capabilities of modern computers. However, full digital systems lack the tangibility and expressiveness of their traditional counterparts. We present Ghost Touch, a system that remotely actuate the artistic medium with an ultrasound phased array. Ghost Touch transforms a normal surface into an interactive tangible canvas in which the users and the system collaborate in real-time to produce an artistic piece. Ghost Touch is able to detect traces and reproduce them, therefore enabling common digital operations such as copy, paste, save or load whilst maintaining the tangibility of the traditional medium. Ghost Touch has enhanced expressivity since it uses a novel algorithm to generate multiple ultrasound focal points with specific intensity levels. Different artistic effects can be performed on sand, milk&ink or liquid soap.Evaluating controls for a point and shoot mobile game: Augmented Reality, Touch and Tilt.UpnaLab2014-11-08 | Controls based on Augmented Reality (AR), Tilt and Touch have been evaluated in a point and shoot game for mobile devices. Tilt and AR controls provided more enjoyment, immersion and accuracy to the players than Touch. Nonetheless, Touch caused fewer nuisances and was playable under more varied situations. Despite the current technical limitations, we suggest to incorporate AR controls into the mobile games that supported them. Nowadays, AR controls can be implemented on handheld devices as easily as the more established Tilt and Touch controls. However, this study is the first comparison of them and thus its findings could be of interest for game developers.Combining multi-touch and device movement in mobile augmented reality manipulationsUpnaLab2014-11-08 | Nowadays, handheld devices are capable of displaying augmented environments in which virtual content overlaps reality. To interact with these environments it is necessary to use a manipulation technique. The objective of a manipulation technique is to define how the input data modify the properties of the virtual objects. Current devices have multi-touch screens that can serve as input. Additionally, the position and rotation of the device can also be used as input creating both an opportunity and a design challenge.collARtUpnaLab2013-10-30 | A collage is an artistic composition made by assembling different parts to create a new whole. This procedure can be applied for assembling tridimensional objects. In this paper we present CollARt, a Mobile Augmented Reality application which permits to create 3D photo collages. Virtual pieces are textured with pictures taken with the camera and can be blended with real objects.