0

Why AR is going to give you superpowers in the future

first_imgThe HoloLens 2 reboots augmented reality for Microsoft. It’s got a new fit, larger field of view and adds detailed hand and eye tracking. But one thing it’s still not is a consumer device.Alex Kipman is a Microsoft technical fellow and inventor of the Kinect and the HoloLens. During a visit to Microsoft’s Redmond campus to try the HoloLens 2 before its debut, we were able to talk to Kipman about his vision for where computing is headed, what HoloLens is becoming and how far away we are from a future where everyday people are actually wearing advanced AR headsets.To Kipman, and Microsoft, a headset like the HoloLens is one of many devices in which sensors will digest the world with AI. As Kipman says, on a headset, it’s the HoloLens. In a home, it’s a smart camera. On a car or drone, it’s an autonomous vehicle. This time, HoloLens 2 aims to connect its AR experiences in the cloud to other devices, including iOS and Android, and feel more like a work tool than ever. Jun 29 • Galaxy S10 5G, OnePlus 7 Pro LG V50 ThinQ 5G: Why you shouldn’t rush to buy a 5G phone 18 Photos 0 Now playing: Watch this: 4:54 Wearable Tech Inside Microsoft’s HoloLens development lab Microsoft HoloLens 2: A first dive into the future of… 53 Photos How much of these technologies is the general consumer going to see in the short term?I have no interest in overhyping, and having a bunch of people think these things are consumer products. And then … get to the trough of disillusionment when people are like, “my God, I’m not using this instead of a PC, instead of a phone, instead of a television.” These devices need to be more comfortable, they need to become more immersive and ultimately they need to have more value-to-price ratio. There is a threshold in the journey where there is enough immersion, enough comfort, enough out-of-box value, where I’ll be happy to announce a consumer product. This is not it.How far are we?That’s impossible me to go guess per se. … I think humans are terrible predictors of time. In enterprise, in first-line worker scenarios, we’re finding great value where this stuff is transformative. Now, to be clear, if I can take the highest watermark, the single best product that exists in this space today and it’s still “not ready for consumers,” you guys can be the judges of everybody else’s product in this space. Now playing: Watch this: May 13 • Galaxy S10E vs. iPhone XR: Every spec compared Share your voice What do you think the killer apps will be?I think communication … essentially defines a secular trend in computing. As it turns out, more often than not, it’s the innovation in the communication stack that changes things. Snail mail to real mail, real mail to messaging, to text messaging, to Snapchatting people. It’s like going from a still to a video to a teleportation. Like, my daughter being able to teleport to play with her cousins in Brazil, as a parent. Me not having to travel around the world to visit all my partners. How much would I love that? If you guys could be having this level of present experience, and you’re in New York, and you guys are in San Francisco and still it’s this immersive, with me in Redmond? It’s not hard to imagine presence as a killer experience for mixed-reality devices.As for monitor replacement … think about you sitting in front of your PC for “n” number of hours a day. This can be immersive and comfortable. Would you go spend this much money, to put this [the HoloLens] on your head with a keyboard and mouse versus buying a 30-inch, $500 monitor? Probably not, so we do try to focus a lot on things that you simply cannot do otherwise. Done right, we will all live in a world very soon where we will interface and interact and instinctually manipulate technology ubiquitously through our days, and you’re not going to be interfacing through monitors.In terms of eye tracking, what challenges or opportunities do you see?I think ultimately the quest here goes back to having our AI understand people, places and things. You want to get as much signal in that conversation as possible. If you’re going to teleport somewhere, I want to be able to know what you’re doing, and have that level of understanding so I can really teleport you. I still want your facial expressions to go through it. That’s something that I am very excited about, the eyes and the emotion of your eyes — there is so much signal there for us to mine to create more immersive and more comfortable experiences.I do think a lot about security of these devices. We’re going to have state-of-the-art iris recognition, the single most secure biometric system with iris recognition through HoloLens, that allows me to get all of that data securely, all of my comfort information, all of my customization. And then lastly, is the idea that we can close the loop. I know at any point in time, where is the device, in relation to your eyes, as we’re starting to form the hologram. Without that signal, I may or may not be getting the image correctly. Which means your eyes and your brain are doing all of the math for everything that I get wrong. Which is what translates into fatigue at the end of the day. We’re able to load most of that math on the device and adjust the hologram as you’re moving, to create sharper, more immersive holograms in the experience. Jul 9 • Killer cameras and battery life might meet their match in the Note 10 Post a comment reading • Why AR is going to give you ‘superpowers’ in the future The creator of HoloLens 2 discusses its future One thing that sets apart your tech versus others in the field is not having any physical control at all, using hands. Is a controller or haptics ever going to happen?100 percent, we love haptics. We started this journey 11 years ago with input. Kinect was about having sensors on the edge that observed environment to understand people, places and things. We went from Kinect input innovation to HoloLens, input plus output. The last one is having these things in my world exchange energy. Having zeros and ones, that transact into photons, actually transact into energy so I can push a hologram, and it pushes me back with equal force. So I can hold the hologram and I can feel the temperature of a hologram. We can call that haptic feedback. Much more sophisticated than how you traditionally would think about [it], but another level of immersion. The minute that I throw a hologram to you and you can catch it and it pushes you back … ooh, immersion just took one crank forward. The minute that I’m holding a hologram and there’s temperature to it, it changes the level of immersion and believability of the experience.Now although that’s absolutely in our dreams, we also believe that humans are tool builders. I would not want my doctor to operate on me without tools, just with their bare hands, anymore than I’d like to eat my food tonight without a fork and a knife. We don’t have any dogma on, “You cannot have something in your hands.” As a matter of fact, in our virtual reality handsets, you’re holding things in your hands: tools, controllers. That device could work here, but I don’t know if you guys have seen that one, it has lights. All you see is the lights over the hologram. It’s not that great of an experience. It’s super easy for us to go create a version of that that goes in IR, so you don’t see the light. It’s absolutely also in our roadmap to think about holding things in the hand. Not just things we create. What if I am a person with a real physical hammer? We’re holding a coffee cup and I still want to touch my hologram?Trying on the HoloLens 2. James Martin/CNET How long are you spending right now using HoloLens each day?Several hours a day is the short answer. We actually designed HoloLens on HoloLens. Wearing HoloLens and looking at the model in 3D is a much more visceral way of being able to understand space and creation on it. But look, when I’m in meetings, I’m not wearing a HoloLens. There are plenty of times when I’m in my office, and I’m using my keyboard, mouse and my PC monitor to do any number of things. But I do wear the device several hours a day and so do most people on the team.What’s the one thing that kept bugging you while making the HoloLens 2?It’s everything. I have a dream that one day there’s only gonna be one problem that keeps us up at night. We count HoloLens in miracles. You know, we can’t have double-digit miracles in any given product cycle. That’s how we kind of size how much innovation or issues we’re going to pack into one release. This carbon fiber enclosure is a huge issue. It’s there so that we can essentially make the device much more comfortable and much more stable. But shipping carbon fiber that doesn’t look like carbon fiber … was incredibly hard, is still incredibly hard, has tons of issues. Inventing a new display engine: that was a huge miracle. The innovation in the lenses, to the vapor chamber in the back, to the fit system, and making the fit system extensible for enterprises, so you can put it under a hard hat, any number of things. That’s just the hardware. The manufacturing, building at scale, at yield? A whole different set of issues. Getting articulated hand tracking to feel instinctual. Getting eye tracking to work over glasses. How do you create this platform from edge to cloud? To staying up at night and saying, “Oh my God, how do we take all of this and make it open?” If you do it wrong, we’re gonna have to live with some of these decisions for the next decade-plus.If you solve all the problems, what’s the dream end state you want?My dream state is I walk on an airplane, man, and every single person on that airplane is wearing our product. That’s not this product, by the way, It’s probably not the next one either. But, ultimately, the goal is these things transform humans, they empower people and organizations to do things they just plainly were not able to do before, they allow us to displace space and time on a daily basis as if we were born instinctually with those superpowers. It’s a work of a lifetime, but certainly I can’t think of anything else better to do with my life. • Tags Augmented reality (AR) Microsoft Microsoft’s HoloLens 2 pulls us further into an augmented reality 8:35 See All 13 Photos Kinect started with consumers, on the Xbox. Do you ever think of revisiting that route for the HoloLens?Look, like everything in life, you learn. I am incredibly proud, obviously, of the work we did with Kinect, and I think Kinect transformed the world. But, look, people in the living room don’t want to stand up and play games … they want to lean back and enjoy and want the precision of a controller in their hands. We didn’t find as much signal in the living room for entertainment with devices like Kinect at the time. But you know, go look today at an Alexa. What are those things doing? They’re recognizing people, they’re recognizing speech. They’re doing a lot of the things that Kinect was doing in 2010. So, obviously, there’s space in people’s homes for devices like Kinect, that recognize people, that recognize the objects in it and understand the context of who you are. But we’re finding way more signal with Kinect in enterprise workloads. They don’t tend to go through a proxy PC. Which is why we then end-of-lifed Kinect for Windows, and we now just launched Azure Kinect, which is of course still tetherable to a PC but also connects directly as an IoT appliance to our cloud.Where’s mixed reality going in the next five years or so, and what part does Microsoft play?I’m not going to guess five years, to be honest with you. Let me say for the duration of this product, let’s say more in the one to two category … I think all the successful ones will be enterprise-bound, primarily first-line worker scenarios, and increasing over time to knowledge worker scenarios. So I’ll give you the prediction for the next two years. Next two years, these are still enterprise-bound. Mobile World Congress 2019 The 51 best VR games But when will that magic augmented world become something for the rest of us?An edited version of our conversation plus a video interview with Kipman are below, from when CNET spoke to him January 31, 2019.  Mobile World Congress 2019 Jun 1 • The Nubia Alpha looks like either a house arrest bracelet or Batman’s phonelast_img read more

0

Physicists find quantum coherence and quantum entanglement are two sides of the

first_img More information: Alexander Streltsov, et al. “Measuring Quantum Coherence with Entanglement.” Physical Review Letters. To be published.Also at arXiv:1502.05876 [quant-ph] Journal information: Physical Review Letters Close relatives with the same rootsAlthough physicists have known that coherence and entanglement are close relatives, the exact relationship between the two resources has not been clear. It’s well-known that quantum coherence and quantum entanglement are both rooted in the superposition principle—the phenomenon in which a single quantum state simultaneously consists of multiple states—but in different ways. Quantum coherence deals with the idea that all objects have wave-like properties. If an object’s wave-like nature is split in two, then the two waves may coherently interfere with each other in such a way as to form a single state that is a superposition of the two states. This concept of superposition is famously represented by Schrödinger’s cat, which is both dead and alive at the same time when in its coherent state inside a closed box. Coherence also lies at the heart of quantum computing, in which a qubit is in a superposition of the “0” and “1” states, resulting in a speed-up over various classical algorithms. When such a state experiences decoherence, however, all of its quantumness is typically lost and the advantage vanishes.The second phenomenon, quantum entanglement, also involves superposition. But in this case, the states in a superposition are the shared states of two entangled particles rather than those of the two split waves of a single particle. The intrigue of entanglement lies in the fact that the two entangled particles are so intimately correlated that a measurement on one particle instantly affects the other particle, even when separated by a large distance. Like coherence, quantum entanglement also plays an essential role in quantum technologies, such as quantum teleportation, quantum cryptography, and super dense coding. Converting one to the otherIn a paper to be published in Physical Review Letters, physicists led by Gerardo Adesso, Associate Professor at the University of Nottingham in the UK, with coauthors from Spain and India, have provided a simple yet powerful answer to the question of how these two resources are related: the scientists show that coherence and entanglement are quantitatively, or operationally, equivalent, based on their behavior arising from their respective resource theories. © 2015 Phys.org Explore further Physicists show ‘quantum freezing phenomenon’ is universal This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. The physicists arrived at this result by showing that, in general, any nonzero amount of coherence in a system can be converted into an equal amount of entanglement between that system and another initially incoherent one. This discovery of the conversion between coherence and entanglement has several important implications. For one, it means that quantum coherence can be measured through entanglement. Consequently, all of the comprehensive knowledge that researchers have obtained about entanglement can now be directly applied to coherence, which in general is not nearly as well-researched (outside of the area of quantum optics). For example, the new knowledge has already allowed the physicists to settle an important open question concerning the geometric measure of coherence: since the geometric measure of entanglement is a “full convex monotone,” the same can be said of the associated coherence measure. As the scientists explained, this is possible because the new results allowed them to define and quantify one resource in terms of the other.”The significance of our work lies in the fact that we prove the close relation between entanglement and coherence not only qualitatively, but on a quantitative level,” coauthor Alex Streltsov, of ICFO-The Institute of Photonic Sciences in Barcelona, told Phys.org. “More precisely, we show that any quantifier of entanglement gives rise to a quantifier of coherence. This concept allowed us to prove that the geometric measure of coherence is a valid coherence quantifier, thus answering a question left open in several previous works.”While the results show that coherence and entanglement are operationally equivalent, the physicists explain that this doesn’t mean that are the exact same thing, as they are still conceptually different ideas. “Despite having the same roots of origin, namely quantum superposition, coherence and entanglement are conceptually different,” said coauthors Uttam Singh, Himadri Dhar, and Manabendra Bera at the Harish-Chandra Research Institute in Allahabad, India. “For example, coherence can be present in single quantum systems, where entanglement is not well-defined. Also, coherence is defined with respect to a given basis, while entanglement is invariant under local basis changes. In all, we believe coherence and entanglement are operationally equivalent but conceptually different.” Future quantum connectionsThe operational equivalence of coherence and entanglement will likely have a far-reaching impact on areas ranging from quantum information theory to more nascent fields such as quantum biology and nanoscale thermodynamics. In the future, the physicists plan to investigate whether coherence and entanglement might also be interconverted into a third resource—that of quantum discord, which, like entanglement, is another type of quantum correlation between two systems.”Our future plans are diverse,” Adesso said. “On the theoretical side, we are working to construct a unified framework to interpret, classify and quantify all different forms of quantum resources, including and beyond entanglement and coherence, and highlight the interlinks among them from an operational perspective. This will allow us to navigate the hierarchy of quantumness indicators in composite systems with a common pilot, and to appreciate which particular ingredients are needed in various informational tasks. “On the practical side, we are investigating experimentally friendly schemes to detect, quantify, and preserve coherence, entanglement and other quantum correlations in noisy environments. More fundamentally, we hope these results will inspire us to devise scalable and efficient methods to convert between different quantum resources for technological applications, and bring us closer to understanding where the boundaries of the quantum world ultimately lie in realistic scenarios.” (a) Input states that are fully incoherent (S and A) cannot be converted to entanglement via incoherent operations. (b) On the other hand, when the input state of S has any nonzero coherence, the coherence can be converted to entanglement via incoherent operations. The new results show that, in such a scenario, the input coherence and the output entanglement are quantitatively equivalent. Credit: Streltsov, et al. Citation: Physicists find quantum coherence and quantum entanglement are two sides of the same coin (2015, June 25) retrieved 18 August 2019 from https://phys.org/news/2015-06-physicists-quantum-coherence-entanglement-sides.html (Phys.org)—Quantum coherence and quantum entanglement are two landmark features of quantum physics, and now physicists have demonstrated that the two phenomena are “operationally equivalent”—that is, equivalent for all practical purposes, though still conceptually distinct. This finding allows physicists to apply decades of research on entanglement to the more fundamental but less-well-researched concept of coherence, offering the possibility of advancing a wide range of quantum technologies.last_img read more