'Apple Glass' users may be able to manipulate AR images with any real object

Apple is looking for a technology that allows a user of “Apple Glass” not to be constrained to fixed physical gestures or touch controls, and instead manipulate any object to change what is visible on the headset.

You can watch, but you can’t touch and it’s good, it’s even awesome. But as Apple AR becomes more and more part of our regular world and Apple integrates it into everything it does, it becomes a problem.

With AR and especially with what Apple calls Mixed Reality (MR), it’s great to be able to see an iPad Pro in front of you, but you have to be able to use it. You have to be able to take a virtual object and use it, otherwise AR is no better than a 3D movie.

The solution proposed by Apple is described in “Manipulation of virtual objects using a tracked physical object”, a patent application filed in January 2020 but only revealed this week. This suggests that really mixing up realities, in that the virtual object could be mapped to a real object in the real world.

“There are various electronic devices, such as head protection devices (also called helmets and HMDs),” the app launches, “with screens that present users with a computer generated reality (CGR) environment in which they can be fully immersed in a surrounding physical environment, fully immersed in a virtual reality environment including virtual objects, or anywhere in between. ”

“While direct manipulation of physical objects in the surrounding physical environment is naturally available to users, the same is not true for virtual objects in the CGR environment,” he continues.

“Not having a way to directly interact with virtual objects presented to a user in a CGR environment limits the degree of integration of virtual objects into the CGR environment. be desirable to provide users with a means to directly manipulate virtual objects presented within the framework of CGR environments. ”

It might not be practical to keep a block of wood in the shape of a life-size Lamborghini. And maybe it wouldn’t help much to have a piece of fiberglass shaped like a Mac Pro. But you could much more usefully have an iPad-shaped, iPad-sized object.

Manipulate a virtual object by controlling a physical object and be very happy with it

Apple refers to the real-world object as a “proxy device” and claims that the same camera system used to position virtual AR around the user can track it accurately.

“Input is received from the proxy device using an input device of the proxy device which represents a request to create a fixed alignment between the virtual object and the virtual representation in a three-dimensional coordinate space ( “3-D”) defined for the content, “specifies the patent application.

“A position and orientation of the virtual object in 3-D coordinate space is dynamically updated using positional data that defines the movement of the proxy device in the physical environment,” he continues. .

So if you hold, for example, a rectangular device, the AR / MR system can tell where it is, what orientation it has, which face is up, and so on. It can tell when you move or rotate it, which means it knows exactly where to overlay the AR image.

“The method involves presenting content comprising a virtual object and a virtual representation of a proxy device physically not associated with an electronic device on a screen of the electronic device,” he continues.

So to somebody standing next to you, you wave a blank slate. But for you, it’s an iPad Pro with all the controls and displays that that means. In that sense, the idea is similar to the recent one on privacy screens.

The invention is attributed to Austin C. Germer and Ryan S. Burgoyne, the latter of which was recently listed on a patent application covering gaze-based user interactions.

We will be happy to hear your thoughts

      Leave a Reply

      AppleiPhonestop - Apple iPhone News and Rumours All Day
      Enable registration in settings - general
      Compare items
      • Laptops (0)
      Compare
      0