'Apple Glass' may use Look Around-style smooth navigation motion

The upcoming “Apple Glass” will introduce wearers to new AR views of the real or virtual environment, and Apple is working on technology that allows users to zoom and magnify images smoothly.

Think about how Apple Maps has its Look Around feature. It’s the same as Google’s Street View, but it removes the jerky steps from this service and makes it easier to zoom the streets. This is what Apple wants to bring to applications using “Apple Glass” and Apple AR.

“Movement in an Environment,” is a recently revealed Apple patent application that explains how virtual landscapes can be modified to indicate movement. A user will still be able to walk around in a CG Rendering Environment (CGR), but this would allow them to move to another part without disorientation.

“Some CGR applications display the CGR environment from a particular position in the CGR environment,” the patent application states. “In some cases, the position represents the location of a user or a virtual camera in the CGR environment. [However] In some applications, a user may wish to move around the CGR environment and / or view the CGR environment from a different position. ”

“Movement in a CGR environment does not necessarily have to correspond directly to the user’s actual physical movement in the real world,” he continues. “This is because a user can move quickly over great distances in a CGR environment that would be impractical in the real world.”

Detail of the patent showing how an “enlarged” view could be presented to Apple glass wearers

This virtual environment must change according to the position or orientation of the user. “For example, a CGR system can detect the rotation of a person’s head and, in response, adjust the graphic content and an acoustic field presented to the person in a manner similar to how those sights and sounds would change in a physical environment, ”explains the app.

This adjustment is necessary in every AR, VR or mixed reality environment, whether everything is in virtual or augmented reality, the objects are positioned on a real world view.

What’s new about this app is that Apple wants you to be able to turn your head, see something “in the distance,” and then decide to jump into it immediately. What you can “see” from where you are in the environment can be matched by what is presented as a magnified view from a distant point.

“A magnified portion and a non-magnified portion of a Computer Generated Reality (CGR) environment are displayed from a first position,” says Apple. “In response to receiving an input, an enlarged portion of the CGR environment from a second position is displayed.”

So you will see a regular view of a distant virtual object, but you can also choose to see it in close-up. After seeing this you can choose to go to this point and you are more mentally prepared for it than if you were just teleported to a new location.

This patent is awarded to three inventors, including Ryan S. Burgoyne. His previous patent applications include a request for the ability for “Apple Glass” users to manipulate AR images.

Make virtual objects real

In addition, Apple has also filed a patent application concerning the fact of making the virtual objects that you see through “Apple Glass” more real, more comfortable with where they appear to be placed. “Rendering objects based on camera noise” is all about matching different video qualities in the same AR environment.

Detail of the patent showing a real object (left) and a virtual object (right).  Despite the poor-quality illustration, Apple claims that the lack of video noise makes the virtual object look fake

Detail of the patent showing a real object (left) and a virtual object (right). Despite the poor-quality illustration, Apple claims that the lack of video noise makes the virtual object look fake

“Some augmented reality (AR) systems capture a video stream and combine images from the video stream with virtual content,” says this other patent application. “The images in the video stream can be very noisy, especially in low light conditions where certain ISO settings are used to increase the brightness of the image.”

“[But] Since the virtual content renderer does not suffer from the physical limitations of the image capture device, there is little or no noise in the virtual content, “he continues.” The lack of appearance of the noise , for example, graininess, strength, its property variations between color channels, etc., on virtual content can make the virtual content appear to float, appear detached, stand out, or do not fit the actual content. ”

This is not a new problem, but Apple claims that “existing systems and techniques do not sufficiently take image noise into account when presenting virtual content with AR image content and other content. combining virtual content with real image content ”.

Thus, Apple offers several methods to effectively capture noise and apply it to pristine virtual objects. The result of all the different approaches is to make a virtual object appear as real and in the position where it is shown.

This app is credited to Daniel Kurz and Tobias Holl. Kurz has previously obtained patents for using AR to turn any surface into a screen with touch controls.

We will be happy to hear your thoughts

      Leave a Reply

      AppleiPhonestop - Apple iPhone News and Rumours All Day
      Logo
      Enable registration in settings - general
      Compare items
      • Laptops (0)
      Compare
      0