Apple AR wearables could track gaze, alter resolution for legibility & power saving

Multiple lines of research from Apple focus on the ability of portable AR devices like the rumored “Apple Glass” to detect what you are looking at and to reconfigure itself to contextually present the informationon a wearer needs.

A wide range of Apple plans for its “Apple Glass” and all of Apple AR’s work have been revealed by multiple patent applications. Classified separately, they still combine to provide a global picture of Apple working on aspects of how people will be able to interact with real and virtual devices.

Detecting the real world environment

The four patent applications globally concern the detection and use of real and virtual environments around a user. “Environment-based application presentation” offers a method of interpreting what the cameras see in the real world as the basis for a kind of map.

“[Using this] a geometric arrangement of the physical environment is determined, “explains the application.” Depending on the type of the physical environment, one or more virtual reality objects are displayed corresponding to a representation of the physical environment. ”

Thus, when the physical configuration of the environment is established, the system can precisely position the virtual objects.

“Some users of portable devices have visual defects such as myopia, hyperopia, astigmatism or presbyopia”, explains this application. “It can be difficult to ensure that an optical system in a head-mounted device satisfactorily displays computer-generated content and provides an acceptable viewing experience for users with visual impairments.

“If no precautions are taken,” he continues, “it may be difficult, if not impossible, for a user with visual defects to properly focus on the content that is displayed, or the content may not otherwise be displayed as wish.”

Detail of a patent showing the design of an adjustable helmet

Apple’s proposals revolve around specifics related to “head-mounted devices” [that] may include optical systems with lenses. “The request says that,”[these] lenses allow device screens to present visual content to users. ”

Foveation imagery in “Apple Glass”

More specific aspects of headsets are discussed in the “Head Mounted Device with Active Optical Foveation” application. Rather than focusing on the objectives through which images are received, it focuses on displaying those images in the most beneficial way for the user.

In headsets that have “a transparent display for a user to observe real world objects” as well as virtual objects, they could use cameras to relay the reality of this physical environment. This type of “pass-through camera” can intelligently modify its own parameters to suit what is being watched.

“The pass-through camera can capture certain high-resolution image data for display on the screen,” says the app. “However, only low resolution image data may be required to display low resolution images at the periphery of the user’s field of vision on the screen.”

Thus, the helmet can modify the resolution of what it shows the wearer, depending on what he is looking at. This has the advantage of reducing the amount of data to be processed, which improves performance and therefore the life of the battery.

Foveated imagery, as this impaired resolution is called, depends on the ability to know where a user is looking. Some of this can be determined by the physical movement of the head, but that does not help the system know which specific parts of a screen are being examined.

Eye tracking in AR and with “Apple Glass”

“Using a gaze tracking system in the device mounted on the head,” suggests Apple, “the device can determine which part of the screen is viewed directly by a user.”

The last of the four new related patent applications focuses on this issue of eye tracking, but for the operation of the devices themselves. “Look-based user interactions” describes the methods by which “a user uses their eyes to interact with user interface objects displayed on the electronic device”.

Detail of the patent application showing a real environment being mapped to allow the positioning of virtual objects

Detail of the patent application showing a real environment being mapped to allow the positioning of virtual objects

The devices could therefore “provide a more natural and efficient interface”. As well as a user being able to select objects or objects by looking at them, these methods offer to be able to interpret everything at a glance with a hard look of Paddington.

The application describes the problems related to difficulties caused by “the uncertainty and instability of the user’s gaze position”.

The four applications each represent developments and improvements in areas that Apple has pursued for years and in many other patents. In addition to providing an overview of areas of concern to the company, they are also an example of how Apple benefits from the presence of hardware and software.

We will be happy to hear your thoughts

      Leave a Reply

      AppleiPhonestop - Apple iPhone News and Rumours All Day
      Enable registration in settings - general
      Compare items
      • Laptops (0)