Touch screens become impractical when that screen is next to your eye in something like “Apple Glass,” so Apple is investigating how you might manipulate virtual controls in the real world.
A recently disclosed patent application showed Apple came up with a way to use Apple AR to display informationon about what would appear to everyone except the owner, in the form of a blank screen. Now, in a separate application, it seeks to make any surface appear to wearers of “Apple Glass” like a control panel, with buttons and a screen.
“Method and device for detecting contact between a first object and a second object”, filed in 2016 but revealed only this week, underlines that this way of presenting controls superimposed on real world objects will be necessary.
“A natural way for humans to interact with (real) objects is to touch them with their hands,” he says. “[Screens] that detect and locate keys on their surface are commonly referred to as touch screens and are now part of, for example, smartphones and tablets. ”
“[However, a] the current trend is for AR screens to get smaller and smaller and / or they get closer to the retina of the user’s eye, “he continues.” This is the case for head screens, for example. , and makes the use of touch screens difficult, if not impossible. . ”
Your “Apple Glass” can then show you informationon, but there isn’t much you can do with it unless you remove the bezels and push the lens with your finger. Since AR aly maps virtual objects to the real world around you, however, it could be expanded to make it look like there were conveniently located buttons and controls where you can tap or touch them.
In this case, your eyes would see the virtual object because “Apple Glass” shows it to you, but your fingers could touch anything that is really there in the real world. If the AR can determine that you’ve hit what you think is a button, it can react as if you’ve actually pressed a command.
Unfortunately, this is more difficult than it looks. The way the Apple privacy screen idea can work is that the real world object is something like an iPad. Even though this iPad doesn’t show you anything on its screen at all, it can recognize where you tap.
Apple’s new proposal is to recognize those taps or touches on any surface. And this can be done without forcing the user to wear sensors with their fingertips or use the occlusion.
“The most common approach is to physically equip the object or the human body (for example the fingertip) with a sensor capable of detecting touch,” says the application. But he’s not impressed with that or the electrical versions that are part of the body of a circuit: “The limitation of these types of approaches is that they require modifications to the object or the human body.”
Detail of the patent showing how a flat square object could be turned into a calculator
With occlusion, a camera can touch a user’s fingers. Especially in AR this might work well because the system will have mapped the whole environment and knows that, for example, there is a table surface under the user’s fingers.
He could therefore map buttons on this surface. However, the patent application points out, in fact, that they should be big buttons. The deadly problem, according to Daniel Kurz, the credited inventor of Apple’s patent application, is that the occlusion can never be precise with precision.
“With virtual buttons, it’s impossible to trigger a button, say # 5 on an array of adjacent buttons on a numeric keypad, without triggering another button first,” he argues, “because button # 5 cannot be reached without occlusion of one of the surrounding pimples. ”
You can get around this by having even bigger pimples, or maybe less, but even then what these systems really detect is occlusion. They detect whether a finger has passed a “button”, not whether that button has actually been pressed.
The solution offered by Apple is heat. “If two objects at different temperatures touch each other,” explains the app, “the area where they touch will change temperature and then slowly converge to the initial temperature as before the touch.”
“Therefore, for pixels corresponding to a point in the environment where contact has recently occurred,” he continues, “it reveals a slow but clearly measurable decrease or increase in temperature”.
So whatever your finger touches, there is going to be a heat transfer. It’s tiny, but it’s definitely there. “Smooth temperature changes may be a sign of a contact between two objects that has recently occurred at the sampled position,” the patent states.
The app offer with different methods of detecting heat changes, for example by thermal imaging, but it is mainly concerned with the precision that can be achieved.
“He can distinguish between touches or occlusions caused by a human body (which happened on purpose) from touches or occlusions by anything else (which can happen by accident),” he says. “It is able to precisely locate a touch and can identify touch events after they have occurred (that is, even if no camera, computer or anything else was present at the moment of touch). ”
It depends on the detection done fairly quickly after touching, but for example, a user might touch a surface and thermal imaging could occur when they remove their finger. The app includes an example photograph showing a recent fingertip tapping on a surface.
Along with the related idea of showing data only to the wearer of “Apple Glass,” this is all part of Apple’s extensive AR work. It shows the company a deep look – and a long time since this patent application was filed four years ago – into the practical applications of augmented reality rather than just the visual aspect.
Among Apple’s many patents in the AR field, there is another by the same Daniel Kurz that concerns the precise manipulation of real and virtual objects together.