A new eyewear patent application from Apple may shed some light on the company’s much discussed augmented reality (AR) and smart glasses plans.
Rumors have swirled for a while that Apple could be working on smart glasses following initial AR development for the iPhone. And before Apple unveiled ARKit to create AR experiences for the iPhone in June, a leaked document suggested Apple was working on a prototype smart glasses, possibly with AR capabilities.
Now a patent application has been spotted by Patently Apple from Metaio, an AR company that Apple bought in May 2015, but is now attributed to Apple.
The patent describes a “method for representing points of interest in a view of a real environment on a mobile device”. Apple discusses this capability both in terms of a smartphone and a semi-transparent screen that’s part of a head-mounted display.
The points of interest (POI) would include real-world objects like vehicles and buildings that on screen are associated with various content, such as audio, video, pictures, text, or 3D images. Key to interacting with content would be balloons to mark a POI and annotation boxes.
Apple notes problems with interacting with these overlay images on a smartphone due to limited space that might block the view. So, instead of touching the screen, the phone’s camera could detect the user’s finger and then use this in place of actually touching the screen.
As Apple explains, the method could be used for a head-mounted display where physically touching the screen isn’t practical.
“This embodiment is particularly useful when using a head-mounted display comprising the camera and the screen. For example, the head-mounted display is a video-see-through head-mounted display (HMD). It is typically not possible for the user to touch the head-mounted screen in a manner like a touchscreen. However, the camera that captures an image of the real environment may also be used to detect image positions of the user’s finger in the image. The image positions of the user’s finger could be equivalent to touching points touched by the user’s finger on the touchscreen.”
The HMD’s semi-transparent screen would display computer generated virtual objects that the user sees as an overlay on the real environment they’re looking at. Here, the human eye is the smartphone camera, while the HMD’s own camera would detect the position of the user’s finger and position of balloons and annotation.
Apple is also considering practical positioning of the annotations and balloons. It notes usability problems if say, the user is looking at a building and must hold their hand at eye level to point at an annotation at the top of the screen. For one, it would block the real view, but secondly would be uncomfortable to hold your hand too high.
To deal with these problems, it suggests placing the balloons at the top of the display and visually connecting them with a line pointing to the associated annotation at the bottom of the display. The user interact with each POI by touching the annotation, which keeps the arm at less strenuous angle and doesn’t cause the finger or hand to obstruct the view.