The U.S. Patent and Trademark Office officially published a series of 80 newly granted patents for Apple Inc. today. In this particular report we focus on Apple’s invention regarding 3D user interface effects on an iPhone or iPad that could take Apple’s current parallax effects to a whole new level based on a stereoscopic camera or 3D camera coming to at least the iPhone 7 Plus next month. The camera will be able to provide superior head and or eye tracking. The 3D camera will also be able to understand and execute in-air gestures. Whether Apple will introduce any of these features in September or a future version of the iPhone is unknown at this time.
Granted Patent: 3D UI Effects on a Display
Apple was first granted a patent covering 3D UI effects on a display using properties of motion. You could check on that report for more details here. Today’s granted patent 9,411,413 covers different patent claims. One such difference states that “The graphical user interface method of claim 1, wherein at least one of the one or more optical sensors comprises: a front-facing camera, an image sensor, a two-dimensional camera, a stereoscopic camera, an infrared camera, proximity sensor, video camera, or a laser.” The addition of a “stereoscopic camera” is interesting because Apple’s dual lens camera coming to the iPhone 7 will provide such a camera that wasn’t mentioned in the earlier patent filing.
Other major deviations found in this granted patent from the 2015 filing include head tracking because of a 3D camera on a mobile device. Again, only with the iPhone 7 and other devices going forward will this be applicable, so it’s an interesting and timely granted patent. Another new point of interest that was added is the mention of “Lidar data” which you’ll see below.
As for the noted second point of differentiation: “A device, comprising: a display; one or more optical sensors; one or more positional sensors; a memory; and one or more programmable control devices communicatively coupled to the display, the optical sensors, the positional sensors, and the memory, wherein the memory includes instructions for causing the one or more programmable control devices to: receive optical data from the one or more optical sensors, wherein the optical data comprises one or more of: two-dimensional image data, stereoscopic image data, structured light data, depth map data, and Lidar data; receive non-optical data from one or more non-optical sensors; determine a position of a user of the device’s head based, at least in part, on the received optical data and the received non-optical data; generate a virtual 3D depiction of at least part of a graphical user interface on the display; and apply an appropriate perspective transformation to the virtual 3D depiction of the at least part of the graphical user interface on the display of the device, wherein the instructions to generate and apply are based, at least in part, on the determined position of the user of the device’s head, the received optical data, and the received non-optical data, and wherein the at least part of the graphical user interface is represented in a virtual 3D operating system environment.
In the noted ‘detailed description’ area of the patent Apple adds clarity to this point: “This disclosure pertains to techniques for tracking the movement of an electronic device having a display, as well as lighting conditions in the environment of a user of such an electronic device and the movement of the user of such an electronic device–and especially the position of the user of the device’s eyes and/or head.”
Apple’s patent FIG. 10 noted above illustrates an exemplary gesture for activating the display of a personal electronic device to operate in a virtual 3D operating system environment mode.
More specifically, Apple’s patent FIG. 10, illustrates an exemplary gesture #1002 for activating the display #102 of a personal electronic device to operate in a virtual 3D operating system environment mode.
While operating in a 3D environment is computationally expensive and can be a battery drain, we can understand why Apple is currently working on iOS devices working with a fuel cell system that could come in the form of a next generation of the current Smart Battery Case so we could power a 3D UI all day.
3D UI Activation Gesture
Apple’s patent further notes that “One potential ‘activation gesture’ is the so-called “princess wave,” i.e., the wave-motion rotation of the device about its Y-axis 1000. For example, the virtual 3D operating system environment mode can be turned on when more than three waves of 10-20 degrees #1002 of modulation along one axis #1000 occur within a predetermined threshold amount of time, e.g., one second. Position quiescence, e.g., holding the device relatively still for at least a predetermined threshold amount of time, e.g., two to three seconds, could be one potential cue to the device to freeze back to the 2D or 21/2D operating system environment mode and restore the display of objects to their traditional 2D representations.
Apple further notes that each of the icons #1006 on a springboard #110, such as is shown on the display in FIG. 10, could transform from 2D representations of icons into 3D “Lucite” cubes 1004/1006, i.e., cubes that appear to be made of a clear plastic or glass-like material and that have pictures at the bottom of them (Lucite is a registered trademark of Lucite International, Inc.).