Multi-Level Authentication System Provisions
Apple's patent FIG. 3 illustrates the pop-up screen directing a user to authenticate. Prior to authentication the data on-screen may be blurred and unreadable until the user has been verified. In some cases, the user may be able to access basic phone privileges like making a call without interference but not have access to email or your contact list without authentication. The good news here is that there appears to be a lot of flexibility built into Apple's authentication system so as to meet varied security levels the user requires or demands.
Unique Time-Allotted Security Feature
Apple states that the electronic device may not require a user to authenticate each time the user unlocks or operates the electronic device. And here's where it gets interesting. In some embodiments, the electronic device may allow a user to authenticate for a particular amount of time. For example, once authenticated, the electronic device may allow a user to access restricted resources for 10 hours from the time the user authenticated. As another example, the electronic device may retain the user's authentication for a particular amount of time after having received the user's last instruction or having entered a stand-by mode (e.g., retain authentication for thirty minutes after an input). The amount of time the electronic device retains authentication information may be set by the device or by the user, and may be based on the particular types or resources protected by the authentication information. I think many will want to put a restriction on long-distant calling on their iPhone so that a thief would have a limited time racking up charges. Time allotment security is unique.
Types of Biometrics Available
Any suitable authentication system may be implemented. In some embodiments, the authentication system may include a system for detecting biometric features or attributes of a user. For example, the electronic device may include a system operative to detect and authenticate a user based on features of or under a user's skin, such as a finger print, hand print, palm print, knuckle print, blood vessel pattern, or any other suitable portion of or under the user's skin. As another example, the electronic device may include a system operative to detect and authenticate a user based on features of a user's eyes or face, or movements of the user's eyes. As still another example, the electronic device may include a system operative to detect features of a user's ear canal, an odor associated with the user, a user's DNA, or any other suitable biometric attribute or information associated with a user.
Apple's patent FIG. 6 shown above is a schematic view of an illustrative electronic device display for detecting a user's fingerprint in accordance with one embodiment of the invention. Display 600 may include screen 602 instructing the user to unlock the electronic device. The classic iPhone or iPod touch slider may in fact become the fingerprint scanner. In some embodiments, sensor 620 may be embedded in the electronic device such that it is not visible in display 600. For example, sensor 620 may be assembled, printed or etched directly on display 600 (e.g., etched on glass) such that the user cannot see the fingerprint scanner. If a user has difficulty providing a suitable fingerprint to sensor 620, display 600 may highlight the outlines of sensor 620 (e.g., display an icon directing the user to place a finger on the icon over sensor 620) to assist the user in authenticating.
Apple's patent FIG. 7 is a schematic view of a MacBook using fingerprint detection. Electronic device 700 may include input mechanisms 710 and 712 that the user may actuate to provide inputs to electronic device 700. For example, input mechanism 710 may include a keyboard, and input mechanism 712 may include a touchpad or track pad.
Furthermore, the MacBook may include several distinct keys which may include a sensor (720) embedded in one or more keys. For example, an optical or capacitive sensor may be placed at the top surface of a key such that when a user places a finger on the key (e.g., rests his index fingers on the "F" or "J" keys), the sensor may detect features of the user's fingertips for authenticating the user. A two-dimensional or moving sensor may be used for this implementation to authenticate the user while the user's fingers are placed over the keys.
In Apple's patent FIG. 9 below; we see another schematic view of a MacBook that could detect a user's hand print. The MacBook may include input mechanism 910 with which a user may provide inputs to the device. Input mechanism 910 may be positioned such that a user's fingers are placed over input mechanism 910 while the user's palms and wrists are placed on or extend over housing 912. The MacBook may include one or more sensors 920 embedded in or placed on housing 912 to authenticate a user of the device. Sensors 920 may be located such that the user's hands, palms or wrists are aligned with sensors 920 when the user places his hands over housing 912 to operate input mechanism 910.
Sensors 920 may be operative to detect features of the user's skin when the user's hands are placed over housing 912, for example using a two-dimensional sensor. In some embodiments, the authentication system may instead or in addition include a sensing mechanism for detecting features underneath a user's skin. For example, the authentication system may include a sensor operative to detect the pattern of a user's veins, arteries, follicle distribution, or any other suitable feature underneath the user's skin that may be detected.
Face Recognition, Retinal Scanning
Apple's patent FIG. 11 is a schematic view of a MacBook having a sensor for detecting features of a user's face within iSight. The MacBook could include sensor 1120 as indicated, so that the features of interest of the user's face may be aligned with the camera sensor 1120.
In response to detecting a user's face, the MacBook may direct sensor 1120 to capture and analyze features of the user's face, and compare the analyzed features with a library of features associated with authorized users. If an authorized user is detected, the MacBook may display or provide access to restricted content 1112 on display 1110.
In some embodiments, the authentication system may instead or in addition include a sensor operative to authenticate a user based on attributes of the user's eyes. For example, the sensor may be operative to scan a user's retina, iris or retinal blood vessels to detect unique patterns of the user. The sensor may include a light source operative to emit light, for example infrared light, to be reflected by the user's eye and detected by a lens or optical sensor. The sensor may analyze the received light to create a representation of the user's eyes that can be compared with a library of authorized user's eyes.
As another example, the sensor may instead or in addition be operative to detect movements of the user's eyes, for example by tracking the position and movement of a user's retina, iris, blood vessels, or any other feature of the user's eyes. Before providing a user with access to electronic device resources, the electronic device may direct the sensor to detect a predetermined eye movement set up by an authorized user. For example, each authorized user may create an eye movement track by moving his eyes in a particular manner (e.g., up, down, left, right, blink, blink) while looking at the sensor. When a user of the device moves his eyes in a manner that matches a predetermined eye movement, the electronic device may unlock the device or provide access to restricted resources.
iPhone Voice, Face Recognition and Ear Canal Sensors
In Apple's patent FIG. 12 we see an iPhone schematic having a sensor for detecting features of a user's eyes in accordance with one embodiment of the invention. The iPhone or iPod touch could include sensor 1220 located adjacent to display 1210 such that the user's eyes may be aligned with sensor 1220 (e.g., in the field of view of sensor 1220) when the user faces display 1210 to view or access electronic device resources. Using sensor 1220, electronic device 1200 may detect features or movements of a user's eyes to authenticate the user and provide access to restricted device resources. In some embodiments, sensor 1220 may be implemented to authenticate a user based on features of the user's face (e.g., like sensor 1120, FIG. 11).
In some embodiments, the authentication may be operative to authenticate users based on attributes or qualities of their voices. For example, the authentication system may be operative to detect a particular voice pitch or voice signature. The authentication system may be text dependent (e.g., the user must say a particular phrase to authenticate, such as "my voice is my passport") or text independent (e.g., any suitable words may be said to authenticate the user). In some embodiments, the authentication system may require the user to say a secret password to authenticate, thus requiring both knowledge of the user's password and the user's voice pitch to properly authenticate. The authentication system may include any suitable component for authenticating a user, including for example a microphone. In some embodiments, the microphone may be primarily used for other purposes (e.g., telephone communications or video conferencing).
In some embodiments, other types of authentication systems may be used. In some embodiments, the authentication system may be operative to identify and authenticate users from the shape of their ear canals. For example, the authentication system may include a sensor (e.g., optical, radar or sonar) operative to detect unique features of a user's ear canal (e.g., shape and length). The sensor may be located, for example, near a speaker of the device (e.g., if the device is a telephone). In some embodiments, the authentication system may be operative to identify a user based on an odor particular to the user. For example, the authentication system may include a sensor operative to detect unique attributes of the odor of a user's skin or sweat glands. The sensor may be located at any suitable position on the device, including for example at or near an input mechanism (e.g., where the user touches the device).
Temporal Pattern Methodology
In some embodiments, the authentication system may include a system operative to identify a user based on a visual or temporal pattern of inputs provided by the user. For example, the electronic device may display several selectable options or shapes forming a visual pattern. The user may select any suitable predetermined subset of displayed options to authenticate. For example, the user may select one or more options that have a predetermined attribute (e.g., size, color, shape or contour) in common. As another example, the user may select one or more options positioned in predetermined areas of the display (e.g., independent of the attributes of the selected options). The user may select options simultaneously, sequentially, or as a combination of these.
As another example, the user may provide a series of inputs at a particular pace or in a particular pattern. For example, the user may select options with a particular delay (e.g., pause between two selections). Alternatively, the user may provide inputs detected by a sensor (e.g., an accelerometer or a gyroscope) of the device following a predetermined temporal pattern. The device may detect the inputs from vibrations caused by tapping the device or an area adjacent to the device, moving the device in a particular manner, or any other suitable approach for detecting inputs.
Apple's patent FIGS. 13 and 14 are schematic views of an illustrative display for providing a visual pattern.
As an extra security feature, Apple states that after each failed attempt at selecting shapes for authentication, the electronic device may change the distribution of the displayed shapes, or even change the shapes (e.g., use different colors or contours) to prevent an unauthorized user from guessing the proper subset of shapes. The electronic device may lock the device resources after a particular number of failed attempts to select the proper subset of shapes. Once locked, a user may need to couple the device with a host to re-enable the device (e.g., couple a mobile device to a fixed device) or use another authentication system (e.g., a biometric system) to re-enable the device.
Combining Temporal and Visual Patterns for Authentication
In some embodiments, the electronic device may authenticate a user based instead or in addition on a received temporal pattern of inputs by the user. For example, the user may provide a particular number of inputs at a particular rate to authenticate. The electronic device may detect the inputs using any suitable approach. For example, the electronic device may detect inputs provided using an input mechanism of the device (e.g., inputs received by a touch screen). As another example, the electronic device may detect inputs from motion, contacts, vibrations or other impacts detected by an appropriate sensor of the device (e.g., an accelerometer). In such an approach, a user may tap any portion of the device (or a body in contact with the device, such as a table on which the device is placed) such that the sensor in the device detects the taps and determines whether they correspond to an authorized temporal pattern. As still another example, the electronic device may detect that it has been moved in a particular manner (e.g., shaken twice then spun) using a sensor in the device (e.g., an accelerometer or gyroscope). In response to detecting a correct temporal pattern, the electronic device may provide access to restricted resources.
In some embodiments, the authentication system may combine temporal and visual patterns for authentication. For example, a user may be required to select particular displayed shapes at a certain rate (e.g., the first two shapes quickly, then a pause before simultaneously selecting the last two). As another example, the user may be required to first select the proper shapes then provide an input for a temporal pattern. As still another example the user may be required to select one or more shapes and move the device (e.g., shake the device). Any other suitable combination of inputs may be required for authentication.
Inventors: Apple lists the following as the inventors of patent application number 20090083850:
Inventors: Apple lists the following as the inventors of patent application number 20090083850:Fadell; Anthony; (Portola Valley, CA) ; Hodge; Andrew; (Palo Alto, CA) ; Schell; Stephan; (Cupertino, CA) ; Caballero; Ruben; (San Jose, CA) ; Dorogusker; Jesse Lee; (Los Altos, CA) ; Zadesky; Stephen; (Portola Valley, CA) and Sanford; Emery; (San Francisco, CA).
For more information on today's patent application, simply feed the individual patent number noted above into this search engine.
NOTICE: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application and/or grant is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent applications and/or grants should be read in its entirety for further details.