US20210081553A1 - Management of drone operations and security in a pervasive computing environment - Google Patents
Management of drone operations and security in a pervasive computing environment Download PDFInfo
- Publication number
- US20210081553A1 US20210081553A1 US17/098,118 US202017098118A US2021081553A1 US 20210081553 A1 US20210081553 A1 US 20210081553A1 US 202017098118 A US202017098118 A US 202017098118A US 2021081553 A1 US2021081553 A1 US 2021081553A1
- Authority
- US
- United States
- Prior art keywords
- data
- aerial drone
- area
- server
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 95
- 238000004891 communication Methods 0.000 claims description 26
- 238000013480 data collection Methods 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 14
- 230000001105 regulatory effect Effects 0.000 claims description 3
- 238000012790 confirmation Methods 0.000 claims 1
- 239000003795 chemical substances by application Substances 0.000 description 55
- 230000008569 process Effects 0.000 description 51
- 230000015654 memory Effects 0.000 description 33
- 238000012545 processing Methods 0.000 description 30
- 238000007726 management method Methods 0.000 description 18
- 230000003287 optical effect Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 15
- 230000005540 biological transmission Effects 0.000 description 13
- 230000002093 peripheral effect Effects 0.000 description 13
- 230000033001 locomotion Effects 0.000 description 12
- 238000007418 data mining Methods 0.000 description 11
- 230000000873 masking effect Effects 0.000 description 10
- 230000009466 transformation Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 230000009471 action Effects 0.000 description 8
- 238000004590 computer program Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 238000012795 verification Methods 0.000 description 7
- 238000013473 artificial intelligence Methods 0.000 description 6
- 238000003491 array Methods 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000002085 persistent effect Effects 0.000 description 5
- 238000013475 authorization Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000001276 controlling effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000010494 dissociation reaction Methods 0.000 description 3
- 230000005593 dissociations Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 239000000126 substance Substances 0.000 description 3
- 108020004414 DNA Proteins 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000007405 data analysis Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000006798 recombination Effects 0.000 description 2
- 238000005215 recombination Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 108091028043 Nucleic acid sequence Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000004224 protection Effects 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0055—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
- G05D1/0077—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements using redundant signals or controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/102—Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
- G06F21/6254—Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/18—Legal services
- G06Q50/188—Electronic negotiation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/006—Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
- G08G5/045—Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/10—Network architectures or network communication protocols for network security for controlling access to devices or network resources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/20—Network architectures or network communication protocols for network security for managing network security; network security policies in general
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/60—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons
- B64U2101/64—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons for parcel delivery or retrieval
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2463/00—Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00
- H04L2463/101—Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00 applying security measures for digital rights management
Definitions
- the present teaching relates to a data processing system, and in particular to managing data privacy and security in a network system that includes a plurality of sensors, processors and other devices used to capture and process personal and private information.
- a concern to many consumers is that copies of these data are re-combined, re-packaged, and/or re-sold to other dealers of data whose particular interests are not aligned with the consumers' interests. Privacy concerns of the original consumer arise when these data are replicated across the vast Internet and its datacenters and become immortalized in the computing cloud. Because of the redundancy of the copies, these data and metadata are very difficult to protect, delete, and secure via enforcement of data access constraints.
- Embodiments include a security and privacy wrapper configured to prevent the unauthorized usage of data beyond a negotiated purpose.
- the security and privacy wrapper is associated with data pertaining to an entity, such as a person, and a set of permissions associated with the entity.
- the wrapper also contains a set of negotiation instructions used to allow the wrapper to work independently, or in conjunction with other programs, to dynamically negotiate the usage and dispensation of data collected in response to the purpose of the data acquisition transaction requested by the entity and other extraneous data acquired along with the data required by the purpose of the negotiated transaction.
- Embodiments include a system configured to negotiate the flow of drones over a site location or through drone traffic responsive to a set of permissions utilized to ensure any extraneous data collected as a result of the drone's activities is disposed of or handled according to a negotiated transaction protocol.
- the system is designed to receive drones used for an intended purpose or transaction by a person or entity, determine whether there is any extraneous data collected or unpermitted passage along with the transaction, negotiate the dispensation of the extraneous data or passage, and negotiate and manage the use and dispensation of the transaction data required to perform the transaction.
- Embodiments include a data storage monitoring system configured to intercept data for storage and to analyze the data to determine whether the data to be stored includes a security data wrapper. If the data contains the wrapper, then a storage schema is dynamically negotiated based upon a predetermined storage protocol associated with a user or entity. The system also uses the wrapper to determine what data may be retrieved (read) from the storage system, as well as determine other parameters such as the expiration of the stored data, number of copies that will be made, locations where the data may be stored within the storage system, type of encryption, type of uses for the data, whether to partition or destroy extraneous data, and the like.
- Embodiments include a method and system configured to determine how data collected from an entity or person may be combined with other data.
- a sentry code is used to analyze the data, determine a set of recombination permissions, and then use the permissions to either allow or prevent the collected data from being combined with data from other data repositories.
- Embodiments for solving the privacy concerns of personal/private data and metadata are disclosed herein from a perspective of data processing architecture and digital rights management.
- the problem of the inability to practically protect private data is resolved by an architectural change to a data platform of collecting, processing, distributing, data mining, and storing data in such a way that private data cannot be rediscovered from stored or shared data.
- the stored data under this architecture does not, in fact, store the private information in the first place.
- the architecture includes a data tag module that wraps all new data blocks generated such that a deletion policy and a pipeline policy follow the data blocks from its birth.
- each question component includes intelligent code to calculate and manipulate the data block to answer a particular question. Once answered, an answer data block is generated.
- the answer data block can also be wrapped by the data tag module.
- the answer data block can be further processed by other question components.
- the architecture ensures that, at all exit points of the data processing architecture, only answer data blocks are allowed to leave. Exit points include: a data block being saved to persistent storage, a data block being shared, a data block being replicated, a data block being published, a data block being accessed.
- FIG. 1A illustrates a block diagram of a data collection system over a network in accordance with some embodiments.
- FIG. 1B illustrates a block diagram of a data collection system in accordance with some embodiments.
- FIG. 1C illustrates a block diagram of a security/privacy data wrapper in accordance with some embodiments.
- FIG. 1D illustrates a block diagram of a data marking and tagging system in accordance with some embodiments.
- FIG. 2 illustrates a control flow of a data processing architecture in accordance with some embodiments.
- FIG. 3 illustrates a block diagram of a privacy-aware data processing architecture in accordance with some embodiments.
- FIG. 4 illustrates a device for use with a privacy-aware data processing system in accordance with some embodiments.
- FIG. 5 illustrates a method for privacy-aware data processing in accordance with some embodiments.
- FIG. 6 illustrates a block diagram example of a privacy aware scenario in accordance with some embodiments.
- FIG. 7 illustrates a method for privacy-aware data processing in accordance with some embodiments.
- FIG. 8 illustrates a block diagram example of a privacy aware scenario in accordance with some embodiments.
- FIG. 9 illustrates a method for autonomous actor data processing in accordance with some embodiments.
- FIG. 10 is a block diagram illustrating a portable multifunction device with touch-sensitive displays in accordance with some embodiments.
- FIG. 11 illustrates an exemplary computer architecture for use with the present system, in accordance with some embodiments.
- FIG. 12 is an illustrative representation of the method of the present invention as applying to drone routes through space.
- FIGS. 1A-1D describe different ways data is generated or first recognized by a data collection system 100 A.
- FIG. 1A illustrates a network diagram of a data collection system 100 A over a network 100 B.
- network adaptors 102 can be Ethernet adaptors, Wi-Fi adaptors, and the like, and may be other systems and devices 104 capable of gathering data such as touch devices, cable modems, dial-up modems, satellite communication transceivers, optical links, optical fiber interfaces, cameras, drones, sensors, camera arrays, microphone arrays, infrared sensors, audio sensors, temperature sensors, body imaging devices, activity sensors, accelerometers, radar detectors, sonar, surface tension sensors, weight sensors, mobile phones, smart phones, vibration sensors, camera/projector detection systems, global positioning devices, location transmitters, beacons, location lighting, or any combinations thereof.
- FIG. 1B illustrates a block diagram of a data collection system 140 A at hardware level.
- the data collection system 140 A includes input hardware devices 142 A.
- the input hardware devices 142 A are electronic devices for capturing data via sensors.
- the sensors can be mechanical sensors, electromagnetic sensors, temperature sensors, chemical sensors, pressure sensors, or any combination thereof.
- the input hardware devices 142 A can be computer pointing devices, computer keyboards, cameras, microphones, scanners, telephones, and may be systems and devices such as touch devices, cable modems, dial-up modems, satellite communication transceivers, optical links, optical fiber interfaces, cameras, drones, sensors, camera arrays, microphone arrays, infrared sensors, audio sensors, temperature sensors, body imaging devices, activity sensors, accelerometers, radar detectors, sonar, surface tension sensors, weight sensors, mobile phones, smart phones, vibration sensors, camera/projector detection systems, global positioning devices, location transmitters, beacons, location lighting, or any combination thereof.
- the input hardware devices 142 A are connected to their respective hardware input adaptors 144 A.
- the hardware input adaptors 144 A are hardware interfaces to the input hardware devices 142 A.
- the hardware input adaptors 144 A report data streams or data blocks in the form of a raw dataset 146 A from the input hardware devices 142 A to a processor 148 A.
- the processor 148 A can then decide on how to further process the raw dataset 146 A, and whether the raw dataset 146 A is to be saved onto a persistent storage 150 A or an unrestricted volatile memory space 152 A.
- FIG. 1C illustrates a block diagram of a security/privacy data wrapper 170 A, herein referred to as “data wrapper.”
- a data wrapper may be understood as a form of metadata.
- data in the form of program, script, code, instruction, permission, container, symbol, or other type information or the like
- a data wrapper may be a script or sequence of instructions to be carried out by another program.
- a data wrapper may, in the context of the transmission of data be understood as a “stream” wrapper.
- the stream wrapper may be data placed in front of or around a transmission or stream of underlying data that may provide information about or restrict access to the data being transmitted.
- a stream wrapper may be data in the form of a header and trailer around the encapsulated payload data within a digital transmission unit (for example, a data packet or frame).
- a stream wrapper may be understood as a “stream transformation.”
- a stream transformation may transform a stream of initial data through the use of hardware and/or software implemented algorithms into a stream of new data.
- the data wrapper 170 A is software configured to “wrap” data associated with a person, entity, and the like.
- the data wrapper may include security/privacy parameters, negotiation module, combination module, expiration module, masking module, recasting module, dissociation module, pattern analyzer, and the like.
- the security/privacy parameters are associated with the rights, permissions, and other indicia used to define and indicate the security and privacy parameters related to the person's data privacy and security requirements.
- the security/privacy parameters include one or more levels of negotiable privacy and security parameters.
- the security/privacy parameters may include permissions to copyrighted material and the use or sale thereof to third parties.
- the security/privacy parameters may include use cases for certain data, such as personal data associated with a person's social security number, body type, weight, address, age, fingerprint identification, and other data personal to the person.
- data wrapper 170 A includes a negotiation module.
- the negotiation module may be used to negotiate with data collection devices such as described above, in order to negotiate what data may be collected, transmitted, and retransmitted by the device.
- the negotiation module may be software or a combination of hardware or software modules.
- the negotiation module may also be configured from an artificial intelligence (AI) such that the negotiation module actively negotiates with devices such as the data collection devices described herein, to act as an advocate on behalf of the person or entity.
- AI artificial intelligence
- such AI is configured to negotiate at various levels of permission to determine the dispensation of data used for a transaction and extraneous data collected in one or more ancillary processes associated with the collection of the data.
- the AI may be configured to distinguish between a request for an address for credit card verification, and a request for a purchase location, such that the credit card request would be limited to a minimal subset of collection to verify the transaction.
- the AI could distinguish between the need for a delivery address and the credit card verification in order to provide sufficient information to the delivery services to find and deliver the purchased goods.
- the AI may be configured to distinguish whether there is a need for a location for delivery for a product and whether delivery of a service does not require a location for delivery.
- data wrapper 170 A includes a combination module.
- the combination module may be used to negotiate with data collection devices such as described above in order to determine what data may be collected and shared between devices and systems.
- the combination module may allow two or more types of personal data to be combined to present to a doctor for a medical procedure, but not allow the data to be combined for a marketing campaign.
- the combination module may be used to prevent or inhibit data mining operations from occurring on extraneous data.
- the combination module may be configured to detect combinations that, when combined, meet a data mining profile threshold.
- the combination module would not allow data collected for a transaction associated with a person buying a dress to be combined with metadata history of online shopping from other online shopping databases, to determine whether the person was a man or woman.
- data wrapper 170 A includes an expiration module.
- the expiration module may be used to establish the expiration limits of data collected from a person or entity.
- the expiration module may be configured to determine from the negotiation parameters how long data should be allowed to be used before either being destroyed, converting to an encrypted form, removed from memory, redacted, and the like.
- the expiration module receives navigation data pertaining to photographs of an autonomous drone's flight records and tags the photo navigation data with a label that only allows such data to last for the time duration of necessary travel, or a time limit, whichever is least.
- the expiration module may also be used to set an expiration time or date on other extraneous data collected, such as the temperature of a room or area that is being inhabited by a person. Such expiration limits may be used in conjunction with the other processes, such as the combination module, to allow data to be combined, for example, over a predefined time limit.
- data wrapper 170 A includes a masking module.
- the masking module may be used to establish the viewing or data processing limits of data collected from a person or entity.
- the data is obscured or “masked” for a specified window of time, which could be indefinitely.
- Such a masking module may be used in scenarios where, for example, the navigation data discussed above could be blurred instead of deleted.
- Such masking may be configured with a hash code that distorts or otherwise prevents data collection devices from perceiving the data they would normally be able to detect.
- the masking module may be configured to mix random numbers or other data such as white noise with the data in a data hash in order to mask the output of navigational telemetry data. This masking could be used by security personnel, for example, to prevent thieves from taking photographs of fingerprints or other biometric data.
- data wrapper 170 A includes a recasting module.
- the recasting module may be used to establish the viewing or data processing limits of data collected from a person or entity.
- the recasting module may add specified data to the personal data in order to “recast” the data being used by third parties to prevent data mining, or to make the data retrieved unusable. For example, if a woman buys a dress from an online shopping store, the recasting module may recast the purchase data as some other type of data, e.g., temperature data. The data looks valid, but will not provide accurate data results for data mining purposes.
- the recasting module may be used to infer one type of data has been received, when actually other data is received. For example, for a surreptitious program counting the number of keystrokes to determine keyboard usage, the recasting module may be configured to add hundreds of mouse clicks to the data in order to overwhelm and detract from the data collection process.
- data wrapper 170 A includes a disassociation module.
- the dissociation module may be used to change data that could be associated to data that is disassociated. For example, consider the case where a person orders a book from an online shopping store. The data that is collected, such as the name, address, and credit card number, may be used to complete the purchase transaction, but when a data mining system tries to use the data, the data is altered (such as with the recasting module) to keep the data from being associated by third party systems. For example, the date of purchase being collected may be reset by the disassociation module to look like a different date, the address may be modified to look like a different address, etc. In one embodiment, the dissociation module is proportional to the amount of extraneous data collected.
- the more extraneous data that is attempted to be collected the more disparate the data becomes with respect to third party systems.
- the data being collected for use for the purpose of the transaction may be verified and non-disparate, whereas the data collected by third party data mining systems would be more disparate and unusable with respect to the person or entity associated with the data.
- data wrapper 170 A includes a pattern generator, embodiments of which are described herein.
- the pattern generator may be used to seek out and detect watermarks, metadata, and other data patterns used by the data wrapper and the security system in order to detect the data wrapper, and to label the data with the negotiation parameters.
- the pattern generator may be configured to work in conjunction with a pattern analyzer as described herein to detect rotating vector data configured to change per transaction. Such data may then be used to prevent data mining processes from circumventing the security and privacy protocols and parameters.
- Such pattern generator may be used as part of the recasting module by translating one form of data pattern to another.
- the pattern generator may also be used to detect encrypted keys used to identify specific types of data.
- biometric data such as DNA may be combined with other data to create a hash data label.
- the hash data label may then be used and decrypted by the pattern analyzer described herein to validate the DNA data in a manner that changes as a function of the change in DNA sequence.
- FIG. 1D illustrates a block diagram of a data collection system 180 that generates and stores metadata based on data analysis.
- the data collection system 180 includes a pattern analyzer 182 .
- an owner ID is combined with the data to form security data wrapper 170 A.
- the owner ID may be a unique ID, such as an encrypted file, that is a combination of personal data and other data unique to the person or entity.
- Such an owner ID may also contain the negotiation parameters or be tied to a contract or other document that indicates the person's data utilization criteria and negotiation parameters.
- the owner ID may be linked to a terms of service that stipulates the owner's intended use for the data being transmitted, the expiration date or time, data recombination criteria, monetary information for data use, and the like.
- the owner ID may also be used as a virtual watermark that may be used to detect violation of the agreements associated with the owner ID.
- the owner ID may be configured as metadata that is attached to the data being transacted, may be a pointer used to direct the data flow to a particular data rights management system, may include a digital key to lock and/or unlock an encrypted storage location, may be part of one or more header files used in packet transmission, and the like.
- FIG. 2 illustrates a control flow of data processing architecture 200 .
- the data processing architecture 200 includes a processor 202 .
- the processor 202 receives data inputs 204 from a variety of sources. The exit points of collected data are marked by an outgoing arrow with a circle.
- the processor 202 can receive inputs from a network adaptor 206 , such as the network adaptors 102 A of FIG. 1A , a hardware input adaptor 210 , such as the hardware input adaptors 144 A of FIG. 1B , or a pattern analyzer 214 , such as the pattern analyzer 182 A of FIG. 1D .
- the restrictions can be governed either by a data wrapper (for example, the data wrapper 170 of FIG. 1D ) tagged when the data inputs 204 are first received and/or by data processing applications executed by the processor 202 .
- the restrictions may be triggered at output channels 220 from the data processing architecture 200 , such as the network adaptor 206 , a hardware output adaptor 224 , the volatile memory space 152 A, or a persistent storage 228 .
- the network adaptor 206 can be one of the network adaptors 102 of FIG.
- the hardware output adaptor 224 is a hardware interface to an output device, such as a printer, a display, a fax, or any combination thereof.
- the volatile memory space 152 is a volatile memory that is shared between applications that are executed on the data processing architecture 200 .
- the persistent storage 228 is a storage device capable of storing data indefinitely even without power, such as the persistent storage 150 A of FIG. 1B .
- FIG. 3 illustrates one embodiment of a digital rights management process 300 incorporating the use of the data wrapper to direct the flow of data and negotiations.
- negotiation parameters and data usage parameters are defined in the form of questions and answers.
- Such questions and answers allow the data to be used, for example, for publication or sharing according to a negotiated agreement between the person or entity associated with the data and third parties.
- the data wrapper may contain the stipulations that only the last four digits of the person's social security number may be used in authentication.
- the third party may need the last six digits of the person's social security number.
- the process 300 may be configured to negotiate the usage of the additional two numbers.
- question #1 302 may be, “May we have the additional two numbers of the client's social security number?”; the answer may be, “Why?” 304 ; the process 300 may reply, “As that is the minimum number needed for the verification process”; the reply from the wrapper may be, “Only if you use the digits for this one transaction”; and the process 300 may reply, “OK.”
- a negotiation process has been completed and the transaction is processed.
- Other scenarios are contemplated. For example, as negotiations may exceed the system capacity or authorization ability, the person undergoing the transaction process may be alerted that the process requires more input or a decision from the person.
- system 300 may also enlist additional resources such as a human advocate to assist in the third party transaction on behalf of the person via, for example, a question interface.
- system 300 dynamically negotiates the terms that allow the person or entity to vary the terms according to its preferences.
- the negotiation includes a clause for destruction of data
- the system 300 sends the data to a destroy process 306 in order to permanently destroy the data once the transaction is completed according to the agreed upon negotiations.
- the negotiations may extend to the data required to complete a transaction as well as to additional data collected.
- additional data collected such as identity, purpose, time, etc.
- the negotiation terms were set to destroy such extraneous data, the data would be destroyed.
- the opposing negotiating entity may agree to have the extraneous data used for another purpose, such as for public reporting. The extraneous data would then be used for such agreed upon purpose.
- a breach of the agreement could be set to fine the third parties using the data outside the agreement.
- Such fines could be used as a control mechanism for data mining and other surreptitious activities.
- the fines could also be set up as payments directed to fund third parties. For example, a fine could be redirected to a charity organization.
- penalties could be used as part of the negotiation to have the data mining organizations pay for such data to gain access or as a fine to deter such behavior.
- FIG. 4 is an exemplary system level diagram of a pervasive computing environment 400 utilizing a server, according to one embodiment.
- pervasive computing environment 400 has a cloud 411 and one or more devices (device_ 1 401 , device_ 2 402 , device_N 403 ).
- a device_! 401 can have input capabilities 404 , output capabilities 405 , computing capabilities 406 , an input encoder and decoder 407 , an output encoder and decoder 408 , a device agent 409 , and a network stack 410 .
- a device can have any combination of the referenced components and is not required to have all of them.
- a cloud 411 has a supervisor 413 (referred to as an I/O/C supervisor, for input, output, and computing), a name service 412 , a data provider 414 in communication with storage providers ( 416 , 417 ), and an authentication and security gateway 415 .
- a supervisor 413 referred to as an I/O/C supervisor, for input, output, and computing
- name service 412 a name service 412
- data provider 414 in communication with storage providers ( 416 , 417 )
- an authentication and security gateway 415 referred to as an authentication and security gateway 415 .
- device agent 409 is configured to act as a negotiation agent to employ the data wrapper.
- the device agent 409 may be configured to negotiate with the authentication and security gateway 415 .
- the device agent 409 employs the protocols discussed herein to manage the data being sent to and from the various devices 401 , 402 , and 403 .
- the respective device agents may be configured to negotiate between each other and the authentication and security gateway 415 .
- device_ 2 402 via its device agent 420 , may request, via cloud 411 , a specific set of data 490 from device_! 401 .
- device agent 409 may request highly-restrictive data usage terms to be included in a data wrapper (for example, a data wrapper similar to the data wrapper 170 A of FIG. 1C ) associated with data 490 .
- a data wrapper for example, a data wrapper similar to the data wrapper 170 A of FIG. 1C
- device agent 420 of device_ 2 402 may need to negotiate with the device agent of a third party device, for example, device agent 430 of device_N 403 .
- An individual person requests via a smart phone (in this example, device_ 1 401 ) to purchase a product from an online retailer.
- the online retailer server device (in this example, device_ 2 402 ) requests: 1) the person's shipping address, 2) the person's billing address, and 3) the person's credit card information.
- the device agent of the smart phone (in this example, device agent 409 ) requests that if sent, such data be subject to a data wrapper containing restricted usage terms.
- the device agent for the iPhone (device agent 409 ) analyzes the internal transaction processing systems and procedures of the online retailer.
- the person's data may need to be stored on multiple physical devices within the online retailer's system due to, for example, redundancy and/or mirroring functionality, 2) in order to transmit the data, the data may be transmitted through and stored on the devices of a third-party cloud computing service (in this example cloud 411 ), and 3) in order to complete the transaction, the online retailer must transmit the person's billing address and credit card information to a third-party credit card company for payment verification.
- the device agent for the online retailer server (device agent 420 ) returns this information to the device agent for the person's smart phone (device agent 409 ) with a request that the device agent for the smart phone modify its stipulated restricted usage terms contained within the data wrapper.
- the device agent for the smart phone agrees to modify the terms to 1) allow storage on multiple physical devices within the online retailer's system only where necessary, and 2) allow transfer to a single third-party device for the limited purpose of transmission and payment verification as long as the third-party device is made aware of and complies with the restricted usage terms as negotiated between the device agent for the smart phone (device agent 409 ) and the device agent for the online retailer's server device (device agent 420 ).
- the device agent for the smart phone may incorporate the agreed-to terms into a data wrapper and functionally combine the data wrapper with the data to be transmitted from the person's smart phone to the online retailer's server device.
- this negotiation process may take place automatically without any human input according to preset negotiation parameters.
- the negotiation parameters may be dynamic and automatically adjust based on contextual characteristics of the current situation.
- Contextual characteristics may include, but are not limited to, the nature of the data, the transmission protocols employed along data transmission channels, the security protocols employed along data transmission channels, the geographic location of the sending and receiving devices, as well as intermediary devices along the data transmission channels, the location of data storage systems, the routing path through which the data is to be transmitted over the network (including whether the path will travel over wired vs. wireless physical infrastructure), or any combination thereof.
- a human actor may also pre-configure or dynamically configure in near real- time the negotiation parameters. This may be accomplished in a number of ways including, but not limited to, setting detailed permissions for a set of likely contexts or, at a higher level, providing an overarching directive to a device agent instantiated on a device he or she is using to apply relatively higher or lower protections when negotiating data use restrictions. For example, using a human interaction interface (HII)(also known as a “user interface” (UI)), a high-level configuration directive to the device agent may be set by a person using the device simply through the use of a slider bar on a touch screen interface.
- HAI human interaction interface
- UI user interface
- the drone may have permissions to expend a range of monetary amounts at a given site, or different ranges for different sites.
- Another possibility would include programming the drone with a mission budget which may be expended over a number of sites across a mission, where the drone would be programmed to optimize the use of the funds. Should a given mission include crossing 5 sites, one of which was substantially larger than the others, and was notoriously surrounded by difficult passage conditions such that avoiding the larger site would likely be treacherous, the optimization would allow the autonomous drone to negotiate, using higher monetary offers, to pass through the larger site because the larger site was of greater importance to the mission than crossing the other 4 sites.
- the autonomous drone would need dynamic configuration either from posing queries to a user concerning new or varying mission conditions, or including artificial intelligence to autonomously alter configured parameters.
- device agent 409 may be configured to enforce the negotiated terms of the data wrapper associated with transmitted data (for example, data 490 mentioned in the previous paragraphs).
- the smart phone e.g., device_ 1 401
- the online retailer's server device e.g., device_ 2 402
- a data wrapper containing a set of data use restriction terms applicable to both the online retailer and other third parties such as the credit card company and the cloud computing service.
- the device agent of the smart phone may track the propagation of data 490 among other devices connected to the internet.
- the device agent of the smart phone may accomplish this in a number of ways.
- the device agent for the smart phone may periodically “crawl” the internet for data the same as or similar to the data transmitted as data 490 .
- Current internet search engines employ such technology to scour the internet for documents, index the locations of the documents, and return links to such documents as search results in response to submitted queries.
- the device agent for the smart phone may include as part of the data wrapper data in the form of a “water mark” or similar form, recognizable only by the device agent for the smart phone (for example, a continuously changing pattern created by a pattern generator similar to that described in FIG. 1C and recognizable by a pattern analyzer similar to that described in FIG. 1D ).
- the device agent for the smart phone may enforce the data use restriction terms in a number of different ways.
- the device agent for the smart phone may automatically notify (through email, text message, or otherwise) the person associated with the data that his or her data is being used improperly and inform that person as to the details of the improper use.
- the device agent for the smart phone may automatically notify (through email, text message, or otherwise) a privacy or security regulatory authority and inform that authority as to the details of the improper use.
- the device agent for the smart phone may initiate a process on the data through the data wrapper to curtail the improper use of the data.
- Processes may include, but are not limited to, deleting the data, masking the data, recasting the data, disassociating the data, and/or encrypting the data.
- the device agent for the smart phone may track the usage of the data and monetize the improper usage through, for example, automatically demanding a royalty payment from the offending user for continued use of the data. For example, if data 490 was improperly shared with an entity performing targeted advertising, the device agent of the smart phone may negotiate a royalty payment (payable to the person to whom the data belongs) with a device agent of the targeted advertising entity using data 490 for the continued use of the data 490 . Again, as described earlier, this process may occur automatically without any human input, and/or be based on pre-configured or dynamically configured negotiation parameters.
- FIG. 5 illustrates a method 500 for data collection negotiation.
- device 401 is a mobile phone with a camera and the phone is being used to purchase an item using a purchase application instantiated on device 401 , for example, within the network stack 410 .
- the application asks the user of the phone to take a picture of a check for the transaction.
- device 402 is a wireless point access used to convey the transaction data between device 401 and the authentication and security gateway instantiated on cloud server 411 .
- the data from the check is processed on device 401 and is broken into account number, name, address, date, time, and location of transaction.
- the IP address of device one may only be used for location verification per the negotiation strategy determined at step 505 and negotiated at step 506 . Therefore, if the negotiation is successful at 508 , according to the negotiation at 510 , as the data is being channeled through device 402 , the agent on device 402 prevents the transmission of the IP address to anyone but the purchase authorization unit in communication with the authentication and security gateway 415 . In this scenario, the IP address data would only be passed to the gateway, and not stored or used by any other device in communication with the wireless access point 402 (device_ 2 ). Further, if the negotiation between the access point 402 and device 401 allows for the IP address to be used to verify the signal strength and wireless communication channel are stable, then device 402 may use the data for a different but allowed purpose.
- FIG. 6 is an exemplary system level diagram of a pervasive computing environment 600 utilizing a server, according to one embodiment.
- FIG. 7 illustrates a method 700 for negotiating the use of sensor data collected by sensors in contact with users.
- pervasive computing environment 600 includes a scenario whereby there are two persons (user 1 and user 2 ) having different data wrappers specific to their needs.
- User 1 does not allow any data outside a specified purpose to be shared publically except to those devices that negotiate the use within the negotiation parameters set forth by the user, while user 2 has present applications that are allowed to be showed publically.
- user 1 and user 2 approach computer 1 614 .
- computer 1 has an authentication detection device 612 that in this scenario is in communication with user 1 authentication device 616 and user 2 authentication device 618 .
- sensor data from computer 1 is intercepted by user 1 authentication device 616 and user 2 authentication device 618 .
- data security and privacy rights are determined for user 1 and user 2 .
- the purpose of the data collection is interpreted and negotiated according to the negotiation parameters set forth by user 1 and user 2 .
- Dynamically, at 705 the determination of the data needed from the interaction by the computer is determined and a strategy is agreed to as to the use of the data collected.
- the data collected would be stopped until a negotiation settlement is reached.
- a camera or other sensing device as part of a vehicle 802 used to help deliver a package to a home.
- the delivery may be accomplished using a parcel delivery service or other means, such as a drone or helicopter.
- a geo-fence in this example geo-fence 1 .
- the data wrapper may be employed to initiate the negotiation between the person or entity controlling the access to the sensor data of the approaching home, such as the route taken, the location, the house color, video of the house, and the third party controlling sensor data usage and egress to and from the vehicle.
- a default minimum subset of data is used for the delivery, such as the path and obstacle avoidance.
- all or some of the data collected may be destroyed in accordance to the parameters. For example, if the data being collected will be used by a third party company to produce street maps showing the person's home publically, and such data usage was prohibited by the person or an entity, such data wrapper may be used to tag and provide verification that such data will be destroyed.
- a second or third geo-fence may be used to set up a negotiation zone such that the data collected may be used for some purposes related to the vehicle proximity to the zone and, for example, to provide an access corridor to the delivery vehicle to enforce that certain data may not be collected under the negotiation agreement parameters.
- the data collected would be stopped until a negotiation settlement is reached.
- FIG. 9 illustrates a method 900 for negotiating the use of sensor data collected by sensors on drones or associated with geo-fences.
- One embodiment includes a scenario whereby there are two autonomous actors (actor 1 and 2) having different data wrappers specific to their needs. Actor 1 might manage a territory and not allow any data outside a specified purpose to be shared publically except with those devices that negotiate the use within the negotiation parameters set forth by a programmer of actor 1, while actor 2 might be an autonomous vehicle programmed to carry out an action and accept a selection of negotiated deviations from that action.
- actor 1 and actor 2 make contact using communicative sensors.
- actor 2 determines the identity and location of actor 1 and rights associated with that identity/location.
- actor 1 determines what actor 2 intends to do with data collected while in the territory managed by actor 1.
- each actor determines what the other's desired programming seeks dynamically. For example, actor 1 may request a specified monetary amount from actor 2 and actor 2 may be configured to offer a monetary amount to continue the programmed action. These amounts could be programmed to include an acceptable range of values.
- the determination of the data needed from the interaction by the computer is determined and a strategy is agreed to as to the use of the data collected. In this example, if the negotiation is successful, at 914 actor 1 receives an acceptable monetary amount and access to the territory is granted to actor 2.
- FIG. 12 is an illustrative representation of the method of the present invention as applying to drone routes through space.
- Drone A 1202 has original route 1204 .
- the original route 1204 of drone A 1202 crosses properties A, B and C 1206 .
- Properties A, B, and C are surrounded by associated geo-fences 1208 .
- Drone A 1202 will encounter each geo-fence 1208 on original route 1204 and upon reaching each geo-fence 1208 , both the servers (not pictured) supporting the geo-fences 1208 and drone A 1202 will enter into a negotiation.
- Drone A 1202 encounters geo-fence A 1208 of Property A 1206 and the negotiation may proceed such that a drone having the identity or ownership of drone A 1202 may proceed freely for a given nominal fee.
- Drone A's 1202 configured parameters accept the nominal fee of property A 1206 to freely continue on original route 1204 , and drone A transfers the nominal fee to a specified account and continues.
- Drone A 1202 will then encounter geo-fence B 1208 of property B 1206 , and the negotiation proceeds such that a drone having the identity or ownership of drone A 1202 is not allowed to collect photographic data in property B without first paying an exorbitant fee.
- drone A 1202 Since drone A 1202 requires collecting such data for navigational purposes, proceeding through property B 1206 will subject it to an exorbitant fee which the configured parameters of drone A 1202 dictate is not acceptable. As a result of the negotiation not being successful, drone A 1202 will have to re-route and take a modified route 1210 that avoids crossing into Property B 1206 .
- the modified route 1210 of drone A 1202 comes into contact with geo-fence C 1208 of property C 1206 .
- Drone A 1202 may find through negotiation that property C 1206 requires passing drones having the identity or ownership of drone A 1202 to approach from the southwest, not take any photographs for commercial purposes, and delete all data collected while in property C 1206 .
- Drone A 1202 is already approaching from the correct direction, the owner of drone A 1202 has configured drone A 1202 to affirm no intent to sell data collected, and the configured parameters of drone A indicate that navigational data does not have to be retained. Accordingly, Drone A will proceed through property C 1206 along the modified route 1210 , discard prior navigational information and rejoin the original route 1204 .
- Drones A and B 1202 reach a communicative proximity 1214 with one another and a negotiation begins.
- Drone B 1202 indicates, for example, weather station ownership and drone A's 1202 configured parameters indicate deferential behavior towards weather drones.
- Drone A 1202 slows down and allows drone B 1202 to pass.
- Drone B 1202 further indicates an oncoming storm to the east and suggests that drone A 1202 lower altitude to avoid damage—drone A 1202 complies.
- FIG. 12 The functions discussed in FIG. 12 are merely illustrative and serve to demonstrate several applications of the present invention. A plurality of non-illustrated parameters and negotiated directives could also be implemented.
- FIG. 10 is a block diagram illustrating portable multifunction device 100 with touch-sensitive displays 112 in accordance with some embodiments.
- the touch-sensitive display 112 is sometimes called a “touch screen” for convenience, and may also be known as or called a touch-sensitive display system.
- the device 100 may include a memory 102 (which may include one or more computer readable storage mediums), a memory controller 122 , one or more processing units (CPU's) 120 , a peripherals interface 118 , RF circuitry 108 , audio circuitry 110 , a speaker 111 , a microphone 113 , an input/output (I/O) subsystem 106 , other input or control devices 116 , and an external port 124 .
- the device 100 may include one or more optical sensors 164 . These components may communicate over one or more communication buses or signal lines 103 .
- the device 100 is only one example of a portable multifunction device 100 , and that the device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration or arrangement of the components.
- the various components shown in FIG. 10 may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
- Memory 102 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 102 by other components of the device 100 , such as the CPU 120 and the peripherals interface 118 , may be controlled by the memory controller 122 .
- the peripherals interface 118 couples the input and output peripherals of the device to the CPU 120 and memory 102 .
- the one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for the device 100 and to process data.
- the peripherals interface 118 , the CPU 120 , and the memory controller 122 may be implemented on a single chip, such as a chip 104 . In some other embodiments, they may be implemented on separate chips.
- the RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals.
- the RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals.
- the RF circuitry 108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
- SIM subscriber identity module
- the RF circuitry 108 may communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
- networks such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
- networks such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
- WLAN wireless local area network
- MAN metropolitan area network
- the wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for: Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoiP), Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols
- the audio circuitry 110 , the speaker 111 , and the microphone 113 provide an audio interface between a user and the device 100 .
- the audio circuitry 110 receives audio data from the peripherals interface 118 , converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 111 .
- the speaker 111 converts the electrical signal to human-audible sound waves.
- the audio circuitry 110 also receives electrical signals converted by the microphone 113 from sound waves.
- the audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 118 for processing. Audio data may be retrieved from and/or transmitted to memory 102 and/or the RF circuitry 108 by the peripherals interface 118 .
- the audio circuitry 110 also includes a headset jack.
- the headset jack provides an interface between the audio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
- removable audio input/output peripherals such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
- the I/O subsystem 106 couples input/output peripherals on the device 100 , such as the touch screen 112 and other input/control devices 116 , to the peripherals interface 118 .
- the I/O subsystem 106 may include a display controller 156 and one or more input controllers 160 for other input or control devices.
- the one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116 .
- the other input/control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth.
- input controller(s) 160 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse.
- the one or more buttons may include an up/down button for volume control of the speaker 111 and/or the microphone 113 .
- the one or more buttons may include a push button. A quick press of the push button may disengage a lock of the touch screen 112 or begin a process that uses gestures on the touch screen to unlock the device. A longer press of the push button may turn power to the device 100 on or off.
- the user may be able to customize a functionality of one or more of the buttons.
- the touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
- the touch-sensitive touch screen 112 provides an input interface and an output interface between the device and a user.
- the display controller 156 receives and/or sends electrical signals from/to the touch screen 112 .
- the touch screen 112 displays visual output to the user.
- the visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
- a touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact.
- the touch screen 112 and the display controller 156 (along with any associated modules and/or sets of instructions in memory 102 ) detect contact (and any movement or breaking of the contact) on the touch screen 112 and convert the detected contact into interactions with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen.
- user-interface objects e.g., one or more soft keys, icons, web pages or images
- a point of contact between a touch screen 112 and the user corresponds to a finger of the user.
- the touch screen 112 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments.
- the touch screen 112 and the display controller 156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen 112 .
- the touch screen 112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen has a resolution of approximately 160 dpi.
- the user may make contact with the touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth.
- the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus-based input due to the larger area of contact of a finger on the touch screen.
- the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
- the device 100 may include a touchpad (not shown) for activating or deactivating particular functions.
- the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output.
- the touchpad may be a touch-sensitive surface that is separate from the touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
- the device 100 may include a physical or virtual click wheel as an input control device 116 .
- a user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in the touch screen 112 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel).
- the click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button.
- User commands and navigation commands provided by the user via the click wheel may be processed by an input controller 160 as well as by one or more of the modules and/or sets of instructions in memory 102 .
- the click wheel and click wheel controller may be part of the touch screen 112 and the display controller 156 , respectively.
- the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to user interaction with the device.
- a virtual click wheel is displayed on the touch screen of a portable multifunction device and operated by user contact with the touch screen.
- the device 100 also includes a power system 162 for powering the various components.
- the power system 162 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
- a power management system e.g., one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
- power sources e.g., battery, alternating current (AC)
- AC alternating current
- a recharging system
- the device 100 may also include one or more optical sensors 164 .
- FIG. 10 shows an optical sensor 164 coupled to an optical sensor controller 158 in I/O subsystem 106 .
- the optical sensor 164 may include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors.
- CCD charge-coupled device
- CMOS complementary metal-oxide semiconductor
- the optical sensor 164 receives light from the environment, projected through one or more lenses, and converts the light to data representing an image.
- an imaging module 143 also called a camera module
- the optical sensor 164 may capture still images or video.
- an optical sensor is located on the back of the device 100 , opposite the touch screen display 112 on the front of the device, so that the touch screen display may be used as a viewfinder for either still and/or video image acquisition.
- an optical sensor is located on the front of the device so that the user's image may be obtained for videoconferencing while the user views the other video conference participants on the touch screen display.
- the position of the optical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor 164 may be used along with the touch screen display for both video conferencing and still and/or video image acquisition.
- the device 100 may also include one or more proximity sensors 166 .
- FIG. 10 shows a proximity sensor 166 coupled to the peripherals interface 118 .
- the proximity sensor 166 may be coupled to an input controller 160 in the I/O subsystem 106 .
- the proximity sensor turns off and disables the touch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call).
- the proximity sensor keeps the screen off when the device is in the user's pocket, purse, or other dark area to prevent unnecessary battery drainage when the device is in a locked state.
- the device 100 may also include one or more accelerometers 168 .
- FIG. 10 shows an accelerometer 168 coupled to the peripherals interface 118 .
- the accelerometer 168 may be coupled to an input controller 160 in the I/O subsystem 106 .
- information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers.
- the software components stored in memory 102 may include an operating system 126 , a communication module (or set of instructions) 128 , a contact/motion module (or set of instructions) 130 , a graphics module (or set of instructions) 132 , a text input module (or set of instructions) 134 , a Global Positioning System (GPS) module (or set of instructions) 135 , and applications (or set of instructions) 136 .
- an operating system 126 a communication module (or set of instructions) 128 , a contact/motion module (or set of instructions) 130 , a graphics module (or set of instructions) 132 , a text input module (or set of instructions) 134 , a Global Positioning System (GPS) module (or set of instructions) 135 , and applications (or set of instructions) 136 .
- a communication module or set of instructions 128
- a contact/motion module or set of instructions 130
- a graphics module or set of instructions 132
- a text input module or set of instructions
- the operating system 126 e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks
- the operating system 126 includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
- the communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by the RF circuitry 108 and/or the external port 124 .
- the external port 124 e.g., Universal Serial Bus (USB), FIREWIRE, etc.
- USB Universal Serial Bus
- FIREWIRE FireWire
- the external port is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.).
- the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with, the 30 -pin connector used on iPod (trademark of Apple Computer, Inc.) devices.
- the contact/motion module 130 may detect contact with the touch screen 112 (in conjunction with the display controller 156 ) and other touch sensitive devices (e.g., a touchpad or physical click wheel).
- the contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen 112 , and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact.
- the contact/motion module 130 and the display controller 156 also detect contact on a touchpad. In some embodiments, the contact/motion module 130 and the controller 160 detect contact on a click wheel.
- the graphics module 132 includes various known software components for rendering and displaying graphics on the touch screen 112 , including components for changing the intensity of graphics that are displayed.
- graphics includes any object that can be displayed to a user, including, without limitation, text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
- the text input module 134 which may be a component of graphics module 132 , provides soft keyboards for entering text in various applications (e.g., contacts 137 , e-mail 140 , IM 141 , blogging 142 , browser 147 , and any other application that needs text input).
- the GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 and/or blogger 142 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
- the applications 136 may include the following modules (or sets of instructions), or a subset or superset thereof:
- widget modules 149 which may include weather widget 149 - 1 , stocks widget 149 - 2 , calculator widget 149 - 3 , alarm clock widget 149 - 4 , dictionary widget 149 - 5 , and other widgets obtained by the user, as well as user-created widgets 149 - 6 ;
- Examples of other applications 136 that may be stored in memory 102 include other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
- the contacts module 137 may be used to manage an address book or contact list, including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138 , video conference 139 , e-mail 140 , or IM 141 ; and so forth.
- Embodiments of user interfaces and associated processes using contacts module 137 are described further below.
- the telephone module 138 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in the address book 137 , modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed.
- the wireless communication may use any of a plurality of communications standards, protocols and technologies. Embodiments of user interfaces and associated processes using telephone module 138 are described further below.
- the videoconferencing module 139 may be used to initiate, conduct, and terminate a video conference between a user and one or more other participants. Embodiments of user interfaces and associated processes using videoconferencing module 139 are described further below.
- the e-mail client module 140 may be used to create, send, receive, and manage e-mail.
- the e-mail module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143 . Embodiments of user interfaces and associated processes using e-mail module 140 are described further below.
- the instant messaging module 141 may be used to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages and to view received instant messages.
- SMS Short Message Service
- MMS Multimedia Message Service
- XMPP extensible Markup Language
- SIMPLE Session Initiation Protocol
- IMPS Internet Messaging Protocol
- transmitted and/or received instant messages may include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS).
- EMS Enhanced Messaging Service
- instant messaging refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
- XMPP extensible Markup Language
- SIMPLE Session Initiation Protocol
- IMPS Internet Messaging Protocol
- the blogging module 142 may be used to send text, still images, video, and/or other graphics to a blog (e.g., the user's blog). Embodiments of user interfaces and associated processes using blogging module 142 are described further below.
- the camera module 143 may be used to capture still images or video (including a video stream) and store them into memory 102 , modify characteristics of a still image or video, or delete a still image or video from memory 102 . Embodiments of user interfaces and associated processes using camera module 143 are described further below.
- the image management module 144 may be used to arrange, modify or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images. Embodiments of user interfaces and associated processes using image management module 144 are described further below.
- the video player module 145 may be used to display, present or otherwise play back videos (e.g., on the touch screen or on an external, connected display via external port 124 ). Embodiments of user interfaces and associated processes using video player module 145 are described further below.
- the music player module 146 allows the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files.
- the device 100 may include the functionality of an MP3 player. Embodiments of user interfaces and associated processes using music player module 146 are described further below.
- the browser module 147 may be used to browse the Internet, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages. Embodiments of user interfaces and associated processes using browser module 147 are described further below.
- the calendar module 148 may be used to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.). Embodiments of user interfaces and associated processes using calendar module 148 are described further below.
- the widget modules 149 are mini-applications that may be downloaded and used by a user (e.g., weather widget 149 - 1 , stocks widget 149 - 2 , calculator widget 149 - 3 , alarm clock widget 149 - 4 , and dictionary widget 149 - 5 ) or created by the user (e.g., user-created widget 149 - 6 ).
- a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file.
- a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
- the widget creator module 150 may be used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget). Embodiments of user interfaces and associated processes using widget creator module 150 are described further below.
- the search module 151 may be used to search for text, music, sound, images, videos, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms).
- search criteria e.g., one or more user-specified search terms.
- the notes module 153 may be used to create and manage notes, to do lists, and the like. Embodiments of user interfaces and associated processes using notes module 153 are described further below.
- the map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data). Embodiments of user interfaces and associated processes using map module 154 are described further below.
- the online video module 155 allows the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124 ), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264.
- instant messaging module 141 rather than e-mail client module 140 , is used to send a link to a particular online video.
- modules and applications corresponds to a set of instructions for performing one or more functions described above.
- modules i.e., sets of instructions
- video player module 145 may be combined with music player module 146 into a single module (e.g., video and music player module 152 , FIG. 10 ).
- memory 102 may store a subset of the modules and data structures identified above. Furthermore, memory 102 may store additional modules and data structures not described above.
- the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen 112 and/or a touchpad.
- a touch screen and/or a touchpad as the primary input/control device for operation of the device 100 , the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced.
- the predefined set of functions that may be performed exclusively through a touch screen and/or a touchpad includes navigation between user interfaces.
- the touchpad when touched by the user, navigates the device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100 .
- the touchpad may be referred to as a “menu button.”
- the menu button may be a physical push button or other physical input/control device instead of a touchpad.
- FIG. 11 illustrates an exemplary computer architecture for use with the present system, according to one embodiment.
- architecture 100 comprises a system bus 120 for communicating information, and a processor 110 coupled to bus 120 for processing information.
- Architecture 1100 further comprises a random access memory (RAM) or other dynamic storage device 125 (referred to herein as main memory), coupled to bus 120 for storing information and instructions to be executed by processor 110 .
- Main memory 125 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 110 .
- Architecture 1100 also may include a read only memory (ROM) 127 and/or other static storage device 126 coupled to bus 120 for storing static information and instructions used by processor 110 .
- the computer further includes user input apparatus 128 and computer output apparatus 129 .
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, an iPhone, a Blackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specifies actions to be taken by that machine.
- PC personal computer
- PDA personal digital assistant
- machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers), that store the one or more sets of instructions.
- the term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies or modules of the presently disclosed technique and innovation.
- routines executed to implement the embodiments of the disclosure may be implemented as part of an operating system or a specific application, component program, object, module or sequence of instructions referred to as “computer programs.”
- the computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
- machine-readable storage media machine-readable media, or computer-readable (storage) media
- recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.
- CD ROMS Compact Disk Read-Only Memory
- DVDs Digital Versatile Disks
- transmission type media such as digital and analog communication links.
- operation of a memory device may comprise a transformation, such as a physical transformation.
- a transformation such as a physical transformation.
- a physical transformation may comprise a physical transformation of an article to a different state or thing.
- a change in state may involve an accumulation and storage of charge or a release of stored charge.
- a change of state may comprise a physical change or transformation in magnetic orientation or a physical change or transformation in molecular structure, such as from crystalline to amorphous or vice versa.
- a storage medium typically may be non-transitory or comprise a non-transitory device.
- a non-transitory storage medium may include a device that is tangible, meaning that the device has a concrete physical form, although the device may change its physical state.
- non-transitory refers to a device remaining tangible despite this change in state.
- connection means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or any combination thereof.
- the words “herein,” “above,” “below,” and words of similar import when used in this application shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number, respectively.
- the word “or” in reference to a list of two or more items covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
- a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
- Embodiments of the invention may also relate to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a non-transitory, tangible, computer-readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus.
- any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- Embodiments of the invention may also relate to a product that is produced by a computing process described herein.
- a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible, computer-readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Bioethics (AREA)
- Software Systems (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computing Systems (AREA)
- Signal Processing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Medical Informatics (AREA)
- Tourism & Hospitality (AREA)
- Automation & Control Theory (AREA)
- Technology Law (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Biomedical Technology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Telephonic Communication Services (AREA)
- Storage Device Security (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 15/482,558, entitled “MANAGEMENT OF DRONE OPERATIONS AND SECURITY IN A PERVASIVE COMPUTING ENVIRONMENT,” and filed Apr. 7, 2017, which is a continuation of U.S. patent application Ser. No. 15/019,892, entitled, “MANAGEMENT OF DRONE OPERATIONS AND SECURITY IN A PERVASIVE COMPUTING ENVIRONMENT” and filed Feb. 9, 2016, now issued as U.S. Pat. No. 9,654,476 on May 16, 2017, which is a continuation of U.S. patent application Ser. No. 14/629,313, entitled, “MANAGEMENT OF DRONE OPERATIONS AND SECURITY IN A PERVASIVE COMPUTING ENVIRONMENT” and filed Feb. 23, 2015, issued as U.S. Pat. No. 9,292,705 on Mar. 22, 2016, which claims the benefit of U.S. Provisional Patent Application No. 61/942,852, entitled, “MANAGEMENT OF DATA PRIVACY AND SECURITY IN A PERVASIVE COMPUTING ENVIRONMENT” and filed Feb. 21, 2014. The content of the above-identified application is incorporated herein by reference in its entirety.
- The present teaching relates to a data processing system, and in particular to managing data privacy and security in a network system that includes a plurality of sensors, processors and other devices used to capture and process personal and private information.
- Free availability of storage space, existence of large data networks, and multitudes of sensing devices such as cameras, has helped to spawn the phenomena of big data analysis. Under this paradigm, people, devices, companies, governments and the like tend collect some data for purposes such as surveillance, usage patterns, mapping, etc., and in the process collect as much extraneous data as possible regardless of whether or not the extraneous data is needed for the particular purpose. For example, in a typical credit card transaction, the cardholder's name, address, credit card number, and security pin are all used to verify the identity of the cardholder for purchase authorization. However, the back end processing system may collect other extraneous data in bulk such as the location where the transaction is made, the IP address of purchase, the network provider, etc. After the data is collected in bulk and the relationships between those data recorded as metadata, data mining applications are often used to process these data and/or metadata to answer specific questions for technical or business reasons.
- A concern to many consumers is that copies of these data are re-combined, re-packaged, and/or re-sold to other dealers of data whose particular interests are not aligned with the consumers' interests. Privacy concerns of the original consumer arise when these data are replicated across the vast Internet and its datacenters and become immortalized in the computing cloud. Because of the redundancy of the copies, these data and metadata are very difficult to protect, delete, and secure via enforcement of data access constraints.
- Embodiments include a security and privacy wrapper configured to prevent the unauthorized usage of data beyond a negotiated purpose. In one embodiment, the security and privacy wrapper is associated with data pertaining to an entity, such as a person, and a set of permissions associated with the entity. The wrapper also contains a set of negotiation instructions used to allow the wrapper to work independently, or in conjunction with other programs, to dynamically negotiate the usage and dispensation of data collected in response to the purpose of the data acquisition transaction requested by the entity and other extraneous data acquired along with the data required by the purpose of the negotiated transaction.
- Embodiments include a system configured to negotiate the flow of drones over a site location or through drone traffic responsive to a set of permissions utilized to ensure any extraneous data collected as a result of the drone's activities is disposed of or handled according to a negotiated transaction protocol. The system is designed to receive drones used for an intended purpose or transaction by a person or entity, determine whether there is any extraneous data collected or unpermitted passage along with the transaction, negotiate the dispensation of the extraneous data or passage, and negotiate and manage the use and dispensation of the transaction data required to perform the transaction.
- Embodiments include a data storage monitoring system configured to intercept data for storage and to analyze the data to determine whether the data to be stored includes a security data wrapper. If the data contains the wrapper, then a storage schema is dynamically negotiated based upon a predetermined storage protocol associated with a user or entity. The system also uses the wrapper to determine what data may be retrieved (read) from the storage system, as well as determine other parameters such as the expiration of the stored data, number of copies that will be made, locations where the data may be stored within the storage system, type of encryption, type of uses for the data, whether to partition or destroy extraneous data, and the like.
- Embodiments include a method and system configured to determine how data collected from an entity or person may be combined with other data. In one configuration, a sentry code is used to analyze the data, determine a set of recombination permissions, and then use the permissions to either allow or prevent the collected data from being combined with data from other data repositories.
- Embodiments for solving the privacy concerns of personal/private data and metadata are disclosed herein from a perspective of data processing architecture and digital rights management. The problem of the inability to practically protect private data is resolved by an architectural change to a data platform of collecting, processing, distributing, data mining, and storing data in such a way that private data cannot be rediscovered from stored or shared data. Unlike conventional methodology of privacy policy enforcement from the data accessing end, such as a policy based on encryption or data access authentication, the stored data under this architecture does not, in fact, store the private information in the first place. The architecture includes a data tag module that wraps all new data blocks generated such that a deletion policy and a pipeline policy follow the data blocks from its birth. Once data is collected, the data blocks are piped through question components generated through a question generator module. Each question component includes intelligent code to calculate and manipulate the data block to answer a particular question. Once answered, an answer data block is generated. The answer data block can also be wrapped by the data tag module. The answer data block can be further processed by other question components. The architecture ensures that, at all exit points of the data processing architecture, only answer data blocks are allowed to leave. Exit points include: a data block being saved to persistent storage, a data block being shared, a data block being replicated, a data block being published, a data block being accessed.
- Some embodiments have other aspects, elements, features, and steps in addition to or in place of what is described above. These potential additions and replacements are described throughout the rest of the specification.
- These and other objects, features and characteristics of the present invention will become more apparent to those skilled in the art from a study of the following detailed description in conjunction with the appended claims and drawings, all of which form a part of this specification. In the drawings:
-
FIG. 1A illustrates a block diagram of a data collection system over a network in accordance with some embodiments. -
FIG. 1B illustrates a block diagram of a data collection system in accordance with some embodiments. -
FIG. 1C illustrates a block diagram of a security/privacy data wrapper in accordance with some embodiments. -
FIG. 1D illustrates a block diagram of a data marking and tagging system in accordance with some embodiments. -
FIG. 2 illustrates a control flow of a data processing architecture in accordance with some embodiments. -
FIG. 3 illustrates a block diagram of a privacy-aware data processing architecture in accordance with some embodiments. -
FIG. 4 illustrates a device for use with a privacy-aware data processing system in accordance with some embodiments. -
FIG. 5 illustrates a method for privacy-aware data processing in accordance with some embodiments. -
FIG. 6 illustrates a block diagram example of a privacy aware scenario in accordance with some embodiments. -
FIG. 7 illustrates a method for privacy-aware data processing in accordance with some embodiments. -
FIG. 8 illustrates a block diagram example of a privacy aware scenario in accordance with some embodiments. -
FIG. 9 illustrates a method for autonomous actor data processing in accordance with some embodiments. -
FIG. 10 is a block diagram illustrating a portable multifunction device with touch-sensitive displays in accordance with some embodiments. -
FIG. 11 illustrates an exemplary computer architecture for use with the present system, in accordance with some embodiments. -
FIG. 12 is an illustrative representation of the method of the present invention as applying to drone routes through space. - The figures depict various embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
-
FIGS. 1A-1D describe different ways data is generated or first recognized by adata collection system 100A.FIG. 1A illustrates a network diagram of adata collection system 100A over anetwork 100B. For example,network adaptors 102 can be Ethernet adaptors, Wi-Fi adaptors, and the like, and may be other systems anddevices 104 capable of gathering data such as touch devices, cable modems, dial-up modems, satellite communication transceivers, optical links, optical fiber interfaces, cameras, drones, sensors, camera arrays, microphone arrays, infrared sensors, audio sensors, temperature sensors, body imaging devices, activity sensors, accelerometers, radar detectors, sonar, surface tension sensors, weight sensors, mobile phones, smart phones, vibration sensors, camera/projector detection systems, global positioning devices, location transmitters, beacons, location lighting, or any combinations thereof. -
FIG. 1B illustrates a block diagram of adata collection system 140A at hardware level. Thedata collection system 140A includesinput hardware devices 142A. Theinput hardware devices 142A are electronic devices for capturing data via sensors. For example, the sensors can be mechanical sensors, electromagnetic sensors, temperature sensors, chemical sensors, pressure sensors, or any combination thereof. Specifically, theinput hardware devices 142A can be computer pointing devices, computer keyboards, cameras, microphones, scanners, telephones, and may be systems and devices such as touch devices, cable modems, dial-up modems, satellite communication transceivers, optical links, optical fiber interfaces, cameras, drones, sensors, camera arrays, microphone arrays, infrared sensors, audio sensors, temperature sensors, body imaging devices, activity sensors, accelerometers, radar detectors, sonar, surface tension sensors, weight sensors, mobile phones, smart phones, vibration sensors, camera/projector detection systems, global positioning devices, location transmitters, beacons, location lighting, or any combination thereof. - The
input hardware devices 142A are connected to their respectivehardware input adaptors 144A. Thehardware input adaptors 144A are hardware interfaces to theinput hardware devices 142A. Thehardware input adaptors 144A report data streams or data blocks in the form of a raw dataset 146A from theinput hardware devices 142A to aprocessor 148A. Theprocessor 148A can then decide on how to further process the raw dataset 146A, and whether the raw dataset 146A is to be saved onto apersistent storage 150A or an unrestrictedvolatile memory space 152A. -
FIG. 1C illustrates a block diagram of a security/privacy data wrapper 170A, herein referred to as “data wrapper.” At a general level, a data wrapper may be understood as a form of metadata. In other words, data (in the form of program, script, code, instruction, permission, container, symbol, or other type information or the like) associated with the underlying personal and/or sensitive data for which security and/or privacy is sought. According to some embodiments, a data wrapper may be a script or sequence of instructions to be carried out by another program. According to other embodiments, a data wrapper may, in the context of the transmission of data be understood as a “stream” wrapper. In such embodiments, the stream wrapper may be data placed in front of or around a transmission or stream of underlying data that may provide information about or restrict access to the data being transmitted. For example, in the context of transmission of data over a packet-switched network, a stream wrapper may be data in the form of a header and trailer around the encapsulated payload data within a digital transmission unit (for example, a data packet or frame). According to other embodiments, a stream wrapper may be understood as a “stream transformation.” A stream transformation may transform a stream of initial data through the use of hardware and/or software implemented algorithms into a stream of new data. - In one embodiment, the
data wrapper 170A is software configured to “wrap” data associated with a person, entity, and the like. The data wrapper may include security/privacy parameters, negotiation module, combination module, expiration module, masking module, recasting module, dissociation module, pattern analyzer, and the like. The security/privacy parameters are associated with the rights, permissions, and other indicia used to define and indicate the security and privacy parameters related to the person's data privacy and security requirements. In other embodiments, the security/privacy parameters include one or more levels of negotiable privacy and security parameters. For example, the security/privacy parameters may include permissions to copyrighted material and the use or sale thereof to third parties. In other examples, the security/privacy parameters may include use cases for certain data, such as personal data associated with a person's social security number, body type, weight, address, age, fingerprint identification, and other data personal to the person. - In other embodiments,
data wrapper 170A includes a negotiation module. The negotiation module may be used to negotiate with data collection devices such as described above, in order to negotiate what data may be collected, transmitted, and retransmitted by the device. The negotiation module may be software or a combination of hardware or software modules. The negotiation module may also be configured from an artificial intelligence (AI) such that the negotiation module actively negotiates with devices such as the data collection devices described herein, to act as an advocate on behalf of the person or entity. In one embodiment, such AI is configured to negotiate at various levels of permission to determine the dispensation of data used for a transaction and extraneous data collected in one or more ancillary processes associated with the collection of the data. For example, the AI may be configured to distinguish between a request for an address for credit card verification, and a request for a purchase location, such that the credit card request would be limited to a minimal subset of collection to verify the transaction. For location identification, the AI could distinguish between the need for a delivery address and the credit card verification in order to provide sufficient information to the delivery services to find and deliver the purchased goods. In other embodiments, the AI may be configured to distinguish whether there is a need for a location for delivery for a product and whether delivery of a service does not require a location for delivery. - In other embodiments,
data wrapper 170A includes a combination module. The combination module may be used to negotiate with data collection devices such as described above in order to determine what data may be collected and shared between devices and systems. For example, the combination module may allow two or more types of personal data to be combined to present to a doctor for a medical procedure, but not allow the data to be combined for a marketing campaign. In other examples, the combination module may be used to prevent or inhibit data mining operations from occurring on extraneous data. In this scenario, the combination module may be configured to detect combinations that, when combined, meet a data mining profile threshold. In one example, the combination module would not allow data collected for a transaction associated with a person buying a dress to be combined with metadata history of online shopping from other online shopping databases, to determine whether the person was a man or woman. - In other embodiments,
data wrapper 170A includes an expiration module. The expiration module may be used to establish the expiration limits of data collected from a person or entity. For example, the expiration module may be configured to determine from the negotiation parameters how long data should be allowed to be used before either being destroyed, converting to an encrypted form, removed from memory, redacted, and the like. In one example, the expiration module receives navigation data pertaining to photographs of an autonomous drone's flight records and tags the photo navigation data with a label that only allows such data to last for the time duration of necessary travel, or a time limit, whichever is least. The expiration module may also be used to set an expiration time or date on other extraneous data collected, such as the temperature of a room or area that is being inhabited by a person. Such expiration limits may be used in conjunction with the other processes, such as the combination module, to allow data to be combined, for example, over a predefined time limit. - In other embodiments,
data wrapper 170A includes a masking module. As with the expiration module, the masking module may be used to establish the viewing or data processing limits of data collected from a person or entity. In the masking module, the data is obscured or “masked” for a specified window of time, which could be indefinitely. Such a masking module may be used in scenarios where, for example, the navigation data discussed above could be blurred instead of deleted. Such masking may be configured with a hash code that distorts or otherwise prevents data collection devices from perceiving the data they would normally be able to detect. The masking module may be configured to mix random numbers or other data such as white noise with the data in a data hash in order to mask the output of navigational telemetry data. This masking could be used by security personnel, for example, to prevent thieves from taking photographs of fingerprints or other biometric data. - In other embodiments,
data wrapper 170A includes a recasting module. As with the expiration and masking processes, the recasting module may be used to establish the viewing or data processing limits of data collected from a person or entity. The recasting module may add specified data to the personal data in order to “recast” the data being used by third parties to prevent data mining, or to make the data retrieved unusable. For example, if a woman buys a dress from an online shopping store, the recasting module may recast the purchase data as some other type of data, e.g., temperature data. The data looks valid, but will not provide accurate data results for data mining purposes. In other embodiments, the recasting module may be used to infer one type of data has been received, when actually other data is received. For example, for a surreptitious program counting the number of keystrokes to determine keyboard usage, the recasting module may be configured to add hundreds of mouse clicks to the data in order to overwhelm and detract from the data collection process. - In other embodiments,
data wrapper 170A includes a disassociation module. The dissociation module may be used to change data that could be associated to data that is disassociated. For example, consider the case where a person orders a book from an online shopping store. The data that is collected, such as the name, address, and credit card number, may be used to complete the purchase transaction, but when a data mining system tries to use the data, the data is altered (such as with the recasting module) to keep the data from being associated by third party systems. For example, the date of purchase being collected may be reset by the disassociation module to look like a different date, the address may be modified to look like a different address, etc. In one embodiment, the dissociation module is proportional to the amount of extraneous data collected. In other words, the more extraneous data that is attempted to be collected, the more disparate the data becomes with respect to third party systems. In this scenario, the data being collected for use for the purpose of the transaction may be verified and non-disparate, whereas the data collected by third party data mining systems would be more disparate and unusable with respect to the person or entity associated with the data. - In other embodiments,
data wrapper 170A includes a pattern generator, embodiments of which are described herein. The pattern generator may be used to seek out and detect watermarks, metadata, and other data patterns used by the data wrapper and the security system in order to detect the data wrapper, and to label the data with the negotiation parameters. For example, the pattern generator may be configured to work in conjunction with a pattern analyzer as described herein to detect rotating vector data configured to change per transaction. Such data may then be used to prevent data mining processes from circumventing the security and privacy protocols and parameters. Such pattern generator may be used as part of the recasting module by translating one form of data pattern to another. The pattern generator may also be used to detect encrypted keys used to identify specific types of data. For example, biometric data such as DNA may be combined with other data to create a hash data label. The hash data label may then be used and decrypted by the pattern analyzer described herein to validate the DNA data in a manner that changes as a function of the change in DNA sequence. -
FIG. 1D illustrates a block diagram of adata collection system 180 that generates and stores metadata based on data analysis. Thedata collection system 180 includes apattern analyzer 182. In one embodiment, an owner ID is combined with the data to formsecurity data wrapper 170A. The owner ID may be a unique ID, such as an encrypted file, that is a combination of personal data and other data unique to the person or entity. Such an owner ID may also contain the negotiation parameters or be tied to a contract or other document that indicates the person's data utilization criteria and negotiation parameters. For example, the owner ID may be linked to a terms of service that stipulates the owner's intended use for the data being transmitted, the expiration date or time, data recombination criteria, monetary information for data use, and the like. The owner ID may also be used as a virtual watermark that may be used to detect violation of the agreements associated with the owner ID. The owner ID may be configured as metadata that is attached to the data being transacted, may be a pointer used to direct the data flow to a particular data rights management system, may include a digital key to lock and/or unlock an encrypted storage location, may be part of one or more header files used in packet transmission, and the like. -
FIG. 2 illustrates a control flow ofdata processing architecture 200. Thedata processing architecture 200 includes aprocessor 202. Theprocessor 202 receivesdata inputs 204 from a variety of sources. The exit points of collected data are marked by an outgoing arrow with a circle. As illustrated byFIGS. 1A-1D , for example, theprocessor 202 can receive inputs from anetwork adaptor 206, such as the network adaptors 102A ofFIG. 1A , a hardware input adaptor 210, such as thehardware input adaptors 144A ofFIG. 1B , or apattern analyzer 214, such as the pattern analyzer 182A ofFIG. 1D . - Under the
data processing architecture 200, once thedata inputs 204 are processed into a processeddata 216, further use, including but not limited to alteration, storage, processing, and/or movement of the processeddata 216 is restricted. This restriction can be governed either by a data wrapper (for example, the data wrapper 170 ofFIG. 1D ) tagged when thedata inputs 204 are first received and/or by data processing applications executed by theprocessor 202. In particular, the restrictions may be triggered atoutput channels 220 from thedata processing architecture 200, such as thenetwork adaptor 206, ahardware output adaptor 224, thevolatile memory space 152A, or apersistent storage 228. Thenetwork adaptor 206 can be one of thenetwork adaptors 102 ofFIG. 1A . Thehardware output adaptor 224 is a hardware interface to an output device, such as a printer, a display, a fax, or any combination thereof. The volatile memory space 152 is a volatile memory that is shared between applications that are executed on thedata processing architecture 200. Thepersistent storage 228 is a storage device capable of storing data indefinitely even without power, such as thepersistent storage 150A ofFIG. 1B . -
FIG. 3 illustrates one embodiment of a digitalrights management process 300 incorporating the use of the data wrapper to direct the flow of data and negotiations. In this embodiment, negotiation parameters and data usage parameters are defined in the form of questions and answers. Such questions and answers allow the data to be used, for example, for publication or sharing according to a negotiated agreement between the person or entity associated with the data and third parties. For example, consider the case of a shopping transaction that involves the use of a credit authorization associated with a sale of an automobile. In order to determine if a buyer has sufficient credit, the data wrapper may contain the stipulations that only the last four digits of the person's social security number may be used in authentication. However, the third party may need the last six digits of the person's social security number. Theprocess 300 may be configured to negotiate the usage of the additional two numbers. For example,question # 1 302 may be, “May we have the additional two numbers of the client's social security number?”; the answer may be, “Why?” 304; theprocess 300 may reply, “As that is the minimum number needed for the verification process”; the reply from the wrapper may be, “Only if you use the digits for this one transaction”; and theprocess 300 may reply, “OK.” Thus, a negotiation process has been completed and the transaction is processed. Other scenarios are contemplated. For example, as negotiations may exceed the system capacity or authorization ability, the person undergoing the transaction process may be alerted that the process requires more input or a decision from the person. As the person may not understand the ramifications of the transaction and negotiation of his or her rights, thesystem 300 may also enlist additional resources such as a human advocate to assist in the third party transaction on behalf of the person via, for example, a question interface. Thus, unlike traditional terms of service agreements and transactions that cancel the transaction unless all the terms are agreed to,system 300 dynamically negotiates the terms that allow the person or entity to vary the terms according to its preferences. In one embodiment, if the negotiation includes a clause for destruction of data, thesystem 300 sends the data to a destroyprocess 306 in order to permanently destroy the data once the transaction is completed according to the agreed upon negotiations. - In other embodiments, the negotiations may extend to the data required to complete a transaction as well as to additional data collected. For example, where an autonomous drone is operated by the military and came in contact with another entity for which the data needed to complete a transaction included the drone's ownership, then the retention of negotiation data collected such as identity, purpose, time, etc., could be negotiated as well according to the negotiation parameters. In one example, if the negotiation terms were set to destroy such extraneous data, the data would be destroyed. However, the opposing negotiating entity may agree to have the extraneous data used for another purpose, such as for public reporting. The extraneous data would then be used for such agreed upon purpose. If the data was used for other purposes, then a breach of the agreement could be set to fine the third parties using the data outside the agreement. Such fines could be used as a control mechanism for data mining and other surreptitious activities. The fines could also be set up as payments directed to fund third parties. For example, a fine could be redirected to a charity organization. Thus, such penalties could be used as part of the negotiation to have the data mining organizations pay for such data to gain access or as a fine to deter such behavior.
-
FIG. 4 is an exemplary system level diagram of apervasive computing environment 400 utilizing a server, according to one embodiment. In one embodiment,pervasive computing environment 400 has acloud 411 and one or more devices (device_1 401,device_2 402, device_N 403). A device_!401 can haveinput capabilities 404, output capabilities 405, computing capabilities 406, an input encoder anddecoder 407, an output encoder and decoder 408, adevice agent 409, and anetwork stack 410. A device, according to one embodiment, can have any combination of the referenced components and is not required to have all of them. Acloud 411 has a supervisor 413 (referred to as an I/O/C supervisor, for input, output, and computing), aname service 412, adata provider 414 in communication with storage providers (416, 417), and an authentication andsecurity gateway 415. - In one embodiment,
device agent 409 is configured to act as a negotiation agent to employ the data wrapper. Thedevice agent 409 may be configured to negotiate with the authentication andsecurity gateway 415. In this embodiment, thedevice agent 409 employs the protocols discussed herein to manage the data being sent to and from thevarious devices security gateway 415. For example,device_2 402, via itsdevice agent 420, may request, viacloud 411, a specific set of data 490 from device_!401. During negotiation withdevice agent 420 through the authentication &security gateway 415,device agent 409 may request highly-restrictive data usage terms to be included in a data wrapper (for example, a data wrapper similar to thedata wrapper 170A ofFIG. 1C ) associated with data 490. In order to determine whether the data usage terms requested bydevice agent 409 ofdevice_1 401 will work for its own purposes,device agent 420 ofdevice_2 402 may need to negotiate with the device agent of a third party device, for example,device agent 430 ofdevice_N 403. - In order to further illustrate the embodiment described above, consider the following scenario. An individual person requests via a smart phone (in this example, device_1 401) to purchase a product from an online retailer. The online retailer server device (in this example, device_2 402) requests: 1) the person's shipping address, 2) the person's billing address, and 3) the person's credit card information. In response to the request from the online retailer's server, the device agent of the smart phone (in this example, device agent 409) requests that if sent, such data be subject to a data wrapper containing restricted usage terms. For example: 1) that the data sent be stored only on the online retailer's server (device_2 402), 2) that the data stored on the online retailer's server device be deleted immediately following completion of the transaction, and 3) that the data never be transferred in any way to a third device. Following this request from the device agent for the iPhone (device agent 409), the device agent for the online retailer server (device agent 420) analyzes the internal transaction processing systems and procedures of the online retailer. This analysis reveals the following: 1) the person's data may need to be stored on multiple physical devices within the online retailer's system due to, for example, redundancy and/or mirroring functionality, 2) in order to transmit the data, the data may be transmitted through and stored on the devices of a third-party cloud computing service (in this example cloud 411), and 3) in order to complete the transaction, the online retailer must transmit the person's billing address and credit card information to a third-party credit card company for payment verification. The device agent for the online retailer server (device agent 420) returns this information to the device agent for the person's smart phone (device agent 409) with a request that the device agent for the smart phone modify its stipulated restricted usage terms contained within the data wrapper. With the goal of effectuating the transaction, the device agent for the smart phone (device agent 409) agrees to modify the terms to 1) allow storage on multiple physical devices within the online retailer's system only where necessary, and 2) allow transfer to a single third-party device for the limited purpose of transmission and payment verification as long as the third-party device is made aware of and complies with the restricted usage terms as negotiated between the device agent for the smart phone (device agent 409) and the device agent for the online retailer's server device (device agent 420). This second request, that the third-party device comply with negotiated restricted usage terms, will in turn necessitate a similar back and forth negotiation between the device agent for the online retailer's server device (device agent 420) and the device agent for the credit card company's server device (in this example, device agent 430) and the device agent for the online retailer's server device (device agent 430) and the authentication & security gateway for the cloud computing service (in this example, authentication and security gateway 415). Once the multiple devices and the cloud computing service arrive at an acceptable set of restricted usage terms, the device agent for the smart phone (device agent 409) may incorporate the agreed-to terms into a data wrapper and functionally combine the data wrapper with the data to be transmitted from the person's smart phone to the online retailer's server device.
- As discussed earlier, this negotiation process may take place automatically without any human input according to preset negotiation parameters. The negotiation parameters, however, may be dynamic and automatically adjust based on contextual characteristics of the current situation. Contextual characteristics may include, but are not limited to, the nature of the data, the transmission protocols employed along data transmission channels, the security protocols employed along data transmission channels, the geographic location of the sending and receiving devices, as well as intermediary devices along the data transmission channels, the location of data storage systems, the routing path through which the data is to be transmitted over the network (including whether the path will travel over wired vs. wireless physical infrastructure), or any combination thereof.
- A human actor may also pre-configure or dynamically configure in near real- time the negotiation parameters. This may be accomplished in a number of ways including, but not limited to, setting detailed permissions for a set of likely contexts or, at a higher level, providing an overarching directive to a device agent instantiated on a device he or she is using to apply relatively higher or lower protections when negotiating data use restrictions. For example, using a human interaction interface (HII)(also known as a “user interface” (UI)), a high-level configuration directive to the device agent may be set by a person using the device simply through the use of a slider bar on a touch screen interface. In the case of programming an autonomous drone, the drone may have permissions to expend a range of monetary amounts at a given site, or different ranges for different sites. Another possibility would include programming the drone with a mission budget which may be expended over a number of sites across a mission, where the drone would be programmed to optimize the use of the funds. Should a given mission include crossing 5 sites, one of which was substantially larger than the others, and was notoriously surrounded by difficult passage conditions such that avoiding the larger site would likely be treacherous, the optimization would allow the autonomous drone to negotiate, using higher monetary offers, to pass through the larger site because the larger site was of greater importance to the mission than crossing the other 4 sites. In a similar example, crossing a given site may become more important to an overall mission en route should dangerous weather systems develop. Accordingly, the autonomous drone would need dynamic configuration either from posing queries to a user concerning new or varying mission conditions, or including artificial intelligence to autonomously alter configured parameters.
- In another embodiment,
device agent 409 may be configured to enforce the negotiated terms of the data wrapper associated with transmitted data (for example, data 490 mentioned in the previous paragraphs). Consider the illustrative scenario discussed in the previous paragraphs. The smart phone (e.g., device_1 401) has transmitted data 490 (containing the person's mailing address, billing address, and credit card information) to the online retailer's server device (e.g., device_2 402) along with a data wrapper containing a set of data use restriction terms applicable to both the online retailer and other third parties such as the credit card company and the cloud computing service. In order to enforce the data use restriction terms contained in the data wrapper of data 490, the device agent of the smart phone (e.g., device agent 409) may track the propagation of data 490 among other devices connected to the internet. The device agent of the smart phone may accomplish this in a number of ways. - According to one embodiment, the device agent for the smart phone may periodically “crawl” the internet for data the same as or similar to the data transmitted as data 490. Current internet search engines employ such technology to scour the internet for documents, index the locations of the documents, and return links to such documents as search results in response to submitted queries.
- According to another embodiment, the device agent for the smart phone may include as part of the data wrapper data in the form of a “water mark” or similar form, recognizable only by the device agent for the smart phone (for example, a continuously changing pattern created by a pattern generator similar to that described in
FIG. 1C and recognizable by a pattern analyzer similar to that described inFIG. 1D ). - It will be understood by those having ordinary skill in the art that the above two embodiments serve only as examples and that there are many other ways in which a software or hardware component may be programmed to seek a particular set of data located on a network or to seek evidence of prior use of that specific set of data via the network.
- If the device agent for the smart phone discovers uses of the data 490 that do not comply with the data use restriction terms of the original data wrapper, it may enforce the data use restriction terms in a number of different ways.
- According to one embodiment, the device agent for the smart phone may automatically notify (through email, text message, or otherwise) the person associated with the data that his or her data is being used improperly and inform that person as to the details of the improper use.
- According to another embodiment, the device agent for the smart phone may automatically notify (through email, text message, or otherwise) a privacy or security regulatory authority and inform that authority as to the details of the improper use.
- According to yet another embodiment, the device agent for the smart phone may initiate a process on the data through the data wrapper to curtail the improper use of the data. Processes may include, but are not limited to, deleting the data, masking the data, recasting the data, disassociating the data, and/or encrypting the data.
- According to yet another embodiment, the device agent for the smart phone may track the usage of the data and monetize the improper usage through, for example, automatically demanding a royalty payment from the offending user for continued use of the data. For example, if data 490 was improperly shared with an entity performing targeted advertising, the device agent of the smart phone may negotiate a royalty payment (payable to the person to whom the data belongs) with a device agent of the targeted advertising entity using data 490 for the continued use of the data 490. Again, as described earlier, this process may occur automatically without any human input, and/or be based on pre-configured or dynamically configured negotiation parameters.
-
FIG. 5 illustrates amethod 500 for data collection negotiation. For example, referring toFIG. 4 , consider the case wheredevice 401 is a mobile phone with a camera and the phone is being used to purchase an item using a purchase application instantiated ondevice 401, for example, within thenetwork stack 410. In this scenario, atstep 500 the application asks the user of the phone to take a picture of a check for the transaction. Further consider wheredevice 402 is a wireless point access used to convey the transaction data betweendevice 401 and the authentication and security gateway instantiated oncloud server 411. The data from the check is processed ondevice 401 and is broken into account number, name, address, date, time, and location of transaction. According to the negotiation parameters from the data wrapper at 502 according to the purpose at 503, the IP address of device one may only be used for location verification per the negotiation strategy determined atstep 505 and negotiated atstep 506. Therefore, if the negotiation is successful at 508, according to the negotiation at 510, as the data is being channeled throughdevice 402, the agent ondevice 402 prevents the transmission of the IP address to anyone but the purchase authorization unit in communication with the authentication andsecurity gateway 415. In this scenario, the IP address data would only be passed to the gateway, and not stored or used by any other device in communication with the wireless access point 402 (device_2). Further, if the negotiation between theaccess point 402 anddevice 401 allows for the IP address to be used to verify the signal strength and wireless communication channel are stable, thendevice 402 may use the data for a different but allowed purpose. -
FIG. 6 is an exemplary system level diagram of apervasive computing environment 600 utilizing a server, according to one embodiment. -
FIG. 7 illustrates amethod 700 for negotiating the use of sensor data collected by sensors in contact with users. In one embodiment,pervasive computing environment 600 includes a scenario whereby there are two persons (user 1 and user 2) having different data wrappers specific to their needs.User 1 does not allow any data outside a specified purpose to be shared publically except to those devices that negotiate the use within the negotiation parameters set forth by the user, whileuser 2 has present applications that are allowed to be showed publically. Atstep 702,user 1 anduser 2approach computer 1 614. - As illustrated in
FIG. 6 ,computer 1 has anauthentication detection device 612 that in this scenario is in communication withuser 1authentication device 616 anduser 2authentication device 618. At 702, sensor data fromcomputer 1 is intercepted byuser 1authentication device 616 anduser 2authentication device 618. At 703, data security and privacy rights are determined foruser 1 anduser 2. At 704, the purpose of the data collection is interpreted and negotiated according to the negotiation parameters set forth byuser 1 anduser 2. Dynamically, at 705, the determination of the data needed from the interaction by the computer is determined and a strategy is agreed to as to the use of the data collected. In this example, if the negotiation is successful, then at 712, sinceuser 1 does not allow any access but that specified, only the data allowed to be shown publically is presented. However, sinceuser 2 has a more relaxed policy,user 2's data is shown immediately oncomputer 1 without any further negotiations as shown at 630. - In other embodiments, using the data wrapper and negotiation parameters as described herein, without a negotiation, the data collected would be stopped until a negotiation settlement is reached. For example, referring to
FIG. 8 , consider the case of a camera or other sensing device as part of avehicle 802 used to help deliver a package to a home. The delivery may be accomplished using a parcel delivery service or other means, such as a drone or helicopter. As thevehicle 802 approaches ahome 804 at position A, the vehicle encounters a geo-fence, in this example geo-fence 1. At this juncture, the data wrapper may be employed to initiate the negotiation between the person or entity controlling the access to the sensor data of the approaching home, such as the route taken, the location, the house color, video of the house, and the third party controlling sensor data usage and egress to and from the vehicle. In some scenarios, a default minimum subset of data is used for the delivery, such as the path and obstacle avoidance. When the delivery is complete, all or some of the data collected may be destroyed in accordance to the parameters. For example, if the data being collected will be used by a third party company to produce street maps showing the person's home publically, and such data usage was prohibited by the person or an entity, such data wrapper may be used to tag and provide verification that such data will be destroyed. In other scenarios, a second or third geo-fence may be used to set up a negotiation zone such that the data collected may be used for some purposes related to the vehicle proximity to the zone and, for example, to provide an access corridor to the delivery vehicle to enforce that certain data may not be collected under the negotiation agreement parameters. - In other embodiments, using the data wrapper and negotiation parameters as described herein, without a negotiation, the data collected would be stopped until a negotiation settlement is reached.
-
FIG. 9 illustrates amethod 900 for negotiating the use of sensor data collected by sensors on drones or associated with geo-fences. One embodiment includes a scenario whereby there are two autonomous actors (actor 1 and 2) having different data wrappers specific to their needs.Actor 1 might manage a territory and not allow any data outside a specified purpose to be shared publically except with those devices that negotiate the use within the negotiation parameters set forth by a programmer ofactor 1, whileactor 2 might be an autonomous vehicle programmed to carry out an action and accept a selection of negotiated deviations from that action. Atstep 902,actor 1 andactor 2 make contact using communicative sensors. At 904,actor 2 determines the identity and location ofactor 1 and rights associated with that identity/location. At 906,actor 1 determines whatactor 2 intends to do with data collected while in the territory managed byactor 1. At 908, each actor determines what the other's desired programming seeks dynamically. For example,actor 1 may request a specified monetary amount fromactor 2 andactor 2 may be configured to offer a monetary amount to continue the programmed action. These amounts could be programmed to include an acceptable range of values. At 910 and 912, the determination of the data needed from the interaction by the computer is determined and a strategy is agreed to as to the use of the data collected. In this example, if the negotiation is successful, at 914actor 1 receives an acceptable monetary amount and access to the territory is granted toactor 2. - Referring now to
FIG. 12 ,FIG. 12 is an illustrative representation of the method of the present invention as applying to drone routes through space. Drone A 1202 hasoriginal route 1204. Theoriginal route 1204 of drone A 1202 crosses properties A, B and C 1206. Properties A, B, and C are surrounded by associated geo-fences 1208. Drone A 1202 will encounter each geo-fence 1208 onoriginal route 1204 and upon reaching each geo-fence 1208, both the servers (not pictured) supporting the geo-fences 1208 and drone A 1202 will enter into a negotiation. For example, Drone A 1202 encounters geo-fence A 1208 of Property A 1206 and the negotiation may proceed such that a drone having the identity or ownership of drone A 1202 may proceed freely for a given nominal fee. Drone A's 1202 configured parameters accept the nominal fee of property A 1206 to freely continue onoriginal route 1204, and drone A transfers the nominal fee to a specified account and continues. Drone A 1202 will then encounter geo-fence B 1208 of property B 1206, and the negotiation proceeds such that a drone having the identity or ownership of drone A 1202 is not allowed to collect photographic data in property B without first paying an exorbitant fee. Since drone A 1202 requires collecting such data for navigational purposes, proceeding through property B 1206 will subject it to an exorbitant fee which the configured parameters of drone A 1202 dictate is not acceptable. As a result of the negotiation not being successful, drone A 1202 will have to re-route and take a modifiedroute 1210 that avoids crossing into Property B 1206. - The modified
route 1210 of drone A 1202 comes into contact with geo-fence C 1208 of property C 1206. Upon reaching geo-fence C 1208, Drone A 1202 may find through negotiation that property C 1206 requires passing drones having the identity or ownership of drone A 1202 to approach from the southwest, not take any photographs for commercial purposes, and delete all data collected while in property C 1206. Drone A 1202 is already approaching from the correct direction, the owner of drone A 1202 has configured drone A 1202 to affirm no intent to sell data collected, and the configured parameters of drone A indicate that navigational data does not have to be retained. Accordingly, Drone A will proceed through property C 1206 along the modifiedroute 1210, discard prior navigational information and rejoin theoriginal route 1204. - When drone A 1202 continues on the
original route 1204, there will be a conflict with drone B 1202 traveling alongintercept route 1212. Drones A and B 1202 reach acommunicative proximity 1214 with one another and a negotiation begins. Drone B 1202 indicates, for example, weather station ownership and drone A's 1202 configured parameters indicate deferential behavior towards weather drones. Drone A 1202 slows down and allows drone B 1202 to pass. Drone B 1202 further indicates an oncoming storm to the east and suggests that drone A 1202 lower altitude to avoid damage—drone A 1202 complies. - The functions discussed in
FIG. 12 are merely illustrative and serve to demonstrate several applications of the present invention. A plurality of non-illustrated parameters and negotiated directives could also be implemented. - Attention is now directed towards embodiments of the device.
-
FIG. 10 is a block diagram illustrating portablemultifunction device 100 with touch-sensitive displays 112 in accordance with some embodiments. The touch-sensitive display 112 is sometimes called a “touch screen” for convenience, and may also be known as or called a touch-sensitive display system. Thedevice 100 may include a memory 102 (which may include one or more computer readable storage mediums), amemory controller 122, one or more processing units (CPU's) 120, aperipherals interface 118,RF circuitry 108,audio circuitry 110, aspeaker 111, amicrophone 113, an input/output (I/O)subsystem 106, other input orcontrol devices 116, and anexternal port 124. Thedevice 100 may include one or moreoptical sensors 164. These components may communicate over one or more communication buses orsignal lines 103. - It should be appreciated that the
device 100 is only one example of aportable multifunction device 100, and that thedevice 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration or arrangement of the components. The various components shown inFIG. 10 may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits. -
Memory 102 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access tomemory 102 by other components of thedevice 100, such as theCPU 120 and theperipherals interface 118, may be controlled by thememory controller 122. - The peripherals interface 118 couples the input and output peripherals of the device to the
CPU 120 andmemory 102. The one ormore processors 120 run or execute various software programs and/or sets of instructions stored inmemory 102 to perform various functions for thedevice 100 and to process data. - In some embodiments, the
peripherals interface 118, theCPU 120, and thememory controller 122 may be implemented on a single chip, such as achip 104. In some other embodiments, they may be implemented on separate chips. - The RF (radio frequency)
circuitry 108 receives and sends RF signals, also called electromagnetic signals. TheRF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. TheRF circuitry 108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. TheRF circuitry 108 may communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for: Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoiP), Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document. - The
audio circuitry 110, thespeaker 111, and themicrophone 113 provide an audio interface between a user and thedevice 100. Theaudio circuitry 110 receives audio data from theperipherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to thespeaker 111. Thespeaker 111 converts the electrical signal to human-audible sound waves. Theaudio circuitry 110 also receives electrical signals converted by themicrophone 113 from sound waves. Theaudio circuitry 110 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 118 for processing. Audio data may be retrieved from and/or transmitted tomemory 102 and/or theRF circuitry 108 by theperipherals interface 118. In some embodiments, theaudio circuitry 110 also includes a headset jack. The headset jack provides an interface between theaudio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone). - The I/
O subsystem 106 couples input/output peripherals on thedevice 100, such as thetouch screen 112 and other input/control devices 116, to theperipherals interface 118. The I/O subsystem 106 may include adisplay controller 156 and one ormore input controllers 160 for other input or control devices. The one ormore input controllers 160 receive/send electrical signals from/to other input orcontrol devices 116. The other input/control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons may include an up/down button for volume control of thespeaker 111 and/or themicrophone 113. The one or more buttons may include a push button. A quick press of the push button may disengage a lock of thetouch screen 112 or begin a process that uses gestures on the touch screen to unlock the device. A longer press of the push button may turn power to thedevice 100 on or off. The user may be able to customize a functionality of one or more of the buttons. Thetouch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards. - The touch-
sensitive touch screen 112 provides an input interface and an output interface between the device and a user. Thedisplay controller 156 receives and/or sends electrical signals from/to thetouch screen 112. Thetouch screen 112 displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below. - A
touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Thetouch screen 112 and the display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on thetouch screen 112 and convert the detected contact into interactions with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. In an exemplary embodiment, a point of contact between atouch screen 112 and the user corresponds to a finger of the user. - The
touch screen 112 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. Thetouch screen 112 and thedisplay controller 156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with atouch screen 112. - The
touch screen 112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen has a resolution of approximately 160 dpi. The user may make contact with thetouch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user. - In some embodiments, in addition to the touch screen, the
device 100 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from thetouch screen 112 or an extension of the touch-sensitive surface formed by the touch screen. - In some embodiments, the
device 100 may include a physical or virtual click wheel as aninput control device 116. A user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in thetouch screen 112 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel). The click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button. User commands and navigation commands provided by the user via the click wheel may be processed by aninput controller 160 as well as by one or more of the modules and/or sets of instructions inmemory 102. For a virtual click wheel, the click wheel and click wheel controller may be part of thetouch screen 112 and thedisplay controller 156, respectively. For a virtual click wheel, the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to user interaction with the device. In some embodiments, a virtual click wheel is displayed on the touch screen of a portable multifunction device and operated by user contact with the touch screen. - The
device 100 also includes apower system 162 for powering the various components. Thepower system 162 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices. - The
device 100 may also include one or moreoptical sensors 164.FIG. 10 shows anoptical sensor 164 coupled to anoptical sensor controller 158 in I/O subsystem 106. Theoptical sensor 164 may include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors. Theoptical sensor 164 receives light from the environment, projected through one or more lenses, and converts the light to data representing an image. In conjunction with an imaging module 143 (also called a camera module), theoptical sensor 164 may capture still images or video. In some embodiments, an optical sensor is located on the back of thedevice 100, opposite thetouch screen display 112 on the front of the device, so that the touch screen display may be used as a viewfinder for either still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that the user's image may be obtained for videoconferencing while the user views the other video conference participants on the touch screen display. In some embodiments, the position of theoptical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a singleoptical sensor 164 may be used along with the touch screen display for both video conferencing and still and/or video image acquisition. - The
device 100 may also include one ormore proximity sensors 166.FIG. 10 shows aproximity sensor 166 coupled to theperipherals interface 118. Alternately, theproximity sensor 166 may be coupled to aninput controller 160 in the I/O subsystem 106. In some embodiments, the proximity sensor turns off and disables thetouch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call). In some embodiments, the proximity sensor keeps the screen off when the device is in the user's pocket, purse, or other dark area to prevent unnecessary battery drainage when the device is in a locked state. - The
device 100 may also include one ormore accelerometers 168.FIG. 10 shows anaccelerometer 168 coupled to theperipherals interface 118. Alternately, theaccelerometer 168 may be coupled to aninput controller 160 in the I/O subsystem 106. In some embodiments, information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers. - In some embodiments, the software components stored in
memory 102 may include anoperating system 126, a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and applications (or set of instructions) 136. - The operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
- The
communication module 128 facilitates communication with other devices over one or moreexternal ports 124 and also includes various software components for handling data received by theRF circuitry 108 and/or theexternal port 124. The external port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with, the 30-pin connector used on iPod (trademark of Apple Computer, Inc.) devices. - The contact/
motion module 130 may detect contact with the touch screen 112 (in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across thetouch screen 112, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contact) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, the contact/motion module 130 and thedisplay controller 156 also detect contact on a touchpad. In some embodiments, the contact/motion module 130 and thecontroller 160 detect contact on a click wheel. - The
graphics module 132 includes various known software components for rendering and displaying graphics on thetouch screen 112, including components for changing the intensity of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including, without limitation, text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like. - The
text input module 134, which may be a component ofgraphics module 132, provides soft keyboards for entering text in various applications (e.g.,contacts 137,e-mail 140,IM 141, blogging 142,browser 147, and any other application that needs text input). TheGPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, tocamera 143 and/orblogger 142 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets). - The
applications 136 may include the following modules (or sets of instructions), or a subset or superset thereof: -
- a contacts module 137 (sometimes called an address book or contact list);
- a
telephone module 138; - a
video conferencing module 139; - an
e-mail client module 140; - an instant messaging (IM)
module 141; - a
blogging module 142; - a
camera module 143 for still and/or video images; - an
image management module 144; - a
video player module 145; - a
music player module 146; - a
browser module 147; - a
calendar module 148;
-
widget modules 149, which may include weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as user-created widgets 149-6; -
-
widget creator module 150 for making user-created widgets 149-6; -
search module 151; - video and music player module 152, which merges
video player module 145 andmusic player module 146; - notes module 153; and/or
- map module 154; and/or
- online video module 155.
-
- Examples of
other applications 136 that may be stored inmemory 102 include other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication. - In conjunction with
touch screen 112,display controller 156,contact module 130,graphics module 132, andtext input module 134, thecontacts module 137 may be used to manage an address book or contact list, including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications bytelephone 138,video conference 139,e-mail 140, orIM 141; and so forth. Embodiments of user interfaces and associated processes usingcontacts module 137 are described further below. - In conjunction with
RF circuitry 108,audio circuitry 110,speaker 111,microphone 113,touch screen 112,display controller 156,contact module 130,graphics module 132, andtext input module 134, thetelephone module 138 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in theaddress book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed. As noted above, the wireless communication may use any of a plurality of communications standards, protocols and technologies. Embodiments of user interfaces and associated processes usingtelephone module 138 are described further below. - In conjunction with
RF circuitry 108,audio circuitry 110,speaker 111,microphone 113,touch screen 112,display controller 156,optical sensor 164,optical sensor controller 158,contact module 130,graphics module 132,text input module 134,contact list 137, andtelephone module 138, thevideoconferencing module 139 may be used to initiate, conduct, and terminate a video conference between a user and one or more other participants. Embodiments of user interfaces and associated processes usingvideoconferencing module 139 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156,contact module 130,graphics module 132, andtext input module 134, thee-mail client module 140 may be used to create, send, receive, and manage e-mail. In conjunction withimage management module 144, thee-mail module 140 makes it very easy to create and send e-mails with still or video images taken withcamera module 143. Embodiments of user interfaces and associated processes usinge-mail module 140 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156,contact module 130,graphics module 132, andtext input module 134, theinstant messaging module 141 may be used to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages and to view received instant messages. In some embodiments, transmitted and/or received instant messages may include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, “instant messaging” refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS). Embodiments of user interfaces and associated processes usinginstant messaging module 141 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display controller 156,contact module 130,graphics module 132,text input module 134,image management module 144, andbrowsing module 147, theblogging module 142 may be used to send text, still images, video, and/or other graphics to a blog (e.g., the user's blog). Embodiments of user interfaces and associated processes usingblogging module 142 are described further below. - In conjunction with
touch screen 112,display controller 156, optical sensor(s) 164,optical sensor controller 158,contact module 130,graphics module 132, andimage management module 144, thecamera module 143 may be used to capture still images or video (including a video stream) and store them intomemory 102, modify characteristics of a still image or video, or delete a still image or video frommemory 102. Embodiments of user interfaces and associated processes usingcamera module 143 are described further below. - In conjunction with
touch screen 112,display controller 156,contact module 130,graphics module 132,text input module 134, andcamera module 143, theimage management module 144 may be used to arrange, modify or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images. Embodiments of user interfaces and associated processes usingimage management module 144 are described further below. - In conjunction with
touch screen 112,display controller 156,contact module 130,graphics module 132,audio circuitry 110, andspeaker 111, thevideo player module 145 may be used to display, present or otherwise play back videos (e.g., on the touch screen or on an external, connected display via external port 124). Embodiments of user interfaces and associated processes usingvideo player module 145 are described further below. - In conjunction with
touch screen 112,display system controller 156,contact module 130,graphics module 132,audio circuitry 110,speaker 111,RF circuitry 108, andbrowser module 147, themusic player module 146 allows the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files. In some embodiments, thedevice 100 may include the functionality of an MP3 player. Embodiments of user interfaces and associated processes usingmusic player module 146 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display system controller 156,contact module 130,graphics module 132, andtext input module 134, thebrowser module 147 may be used to browse the Internet, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages. Embodiments of user interfaces and associated processes usingbrowser module 147 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display system controller 156,contact module 130,graphics module 132,text input module 134,e-mail module 140, andbrowser module 147, thecalendar module 148 may be used to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.). Embodiments of user interfaces and associated processes usingcalendar module 148 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display system controller 156,contact module 130,graphics module 132,text input module 134, andbrowser module 147, thewidget modules 149 are mini-applications that may be downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user-created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets). Embodiments of user interfaces and associated processes usingwidget modules 149 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display system controller 156,contact module 130,graphics module 132,text input module 134, andbrowser module 147, thewidget creator module 150 may be used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget). Embodiments of user interfaces and associated processes usingwidget creator module 150 are described further below. - In conjunction with
touch screen 112,display system controller 156,contact module 130,graphics module 132, andtext input module 134, thesearch module 151 may be used to search for text, music, sound, images, videos, and/or other files inmemory 102 that match one or more search criteria (e.g., one or more user-specified search terms). Embodiments of user interfaces and associated processes usingsearch module 151 are described further below. - In conjunction with
touch screen 112,display controller 156,contact module 130,graphics module 132, andtext input module 134, the notes module 153 may be used to create and manage notes, to do lists, and the like. Embodiments of user interfaces and associated processes using notes module 153 are described further below. - In conjunction with
RF circuitry 108,touch screen 112,display system controller 156,contact module 130,graphics module 132,text input module 134,GPS module 135, andbrowser module 147, the map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data). Embodiments of user interfaces and associated processes using map module 154 are described further below. - In conjunction with
touch screen 112,display system controller 156,contact module 130,graphics module 132,audio circuitry 110,speaker 111,RF circuitry 108,text input module 134,e-mail client module 140, andbrowser module 147, the online video module 155 allows the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments,instant messaging module 141, rather thane-mail client module 140, is used to send a link to a particular online video. - Each of the above identified modules and applications corresponds to a set of instructions for performing one or more functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various embodiments. For example,
video player module 145 may be combined withmusic player module 146 into a single module (e.g., video and music player module 152,FIG. 10 ). In some embodiments,memory 102 may store a subset of the modules and data structures identified above. Furthermore,memory 102 may store additional modules and data structures not described above. - In some embodiments, the
device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through atouch screen 112 and/or a touchpad. By using a touch screen and/or a touchpad as the primary input/control device for operation of thedevice 100, the number of physical input/control devices (such as push buttons, dials, and the like) on thedevice 100 may be reduced. - The predefined set of functions that may be performed exclusively through a touch screen and/or a touchpad includes navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates the
device 100 to a main, home, or root menu from any user interface that may be displayed on thedevice 100. In such embodiments, the touchpad may be referred to as a “menu button.” In some other embodiments, the menu button may be a physical push button or other physical input/control device instead of a touchpad. -
FIG. 11 illustrates an exemplary computer architecture for use with the present system, according to one embodiment. One embodiment ofarchitecture 100 comprises asystem bus 120 for communicating information, and aprocessor 110 coupled tobus 120 for processing information.Architecture 1100 further comprises a random access memory (RAM) or other dynamic storage device 125 (referred to herein as main memory), coupled tobus 120 for storing information and instructions to be executed byprocessor 110.Main memory 125 also may be used for storing temporary variables or other intermediate information during execution of instructions byprocessor 110.Architecture 1100 also may include a read only memory (ROM) 127 and/or otherstatic storage device 126 coupled tobus 120 for storing static information and instructions used byprocessor 110. The computer further includesuser input apparatus 128 andcomputer output apparatus 129. - References in this specification to “an embodiment,” “one embodiment,” or the like mean that the particular feature, structure, or characteristic being described is included in at least one embodiment of the present invention. Occurrences of such phrases in this specification do not necessarily all refer to the same embodiment.
- Some portions of the detailed description may be presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
- It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “generating” or the like refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within registers and memories of the computer system into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
- The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatuses to perform the methods of some embodiments. The required structure for a variety of these systems will appear from the description below. In addition, the techniques are not described with reference to any particular programming language, and various embodiments may thus be implemented using a variety of programming languages.
- In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, an iPhone, a Blackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specifies actions to be taken by that machine.
- While the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers), that store the one or more sets of instructions. The term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies or modules of the presently disclosed technique and innovation.
- In general, the routines executed to implement the embodiments of the disclosure may be implemented as part of an operating system or a specific application, component program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.
- Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
- Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include, but are not limited to, recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.
- In some circumstances, operation of a memory device, such as a change in state from a binary one to a binary zero or vice versa, for example, may comprise a transformation, such as a physical transformation. With particular types of memory devices, such a physical transformation may comprise a physical transformation of an article to a different state or thing. For example, but without limitation, for some types of memory devices, a change in state may involve an accumulation and storage of charge or a release of stored charge. Likewise, in other memory devices, a change of state may comprise a physical change or transformation in magnetic orientation or a physical change or transformation in molecular structure, such as from crystalline to amorphous or vice versa. The foregoing is not intended to be an exhaustive list of all examples in which a change in state from a binary one to a binary zero or vice versa in a memory device may comprise a transformation, such as a physical transformation. Rather, the foregoing is intended as illustrative examples.
- A storage medium typically may be non-transitory or comprise a non-transitory device. In this context, a non-transitory storage medium may include a device that is tangible, meaning that the device has a concrete physical form, although the device may change its physical state. Thus, for example, non-transitory refers to a device remaining tangible despite this change in state.
- The above description and drawings are illustrative and are not to be construed as limiting the invention to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure. Numerous specific details are described to provide a thorough understanding of the disclosure. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description.
- Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.
- As used herein, the terms “connected,” “coupled,” or any variant thereof when applying to modules of a system means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or any combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import when used in this application shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number, respectively. The word “or” in reference to a list of two or more items covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
- Those of skill in the art will appreciate that the invention may be embodied in other forms and manners not shown below. It is understood that the use of relational terms, if any, such as first, second, top and bottom, and the like are used solely for distinguishing one entity or action from another, without necessarily requiring or implying any such actual relationship or order between such entities or actions.
- While processes or blocks are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, substituted, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel or may be performed at different times. Further, any specific numbers noted herein are only examples. Alternative implementations may employ differing values or ranges.
- The teachings of the disclosure provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments.
- Any patents, applications, and other references noted above, including any that may be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the disclosure can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the disclosure.
- These and other changes can be made to the disclosure in light of the above Detailed Description. While the above description describes certain embodiments of the disclosure, and describes the best mode contemplated, no matter how detailed the above appears in text, the teachings can be practiced in many ways. Details of the system may vary considerably in their implementation details, while still being encompassed by the subject matter disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosure with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the disclosure to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the disclosure encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the disclosure under the claims.
- While certain aspects of the disclosure are presented below in certain claim forms, the inventors contemplate the various aspects of the disclosure in any number of claim forms. Any claims intended to be treated under 35 U.S.C. § 112, 6 will begin with the words “means for.” Accordingly, the applicant reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the disclosure.
- The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. Certain terms that are used to describe the disclosure are discussed above, or elsewhere in the specification, to provide additional guidance to the practitioner regarding the description of the disclosure. For convenience, certain terms may be highlighted, for example, using capitalization, italics and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that some elements can be described in more than one way.
- Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein, nor is any special significance to be placed upon whether or not a term is elaborated on or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification, including examples of any terms discussed herein, is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.
- Without intent to further limit the scope of the disclosure, examples of instruments, apparatuses, methods and their related results according to the embodiments of the present disclosure are given below. Note that titles or subtitles may be used in the examples for convenience of a reader, which in no way should limit the scope of the disclosure. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions, will control.
- Some portions of this description describe the embodiments of the invention in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
- Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
- Embodiments of the invention may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible, computer-readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- Embodiments of the invention may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible, computer-readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
- The language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
Claims (23)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/098,118 US20210081553A1 (en) | 2014-02-21 | 2020-11-13 | Management of drone operations and security in a pervasive computing environment |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461942852P | 2014-02-21 | 2014-02-21 | |
US14/629,313 US9292705B2 (en) | 2014-02-21 | 2015-02-23 | Management of drone operations and security in a pervasive computing environment |
US15/019,892 US9654476B2 (en) | 2014-02-21 | 2016-02-09 | Management of drone operations and security in a pervasive computing environment |
US15/482,558 US10839089B2 (en) | 2014-02-21 | 2017-04-07 | Management of drone operations and security in a pervasive computing environment |
US17/098,118 US20210081553A1 (en) | 2014-02-21 | 2020-11-13 | Management of drone operations and security in a pervasive computing environment |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/482,558 Continuation US10839089B2 (en) | 2014-02-21 | 2017-04-07 | Management of drone operations and security in a pervasive computing environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210081553A1 true US20210081553A1 (en) | 2021-03-18 |
Family
ID=53882512
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/629,313 Active US9292705B2 (en) | 2014-02-21 | 2015-02-23 | Management of drone operations and security in a pervasive computing environment |
US14/629,436 Active 2035-11-18 US10121015B2 (en) | 2014-02-21 | 2015-02-23 | Management of data privacy and security in a pervasive computing environment |
US15/019,892 Active US9654476B2 (en) | 2014-02-21 | 2016-02-09 | Management of drone operations and security in a pervasive computing environment |
US15/482,558 Active 2036-07-06 US10839089B2 (en) | 2014-02-21 | 2017-04-07 | Management of drone operations and security in a pervasive computing environment |
US16/171,258 Active US10963579B2 (en) | 2014-02-21 | 2018-10-25 | Management of data privacy and security in a pervasive computing environment |
US17/098,118 Pending US20210081553A1 (en) | 2014-02-21 | 2020-11-13 | Management of drone operations and security in a pervasive computing environment |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/629,313 Active US9292705B2 (en) | 2014-02-21 | 2015-02-23 | Management of drone operations and security in a pervasive computing environment |
US14/629,436 Active 2035-11-18 US10121015B2 (en) | 2014-02-21 | 2015-02-23 | Management of data privacy and security in a pervasive computing environment |
US15/019,892 Active US9654476B2 (en) | 2014-02-21 | 2016-02-09 | Management of drone operations and security in a pervasive computing environment |
US15/482,558 Active 2036-07-06 US10839089B2 (en) | 2014-02-21 | 2017-04-07 | Management of drone operations and security in a pervasive computing environment |
US16/171,258 Active US10963579B2 (en) | 2014-02-21 | 2018-10-25 | Management of data privacy and security in a pervasive computing environment |
Country Status (1)
Country | Link |
---|---|
US (6) | US9292705B2 (en) |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220173934A1 (en) * | 2008-08-11 | 2022-06-02 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11553399B2 (en) | 2009-04-30 | 2023-01-10 | Icontrol Networks, Inc. | Custom content for premises management |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US11588787B2 (en) | 2004-03-16 | 2023-02-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11595364B2 (en) | 2005-03-16 | 2023-02-28 | Icontrol Networks, Inc. | System for data routing in networks |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11611568B2 (en) | 2007-06-12 | 2023-03-21 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11616659B2 (en) | 2008-08-11 | 2023-03-28 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US11625161B2 (en) | 2007-06-12 | 2023-04-11 | Icontrol Networks, Inc. | Control system user interface |
US11625008B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Premises management networking |
US11626006B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Management of a security system at a premises |
US11632308B2 (en) | 2007-06-12 | 2023-04-18 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11641391B2 (en) | 2008-08-11 | 2023-05-02 | Icontrol Networks Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11656667B2 (en) | 2004-03-16 | 2023-05-23 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11663902B2 (en) | 2007-04-23 | 2023-05-30 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11706045B2 (en) | 2005-03-16 | 2023-07-18 | Icontrol Networks, Inc. | Modular electronic display platform |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11722896B2 (en) | 2007-06-12 | 2023-08-08 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11757834B2 (en) | 2004-03-16 | 2023-09-12 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11792330B2 (en) | 2005-03-16 | 2023-10-17 | Icontrol Networks, Inc. | Communication and automation in a premises management system |
US11809174B2 (en) | 2007-02-28 | 2023-11-07 | Icontrol Networks, Inc. | Method and system for managing communication connectivity |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11816323B2 (en) | 2008-06-25 | 2023-11-14 | Icontrol Networks, Inc. | Automation system user interface |
US11824675B2 (en) | 2005-03-16 | 2023-11-21 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US11894986B2 (en) | 2007-06-12 | 2024-02-06 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11900790B2 (en) | 2010-09-28 | 2024-02-13 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11943301B2 (en) | 2014-03-03 | 2024-03-26 | Icontrol Networks, Inc. | Media content management |
US11991306B2 (en) | 2004-03-16 | 2024-05-21 | Icontrol Networks, Inc. | Premises system automation |
US12003387B2 (en) | 2012-06-27 | 2024-06-04 | Comcast Cable Communications, Llc | Control system user interface |
US12021649B2 (en) | 2010-12-20 | 2024-06-25 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US12063220B2 (en) | 2004-03-16 | 2024-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US12063221B2 (en) | 2006-06-12 | 2024-08-13 | Icontrol Networks, Inc. | Activation of gateway device |
US12088425B2 (en) | 2010-12-16 | 2024-09-10 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US12100287B2 (en) | 2010-12-17 | 2024-09-24 | Icontrol Networks, Inc. | Method and system for processing security event data |
Families Citing this family (111)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US10339791B2 (en) | 2007-06-12 | 2019-07-02 | Icontrol Networks, Inc. | Security network integrated with premise security system |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US10156959B2 (en) | 2005-03-16 | 2018-12-18 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11159484B2 (en) | 2004-03-16 | 2021-10-26 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US9141276B2 (en) | 2005-03-16 | 2015-09-22 | Icontrol Networks, Inc. | Integrated interface for mobile device |
US10522026B2 (en) | 2008-08-11 | 2019-12-31 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US10348575B2 (en) | 2013-06-27 | 2019-07-09 | Icontrol Networks, Inc. | Control system user interface |
US7711796B2 (en) | 2006-06-12 | 2010-05-04 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US9531593B2 (en) | 2007-06-12 | 2016-12-27 | Icontrol Networks, Inc. | Takeover processes in security network integrated with premise security system |
US20090077623A1 (en) | 2005-03-16 | 2009-03-19 | Marc Baum | Security Network Integrating Security System and Network Devices |
US10142392B2 (en) | 2007-01-24 | 2018-11-27 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US10200504B2 (en) | 2007-06-12 | 2019-02-05 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US9306809B2 (en) | 2007-06-12 | 2016-04-05 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US20170180198A1 (en) | 2008-08-11 | 2017-06-22 | Marc Baum | Forming a security network including integrated security system components |
US8874477B2 (en) | 2005-10-04 | 2014-10-28 | Steven Mark Hoffberg | Multifactorial optimization system and method |
US10079839B1 (en) | 2007-06-12 | 2018-09-18 | Icontrol Networks, Inc. | Activation of gateway device |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11258625B2 (en) * | 2008-08-11 | 2022-02-22 | Icontrol Networks, Inc. | Mobile premises automation platform |
US8782434B1 (en) | 2010-07-15 | 2014-07-15 | The Research Foundation For The State University Of New York | System and method for validating program execution at run-time |
US9122873B2 (en) | 2012-09-14 | 2015-09-01 | The Research Foundation For The State University Of New York | Continuous run-time validation of program execution: a practical approach |
US9292705B2 (en) | 2014-02-21 | 2016-03-22 | Lens Ventures, Llc | Management of drone operations and security in a pervasive computing environment |
US11146637B2 (en) | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US11151460B2 (en) | 2014-03-26 | 2021-10-19 | Unanimous A. I., Inc. | Adaptive population optimization for amplifying the intelligence of crowds and swarms |
US10110664B2 (en) | 2014-03-26 | 2018-10-23 | Unanimous A. I., Inc. | Dynamic systems for optimization of real-time collaborative intelligence |
US11269502B2 (en) | 2014-03-26 | 2022-03-08 | Unanimous A. I., Inc. | Interactive behavioral polling and machine learning for amplification of group intelligence |
US12099936B2 (en) | 2014-03-26 | 2024-09-24 | Unanimous A. I., Inc. | Systems and methods for curating an optimized population of networked forecasting participants from a baseline population |
US9940006B2 (en) | 2014-03-26 | 2018-04-10 | Unanimous A. I., Inc. | Intuitive interfaces for real-time collaborative intelligence |
US10817159B2 (en) | 2014-03-26 | 2020-10-27 | Unanimous A. I., Inc. | Non-linear probabilistic wagering for amplified collective intelligence |
US12001667B2 (en) | 2014-03-26 | 2024-06-04 | Unanimous A. I., Inc. | Real-time collaborative slider-swarm with deadbands for amplified collective intelligence |
US10416666B2 (en) * | 2014-03-26 | 2019-09-17 | Unanimous A. I., Inc. | Methods and systems for collaborative control of a remote vehicle |
AU2015236010A1 (en) | 2014-03-26 | 2016-11-10 | Unanimous A.I. LLC | Methods and systems for real-time closed-loop collaborative intelligence |
US12079459B2 (en) | 2014-03-26 | 2024-09-03 | Unanimous A. I., Inc. | Hyper-swarm method and system for collaborative forecasting |
US10133460B2 (en) | 2014-03-26 | 2018-11-20 | Unanimous A.I., Inc. | Systems and methods for collaborative synchronous image selection |
US10817158B2 (en) | 2014-03-26 | 2020-10-27 | Unanimous A. I., Inc. | Method and system for a parallel distributed hyper-swarm for amplifying human intelligence |
US11941239B2 (en) | 2014-03-26 | 2024-03-26 | Unanimous A.I., Inc. | System and method for enhanced collaborative forecasting |
US9940444B1 (en) * | 2014-04-21 | 2018-04-10 | Virtual Marketing Incorporated | Software wrapper and installer using timestamp validation and system identification validation |
US9495877B2 (en) * | 2014-05-27 | 2016-11-15 | The Boeing Company | Airspace deconfliction system and method |
US9798322B2 (en) | 2014-06-19 | 2017-10-24 | Skydio, Inc. | Virtual camera interface and other user interaction paradigms for a flying digital assistant |
US12007763B2 (en) | 2014-06-19 | 2024-06-11 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US9678506B2 (en) | 2014-06-19 | 2017-06-13 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
WO2016093908A2 (en) | 2014-09-05 | 2016-06-16 | Precisionhawk Usa Inc. | Automated un-manned air traffic control system |
EP3192000B1 (en) * | 2014-09-08 | 2024-06-12 | Uri Jacob Braun | System and method of controllably disclosing sensitive data |
US9536216B1 (en) | 2014-12-18 | 2017-01-03 | Amazon Technologies, Inc. | Delivery of packages by unmanned aerial vehicles |
US10445513B2 (en) * | 2015-03-06 | 2019-10-15 | Nokia Technologies Oy | Privacy management |
US10248973B1 (en) * | 2015-03-13 | 2019-04-02 | Marin Software Incorporated | Automated selection of bidders for online advertisements using collaborative bidding rules |
US20160307449A1 (en) * | 2015-04-15 | 2016-10-20 | International Business Machines Corporation | Autonomous drone service system |
US10284560B2 (en) | 2015-08-22 | 2019-05-07 | Just Innovation, Inc. | Secure unmanned vehicle operation and communication |
US10102757B2 (en) | 2015-08-22 | 2018-10-16 | Just Innovation, Inc. | Secure unmanned vehicle operation and monitoring |
US9934630B2 (en) * | 2015-10-30 | 2018-04-03 | Capital One Services, Llc | Secure delivery via unmanned vehicles |
US9928748B2 (en) | 2015-11-25 | 2018-03-27 | International Business Machines Corporation | Dynamic geo-fence for drone |
US9523986B1 (en) | 2015-12-04 | 2016-12-20 | International Business Machines Corporation | System and method for secure, privacy-aware and contextualised package delivery using autonomous vehicles |
JP6687488B2 (en) * | 2015-12-24 | 2020-04-22 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Unmanned aerial vehicle and control method thereof |
US20170200085A1 (en) * | 2016-01-11 | 2017-07-13 | Anuthep Benja-Athon | Creation of Abiotic-Biotic Civilization |
US10331909B2 (en) * | 2016-01-26 | 2019-06-25 | International Business Machines Corporation | Dynamic data flow analysis for dynamic languages programs |
CN113238581B (en) | 2016-02-29 | 2024-08-20 | 星克跃尔株式会社 | Method and system for flight control of unmanned aerial vehicle |
JP6372508B2 (en) * | 2016-03-15 | 2018-08-15 | オムロン株式会社 | Data flow control device and data flow control method |
US10657830B2 (en) * | 2016-03-28 | 2020-05-19 | International Business Machines Corporation | Operation of an aerial drone inside an exclusion zone |
US10303186B2 (en) * | 2016-03-29 | 2019-05-28 | Chengwen Chris WANG | Unmanned spatial vehicle performance |
US10169608B2 (en) * | 2016-05-13 | 2019-01-01 | Microsoft Technology Licensing, Llc | Dynamic management of data with context-based processing |
GB2565027A (en) * | 2016-05-18 | 2019-01-30 | Walmart Apollo Llc | Apparatus and method for displaying content with delivery vehicle |
CA3187011A1 (en) * | 2016-05-20 | 2017-11-23 | United Parcel Service Of America, Inc. | Sharing location information with a recipient |
US10435176B2 (en) | 2016-05-25 | 2019-10-08 | Skydio, Inc. | Perimeter structure for unmanned aerial vehicle |
US10382539B1 (en) * | 2016-06-01 | 2019-08-13 | Cape Productions Inc. | Methods and apparatus for data control and transfer with an unmanned aerial vehicle |
US10791213B2 (en) | 2016-06-14 | 2020-09-29 | Hand Held Products, Inc. | Managing energy usage in mobile devices |
US9947233B2 (en) | 2016-07-12 | 2018-04-17 | At&T Intellectual Property I, L.P. | Method and system to improve safety concerning drones |
US10217002B2 (en) | 2016-08-04 | 2019-02-26 | International Business Machines Corporation | Systems and methods for monitoring unmanned vehicles |
US10520943B2 (en) | 2016-08-12 | 2019-12-31 | Skydio, Inc. | Unmanned aerial image capture platform |
WO2018053320A1 (en) * | 2016-09-15 | 2018-03-22 | Oracle International Corporation | Automatic partitioning of stream data for shapes |
US10139836B2 (en) | 2016-09-27 | 2018-11-27 | International Business Machines Corporation | Autonomous aerial point of attraction highlighting for tour guides |
US20180101777A1 (en) * | 2016-10-12 | 2018-04-12 | Anuthep Benja-Athon | EM Oracle |
US10417755B1 (en) | 2016-11-18 | 2019-09-17 | Talon Aerolytics (Holding), Inc. | Drone-based inspection of wireless communication towers and corresponding methods, systems, and apparatuses |
US11295458B2 (en) | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
WO2018102286A1 (en) * | 2016-12-02 | 2018-06-07 | Equifax, Inc. | Generating and processing obfuscated sensitive information |
US10395017B2 (en) * | 2017-02-01 | 2019-08-27 | International Business Machines Corporation | Selectively redacting digital footprint information in order to improve computer data security |
US10310501B2 (en) | 2017-02-15 | 2019-06-04 | International Business Machines Corporation | Managing available energy among multiple drones |
KR20180096190A (en) | 2017-02-20 | 2018-08-29 | 삼성전자주식회사 | Electronic device for controlling unmanned aerial vehicle and method of operating the same |
US20190237168A1 (en) * | 2018-01-29 | 2019-08-01 | Anuthep Benja-Athon | Abiotic Intelligence-Rendered Pay |
US20190238338A1 (en) * | 2018-01-31 | 2019-08-01 | Walmart Apollo, Llc | Cloning drones using blockchain |
CN108719242B (en) * | 2018-03-27 | 2020-12-29 | 钟静海 | Pesticide spraying system |
CN108516092B (en) * | 2018-03-27 | 2021-01-01 | 含山县丰华供销合作社有限公司 | Agricultural unmanned aerial vehicle's medicine system that spouts |
US10629009B2 (en) | 2018-04-25 | 2020-04-21 | International Business Machines Corporation | Non-intrusive unmanned entity inspection |
US10676216B2 (en) * | 2018-04-25 | 2020-06-09 | International Business Machines Corporation | Non-intrusive unmanned entity inspection |
US10564940B2 (en) * | 2018-05-03 | 2020-02-18 | International Business Machines Corporation | Systems and methods for programming drones |
WO2019231745A1 (en) * | 2018-05-29 | 2019-12-05 | Walmart Apollo, Llc | Unmanned retail delivery vehicle protection systems and methods of protection |
US10713468B2 (en) * | 2018-11-08 | 2020-07-14 | International Business Machines Corporation | Checking credentials using a drone |
CN109582034B (en) * | 2018-11-29 | 2021-08-06 | 沈阳无距科技有限公司 | Multitask route planning method and device and electronic equipment |
CN113273164A (en) * | 2019-01-07 | 2021-08-17 | 昕诺飞控股有限公司 | Controller, system and method for providing location-based services to an area |
EP3690383A1 (en) * | 2019-02-04 | 2020-08-05 | CMI Defence S.A. | Operational section of armoured vehicles communicating with a flotilla of drones |
US11479357B1 (en) | 2019-03-15 | 2022-10-25 | Alarm.Com Incorporated | Perspective angle acquisition and adjustment of security camera drone |
US11237749B2 (en) * | 2019-06-06 | 2022-02-01 | EMC IP Holding Company LLC | System and method for backup data discrimination |
US10694372B1 (en) | 2019-09-06 | 2020-06-23 | International Business Machines Corporation | Independent agent-based location verification |
US11687595B2 (en) | 2019-10-30 | 2023-06-27 | EMC IP Holding Company LLC | System and method for searching backups |
US11507473B2 (en) | 2019-10-30 | 2022-11-22 | EMC IP Holding Company LLC | System and method for efficient backup generation |
US11586506B2 (en) | 2019-10-30 | 2023-02-21 | EMC IP Holding Company LLC | System and method for indexing image backups |
US11593497B2 (en) | 2019-10-30 | 2023-02-28 | EMC IP Holding Company LLC | System and method for managing sensitive data |
US11475159B2 (en) | 2019-10-30 | 2022-10-18 | EMC IP Holding Company LLC | System and method for efficient user-level based deletions of backup data |
KR102189486B1 (en) * | 2020-06-17 | 2020-12-11 | (주)인티그리트 | System for providing shared contents service using remote controlling of shared autonomous device |
US11693399B2 (en) | 2020-09-01 | 2023-07-04 | Ge Aviation Systems Llc | Systems and methods for market based deconfliction for unmanned traffic management |
US11693404B2 (en) | 2020-10-22 | 2023-07-04 | Ge Aviation Systems Llc | Trusted autonomy framework for unmanned aerial systems |
US11798117B2 (en) | 2021-06-16 | 2023-10-24 | Bank Of America Corporation | Systems and methods for intelligent steganographic protection |
US11932281B2 (en) * | 2021-09-22 | 2024-03-19 | International Business Machines Corporation | Configuring and controlling an automated vehicle to perform user specified operations |
US12033482B1 (en) * | 2022-08-01 | 2024-07-09 | Amazon Technologies, Inc. | Techniques for implementing customized intrusion zones |
US11953996B1 (en) | 2023-01-20 | 2024-04-09 | Dell Products L.P. | Method and system for selectively preserving data generated during application access |
US11949638B1 (en) | 2023-03-04 | 2024-04-02 | Unanimous A. I., Inc. | Methods and systems for hyperchat conversations among large networked populations with collective intelligence amplification |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070284474A1 (en) * | 2006-06-09 | 2007-12-13 | The Insitu Group, Inc. | Wirelessly controlling unmanned aircraft and accessing associated surveillance data |
US20120059535A1 (en) * | 2010-09-03 | 2012-03-08 | Honeywell International Inc. | Systems and methods for rta control of multi-segment flight plans with smooth transitions |
US20120158280A1 (en) * | 2008-01-14 | 2012-06-21 | Ravenscroft Donald L | Computing route plans for routing around obstacles having spatial and temporal dimensions |
Family Cites Families (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5495412A (en) * | 1994-07-15 | 1996-02-27 | Ican Systems, Inc. | Computer-based method and apparatus for interactive computer-assisted negotiations |
CN100452071C (en) * | 1995-02-13 | 2009-01-14 | 英特特拉斯特技术公司 | Systems and methods for secure transaction management and electronic rights protection |
US7095854B1 (en) * | 1995-02-13 | 2006-08-22 | Intertrust Technologies Corp. | Systems and methods for secure transaction management and electronic rights protection |
DE834824T1 (en) * | 1996-10-04 | 1998-09-24 | Toshiba Kawasaki Kk | Cooperative inference device, cooperative inference process, storage medium, which stores a program thereof and cooperative inference system thereof |
US6401080B1 (en) * | 1997-03-21 | 2002-06-04 | International Business Machines Corporation | Intelligent agent with negotiation capability and method of negotiation therewith |
US6148342A (en) * | 1998-01-27 | 2000-11-14 | Ho; Andrew P. | Secure database management system for confidential records using separately encrypted identifier and access request |
US6437692B1 (en) * | 1998-06-22 | 2002-08-20 | Statsignal Systems, Inc. | System and method for monitoring and controlling remote devices |
US6304948B1 (en) * | 1998-10-06 | 2001-10-16 | Ricoh Corporation | Method and apparatus for erasing data after expiration |
US7222109B1 (en) * | 1998-11-16 | 2007-05-22 | Sky Technologies Llc | System and method for contract authority |
US6141653A (en) * | 1998-11-16 | 2000-10-31 | Tradeaccess Inc | System for interative, multivariate negotiations over a network |
US20030154172A1 (en) * | 1999-05-04 | 2003-08-14 | Mark Richards | Negotiation facilitation during claim processing |
US7330826B1 (en) * | 1999-07-09 | 2008-02-12 | Perfect.Com, Inc. | Method, system and business model for a buyer's auction with near perfect information using the internet |
US6842899B2 (en) * | 1999-12-21 | 2005-01-11 | Lockheed Martin Corporation | Apparatus and method for resource negotiations among autonomous agents |
US7738550B2 (en) * | 2000-03-13 | 2010-06-15 | Sony Corporation | Method and apparatus for generating compact transcoding hints metadata |
US7028009B2 (en) | 2001-01-17 | 2006-04-11 | Contentguardiholdings, Inc. | Method and apparatus for distributing enforceable property rights |
AU2002255806A1 (en) * | 2001-03-20 | 2002-10-08 | Dealigence Inc. | Negotiating platform |
US20040088264A1 (en) * | 2001-04-11 | 2004-05-06 | Preist Christopher William | Automatic contract negotiation with multiple parameters |
US20030014325A1 (en) * | 2001-06-27 | 2003-01-16 | Peter Biffar | Automatic pricing and negotiation system |
US8250025B2 (en) * | 2001-11-06 | 2012-08-21 | Business Controls, Inc. | Anonymous reporting system |
US7478157B2 (en) * | 2001-11-07 | 2009-01-13 | International Business Machines Corporation | System, method, and business methods for enforcing privacy preferences on personal-data exchanges across a network |
US7316032B2 (en) * | 2002-02-27 | 2008-01-01 | Amad Tayebi | Method for allowing a customer to preview, acquire and/or pay for information and a system therefor |
US7904360B2 (en) * | 2002-02-04 | 2011-03-08 | Alexander William EVANS | System and method for verification, authentication, and notification of a transaction |
US7472423B2 (en) * | 2002-03-27 | 2008-12-30 | Tvworks, Llc | Method and apparatus for anonymously tracking TV and internet usage |
US20170313332A1 (en) * | 2002-06-04 | 2017-11-02 | General Electric Company | Autonomous vehicle system and method |
US7685073B2 (en) * | 2002-07-30 | 2010-03-23 | Baker Paul L | Methods for negotiating agreement over concealed terms through a blind agent |
US20040153908A1 (en) * | 2002-09-09 | 2004-08-05 | Eprivacy Group, Inc. | System and method for controlling information exchange, privacy, user references and right via communications networks communications networks |
US20040210926A1 (en) * | 2003-01-08 | 2004-10-21 | Avtrex, Inc. | Controlling access to content |
US20040172371A1 (en) * | 2003-02-28 | 2004-09-02 | Fujitsu Limited | Automated negotiation |
US20040254846A1 (en) * | 2003-06-13 | 2004-12-16 | Byde Andrew Robert | Options for negotiation with multiple sellers |
US20040254847A1 (en) * | 2003-06-13 | 2004-12-16 | Preist Christopher William | Automated negotiation with multiple parties |
US7526640B2 (en) * | 2003-06-30 | 2009-04-28 | Microsoft Corporation | System and method for automatic negotiation of a security protocol |
US6856894B1 (en) * | 2003-10-23 | 2005-02-15 | International Business Machines Corporation | Navigating a UAV under remote control and manual control with three dimensional flight depiction |
US20120210119A1 (en) * | 2004-06-14 | 2012-08-16 | Arthur Baxter | Method and Apparatus for Secure Internet Browsing |
US20060048224A1 (en) * | 2004-08-30 | 2006-03-02 | Encryptx Corporation | Method and apparatus for automatically detecting sensitive information, applying policies based on a structured taxonomy and dynamically enforcing and reporting on the protection of sensitive data through a software permission wrapper |
EP1667088B1 (en) * | 2004-11-30 | 2009-11-04 | Oculus Info Inc. | System and method for interactive 3D air regions |
US8874477B2 (en) * | 2005-10-04 | 2014-10-28 | Steven Mark Hoffberg | Multifactorial optimization system and method |
US9459622B2 (en) * | 2007-01-12 | 2016-10-04 | Legalforce, Inc. | Driverless vehicle commerce network and community |
US8015117B1 (en) * | 2006-04-27 | 2011-09-06 | Hewlett-Packard Development Company, L.P. | Method and system for anonymous reporting |
US8996715B2 (en) * | 2006-06-23 | 2015-03-31 | International Business Machines Corporation | Application firewall validation bypass for impromptu components |
US8549077B2 (en) * | 2006-06-30 | 2013-10-01 | The Invention Science Fund I, Llc | Usage parameters for communication content |
US8537716B2 (en) * | 2006-07-28 | 2013-09-17 | Ca, Inc. | Method and system for synchronizing access points in a wireless network |
DE102007032084A1 (en) * | 2007-07-09 | 2009-01-22 | Eads Deutschland Gmbh | Collision and Conflict Prevention System for autonomous unmanned aerial vehicles (UAV) |
US8082102B2 (en) * | 2008-01-14 | 2011-12-20 | The Boeing Company | Computing flight plans for UAVs while routing around obstacles having spatial and temporal dimensions |
US11159909B2 (en) * | 2008-02-05 | 2021-10-26 | Victor Thomas Anderson | Wireless location establishing device |
US20090313173A1 (en) * | 2008-06-11 | 2009-12-17 | Inderpal Singh | Dynamic Negotiation System |
WO2010048980A1 (en) * | 2008-10-27 | 2010-05-06 | Telecom Italia S.P.A. | Method and system for profiling data traffic in telecommunications networks |
US20100286859A1 (en) * | 2008-11-18 | 2010-11-11 | Honeywell International Inc. | Methods for generating a flight plan for an unmanned aerial vehicle based on a predicted camera path |
WO2010123529A2 (en) * | 2008-12-19 | 2010-10-28 | Xollai, Llc | System and method for autonomous vehicle control |
JP5643292B2 (en) * | 2009-04-20 | 2014-12-17 | インターデイジタル パテント ホールディングス インコーポレイテッド | Multiple domain systems and domain ownership |
US8213957B2 (en) * | 2009-04-22 | 2012-07-03 | Trueposition, Inc. | Network autonomous wireless location system |
US20100312706A1 (en) * | 2009-06-09 | 2010-12-09 | Jacques Combet | Network centric system and method to enable tracking of consumer behavior and activity |
EP3096503A1 (en) * | 2009-10-15 | 2016-11-23 | Interdigital Patent Holdings, Inc. | Registration and credential roll-out for accessing a subscription-based service |
US20110161236A1 (en) * | 2009-12-29 | 2011-06-30 | Sharad Singhal | System and method for negotiating a sale |
US8964625B2 (en) * | 2009-12-31 | 2015-02-24 | Verizon Patent And Licensing Inc. | Dynamic wireless network apparatuses, systems, and methods |
FR2955959B1 (en) * | 2010-02-02 | 2012-09-21 | Thales Sa | NAVIGATION ASSISTANCE SYSTEM FOR A DRONE |
US8190675B2 (en) * | 2010-02-11 | 2012-05-29 | Inditto, Llc | Method and system for providing access to remotely hosted services through a normalized application programming interface |
US9032473B2 (en) * | 2010-03-02 | 2015-05-12 | Interdigital Patent Holdings, Inc. | Migration of credentials and/or domains between trusted hardware subscription modules |
US20110238482A1 (en) * | 2010-03-29 | 2011-09-29 | Carney John S | Digital Profile System of Personal Attributes, Tendencies, Recommended Actions, and Historical Events with Privacy Preserving Controls |
US8544104B2 (en) * | 2010-05-10 | 2013-09-24 | International Business Machines Corporation | Enforcement of data privacy to maintain obfuscation of certain data |
JP5545084B2 (en) * | 2010-07-08 | 2014-07-09 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
DE102011012874A1 (en) * | 2010-12-29 | 2012-07-05 | Francotyp-Postalia Gmbh | Method for approving utilization of ink cartridge of postage meter that is utilized for producing valid franking impression on mail item, involves carrying out billing of approval of utilization if non-usage of cartridge is detected |
US9739763B2 (en) * | 2011-05-16 | 2017-08-22 | Trimble Inc. | Telematic locomotive microfluidic analysis |
US9251216B2 (en) * | 2011-05-19 | 2016-02-02 | At&T Intellectual Property I, L.P. | Efficient publication of sparse data |
US20140074646A1 (en) * | 2011-05-31 | 2014-03-13 | Mehmet Kivanc Ozonat | Automated Negotiation |
US9159055B2 (en) * | 2011-09-07 | 2015-10-13 | Elwha Llc | Computational systems and methods for identifying a communications partner |
US8869305B1 (en) * | 2011-09-22 | 2014-10-21 | Symantec Corporation | Systems and methods for implementing password-protection policies based on physical locations of mobile devices |
US8452693B2 (en) * | 2011-10-06 | 2013-05-28 | Dhavalkumar M. Shah | Method for providing geographical location-based security, restrict, permit access of varying level to individual's any kind of data, information, credit, finances, services obtained(online and or offline) |
US20180032997A1 (en) | 2012-10-09 | 2018-02-01 | George A. Gordon | System, method, and computer program product for determining whether to prompt an action by a platform in connection with a mobile device |
US20130111545A1 (en) * | 2011-11-02 | 2013-05-02 | Alcatel-Lucent Usa Inc. | Privacy Management for Subscriber Data |
US20130160144A1 (en) * | 2011-12-14 | 2013-06-20 | Microsoft Corporation | Entity verification via third-party |
US9841761B2 (en) * | 2012-05-04 | 2017-12-12 | Aeryon Labs Inc. | System and method for controlling unmanned aerial vehicles |
EP2852881A4 (en) * | 2012-05-22 | 2016-03-23 | Intouch Technologies Inc | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
US20140018979A1 (en) * | 2012-07-13 | 2014-01-16 | Honeywell International Inc. | Autonomous airspace flight planning and virtual airspace containment system |
US9518821B2 (en) * | 2012-08-02 | 2016-12-13 | Benjamin Malay | Vehicle control system |
US8925054B2 (en) * | 2012-10-08 | 2014-12-30 | Comcast Cable Communications, Llc | Authenticating credentials for mobile platforms |
CN105009519A (en) | 2012-11-23 | 2015-10-28 | 卡尔加里科技股份有限公司 | Methods and systems for peer-to-peer discovery and connection from a collaborative application session |
US9310809B2 (en) * | 2012-12-03 | 2016-04-12 | The Boeing Company | Systems and methods for collaboratively controlling at least one aircraft |
US20140279568A1 (en) * | 2013-03-15 | 2014-09-18 | Variab.Ly Ltd | Price negotiation method and system |
US20140286744A1 (en) * | 2013-03-21 | 2014-09-25 | Unitronics Parking Solutions Ltd. | Vehicle centering system |
US20140358629A1 (en) * | 2013-05-31 | 2014-12-04 | Gurudatta Horantur Shivaswamy | Competitive pricing platform and managed inventory repository |
US9544623B2 (en) * | 2013-07-08 | 2017-01-10 | The Trustees Of Princeton University | Quota aware video adaptation |
US9721020B2 (en) * | 2013-07-31 | 2017-08-01 | International Business Machines Corporation | Search query obfuscation via broadened subqueries and recombining |
US9489538B2 (en) * | 2014-01-02 | 2016-11-08 | Alcatel Lucent | Role-based anonymization |
US9292705B2 (en) * | 2014-02-21 | 2016-03-22 | Lens Ventures, Llc | Management of drone operations and security in a pervasive computing environment |
US9959424B2 (en) * | 2014-03-12 | 2018-05-01 | Michael Bilotta | Information based life view |
US9256994B2 (en) * | 2014-05-12 | 2016-02-09 | Unmanned Innovation, Inc. | Unmanned aerial vehicle authorization and geofence envelope determination |
EP3233633A4 (en) * | 2014-12-17 | 2018-05-30 | Picpocket, Inc. | Drone based systems and methodologies for capturing images |
US9552736B2 (en) * | 2015-01-29 | 2017-01-24 | Qualcomm Incorporated | Systems and methods for restricting drone airspace access |
CA2996709A1 (en) * | 2015-08-27 | 2017-03-02 | Dronsystems Limited | A highly automated system of air traffic control (atm) for at least one unmanned aerial vehicle (unmanned aerial vehicles uav) |
US20170142589A1 (en) * | 2015-11-18 | 2017-05-18 | Samsung Electronics Co., Ltd | Method for adjusting usage policy and electronic device for supporting the same |
US10657827B2 (en) * | 2015-12-09 | 2020-05-19 | Dronesense Llc | Drone flight operations |
US11494808B2 (en) * | 2016-05-28 | 2022-11-08 | Anagog Ltd. | Anonymizing potentially sensitive data |
US10628608B2 (en) * | 2016-06-29 | 2020-04-21 | Sap Se | Anonymization techniques to protect data |
US20200311791A1 (en) * | 2019-03-28 | 2020-10-01 | Zycus Infotech Pvt. Ltd. | System and method for assisting in negotiations for commercial transactions |
-
2015
- 2015-02-23 US US14/629,313 patent/US9292705B2/en active Active
- 2015-02-23 US US14/629,436 patent/US10121015B2/en active Active
-
2016
- 2016-02-09 US US15/019,892 patent/US9654476B2/en active Active
-
2017
- 2017-04-07 US US15/482,558 patent/US10839089B2/en active Active
-
2018
- 2018-10-25 US US16/171,258 patent/US10963579B2/en active Active
-
2020
- 2020-11-13 US US17/098,118 patent/US20210081553A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070284474A1 (en) * | 2006-06-09 | 2007-12-13 | The Insitu Group, Inc. | Wirelessly controlling unmanned aircraft and accessing associated surveillance data |
US20120158280A1 (en) * | 2008-01-14 | 2012-06-21 | Ravenscroft Donald L | Computing route plans for routing around obstacles having spatial and temporal dimensions |
US20120059535A1 (en) * | 2010-09-03 | 2012-03-08 | Honeywell International Inc. | Systems and methods for rta control of multi-segment flight plans with smooth transitions |
Cited By (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11625008B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Premises management networking |
US11626006B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Management of a security system at a premises |
US11656667B2 (en) | 2004-03-16 | 2023-05-23 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11588787B2 (en) | 2004-03-16 | 2023-02-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US12063220B2 (en) | 2004-03-16 | 2024-08-13 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11991306B2 (en) | 2004-03-16 | 2024-05-21 | Icontrol Networks, Inc. | Premises system automation |
US11601397B2 (en) | 2004-03-16 | 2023-03-07 | Icontrol Networks, Inc. | Premises management configuration and control |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11893874B2 (en) | 2004-03-16 | 2024-02-06 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11782394B2 (en) | 2004-03-16 | 2023-10-10 | Icontrol Networks, Inc. | Automation system with mobile interface |
US11757834B2 (en) | 2004-03-16 | 2023-09-12 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11810445B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11706045B2 (en) | 2005-03-16 | 2023-07-18 | Icontrol Networks, Inc. | Modular electronic display platform |
US11595364B2 (en) | 2005-03-16 | 2023-02-28 | Icontrol Networks, Inc. | System for data routing in networks |
US11792330B2 (en) | 2005-03-16 | 2023-10-17 | Icontrol Networks, Inc. | Communication and automation in a premises management system |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11824675B2 (en) | 2005-03-16 | 2023-11-21 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US12063221B2 (en) | 2006-06-12 | 2024-08-13 | Icontrol Networks, Inc. | Activation of gateway device |
US12120171B2 (en) | 2007-01-24 | 2024-10-15 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11809174B2 (en) | 2007-02-28 | 2023-11-07 | Icontrol Networks, Inc. | Method and system for managing communication connectivity |
US11663902B2 (en) | 2007-04-23 | 2023-05-30 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US11632308B2 (en) | 2007-06-12 | 2023-04-18 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11722896B2 (en) | 2007-06-12 | 2023-08-08 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11611568B2 (en) | 2007-06-12 | 2023-03-21 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11894986B2 (en) | 2007-06-12 | 2024-02-06 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11625161B2 (en) | 2007-06-12 | 2023-04-11 | Icontrol Networks, Inc. | Control system user interface |
US11815969B2 (en) | 2007-08-10 | 2023-11-14 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11816323B2 (en) | 2008-06-25 | 2023-11-14 | Icontrol Networks, Inc. | Automation system user interface |
US11616659B2 (en) | 2008-08-11 | 2023-03-28 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11711234B2 (en) | 2008-08-11 | 2023-07-25 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11641391B2 (en) | 2008-08-11 | 2023-05-02 | Icontrol Networks Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US20220173934A1 (en) * | 2008-08-11 | 2022-06-02 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11962672B2 (en) | 2008-08-11 | 2024-04-16 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11792036B2 (en) * | 2008-08-11 | 2023-10-17 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11601865B2 (en) | 2009-04-30 | 2023-03-07 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US11665617B2 (en) | 2009-04-30 | 2023-05-30 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US11997584B2 (en) | 2009-04-30 | 2024-05-28 | Icontrol Networks, Inc. | Activation of a home automation controller |
US11778534B2 (en) | 2009-04-30 | 2023-10-03 | Icontrol Networks, Inc. | Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces |
US11856502B2 (en) | 2009-04-30 | 2023-12-26 | Icontrol Networks, Inc. | Method, system and apparatus for automated inventory reporting of security, monitoring and automation hardware and software at customer premises |
US11553399B2 (en) | 2009-04-30 | 2023-01-10 | Icontrol Networks, Inc. | Custom content for premises management |
US12127095B2 (en) | 2009-04-30 | 2024-10-22 | Icontrol Networks, Inc. | Custom content for premises management |
US11900790B2 (en) | 2010-09-28 | 2024-02-13 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US12088425B2 (en) | 2010-12-16 | 2024-09-10 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US12100287B2 (en) | 2010-12-17 | 2024-09-24 | Icontrol Networks, Inc. | Method and system for processing security event data |
US12021649B2 (en) | 2010-12-20 | 2024-06-25 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US12003387B2 (en) | 2012-06-27 | 2024-06-04 | Comcast Cable Communications, Llc | Control system user interface |
US11943301B2 (en) | 2014-03-03 | 2024-03-26 | Icontrol Networks, Inc. | Media content management |
Also Published As
Publication number | Publication date |
---|---|
US20170278407A1 (en) | 2017-09-28 |
US9654476B2 (en) | 2017-05-16 |
US20150242648A1 (en) | 2015-08-27 |
US10963579B2 (en) | 2021-03-30 |
US10839089B2 (en) | 2020-11-17 |
US20190087593A1 (en) | 2019-03-21 |
US20150242972A1 (en) | 2015-08-27 |
US20160164874A1 (en) | 2016-06-09 |
US9292705B2 (en) | 2016-03-22 |
US10121015B2 (en) | 2018-11-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210081553A1 (en) | Management of drone operations and security in a pervasive computing environment | |
US11423126B2 (en) | Computerized system and method for modifying a media file by automatically applying security features to select portions of media file content | |
US11740096B2 (en) | User interfaces for customized navigation routes | |
US11468198B2 (en) | Secure digital media authentication and analysis | |
US10862843B2 (en) | Computerized system and method for modifying a message to apply security features to the message's content | |
US11695741B2 (en) | Blockchain network incorporating an individual's geo-location via a communication network and applications using the same | |
US20200269861A1 (en) | Operationally customizable smart vehicle access | |
Kim et al. | Automatic, location-privacy preserving dashcam video sharing using blockchain and deep learning | |
US20210049703A1 (en) | Method for subscribing insurance policies from geolocated mobile devices with contracting on a distributed database | |
KR101557031B1 (en) | Method and system for performing image contents registration service | |
US11734397B2 (en) | Hallmark-based image capture prevention | |
US11818045B2 (en) | System for performing dynamic monitoring and prioritization of data packets | |
US20220108391A1 (en) | Systems and methods for utilizing project information for adjusting a credit offer | |
US20220321437A1 (en) | System for performing dynamic monitoring and filtration of data packets | |
Conley et al. | Location-based services: time for a privacy check-in | |
Sipior et al. | Cyberespionage goes mobile: Fasttrans company attacked | |
US20240362623A1 (en) | Liquidity and security mechanisms as part of a unified cryptographic wallet | |
US20230388804A1 (en) | System and method for preventing geo-location data tampering | |
Mukisa | A Framework for Privacy Aware Design in Future Mobile Applications | |
CA3185407A1 (en) | Systems and methods for generating and using entity specific data assets | |
JP2005025258A (en) | Terminal performing data communication, program to be used therewith, server, and data transmission system | |
Mudiyanselage | BACHELOR THESIS ASSIGNMENT | |
Ricciardi | Ambient Intelligence in the Internet of Things | |
Boothe | Ubiquitous Computing and Privacy |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: LENS VENTURES LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEMMEY, TARA;VONOG, STANISLAV;REEL/FRAME:067140/0270 Effective date: 20160209 |
|
AS | Assignment |
Owner name: VONOG, STANISLAV, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LENS VENTURES LLC;REEL/FRAME:067166/0932 Effective date: 20240419 Owner name: LEMMEY, TARA, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LENS VENTURES LLC;REEL/FRAME:067166/0932 Effective date: 20240419 |
|
AS | Assignment |
Owner name: LENS VENTURES LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEMMEY, TARA;VONOG, STANISLAV;REEL/FRAME:067241/0242 Effective date: 20240426 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |