US20160257198A1 - In-vehicle component user interface - Google Patents
In-vehicle component user interface Download PDFInfo
- Publication number
- US20160257198A1 US20160257198A1 US14/635,321 US201514635321A US2016257198A1 US 20160257198 A1 US20160257198 A1 US 20160257198A1 US 201514635321 A US201514635321 A US 201514635321A US 2016257198 A1 US2016257198 A1 US 2016257198A1
- Authority
- US
- United States
- Prior art keywords
- component
- user interface
- vehicle
- control set
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000013459 approach Methods 0.000 claims abstract description 17
- 230000003993 interaction Effects 0.000 claims abstract description 14
- 230000006870 function Effects 0.000 claims description 26
- 238000000034 method Methods 0.000 claims description 18
- 230000004931 aggregating effect Effects 0.000 claims 1
- 238000001514 detection method Methods 0.000 description 25
- 238000004891 communication Methods 0.000 description 21
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 230000009471 action Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 3
- 230000002776 aggregation Effects 0.000 description 2
- 238000004220 aggregation Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 230000005684 electric field Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013497 data interchange Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000003319 supportive effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
- G05B19/0421—Multiprocessor system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/85—Arrangements for transferring vehicle- or driver-related data
-
- B60K2350/1024—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/563—Vehicle displaying mobile device information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/566—Mobile devices displaying vehicle information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/569—Vehicle controlling mobile device functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/573—Mobile devices controlling vehicle functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/589—Wireless data transfers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/23—Pc programming
- G05B2219/23067—Control, human or man machine interface, interactive, HMI, MMI
Definitions
- aspects of the disclosure generally relate to deployment of a user interface for interior vehicle component configuration by way of a personal user device.
- Smartphone and wearable device sales volumes continue to increase. Thus, more such devices are brought by users into the automotive context. Smartphones can already be used in some vehicle models to access a wide range of vehicle information, to start the vehicle, and to open windows and doors. Additionally, some wearable devices are capable of providing real-time navigation information to the driver.
- a system in a first illustrative embodiment, includes an in-vehicle component, including a first control set to configure the component, configured to identify a device associated with a user approach to the component; and send an interaction request to the device to cause the device to display a user interface for the component including a second control set to configure the component, the second control set including at least one function unavailable in the first control set.
- a personal device is configured to receive, from an in-vehicle component including a first control set to configure the component, a user interface definition descriptive of a second control set to configure the component; and receive, from the component, a request to display a user interface for the component including the second control set to configure the component, the second control set including at least one function unavailable in the first control set
- a computer-implemented method includes receiving, by a personal device from an in-vehicle component including a first control set to configure the component, a user interface definition descriptive of a second control set to configure the component; and receiving, from the component, a request to display a user interface for the component including the second control set to configure the component, the second control set including at least one function unavailable in the first control set.
- FIG. 1 illustrates an example diagram of a system that may be used to provide telematics services to a vehicle
- FIG. 2A illustrates a diagram of a request by a user to configure an in-vehicle component via the user's mobile device
- FIG. 2B illustrates an alternate diagram of a request by a user to configure an in-vehicle component via the user's mobile device
- FIG. 3 illustrates an example vehicle including a plurality of in-vehicle components and a plurality of vehicle seats from which the in-vehicle components are accessible;
- FIG. 4A illustrates an example in-vehicle component receiving wireless signal intensity data from other in-vehicle components
- FIG. 4B illustrates an example in-vehicle component the in-vehicle component providing the identified mobile device with a user interface definition
- FIG. 5 illustrates an example process for identifying a mobile device associated with a user in the vehicle requesting an action
- FIG. 6 illustrates an example process for displaying a user interface on the identified mobile device.
- a system may be configured to allow vehicle occupants to seamlessly interact with their vehicle or with any other framework-enabled vehicle.
- the system may include a vehicle configured to detect a user approach to a proximity sensor of an in-vehicle component to be configured, and further to identify a personal device of the approaching user on which to display a user interface for the in-vehicle component.
- a personal device may generally refer to a mobile device such as a smartphone, or a wearable device such as a smart watch or smart glasses.
- the personal device of the user may be configured to communicate with the vehicle to receive the user interface to display, provide the user interface to the user, and forward any commands entered via the user interface to the vehicle for configuration of the in-vehicle component.
- the system may be configured to determine which occupant of the vehicle desires to interact with a specific function, i.e., which device should interact with the in-vehicle component to be configured, and further to communicate, to the identified device, which user interface information is to be displayed.
- a user may reach for a light switch within the vehicle cabin, e.g., located on the vehicle headliner near a lamp or on a seat armrest.
- the light switch When the light switch is touched by the user, it may provide some basic functionality to allow for the configuration of the light, such as turning the light off or on.
- his or her mobile device may be configured to automatically display a more in-depth interface for the light switch.
- the in-depth user interface may accordingly enable the user to setup additional lighting features, such as tone, mood, intensity, etc., which may be unavailable via the direct physical user interface of the light.
- a user may request a taxi, a shared car, or another type of public transportation vehicle.
- the user may desire to perform customization to the local experience within the vehicle by adjusting lighting, climate, and sound attributes for the user's seat location.
- the user may also desire to be made aware of the specific features of the user's seat, such as whether the seat has cooling or massage features or some other feature available. If such features are available, the user may wish to be able to craft a customized experience without having to learn a vehicle-specific or application-specific user interface. Accordingly, when the user approaches one of the controls of the vehicle to configure, the vehicle may be configured to provide a user interface definition to the user's personal device including the specifics of the particular vehicle control.
- the user may perform the same customization on a first vehicle, and may desire that the user's vehicle settings would automatically be applied to a second vehicle supporting the customizations in which the user may travel.
- the user's personal device may maintain lighting, climate, infotainment, and seat position settings from the first vehicle, and may attempt to set user defaults accordingly based on the available features of the second vehicle. Further aspects of the system are discussed in detail below.
- FIG. 1 illustrates an example diagram of a system 100 that may be used to provide telematics services to a vehicle 102 .
- the vehicle 102 may be one of various types of passenger vehicles, such as a crossover utility vehicle (CUV), a sport utility vehicle (SUV), a truck, a recreational vehicle (RV), a boat, a plane or other mobile machine for transporting people or goods.
- Telematics services may include, as some non-limiting possibilities, navigation, turn-by-turn directions, vehicle health reports, local business search, accident reporting, and hands-free calling.
- the system 100 may include the SYNC system manufactured by The Ford Motor Company of Dearborn, Mich. It should be noted that the illustrated system 100 is merely an example, and more, fewer, and/or differently located elements may be used.
- the computing platform 104 may include one or more processors 106 configured to perform instructions, commands and other routines in support of the processes described herein.
- the computing platform 104 may be configured to execute instructions of vehicle applications 110 to provide features such as navigation, accident reporting, satellite radio decoding, and hands-free calling.
- Such instructions and other data may be maintained in a non-volatile manner using a variety of types of computer-readable storage medium 112 .
- the computer-readable medium 112 also referred to as a processor-readable medium or storage
- Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java, C, C++, C#, Objective C, Fortran, Pascal, Java Script, Python, Perl, and PL/SQL.
- the computing platform 104 may be provided with various features allowing the vehicle occupants to interface with the computing platform 104 .
- the computing platform 104 may include an audio input 114 configured to receive spoken commands from vehicle occupants through a connected microphone 116 , and auxiliary audio input 118 configured to receive audio signals from connected devices.
- the auxiliary audio input 118 may be a physical connection, such as an electrical wire or a fiber optic cable, or a wireless input, such as a BLUETOOTH audio connection.
- the audio input 114 may be configured to provide audio processing capabilities, such as pre-amplification of low-level signals, and conversion of analog inputs into digital data for processing by the processor 106 .
- the computing platform 104 may also provide one or more audio outputs 120 to an input of an audio module 122 having audio playback functionality. In other examples, the computing platform 104 may provide the audio output to an occupant through use of one or more dedicated speakers (not illustrated).
- the audio module 122 may include an input selector 124 configured to provide audio content from a selected audio source 126 to an audio amplifier 128 for playback through vehicle speakers 130 or headphones (not illustrated).
- the audio sources 126 may include, as some examples, decoded amplitude modulated (AM) or frequency modulated (FM) radio signals, and audio signals from compact disc (CD) or digital versatile disk (DVD) audio playback.
- the audio sources 126 may also include audio received from the computing platform 104 , such as audio content generated by the computing platform 104 , audio content decoded from flash memory drives connected to a universal serial bus (USB) subsystem 132 of the computing platform 104 , and audio content passed through the computing platform 104 from the auxiliary audio input 118 .
- audio received from the computing platform 104 such as audio content generated by the computing platform 104 , audio content decoded from flash memory drives connected to a universal serial bus (USB) subsystem 132 of the computing platform 104 , and audio content passed through the computing platform 104 from the auxiliary audio input 118 .
- USB universal serial bus
- the computing platform 104 may utilize a voice interface 134 to provide a hands-free interface to the computing platform 104 .
- the voice interface 134 may support speech recognition from audio received via the microphone 116 according to grammar associated with available commands, and voice prompt generation for output via the audio module 122 .
- the system may be configured to temporarily mute or otherwise override the audio source specified by the input selector 124 when an audio prompt is ready for presentation by the computing platform 104 and another audio source 126 is selected for playback.
- the computing platform 104 may also receive input from human-machine interface (HMI) controls 136 configured to provide for occupant interaction with the vehicle 102 .
- HMI human-machine interface
- the computing platform 104 may interface with one or more buttons or other HMI controls configured to invoke functions on the computing platform 104 (e.g., steering wheel audio buttons, a push-to-talk button, instrument panel controls, etc.).
- the computing platform 104 may also drive or otherwise communicate with one or more displays 138 configured to provide visual output to vehicle occupants by way of a video controller 140 .
- the display 138 may be a touch screen further configured to receive user touch input via the video controller 140 , while in other cases the display 138 may be a display only, without touch input capabilities.
- the computing platform 104 may be further configured to communicate with other components of the vehicle 102 via one or more in-vehicle networks 142 .
- the in-vehicle networks 142 may include one or more of a vehicle controller area network (CAN), an Ethernet network, and a media oriented system transfer (MOST), as some examples.
- the in-vehicle networks 142 may allow the computing platform 104 to communicate with other vehicle 102 systems, such as a vehicle modem 144 (which may not be present in some configurations), a global positioning system (GPS) module 146 configured to provide current vehicle 102 location and heading information, and various vehicle ECUs 148 configured to cooperate with the computing platform 104 .
- GPS global positioning system
- the vehicle ECUs 148 may include a powertrain control module configured to provide control of engine operating components (e.g., idle control components, fuel delivery components, emissions control components, etc.) and monitoring of engine operating components (e.g., status of engine diagnostic codes); a body control module configured to manage various power control functions such as exterior lighting, interior lighting, keyless entry, remote start, and point of access status verification (e.g., closure status of the hood, doors and/or trunk of the vehicle 102 ); a radio transceiver module configured to communicate with key fobs or other local vehicle 102 devices; and a climate control management module configured to provide control and monitoring of heating and cooling system components (e.g., compressor clutch and blower fan control, temperature sensor information, etc.).
- engine operating components e.g., idle control components, fuel delivery components, emissions control components, etc.
- monitoring of engine operating components e.g., status of engine diagnostic codes
- a body control module configured to manage various power control functions such as exterior lighting, interior lighting, keyless entry, remote
- the audio module 122 and the HMI controls 136 may communicate with the computing platform 104 over a first in-vehicle network 142 -A, and the vehicle modem 144 , GPS module 146 , and vehicle ECUs 148 may communicate with the computing platform 104 over a second in-vehicle network 142 -B.
- the computing platform 104 may be connected to more or fewer in-vehicle networks 142 .
- one or more HMI controls 136 or other components may be connected to the computing platform 104 via different in-vehicle networks 142 than shown, or directly without connection to an in-vehicle network 142 .
- the computing platform 104 may also be configured to communicate with mobile devices 152 of the vehicle occupants.
- the mobile devices 152 may be any of various types of portable computing device, such as cellular phones, tablet computers, smart watches, laptop computers, portable music players, or other devices capable of communication with the computing platform 104 .
- the computing platform 104 may include a wireless transceiver 150 (e.g., a BLUETOOTH module, a ZIGBEE transceiver, a Wi-Fi transceiver, an IrDA transceiver, an RFID transceiver, etc.) configured to communicate with a compatible wireless transceiver 154 of the mobile device 152 .
- the computing platform 104 may communicate with the mobile device 152 over a wired connection, such as via a USB connection between the mobile device 152 and the USB subsystem 132 .
- the communications network 156 may provide communications services, such as packet-switched network services (e.g., Internet access, VoIP communication services), to devices connected to the communications network 156 .
- An example of a communications network 156 may include a cellular telephone network.
- Mobile devices 152 may provide network connectivity to the communications network 156 via a device modem 158 of the mobile device 152 .
- mobile devices 152 may be associated with unique device identifiers (e.g., mobile device numbers (MDNs), Internet protocol (IP) addresses, etc.) to identify the communications of the mobile devices 152 over the communications network 156 .
- unique device identifiers e.g., mobile device numbers (MDNs), Internet protocol (IP) addresses, etc.
- occupants of the vehicle 102 or devices having permission to connect to the computing platform 104 may be identified by the computing platform 104 according to paired device data 160 maintained in the storage medium 112 .
- the paired device data 160 may indicate, for example, the unique device identifiers of mobile devices 152 previously paired with the computing platform 104 of the vehicle 102 , such that the computing platform 104 may automatically reconnected to the mobile devices 152 referenced in the paired device data 160 without user intervention.
- the mobile device 152 may allow the computing platform 104 to use the network connectivity of the device modem 158 to communicate over the communications network 156 with the remote telematics services 162 .
- the computing platform 104 may utilize a data-over-voice plan or data plan of the mobile device 152 to communicate information between the computing platform 104 and the communications network 156 .
- the computing platform 104 may utilize the vehicle modem 144 to communicate information between the computing platform 104 and the communications network 156 , without use of the communications facilities of the mobile device 152 .
- the mobile device 152 may include one or more processors 164 configured to execute instructions of mobile applications 170 loaded to a memory 166 of the mobile device 152 from storage medium 168 of the mobile device 152 .
- the mobile applications 170 may be configured to communicate with the computing platform 104 via the wireless transceiver 154 and with the remote telematics services 162 or other network services via the device modem 158 .
- the computing platform 104 may also include a device link interface 172 to facilitate the integration of functionality of the mobile applications 170 into the grammar of commands available via the voice interface 134 as well as into display 138 of the computing platform 104 .
- the device link interfaced 172 may also provide the mobile applications 170 with access to vehicle information available to the computing platform 104 via the in-vehicle networks 142 .
- Some examples of device link interfaces 172 include the SYNC APPLINK component of the SYNC system provided by The Ford Motor Company of Dearborn, Mich., the CarPlay protocol provided by Apple Inc. of Cupertino, Calif., or the Android Auto protocol provided by Google, Inc. of Mountain View, Calif.
- the vehicle component interface application 174 may be once such application installed to the mobile device 152 .
- the vehicle component interface application 174 of the mobile device 152 may be configured to facilitate access to one or more vehicle 102 features made available for device configuration by the vehicle 102 .
- the available vehicle 102 features may be accessible by a single vehicle component interface application 174 , in which case such the vehicle component interface application 174 may be configured to be customizable or to maintain configurations supportive of the specific vehicle 102 brand/model and option packages.
- the vehicle component interface application 174 may be configured to receive, from the vehicle 102 , a definition of the features that are available to be controlled, display a user interface descriptive of the available features, and provide user input from the user interface to the vehicle 102 to allow the user to control the indicated features.
- an appropriate mobile device 152 to display the vehicle component interface application 174 may be identified, and a definition of the user interface to display may be provided to the identified vehicle component interface application 174 for display to the user.
- Systems such as the system 100 described above may require mobile device 152 pairing with the computing platform 104 and/or other setup operations.
- a system may be configured to allow vehicle occupants to seamlessly interact with user interface elements in their vehicle or with any other framework-enabled vehicle, without requiring the mobile device 152 or wearable device 202 to have been paired with or be in communication with the computing platform 104 .
- FIG. 2A illustrate a diagram 200 -A of a request by a user to configure an in-vehicle component 206 via the user's mobile device 152 .
- a wearable device 202 associated with the user's mobile device 152 being moved toward an in-vehicle component 206 having a proximity sensor 208 .
- the wearable device 202 may include a smartwatch, smart glasses, fitness band, control ring, or other personal mobility or accessory device designed to be worn and to communicate with the user's mobile device 152 .
- the wearable device 202 may communicate data with the mobile device 152 over a wireless connection 204 .
- the wireless connection 204 may be a Bluetooth Low Energy (BLE) connection, but other types of local wireless connection, such as Wi-Fi or Zigbee may be utilized as well.
- BLE Bluetooth Low Energy
- the mobile device 152 may provide access to one or more control or display functions of the mobile device 152 to the wearable device 202 .
- the mobile device 152 may enable the wearable device 202 to accept a phone call to the mobile device 152 , enable a mobile application of the mobile device 152 to execute, receive and present notifications sent to the mobile device 152 , and/or a combination thereof.
- the in-vehicle component 206 may include various elements of the vehicle 102 having user-specific configurable settings. As shown in FIG. 3 , an example vehicle 102 includes a plurality of in-vehicle components 206 -A through 206 -I (collectively 206 ) and a plurality of vehicle seats 302 -A through 302 -D (collectively 302 ) from which the in-vehicle components 206 are accessible. These in-vehicle components 206 may include, as some examples, overhead light in-vehicle components 206 -A through 206 -D, overhead compartment in-vehicle component 206 -E, and speaker in-vehicle components 206 -F through 206 -I.
- in-vehicle components 206 are possible as well, such as power seats or climate control vents.
- the in-vehicle component 206 may expose controls such as buttons, sliders, and touchscreens that may be used by the user to configure the particular settings of the in-vehicle component 206 .
- the controls of the in-vehicle component 206 may allow the user to set a lighting level of a light control, set a temperature of a climate control, set a volume and source of audio for a speaker, and set a position of a seat control.
- the illustrated portion of the vehicle 102 in FIG. 3 is merely an example, and more, fewer, and/or differently located elements may be used.
- each in-vehicle component 206 may be equipped with a proximity detection sensor 208 configured to facilitate detection of the wearable device 202 .
- the proximity detection sensor 208 may include a wireless device, such as an Apple iBeacon device or a Google altBeacon device configured to enable low energy Bluetooth signal intensity as a locator, to determine the proximity of the wearable device 202 or mobile device 152 . Detection of proximity of the wearable device 202 or mobile device 152 by the proximity detection sensor 208 may cause the vehicle component interface application 174 of the mobile device 152 to be activated. In an example, a wearer of the wearable device 202 may reach his or her hand toward the in-vehicle component 206 .
- the intensity shift of the wireless connection 204 strength may be detected by the proximity detection sensor 208 , and a handshake may be established between the proximity detection sensor 208 and the approaching wearable device 202 .
- This connection functionality of the mobile device 152 may accordingly be utilized as a trigger to invoke the vehicle component interface application 174 on the mobile device 152 .
- the proximity detection sensor 208 may include a near field communication (NFC) tag that may be detected by the wearable device 202 or mobile device 152 . Accordingly, as the wearable device 202 or mobile device 152 is moved into proximity to the in-vehicle component 206 , the vehicle component interface application 174 on the mobile device 152 may be activated. However, the use of NFC tags may require a controlled, slow motion of the approaching device to close proximity to the proximity detection sensor 208 . As a further possibility, the proximity detection sensor 208 may include a static image such as a quick response (QR) code or other information-encoded image that may be captured via a camera of the wearable device 202 or mobile device 152 .
- QR quick response
- the vehicle component interface application 174 on the mobile device 152 may be activated responsive to the user pointing a camera of the wearable device 202 or mobile device 152 at the QR code or other image.
- QR codes or other image representations may require the approaching device to keep its camera on, and further requires the user to orient the approaching device to acquire the image.
- each in-vehicle component 206 may include a set of controls configured to receive input from the user with respect to basic or core functions of the in-vehicle component 206 (e.g., turn light on/off, turn speaker on/off, etc.), and a proximity detection sensor 208 configured to identify proximity of wearable device 202 or mobile device 152 . It should be noted that the user interaction with the in-vehicle component 206 may be performed despite the mobile device 152 or wearable device 202 not having been paired with or being in communication with the computing platform 104 .
- FIG. 2B illustrates an alternate diagram 200 -B of a request by a user to configure an in-vehicle component 206 via the user's mobile device 152 .
- the user is approaching and may touch the proximity detection sensor 208 of the in-vehicle component 206 with a “naked” hand, i.e., a hand that is not wearing a wearable device 202 or holding a mobile device 152 .
- the vehicle 102 may be unable to detect which device to utilize based on the wireless signal intensity 210 .
- instructing all mobile devices 152 in the vehicle 102 to launch the vehicle component interface application 174 or sending to all of them a notification that the interface is available would be an inelegant solution.
- triangulation may be used to detect which mobile device 152 is that of the passenger requesting interaction with the in-vehicle component 206 .
- the vehicle 102 may determine that a mobile device 152 located in seat 302 -B is the device of the user in proximity to the in-vehicle component 206 -B.
- each of the in-vehicle controls 206 -A through 206 -D is located closest to one of the seats 302 -A through 302 -D, respectively.
- each of the in-vehicle controls 206 -A through 206 -D includes a respective proximity sensor 208 .
- a preliminary action may be performed by the in-vehicle component 206 , such as toggling the on-off state of a light of the in-vehicle component 206 .
- the in-vehicle component 206 -B may broadcast or otherwise send a request for intensity information 210 to the other in-vehicle components 206 of the vehicle 102 (e.g., 206 -A and 206 -C as illustrated).
- This request may cause the other in-vehicle components 206 to return wireless signal intensity 210 data identified by their respective proximity sensors 208 for whatever devices they detect (e.g., intensity data 210 -A identified by the proximity sensor 308 -A, intensity data 210 -C identified by the proximity sensor 208 -C).
- the in-vehicle component 206 -B may user the wireless signal intensity 210 -B from its own proximity sensor 308 -B as well as data from the other proximity sensors 208 (e.g., proximity sensors 208 -A and 208 -C) to determine a mobile device 152 of the approaching user.
- the proximity detection sensors 208 may be configured to share device wireless signal intensity 210 data with one other to allow for triangulation and identification of which of the wearable devices 202 or mobile devices 152 are closest to a given in-vehicle component 206 .
- a mobile device 152 may be detected as being the only mobile device 102 that has a highest measured wireless signal intensity 210 at the in-vehicle component 206 -B as compared to that measured at the in-vehicle component 206 -A and the in-vehicle component 206 -C. That device may therefore be determined to be the mobile device 152 most likely located in seat 302 -B.
- the proximity detection sensors 208 may additionally be utilized to enable in-cabin gesture interfaces for users wearing capable wearable devices 202 (e.g., BLE devices in the case of BLE proximity detection sensors 208 ), such as one of the new smart-watches, fitness bands or control rings. Based on the aforementioned triangulation techniques, the network of proximity detection sensors 208 may be able to perform in-cabin location tracking of the wearable devices 202 , in order to detect a gesture action performed by a user in the air, such as to open a window with a simple swipe of the hand, or to control the volume with an up-down hand motion.
- a gesture action performed by a user in the air such as to open a window with a simple swipe of the hand, or to control the volume with an up-down hand motion.
- the in-cabin tracking may also be extended to passengers not wearing wearable devices 202 .
- electrical field distortions may be measurable with sufficiently sensitive proximity detection sensors 208 based on the field generated by the wireless components inside the vehicle 102 . If different communication technologies are used, such as 60 GHz modulation, in addition to increasing the bandwidth of data communicable between devices, in some cases the vehicle 102 may be able to detect in-cabin tracking to detect gestures and other motions at a high resolution.
- the integration of proximity detection sensors 208 with the configurable in-vehicle components 206 , as well as the triangulation method or wireless signal intensity 210 threshold techniques, may accordingly allow the vehicle 102 to determine which mobile device 152 belongs to the user engaging the configurable in-vehicle component 206 .
- the vehicle 102 may be configured to provide the identified mobile device 152 with a user interface definition 402 regarding what functionality is available to perform on the in-vehicle component 206 .
- the in-vehicle component 206 may be configured to communicate the user interface definition 402 to the mobile devices 152 or wearable devices 202 identified to display the user interface.
- the in-vehicle component 206 may be configured to request the mobile device 152 to launch a vehicle component interface application 174 previously installed to the mobile device 152 . If the vehicle component interface application 174 is not already installed on the personal device, the in-vehicle component 206 may be configured to offer to side-load to it or to offer a link from which the in-vehicle component 206 may be installed to the personal device (e.g., from the Google Application Store or the Apple AppStore, as some possibilities).
- the interface definition 402 may be encoded in a data interchange format, such as hypertext markup language (HTML), extensible markup language (XML) or JavaScript Object Notation (JSON).
- the user interface definition 402 may be encoded in a markup similar to that of the view and viewgroup user interface definitions utilized by the Google Android operating system.
- One advantage of using a data exchange commonly used on the web is that user devices (e.g., mobile devices 152 , wearable devices 202 , etc.) may be able to render the user interface definition 402 to display the user interface using existing or downloadable functionality of the device (e.g., a web browser plugin).
- a personal device of the user may be configured to utilize the vehicle component interface application 174 to connect to the vehicle 102 to receive the user interface definition 402 .
- the personal device may connect to the in-vehicle components 206 to receive the user interface definition 402 via available wireless protocols (e.g., BLE, etc.) provided by the proximity sensors 208 .
- the received user interface definition 402 may be descriptive of the functions available in each in-vehicle component 206 , variables that may be controlled, and current state of the variables.
- a universal vehicle component interface application 174 may be utilized across various brands/makes/models of vehicle 102 .
- an user interface definition 402 template for an in-vehicle light fixture having a single lamp may be described as an XML element with two attributes as follows:
- a more complex user interface definition 402 template would accordingly be utilized, such as that used to control seat functions (e.g., forward, back, tilt, recline, lumbar, etc.).
- the user interface definition 402 template may be defined to include attributes descriptive of the available functions, their names for presentation in the user interface, their allowed range of values (e.g., min, max, step size, default, etc.), and potentially layout information descriptive of grouping, ordering, or suggested controls (e.g., toggle control, slider control, knob control, etc.) of how to render the interface controls to change these attributes.
- the mobile device 152 or wearable device 202 As the mobile device 152 or wearable device 202 is requested by the in-vehicle component 206 to act as a user interface for the in-vehicle component 206 , the mobile device 152 or wearable device 202 accordingly receives functionalities are available from the module, but also what other modules offering similar functionalities are available in the vehicle as well as their locations (e.g., from triangulation as discussed above).
- the vehicle component interface application 174 may be configured to aggregate the data and offer to the user combinations for controlling interior lighting or other vehicle functions by controlling the in-vehicle component 206 sharing that attribute. As a specific example, the user may utilize their mobile device 152 to invoke interior lights of all interior lights, but at a low intensity level.
- aggregation of the user interface definition 402 may be performed by the in-vehicle components 206 , such that the aggregated user interface definition 402 may be communicated to the personal device by the specific in-vehicle component 206 requesting for the user's device to display a user interface.
- FIG. 5 illustrates an example process 500 for identifying a mobile device 152 associated with a user in the vehicle 102 requesting an action.
- the process 500 may be performed, for example, by one or more in-vehicle components 206 of the vehicle 102 .
- the in-vehicle component 206 determines whether a personal device of a user (e.g., a mobile device 152 , a wearable device 202 , etc.) is approaching the in-vehicle component 206 .
- the in-vehicle component 206 may be equipped with a proximity detection sensor 208 configured to facilitate detection of a wearable device 202 , such that as the wireless signal intensity 210 of the approaching wearable device 202 to the proximity detection sensor 208 crosses a minimum threshold intensity, the intensity shift of the wireless connection 204 strength may be detected by the proximity detection sensor 208 , and a handshake may be established between the proximity detection sensor 208 and the approaching wearable device 202 . If a personal device is detected as approaching the in-vehicle component 206 , control passes to operation 504 . Otherwise, control passes to operation 510 .
- the in-vehicle component 206 identifies the mobile device 152 of the user to use to display a user interface for the in-vehicle component 206 .
- the approaching wearable device 202 may be paired with or otherwise associated with a mobile device 152 configured to execute the vehicle component interface application 174 , and the wearable device 202 may be configured to provide to the in-vehicle component 206 (or the in-vehicle component 206 may request) the identity of the associated mobile device 152 .
- the approaching device is a mobile device 152 or other device configured to execute the vehicle component interface application 174
- the in-vehicle component 206 may identify the approaching mobile device 152 as the device to display the user interface.
- the in-vehicle component 206 sends an interaction request to the identified device.
- the in-vehicle component 206 may be configured to request the identified device to launch a vehicle component interface application 174 , or to provide a link for the vehicle component interface application 174 to be downloaded if the vehicle component interface application 174 is not yet installed. Once invoked or installed, control passes to operation 508 .
- the in-vehicle component 206 processes the interaction request using the identified device. An example interaction is described below with respect to the process 600 . After operation 508 , control passes to operation 502 .
- the in-vehicle component 206 may determine whether the in-vehicle component 206 detects an approach but no personal device.
- electrical field distortions of the in-vehicle component 206 may be measured by the proximity detection sensor 208 of the in-vehicle component 206 based on the field generated by the wireless components inside the vehicle 102 .
- the in-vehicle component 206 may detect a user touch via a selection of a control of the built-in user interface of the in-vehicle component 206 . If an approach is detected control passes to operation 512 . Otherwise, control passes to operation 502 .
- the in-vehicle component 206 requests the other in-vehicle components 206 of the vehicle 102 to send wireless signal intensity 210 data identified by their respective proximity sensors 208 for whatever devices they detect. This may be done to allow the in-vehicle component 206 to perform triangulation to detect which mobile device 152 is that of the user requesting interaction with the in-vehicle component 206 .
- the in-vehicle component 206 determines whether the wireless signal intensity 210 data or whether a timeout occurred. For example, if at least a predetermined amount of time has passed since sending the request in operation 502 , control passes to operation 516 . Or, if the in-vehicle component 206 receives the requested wireless signal intensity 210 , control passes to operation 516 . Otherwise, control remains at operation 514 .
- the in-vehicle component 206 calculates proximity to the detected devices.
- the in-vehicle component 206 may use the received wireless signal intensities 210 from its proximity sensor 308 as well as data from the other proximity sensors 208 , to determine which devices have what wireless signal intensities 210 at the various in-vehicle component 206 .
- the in-vehicle component 206 identifies a closest device.
- in-vehicle component 206 may identify a mobile device 152 having a higher wireless signal intensity 210 by in-vehicle component 206 than by the other in-vehicle components 206 . This device may accordingly be identified as being the most likely the mobile device 152 of the user approaching the in-vehicle component 206 .
- control passes to operation 506 .
- FIG. 6 illustrates an example process 600 for displaying a user interface on the identified mobile device 152 .
- the process 600 may be performed, for example, by a personal device (e.g., a mobile device 152 , a wearable device 202 , etc.) in communication with one or more in-vehicle components 206 of the vehicle 102 .
- a personal device e.g., a mobile device 152 , a wearable device 202 , etc.
- the personal device enters the vehicle 102 .
- the personal device may be carried by a user entering the vehicle 102 .
- the personal device connects to the in-vehicle components 206 .
- a personal device of the user may be configured to utilize the vehicle component interface application 174 to connect to the available wireless protocols (e.g., BLE, etc.).
- the personal device receives complex user interface definition 402 template information from the in-vehicle components 206 .
- the personal device may receive tagged user interface definition 402 information descriptive of the functions available in each in-vehicle component 206 , variables that may be controlled, and current state of the variables.
- the personal device determines whether to act as a user interface for the in-vehicle components 206 .
- the personal device may be requested by the in-vehicle component 206 to act as a user interface for the in-vehicle component 206 .
- the personal device aggregates data from the in-vehicle components 206 offering similar functionality.
- the user may utilize their mobile device 152 to invoke interior lights of all interior lights, but at a low intensity level.
- aggregation of the user interface definition 402 may be performed by the in-vehicle components 206 , such that the aggregated user interface definition 402 may be communicated to the personal device by the specific in-vehicle component 206 requesting for the user's device to display a user interface.
- the personal device renders a user interface.
- the personal device may accordingly display a user interface defined according to the received and aggregated tagged user interface definition 402 .
- the personal device determines whether the user requests to quit the user interface. In an example, the personal device may receive user input requesting for the user interface to be dismissed. If such input is received, control passes to operation 616 . Otherwise, control passes to operation 618 .
- the personal device closes the user interface. After operation 616 , control passes to operation 608 .
- the personal device determines whether a user interaction with the user interface is received.
- the personal device may receive user input requesting for a change to be made to the settings for one or more of the in-vehicle components 206 .
- the personal device sends an action request to the in-vehicle component(s) 206 .
- the user may utilize the personal device to invoke interior lights of all interior lights, but at a low intensity level.
- control passes to operation 614 .
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Description
- Aspects of the disclosure generally relate to deployment of a user interface for interior vehicle component configuration by way of a personal user device.
- Smartphone and wearable device sales volumes continue to increase. Thus, more such devices are brought by users into the automotive context. Smartphones can already be used in some vehicle models to access a wide range of vehicle information, to start the vehicle, and to open windows and doors. Additionally, some wearable devices are capable of providing real-time navigation information to the driver.
- In a first illustrative embodiment, a system includes an in-vehicle component, including a first control set to configure the component, configured to identify a device associated with a user approach to the component; and send an interaction request to the device to cause the device to display a user interface for the component including a second control set to configure the component, the second control set including at least one function unavailable in the first control set.
- In a second illustrative embodiment, a personal device is configured to receive, from an in-vehicle component including a first control set to configure the component, a user interface definition descriptive of a second control set to configure the component; and receive, from the component, a request to display a user interface for the component including the second control set to configure the component, the second control set including at least one function unavailable in the first control set
- In a third illustrative embodiment, a computer-implemented method includes receiving, by a personal device from an in-vehicle component including a first control set to configure the component, a user interface definition descriptive of a second control set to configure the component; and receiving, from the component, a request to display a user interface for the component including the second control set to configure the component, the second control set including at least one function unavailable in the first control set.
-
FIG. 1 illustrates an example diagram of a system that may be used to provide telematics services to a vehicle; -
FIG. 2A illustrates a diagram of a request by a user to configure an in-vehicle component via the user's mobile device; -
FIG. 2B illustrates an alternate diagram of a request by a user to configure an in-vehicle component via the user's mobile device; -
FIG. 3 illustrates an example vehicle including a plurality of in-vehicle components and a plurality of vehicle seats from which the in-vehicle components are accessible; -
FIG. 4A illustrates an example in-vehicle component receiving wireless signal intensity data from other in-vehicle components; -
FIG. 4B illustrates an example in-vehicle component the in-vehicle component providing the identified mobile device with a user interface definition; -
FIG. 5 illustrates an example process for identifying a mobile device associated with a user in the vehicle requesting an action; and -
FIG. 6 illustrates an example process for displaying a user interface on the identified mobile device. - As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.
- A system may be configured to allow vehicle occupants to seamlessly interact with their vehicle or with any other framework-enabled vehicle. The system may include a vehicle configured to detect a user approach to a proximity sensor of an in-vehicle component to be configured, and further to identify a personal device of the approaching user on which to display a user interface for the in-vehicle component. As used herein, a personal device may generally refer to a mobile device such as a smartphone, or a wearable device such as a smart watch or smart glasses. The personal device of the user may be configured to communicate with the vehicle to receive the user interface to display, provide the user interface to the user, and forward any commands entered via the user interface to the vehicle for configuration of the in-vehicle component. It should be noted that the user interaction with the in-vehicle component may be performed despite the personal device not having been paired with or being in communication with the vehicle head unit. Thus, the system may be configured to determine which occupant of the vehicle desires to interact with a specific function, i.e., which device should interact with the in-vehicle component to be configured, and further to communicate, to the identified device, which user interface information is to be displayed.
- In an example, a user may reach for a light switch within the vehicle cabin, e.g., located on the vehicle headliner near a lamp or on a seat armrest. When the light switch is touched by the user, it may provide some basic functionality to allow for the configuration of the light, such as turning the light off or on. Moreover, as the user approaches the light switch, his or her mobile device may be configured to automatically display a more in-depth interface for the light switch. The in-depth user interface may accordingly enable the user to setup additional lighting features, such as tone, mood, intensity, etc., which may be unavailable via the direct physical user interface of the light.
- In another example, a user may request a taxi, a shared car, or another type of public transportation vehicle. As the user enters the vehicle, the user may desire to perform customization to the local experience within the vehicle by adjusting lighting, climate, and sound attributes for the user's seat location. The user may also desire to be made aware of the specific features of the user's seat, such as whether the seat has cooling or massage features or some other feature available. If such features are available, the user may wish to be able to craft a customized experience without having to learn a vehicle-specific or application-specific user interface. Accordingly, when the user approaches one of the controls of the vehicle to configure, the vehicle may be configured to provide a user interface definition to the user's personal device including the specifics of the particular vehicle control.
- In yet another example, the user may perform the same customization on a first vehicle, and may desire that the user's vehicle settings would automatically be applied to a second vehicle supporting the customizations in which the user may travel. For example, the user's personal device may maintain lighting, climate, infotainment, and seat position settings from the first vehicle, and may attempt to set user defaults accordingly based on the available features of the second vehicle. Further aspects of the system are discussed in detail below.
-
FIG. 1 illustrates an example diagram of asystem 100 that may be used to provide telematics services to avehicle 102. Thevehicle 102 may be one of various types of passenger vehicles, such as a crossover utility vehicle (CUV), a sport utility vehicle (SUV), a truck, a recreational vehicle (RV), a boat, a plane or other mobile machine for transporting people or goods. Telematics services may include, as some non-limiting possibilities, navigation, turn-by-turn directions, vehicle health reports, local business search, accident reporting, and hands-free calling. In an example, thesystem 100 may include the SYNC system manufactured by The Ford Motor Company of Dearborn, Mich. It should be noted that the illustratedsystem 100 is merely an example, and more, fewer, and/or differently located elements may be used. - The
computing platform 104 may include one ormore processors 106 configured to perform instructions, commands and other routines in support of the processes described herein. For instance, thecomputing platform 104 may be configured to execute instructions ofvehicle applications 110 to provide features such as navigation, accident reporting, satellite radio decoding, and hands-free calling. Such instructions and other data may be maintained in a non-volatile manner using a variety of types of computer-readable storage medium 112. The computer-readable medium 112 (also referred to as a processor-readable medium or storage) includes any non-transitory medium (e.g., a tangible medium) that participates in providing instructions or other data that may be read by theprocessor 106 of thecomputing platform 104. Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java, C, C++, C#, Objective C, Fortran, Pascal, Java Script, Python, Perl, and PL/SQL. - The
computing platform 104 may be provided with various features allowing the vehicle occupants to interface with thecomputing platform 104. For example, thecomputing platform 104 may include anaudio input 114 configured to receive spoken commands from vehicle occupants through a connectedmicrophone 116, andauxiliary audio input 118 configured to receive audio signals from connected devices. Theauxiliary audio input 118 may be a physical connection, such as an electrical wire or a fiber optic cable, or a wireless input, such as a BLUETOOTH audio connection. In some examples, theaudio input 114 may be configured to provide audio processing capabilities, such as pre-amplification of low-level signals, and conversion of analog inputs into digital data for processing by theprocessor 106. - The
computing platform 104 may also provide one ormore audio outputs 120 to an input of anaudio module 122 having audio playback functionality. In other examples, thecomputing platform 104 may provide the audio output to an occupant through use of one or more dedicated speakers (not illustrated). Theaudio module 122 may include aninput selector 124 configured to provide audio content from aselected audio source 126 to anaudio amplifier 128 for playback throughvehicle speakers 130 or headphones (not illustrated). Theaudio sources 126 may include, as some examples, decoded amplitude modulated (AM) or frequency modulated (FM) radio signals, and audio signals from compact disc (CD) or digital versatile disk (DVD) audio playback. Theaudio sources 126 may also include audio received from thecomputing platform 104, such as audio content generated by thecomputing platform 104, audio content decoded from flash memory drives connected to a universal serial bus (USB)subsystem 132 of thecomputing platform 104, and audio content passed through thecomputing platform 104 from theauxiliary audio input 118. - The
computing platform 104 may utilize avoice interface 134 to provide a hands-free interface to thecomputing platform 104. Thevoice interface 134 may support speech recognition from audio received via themicrophone 116 according to grammar associated with available commands, and voice prompt generation for output via theaudio module 122. In some cases, the system may be configured to temporarily mute or otherwise override the audio source specified by theinput selector 124 when an audio prompt is ready for presentation by thecomputing platform 104 and anotheraudio source 126 is selected for playback. - The
computing platform 104 may also receive input from human-machine interface (HMI) controls 136 configured to provide for occupant interaction with thevehicle 102. For instance, thecomputing platform 104 may interface with one or more buttons or other HMI controls configured to invoke functions on the computing platform 104 (e.g., steering wheel audio buttons, a push-to-talk button, instrument panel controls, etc.). Thecomputing platform 104 may also drive or otherwise communicate with one ormore displays 138 configured to provide visual output to vehicle occupants by way of avideo controller 140. In some cases, thedisplay 138 may be a touch screen further configured to receive user touch input via thevideo controller 140, while in other cases thedisplay 138 may be a display only, without touch input capabilities. - The
computing platform 104 may be further configured to communicate with other components of thevehicle 102 via one or more in-vehicle networks 142. The in-vehicle networks 142 may include one or more of a vehicle controller area network (CAN), an Ethernet network, and a media oriented system transfer (MOST), as some examples. The in-vehicle networks 142 may allow thecomputing platform 104 to communicate withother vehicle 102 systems, such as a vehicle modem 144 (which may not be present in some configurations), a global positioning system (GPS)module 146 configured to providecurrent vehicle 102 location and heading information, andvarious vehicle ECUs 148 configured to cooperate with thecomputing platform 104. As some non-limiting possibilities, thevehicle ECUs 148 may include a powertrain control module configured to provide control of engine operating components (e.g., idle control components, fuel delivery components, emissions control components, etc.) and monitoring of engine operating components (e.g., status of engine diagnostic codes); a body control module configured to manage various power control functions such as exterior lighting, interior lighting, keyless entry, remote start, and point of access status verification (e.g., closure status of the hood, doors and/or trunk of the vehicle 102); a radio transceiver module configured to communicate with key fobs or otherlocal vehicle 102 devices; and a climate control management module configured to provide control and monitoring of heating and cooling system components (e.g., compressor clutch and blower fan control, temperature sensor information, etc.). - As shown, the
audio module 122 and the HMI controls 136 may communicate with thecomputing platform 104 over a first in-vehicle network 142-A, and thevehicle modem 144,GPS module 146, andvehicle ECUs 148 may communicate with thecomputing platform 104 over a second in-vehicle network 142-B. In other examples, thecomputing platform 104 may be connected to more or fewer in-vehicle networks 142. Additionally or alternately, one or more HMI controls 136 or other components may be connected to thecomputing platform 104 via different in-vehicle networks 142 than shown, or directly without connection to an in-vehicle network 142. - The
computing platform 104 may also be configured to communicate withmobile devices 152 of the vehicle occupants. Themobile devices 152 may be any of various types of portable computing device, such as cellular phones, tablet computers, smart watches, laptop computers, portable music players, or other devices capable of communication with thecomputing platform 104. In many examples, thecomputing platform 104 may include a wireless transceiver 150 (e.g., a BLUETOOTH module, a ZIGBEE transceiver, a Wi-Fi transceiver, an IrDA transceiver, an RFID transceiver, etc.) configured to communicate with acompatible wireless transceiver 154 of themobile device 152. Additionally or alternately, thecomputing platform 104 may communicate with themobile device 152 over a wired connection, such as via a USB connection between themobile device 152 and theUSB subsystem 132. - The
communications network 156 may provide communications services, such as packet-switched network services (e.g., Internet access, VoIP communication services), to devices connected to thecommunications network 156. An example of acommunications network 156 may include a cellular telephone network.Mobile devices 152 may provide network connectivity to thecommunications network 156 via adevice modem 158 of themobile device 152. To facilitate the communications over thecommunications network 156,mobile devices 152 may be associated with unique device identifiers (e.g., mobile device numbers (MDNs), Internet protocol (IP) addresses, etc.) to identify the communications of themobile devices 152 over thecommunications network 156. In some cases, occupants of thevehicle 102 or devices having permission to connect to thecomputing platform 104 may be identified by thecomputing platform 104 according to paireddevice data 160 maintained in thestorage medium 112. The paireddevice data 160 may indicate, for example, the unique device identifiers ofmobile devices 152 previously paired with thecomputing platform 104 of thevehicle 102, such that thecomputing platform 104 may automatically reconnected to themobile devices 152 referenced in the paireddevice data 160 without user intervention. - When a
mobile device 152 that supports network connectivity is paired with thecomputing platform 104, themobile device 152 may allow thecomputing platform 104 to use the network connectivity of thedevice modem 158 to communicate over thecommunications network 156 with the remote telematics services 162. In one example, thecomputing platform 104 may utilize a data-over-voice plan or data plan of themobile device 152 to communicate information between thecomputing platform 104 and thecommunications network 156. Additionally or alternately, thecomputing platform 104 may utilize thevehicle modem 144 to communicate information between thecomputing platform 104 and thecommunications network 156, without use of the communications facilities of themobile device 152. - Similar to the
computing platform 104, themobile device 152 may include one ormore processors 164 configured to execute instructions ofmobile applications 170 loaded to amemory 166 of themobile device 152 fromstorage medium 168 of themobile device 152. In some examples, themobile applications 170 may be configured to communicate with thecomputing platform 104 via thewireless transceiver 154 and with theremote telematics services 162 or other network services via thedevice modem 158. Thecomputing platform 104 may also include adevice link interface 172 to facilitate the integration of functionality of themobile applications 170 into the grammar of commands available via thevoice interface 134 as well as intodisplay 138 of thecomputing platform 104. The device link interfaced 172 may also provide themobile applications 170 with access to vehicle information available to thecomputing platform 104 via the in-vehicle networks 142. Some examples of device link interfaces 172 include the SYNC APPLINK component of the SYNC system provided by The Ford Motor Company of Dearborn, Mich., the CarPlay protocol provided by Apple Inc. of Cupertino, Calif., or the Android Auto protocol provided by Google, Inc. of Mountain View, Calif. The vehiclecomponent interface application 174 may be once such application installed to themobile device 152. - The vehicle
component interface application 174 of themobile device 152 may be configured to facilitate access to one ormore vehicle 102 features made available for device configuration by thevehicle 102. In some cases, theavailable vehicle 102 features may be accessible by a single vehiclecomponent interface application 174, in which case such the vehiclecomponent interface application 174 may be configured to be customizable or to maintain configurations supportive of thespecific vehicle 102 brand/model and option packages. In an example, the vehiclecomponent interface application 174 may be configured to receive, from thevehicle 102, a definition of the features that are available to be controlled, display a user interface descriptive of the available features, and provide user input from the user interface to thevehicle 102 to allow the user to control the indicated features. As exampled in detail below, an appropriatemobile device 152 to display the vehiclecomponent interface application 174 may be identified, and a definition of the user interface to display may be provided to the identified vehiclecomponent interface application 174 for display to the user. - Systems such as the
system 100 described above may requiremobile device 152 pairing with thecomputing platform 104 and/or other setup operations. However, as explained in detail below, a system may be configured to allow vehicle occupants to seamlessly interact with user interface elements in their vehicle or with any other framework-enabled vehicle, without requiring themobile device 152 orwearable device 202 to have been paired with or be in communication with thecomputing platform 104. -
FIG. 2A illustrate a diagram 200-A of a request by a user to configure an in-vehicle component 206 via the user'smobile device 152. As shown inFIG. 2A , awearable device 202 associated with the user'smobile device 152 being moved toward an in-vehicle component 206 having aproximity sensor 208. - The
wearable device 202 may include a smartwatch, smart glasses, fitness band, control ring, or other personal mobility or accessory device designed to be worn and to communicate with the user'smobile device 152. In an example, thewearable device 202 may communicate data with themobile device 152 over awireless connection 204. Thewireless connection 204 may be a Bluetooth Low Energy (BLE) connection, but other types of local wireless connection, such as Wi-Fi or Zigbee may be utilized as well. Using theconnection 204, themobile device 152 may provide access to one or more control or display functions of themobile device 152 to thewearable device 202. For example, themobile device 152 may enable thewearable device 202 to accept a phone call to themobile device 152, enable a mobile application of themobile device 152 to execute, receive and present notifications sent to themobile device 152, and/or a combination thereof. - The in-
vehicle component 206 may include various elements of thevehicle 102 having user-specific configurable settings. As shown inFIG. 3 , anexample vehicle 102 includes a plurality of in-vehicle components 206-A through 206-I (collectively 206) and a plurality of vehicle seats 302-A through 302-D (collectively 302) from which the in-vehicle components 206 are accessible. These in-vehicle components 206 may include, as some examples, overhead light in-vehicle components 206-A through 206-D, overhead compartment in-vehicle component 206-E, and speaker in-vehicle components 206-F through 206-I. Other examples of in-vehicle components 206 are possible as well, such as power seats or climate control vents. In many cases, the in-vehicle component 206 may expose controls such as buttons, sliders, and touchscreens that may be used by the user to configure the particular settings of the in-vehicle component 206. As some possibilities, the controls of the in-vehicle component 206 may allow the user to set a lighting level of a light control, set a temperature of a climate control, set a volume and source of audio for a speaker, and set a position of a seat control. It should be noted that the illustrated portion of thevehicle 102 inFIG. 3 is merely an example, and more, fewer, and/or differently located elements may be used. - Referring back to
FIG. 2A , each in-vehicle component 206 may be equipped with aproximity detection sensor 208 configured to facilitate detection of thewearable device 202. In an example, theproximity detection sensor 208 may include a wireless device, such as an Apple iBeacon device or a Google altBeacon device configured to enable low energy Bluetooth signal intensity as a locator, to determine the proximity of thewearable device 202 ormobile device 152. Detection of proximity of thewearable device 202 ormobile device 152 by theproximity detection sensor 208 may cause the vehiclecomponent interface application 174 of themobile device 152 to be activated. In an example, a wearer of thewearable device 202 may reach his or her hand toward the in-vehicle component 206. As thewireless signal intensity 210 of the approachingwearable device 202 to theproximity detection sensor 208 crosses a minimum threshold intensity, the intensity shift of thewireless connection 204 strength may be detected by theproximity detection sensor 208, and a handshake may be established between theproximity detection sensor 208 and the approachingwearable device 202. This connection functionality of themobile device 152 may accordingly be utilized as a trigger to invoke the vehiclecomponent interface application 174 on themobile device 152. - As another possibility, the
proximity detection sensor 208 may include a near field communication (NFC) tag that may be detected by thewearable device 202 ormobile device 152. Accordingly, as thewearable device 202 ormobile device 152 is moved into proximity to the in-vehicle component 206, the vehiclecomponent interface application 174 on themobile device 152 may be activated. However, the use of NFC tags may require a controlled, slow motion of the approaching device to close proximity to theproximity detection sensor 208. As a further possibility, theproximity detection sensor 208 may include a static image such as a quick response (QR) code or other information-encoded image that may be captured via a camera of thewearable device 202 ormobile device 152. In such a case, the vehiclecomponent interface application 174 on themobile device 152 may be activated responsive to the user pointing a camera of thewearable device 202 ormobile device 152 at the QR code or other image. The use of QR codes or other image representations may require the approaching device to keep its camera on, and further requires the user to orient the approaching device to acquire the image. - In general, each in-
vehicle component 206 may include a set of controls configured to receive input from the user with respect to basic or core functions of the in-vehicle component 206 (e.g., turn light on/off, turn speaker on/off, etc.), and aproximity detection sensor 208 configured to identify proximity ofwearable device 202 ormobile device 152. It should be noted that the user interaction with the in-vehicle component 206 may be performed despite themobile device 152 orwearable device 202 not having been paired with or being in communication with thecomputing platform 104. -
FIG. 2B illustrates an alternate diagram 200-B of a request by a user to configure an in-vehicle component 206 via the user'smobile device 152. As compared to the diagram 200-A, in the diagram 200-B the user is approaching and may touch theproximity detection sensor 208 of the in-vehicle component 206 with a “naked” hand, i.e., a hand that is not wearing awearable device 202 or holding amobile device 152. Thus, as no increase inwireless signal intensity 210 is available to be detected, thevehicle 102 may be unable to detect which device to utilize based on thewireless signal intensity 210. In such a situation, instructing allmobile devices 152 in thevehicle 102 to launch the vehiclecomponent interface application 174 or sending to all of them a notification that the interface is available would be an inelegant solution. - Instead, triangulation may be used to detect which
mobile device 152 is that of the passenger requesting interaction with the in-vehicle component 206. Referring again toFIG. 3 , if a user located in seat 302-B reaches for the overhead light in-vehicle component 206-B, by triangulation thevehicle 102 may determine that amobile device 152 located in seat 302-B is the device of the user in proximity to the in-vehicle component 206-B. As shown inFIG. 3 , each of the in-vehicle controls 206-A through 206-D is located closest to one of the seats 302-A through 302-D, respectively. Additionally, similar to as shown inFIGS. 2A and 2B , each of the in-vehicle controls 206-A through 206-D includes arespective proximity sensor 208. - In the example in which a
proximity sensor 208 of the in-vehicle component 206 detects an approach or touch of the user's hand to the in-vehicle component 206, a preliminary action may be performed by the in-vehicle component 206, such as toggling the on-off state of a light of the in-vehicle component 206. Additionally or alternately, as shown inFIG. 4A , the in-vehicle component 206-B may broadcast or otherwise send a request forintensity information 210 to the other in-vehicle components 206 of the vehicle 102 (e.g., 206-A and 206-C as illustrated). This request may cause the other in-vehicle components 206 to returnwireless signal intensity 210 data identified by theirrespective proximity sensors 208 for whatever devices they detect (e.g., intensity data 210-A identified by the proximity sensor 308-A, intensity data 210-C identified by the proximity sensor 208-C). - Continuing with the example of the user in seat 302-B approaching the in-vehicle component 206-B, the in-vehicle component 206-B may user the wireless signal intensity 210-B from its own proximity sensor 308-B as well as data from the other proximity sensors 208 (e.g., proximity sensors 208-A and 208-C) to determine a
mobile device 152 of the approaching user. Thus, theproximity detection sensors 208 may be configured to share devicewireless signal intensity 210 data with one other to allow for triangulation and identification of which of thewearable devices 202 ormobile devices 152 are closest to a given in-vehicle component 206. - For instance, a
mobile device 152 may be detected as being the onlymobile device 102 that has a highest measuredwireless signal intensity 210 at the in-vehicle component 206-B as compared to that measured at the in-vehicle component 206-A and the in-vehicle component 206-C. That device may therefore be determined to be themobile device 152 most likely located in seat 302-B. Notably, such an approach facilitates device identification despite the various devices potentially having different baseline signal intensities, since the triangulation relies on differences in relativewireless signal intensity 210 levels for each device as measured by thevarious proximity sensors 208 of the in-vehicle components 206, not on a determination of which device has a highest overall intensity level at oneparticular proximity sensor 208. - In some examples, the
proximity detection sensors 208 may additionally be utilized to enable in-cabin gesture interfaces for users wearing capable wearable devices 202 (e.g., BLE devices in the case of BLE proximity detection sensors 208), such as one of the new smart-watches, fitness bands or control rings. Based on the aforementioned triangulation techniques, the network ofproximity detection sensors 208 may be able to perform in-cabin location tracking of thewearable devices 202, in order to detect a gesture action performed by a user in the air, such as to open a window with a simple swipe of the hand, or to control the volume with an up-down hand motion. - The in-cabin tracking may also be extended to passengers not wearing
wearable devices 202. In an example, electrical field distortions may be measurable with sufficiently sensitiveproximity detection sensors 208 based on the field generated by the wireless components inside thevehicle 102. If different communication technologies are used, such as 60 GHz modulation, in addition to increasing the bandwidth of data communicable between devices, in some cases thevehicle 102 may be able to detect in-cabin tracking to detect gestures and other motions at a high resolution. - The integration of
proximity detection sensors 208 with the configurable in-vehicle components 206, as well as the triangulation method orwireless signal intensity 210 threshold techniques, may accordingly allow thevehicle 102 to determine whichmobile device 152 belongs to the user engaging the configurable in-vehicle component 206. - As shown in
FIG. 4B , once themobile device 152 of the requesting user is identified, thevehicle 102 may be configured to provide the identifiedmobile device 152 with auser interface definition 402 regarding what functionality is available to perform on the in-vehicle component 206. In an example, to keep the in-vehicle component 206 functionality self-contained, the in-vehicle component 206 may be configured to communicate theuser interface definition 402 to themobile devices 152 orwearable devices 202 identified to display the user interface. - To provide the interface specified by the
user interface definition 402 on the locatedmobile device 152, in an example, the in-vehicle component 206 may be configured to request themobile device 152 to launch a vehiclecomponent interface application 174 previously installed to themobile device 152. If the vehiclecomponent interface application 174 is not already installed on the personal device, the in-vehicle component 206 may be configured to offer to side-load to it or to offer a link from which the in-vehicle component 206 may be installed to the personal device (e.g., from the Google Application Store or the Apple AppStore, as some possibilities). - The
interface definition 402 may be encoded in a data interchange format, such as hypertext markup language (HTML), extensible markup language (XML) or JavaScript Object Notation (JSON). As one specific example, theuser interface definition 402 may be encoded in a markup similar to that of the view and viewgroup user interface definitions utilized by the Google Android operating system. One advantage of using a data exchange commonly used on the web is that user devices (e.g.,mobile devices 152,wearable devices 202, etc.) may be able to render theuser interface definition 402 to display the user interface using existing or downloadable functionality of the device (e.g., a web browser plugin). - As one possibility, responsive to a user entering the
vehicle 102, a personal device of the user may be configured to utilize the vehiclecomponent interface application 174 to connect to thevehicle 102 to receive theuser interface definition 402. In an example, the personal device may connect to the in-vehicle components 206 to receive theuser interface definition 402 via available wireless protocols (e.g., BLE, etc.) provided by theproximity sensors 208. The receiveduser interface definition 402 may be descriptive of the functions available in each in-vehicle component 206, variables that may be controlled, and current state of the variables. Thus, as the vehiclecomponent interface application 174 may retrieve theuser interface definition 402 descriptive of the user interface to present from thevehicle 102, a universal vehiclecomponent interface application 174 may be utilized across various brands/makes/models ofvehicle 102. - In an example, an
user interface definition 402 template for an in-vehicle light fixture having a single lamp may be described as an XML element with two attributes as follows: -
<Lighting intensity_max=″100” color_tone_max=″360″/> - For a more complex interface, a more complex
user interface definition 402 template would accordingly be utilized, such as that used to control seat functions (e.g., forward, back, tilt, recline, lumbar, etc.). In such an example, theuser interface definition 402 template may be defined to include attributes descriptive of the available functions, their names for presentation in the user interface, their allowed range of values (e.g., min, max, step size, default, etc.), and potentially layout information descriptive of grouping, ordering, or suggested controls (e.g., toggle control, slider control, knob control, etc.) of how to render the interface controls to change these attributes. - As the
mobile device 152 orwearable device 202 is requested by the in-vehicle component 206 to act as a user interface for the in-vehicle component 206, themobile device 152 orwearable device 202 accordingly receives functionalities are available from the module, but also what other modules offering similar functionalities are available in the vehicle as well as their locations (e.g., from triangulation as discussed above). The vehiclecomponent interface application 174 may be configured to aggregate the data and offer to the user combinations for controlling interior lighting or other vehicle functions by controlling the in-vehicle component 206 sharing that attribute. As a specific example, the user may utilize theirmobile device 152 to invoke interior lights of all interior lights, but at a low intensity level. It should be noted that in other examples, aggregation of theuser interface definition 402 may be performed by the in-vehicle components 206, such that the aggregateduser interface definition 402 may be communicated to the personal device by the specific in-vehicle component 206 requesting for the user's device to display a user interface. -
FIG. 5 illustrates anexample process 500 for identifying amobile device 152 associated with a user in thevehicle 102 requesting an action. Theprocess 500 may be performed, for example, by one or more in-vehicle components 206 of thevehicle 102. - At
operation 502, the in-vehicle component 206 determines whether a personal device of a user (e.g., amobile device 152, awearable device 202, etc.) is approaching the in-vehicle component 206. In an example, the in-vehicle component 206 may be equipped with aproximity detection sensor 208 configured to facilitate detection of awearable device 202, such that as thewireless signal intensity 210 of the approachingwearable device 202 to theproximity detection sensor 208 crosses a minimum threshold intensity, the intensity shift of thewireless connection 204 strength may be detected by theproximity detection sensor 208, and a handshake may be established between theproximity detection sensor 208 and the approachingwearable device 202. If a personal device is detected as approaching the in-vehicle component 206, control passes tooperation 504. Otherwise, control passes tooperation 510. - At
operation 504, the in-vehicle component 206 identifies themobile device 152 of the user to use to display a user interface for the in-vehicle component 206. In an example, the approachingwearable device 202 may be paired with or otherwise associated with amobile device 152 configured to execute the vehiclecomponent interface application 174, and thewearable device 202 may be configured to provide to the in-vehicle component 206 (or the in-vehicle component 206 may request) the identity of the associatedmobile device 152. In another example, if the approaching device is amobile device 152 or other device configured to execute the vehiclecomponent interface application 174, then the in-vehicle component 206 may identify the approachingmobile device 152 as the device to display the user interface. - At
operation 506, the in-vehicle component 206 sends an interaction request to the identified device. In an example the in-vehicle component 206 may be configured to request the identified device to launch a vehiclecomponent interface application 174, or to provide a link for the vehiclecomponent interface application 174 to be downloaded if the vehiclecomponent interface application 174 is not yet installed. Once invoked or installed, control passes tooperation 508. - At
operation 508, the in-vehicle component 206 processes the interaction request using the identified device. An example interaction is described below with respect to theprocess 600. Afteroperation 508, control passes tooperation 502. - At
operation 510, the in-vehicle component 206 may determine whether the in-vehicle component 206 detects an approach but no personal device. In an example, electrical field distortions of the in-vehicle component 206 may be measured by theproximity detection sensor 208 of the in-vehicle component 206 based on the field generated by the wireless components inside thevehicle 102. In another example, the in-vehicle component 206 may detect a user touch via a selection of a control of the built-in user interface of the in-vehicle component 206. If an approach is detected control passes tooperation 512. Otherwise, control passes tooperation 502. - At
operation 512, the in-vehicle component 206 requests the other in-vehicle components 206 of thevehicle 102 to sendwireless signal intensity 210 data identified by theirrespective proximity sensors 208 for whatever devices they detect. This may be done to allow the in-vehicle component 206 to perform triangulation to detect whichmobile device 152 is that of the user requesting interaction with the in-vehicle component 206. - At
operation 514, the in-vehicle component 206 determines whether thewireless signal intensity 210 data or whether a timeout occurred. For example, if at least a predetermined amount of time has passed since sending the request inoperation 502, control passes tooperation 516. Or, if the in-vehicle component 206 receives the requestedwireless signal intensity 210, control passes tooperation 516. Otherwise, control remains atoperation 514. - At
operation 516, the in-vehicle component 206 calculates proximity to the detected devices. In an example, the in-vehicle component 206 may use the receivedwireless signal intensities 210 from its proximity sensor 308 as well as data from theother proximity sensors 208, to determine which devices have whatwireless signal intensities 210 at the various in-vehicle component 206. - At
operation 518, the in-vehicle component 206 identifies a closest device. In an example, in-vehicle component 206 may identify amobile device 152 having a higherwireless signal intensity 210 by in-vehicle component 206 than by the other in-vehicle components 206. This device may accordingly be identified as being the most likely themobile device 152 of the user approaching the in-vehicle component 206. Afteroperation 518, control passes tooperation 506. -
FIG. 6 illustrates anexample process 600 for displaying a user interface on the identifiedmobile device 152. Theprocess 600 may be performed, for example, by a personal device (e.g., amobile device 152, awearable device 202, etc.) in communication with one or more in-vehicle components 206 of thevehicle 102. - At
operation 602, the personal device enters thevehicle 102. In an example, the personal device may be carried by a user entering thevehicle 102. - At
operation 604, the personal device connects to the in-vehicle components 206. In an example, responsive to a user entering thevehicle 102, a personal device of the user may be configured to utilize the vehiclecomponent interface application 174 to connect to the available wireless protocols (e.g., BLE, etc.). - At
operation 606, the personal device receives complexuser interface definition 402 template information from the in-vehicle components 206. In an example, the personal device may receive taggeduser interface definition 402 information descriptive of the functions available in each in-vehicle component 206, variables that may be controlled, and current state of the variables. - At
operation 608, the personal device determines whether to act as a user interface for the in-vehicle components 206. In an example, such as theprocess 500 discussed above, the personal device may be requested by the in-vehicle component 206 to act as a user interface for the in-vehicle component 206. - At
operation 610, the personal device aggregates data from the in-vehicle components 206 offering similar functionality. In an example, the user may utilize theirmobile device 152 to invoke interior lights of all interior lights, but at a low intensity level. It should be noted that in other examples, aggregation of theuser interface definition 402 may be performed by the in-vehicle components 206, such that the aggregateduser interface definition 402 may be communicated to the personal device by the specific in-vehicle component 206 requesting for the user's device to display a user interface. - At
operation 612, the personal device renders a user interface. The personal device may accordingly display a user interface defined according to the received and aggregated taggeduser interface definition 402. - At operation 614, the personal device determines whether the user requests to quit the user interface. In an example, the personal device may receive user input requesting for the user interface to be dismissed. If such input is received, control passes to
operation 616. Otherwise, control passes tooperation 618. - At
operation 616, the personal device closes the user interface. Afteroperation 616, control passes tooperation 608. - At
operation 618, the personal device determines whether a user interaction with the user interface is received. In an example, the personal device may receive user input requesting for a change to be made to the settings for one or more of the in-vehicle components 206. - At
operation 620, the personal device sends an action request to the in-vehicle component(s) 206. In an example, the user may utilize the personal device to invoke interior lights of all interior lights, but at a low intensity level. Afteroperation 620, control passes to operation 614. - While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the invention.
Claims (18)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/635,321 US20160257198A1 (en) | 2015-03-02 | 2015-03-02 | In-vehicle component user interface |
DE102016103612.9A DE102016103612A1 (en) | 2015-03-02 | 2016-03-01 | User interface of an in-vehicle component |
CN201610119017.8A CN105938338A (en) | 2015-03-02 | 2016-03-02 | In-vehicle component user interface |
US16/988,384 US11472293B2 (en) | 2015-03-02 | 2020-08-07 | In-vehicle component user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/635,321 US20160257198A1 (en) | 2015-03-02 | 2015-03-02 | In-vehicle component user interface |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/988,384 Division US11472293B2 (en) | 2015-03-02 | 2020-08-07 | In-vehicle component user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160257198A1 true US20160257198A1 (en) | 2016-09-08 |
Family
ID=56738868
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/635,321 Abandoned US20160257198A1 (en) | 2015-03-02 | 2015-03-02 | In-vehicle component user interface |
US16/988,384 Active 2035-04-07 US11472293B2 (en) | 2015-03-02 | 2020-08-07 | In-vehicle component user interface |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/988,384 Active 2035-04-07 US11472293B2 (en) | 2015-03-02 | 2020-08-07 | In-vehicle component user interface |
Country Status (3)
Country | Link |
---|---|
US (2) | US20160257198A1 (en) |
CN (1) | CN105938338A (en) |
DE (1) | DE102016103612A1 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170225690A1 (en) * | 2016-02-09 | 2017-08-10 | General Motors Llc | Wearable device controlled vehicle systems |
US20180002895A1 (en) * | 2015-03-20 | 2018-01-04 | Sumitomo(S.H.I.) Construction Machinery Co., Ltd. | Shovel |
US10071685B2 (en) * | 2016-12-21 | 2018-09-11 | Hyundai Motor Company | Audio video navigation (AVN) head unit, vehicle having the same, and method for controlling the vehicle having the AVN head unit |
US10137777B2 (en) | 2015-11-03 | 2018-11-27 | GM Global Technology Operations LLC | Systems and methods for vehicle system control based on physiological traits |
US10310553B2 (en) * | 2016-01-04 | 2019-06-04 | Lg Electronics Inc. | Display apparatus for vehicle and vehicle |
US10318442B2 (en) * | 2016-05-20 | 2019-06-11 | Faraday & Future Inc. | Pairing of input device and display in vehicle infotainment systems |
US20190241121A1 (en) * | 2018-02-06 | 2019-08-08 | Ford Global Technologies, Llc | Vehicle lamp assembly |
US10619392B2 (en) | 2016-04-13 | 2020-04-14 | 1925Workbench Ltd. | Rail-mounted doors |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US11076261B1 (en) * | 2016-09-16 | 2021-07-27 | Apple Inc. | Location systems for electronic device communications |
US20210357086A1 (en) * | 2020-05-18 | 2021-11-18 | Toyota Jidosha Kabushiki Kaisha | Agent control device, agent control method, and recording medium |
CN114523919A (en) * | 2022-02-14 | 2022-05-24 | 海信集团控股股份有限公司 | Vehicle and control method thereof |
US20220177067A1 (en) * | 2019-03-27 | 2022-06-09 | Tvs Motor Company Limited | Smart connect instrument cluster |
US11424921B2 (en) | 2015-11-09 | 2022-08-23 | Dealerware, Llc | Vehicle access systems and methods |
US12132986B2 (en) * | 2021-12-12 | 2024-10-29 | Avanti R&D, Inc. | Computer vision system used in vehicles |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018084860A1 (en) * | 2016-11-04 | 2018-05-11 | Google Llc | Adaptive user interface with reduced payload |
CN106804028A (en) * | 2016-12-29 | 2017-06-06 | 上海蔚来汽车有限公司 | In-car positioner, method and vehicle-mounted device control system based on ibeacon |
US20210382561A1 (en) * | 2020-06-05 | 2021-12-09 | Koninklijke Fabriek Inventum B.V. | Gesture control for overhead bins |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080261643A1 (en) * | 2006-10-05 | 2008-10-23 | Lee Bauer | Extensible infotainment/telematics system |
US20120006581A1 (en) * | 2009-01-07 | 2012-01-12 | Biralee Investments Pty Limited | Cable organiser |
US20120065815A1 (en) * | 2010-09-09 | 2012-03-15 | Wolfgang Hess | User interface for a vehicle system |
US8421589B2 (en) * | 2009-01-27 | 2013-04-16 | Delphi Technologies, Inc. | Dual purpose wireless device, wherein vehicle controls depend on device location |
US8447598B2 (en) * | 2007-12-05 | 2013-05-21 | Johnson Controls Technology Company | Vehicle user interface systems and methods |
US20140043152A1 (en) * | 2012-08-13 | 2014-02-13 | Ford Global Technologies, Llc | System and Method for Controlling Adaptive Cruise Control Based on Driver Status |
US20140142783A1 (en) * | 2012-11-19 | 2014-05-22 | GM Global Technology Operations LLC | Methods of controlling vehicle interfaces using device motion and near field communications |
US20140164559A1 (en) * | 2012-12-10 | 2014-06-12 | Ford Global Technologies, Llc | Offline configuration of vehicle infotainment system |
US20140375477A1 (en) * | 2013-06-20 | 2014-12-25 | Motorola Mobility Llc | Vehicle detection |
US20150352953A1 (en) * | 2014-06-04 | 2015-12-10 | Magna Electronics Inc. | Vehicle control system with mobile device interface |
Family Cites Families (240)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4721954A (en) | 1985-12-18 | 1988-01-26 | Marlee Electronics Corporation | Keypad security system |
US5697844A (en) | 1986-03-10 | 1997-12-16 | Response Reward Systems, L.C. | System and method for playing games and rewarding successful players |
US4792783A (en) | 1986-05-07 | 1988-12-20 | Electro-Mechanical Products | Vehicular function controller having alterable function designators |
JPH02127117A (en) | 1988-11-04 | 1990-05-15 | Diesel Kiki Co Ltd | Air conditioner control device for vehicle |
JPH0328034A (en) | 1989-06-26 | 1991-02-06 | Nissan Motor Co Ltd | Car-interior illumination device |
JP2935871B2 (en) | 1990-04-18 | 1999-08-16 | タカタ株式会社 | Lighting buckle for seat belt device |
US5255442A (en) | 1991-12-20 | 1993-10-26 | Donnelly Corporation | Vehicle compass with electronic sensor |
US5543591A (en) | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
IT1272833B (en) | 1993-10-21 | 1997-06-30 | Audiovox Corp | Alarm system for vehicles |
WO1996015650A1 (en) | 1994-11-11 | 1996-05-23 | Philips Electronics N.V. | System to optimize artificial lighting levels with increasing daylight level |
US5650929A (en) | 1995-04-28 | 1997-07-22 | Prince Corporation | Modular electronic display and accessory mounting system for a vehicle |
US7973773B2 (en) | 1995-06-29 | 2011-07-05 | Pryor Timothy R | Multipoint, virtual control, and force based touch screen applications |
US5796179A (en) | 1995-09-30 | 1998-08-18 | Suzuki Motor Corporation | Vehicle anti-theft device with low voltage compensation and a rolling code |
US6028537A (en) | 1996-06-14 | 2000-02-22 | Prince Corporation | Vehicle communication and remote control system |
US5757268A (en) | 1996-09-26 | 1998-05-26 | United Technologies Automotive, Inc. | Prioritization of vehicle display features |
US5848634A (en) | 1996-12-27 | 1998-12-15 | Latron Electronics Co. Inc. | Motorized window shade system |
US7015896B2 (en) | 1998-01-23 | 2006-03-21 | Digit Wireless, Llc | Keyboards with both individual and combination key output |
US6377860B1 (en) | 1998-07-31 | 2002-04-23 | Sun Microsystems, Inc. | Networked vehicle implementing plug and play with javabeans |
US6397249B1 (en) | 1998-11-24 | 2002-05-28 | International Business Machines Corporation | Data processing system and method for determining a physical location of a client computer system |
DE10006943A1 (en) | 2000-02-17 | 2001-08-23 | Volkswagen Ag | Interior lighting system of a motor vehicle and method for controlling such |
US6536928B1 (en) | 2000-03-03 | 2003-03-25 | Lear Corporation | Multi-colored vehicle interior lighting |
DE10021068A1 (en) | 2000-04-28 | 2001-10-31 | Bosch Gmbh Robert | User-specific device tuning method for e.g. vehicle involves tuning device based on stored user-specific data consisting of full or partial device independent reference values |
US20020092019A1 (en) | 2000-09-08 | 2002-07-11 | Dwight Marcus | Method and apparatus for creation, distribution, assembly and verification of media |
US6449541B1 (en) | 2000-10-17 | 2002-09-10 | Microsoft Corporation | Application-to-component communications helper in a vehicle computer system |
US6615123B2 (en) | 2000-12-01 | 2003-09-02 | Hewlett-Packard Development Company, L.P. | Personality module for configuring a vehicle |
US20020087423A1 (en) | 2001-01-02 | 2002-07-04 | Carbrey Palango Joan L. | System builder for building electronics systems |
US6473038B2 (en) | 2001-01-05 | 2002-10-29 | Motorola, Inc. | Method and apparatus for location estimation |
US6663010B2 (en) | 2001-01-22 | 2003-12-16 | Meritor Heavy Vehicle Technology, Llc | Individualized vehicle settings |
KR20040008148A (en) | 2001-04-03 | 2004-01-28 | 에이티앤드티 와이어리스 서비시즈 인코포레이티드 | Methods and apparatus for mobile station location estimation |
US7114178B2 (en) | 2001-05-22 | 2006-09-26 | Ericsson Inc. | Security system |
US20020197976A1 (en) | 2001-06-22 | 2002-12-26 | Jonathan Liu | Vehicle customized feature activation system |
US6775603B2 (en) | 2001-10-18 | 2004-08-10 | Ford Motor Company | Method and system for maintaining personalization of user adjustable features |
US7342325B2 (en) | 2001-11-05 | 2008-03-11 | Michael Rhodes | Universal fleet electrical system |
US8611919B2 (en) | 2002-05-23 | 2013-12-17 | Wounder Gmbh., Llc | System, method, and computer program product for providing location based services and mobile e-commerce |
US9205744B2 (en) | 2002-06-21 | 2015-12-08 | Intel Corporation | PC-based automobile owner's manual, diagnostics, and auto care |
US8074201B2 (en) | 2002-07-10 | 2011-12-06 | National Instruments Corporation | Deployment and execution of a program on an embedded device |
US7034655B2 (en) | 2002-08-06 | 2006-04-25 | Tri/Mark Corporation | Keypad module and method for electronic access security and keyless entry of a vehicle |
US20040034455A1 (en) | 2002-08-15 | 2004-02-19 | Craig Simonds | Vehicle system and method of communicating between host platform and human machine interface |
US7275983B2 (en) | 2002-09-27 | 2007-10-02 | Denso Corporation | System for limiting an increase in the inside air temperature of passenger compartment of vehicle |
KR100575906B1 (en) | 2002-10-25 | 2006-05-02 | 미츠비시 후소 트럭 앤드 버스 코포레이션 | Hand pattern switching apparatus |
US20050009469A1 (en) | 2002-12-20 | 2005-01-13 | Sakari Kotola | Client software download in bluetooth device bonding |
US7337436B2 (en) | 2003-02-07 | 2008-02-26 | Sun Microsystems, Inc. | System and method for cross platform and configuration build system |
US20040215532A1 (en) | 2003-02-25 | 2004-10-28 | Hans Boman | Method and system for monitoring relative movement of maritime containers and other cargo |
US7647562B2 (en) | 2003-04-03 | 2010-01-12 | National Instruments Corporation | Deployment and execution of a graphical program on an embedded device from a PDA |
US20050017842A1 (en) | 2003-07-25 | 2005-01-27 | Bryan Dematteo | Adjustment apparatus for adjusting customizable vehicle components |
US20050044906A1 (en) | 2003-07-25 | 2005-03-03 | Spielman Timothy G. | Method and system for setting entry codes via a communications network for access to moveable enclosures |
US7015791B2 (en) | 2003-08-19 | 2006-03-21 | General Motors Corporation | Keyless entry module and method |
US7751829B2 (en) | 2003-09-22 | 2010-07-06 | Fujitsu Limited | Method and apparatus for location determination using mini-beacons |
US7230545B2 (en) | 2003-11-07 | 2007-06-12 | Nattel Group, Inc. | Automobile communication and registry system |
EP1561640A3 (en) | 2004-02-06 | 2010-06-30 | Goodrich Lighting Systems GmbH | Colored light for the passengers of a publc transport means, in particular for an aeroplane cabin |
US7170400B2 (en) | 2004-05-20 | 2007-01-30 | Lear Corporation | System for customizing settings and sounds for vehicle |
US7031809B2 (en) | 2004-05-21 | 2006-04-18 | Jens Erik Sorensen | Remote control of automobile component arrangements |
US20060155429A1 (en) | 2004-06-18 | 2006-07-13 | Applied Digital, Inc. | Vehicle entertainment and accessory control system |
US7050795B2 (en) | 2004-06-24 | 2006-05-23 | Denso International America, Inc. | System for programming customizable vehicle features |
US7009504B1 (en) | 2004-08-10 | 2006-03-07 | Lear Corporation | Reconfigurable vehicle accessory control panel |
US20060075934A1 (en) | 2004-09-28 | 2006-04-13 | Pranil Ram | Passenger keyboard and display apparatus, and a system and method for delivering content to users of such apparatus |
US7319924B2 (en) | 2004-10-22 | 2008-01-15 | General Motors Corporation | Method and system for managing personalized settings in a mobile vehicle |
US7518381B2 (en) | 2004-12-17 | 2009-04-14 | Stoneridge Control Devices, Inc. | Touch sensor system and method |
US20060155547A1 (en) | 2005-01-07 | 2006-07-13 | Browne Alan L | Voice activated lighting of control interfaces |
US20070262140A1 (en) | 2005-02-03 | 2007-11-15 | Long Kenneth W Sr | Apparatus, System, and Method for Delivering Products or Services |
US7778651B2 (en) | 2005-02-16 | 2010-08-17 | Harris Corporation | Wireless network range estimation and associated methods |
US7502620B2 (en) | 2005-03-04 | 2009-03-10 | Shyhook Wireless, Inc. | Encoding and compression of a location beacon database |
US20060205456A1 (en) | 2005-03-14 | 2006-09-14 | Bentz William G | Video games and add-on features |
US7920048B2 (en) | 2005-05-09 | 2011-04-05 | Safetystream Mobile Limited | Method for using a table of data to control access and a locking mechanism using same |
US20060258377A1 (en) | 2005-05-11 | 2006-11-16 | General Motors Corporation | Method and sysem for customizing vehicle services |
US7987030B2 (en) | 2005-05-25 | 2011-07-26 | GM Global Technology Operations LLC | Vehicle illumination system and method |
US20070021885A1 (en) | 2005-07-25 | 2007-01-25 | Honeywell International Inc. | System and method for personalizing motor vehicle ride or handling characteristics |
DE202005015165U1 (en) | 2005-09-27 | 2005-12-29 | Huf Hülsbeck & Fürst Gmbh & Co. Kg | Remote control locking device for vehicle has portable electronic key having mode button and function buttons on touch screen |
US20070140187A1 (en) | 2005-12-15 | 2007-06-21 | Rokusek Daniel S | System and method for handling simultaneous interaction of multiple wireless devices in a vehicle |
US7706740B2 (en) | 2006-01-06 | 2010-04-27 | Qualcomm Incorporated | Apparatus and methods of selective collection and selective presentation of content |
US7581244B2 (en) | 2006-01-25 | 2009-08-25 | Seiko Epson Corporation | IMX session control and authentication |
US20070198472A1 (en) | 2006-02-17 | 2007-08-23 | Ford Motor Company | Multimedia system for a vehicle |
JP4812089B2 (en) | 2006-02-24 | 2011-11-09 | キヤノン株式会社 | Printing apparatus and connection method thereof |
US8615273B2 (en) * | 2006-10-05 | 2013-12-24 | Harman Becker Automotive Systems Gmbh | Extensible infotainment/telematics system with process control shifting |
US7810969B2 (en) | 2006-11-02 | 2010-10-12 | Ford Global Technologies, Llc | Ambient lighting for vehicle interior floor console |
WO2008058194A2 (en) | 2006-11-07 | 2008-05-15 | Collins & Aikman Products Co. | Luminous interior trim material |
US7800483B2 (en) | 2006-11-10 | 2010-09-21 | Federal-Mogul World Wide, Inc. | Transitional lighting system for vehicle interior |
US8073589B2 (en) | 2006-12-01 | 2011-12-06 | Ford Global Technologies, Llc | User interface system for a vehicle |
KR20080052997A (en) | 2006-12-08 | 2008-06-12 | 현대자동차주식회사 | Interface system between human and car |
US8006002B2 (en) | 2006-12-12 | 2011-08-23 | Apple Inc. | Methods and systems for automatic configuration of peripherals |
US7595718B2 (en) | 2007-01-30 | 2009-09-29 | Tse Hsing Chen | Antitheft system with clip-on wireless keypad |
US20080288406A1 (en) | 2007-05-16 | 2008-11-20 | The Marketing Store Worldwide Llc | System and method for telematic marketing |
DE102007052008A1 (en) | 2007-10-26 | 2009-04-30 | Andreas Steinhauser | Single- or multitouch-capable touchscreen or touchpad consisting of an array of pressure sensors and production of such sensors |
US8239087B2 (en) | 2008-02-14 | 2012-08-07 | Steering Solutions Ip Holding Corporation | Method of operating a vehicle accessory |
US8065169B1 (en) | 2008-02-15 | 2011-11-22 | Allstate Insurance Company | Real-time insurance estimate based on non-personal identifying information |
EP2105759A1 (en) | 2008-03-28 | 2009-09-30 | Identec Solutions AG | Method and systems for carrying out a two way ranging procedure |
US20090249081A1 (en) | 2008-03-31 | 2009-10-01 | Kabushiki Kaisha Toshiba-1 Shibaura 1-Chomominatoku | Storage device encryption and method |
FR2933212B1 (en) | 2008-06-27 | 2013-07-05 | Movea Sa | MOVING CAPTURE POINTER RESOLVED BY DATA FUSION |
CN101639740A (en) | 2008-08-01 | 2010-02-03 | 鸿富锦精密工业(深圳)有限公司 | Input method and password protection method based on touch screen |
WO2010028141A2 (en) * | 2008-09-03 | 2010-03-11 | Flextronics Ap, Llc | Systems and methods for connecting and operating portable gps enabled devices in automobiles |
US8465161B2 (en) | 2008-10-14 | 2013-06-18 | Magna Mirrors Of America, Inc. | Interior rearview mirror assembly with button module |
US20110187496A1 (en) | 2008-10-30 | 2011-08-04 | Denison William D | Electronic Access Control Device and Management System |
US8941466B2 (en) | 2009-01-05 | 2015-01-27 | Polytechnic Institute Of New York University | User authentication for devices with touch sensitive elements, such as touch sensitive display screens |
US20100171696A1 (en) | 2009-01-06 | 2010-07-08 | Chi Kong Wu | Motion actuation system and related motion database |
US8854180B2 (en) | 2009-01-10 | 2014-10-07 | Pro Tech Systems Of Maryland, Inc. | Access control system |
US20100197359A1 (en) | 2009-01-30 | 2010-08-05 | Harris Technology, Llc | Automatic Detection of Wireless Phone |
DE102009008041A1 (en) | 2009-02-09 | 2010-08-12 | Volkswagen Ag | Method for operating a motor vehicle with a touchscreen |
JP2010199716A (en) | 2009-02-23 | 2010-09-09 | Fujitsu Ten Ltd | Onboard device and communication control method |
US8825222B2 (en) | 2009-02-27 | 2014-09-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Remote management of vehicle settings |
US20100235045A1 (en) | 2009-03-10 | 2010-09-16 | Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America | Virtual feature management for vehicle information and entertainment systems |
US20100233957A1 (en) | 2009-03-11 | 2010-09-16 | Delphi Technologies, Inc. | Vehicle Personalization Using A Near Field Communications Transponder |
JP4767332B2 (en) | 2009-03-13 | 2011-09-07 | シャープ株式会社 | Information processing system and image forming system |
US20100280711A1 (en) | 2009-04-29 | 2010-11-04 | Gm Global Technology Operations, Inc. | System and method of using a portable device to recognize a frequent driver |
DE102009024656A1 (en) | 2009-06-12 | 2011-03-24 | Volkswagen Ag | A method of controlling a graphical user interface and graphical user interface operator |
WO2011011544A1 (en) | 2009-07-21 | 2011-01-27 | Scott Ferrill Tibbitts | Method and system for controlling a mobile communication device in a moving vehicle |
US8482430B2 (en) | 2009-10-13 | 2013-07-09 | GM Global Technology Operations LLC | Method and apparatus for communicatively changing interior illumination color in a vehicle |
US8983534B2 (en) | 2009-10-14 | 2015-03-17 | Dipam Patel | Mobile telephone for remote operation |
US8344850B2 (en) | 2009-10-30 | 2013-01-01 | Lear Corporation | System and method for authorizing a remote device |
US8315617B2 (en) | 2009-10-31 | 2012-11-20 | Btpatent Llc | Controlling mobile device functions |
US8514069B2 (en) | 2009-11-12 | 2013-08-20 | MTN Satellite Communications | Tracking passengers on cruise ships |
US8706349B2 (en) | 2009-12-07 | 2014-04-22 | At&T Mobility Ii Llc | Devices, systems and methods for controlling permitted settings on a vehicle |
US8633916B2 (en) | 2009-12-10 | 2014-01-21 | Apple, Inc. | Touch pad with force sensors and actuator feedback |
US8284020B2 (en) | 2009-12-22 | 2012-10-09 | Lear Corporation | Passive entry system and method for a vehicle |
JP5005758B2 (en) | 2009-12-25 | 2012-08-22 | 株式会社ホンダアクセス | In-vehicle device operating device in automobile |
US8655965B2 (en) | 2010-03-05 | 2014-02-18 | Qualcomm Incorporated | Automated messaging response in wireless communication systems |
US9417691B2 (en) | 2010-03-26 | 2016-08-16 | Nokia Technologies Oy | Method and apparatus for ad-hoc peer-to-peer augmented reality environment |
KR102068428B1 (en) | 2010-04-23 | 2020-02-11 | 임머숀 코퍼레이션 | Systems and methods for providing haptic effects |
US8336664B2 (en) | 2010-07-09 | 2012-12-25 | Telecommunication Systems, Inc. | Telematics basic mobile device safety interlock |
US8401589B2 (en) | 2010-08-10 | 2013-03-19 | At&T Intellectual Property I, L.P. | Controlled text-based communication on mobile devices |
US20120136802A1 (en) | 2010-11-30 | 2012-05-31 | Zonar Systems, Inc. | System and method for vehicle maintenance including remote diagnosis and reverse auction for identified repairs |
KR101219933B1 (en) | 2010-09-13 | 2013-01-08 | 현대자동차주식회사 | System for controlling device in vehicle using augmented reality and thereof method |
FR2965434B1 (en) | 2010-09-28 | 2015-12-11 | Valeo Securite Habitacle | METHOD OF PAIRING A MOBILE TELEPHONE WITH A MOTOR VEHICLE AND LOCKING / UNLOCKING ASSEMBLY |
CN102445954B (en) | 2010-09-30 | 2014-03-19 | 福建捷联电子有限公司 | Vehicle-mounted computer adopting all-in-one computer |
EP2442600B1 (en) | 2010-10-14 | 2013-03-06 | Research In Motion Limited | Near-field communication (NFC) system providing nfc tag geographic position authentication and related methods |
WO2012054031A1 (en) | 2010-10-20 | 2012-04-26 | Empire Technology Development Llc | Encoded optical lock |
CA2815883C (en) | 2010-10-28 | 2018-04-10 | Gestion Andre & Paquerette Ltee | Device and method for managing an electronic control unit of a vehicle |
US8527143B2 (en) | 2010-10-29 | 2013-09-03 | Nissan North America, Inc. | Vehicle user interface system and method having location specific feature availability |
US20120214463A1 (en) | 2010-11-05 | 2012-08-23 | Smith Michael J | Detecting use of a mobile device by a driver of a vehicle, such as an automobile |
JP5685073B2 (en) | 2010-12-17 | 2015-03-18 | 株式会社東海理化電機製作所 | Electronic key system |
US8543833B2 (en) | 2010-12-29 | 2013-09-24 | Microsoft Corporation | User identification with biokinematic input |
BR112013014287B1 (en) | 2010-12-30 | 2020-12-29 | Interdigital Ce Patent Holdings | METHOD AND APPARATUS FOR RECOGNITION OF GESTURE |
US8863256B1 (en) | 2011-01-14 | 2014-10-14 | Cisco Technology, Inc. | System and method for enabling secure transactions using flexible identity management in a vehicular environment |
JP5747089B2 (en) | 2011-01-21 | 2015-07-08 | ジョンソン コントロールズ テクノロジー カンパニーJohnson Controls Technology Company | In-vehicle electronic device blocker |
US9350809B2 (en) | 2011-01-31 | 2016-05-24 | Nokia Technologies Oy | Method and apparatus for automatically determining communities of interest, for use over an ad-hoc mesh network, based on context information |
US8947203B2 (en) | 2011-03-07 | 2015-02-03 | John Clinton Kolar | Aftermarket sound activated wireless vehicle door unlocker |
US8476832B2 (en) | 2011-03-15 | 2013-07-02 | Ford Global Technologies, Llc | Vehicle interior lighting system with welcome and farewell stages |
US8880100B2 (en) | 2011-03-23 | 2014-11-04 | Radium, Inc. | Proximity based social networking |
US20120254809A1 (en) | 2011-03-31 | 2012-10-04 | Nokia Corporation | Method and apparatus for motion gesture recognition |
US8994492B2 (en) | 2011-04-21 | 2015-03-31 | Fariborz M Farhan | Disablement of user device functionality |
US8873841B2 (en) | 2011-04-21 | 2014-10-28 | Nokia Corporation | Methods and apparatuses for facilitating gesture recognition |
US20120268242A1 (en) | 2011-04-21 | 2012-10-25 | Delphi Technologies, Inc. | Vehicle security system and method of operation based on a nomadic device location |
US9104537B1 (en) | 2011-04-22 | 2015-08-11 | Angel A. Penilla | Methods and systems for generating setting recommendation to user accounts for registered vehicles via cloud systems and remotely applying settings |
US9288270B1 (en) | 2011-04-22 | 2016-03-15 | Angel A. Penilla | Systems for learning user preferences and generating recommendations to make settings at connected vehicles and interfacing with cloud systems |
US9285944B1 (en) | 2011-04-22 | 2016-03-15 | Angel A. Penilla | Methods and systems for defining custom vehicle user interface configurations and cloud services for managing applications for the user interface and learned setting functions |
US9348492B1 (en) * | 2011-04-22 | 2016-05-24 | Angel A. Penilla | Methods and systems for providing access to specific vehicle controls, functions, environment and applications to guests/passengers via personal mobile devices |
US9536197B1 (en) * | 2011-04-22 | 2017-01-03 | Angel A. Penilla | Methods and systems for processing data streams from data producing objects of vehicle and home entities and generating recommendations and settings |
EP2710562A1 (en) | 2011-05-02 | 2014-03-26 | Apigy Inc. | Systems and methods for controlling a locking mechanism using a portable electronic device |
US20120310445A1 (en) | 2011-06-02 | 2012-12-06 | Ford Global Technologies, Llc | Methods and Apparatus for Wireless Device Application Having Vehicle Interaction |
JP5658103B2 (en) | 2011-07-12 | 2015-01-21 | 株式会社東海理化電機製作所 | Power plug lock device |
US8626144B2 (en) | 2011-07-13 | 2014-01-07 | GM Global Technology Operations LLC | Bluetooth low energy approach detections through vehicle paired capable devices |
US8873147B1 (en) | 2011-07-20 | 2014-10-28 | Google Inc. | Chord authentication via a multi-touch interface |
US20130037252A1 (en) | 2011-08-12 | 2013-02-14 | GM Global Technology Operations LLC | Smart hvac system having occupant detection capability |
JP5662906B2 (en) | 2011-08-25 | 2015-02-04 | オムロンオートモーティブエレクトロニクス株式会社 | Position detection system and position determination method |
US20130079951A1 (en) | 2011-09-22 | 2013-03-28 | Alcatel-Lucent Usa Inc. | Vehicle Device |
US8977408B1 (en) | 2011-09-23 | 2015-03-10 | Cellco Partnership | Vehicle settings profile system |
WO2013052043A1 (en) | 2011-10-05 | 2013-04-11 | Celadon Applications, Llc | Electronic communications and control module |
US8947202B2 (en) | 2011-10-20 | 2015-02-03 | Apple Inc. | Accessing a vehicle using portable devices |
JP2013102373A (en) | 2011-11-09 | 2013-05-23 | Denso Corp | Hands-free device |
WO2013074901A2 (en) | 2011-11-16 | 2013-05-23 | Flextronics Ap, Llc | Control of device features based on vehicle indications and state |
US9554286B2 (en) | 2011-12-02 | 2017-01-24 | Lear Corporation | Apparatus and method for detecting a location of a wireless device |
US20130227647A1 (en) | 2012-02-28 | 2013-08-29 | Apple Inc. | Shared network access via a peer-to-peer link |
DE102012203535A1 (en) | 2012-03-06 | 2013-09-12 | Bayerische Motoren Werke Aktiengesellschaft | Keyless car key with gesture recognition |
US9147297B2 (en) | 2012-03-14 | 2015-09-29 | Flextronics Ap, Llc | Infotainment system based on user profile |
WO2014172369A2 (en) | 2013-04-15 | 2014-10-23 | Flextronics Ap, Llc | Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors |
US8942881B2 (en) | 2012-04-02 | 2015-01-27 | Google Inc. | Gesture-based automotive controls |
DE102013207094B4 (en) | 2012-04-27 | 2023-03-23 | GM Global Technology Operations, LLC (n.d. Ges. d. Staates Delaware) | System for implementing functions in the vehicle using short-range communication |
US9516492B2 (en) | 2012-04-30 | 2016-12-06 | Ford Global Technologies, Llc | Apparatus and method for detecting a personal communication device in a vehicle |
US9182473B2 (en) | 2012-05-10 | 2015-11-10 | Lear Corporation | System, method and product for locating vehicle key using neural networks |
US9761070B2 (en) | 2012-05-22 | 2017-09-12 | Trw Automotive U.S. Llc | Method and apparatus for hands-free opening of a door |
US9361771B2 (en) | 2012-05-23 | 2016-06-07 | Schlage Lock Company Llc | Door lock sensor and alarm |
US8768565B2 (en) | 2012-05-23 | 2014-07-01 | Enterprise Holdings, Inc. | Rental/car-share vehicle access and management system and method |
US9434300B2 (en) | 2012-05-29 | 2016-09-06 | Mohammad A. Pasdar | Multi-color in-dash lighting system for changing vehicle backlighting |
US20130329111A1 (en) | 2012-06-08 | 2013-12-12 | Samsung Electronics Co., Ltd. | Contextual help guide |
US20130342379A1 (en) | 2012-06-25 | 2013-12-26 | Lear Corporation | Vehicle Remote Function System and Method |
CN104508508A (en) | 2012-07-06 | 2015-04-08 | 丰田自动车株式会社 | Location identification system and method |
US8750832B2 (en) | 2012-07-30 | 2014-06-10 | GM Global Technology Operations LLC | Connecting a personal mobile device to a vehicle communication unit |
US20140068713A1 (en) | 2012-08-31 | 2014-03-06 | Tweddle Group, Inc. | Systems, methods and articles for providing communications and services involving automobile head units and user preferences |
JP2014069592A (en) | 2012-09-27 | 2014-04-21 | Mitsubishi Motors Corp | Remote control system for on-vehicle equipment |
US9656690B2 (en) | 2012-10-30 | 2017-05-23 | Robert Bosch Gmbh | System and method for using gestures in autonomous parking |
US9079560B2 (en) | 2012-11-02 | 2015-07-14 | GM Global Technology Operations LLC | Device location determination by a vehicle |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
JP5974876B2 (en) | 2012-12-07 | 2016-08-23 | 株式会社オートネットワーク技術研究所 | Vehicle lock control device |
US9224289B2 (en) | 2012-12-10 | 2015-12-29 | Ford Global Technologies, Llc | System and method of determining occupant location using connected devices |
US9008917B2 (en) | 2012-12-27 | 2015-04-14 | GM Global Technology Operations LLC | Method and system for detecting proximity of an end device to a vehicle based on signal strength information received over a bluetooth low energy (BLE) advertising channel |
CN103092481A (en) | 2013-01-17 | 2013-05-08 | 广东欧珀移动通信有限公司 | Method and device for intelligent terminal dynamic gesture unlocking |
CN103049053A (en) | 2013-01-21 | 2013-04-17 | 北汽银翔汽车有限公司 | Vehicle-mounted computer system |
US20140215120A1 (en) | 2013-01-30 | 2014-07-31 | Inmar, Inc. | System, method and computer program product for generating chronologically ordered globally unique identifiers |
EP2763077B1 (en) | 2013-01-30 | 2023-11-15 | Nokia Technologies Oy | Method and apparatus for sensor aided extraction of spatio-temporal features |
US9164588B1 (en) | 2013-02-05 | 2015-10-20 | Google Inc. | Wearable computing device with gesture recognition |
US8866604B2 (en) | 2013-02-14 | 2014-10-21 | Ford Global Technologies, Llc | System and method for a human machine interface |
KR101761190B1 (en) * | 2013-02-22 | 2017-07-25 | 삼성전자 주식회사 | Method and apparatus for providing user interface in portable terminal |
US8972730B2 (en) | 2013-03-08 | 2015-03-03 | Honeywell International Inc. | System and method of using a signed GUID |
US9241235B2 (en) | 2013-03-14 | 2016-01-19 | Voxx International Corporation | Passive entry cell phone and method and system therefor |
WO2014143032A1 (en) | 2013-03-15 | 2014-09-18 | Intel Corporation | Continuous interaction learning and detection in real-time |
US9123244B2 (en) | 2013-03-15 | 2015-09-01 | Denso International America, Inc. | Vehicle tracking of personal devices with response system |
WO2014146186A1 (en) | 2013-03-22 | 2014-09-25 | Keyfree Technologies Inc. | Managing access to a restricted area |
CN103218044B (en) | 2013-04-11 | 2016-02-03 | 张苏渝 | A kind of touching device of physically based deformation feedback and processing method of touch thereof |
FI124600B (en) | 2013-04-30 | 2014-10-31 | Bluegiga Technologies Oy | Procedure and technical device for short-range communication |
US8930045B2 (en) | 2013-05-01 | 2015-01-06 | Delphi Technologies, Inc. | Relay attack prevention for passive entry passive start (PEPS) vehicle security systems |
CA2948891C (en) | 2013-05-08 | 2023-04-04 | Obdedge, Llc | Driver identification and data collection systems for use with mobile communication devices in vehicles |
US20140365073A1 (en) | 2013-06-05 | 2014-12-11 | Ford Global Technologies, Llc | System and method of communicating with vehicle passengers |
US9053516B2 (en) | 2013-07-15 | 2015-06-09 | Jeffrey Stempora | Risk assessment using portable devices |
CN103342117A (en) | 2013-07-19 | 2013-10-09 | 上海勃科信息科技有限公司 | Wireless car body control module |
CN203368573U (en) | 2013-07-26 | 2013-12-25 | 深圳市赛格导航科技股份有限公司 | System enabling vehicle-mounted device to match mobile phone terminal automatically based on Bluetooth |
US9424047B2 (en) | 2013-08-05 | 2016-08-23 | Harman International Industries, Incorporated | System and methods for an in-vehicle computing system |
US20150048927A1 (en) | 2013-08-13 | 2015-02-19 | Directed, Llc | Smartphone based passive keyless entry system |
WO2015030710A1 (en) | 2013-08-26 | 2015-03-05 | Intel Corporation | Configuring user customizable operational features of a vehicle |
JP6241177B2 (en) | 2013-09-27 | 2017-12-06 | 富士通株式会社 | LOCATION MODEL UPDATE DEVICE, LOCATION ESTIMATION METHOD, AND PROGRAM |
US9227595B2 (en) | 2013-10-31 | 2016-01-05 | GM Global Technology Operations LLC | Methods, systems and apparatus for providing notification that a vehicle has been accessed |
US20150123762A1 (en) | 2013-11-05 | 2015-05-07 | Hyundai Motor Company | Method and system of opening and closing door of vehicle |
CN105830470A (en) | 2013-11-22 | 2016-08-03 | 高通股份有限公司 | System and method for configuring an interior of a vehicle based on preferences provided with a plurality of mobile computing devices within the vehicle |
US20150148018A1 (en) | 2013-11-26 | 2015-05-28 | Lenovo (Singapore) Pte. Ltd. | Vehicle operator specific user device management |
US10078811B2 (en) | 2013-11-29 | 2018-09-18 | Fedex Corporate Services, Inc. | Determining node location based on context data in a wireless node network |
US9398437B2 (en) | 2013-12-16 | 2016-07-19 | Nokia Technologies Oy | Method, apparatus, and computer program product for service discovery in wireless short-range communication |
CN104750056B (en) | 2013-12-31 | 2018-08-14 | 比亚迪股份有限公司 | Regulating system, method, apparatus and the mobile terminal of vehicle-state |
CN104742833B (en) | 2013-12-31 | 2017-11-17 | 比亚迪股份有限公司 | Adjusting method, device, system and the mobile terminal of vehicle part position |
CN105900463B (en) | 2014-01-06 | 2019-10-18 | 福特全球技术公司 | The device and method of interior positioning for mobile device |
US20150195669A1 (en) | 2014-01-06 | 2015-07-09 | Ford Global Technologies, Llc | Method and system for a head unit to receive an application |
US9357475B2 (en) | 2014-01-31 | 2016-05-31 | General Motors Llc | Vehicle telematics scan rate control |
CN103780702A (en) * | 2014-02-17 | 2014-05-07 | 重庆长安汽车股份有限公司 | Vehicle-mounted amusement device and mobile phone interactive system and method |
US9537989B2 (en) | 2014-03-04 | 2017-01-03 | Qualcomm Incorporated | Managing features associated with a user equipment based on a location of the user equipment within a vehicle |
US9569263B2 (en) | 2014-03-11 | 2017-02-14 | Sas Institute Inc. | Techniques for generating instructions to control database processing |
US10059175B2 (en) | 2014-03-13 | 2018-08-28 | Ford Global Technologies, Llc | Autonomous vehicle with automatic window shade |
US9721411B2 (en) | 2014-03-18 | 2017-08-01 | Google Inc. | Proximity-initiated physical mobile device gestures |
KR20150111221A (en) | 2014-03-25 | 2015-10-05 | 삼성전자주식회사 | Method for constructing page and electronic device supporting the same |
US20150283914A1 (en) | 2014-04-04 | 2015-10-08 | Ford Global Technologies, Llc | Method and system for vehicle battery environment control |
US20150294518A1 (en) | 2014-04-10 | 2015-10-15 | Ford Global Technologies, Llc | Remotely programmed keyless vehicle entry system |
CN103942963A (en) | 2014-05-12 | 2014-07-23 | 李三多 | System for verifying passenger vehicle identity through mobile phone |
CN104007929B (en) | 2014-05-26 | 2016-03-02 | 南京泰锐斯通信科技有限公司 | Based on mobile terminal unlock method and the mobile terminal of gesture identification |
US20150356797A1 (en) | 2014-06-05 | 2015-12-10 | International Business Machines Corporation | Virtual key fob with transferable user data profile |
US9467825B2 (en) | 2014-06-25 | 2016-10-11 | Verizon Patent And Licensing Inc. | Alerts based on vehicle and device telematics |
KR101588190B1 (en) | 2014-10-22 | 2016-01-25 | 현대자동차주식회사 | Vehicle, controlling method thereof and multimedia apparatus therein |
CA2967557C (en) | 2014-11-14 | 2023-09-05 | Bombardier Inc. | In-vehicle position detection and configuration of vehicle components |
CN104580784A (en) | 2014-12-09 | 2015-04-29 | 中山市佐敦音响防盗设备有限公司 | System for controlling car by mobile phone |
US10384643B2 (en) | 2015-01-14 | 2019-08-20 | GM Global Technology Operations LLC | Virtual keyfob for vehicle sharing |
US10173642B2 (en) | 2015-01-23 | 2019-01-08 | Continental Automotive Systems, Inc. | Telematics system with PIN-controlled external SIM to prevent vehicle piracy |
US9357054B1 (en) | 2015-03-11 | 2016-05-31 | Amazon Technologies, Inc. | Determining user's seating position in vehicle using barometer and motion sensors |
US10101433B2 (en) | 2015-05-01 | 2018-10-16 | GM Global Technology Operations LLC | Methods for locating a vehicle key fob |
US9616773B2 (en) | 2015-05-11 | 2017-04-11 | Uber Technologies, Inc. | Detecting objects within a vehicle in connection with a service |
US10200824B2 (en) | 2015-05-27 | 2019-02-05 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device |
-
2015
- 2015-03-02 US US14/635,321 patent/US20160257198A1/en not_active Abandoned
-
2016
- 2016-03-01 DE DE102016103612.9A patent/DE102016103612A1/en active Pending
- 2016-03-02 CN CN201610119017.8A patent/CN105938338A/en active Pending
-
2020
- 2020-08-07 US US16/988,384 patent/US11472293B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080261643A1 (en) * | 2006-10-05 | 2008-10-23 | Lee Bauer | Extensible infotainment/telematics system |
US8447598B2 (en) * | 2007-12-05 | 2013-05-21 | Johnson Controls Technology Company | Vehicle user interface systems and methods |
US20120006581A1 (en) * | 2009-01-07 | 2012-01-12 | Biralee Investments Pty Limited | Cable organiser |
US8421589B2 (en) * | 2009-01-27 | 2013-04-16 | Delphi Technologies, Inc. | Dual purpose wireless device, wherein vehicle controls depend on device location |
US20120065815A1 (en) * | 2010-09-09 | 2012-03-15 | Wolfgang Hess | User interface for a vehicle system |
US20140043152A1 (en) * | 2012-08-13 | 2014-02-13 | Ford Global Technologies, Llc | System and Method for Controlling Adaptive Cruise Control Based on Driver Status |
US20140142783A1 (en) * | 2012-11-19 | 2014-05-22 | GM Global Technology Operations LLC | Methods of controlling vehicle interfaces using device motion and near field communications |
US20140164559A1 (en) * | 2012-12-10 | 2014-06-12 | Ford Global Technologies, Llc | Offline configuration of vehicle infotainment system |
US20140375477A1 (en) * | 2013-06-20 | 2014-12-25 | Motorola Mobility Llc | Vehicle detection |
US20150352953A1 (en) * | 2014-06-04 | 2015-12-10 | Magna Electronics Inc. | Vehicle control system with mobile device interface |
Non-Patent Citations (1)
Title |
---|
General Motors Corporation; Pontiac GTO Owner's Manual; 2005; pages 3-19 and 3-20; https://my.gm.com/content/dam/gmownercenter/gmna/dynamic/manuals/2006/pontiac/gto/2006_pontiac_gto_owners.pdf * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11261581B2 (en) * | 2015-03-20 | 2022-03-01 | Sumitomo(S.H.I.) Construction Machinery Co., Ltd. | Shovel |
US20180002895A1 (en) * | 2015-03-20 | 2018-01-04 | Sumitomo(S.H.I.) Construction Machinery Co., Ltd. | Shovel |
US10137777B2 (en) | 2015-11-03 | 2018-11-27 | GM Global Technology Operations LLC | Systems and methods for vehicle system control based on physiological traits |
US11463246B2 (en) | 2015-11-09 | 2022-10-04 | Dealerware, Llc | Vehicle access systems and methods |
US11451384B2 (en) * | 2015-11-09 | 2022-09-20 | Dealerware, Llc | Vehicle access systems and methods |
US11424921B2 (en) | 2015-11-09 | 2022-08-23 | Dealerware, Llc | Vehicle access systems and methods |
US10310553B2 (en) * | 2016-01-04 | 2019-06-04 | Lg Electronics Inc. | Display apparatus for vehicle and vehicle |
US20170225690A1 (en) * | 2016-02-09 | 2017-08-10 | General Motors Llc | Wearable device controlled vehicle systems |
US10619392B2 (en) | 2016-04-13 | 2020-04-14 | 1925Workbench Ltd. | Rail-mounted doors |
US10318442B2 (en) * | 2016-05-20 | 2019-06-11 | Faraday & Future Inc. | Pairing of input device and display in vehicle infotainment systems |
US11232655B2 (en) | 2016-09-13 | 2022-01-25 | Iocurrents, Inc. | System and method for interfacing with a vehicular controller area network |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US11076261B1 (en) * | 2016-09-16 | 2021-07-27 | Apple Inc. | Location systems for electronic device communications |
US11805392B2 (en) | 2016-09-16 | 2023-10-31 | Apple Inc. | Location systems for electronic device communications |
US10071685B2 (en) * | 2016-12-21 | 2018-09-11 | Hyundai Motor Company | Audio video navigation (AVN) head unit, vehicle having the same, and method for controlling the vehicle having the AVN head unit |
US20190241121A1 (en) * | 2018-02-06 | 2019-08-08 | Ford Global Technologies, Llc | Vehicle lamp assembly |
US20220177067A1 (en) * | 2019-03-27 | 2022-06-09 | Tvs Motor Company Limited | Smart connect instrument cluster |
US20210357086A1 (en) * | 2020-05-18 | 2021-11-18 | Toyota Jidosha Kabushiki Kaisha | Agent control device, agent control method, and recording medium |
US12132986B2 (en) * | 2021-12-12 | 2024-10-29 | Avanti R&D, Inc. | Computer vision system used in vehicles |
CN114523919A (en) * | 2022-02-14 | 2022-05-24 | 海信集团控股股份有限公司 | Vehicle and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
US20200369153A1 (en) | 2020-11-26 |
DE102016103612A1 (en) | 2016-09-08 |
CN105938338A (en) | 2016-09-14 |
US11472293B2 (en) | 2022-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11472293B2 (en) | In-vehicle component user interface | |
US9248794B2 (en) | Configuring user customizable operational features of a vehicle | |
US9630496B2 (en) | Rear occupant warning system | |
US9773417B2 (en) | Enhanced park assist system | |
US10713937B2 (en) | Trainable transceiver and mobile communications device diagnostic systems and methods | |
US9532160B2 (en) | Method of determining user intent to use services based on proximity | |
US10123155B2 (en) | Secondary-connected device companion application control of a primary-connected device | |
US20170001649A1 (en) | Secure low energy vehicle information monitor | |
US10045147B2 (en) | Application control of primary-connected devices from secondary-connected devices | |
US20190121628A1 (en) | Previewing applications based on user context | |
US10467905B2 (en) | User configurable vehicle parking alert system | |
GB2548453A (en) | Parallel parking system | |
US10462193B2 (en) | Vehicle add-on multimedia playback and capture devices | |
US20170280302A1 (en) | Vehicle seating zone assignment conflict resolution | |
CN108632346A (en) | The connection of ridesharing vehicle and passenger devices | |
CN107054243B (en) | In-vehicle control positioning | |
CN107172118B (en) | Control of primary connection device by vehicle computing platform and secondary connection device | |
CN115119145A (en) | Dynamic geofence hysteresis | |
US9966951B2 (en) | System for allowing a user to wirelessly manage software applications of a computing device and plurality of vehicles sensors | |
CN107831825B (en) | Flexible modular screen apparatus for mounting to a participating vehicle and transferring user profiles therebetween | |
KR20140128806A (en) | An method for configuring of a vehichle and an appratus using it | |
US20140277934A1 (en) | Method and Apparatus for Ambient Lighting Incoming Message Alert |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BUTTOLO, PIETRO;RANKIN, JAMES STEWART, II;GHOSH, DIPANJAN;AND OTHERS;SIGNING DATES FROM 20150224 TO 20150302;REEL/FRAME:035066/0699 |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: REPLY BRIEF FILED AND FORWARDED TO BPAI |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |