WO2020071072A1 - Information provision system, mobile terminal, information provision method, and computer program - Google Patents
Information provision system, mobile terminal, information provision method, and computer programInfo
- Publication number
- WO2020071072A1 WO2020071072A1 PCT/JP2019/035744 JP2019035744W WO2020071072A1 WO 2020071072 A1 WO2020071072 A1 WO 2020071072A1 JP 2019035744 W JP2019035744 W JP 2019035744W WO 2020071072 A1 WO2020071072 A1 WO 2020071072A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- vehicle
- collision prediction
- pedestrian
- target
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present disclosure relates to an information providing system, a mobile terminal, an information providing method, and a computer program.
- This application claims the priority based on Japanese Patent Application No. 2018-187530 filed on Oct. 2, 2018, and incorporates all the contents described in the Japanese application.
- Patent Document 1 discloses a traffic system that reports information about another vehicle to the own vehicle.
- the information providing system includes, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information, moving information indicating a moving state of the plurality of moving objects, Body appearance information, an acquisition unit that acquires, a target mobile having a mobile terminal among the plurality of mobiles, a collision prediction of whether or not a mobile other than the target mobile collides; While performing based on the movement information, a calculation unit that calculates an evaluation value indicating the possibility of collision between the target mobile object and the other mobile object, and a collision prediction result indicating a result of the collision prediction by the calculation unit, A determination unit that determines whether or not to notify a mobile terminal, wherein the calculation unit includes a collision prediction time, which is included in the collision prediction result, until the target mobile unit and the other mobile unit collide with each other.
- the evaluation unit obtains the evaluation value, and the determination unit determines whether to notify the collision prediction result to the mobile terminal based on the evaluation value.
- An information providing system includes an acquisition unit configured to acquire movement information indicating a movement state of the plurality of moving objects based on dynamic map information in which dynamic information of the plurality of moving objects is superimposed on map information. And performing a collision prediction of whether or not a target mobile having a mobile terminal among the plurality of mobiles collides with another mobile other than the target mobile based on the movement information, A calculating unit that calculates an evaluation value indicating a possibility that the moving body collides with the other moving body, and determines whether to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculating unit; A determination unit, wherein the calculation unit is configured to determine, based on position information of the plurality of moving objects included in the movement information, a relative distance between the target moving object and each of the other moving objects, and the relative distance. Of the relative distance And the calculated evaluation value, the evaluation unit based on the displacement of the relative distance, whether to notify the collision prediction result to the mobile terminal determines on the basis of the evaluation value.
- a mobile terminal is a mobile terminal that receives the collision prediction result from the information providing system and outputs the collision prediction result to a user.
- An information providing method is an information providing method for providing information to a mobile terminal, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information,
- the evaluation value is obtained based on a behavior attribute value corresponding to behavior information indicating a behavior of each of the moving body and the other moving body, and in the determining step, the collision prediction result is notified to the mobile terminal. Is determined based on the evaluation value.
- a computer program is a computer program for causing a computer to execute an information providing process for providing information to a mobile terminal, and the computer converts dynamic information of a plurality of moving objects into map information.
- a target mobile having a terminal and a collision prediction of whether or not a mobile other than the target mobile collides is performed based on the movement information, and the target mobile and the other mobile are
- Determination step wherein in the calculation step, the collision prediction time, which is included in the collision prediction result, until the target mobile body and the other mobile body collide with each other, the target mobile body, and the other An appearance attribute value corresponding to the appearance information of each moving object, and an action attribute value corresponding to action information indicating the action content of each of the target moving object and the other moving objects obtained from the movement information,
- a computer program that determines whether or not to notify the collision prediction result to the mobile terminal based on the evaluation value in the determining step.
- FIG. 1 is a schematic diagram illustrating an overall configuration of a wireless communication system according to an embodiment.
- FIG. 2 is a block diagram illustrating an example of an internal configuration of the edge server and the core server.
- FIG. 3 is a block diagram illustrating an example of an internal configuration of an in-vehicle device of a vehicle equipped with a communication terminal.
- FIG. 4 is a block diagram illustrating an example of an internal configuration of the pedestrian terminal.
- FIG. 5 is a block diagram illustrating an example of an internal configuration of a roadside sensor on which a wireless communication device as a communication terminal is mounted.
- FIG. 6 is an overall configuration diagram of the information providing system according to the present embodiment.
- FIG. 7 is a sequence diagram illustrating an example of dynamic information update processing and distribution processing.
- FIG. 8 is a functional block diagram of the edge server showing a function for providing a collision prediction result.
- FIG. 9 is a diagram illustrating an example of the moving object database.
- FIG. 10 is a diagram showing a part of the part for registering the attribute information in the moving object database, and shows a column for registering the attribute information when the moving object is a vehicle.
- FIG. 11 is a diagram showing a part of the part for registering the attribute information in the moving object database, and shows a column for registering the attribute information when the moving object is a pedestrian.
- FIG. 12 is a flowchart illustrating an example of calculation processing of the evaluation value of the collision prediction result by the calculation unit.
- FIG. 13 is a flowchart illustrating an example of the individual calculation process in step S55 in FIG. FIG.
- FIG. 14 is a flowchart illustrating an example of the process of adding a collision prediction target in FIG.
- FIG. 15 is a flowchart showing an example of the vehicle addition process in FIG.
- FIG. 16 is a flowchart showing an example of the pedestrian addition process in FIG.
- FIG. 17 is a diagram illustrating an example of the evaluation value database.
- FIG. 18 is a flowchart illustrating an example of a determination process performed by the determination unit.
- FIG. 19 is a diagram showing a situation around an intersection according to scenario 1.
- FIG. 20 is a diagram showing a situation around an intersection according to scenario 2.
- FIG. 21 is a diagram showing a situation around an intersection according to scenario 3.
- FIG. 22 is a diagram showing a situation around an intersection according to scenario 4.
- FIG. 19 is a diagram showing a situation around an intersection according to scenario 1.
- FIG. 20 is a diagram showing a situation around an intersection according to scenario 2.
- FIG. 21 is a diagram showing a situation around an intersection according to scenario
- FIG. 23 is a diagram showing a situation around an intersection according to scenario 5.
- FIG. 24 is a diagram showing a situation around an intersection according to scenario 6.
- FIG. 25 is a diagram illustrating a storage unit of the edge server according to the second embodiment.
- FIG. 26 is a diagram illustrating an example of the relative distance database.
- FIG. 27 is a flowchart illustrating an example of a calculation process performed by the calculation unit 31a according to the present embodiment.
- FIG. 28 is a flowchart showing an example of the adding process based on the relative distance in step S59 in FIG.
- FIG. 29 is a flowchart illustrating an example of an addition process for an evaluation value.
- FIG. 30 is a flowchart illustrating an example of a process corresponding to vehicle-to-pedestrian or pedestrian-to-vehicle.
- FIG. 31 is a flowchart illustrating an example of a process corresponding to a vehicle-to-vehicle process.
- FIG. 32 is a diagram illustrating an example of the evaluation value database after the addition processing based on the relative distance.
- FIG. 33 is a diagram for explaining a specific example of the addition process based on the relative distance.
- FIG. 34 is a diagram for describing another specific example of the addition process based on the relative distance.
- FIG. 35 is a diagram for describing another specific example of the addition process based on the relative distance.
- FIG. 36A is a diagram for describing another specific example of the addition processing based on the relative distance.
- FIG. 36B is a diagram for describing another specific example of the addition processing based on the relative distance.
- the central device of the system is configured to determine the presence or absence of an abnormal event of each vehicle based on vehicle information obtained from each vehicle, and to notify each vehicle of the determination result.
- the system is configured to notify a result of occurrence of an abnormal event. For example, using such a system, a result of predicting a collision occurring between vehicles (moving bodies) is notified. Can be considered.
- a collision prediction time until the moving bodies that are predicted to collide can collide can be obtained. It can be said that the shorter the collision prediction time is, the higher the possibility of collision is and the more urgent it is required. Therefore, if it is determined whether or not to notify the mobile body of the collision prediction result according to the collision prediction time, it is possible to notify the collision prediction result according to necessity.
- the moving objects include not only vehicles but also traffic weak people such as pedestrians, and the possibility of collision may differ depending on the attributes of each moving object. For this reason, if the possibility of a collision is evaluated only based on the collision prediction time and it is determined whether or not to notify the mobile body of the collision prediction result, the possibility of the collision evaluated based on the collision prediction time and the actual collision It is conceivable that a divergence occurs from the possibility and the notification of the collision prediction result is not properly performed. Therefore, it is desired to appropriately provide a collision prediction result for which a notification is highly necessary to the mobile terminal.
- the present disclosure has been made in view of such circumstances, and has as its object to provide a technology capable of appropriately providing a collision prediction result to a mobile terminal.
- An information providing system is configured to include, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information, movement information indicating a moving state of the plurality of moving objects,
- An acquisition unit that acquires appearance information of a plurality of moving objects, and whether or not a target moving object having a mobile terminal among the plurality of moving objects collides with another moving object other than the target moving object;
- a computing unit that performs a collision prediction based on the movement information, and calculates an evaluation value indicating a possibility of collision between the target moving object and the other moving object; and a collision prediction indicating a result of the collision prediction by the computing unit.
- a determination unit that determines whether or not to notify a result to the mobile terminal, wherein the calculation unit is included in the collision prediction result until the target mobile unit and the other mobile unit collide with each other. Collision prediction time, the target moving body, and Appearance attribute value according to the appearance information of each of the moving objects, obtained from the movement information, the target moving object, and an action attribute value according to the action information indicating the action content of each of the other moving objects, And determines the evaluation value based on the evaluation value. The determination unit determines whether to notify the collision prediction result to the mobile terminal.
- the evaluation value for determining whether or not to notify the collision prediction result is determined based on the collision prediction time, the appearance attribute value, and the action attribute value.
- the appearance and behavior of the target moving object and other moving objects can be reflected in the determination as to whether or not to notify the result. Therefore, for example, for a collision prediction regarding a moving object having a high possibility of collision in appearance or behavior, an evaluation value can be set so as to be weighted in a direction in which a collision prediction result is notified, and the necessity of notification is required. In determining, the appearance and behavior of the moving object can be taken into account. As a result, the necessity of notification can be appropriately determined, and a collision prediction result with high necessity of notification can be appropriately provided to the mobile terminal.
- the information providing system may further include a notification unit that notifies the collision prediction result to the mobile terminal that the determination unit determines to notify.
- a notification unit that notifies the collision prediction result to the mobile terminal that the determination unit determines to notify.
- information can be provided only to the mobile terminals for which it is determined that the collision prediction result is necessary based on the evaluation value.
- the calculation unit specifies a moving body that is predicted to collide with the target moving body among the other moving bodies based on the collision prediction result
- the evaluation value may be obtained using the collision prediction time, the appearance attribute value, and the action attribute value of the moving object predicted to collide. In this case, it is possible to obtain an evaluation value relating to the collision prediction result of the moving body predicted to collide.
- the information providing system further includes a control unit that controls the mobile terminal and causes the mobile terminal to execute a process of outputting the collision prediction result to a user of the mobile terminal, wherein the control unit includes:
- the mobile terminal may be controlled so that an output mode of the collision prediction result is different depending on the appearance information and the behavior information of the moving body predicted to collide. In this case, it is possible to output to the user in an output mode according to the characteristics of the moving object predicted to collide.
- the appearance attribute value includes an attribute value corresponding to a size of the vehicle, an attribute value corresponding to a shape of the vehicle, At least one of an attribute value according to a color of the vehicle and an attribute value according to information described on a license plate attached to the vehicle may be included.
- the appearance attribute value includes an attribute value corresponding to the height of the pedestrian, an attribute value corresponding to the clothes of the pedestrian, And at least one of attribute values according to the pedestrian's equipment.
- the action attribute value includes at least an attribute value corresponding to a current driving state and an attribute value corresponding to a past driving history.
- One may be included.
- the action attribute value when the other moving object is a pedestrian, includes an attribute value corresponding to a walking location of the pedestrian and an attribute corresponding to a walking locus of the pedestrian. At least one of a value and an attribute value according to the operation of the traffic light with respect to the lamp color may be included.
- the calculation unit is configured to determine a relative distance between the target moving object and each of the other moving objects based on position information of the plurality of moving objects included in the movement information; The relative distance may be determined, and an additional value based on the relative distance and the relative distance may be added to the evaluation value.
- the collision prediction result is not provided in the collision prediction time, but the collision prediction result is provided to the mobile terminal of the target moving body when the possibility of collision occurs due to the approach of the target moving body and another moving body. In the evaluation value.
- An information providing system acquires movement information indicating a movement state of the plurality of moving objects based on dynamic map information in which dynamic information of the plurality of moving objects is superimposed on map information. And performing a collision prediction of whether or not a target mobile having a mobile terminal among the plurality of mobiles collides with another mobile other than the target mobile based on the movement information.
- An arithmetic unit for calculating an evaluation value indicating a possibility of collision between the target mobile object and the other mobile object, and whether to notify the mobile terminal of a collision prediction result indicating a result of a collision prediction by the arithmetic unit.
- a determination unit that determines the relative distance between the target mobile unit and each of the other mobile units based on the position information of the plurality of mobile units included in the movement information, and Determine the displacement of the relative distance, Obtains the evaluation value based on the displacement of the pair distance and the relative distance, the determining unit determines whether to notify the collision prediction result to the mobile terminal determines on the basis of the evaluation value.
- the evaluation value for determining whether or not to notify the collision prediction result is obtained based on the relative distance between the target moving object and each of the other moving objects, and the displacement of the relative distance. If the target moving body and the other moving body are within the range of the predetermined relative distance and the relative distance decreases so as to approach each other, it can be determined that the possibility of collision has occurred. . As a result, regardless of the traveling directions of the target moving object and the other moving objects, other moving objects having a possibility of collision with the target moving object can be widely evaluated, and the necessity of notification can be appropriately determined. it can.
- a mobile terminal receives the collision prediction result from the information providing system according to any one of (1) to (10) and outputs the collision prediction result to a user. It is.
- An information providing method is an information providing method for providing information to a mobile terminal, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information.
- Moving information indicating a moving state of the plurality of moving objects, appearance information of the plurality of moving objects, an obtaining step of obtaining, a target moving object having the mobile terminal among the plurality of moving objects, An evaluation value indicating the possibility of collision between the target moving object and the other moving object, while performing a collision prediction of whether or not the moving object collides with another moving object other than the target moving object based on the movement information.
- a determining step of determining whether or not to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculation step wherein the calculation step includes the collision prediction result.
- a computer program is a computer program for causing a computer to execute an information providing process for providing information to a mobile terminal, and stores the dynamic information of a plurality of moving objects in the computer.
- the target mobile having the mobile terminal and a mobile other than the target mobile perform a collision prediction based on the mobile information as to whether or not a collision occurs, and the target mobile and the other mobile are used.
- Determination step wherein in the calculation step, the collision prediction result included in the collision prediction result, the collision prediction time until the target moving body and the other moving body collides, the target moving body, and the An appearance attribute value according to the appearance information of each of the other moving objects, and an action attribute value obtained from the movement information and corresponding to action information indicating an action content of each of the target moving object and the other moving object; And a computer program that determines whether or not to notify the collision prediction result to the mobile terminal based on the evaluation value in the determining step.
- FIG. 1 is a schematic diagram illustrating an overall configuration of a wireless communication system according to an embodiment.
- a wireless communication system includes a plurality of communication terminals 1A to 1D capable of wireless communication, one or a plurality of base stations 2 wirelessly communicating with communication terminals 1A to 1D, and a wired or wireless communication with base station 2. It includes one or more edge servers 3 that communicate with each other and one or more core servers 4 that communicate with the edge servers 3 by wire or wirelessly.
- the communication terminals 1A to 1D are also referred to as communication terminals 1.
- the core server 4 is installed in a core data center (DC) of the core network.
- the edge server 3 is installed in a distributed data center (DC) of a metro network.
- the metro network is, for example, a communication network constructed for each city. Each metro network is connected to the core network.
- the base station 2 is communicably connected to any one of the edge servers 3 of the distributed data center included in the metro network.
- the core server 4 is communicably connected to the core network.
- the edge server 3 is communicably connected to a metro network. Therefore, the core server 4 can communicate with the edge server 3 and the base station 2 belonging to each metro network via the core network and the metro network.
- the base station 2 includes at least one of a macro cell base station, a micro cell base station, and a pico cell base station.
- the edge server 3 and the core server 4 are general-purpose servers capable of performing SDN (Software-Defined Networking).
- the relay device such as the base station 2 and a repeater (not shown) is a transport device capable of SDN. Therefore, a plurality of virtual networks (network slices) S1 to S4 satisfying conflicting service requirements such as low-delay communication and large-capacity communication can be defined as physical devices of the wireless communication system by the network virtualization technology.
- the wireless communication system according to the present embodiment is compliant with, for example, 5G.
- the wireless communication system according to the present embodiment is a mobile communication system capable of defining a plurality of network slices (hereinafter, also referred to as “slices”) S1 to S4 according to predetermined service requirements such as delay time. It is not limited to 5G.
- the number of slices to be defined is not limited to four, but may be five or more.
- each of the network slices S1 to S4 is defined as follows.
- the slice S1 is a network slice defined so that the communication terminals 1A to 1D communicate directly.
- the communication terminals 1A to 1D that directly communicate in the slice S1 are also referred to as “node N1”.
- Slice S2 is a network slice defined so that communication terminals 1A to 1D communicate with base station 2.
- the highest communication node in the slice S2 (the base station 2 in the illustrated example) is also referred to as “node N2”.
- the slice S3 is a network slice defined so that the communication terminals 1A to 1D communicate with the edge server 3 via the base station 2.
- the highest communication node in the slice S3 (the edge server 3 in the illustrated example) is also referred to as “node N3”.
- the node N2 becomes a relay node. That is, data communication is performed by an uplink route from the node N1 to the node N2 to the node N3 and a downlink route from the node N3 to the node N2 to the node N1.
- Slice S4 is a network slice defined so that communication terminals 1A to 1D communicate with core server 4 via base station 2 and edge server 3.
- the highest-order communication node in the slice S4 (the core server 4 in the illustrated example) is also referred to as “node N4”.
- the nodes N2 and N3 become relay nodes. That is, data communication is performed by an uplink route of the node N1 ⁇ node N2 ⁇ node N3 ⁇ node N4 and a downlink route of the node N4 ⁇ node N3 ⁇ node N2 ⁇ node N1.
- the routing is performed without using the edge server 3 as a relay node.
- data communication is performed by an uplink route from the node N1 to the node N2 to the node N4 and a downlink route from the node N4 to the node N2 to the node N1.
- the communication terminal 1A is a wireless communication device mounted on the vehicle 5.
- the vehicles 5 include not only ordinary passenger cars but also public vehicles such as route buses and emergency vehicles.
- the vehicle 5 may be not only a four-wheeled vehicle but also a two-wheeled vehicle (motorcycle).
- the drive system of the vehicle 5 may be any of an engine drive, an electric motor drive, and a hybrid system.
- the driving method of the vehicle 5 may be any of normal driving in which a passenger performs operations such as acceleration / deceleration and steering of a steering wheel, and automatic driving in which the operation is executed by software.
- the communication terminal 1A of the vehicle 5 may be an existing wireless communication device of the vehicle 5, or may be a portable terminal carried by the passenger into the vehicle 5.
- the passenger's mobile terminal is temporarily connected to the in-vehicle LAN (Local Area Network) of the vehicle 5 to temporarily become a wireless communication device mounted on the vehicle.
- LAN Local Area Network
- the communication terminal 1B is a portable terminal (pedestrian terminal) carried by the pedestrian 7.
- the pedestrian 7 is a person who walks outdoors on roads and parking lots, and indoors in buildings and underground malls.
- the pedestrian 7 includes not only a person walking but also a person riding a bicycle having no power source.
- the communication terminal 1C is a wireless communication device mounted on the roadside sensor 8.
- the roadside sensor 8 is an image-type vehicle sensor installed on a road, a security camera installed outdoors or indoors, and the like.
- the communication terminal 1D is a wireless communication device mounted on the traffic signal controller 9 at the intersection.
- the direct wireless communication in the slice S1 (for example, “inter-vehicle communication” in which the communication terminal 1A of the vehicle 5 directly communicates) and the slice through the base station 2 Wireless communication of S2 is possible.
- an information providing service for users included in a relatively wide service area for example, an area including municipalities and prefectures
- using the slices S3 and S4 in the wireless communication system of FIG. 1 is provided. is assumed.
- FIG. 2 is a block diagram illustrating an example of an internal configuration of the edge server 3 and the core server 4.
- the edge server 3 includes a control unit 31 including a CPU (Central Processing Unit), a ROM (Read Only Memory) 32, a RAM (Random Access Memory) 33, a storage unit 34, and a communication unit 35. including.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the control unit 31 controls the operation of each hardware by reading out and executing one or a plurality of programs stored in the ROM 32 in advance in the RAM 33, so that the computer can communicate with the core server 4, the base station 2, and the like. It functions as the edge server 3.
- the RAM 33 is composed of a volatile memory element such as an SRAM (Static RAM) or a DRAM (Dynamic RAM), and temporarily stores a program executed by the control unit 31 and data necessary for the execution.
- SRAM Static RAM
- DRAM Dynamic RAM
- the storage unit 34 is configured by a nonvolatile memory element such as a flash memory or an EEPROM (Electrically Erasable Programmable Read Only Only Memory), or a magnetic storage device such as a hard disk.
- a nonvolatile memory element such as a flash memory or an EEPROM (Electrically Erasable Programmable Read Only Only Memory), or a magnetic storage device such as a hard disk.
- the communication unit 35 has a function of communicating with the core server 4, the base station 2, and the like via the metro network.
- the communication unit 35 transmits the information provided from the control unit 31 to an external device via the metro network, and provides the control unit 31 with the information received via the metro network.
- the storage unit 34 stores a dynamic information map (hereinafter, also simply referred to as a “map”) M1 as dynamic map information.
- the map M1 is an aggregate (virtual database) of data in which dynamic information that changes every moment is superimposed on a high-definition digital map that is static information.
- the information constituting the map M1 includes the following “dynamic information” and “static information”.
- Dynamic information refers to dynamic data that requires a delay time within one second.
- dynamic information includes position information and signal information of a moving object (vehicle and pedestrian, etc.) used as ITS (Intelligent Transport Systems) look-ahead information.
- ITS Intelligent Transport Systems
- the position information of the moving object included in the dynamic information does not have a wireless communication function in addition to the position information of the vehicle 5 and the pedestrian 7 capable of wireless communication by having the communication terminals 1A and 1B.
- the position information of the vehicle 5 and the pedestrian 7 is also included.
- Static information refers to static data with a delay of one month or less.
- road surface information, lane information, three-dimensional structure data, and the like correspond to the static information.
- the control unit 31 of the edge server 3 updates the dynamic information of the map M1 stored in the storage unit 34 at a predetermined update cycle (update processing). Specifically, the control unit 31 collects, from the communication terminals 1A to 1D, various types of sensor information acquired by the vehicle 5, the roadside sensor 8, and the like in the service area of the own device for each predetermined update cycle, The dynamic information of the map M1 is updated based on the collected sensor information.
- the control unit 31 When receiving the dynamic information request message from the communication terminals 1A and 1B of the predetermined user, the control unit 31 sends the latest dynamic information to the communication terminals 1A and 1B of the transmission source of the request message at every predetermined distribution cycle. Distribute (distribution processing). The control unit 31 collects traffic information and weather information of various places in the service area from the traffic control center, the private weather service support center, and the like, and updates the dynamic information or the static information of the map M1 based on the collected information. May be.
- core server 4 includes a control unit 41 including a CPU, a ROM 42, a RAM 43, a storage unit 44, and a communication unit 45.
- the control unit 41 controls the operation of each hardware by reading and executing one or a plurality of programs stored in the ROM 32 in advance in the RAM 43, and functions as a core server 4 capable of communicating the computer device with the edge server 3. Let it.
- the RAM 43 is composed of a volatile memory element such as an SRAM or a DRAM, and temporarily stores a program executed by the control unit 41 and data necessary for the execution.
- the storage unit 44 is configured by a nonvolatile memory element such as a flash memory or an EEPROM, or a magnetic storage device such as a hard disk.
- the communication unit 45 has a function of communicating with the edge server 3, the base station 2, and the like via the core network.
- the communication unit 45 transmits the information provided from the control unit 41 to an external device via the core network, and provides the control unit 41 with the information received via the core network.
- the storage unit 44 of the core server 4 stores an information map M2.
- the data structure of the map M2 (data structure including dynamic information and static information) is the same as the data structure of the map M1.
- the map M2 may be a map of the same service area as the map M1 of the specific edge server 3, or may be a wider area map obtained by integrating the maps M1 held by the plurality of edge servers 3.
- the control unit 41 of the core server 4 updates the dynamic information of the map M2 stored in the storage unit 44, and distributes the dynamic information in response to the request message. And may be performed. That is, the control unit 41 can perform the update process and the distribution process based on the map M2 of the own device independently of the edge server 3. That is, the control unit 41 can independently execute the update process and the distribution process of the dynamic information based on the map M2 of the own device, separately from the edge server 3.
- the core server 4 belonging to the slice S4 has a longer communication delay time with the communication terminals 1A to 1D than the edge server 3 belonging to the slice S3. For this reason, even if the core server 4 independently updates the dynamic information of the map M2, the real-time property is inferior to the dynamic information of the map M1 managed by the edge server 3. Therefore, for example, the control unit 31 of the edge server 3 and the control unit 41 of the core server 4 may perform the update processing and the distribution processing of the dynamic information in a distributed manner according to the priority defined for each predetermined area. .
- FIG. 3 is a block diagram illustrating an example of an internal configuration of the vehicle-mounted device 50 of the vehicle 5 equipped with the communication terminal 1A.
- on-vehicle device 50 mounted on vehicle 5 includes a control unit (ECU: Electronic Control Unit) 51, a GPS receiver 52, a vehicle speed sensor 53, a gyro sensor 54, a storage unit 55, a display 56, and a speaker. 57, an input device 58, a vehicle-mounted camera 59, a radar sensor 60, and a communication unit 61.
- ECU Electronic Control Unit
- the communication unit 61 is the above-described communication terminal 1A (a wireless communication device capable of performing communication conforming to 5G). Therefore, the in-vehicle device 50 of the vehicle 5 can communicate with the edge server 3 as a kind of mobile terminal belonging to the slice S3. Further, the in-vehicle device 50 of the vehicle 5 can communicate with the core server 4 as a kind of mobile terminal belonging to the slice S4.
- the control unit 51 is a computer device that searches for a route of the vehicle 5 and controls other electronic devices 52 to 61.
- the control unit 51 obtains the vehicle position of the own vehicle from the GPS signals periodically acquired by the GPS receiver 52. Further, the control unit 51 complements the vehicle position and direction based on the input signals of the vehicle speed sensor 53 and the gyro sensor 54, and grasps the accurate current position and direction of the vehicle 5.
- the GPS receiver 52, the vehicle speed sensor 53, and the gyro sensor 54 are sensors for measuring the current position, speed, and direction of the vehicle 5.
- Storage unit 55 includes a map database.
- the map database provides the control unit 51 with road map data.
- the road map data includes link data and node data, and is stored in a recording medium such as a DVD, CD-ROM, memory card, or HDD.
- the storage unit 55 reads necessary road map data from the recording medium and provides the read road map data to the control unit 51.
- the display 56 and the speaker 57 are output devices for notifying the user who is the occupant of the vehicle 5 of various information generated by the control unit 51. Specifically, the display 56 displays an input screen at the time of a route search, a map image around the own vehicle, route information to a destination, and the like. The speaker 57 outputs an announcement or the like for guiding the vehicle 5 to the destination by voice. These output devices can also notify the passenger of the provided information received by the communication unit 61.
- the input device 58 is a device for the passenger of the vehicle 5 to perform various input operations.
- the input device 58 is an operation switch provided on a steering wheel, a touch panel provided on the joystick display 56, a combination thereof, or the like.
- the input device 58 may be a voice recognition device that receives an input by voice recognition of a passenger.
- the input signal generated by the input device 58 is transmitted to the control unit 51.
- the on-vehicle camera 59 is an image sensor that captures an image in front of the vehicle 5.
- the in-vehicle camera 59 may be either a monocular or a compound eye.
- the radar sensor 60 is a sensor that detects an object existing in front of or around the vehicle 5 using a millimeter-wave radar, a LiDAR method, or the like.
- the control unit 51 executes a driving support control to output an alert to the occupant during driving on the display 56 or to perform a forced brake intervention based on the measurement data by the on-vehicle camera 59 and the radar sensor 60. be able to.
- the control unit 51 is configured by an arithmetic processing device such as a microcomputer that executes various control programs stored in the storage unit 55.
- the control unit 51 has a function of displaying a map image on the display 56 and a route from the departure place to the destination (including the position of the stopover point, if any) as functions realized by executing the control program. And a function of guiding the vehicle 5 to the destination according to the calculated route.
- the control unit 51 performs an object recognition process of recognizing an object in front of or around the own vehicle based on at least one measurement data of the in-vehicle camera 59 and the radar sensor 60, and calculates a distance to the recognized object. It has a function of performing distance processing.
- the control unit 51 can calculate the position information of the object recognized by the object recognition process from the distance calculated by the distance measurement process and the sensor position of the own vehicle.
- the control unit 51 can execute the following processes in communication with the edge server 3 (or the core server 4). 1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
- the request message transmission process is a process of transmitting, to the edge server 3, a control packet requesting distribution of the dynamic information of the map M1 sequentially updated by the edge server 3.
- the control packet includes the vehicle ID of the own vehicle.
- the edge server 3 distributes the dynamic information at a predetermined distribution cycle to the communication terminal 1A of the vehicle 5 having the transmission source vehicle ID.
- the dynamic information receiving process is a process of receiving dynamic information distributed by the edge server 3 to the own device.
- the generation process of the change point information by the control unit 51 calculates the change between the received dynamic information and the sensor information of the own vehicle at the time of reception, based on the comparison result, and obtains information on the difference between the two information. This is the process of generating change point information.
- the change point information generated by the control unit 51 is, for example, the following information examples a1 to a2.
- Information example a1 Change point information related to a recognized object
- the control unit 51 detects an object X (a moving object such as a vehicle or a pedestrian or an obstacle) not included in the received dynamic information by its own object recognition processing. In this case, the image data and the position information of the detected object X are used as change point information. If the position information of the object X included in the received dynamic information and the position information of the object X obtained by its own object recognition processing are shifted by a predetermined threshold or more, the control unit 51 And the difference value between the two pieces of position information are set as change point information.
- an object X a moving object such as a vehicle or a pedestrian or an obstacle
- the control unit 51 shifts the position information of the own vehicle included in the received dynamic information from the vehicle position of the own vehicle calculated by the GPS signal by a predetermined threshold or more. In this case, the difference value between the two is used as change point information.
- the control unit 51 determines the difference between the two. The value is used as change point information.
- the control unit 51 When generating the change point information as described above, the control unit 51 generates a communication packet addressed to the edge server 3 including the generated change point information and the vehicle ID of the own vehicle.
- the transmission process of the change point information is a process of transmitting the communication packet including the change point information to the edge server 3.
- the transmission process of the change point information is performed within the dynamic information distribution cycle by the edge server 3.
- control unit 51 Based on the dynamic information received from the edge server 3 or the like, the control unit 51 outputs a warning to the occupant during driving on the display 56 or executes driving support control for forcibly intervening braking. You can also.
- FIG. 4 is a block diagram illustrating an example of an internal configuration of the pedestrian terminal 70 (the communication terminal 1B).
- the pedestrian terminal 70 in FIG. 4 is a wireless communication device capable of performing communication processing conforming to, for example, 5G. Therefore, the pedestrian terminal 70 can communicate with the edge server 3 as a kind of mobile terminal belonging to the slice S3. Further, the pedestrian terminal 70 can also communicate with the core server 4 as a kind of mobile terminal belonging to the slice S4.
- pedestrian terminal 70 includes a control unit 71, a storage unit 72, a display unit 73, an operation unit 74, and a communication unit 75.
- the communication unit 75 is a communication interface that performs wireless communication with the base station 2 of the carrier that provides the 5G service.
- the communication unit 75 converts the RF signal from the base station 2 into a digital signal and outputs the digital signal to the control unit 71.
- the communication unit 75 converts the digital signal input from the control unit 71 into an RF signal, and transmits the RF signal to the base station 2.
- the control unit 71 includes a CPU, a ROM, and a RAM.
- the control unit 71 reads out and executes the program stored in the storage unit 72, and controls the entire operation of the pedestrian terminal 70.
- the storage unit 72 includes a hard disk, a nonvolatile memory, and the like, and stores various computer programs and data.
- the storage unit 72 stores a mobile ID, which is identification information of the pedestrian terminal 70.
- the mobile ID is, for example, a unique user ID or MAC address of a carrier contractor.
- the storage unit 72 stores various application software arbitrarily installed by the user.
- the application software stored in the storage unit 72 includes, for example, application software for enjoying an information providing service for receiving dynamic information of the map M1 through communication with the edge server 3 (or the core server 4). .
- the operation unit 74 includes various operation buttons and a touch panel function of the display unit 73.
- the operation unit 74 outputs an operation signal according to a user operation to the control unit 71.
- the display unit 73 is, for example, a liquid crystal display.
- the display unit 73 presents various information to the user. For example, the display unit 73 displays the image data of the information maps M1 and M2 transmitted from the servers 3 and 4 on a screen.
- the control unit 71 has a time synchronization function of acquiring the current time from the GPS signal, a position detection function of measuring the current position (latitude, longitude, and altitude) of the vehicle from the GPS signal, and detects the direction of the pedestrian 7 by using the direction sensor. And a azimuth detecting function for measuring.
- the control unit 71 can execute the following processes in communication with the edge server 3 (may be the core server 4). 1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
- the request message transmission process is a process of transmitting, to the edge server 3, a control packet requesting distribution of the dynamic information of the map M1 sequentially updated by the edge server 3.
- the control packet includes the mobile ID of the pedestrian terminal 70.
- the edge server 3 distributes the dynamic information to the communication terminal 1B of the pedestrian 7 having the transmission source mobile ID at a predetermined distribution cycle.
- the dynamic information receiving process is a process of receiving dynamic information distributed by the edge server 3 to the own device.
- the generation process of the change point information by the control unit 71 calculates the change between the received dynamic information and the sensor information of the own vehicle at the time of reception, based on the comparison result, and obtains information on the difference between the two information. This is the process of generating change point information.
- the change point information generated by the control unit 71 is, for example, the following information example.
- change point information regarding the own vehicle The control unit 71 determines that the position information of the own pedestrian 7 included in the received dynamic information and the position of the own pedestrian 7 calculated by the GPS signal are a predetermined threshold value. If there is a deviation, the difference between the two is used as change point information. When the direction of the self-pedestrian 7 included in the received dynamic information and the direction of the self-pedestrian 7 calculated by the direction sensor deviate by a predetermined threshold or more, the control unit 71 determines the difference value between the two. Change point information.
- the control unit 71 When generating the change point information as described above, the control unit 71 generates a communication packet addressed to the edge server 3 including the generated change point information and the mobile ID of the own terminal 70.
- the transmission process of the change point information is a process of transmitting the communication packet including the change point information to the edge server 3.
- the transmission process of the change point information is performed within the dynamic information distribution cycle by the edge server 3.
- control unit 71 transmits the state information including the position and orientation information of the own terminal 70 to the edge server 3 by performing the change point information generation processing and the change point information transmission processing.
- FIG. 5 is a block diagram illustrating an example of an internal configuration of the roadside sensor 8 including the wireless communication device as the communication terminal 1C.
- roadside sensor 8 includes a control unit 81, a storage unit 82, a roadside camera 83, a radar sensor 84, and a communication unit 85.
- the communication unit 85 is the above-described communication terminal 1C, that is, a wireless communication device capable of performing communication processing conforming to, for example, 5G. Therefore, the roadside sensor 8 can communicate with the edge server 3 as a kind of fixed terminal belonging to the slice S3. The roadside sensor 8 can also communicate with the core server 4 as a kind of fixed terminal belonging to the slice S4.
- the control unit 81 includes a CPU, a ROM, and a RAM.
- the control unit 81 reads and executes the program stored in the storage unit 82, and controls the entire operation of the roadside sensor 8.
- the storage unit 82 includes a hard disk, a nonvolatile memory, and the like, and stores various computer programs and data. Further, the storage unit 82 stores a sensor ID which is identification information of the roadside sensor 8. The sensor ID is, for example, a user ID or a MAC address unique to the owner of the roadside sensor 8.
- the roadside camera 83 is an image sensor that captures an image of a predetermined shooting area.
- the roadside camera 83 may be a single eye or a compound eye.
- the radar sensor 60 is a sensor that detects an object existing in front of or around the vehicle 5 using a millimeter-wave radar, a LiDAR method, or the like.
- the control unit 81 transmits the captured video data and the like to the computer device of the security administrator.
- the control unit 81 transmits the captured image data and the like to the traffic control center.
- the control unit 81 performs an object recognition process of recognizing an object in the shooting area based on at least one measurement data of the roadside camera 83 and the radar sensor 84, and a distance measurement process of calculating a distance to the recognized object. , Has the function of executing.
- the control unit 51 can calculate the position information of the object recognized by the object recognition process from the distance calculated by the distance measurement process and the sensor position of the own vehicle.
- the control unit 81 can execute the following processes in communication with the edge server 3 (may be the core server 4). 1) Change point information generation processing 2) Change point information transmission processing
- the generation process of the change point information in the roadside sensor 8 is performed based on a comparison result between the previous sensor information and the current sensor information for each predetermined measurement cycle (for example, the distribution cycle of the dynamic information by the edge server 3). This is a process of calculating a change between sensor information and generating change point information that is information on a difference between the two information.
- the change point information generated by the roadside sensor 8 is, for example, the following information example b1.
- the control unit 81 detects an object Y (a moving object such as a vehicle or a pedestrian or an obstacle) that was not detected in the previous object recognition process by the current object recognition process. In this case, the detected image data and position information of the object Y are used as change point information.
- the control unit 81 And the difference value between them are defined as change point information.
- the control unit 81 When generating the change point information as described above, the control unit 81 generates a communication packet addressed to the edge server 3 including the generated change point information and the sensor ID of the own device.
- the transmission process of the change point information is a process of transmitting the communication packet including the change point information in the data to the edge server 3.
- the transmission process of the change point information is performed within the dynamic information distribution cycle by the edge server 3.
- FIG. 6 is an overall configuration diagram of the information providing system according to the present embodiment.
- the information providing system according to the present embodiment includes a large number of vehicles 5, pedestrian terminals 70, and roadside sensors 8 scattered in a service area (real world) of edge server 3 which is relatively wide. And an edge server 3 that can perform wireless communication with low delay by 5G-compliant communication and the like performed via these communication nodes and the base station 2 and that functions as an information providing device. That is, the information providing system is configured to include part or all of the above-described wireless communication system.
- the mobile object existing in the service area of the edge server 3 includes the vehicle 5 capable of wireless communication by mounting the communication terminal 1A and the in-vehicle device 50, and the pedestrian 7 carrying the pedestrian terminal 70. And a pedestrian 7 who does not carry the pedestrian terminal 70 without a wireless communication function.
- the edge server 3 collects the above-mentioned change point information at a predetermined cycle from the in-vehicle device 50 of the vehicle 5 in the service area, the pedestrian terminal 70, the roadside sensor 8, and the like (step S31).
- the edge server 3 integrates the collected change point information by map matching (integration processing), and updates the dynamic information of the information map M1 being managed (step S32).
- the edge server 3 transmits the latest dynamic information to the requesting communication node (step S33).
- the vehicle 5 that has received the dynamic information can utilize the dynamic information for driving assistance of a passenger and the like.
- the edge server 3 may transmit the map M1 updated in step S32 to the requesting communication node as dynamic information.
- the vehicle 5 receiving the dynamic information detects the change point information from the sensor information of the own vehicle based on the dynamic information
- the vehicle 5 transmits the detected change point information to the edge server 3 (Step S34).
- step S31 collection of change point information (step S31) ⁇ update of dynamic information (step S32) ⁇ distribution of dynamic information (step S33) ⁇ detection of change point information by vehicle
- step S34 collection of change point information
- FIG. 6 illustrates an information providing system including only one edge server 3, but the information providing system may include a plurality of edge servers 3, or may be used instead of the edge server 3 or in addition to the edge server 3.
- One or more core servers 4 may be included.
- the information map M1 managed by the edge server 3 may be any map as long as at least dynamic information of an object is superimposed on map information such as a digital map. This is the same for the information map M2 of the core server.
- FIG. 7 is a sequence diagram illustrating an example of a dynamic information update process and a distribution process performed by cooperation of the pedestrian terminal 70, the vehicle-mounted device 50 of the vehicle 5, the roadside sensor 8, and the edge server 3.
- the execution subject is the pedestrian terminal 70, the in-vehicle device 50 of the vehicle 5, the roadside sensor 8, and the edge server 3, but the actual execution subject is those control units 71, 51, 81, 31. It is. .. In FIG. 7 are the distribution periods of the dynamic information.
- edge server 3 when receiving a request message for dynamic information from pedestrian terminal 70 and on-vehicle device 50 of vehicle 5 (step S1), edge server 3 transmits the latest dynamic information at the time of reception to the transmission source. To the pedestrian terminal 70 and the in-vehicle device 50 of the vehicle 5 (step S2).
- step S1 the edge server 3 analyzes the received request message. If the information indicating the request source included in the message is information indicating the communication terminal 1 registered in advance, the edge server 3 determines that the request message has been transmitted. Alternatively, dynamic information may be transmitted.
- step S1 if there is a request message from either the pedestrian terminal 70 or the in-vehicle device 50, in step S2, the dynamic information is distributed to only one of the communication terminals that transmitted the request message. Is done.
- the pedestrian terminal 70 When the pedestrian terminal 70 that has received the dynamic information distributed in step S2 generates the change point information within the distribution cycle U1 (step S3), the pedestrian terminal 70 transmits the generated change point information to the edge server 3 (step S6). ).
- the in-vehicle device 50 that has received the dynamic information distributed in step S2 generates change point information from the comparison result between the dynamic information and its own sensor information within the distribution cycle U1 (step S4), the generated change The point information is transmitted to the edge server 3 (Step S6).
- the roadside sensor 8 When the roadside sensor 8 generates the change point information of its own sensor information within the distribution cycle U1, the roadside sensor 8 transmits the generated change point information to the edge server 3 (step S6).
- the edge server 3 When the edge server 3 receives the change point information from the pedestrian terminal 70, the in-vehicle device 50, and the roadside sensor 8 within the update period U1, the edge server 3 updates the change information with dynamic information reflecting the change point information (Step S7). The updated dynamic information is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (Step S8).
- step S6 when only the in-vehicle device 50 generates the change point information within the distribution cycle U1, only the change point information generated by the in-vehicle device 50 in step S4 is transmitted to the edge server 3 (step S6), and the change point Update of the dynamic information reflecting only the information is performed (step S7). If the pedestrian terminal 70, the in-vehicle device 50, and the roadside sensor 8 do not perform the change point information within the distribution cycle U1, the processes of steps S3 to S7 are not performed, and the dynamic information of the previous transmission is not performed. The same dynamic information as in (Step S2) is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (Step S8). Thus, the edge server 3 updates the dynamic information in step S7 based on the change point information transmitted within the distribution cycle U1.
- the pedestrian terminal 70 that has received the dynamic information distributed in step S8 generates the change point information within the distribution cycle U2 (step S9), and transmits the generated change point information to the edge server 3 (step S12). ).
- the in-vehicle device 50 that has received the dynamic information distributed in step S8 generates change point information from the comparison result between the dynamic information and its own sensor information within the distribution cycle U2 (step S10).
- the point information is transmitted to the edge server 3 (Step S12).
- the roadside sensor 8 When the roadside sensor 8 generates the change point information of its own sensor information within the distribution cycle U2, the roadside sensor 8 transmits the generated change point information to the edge server 3 (step S12).
- the edge server 3 When receiving the change point information from the vehicle-mounted device 50 and the roadside sensor 8 within the distribution cycle U2, the edge server 3 updates the change information with dynamic information reflecting the change point information (Step S13), and updates the updated dynamic information. Is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (step S14). Thus, the edge server 3 updates the dynamic information in step S13 based on the change point information transmitted within the distribution cycle U2.
- step S14 The processing after step S14 is performed until a request message for stopping distribution of dynamic information is received from both the pedestrian terminal 70 and the vehicle 5 or the communication between the pedestrian terminal 70 and the vehicle 5 is interrupted. Is repeated by the same sequence as.
- the information providing system has a function of providing a collision prediction result to the pedestrian terminal 70 mounted on one or a plurality of moving objects located in the service area and the vehicle-mounted device 50 of the vehicle 5.
- the collision prediction result is information indicating a prediction result when predicting whether or not a mobile body including the in-vehicle device 50 or the pedestrian terminal 70 collides with another mobile body other than the mobile body. is there.
- the collision prediction result includes information such as the attribute of the moving body predicted to collide, the direction in which the moving body predicted to collide approaches, the collision prediction time, and the like.
- FIG. 8 is a functional block diagram of the edge server 3 showing a function for providing a collision prediction result according to the first embodiment.
- the control unit 31 of the edge server 3 functionally includes a calculation unit 31a, a determination unit 31b, a notification unit 31c, and a detection unit 31d. Each of these functions is realized by the control unit 31 executing a program stored in the storage unit 34.
- the calculation unit 31a is configured to perform a mobile terminal (pedestrian terminal 70 and vehicle-mounted) mounted on a moving object (pedestrian 7, vehicle 5) located in the service area represented by the dynamic information map M1 based on the dynamic information map M1.
- Each of the devices 50 has a function of performing a collision prediction and obtaining an evaluation value of the collision prediction result.
- the calculation unit 31a refers to the mobile object database 34a (described later) in which information obtained from the dynamic information map M1 is registered, and obtains an evaluation value for a collision prediction result.
- the determining unit 31b has a function of determining whether or not to notify a collision prediction result of a mobile object equipped with a mobile terminal (the pedestrian terminal 70 and the in-vehicle device 50) to the mobile terminal based on the evaluation value. doing.
- the notifying unit 31c has a function of notifying the mobile terminal of a collision prediction result based on the determination result of the determining unit 31b.
- the detection unit 31d detects a plurality of moving objects (pedestrians 7, vehicles 5) in which the position information is included in the dynamic information of the dynamic information map M1, and generates moving object information indicating the status of each moving object. It has the function to do.
- the storage unit 34 stores a mobile object database 34a and an evaluation value database 34b in addition to the above-described dynamic information map M1.
- FIG. 9 is a diagram illustrating an example of the mobile object database 34a.
- mobile object information generated by the detection unit 31d is registered in the mobile object database 34a.
- the mobile database 34a is managed and updated by the detection unit 31d.
- the detecting unit 31d refers to the dynamic information map M1, and when the position information of the new moving object (pedestrian 7, vehicle 5) is registered in the dynamic information, the detecting unit 31d assigns the moving object ID to the moving object.
- the moving body information corresponding to the moving body ID is generated and registered in the moving body database 34a.
- the detection unit 31d detects a moving object whose position information is registered in the dynamic information map M1, assigns a moving object ID to each moving object, and generates moving object information.
- the moving object information includes information such as information on the presence or absence of a communication function, a vehicle ID (mobile ID), attribute information of the moving object, position information, azimuth information indicating a moving direction, and speed information indicating a moving speed.
- the attribute information is information indicating an attribute of each moving object.
- the attributes of the moving object include a plurality of attributes such as an attribute related to the appearance of the moving object and an attribute related to the behavior of the moving object. Note that the attribute information is distinguished between a vehicle and a pedestrian.
- the attribute information includes information indicating the attribute content of each of the plurality of attributes in the moving object.
- the mobile object database 34a has columns for registering the mobile object ID, information on the presence or absence of a communication function, a vehicle ID (mobile ID), attribute information of the mobile object, position information, azimuth information, and speed information. ing. Each piece of information in the moving object database 34a is registered in association with the moving object ID.
- the detection unit 31d (acquisition unit) refers to the dynamic information map M1 and generates moving body information of each moving body.
- the detection unit 31d determines, among the moving object information, information on the presence or absence of a communication function included in the dynamic information map M1, a vehicle ID (mobile ID), and position information, information included in the dynamic map information M1. Is acquired as it is and registered in the mobile object database 34a.
- the detecting unit 31d calculates the azimuth information and the speed information based on the change over time of the position information of each moving object included in the dynamic information map M1.
- FIGS. 10 and 11 are diagrams showing a part of a part for registering attribute information in the mobile object database 34a.
- FIG. 10 shows a column for registering attribute information when the moving object is a vehicle.
- FIG. 11 shows a column for registering attribute information when the moving object is a pedestrian.
- the attribute information of the vehicle includes an appearance attribute and an action attribute.
- the appearance attributes include four types of attributes of “size”, “shape”, “color”, and “display license plate”. The contents of these attributes are set in advance, and the attribute contents (appearance information) corresponding to the target vehicle are registered in association with the moving object ID from among a plurality of preset attribute contents.
- the attribute content in “size” includes large, medium, and small. Buses and trucks are determined to be large, light vehicles are determined to be small, and other general passenger vehicles are determined to be medium.
- the attribute contents in the “shape” include a sedan, a coupe, a wagon, a minivan, an SUV, a convertible car, and a truck.
- the attribute contents of “color” include high brightness and low brightness.
- the attribute content in the “display of license plate” includes the display of the place name of the license plate attached to the target vehicle 5 (the place name of the transportation branch office that controls the home of use of the vehicle).
- the detection unit 31d refers to the image data of the moving object captured by the camera or the like included in the dynamic information map M1, and acquires the information on the appearance attribute of the vehicle 5 described above.
- the behavior attribute includes a past traveling history. More specifically, the behavior attributes include four numbers of “past lane change number”, “past sudden steering sharp brake number”, “lane change number after warning”, and “fast steering sharp brake number after warning”. Contains type attributes.
- the “number of past lane changes” indicates the number of times the target vehicle 5 has changed lanes during a fixed period in the past.
- the “number of past sudden steering wheel sudden brakes” indicates the number of times that the target vehicle 5 has performed the sudden steering operation or the sudden braking operation in a past fixed period.
- the “number of lane changes after warning” indicates the number of times the vehicle 5 has changed lanes after a warning is issued to the in-vehicle device 50 of the target vehicle 5 by the edge server 3 described below.
- the “number of sudden steering wheel sudden brakes after the alarm” indicates the number of times the vehicle 5 performs the sudden steering wheel sudden braking operation after the alarm.
- the detecting unit 31d stores past dynamic information (position information included therein) of each moving body in the storage unit 34 and generates trajectory information of each moving body that has moved in the past based on the past dynamic information. It has the function to do.
- the trajectory information also includes speed information of the moving object. Therefore, in the case of the vehicle 5, it is possible to detect that the sudden steering wheel sudden braking operation is performed based on the trajectory information. Further, based on the trajectory information, it is also possible to detect that a lane change that changes lanes when traveling straight ahead has been performed.
- the detecting unit 31d detects a lane change and a sudden steering sudden braking operation by each vehicle 5 based on the trajectory information of each vehicle 5, and determines the number of times each vehicle 5 has changed lanes and the sudden steering sudden braking operation. It has a function to count the number of times it has been performed.
- the edge server 3 has a function of notifying a warning to the in-vehicle device 50 based on the number of times the lane change counted by the detection unit 31d and the number of times of the sudden steering wheel sudden braking operation are performed. .
- the edge server 3 notifies the in-vehicle device 50 that has performed lane change and sudden steering sudden braking operation a certain number of times or more within a predetermined period.
- the detection unit 31d also counts, for the in-vehicle device 50 that has been notified of the alarm, the number of times that each vehicle 5 has changed lanes and the number of times that the driver has performed a sudden steering wheel sudden braking operation after the alarm.
- the detecting unit 31d uses the information on “the number of past lane changes”, “the number of past sudden steering sudden brakes”, “the number of lane changes after warning”, and “the number of sudden steering sudden brakes after warning” ( Behavior information).
- the attribute information of the pedestrian also includes an appearance attribute and an action attribute, like the attribute information of the vehicle.
- the appearance attributes include four types of attributes of “height”, “clothes”, “earphones or aside”, and “crutches or wheelchairs”.
- the contents of the attributes of "clothes", “earphones or aside”, and “crutches or wheelchairs” are set in advance, and an attribute corresponding to the target pedestrian from a plurality of preset attribute contents (appearance information). The contents are registered.
- the height of the pedestrian is registered in the attribute content of “height”.
- the attribute contents in “clothes” include students and non-students. If the attribute content of “clothes” is a student, it indicates that the pedestrian's clothes are wearing school uniforms and hats. If the attribute content of "clothes” is other than student, it indicates that the pedestrian's clothes are not school uniforms.
- the attribute content in “earphones or aside” includes applicable and non-applicable. If the attribute content in “earphones or looking away” is applicable, it indicates that the pedestrian is wearing earphones or looking at a terminal such as a smartphone while walking. When the attribute content of “earphone or side view” is not applicable, it indicates that the user is walking normally.
- the attribute content in “crutch or wheelchair” includes applicable and non-applicable. When the attribute content in “crutches or wheelchairs” is applicable, it indicates that the pedestrian is using crutches or wheelchairs. If the attribute content of “Crutch or Wheelchair” is not applicable, it indicates that the user is walking normally.
- the behavior attributes include four types of attributes, “walking place”, “ignore signal”, and “walking locus”.
- the attribute content of “walking place” includes a road, a sidewalk, and others. When the attribute content of “walking place” is a road, it indicates that the target pedestrian 7 is walking on the road. When the attribute content in “walking place” is a sidewalk, it indicates that the target pedestrian 7 is walking on a sidewalk or a crosswalk. If the attribute content of “walking place” is “other”, it indicates that the target pedestrian 7 is walking in a building or the like.
- the attribute content in “ignore signal” includes applicable and non-applicable. If the attribute content in “ignore signal” is applicable, it indicates that the signal is currently ignored. If the attribute content in “ignore signal” is not applicable, it indicates that the signal is not currently ignored.
- the attribute content in the “walking locus” includes straight and meandering. If the attribute content in the “walking locus” is meandering, it indicates that the nearest walking locus of the target pedestrian 7 is meandering. When the attribute content in the “walking locus” is straight, it indicates that the nearest walking locus of the target pedestrian 7 is straight.
- the detecting unit 31d refers to the image data of the moving object captured by the camera or the like included in the dynamic information map M1 and refers to the above-described trajectory information, and obtains information (behavior information) regarding each attribute of each pedestrian. get. Note that when the dynamic information map M1 does not include image data of a moving object (vehicle and pedestrian), the detection unit 31d determines the image data and the like included in the change point information for updating the dynamic information map M1. Can be used. The detecting unit 31d generates attribute information of each moving object based on the acquired information on the attributes.
- the detection unit 31d repeatedly generates the moving body information of each moving body and registers the moving body information in the moving body database 34a, and updates the moving body database 34a as needed. As a result, the moving object information registered in the moving object database 34a is maintained at the latest information.
- the evaluation value database 34b in FIG. 8 is a database for registering the evaluation value of the collision prediction result obtained by the calculation unit 31a.
- the evaluation value database 34b will be described later.
- FIG. 12 is a flowchart illustrating an example of the calculation process of the evaluation value of the collision prediction result by the calculation unit 31a.
- the calculation unit 31a reads the mobile object database 34a (step S51), and specifies a mobile object having a communication function as an evaluation target (step S52).
- the collision prediction result can be provided to a moving object having the pedestrian terminal 70 or the in-vehicle device 50. Therefore, the calculation unit 31a specifies a moving object having the pedestrian terminal 70 and the in-vehicle device 50 as an evaluation target for obtaining an evaluation value. That is, the evaluation target indicates a mobile terminal for which an evaluation value is obtained by the calculation unit 31a.
- the evaluation target is a pedestrian terminal 70 or a pedestrian 7 or a vehicle 5 which is a moving object having the in-vehicle device 50, or is carried or mounted on a moving object.
- Pedestrian terminal 70 or on-vehicle device 50 that is running.
- the vehicle 5 on which a person carrying the pedestrian terminal 70 boards is also included in the evaluation target.
- the calculation unit 31a performs an evaluation value calculation process (Step S53).
- the calculation unit 31a sequentially executes the processing on each of the specified evaluation targets, and repeats the processing until the processing is performed on all of the evaluation targets.
- the calculation unit 31a proceeds to step S55, and performs an individual calculation process.
- the calculation unit 31a obtains an evaluation value to be evaluated in the individual calculation process (Step S55).
- FIG. 13 is a flowchart illustrating an example of the individual calculation process in step S55 in FIG.
- the calculation unit 31a specifies a mobile body (another mobile body) other than the evaluation target (Step S61), and calculates a collision prediction time between the evaluation target (the target mobile body) and another mobile body. Is calculated for each of the other moving objects (step S62).
- other moving objects include moving objects equipped with the pedestrian terminal 70 or the vehicle-mounted device 50.
- the calculation unit 31a refers to the mobile object database 34a and, based on the position information, the azimuth information, and the speed information of the evaluation object and the other mobile objects, a collision when the evaluation object collides with another mobile object. Find the estimated time. When there is no possibility of collision from the position information and the azimuth information, the calculation unit 31a sets the collision prediction time to an extremely large predetermined value (for example, 5 minutes). The calculation unit 31a also sets the collision prediction time to the predetermined value when the calculated collision prediction time is equal to or longer than the predetermined value.
- an extremely large predetermined value for example, 5 minutes
- the calculation unit 31a predicts a collision between the evaluation target and another moving body based on the dynamic information map M1, and obtains a collision prediction time.
- the calculation unit 31a specifies, as the collision prediction target, the moving object having the shortest collision prediction time among the collision prediction times calculated for the other moving objects (step S63).
- the calculation unit 31a specifies a moving object having the shortest collision prediction time as a collision prediction target.
- the calculation unit 31a performs a collision prediction on each of the other moving objects, and obtains a collision prediction result.
- the calculation unit 31a obtains a base value for obtaining an evaluation value of the collision prediction result based on the calculated collision prediction time (Step S64).
- the base value is a value serving as a basis for obtaining an evaluation value of the collision prediction result, and an evaluation value is obtained by adding an additional value to the base value in a later process.
- the calculation unit 31a obtains a base value for the predicted collision time according to the following rule.
- the evaluation value of the collision prediction result is set such that the larger the value, the higher the possibility of collision and the higher the need for notification. Accordingly, the base value is also set such that the larger the value, the higher the possibility of collision and the higher the need for notification.
- the evaluation value of the collision prediction result is a value indicating a possibility that the collision prediction target collides with a moving object other than the collision prediction target.
- the calculation unit 31a sets “0” as a base value for a moving object other than the collision prediction target.
- the added value is not added to the base value of the moving object other than the collision prediction target by the subsequent processing. Therefore, the evaluation value of the collision prediction result of the moving object other than the collision prediction target (evaluation value of the collision prediction result between the target moving object and the moving object other than the collision prediction target) is “0”.
- the calculation unit 31a obtains the base value for each moving object based on the predicted collision time.
- step S65 determines whether there is a blind spot factor that causes a blind spot between the evaluation target and the collision prediction target.
- the presence or absence of a blind spot factor between the evaluation target and the collision prediction target is determined by the calculation unit 31a by referring to the dynamic information map M1.
- the calculation unit 31a refers to the dynamic information map M1, and determines whether there is a building or another moving object that blocks the line of sight between the evaluation target and the collision prediction target. When such a building or another moving object exists between the evaluation target and the collision prediction target, the calculation unit 31a determines that there is a blind spot factor between the evaluation target and the collision prediction target. On the other hand, if there is no obstacle between the evaluation target and the collision prediction target, the calculation unit 31a determines that there is no blind spot factor.
- step S65 If it is determined in step S65 that there is a blind spot factor between the evaluation target and the collision prediction target, the calculation unit 31a adds the base value obtained in step S64 (step S66), and proceeds to step S67. On the other hand, when it is determined that there is no blind spot factor between the evaluation target and the collision prediction target (step S65), the calculation unit 31a proceeds to step S67.
- the evaluation value is set such that the larger the value, the higher the factor that impairs the safety of the evaluation target.
- the calculation unit 31a adds the base value.
- the added value added to the base value in step S66 is, for example, "100". This addition value is an example and is not limited to this. The same applies to the following addition values.
- the calculation unit 31a determines the presence or absence of a blind spot factor that causes a blind spot between the evaluation target and the collision prediction target, and adds the determination result of the presence or absence of the blind spot factor to the evaluation value.
- the presence or absence of a blind spot factor can be reflected in the evaluation value, and can be reflected in the execution determination of providing information to the evaluation target described later.
- step S67 the calculation unit 31a performs an addition process of adding a value to the basic value according to the content of the attribute of the collision prediction target (step S67).
- FIG. 14 is a flowchart illustrating an example of the process of adding a collision prediction target in FIG.
- the calculation unit 31a determines whether (the moving object of) the collision prediction target is a vehicle (step S70). If it is determined that the collision prediction target is the vehicle 5, the calculation unit 31a proceeds to step S71, executes the vehicle addition processing, and ends the processing.
- FIG. 15 is a flowchart showing an example of the vehicle addition process in FIG.
- the calculation unit 31a first refers to the attribute information of the collision prediction target (vehicle 5) in the mobile object database 34a.
- the calculation unit 31a performs addition regarding “size” of the attribute information (step S81).
- An addition value is set in advance for each of the attribute contents of “size”, that is, large, medium, and small.
- the arithmetic unit 31a adds “50” to the base value as an added value when the attribute content in “size” is large, and adds “30” to the base value as an added value when the attribute content is medium-sized. In some cases, “20” is added to the base value as an additional value. Generally, it is considered that the greater the size of the vehicle 5, the greater the possibility of collision. Therefore, the addition value is set to increase as the size of the vehicle 5 increases. As described above, the calculation unit 31a adds the added value corresponding to the “size” of the collision prediction target to the base value (Step S81).
- the calculation unit 31a performs addition regarding “shape” in the attribute information (step S82).
- An addition value is set in advance for each of the attributes of the “shape”, that is, the sedan, coupe, wagon, minivan, SUV, open car, and truck. For example, when there is a difference in the accident rate or the like according to the shape of the vehicle 5, a larger added value can be set for the attribute content having a high accident rate. For example, when the attribute content in the “shape” is coupe, open car, and truck, the addition value “30” is added, and when the attribute content is sedan, wagon, minivan, and SUV, the addition value “10” is added. As described above, the calculation unit 31a adds the added value corresponding to the “shape” of the collision prediction target to the base value (Step S82).
- the calculation unit 31a performs addition regarding “color” in the attribute information (step S83).
- An addition value is set in advance for each of the high brightness and the low brightness, which are the attribute contents of “color”. For example, when there is a difference in the accident rate or the like between the high brightness and the low brightness, it is possible to set the attribute content having the high accident rate to have a larger addition value. For example, when the attribute content of “color” is high lightness, the added value is not added, and when the attribute content is “low lightness”, the added value “10” is added. As described above, the calculation unit 31a adds the addition value corresponding to the “color” of the collision prediction target to the base value (Step S83).
- the calculation unit 31a determines whether or not the place name display in the “display of license plate” in the attribute information is not local (step S84). If the place name display on the license plate is different from the place name of the jurisdiction of the transport branch office at the current position of the vehicle 5, the arithmetic unit 31a determines that the place name display is not local. If it is determined in step S84 that the place name display on the license plate is not local, the calculation unit 31a adds the added value to the base value (step S85), and proceeds to step S86. On the other hand, if it is determined in step S84 that the place name display on the license plate is local, the calculation unit 31a proceeds to step S86.
- Steps S81 to S85 are processing relating to appearance attributes, and step S86 is processing relating to action attributes.
- the calculation unit 31a performs addition on the past travel history that is the action attribute of the attribute information (step S86).
- Each of the "past lane change count”, “past sudden steering sharp brake count”, “lane change count after warning”, and “fast steering sharp brake count after warning” included in the behavior attribute includes A corresponding addition value is set in advance.
- a larger added value is set as the number of times, which is the attribute content, increases.
- the calculation unit 31a adds the addition value according to the past traveling history to the base value (step S86), and ends the addition processing.
- the case where the addition value according to the past traveling history is added is illustrated, but the addition value according to the current traveling state such as the current traveling speed may be further added to the base value.
- step S70 when it is determined in step S70 that the collision prediction target is not the vehicle 5, the calculation unit 31a proceeds to step S72, executes a pedestrian addition process, and ends the process.
- FIG. 16 is a flowchart showing an example of the pedestrian addition process in FIG.
- the calculation unit 31a first refers to the attribute information of the collision prediction target (pedestrian 7) in the mobile object database 34a.
- the calculation unit 31a determines whether “height” of the attribute information is equal to or less than a predetermined value (step S91). If it is determined in step S91 that the “height” is equal to or smaller than the predetermined value, the calculation unit 31a adds the added value to the base value (step S92), and proceeds to step S93. On the other hand, if it is determined in step S91 that “height” is not smaller than or equal to the predetermined value, the calculation unit 31a proceeds to step S93.
- the predetermined value is set to, for example, 120 cm.
- step S91 it can be determined whether the pedestrian is a child younger than the lower grade of elementary school or a pedestrian other than that. Therefore, when the pedestrian is a child younger than the lower grade of elementary school, an additional value can be added. For example, the added value in step S92 is “20”.
- step S93 determines whether or not the attribute content of “clothes” is a student. If it is determined in step S93 that the attribute content of “clothes” is a student, the calculation unit 31a adds the added value to the base value (step S94), and proceeds to step S95. On the other hand, if it is determined in step S93 that the attribute content of “clothes” is not a student, the calculation unit 31a proceeds to step S95. In steps S93 and S94, when the pedestrian is a student, an additional value can be added. For example, the added value in step S94 is “10”.
- step S95 determines whether or not the attribute content in “earphone or side-by-side” is applicable. If it is determined in step S95 that the attribute content of “earphone or aside” is applicable, the calculation unit 31a adds the added value to the base value (step S96), and proceeds to step S97. On the other hand, if it is determined in step S95 that the attribute content in “earphone or aside” is not applicable (not applicable), the operation unit 31a proceeds to step S97.
- steps S95 and S96 when the pedestrian is wearing earphones or looking at a terminal such as a smartphone while walking, the added value can be added. For example, the added value in step S96 is “10”.
- step S97 determines whether or not the attribute content of “crucible or wheelchair” is applicable. If it is determined in step S97 that the attribute content of “crutch or wheelchair” is applicable, the calculation unit 31a adds the added value to the base value (step S98), and proceeds to step S99. On the other hand, if it is determined in step S97 that the attribute content of “crutch or wheelchair” is not applicable (not applicable), the calculation unit 31a proceeds to step S99.
- steps S97 and S98 when the pedestrian is using a crutch or a wheelchair, the added value can be added. For example, the added value in step S98 is “30”. Steps S91 to S98 are processing relating to appearance attributes, and processing from step S99 is processing relating to action attributes.
- the calculation unit 31a determines whether or not the attribute content of the “walking place” is a road (Step S99). If it is determined in step S99 that the attribute content of the “walking place” is a road, the calculation unit 31a adds the added value to the base value (step S100), and proceeds to step S101. On the other hand, if it is determined in step S99 that the attribute content of “walking place” is not a road, the calculation unit 31a proceeds to step S101. In steps S100 and S101, when the pedestrian is walking on the road, the added value can be added. For example, the added value in step S100 is “30”.
- the calculation unit 31a determines whether the walking speed is lower than a predetermined value (Step S101).
- the determination as to whether the walking speed of the pedestrian is lower than the predetermined value can be made by referring to the moving object information of the moving object database 34a.
- the predetermined value to be compared with the walking speed is set, for example, every 3.6 km as a general pedestrian speed.
- step S101 If it is determined in step S101 that the walking speed is lower than the predetermined value, the calculation unit 31a adds the added value to the base value (step S102), and proceeds to step S103. On the other hand, if it is determined in step S101 that the walking speed is not lower than the predetermined value, the calculation unit 31a proceeds to step S103.
- steps S101 and S102 when the walking speed of the pedestrian is low, the added value can be added. For example, the added value in step S102 is “10”.
- step S103 determines whether or not the attribute content in "ignore signal" is applicable. If it is determined in step S103 that the attribute content of "ignore signal” is applicable, the operation unit 31a adds the added value to the base value (step S104), and proceeds to step S105. On the other hand, if it is determined in step S103 that the attribute content of “ignore signal” is not applicable (not applicable), the operation unit 31a proceeds to step S105. In step S103 and step S104, when the pedestrian ignores the signal, the added value can be added. For example, the added value in step S100 is “50”.
- the calculation unit 31a determines whether the attribute content in the “walking locus” is meandering (Step S105). If it is determined in step S105 that the attribute content in the “walking locus” is meandering, the calculation unit 31a adds the addition value to the base value (step S106), and ends the addition processing. On the other hand, if it is determined in step S105 that the attribute content in the “walking locus” is not meandering (is a straight line), the calculation unit 31a ends the addition processing. In step S105 and step S106, for example, since the pedestrian is in a drunken state due to drinking or the like, the added value can be added when the pedestrian is meandering. For example, the added value in step S106 is “50”.
- the arithmetic unit 31a performs the addition process after distinguishing between the vehicle and the pedestrian (FIG. 14).
- the calculation unit 31a proceeds to step S68, and performs an addition processing of adding a value to a base value according to the attribute content of the evaluation target (step S68).
- the addition processing in step S68 is the same as the addition processing in step S67 except that the addition processing is performed on the evaluation target, and thus the description is omitted here.
- the calculation unit 31a ends the individual calculation processing.
- the calculation unit 31a calculates an addition value (appearance attribute value) corresponding to the attribute content of the appearance attribute of the evaluation target and the collision prediction target, and an addition value (action value) corresponding to the attribute content of the action attribute. Attribute value) to the base value. As a result, it is possible to reflect the appearance of the evaluation target and the collision prediction target and the past or current behavior in the evaluation value. In other words, factors that affect the collision prediction, which appear in the appearance and behavior of the evaluation target and the collision prediction target, can be reflected in the evaluation value.
- step S57 the calculation unit 31a proceeds to step S57, and uses the sum of the added value added in the individual calculation processing and the base value as the evaluation value of the collision prediction result. Is registered in the evaluation value database 34b (step S57).
- the calculation unit 31a sequentially processes Steps S55 and S57 in FIG. 12 for the specified evaluation targets, and repeats the processing until all the evaluation targets are processed. After obtaining the evaluation values for all of the specified evaluation targets and registering them in the database 34b, the calculation unit 31a returns to step S51 again and repeats the same processing.
- the arithmetic unit 31a repeats the calculation processing, repeatedly calculates and registers the evaluation value of each of the specified evaluation targets, and updates the registered contents of the evaluation value database 34b as needed.
- FIG. 17 is a diagram illustrating an example of the evaluation value database 34b. As shown in FIG. 17, in the evaluation value database 34b, a moving object ID to be evaluated and evaluation values of moving objects other than the evaluation object (other moving objects) are registered in association with each other.
- the evaluation value database 34b is provided with a column of a mobile unit ID to be evaluated, a column of a vehicle ID (mobile ID), and a column of evaluation values of mobile units other than the evaluation target.
- the mobile object ID of the mobile object specified as the evaluation object is registered.
- a vehicle ID (mobile ID) of a mobile terminal (pedestrian terminal 70 or in-vehicle device 50) included in the mobile object to be evaluated is registered.
- the column of the evaluation value of the mobile unit other than the evaluation target includes a column of the mobile unit ID of each mobile unit other than the evaluation target.
- the evaluation values of the plurality of moving objects are registered in the column of the evaluation values of the moving objects other than the evaluation target.
- the evaluation value for the moving object other than the collision prediction target is set to “0”. Therefore, a moving object other than the evaluation target whose evaluation value is registered with a value other than “0” is a collision prediction target to be evaluated.
- the column corresponding to the mobile object ID of “1004” is the evaluation value. “500” is registered as a value, and “0” is registered as the other values. From this, it can be seen that the moving object with the moving object ID “1004” is specified as the collision prediction target of the evaluation object with the moving object ID “1006”. In this manner, by referring to the evaluation value database 34b, it is possible to specify the collision prediction target of each evaluation target.
- FIG. 18 is a flowchart illustrating an example of the determination processing by the determination unit 31b.
- the determination unit 31b refers to the evaluation value database 34b and determines whether there is an evaluation target whose evaluation value is equal to or greater than a preset threshold (threshold for notification determination) (Step S110). ). If it is determined in step S110 that there is no evaluation target whose evaluation value is equal to or greater than the notification determination threshold, the determination unit 31b repeats step S110. Therefore, the determination unit 31b repeats step S110 until it determines that there is an evaluation target whose evaluation value is equal to or greater than the notification determination threshold.
- a preset threshold threshold for notification determination
- step S110 If it is determined in step S110 that there is an evaluation target whose evaluation value is equal to or larger than the notification determination threshold, the determination unit 31b proceeds to step S111 and determines to notify the evaluation target of the collision prediction result of the collision prediction target. Then, the process returns to step S110.
- the determination unit 31b refers to the evaluation value of each evaluation target, and if there are a plurality of evaluation targets whose evaluation values are equal to or greater than the notification determination threshold, determines the notification of the collision prediction result of the collision prediction target to all of the plurality of evaluation targets. I do. As described above, the determination unit 31b determines, for each evaluation target, whether to notify the evaluation target of the collision prediction result of each evaluation target based on the evaluation value.
- whether or not to notify the collision prediction result is determined for each evaluation target based on the evaluation value of each of the evaluation targets (the pedestrian terminal 70, the in-vehicle device 50, or the moving object including these). Therefore, necessary information can be appropriately provided to each evaluation target.
- the dynamic information map M1 on which the dynamic information on a plurality of moving objects is superimposed is used to obtain the evaluation value of the collision prediction result of each evaluation target, the moving objects including no mobile terminals are included. Collision prediction can be performed, and a collision prediction result appropriately predicted for each evaluation target can be represented as an evaluation value.
- the notification unit 31c notifies the evaluation target of which the determination unit 31b has determined the notification of the collision prediction result between the evaluation target and the collision prediction target. I do. Thereby, information can be provided only to the mobile terminals for which it is determined that the collision prediction result is necessary based on the evaluation value.
- the collision prediction result notified to the evaluation target by the notifying unit 31c includes the attribute of the collision prediction target, the direction in which the collision prediction target approaches, the collision prediction time, and the like.
- the mobile terminal to be evaluated (the pedestrian terminal 70 and the in-vehicle device 50) that has received the notification by the notification unit 31c outputs the notified collision prediction result to the user of the mobile terminal.
- the evaluation value for determining whether or not to notify the collision prediction result is determined based on the collision prediction time, the appearance attribute value, and the action attribute value.
- the appearance and behavior of the evaluation target and the collision prediction target can be reflected in the determination of whether or not to notify the result. Therefore, for example, for a collision prediction regarding a moving object having a high possibility of collision in appearance or behavior, an evaluation value can be set so as to be weighted in a direction in which a collision prediction result is notified, and the necessity of notification is required. In determining, the appearance and behavior of the moving object can be taken into account. As a result, the necessity of notification can be appropriately determined, and a collision prediction result with high necessity of notification can be appropriately provided to the mobile terminal.
- FIG. 19 is a diagram showing a situation around an intersection according to scenario 1.
- the pedestrians 7A and 7B get off the vehicle 5C parked at the position immediately after passing the pedestrian crossing P and cross the pedestrian crossing P.
- the pedestrian 7A is an adult
- the pedestrian 7B is a child (having a height of 120 cm or less).
- the light color of the signal of the pedestrian crossing P was flashing blue.
- the traffic light on the route on which the vehicle 5A runs is red, but the traffic light on the pedestrian crossing P is blinking blue.
- the vehicle 5A approaching the pedestrian crossing P recognizes that it will soon turn blue. Therefore, it is assumed that the vehicle 5A is trying to pass through the intersection without stopping. It is assumed that the pedestrians 7A and 7B cannot see the vehicle 5A due to the presence of the vehicle 5C. Further, a vehicle 5B is running behind the vehicle 5A.
- the vehicles 5A, 5B, 5C and the pedestrians 7A, 7B are registered as moving objects in the dynamic information map M1 and the moving object database 34a by the system.
- the vehicles 5A and 5B have the in-vehicle device 50 mounted thereon, and the pedestrians 7A and 7B carry the pedestrian terminal 70.
- the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
- the attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B) and the signals are ignored.
- a vehicle 5C that is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
- the edge server 3 when calculating the evaluation value when the vehicle 5A is to be evaluated, the edge server 3 obtains a basic value based on the collision prediction time, and in addition to the addition value obtained by the addition processing regarding the vehicle 5, the blind spot factor and the pedestrian 7A , 7B are added to the base value to obtain an evaluation value. At this time, in the addition process for the pedestrians 7A and 7B, the height of the pedestrian and the added value due to ignoring the signal are added.
- the addition process for the vehicle 5 is appropriately added depending on the attribute contents of the appearance attribute and the behavior attribute of each vehicle 5. The same applies to the vehicle 5 appearing in the following description.
- the notification determination threshold used for determining whether or not to notify a collision prediction result is set to be smaller than an evaluation value in a case where factors that impair safety overlap as in this scenario. Therefore, in the case of the present scenario, the evaluation values of the pedestrians 7A and 7B in the vehicle 5A are larger than the notification determination threshold. For this reason, the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
- the edge server 3 performs the addition process on both the evaluation target and the collision prediction target when obtaining the evaluation value. Therefore, the evaluation value when the pedestrians 7A and 7B are the evaluation targets The evaluation value is the same as the target. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes larger than the notification determination threshold, and the edge server 3 notifies the pedestrians 7A and 7B of the collision prediction result of the vehicle 5A in the pedestrians 7A and 7B. I do.
- the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
- the evaluation value is set to a large value.
- the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
- the edge server 3 can appropriately provide necessary information to each mobile terminal.
- the in-vehicle devices 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B output a collision prediction result to the user of the own device based on the notification from the edge server 3.
- an output screen V1 of the vehicle-mounted device 50 of the vehicle 5A includes a display D1 indicating that a pedestrian appears from the right side of the pedestrian crossing P in front of the own vehicle, an arrow D2 indicating the traveling direction of the pedestrian, and the like. Is displayed. Thereby, the edge server 3 can make the user of the vehicle 5A recognize in advance that a pedestrian appears from the right side of the pedestrian crossing P ahead.
- the pedestrians 7A and 7B are displayed without being specified, for example, the control unit 51 of the vehicle 5A displays whether the attribute of the pedestrian 7 is a child or an adult. The method may be different. This is because more attention is required when the pedestrian is a child.
- the output mode of the collision prediction result may be controlled to be different depending on the attribute contents of the appearance attribute and the action attribute in the collision prediction target.
- information can be output to the user in an output mode according to the characteristics of the attribute of the collision prediction target.
- the output screen V2 of the pedestrian terminal 70 of the pedestrians 7A and 7B includes a display D3 indicating that the vehicle A appears from the left side of the pedestrian crossing P ahead, an arrow D4 indicating the traveling direction of the vehicle 5A, and the like. Is displayed. Thereby, the edge server 3 can make the pedestrians 7A and 7B recognize in advance that the vehicle appears from the left side of the pedestrian crossing P in front.
- the edge server 3 can make the user of the vehicle 5B recognize in advance that the vehicle 5A traveling ahead stops and approaches.
- the edge server 3 can avoid collision between the moving bodies by causing the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B to output to the user. it can.
- the output of the collision prediction result by the pedestrian terminal 70 and the in-vehicle device 50 is controlled by an output control unit of the control unit 71 and the control unit 51 of the pedestrian terminal 70 and the in-vehicle device 50.
- the control unit 31 of the edge server 3 has an output control unit that can control the output of the pedestrian terminal 70 and the in-vehicle device 50 to the user, the output control unit of the edge server 3
- the output to the user by the pedestrian terminal 70 and the in-vehicle device 50 may be controlled.
- FIG. 19 shows the case where the position of the collision prediction target and the traveling direction thereof are displayed on the output screens V1, V2, V3, but may be displayed together with the collision prediction time.
- the edge server 3 does not notify the collision prediction result to the vehicle 5A and the pedestrians 7A and 7B.
- the added value is added according to the attribute content of the vehicle 5A, so that the evaluation value exceeds the notification determination threshold. There is also.
- the added value is added based on the size, shape, color, license plate display, and past traveling history of the vehicle 5A. Therefore, although the pedestrians 7A and 7B are set as the collision prediction targets in the vehicle 5A, even when the collision prediction result is not notified, the edge server 3 may determine the collision prediction result depending on the attribute of the vehicle 5A. The pedestrians 7A and 7B and the vehicle 5A may be notified.
- the attribute of the vehicle 5A can be taken into account in determining the necessity of the notification of the collision prediction result.
- the necessity of notification of the collision prediction result can be appropriately determined, and the notification of the collision prediction result can be appropriately performed.
- FIG. 20 is a diagram showing a situation around an intersection according to scenario 2.
- the settings of the vehicles 5A and 5B and the pedestrians 7A and 7B are the same as in the scenario 1. Further, the scenario 2 does not include the vehicle 5C of the scenario 1.
- the pedestrians 7A and 7B are crossing the pedestrian crossing P at a speed lower than a general walking speed (3.6 km per hour).
- the light color of the signal of the pedestrian crossing P is flashing blue. In the middle of the pedestrian crossing P, it is assumed that the light color of the signal of the pedestrian crossing P changes to red, and the pedestrians 7A and 7B slowly cross the pedestrian crossing P at the same walking speed.
- the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
- the edge server 3 when obtaining the evaluation value when the vehicle 5A is to be evaluated, the edge server 3 obtains a basic value from the collision prediction time, and adds the pedestrians 7A and 7B in addition to the addition value by the addition processing for the vehicle 5. The value added by the processing is added to the base value to obtain an evaluation value. At this time, in the addition processing relating to the pedestrians 7A and 7B, the height of the pedestrian (120 cm or less), the speed of the pedestrian, and the added value due to ignoring the signal are added. As a result, the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes larger than the notification determination threshold, and the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A. Further, the edge server 3 notifies the pedestrians 7A and 7B of the prediction result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
- the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
- the evaluation value is set to a large value.
- the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
- the in-vehicle devices 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B output the collision prediction result to the user of the own device based on the notification from the edge server 3.
- the edge server 3 can make the user of the vehicle 5A recognize in advance that a pedestrian appears from the right side of the pedestrian crossing P in front.
- the edge server 3 can cause the pedestrians 7A and 7B to recognize in advance that a vehicle will appear from the left side of the pedestrian crossing P in front.
- the edge server 3 can avoid collision between the moving objects by causing the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B to output to the user. .
- FIG. 21 is a diagram showing a situation around an intersection according to scenario 3.
- the pedestrians 7A and 7B are walking on the sidewalk H, there is an obstacle G1 on the sidewalk H, and the pedestrians 7A and 7B run off the roadway while avoiding the obstacle G1.
- the pedestrian 7A is an adult, and the pedestrian 7B is a child.
- the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
- the pedestrians 7A and 7B have a child attribute (only the pedestrian 7B) and do not walk on the sidewalk.
- the edge server 3 when obtaining the evaluation values of the pedestrians 7A and 7B when the vehicle 5A is the evaluation target, the edge server 3 obtains the base value from the predicted collision time, The addition value obtained by the addition processing for the persons 7A and 7B is added to the base value to obtain an evaluation value.
- an added value based on the height (120 cm or less) of the pedestrian and the walking place (roadway) of the pedestrian is added.
- the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A. Further, the edge server 3 notifies the pedestrians 7A and 7B of the result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
- the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B. Further, a vehicle 5D which is a blind spot factor exists between the vehicle 5B and the vehicle 5A.
- the edge server 3 When calculating the evaluation value when the vehicle 5B is to be evaluated, the edge server 3 adds an addition value based on the presence or absence of a blind spot factor to the evaluation value obtained from the collision prediction time. When the evaluation value of the vehicle 5A in the vehicle 5B is larger than the notification determination threshold, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
- the edge server 3 specifies the vehicle 5B as a collision prediction target of the vehicle 5C.
- the evaluation value is set to a large value.
- the edge server 3 notifies the collision prediction result of the vehicle 5B in the vehicle 5C to the vehicle 5C.
- the edge server 3 causes the in-vehicle devices 50 of the vehicles 5A, 5B, 5C and the pedestrian terminals 70 of the pedestrians 7A, 7B to output to the user, thereby avoiding collision between the moving objects. Can be.
- FIG. 22 is a diagram showing a situation around an intersection according to scenario 4.
- the pedestrians 7A and 7B are walking on the sidewalk H, pass through between the stopped vehicle 5C and the vehicle 5D, and walk on a place other than the pedestrian crossing to cross the road.
- the pedestrian 7A is an adult, and the pedestrian 7B is a child.
- the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
- the pedestrians 7A and 7B have a child attribute (only the pedestrian 7B) and do not walk on the sidewalk and the crosswalk.
- a vehicle 5D which is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
- the edge server 3 when calculating the evaluation values of the pedestrians 7A and 7B when the vehicle 5A is to be evaluated, the edge server 3 obtains a base value from the predicted collision time, The factors and the added value by the adding process for the pedestrians 7A and 7B are added to the base value to obtain an evaluation value. At this time, in the addition process for the pedestrians 7A and 7B, an added value based on the height (120 cm or less) of the pedestrian and the walking place (roadway) of the pedestrian is added. As a result, the evaluation values of the pedestrians 7A and 7B in the vehicle 5A become larger than the notification determination threshold, and the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A. Further, the edge server 3 notifies the pedestrians 7A and 7B of the result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
- the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
- the evaluation value is set to a large value.
- the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
- an in-vehicle device having an in-vehicle camera 59 in the vehicle 5 in which a pedestrian passes. If 50 is mounted, it can also be detected by the in-vehicle camera 59.
- FIG. 23 is a diagram showing a situation around an intersection according to scenario 5.
- pedestrians 7A and 7B are walking meandering on sidewalk H.
- the pedestrian 7A is an adult, and the pedestrian 7B is a child.
- the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
- the pedestrians 7A and 7B have a child attribute (only the pedestrian 7B), and if the pedestrians run off the road, they are not walking on the sidewalk.
- the edge server 3 obtains the base value from the predicted collision time, The addition value obtained by the addition processing for the persons 7A and 7B is added to the base value to obtain an evaluation value.
- an added value based on the height (120 cm or less) of the pedestrian, the walking locus (meandering) of the pedestrian, and the walking place (roadway) is added.
- the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A. Further, the edge server 3 notifies the pedestrians 7A and 7B of the result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
- the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
- the edge server 3 obtains an evaluation value when the vehicle 5B is an evaluation target, based on a predicted collision time between the pedestrians 7A and 7B. When the evaluation value of the vehicle 5A in the vehicle 5B is larger than the notification determination threshold, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
- FIG. 24 is a diagram showing a situation around an intersection according to scenario 6.
- pedestrians 7A and 7B are crossing a pedestrian crossing P that crosses a route R1.
- the light color of the signal of the pedestrian crossing P is blinking blue, but the light color of the signal of the pedestrian crossing P is It turns red, and it is assumed that the pedestrians 7A and 7B cross the pedestrian crossing P in a hurry.
- the pedestrian 7A is an adult, and the pedestrian 7B is a child.
- the vehicle 5A enters the intersection from the route R2, turns left, and travels in the intersection toward the route R1.
- the vehicle 5B is running following the vehicle 5A. Since the light color of the signal on the pedestrian crossing P changes from blinking blue to red, the light color of the vehicles 5A and 5B and the traffic light ahead also changes from blue to yellow and red. Therefore, the vehicles 5A and 5B are also in a hurry to pass the intersection. Further, there is a building G2 between the route R2 and the route R1 and at a corner of the intersection, which blocks the line of sight between the route R1 and the route R2.
- the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A and the collision prediction targets of the pedestrians 7A and 7B.
- the vehicle 5A is specified.
- the attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B), and the signals are ignored.
- a building G2 that is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
- the edge server 3 when obtaining the evaluation value when the vehicle 5A is the evaluation target, the edge server 3 obtains the base value from the collision prediction time, and in addition to the addition value by the addition processing regarding the vehicle 5, the blind spot factor and the pedestrian 7A , 7B are added to the base value to obtain an evaluation value.
- the height of the pedestrian (120 cm or less) and an added value due to ignoring the signal are added.
- the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A. Further, the edge server 3 notifies the pedestrians 7A and 7B of the prediction result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
- the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
- the evaluation value is set to a large value.
- the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
- the in-vehicle devices 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B output the collision prediction result to the user of the own device based on the notification from the edge server 3.
- the edge server 3 can make the user of the vehicle 5A recognize in advance that the pedestrian is crossing the pedestrian crossing P.
- the edge server 3 can make the pedestrians 7A and 7B recognize in advance that the vehicle 5A will appear from the right side of the pedestrian crossing P.
- the edge server 3 can avoid collision between the moving objects by causing the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B to output to the user. .
- FIG. 25 is a diagram illustrating the storage unit 34 of the edge server 3 according to the second embodiment.
- the storage unit 34 of the present embodiment stores a dynamic information map M1, a moving object database 34a, an evaluation value database 34b, and a relative distance database 34c.
- the relative distance database 34c the relative distance between the moving objects registered in the moving object database 34a is registered.
- the calculation unit 31a of the present embodiment evaluates using the movement information (position information, azimuth information, and speed information) of the evaluation target (target moving object) and the moving objects other than the evaluation target (other moving objects).
- the third embodiment is different from the first embodiment in that an addition process based on the relative distance between the moving objects registered in the relative distance database 34c is further performed in addition to the individual calculation process for obtaining the value.
- the configuration other than the configuration described below is the same as that of the first embodiment.
- FIG. 26 is a diagram illustrating an example of the relative distance database 34c.
- the relative distance between the moving objects registered in the moving object database 34a is registered in the relative distance database 34c.
- a relative distance from another moving object ID is registered for each moving object ID.
- a past relative distance is also registered in the relative distance database 34c.
- the relative distance database 34c As the relative distance between the mobile unit ID “1001” and the mobile unit ID “1002”, “132 m” is registered as the relative distance obtained from the latest updated position information, “152 m” is registered as the relative distance obtained from the positional information at the time, and “172 m” is registered as the relative distance obtained from the positional information at the time of updating two times before. That is, in the relative distance database 34c, a plurality of relative distances arranged in chronological order at each update interval are registered.
- the detecting unit 31d calculates the relative distance between the moving objects and registers the relative distance in the relative distance database 34c. Therefore, the detecting unit 31d updates the relative distance database 34c at the same time as updating the moving object database 34a as needed.
- FIG. 27 is a flowchart illustrating an example of a calculation process performed by the calculation unit 31a according to the present embodiment.
- an addition processing step S59 based on the relative distance is performed after the individual calculation processing in step S55.
- a process of further adding the added value to each evaluation value registered in step S57 is performed. Therefore, here, step S59 will be mainly described.
- FIG. 28 is a flowchart showing an example of the adding process based on the relative distance in step S59 in FIG.
- the calculation unit 31a first specifies a mobile object other than the evaluation target (step S120), and among the mobile objects other than the evaluation target, the relative distance to the target mobile object is within 100 m and the relative distance to the target mobile object is less than 100 m.
- a moving body having a negative distance displacement is specified as an approaching target (step S121).
- the displacement of the relative distance is a value obtained by subtracting the past relative distance from the current relative distance.
- the displacement of the relative distance is positive, it indicates that the relative distance is increasing, and the evaluation target and the moving object other than the evaluation target are moving in a direction relatively away from each other.
- the displacement of the relative distance is negative, it indicates that the relative distance is decreasing, and it can be seen that the evaluation target and the moving object other than the evaluation target are moving in a direction relatively approaching.
- the calculation unit 31a obtains the displacement of the relative distance by subtracting the relative distance based on the position information updated earlier than the latest one from the relative distance based on the latest updated position information.
- the arithmetic unit 31a subtracts the relative distance based on the position information updated earlier than the latest one from the relative distance based on the latest updated position information, and further calculates a unit based on the update interval of the relative distance database 34c.
- the displacement of the relative distance per time may be obtained, and this may be used as the displacement of the relative distance.
- the calculation unit 31a specifies, as an approaching target, a moving object that is located within 100 m around the target moving object and whose relative distance decreases so as to approach the target moving object. That is, the moving object specified as the approaching target is a moving object that has a possibility of colliding or contacting the target moving object.
- step S121 the calculation unit 31a performs an addition process on the evaluation value of the moving object specified as an approaching target (hereinafter, also referred to as an approaching moving object) (step S122).
- FIG. 29 is a flowchart illustrating an example of an addition process for an evaluation value.
- the addition processing for the approaching moving body is processing for further adding an addition value to the evaluation value of the approaching moving body in the target moving body. Note that the evaluation value of the approaching moving body in the target moving body is obtained by an individual calculation process.
- the calculation unit 31a repeats the processing from step S131 to step S137 until all the approaching moving objects identified in step S121 are processed.
- steps S132 to S136 the processing of the target moving object and one of the specified approaching moving objects will be described.
- step S132 the calculation unit 31a determines whether the relationship between the target moving body and the approaching moving body is pedestrian-to-pedestrian (step S132). If it is determined in step S132 that the relationship between the target moving body and the approaching moving body is pedestrian-to-pedestrian, the calculation unit 31a proceeds to step S137 without performing the adding process on the approaching moving body. Accordingly, when the relationship between the target moving body and the approaching moving body is pedestrian-to-pedestrian, the addition processing is not performed.
- step S132 determines whether or not the relationship between the target moving object and the approaching moving object is a pedestrian-to-pedestrian. If it is determined in step S134 that the relationship between the target moving object and the approaching moving object is not a pedestrian-to-pedestrian, the calculation unit 31a proceeds to step S134, where the relationship between the target moving object and the approaching moving object is determined. It is determined whether or not the vehicle is a vehicle-to-vehicle (step S134). If it is determined in step S134 that the relationship between the target moving body and the approaching moving body is not a vehicle-to-vehicle, the calculation unit 31a proceeds to step S136 and performs a process corresponding to the vehicle-to-pedestrian or pedestrian-to-vehicle.
- FIG. 30 is a flowchart illustrating an example of a process corresponding to vehicle-to-pedestrian or pedestrian-to-vehicle.
- one of the target moving body and the approaching moving body is a vehicle, and the other is a pedestrian.
- the target moving body is a vehicle and the approaching moving body is a pedestrian will be described.
- the same processing is performed when the target moving body is a pedestrian and the approaching moving body is a vehicle.
- the calculation unit 31a refers to the attribute “walking place” of the pedestrian who is the approaching moving body, and determines whether or not the content is in the sidewalk or in the building (step S141).
- the detection unit 31d of the present embodiment acquires whether the pedestrian's walking place is a roadway, a sidewalk, other (in a building), or a pedestrian crossing, and describes the content of the pedestrian's behavior attribute “walking place”. , One of a roadway, a sidewalk, other (inside a building), and a crosswalk is registered.
- the calculation unit 31a ends the process without performing the adding process on the approaching moving body (pedestrian), and proceeds to step S137 (FIG. 29). Proceed to.
- the pedestrian is in the sidewalk or the building, it is considered that the possibility that the pedestrian will collide with the vehicle as the target moving body is low. Therefore, when the pedestrian is on the sidewalk or in the building, the calculation unit 31a does not perform the addition processing.
- step S141 If it is determined in step S141 that the content of “walking place” is not in the sidewalk or in the building, the calculation unit 31a refers to the attribute “speed” of the target moving object vehicle and determines whether the speed of the vehicle is 5 km / h or less. It is determined whether or not it is (step S143). When determining that the speed of the vehicle is 5 km / h or less, the calculation unit 31a ends the process without performing the addition process on the pedestrian who is the approaching moving body, and proceeds to step S137 (FIG. 29). The reason why the addition process is not performed in this case is that if the speed of the vehicle is 5 km / h or less, the possibility that the pedestrian will collide with the vehicle as the target moving body is considered to be low.
- step S143 If it is determined in step S143 that the speed of the vehicle is not lower than 5 km / h, the calculation unit 31a refers to the attribute “walking place” of the pedestrian and determines whether or not the content is a pedestrian crossing (step S144). .
- step S144 determines whether the content of “walking place” is a pedestrian crossing.
- the calculation unit 31a proceeds to step S145, and determines whether a pedestrian who is an approaching moving body has passed the center of the road traversed by the pedestrian crossing. (Step S145). The calculation unit 31a determines whether the pedestrian has passed the center of the road.
- the calculation unit 31a refers to the map M1 and the attribute information of the pedestrian, and based on the position and the moving direction of the pedestrian, the position of the pedestrian has walked on the pedestrian crossing and has passed through the center of the road which is an intermediate point of the pedestrian crossing. It is determined whether or not it is a position.
- the calculation unit 31a determines whether or not the position of the pedestrian is within 2 m until reaching the sidewalk (step S146). The calculation unit 31a also determines whether the position of the pedestrian is within 2 meters before reaching the sidewalk. The calculation unit 31a refers to the map M1 and the attribute information of the pedestrian, and determines whether the pedestrian walks on the pedestrian crossing from the pedestrian's position and the moving direction and the pedestrian's position is within 2 m until reaching the sidewalk. Is determined.
- step S137 If it is determined that the position of the pedestrian is within 2 m before reaching the sidewalk, the calculation unit 31a ends the process without performing the addition process on the pedestrian who is the approaching moving body, and proceeds to step S137 (FIG. 29). If the position of the pedestrian is within 2 m before reaching the sidewalk, it is considered that the pedestrian quickly finishes crossing the pedestrian crossing, and the possibility that the pedestrian will collide with the vehicle as the target moving body is low. Therefore, if it is determined in step S142 that the pedestrian is within 2 m until the pedestrian reaches the sidewalk, the calculation unit 31a does not perform the addition process.
- step S147 the operation unit 31a proceeds to step S147, and adds the addition value to the evaluation value of the approaching moving body in the target moving body.
- the calculation unit 31a refers to the evaluation value database 34b, and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do.
- the calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
- the added value added to the evaluation value in step S147 is, for example, “200”.
- the pedestrian is likely to be walking on a road other than the crosswalk, and the pedestrian's behavior attribute “walking location” is also a road. Since the pedestrian is located on the roadway and the relative distance between the pedestrian and the vehicle that is relatively approaching is within 100 m, it can be considered that the possibility that the pedestrian and the vehicle collide is relatively high. . Therefore, the added value added here has a high necessity of notification, and is set to the same value as the base value when the predicted collision time is less than 3 seconds in the individual calculation processing.
- step S145 If it is determined in step S145 that the pedestrian has not passed through the center of the road, the calculation unit 31a also proceeds to step S147, and if it is determined that the pedestrian is not within 2 meters before reaching the sidewalk, the target movement
- the addition value is added to the evaluation value of the approaching moving body in the body (step S147), and the process ends.
- the pedestrian is walking on the pedestrian crossing, but it still requires a certain amount of time to finish crossing the pedestrian crossing, during which the approaching vehicle may collide with the pedestrian. It can be considered that there is. Therefore, also in this case, the addition value is added to the evaluation value of the approaching moving body in the target moving body.
- step S148 the calculation unit 31a refers to the evaluation value database 34b and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do.
- the calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
- the added value added to the evaluation value in step S148 is, for example, "100".
- the pedestrian has crossed the pedestrian crossing halfway, but it still requires a certain amount of time to complete the pedestrian crossing, during which the approaching vehicle may collide with the pedestrian. Can be considered to be. Therefore, also in this case, the addition value is added to the evaluation value of the approaching moving body in the target moving body. It should be noted that the time required for the pedestrian to cross the pedestrian crossing is shorter than the case where it is determined that the pedestrian has not passed through the center of the road. Therefore, the added value in step S148 is set to a value smaller than the added value in step S147.
- step S134 when it is determined in step S134 that the relationship between the target moving body and the approaching moving body is vehicle-to-vehicle, the calculation unit 31a proceeds to step S135 and performs processing corresponding to the vehicle-to-vehicle. .
- FIG. 31 is a flowchart illustrating an example of a process corresponding to a vehicle-to-vehicle process.
- the calculation unit 31a determines whether the vehicle of the target moving body and the vehicle of the approaching moving body are running in parallel on the same road in the same direction, or are traveling on the same lane. Is determined (step S151).
- the determination unit 31a determines whether the two vehicles are running in parallel in the same direction on the same road or in the same lane.
- the calculation unit 31a refers to the map M1 and the attribute information of the two vehicles, and based on the positions and the moving directions of the two vehicles, the two vehicles are running in parallel on the same road in the same direction, or traveling in the same lane. Is determined.
- the arithmetic unit 31a determines whether or not the road on which both vehicles travel has a median strip. Is determined (step S152).
- the calculation unit 31a also determines whether or not there is a median strip on the road on which both vehicles travel.
- the calculation unit 31a refers to the map M1 and the attribute information of both vehicles, and determines whether there is a median strip on the road on which both vehicles travel.
- step S137 (FIG. 29). move on. If both vehicles are not running side by side on the same road in the same direction and are not traveling in the same lane when both vehicles are approaching each other, both vehicles are traveling in opposite lanes It is believed that there is. At this time, if there is a median strip, it is considered that the possibility of collision between the two is extremely low. For this reason, if it is determined in step S152 that there is a median strip on the road on which both vehicles travel, the calculation unit 31a does not perform the addition processing.
- step S152 determines whether there is no median strip on the road on which both vehicles are traveling. If it is determined in step S152 that there is no median strip on the road on which both vehicles are traveling, the calculation unit 31a proceeds to step S153, adds the addition value to the evaluation value of the approaching moving body in the target moving body, and performs the processing. Finish.
- the two vehicles are traveling on the opposite lane and do not collide in principle.
- the calculation unit 31a performs an addition process.
- step S153 the calculation unit 31a refers to the evaluation value database 34b and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do.
- the calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
- the added value added to the evaluation value in step S153 is, for example, “50”.
- step S151 when it is determined that the vehicle of the target moving body and the vehicle of the approaching moving body are running in parallel on the same road in the same direction or are running on the same lane, the arithmetic unit 31a It is determined whether or not the road curves within 100 m in front of the moving vehicle (step S154).
- the calculation unit 31a determines whether the road is curved within 100m ahead.
- the calculation unit 31a refers to the map M1 and the attribute information of the vehicle of the target moving body, and determines whether the road is curved within a range of 100 m ahead of the target moving body.
- the curve determined in step S154 indicates a curve that can be turned at a legal speed without greatly reducing the speed.
- step S137 the calculation unit 31a ends the processing without adding the added value to the evaluation value of the approaching moving body in the target moving body, and proceeds to step S137 (FIG. Proceed to 29). If both vehicles are running side by side, or are traveling in the same lane, such a condition may occur if both vehicles are traveling normally on a straight road, even if both vehicles are gradually approaching. It is not necessary to actively notify the vehicle of the collision prediction result. Therefore, if it is determined in step S154 that the road is not curved within 100 m ahead, the calculation unit 31a does not perform the addition processing.
- step S154 If it is determined in step S154 that the road is curved within 100 m ahead, the operation unit 31a proceeds to step S155, adds the added value to the evaluation value of the approaching moving body in the target moving body, and ends the processing.
- step S155 When both vehicles enter the curve, both vehicles need to turn the steering wheel according to the curve.
- both vehicles are running side by side or on the same lane and approach each other, the possibility of contact with each other increases if both vehicles turn the steering wheel. Therefore, the arithmetic unit 31a performs an addition process.
- step S155 the calculation unit 31a refers to the evaluation value database 34b and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do.
- the calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
- the added value added to the evaluation value in step S155 is, for example, “200”.
- the added value in step S155 is set to a value larger than the added value in step S153.
- step S137 the calculation unit 31a repeats the processing from step S131 to step S137 until all the approaching moving objects identified in step S121 are processed.
- the calculation unit 31a ends the addition processing for the evaluation value (Step S122 in FIG. 28), and ends the addition processing based on the relative distance.
- the calculation unit 31a treats the approaching moving body as a moving body having a possibility of colliding with or contacting the target moving body, and performs the addition processing on the evaluation value for the approaching moving body.
- the calculation unit 31a has the above-mentioned in the addition process based on the relative distance. It may be specified as a moving body approaching the target moving body and added to the evaluation value.
- the calculation unit 31a After completing the addition process for the evaluation value, the calculation unit 31a returns to the calculation process of FIG.
- the calculation unit 31a sequentially processes steps S55, S57, and S59 in FIG. 27 for the specified target moving objects, and repeats the processing until all of the target moving objects are processed.
- the calculation unit 31a obtains evaluation values for all of the specified target moving objects, and after completing the addition processing on the evaluation values, returns to step S51 again and repeats the same processing.
- the arithmetic unit 31a repeatedly calculates and registers the evaluation value of each specified target moving object by repeating the arithmetic processing, and updates the registered contents of the evaluation value database 34b as needed.
- FIG. 32 is a diagram illustrating an example of the evaluation value database 34b after the addition processing based on the relative distance.
- the evaluation value of the target moving object having the moving object ID “1005” in the column of the evaluation values of the moving objects other than the evaluation target, the column corresponding to the moving object ID “1001” has “ “200” is registered, and “200” is registered as an evaluation value in a column corresponding to the mobile unit ID "1003”, and "0" is registered in other fields.
- the moving object with the moving object ID “1003” is specified as a collision prediction target with respect to the target moving object with the moving object ID “1005”, whereby the evaluation value is set to “200”.
- the moving object with the moving object ID “1001” is specified as a moving object approaching the target moving object with the moving object ID “1005”, and the evaluation value is “200”.
- the notification determination threshold value for the evaluation value used to determine whether the determination unit 31b notifies the collision prediction result is set to “200”. That is, when the evaluation value is “200” or more, the determination unit 31b determines that the collision prediction result should be notified to the evaluation target.
- “0” is set as an evaluation value for a moving object other than the collision prediction target.
- the addition processing based on the relative distance a plurality of approaching moving bodies are specified, and an addition value is added to the evaluation value of the target moving body for each of the plurality of approaching moving bodies. For this reason, as shown in FIG. 32, a plurality of moving objects (the moving object with the moving object ID “1001” and the moving object with the moving object ID “1003”) correspond to the moving object with the moving object ID “1005”.
- the evaluation value is registered.
- the target mobile unit (the mobile unit with the mobile unit ID “1005”) has a collision prediction result for the mobile unit with the mobile unit ID “1001” and a collision prediction result for the mobile unit with the mobile unit ID “1003”. Will be notified.
- the collision prediction result related to the collision prediction target specified by the individual calculation processing but also the collision prediction result related to the approaching moving body specified by the addition processing based on the relative distance is transmitted to the target moving body.
- the content of the collision prediction result for the approaching moving body is the same as the collision prediction result for the collision prediction target, and includes information such as the attribute of the approaching moving body and the direction in which the approaching moving body approaches.
- the edge server 3 identifies a moving object having a relative distance of 100 m or less and a negative displacement of the relative distance as an approaching moving object among other moving objects, and approaches the moving object at the target moving object.
- the additional value is added to the evaluation value of the moving object.
- the edge server 3 adds the relative distance between the target moving object and another moving object and an added value based on the displacement of the relative distance to the evaluation value of the approaching moving object in the target moving object.
- the collision prediction time calculated by the individual calculation processing is based on the movement information (position information, azimuth information, and speed information) of the target moving body and other moving bodies, and the target moving body and the other moving bodies are used.
- the collision prediction time when a collision occurs is obtained. Therefore, the edge server 3 determines that the traveling directions (azimuth information) of the target mobile unit and the other mobile units intersect or overlap with each other, and that the relationship between the speed of the target mobile unit and the speed of the other mobile unit is different. It is determined whether or not a collision occurs. Therefore, even when the target moving body is approaching another moving body, the edge server 3 determines that the possibility of collision between the approaching target moving body and the other moving body is low. There is.
- the edge server 3 uses the relative distance between the target moving object and another moving object and the added value based on the displacement of the relative distance as the evaluation value of the approaching moving object in the target moving object. to add.
- the collision prediction result is provided to the mobile terminal of the target moving object when the possibility of a collision occurs due to the approach of the target moving object and another moving object, which does not appear in the collision prediction time.
- the evaluation value In the evaluation value.
- step S121 in FIG. 28 of the present embodiment among the moving objects other than the evaluation object, a moving object whose relative distance to the target moving object is within 100 m and whose relative distance is negative is specified as an approaching moving object.
- the threshold of the relative distance for determining the approaching moving body is not limited to 100 m.
- the threshold of the relative distance is 100 m
- the threshold value of the relative distance is set to 100 m.
- the threshold of the relative distance can be made smaller. Therefore, the threshold value of the relative distance may be changed according to the attributes (vehicle or pedestrian) of the target moving object and other moving objects. Furthermore, the threshold of the relative distance may be configured to change according to the displacement of the relative distance.
- step S154 in FIG. 31 of the present embodiment a case has been illustrated in which it is determined whether or not the road curves within 100 m ahead of the vehicle of the target moving object.
- the threshold value of the distance for determining whether there is a curve ahead is not limited to 100 m.
- the threshold for determining the presence or absence of the curve ahead is 100 m.
- the threshold value of the distance for determining the presence or absence of a forward curve is set to 100 m because the notification can be made in advance during these several seconds. Therefore, the threshold may be set to a longer distance if the notification can be made in advance before the vehicle reaches the curve. Further, the threshold value of the distance for determining the presence or absence of a forward curve may be changed according to the speed of the vehicle.
- the calculation unit 31a does not perform the individual calculation process and based on the relative distance between the moving objects.
- the evaluation value between the target moving object and another moving object may be obtained only by the addition processing.
- the evaluation value for determining whether or not to notify the collision prediction result is determined based on the relative distance between the target moving object and each of the other moving objects, and the displacement of the relative distance. If the relative distance to another moving object is within a range of a predetermined relative distance and the relative distance decreases so as to approach each other, it can be determined that a possibility of collision has occurred. As a result, regardless of the traveling directions of the target moving object and the other moving objects, other moving objects having a possibility of collision with the target moving object can be widely evaluated, and the necessity of notification can be appropriately determined. it can.
- FIG. 33 is a diagram for describing a specific example of the addition process based on the relative distance, and is a diagram illustrating a pedestrian 7 crossing a crosswalk 90 at an intersection and a vehicle 5 turning right at the intersection.
- the pedestrian 7 and the vehicle 5 are both traveling in accordance with the traffic light. It is assumed that the pedestrian 7 and the vehicle 5 are registered as moving objects in the dynamic information map M1 and the moving object database 34a.
- the vehicle 5 is equipped with the on-vehicle device 50, and the pedestrian 7 carries the pedestrian terminal 70.
- the notification determination threshold is set to “200”, and the edge server 3 determines that the evaluation value between the target mobile unit and another mobile unit is the notification determination threshold value (“200”). If so, it is assumed that the collision prediction result is notified to the target moving body.
- (A), (b), (c), and (d) in FIG. 33 show, in chronological order, a case where the vehicle 5 enters an intersection and turns right when the pedestrian 7 starts to cross the pedestrian crossing 90. ing. Further, the distance d between the pedestrian 7 and the vehicle 5 gradually decreases in the order of (a) to (d) in FIG. Further, it is assumed that the relative distance between the pedestrian 7 and the vehicle 5 is within 100 m in any of the cases (a) to (d). Further, the speed of the vehicle 5 is assumed to be higher than 5 km / h.
- the pedestrian 7 is specified as the approaching moving body when the vehicle 5 is set as the target moving body.
- the pedestrian 7 is set as the target moving body
- the vehicle 5 is specified as the approaching moving body.
- the pedestrian 7 and the vehicle 5 do not collide in the collision prediction based on the position information, the azimuth information, and the speed information because the traveling directions (arrows in the figure) are parallel to each other.
- the edge server 3 (the calculation unit 31a thereof) does not specify the pedestrian 7 as the collision prediction target of the vehicle 5 in the individual calculation processing.
- the edge server 3 does not specify the vehicle 5 as a collision prediction target for the pedestrian 7. Therefore, the edge server 3 sets the evaluation value for the vehicle 5 when the pedestrian 7 is the target moving object to “0” in the individual calculation processing.
- the edge server 3 sets the evaluation value for the pedestrian 7 when the vehicle 5 is the target moving object to “0” in the individual calculation processing.
- the edge server 3 does not add the addition value to the evaluation value for the pedestrian 7 of the vehicle 5 as the target moving body. (Step S141 in FIG. 30). Therefore, at the stage (a) in FIG. 33, the edge server 3 does not notify the vehicle 5 of the collision prediction result regarding the pedestrian 7, and notifies the pedestrian 7 of the collision prediction result regarding the vehicle 5. No notification.
- the pedestrian 7 walks on the pedestrian crossing from the sidewalk 91 to the sidewalk 92, and the vehicle 5 enters the intersection and is about to start a right turn. Further, in (b) of FIG. 33, the pedestrian 7 has not yet passed the center of the road (the middle point of the pedestrian crossing 90), and the distance to reach the sidewalk 92 ahead in the traveling direction is 2 m or more. Is also walking at a long point.
- the edge server 3 mutually specifies the vehicle 5 and the pedestrian 7 as the collision prediction targets in the individual calculation processing is a case-by-case basis.
- the edge server 3 does not mutually identify the vehicle 5 and the pedestrian 7 as collision prediction targets.
- the edge server 3 sets the evaluation value of the vehicle 5 of the pedestrian 7 as the target moving object to “0” in the individual calculation processing. Similarly, the edge server 3 sets the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object to “0” in the individual calculation processing.
- the edge server 3 adds the added value “200” to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body (Step S147 in FIG. 30). Therefore, the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body is “200”. Therefore, the evaluation value of the pedestrian 7 in the vehicle 5 is equal to or greater than the notification determination threshold, and the edge server 3 notifies the vehicle 5 of the collision prediction result regarding the pedestrian 7 to the stage (b) in FIG. .
- the edge server 3 performs the same processing on the pedestrian 7 as the target moving body. Therefore, the edge server 3 notifies the pedestrian 7 of the collision prediction result regarding the vehicle 5 at the stage (b) in FIG.
- the edge server 3 performs the calculation based on the relative distance in the stage of (b) in FIG.
- the evaluation value of the vehicle 5 in the pedestrian 7 as the target moving object and the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object are increased to be equal to or more than the notification determination threshold value.
- the edge server 3 notifies both the pedestrian 7 and the vehicle 5 of the collision prediction result.
- the vehicle 5 and the pedestrian 7 can be made to recognize each other before the situation as shown in (c) and (d) in FIG. 33, and a collision can be avoided. .
- FIG. 34 is a diagram for describing another specific example of the addition processing based on the relative distance, and is a diagram illustrating a pedestrian 7 crossing a crosswalk 90 at an intersection and a vehicle 5 turning right at the intersection.
- (A), (b), (c), and (d) in FIG. 34 show, in chronological order, a case where the vehicle 5 enters an intersection and turns right while the pedestrian 7 is crossing the crosswalk 90. Is shown.
- the pedestrian 7 is located on the sidewalk 91 immediately before the vehicle 5 enters the intersection ((a) in FIG. 33).
- the pedestrian 7 is located on the pedestrian crossing 90 at a position passing through the center of the road. Is shown. That is, the specific example 2 shows a case where the pedestrian 7 walks ahead of the vehicle 5 by about half the road width as compared with the specific example 1.
- the pedestrian 7 is on the pedestrian crossing 90 and passes through the center of the road, but the point where the distance to reach the sidewalk 92 ahead in the traveling direction is longer than 2 m. Assume that you are walking. Other conditions are the same as in the first embodiment.
- the pedestrian 7 and the vehicle 5 do not collide in the collision prediction based on the position information, the azimuth information, and the speed information because the traveling directions (arrows in the figure) are parallel to each other.
- the edge server 3 (the calculation unit 31a thereof) does not mutually specify the vehicle 5 and the pedestrian 7 as collision prediction targets. Therefore, the edge server 3 sets the evaluation value of the vehicle 5 of the pedestrian 7 as the target moving object to “0” in the individual calculation processing. Similarly, the edge server 3 sets the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object to “0” in the individual calculation processing.
- the edge server 3 adds the added value “100” to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body (Step S148 in FIG. 30). Therefore, the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body is “100”.
- the edge server 3 notifies the collision prediction result when the evaluation value is “200” or more. Therefore, when the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object is “100”, the edge server 3 does not notify the vehicle 5 of the collision prediction result. However, if for some reason the evaluation value obtained by the individual calculation processing becomes “100” or more and exceeds the notification determination threshold (“200”), the edge server 3 notifies the collision prediction result.
- the pedestrian 7 has crossed the pedestrian crossing 90 immediately before the sidewalk 92, and the vehicle 5 is approaching the intersection and is about to start a right turn.
- the edge server 3 specifies the vehicle 5 and the pedestrian 7 as collision prediction targets in the individual calculation processing.
- the edge server 3 does not mutually identify the vehicle 5 and the pedestrian 7 as collision prediction targets. Therefore, the edge server 3 sets the evaluation value of the vehicle 5 of the pedestrian 7 as the target moving object to “0” in the individual calculation processing. Similarly, the edge server 3 sets the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object to “0” in the individual calculation processing.
- the edge server 3 does not add to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body in the addition processing based on the relative distance (Step S146 in FIG. 30).
- the edge server 3 does not add to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body in the addition processing based on the relative distance. Therefore, the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body is “0”. Therefore, the edge server 3 does not notify the vehicle 5 of the collision prediction result. Also, the edge server 3 performs the same processing in the case of the pedestrian 7 as the target moving body. Therefore, the edge server 3 does not notify the pedestrian 7 of the collision prediction result.
- the edge server 3 even if the edge server 3 specifies the approaching moving body approaching the target moving body, if the necessity of notifying the collision prediction result is low, the edge server 3 Do not notify the collision prediction result. Thereby, it is possible to suppress unnecessary notification of the collision prediction result.
- FIG. 35 is a diagram for describing another specific example of the addition process based on the relative distance, and is a diagram illustrating a pedestrian 7 crossing a crosswalk 90 at an intersection and a vehicle 5 turning left at the intersection.
- (a), (b), and (c) show, in chronological order, a case where the vehicle 5 enters an intersection and turns left when the pedestrian 7 starts crossing the pedestrian crossing 90.
- the specific example 3 it is assumed that the conditions regarding the vehicle 5, the pedestrian 7, and the like are the same as those in the specific example 1 except that the vehicle 5 turns left at the intersection and passes the pedestrian crossing 90 where the pedestrian 7 crosses. Therefore, in (a), (b), and (c) of FIG. 35, when the vehicle 5 is the target moving object, the pedestrian 7 is specified as the approaching moving object, and when the pedestrian 7 is the target moving object.
- the vehicle 5 is specified as an approaching moving body.
- the edge server 3 does not notify the vehicle 5 of the collision prediction result regarding the pedestrian 7 as in the case of (a) in FIG. You will not be notified.
- the pedestrian 7 walks on the pedestrian crossing from the sidewalk 91 to the sidewalk 92, and the vehicle 5 enters the intersection and starts to turn left.
- the pedestrian 7 is on the pedestrian crossing 90 and has not yet passed the center of the road. Therefore, even if the pedestrian 7 is not specified as the collision prediction target by the individual calculation process, the edge server 3 adds the addition value “200” to the evaluation value of the pedestrian 7 in the vehicle 5 by the addition process based on the relative distance. It is added (step S147 in FIG. 30). Therefore, the evaluation value of the pedestrian 7 in the vehicle 5 is equal to or greater than the notification determination threshold, and the edge server 3 notifies the vehicle 5 of the collision prediction result regarding the pedestrian 7 as in the case of (b) in FIG. Then, the pedestrian 7 is notified of the collision prediction result regarding the vehicle 5.
- FIG. 36A is a diagram for explaining another specific example of the addition process based on the relative distance, in which two vehicles 5A and 5B head on a two-lane road R10 in the same direction (the direction of the arrow in the figure).
- FIG. Vehicle 5A is traveling in lane R10a
- vehicle 5B is traveling in lane R10b adjacent to lane R10a.
- the road R10 is curved within a range of 100 m ahead of the vehicle 5A.
- the vehicle 5A and the vehicle 5B are registered as moving objects in the dynamic information map M1 and the moving object database 34a. Also, it is assumed that the vehicle 5A has the in-vehicle device 50 mounted thereon, but the vehicle 5B does not have the in-vehicle device 50 mounted thereon. Therefore, in the following description, a case will be described in which the vehicle 5A is a target moving body and the vehicle 5B is an approaching moving body. In the following description, an addition process based on a relative distance will be mainly described.
- the edge server 3 specifies the vehicle 5B as the approaching mobile when the vehicle 5A is the target mobile.
- the edge server 3 does not perform the addition in the addition processing based on the relative distance (step S154 in FIG. 31).
- the edge server 3 does not perform the addition in the addition processing based on the relative distance.
- the edge server 3 obtains an evaluation value of the vehicle 5A with respect to the vehicle 5B by the individual calculation processing.
- the edge server 3 notifies the vehicle 5A of the collision prediction result regarding the vehicle 5B.
- the edge server 3 performs the addition processing based on the relative distance to the vehicle 5B in the vehicle 5A that is the target moving body.
- the added value “200” is added to the evaluation value of (step S155 in FIG. 31).
- the vehicle 5A and the vehicle 5B need to turn the steering wheel according to the curve. At this time, for example, even if the following vehicle 5A turns the steering wheel according to the curve, if the vehicle 5B running ahead does not turn the steering wheel, the possibility of contact with each other increases. Therefore, the edge server 3 performs an addition process.
- the edge server 3 adds the addition value “200” to the evaluation value of the vehicle 5A for the vehicle 5B, the evaluation value becomes equal to or larger than the notification determination threshold. Therefore, even if the vehicle 5B is not specified as the collision prediction target of the vehicle 5A in the individual calculation processing, the edge server 3 notifies the vehicle 5A of the collision prediction result by the addition processing based on the relative distance. As a result, the vehicle 5A can be notified of the possibility of contact with the vehicle 5B before entering the curve, and the vehicle 5A can be alerted.
- FIG. 36B is a diagram for describing another specific example of the addition processing based on the relative distance, in which the vehicle 5A is traveling in the lane R11 of the road R10 in the direction of the arrow in the figure, and the vehicle 5A is traveling.
- FIG. 5 is a diagram showing a state in which a vehicle 5B is traveling in a lane R12 facing a lane R11 in a direction of an arrow in the figure.
- the vehicle 5A and the vehicle 5B are registered as moving objects in the dynamic information map M1 and the moving object database 34a. Also, it is assumed that the vehicle 5A has the in-vehicle device 50 mounted thereon, but the vehicle 5B does not have the in-vehicle device 50 mounted thereon. Therefore, in the following description, a case will be described in which the vehicle 5A is a target moving body and the vehicle 5B is an approaching moving body. In the following description, an addition process based on a relative distance will be mainly described.
- the edge server 3 specifies the vehicle 5B as the approaching mobile when the vehicle 5A is the target mobile.
- the edge server 3 when the median strip C exists between the lane R11 and the lane R12, the edge server 3 does not perform the addition in the addition processing based on the relative distance (see FIG. 31). Step S152).
- the two vehicles when the two vehicles are not approaching each other on the same road in the same direction and are not traveling in the same lane when the two vehicles are approaching each other, the two vehicles face each other. You are driving in the lane. At this time, if there is a median strip, it is considered that the possibility of collision between the two is extremely low. For this reason, as shown in FIG. 36B, when the vehicle 5A and the vehicle 5B are traveling on opposite lanes (lane R11 and lane R12) and there is a median strip C, the edge server 3 In the addition process based on the above, the addition value is not added to the evaluation value. In addition, the individual calculation process between the vehicle 5A and the vehicle 5B is performed.
- the edge server 3 obtains an evaluation value of the vehicle 5A with respect to the vehicle 5B by the individual calculation processing. Therefore, the edge server 3 determines whether or not to notify the collision prediction result according to the evaluation value of the vehicle 5A with respect to the vehicle 5B by the individual calculation processing.
- the edge server 3 adds “50” to the evaluation value of the vehicle 5B in the vehicle 5A as the target moving object in the addition processing based on the relative distance. Is added (step S153 in FIG. 31). In this case, the edge server 3 does not notify the vehicle 5A of the collision prediction result regarding the vehicle 5B.
- the edge server 3 adds the additional value “50” to the evaluation value of the vehicle 5B in the vehicle 5A. Thereby, the edge server 3 notifies the collision prediction result to the vehicle 5A when the evaluation value by the individual calculation processing is “150” or more.
- the edge server 3 notifies the collision prediction result.
- a moving object other than the evaluation target is specified, and a moving object having the shortest collision prediction time is specified as the collision prediction target, and an addition process is performed between the evaluation target and the collision prediction target. Is performed, and the evaluation value is obtained.
- an addition process may be performed on a moving object other than the evaluation target to obtain an evaluation value, and the evaluation value may be obtained by the determination unit 31b.
- the determination unit 31b determines whether or not to notify the collision prediction result based on the notification determination threshold value.
- the evaluation values of the respective evaluation targets are compared with each other.
- it may be configured to determine whether to notify the collision prediction result based on the comparison result.
- the evaluation values of the respective evaluation objects may be compared with each other, and a determination may be made to notify the collision prediction result for a predetermined number of evaluation objects in the descending order of the evaluation values. In this case, since the number of evaluation targets for notifying the collision prediction result does not exceed the predetermined number, it is possible to suppress an unnecessary load on the wireless communication system. This makes it possible to appropriately provide the mobile terminal with a collision prediction result having a high need for notification while suppressing the load on the wireless communication system.
- the determination unit 31b may be configured to adaptively change the notification determination threshold.
- the determination unit 31b may be configured to count the number of notifications of notification of the collision prediction result to each evaluation target in a past predetermined period, and change the notification determination threshold according to the number of notifications.
- the determination unit 31b adjusts the notification determination threshold to a larger value so as to reduce the number of notifications, and If the number of times is a value that can be determined to be a load having sufficient capacity for the wireless communication system, the determination unit 31b adjusts the notification determination threshold to a larger value so that the number of times of notification increases. This makes it possible to appropriately provide the mobile terminal with a collision prediction result having a high need for notification while suppressing the load on the wireless communication system.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
This invention is provided with: an acquisition unit for acquiring, on the basis of dynamic map information, movement information indicating the movement state of a plurality of moving bodies, and outer appearance information for the plurality of moving bodies; a computation unit for performing, on the basis of the movement information, a collision prediction of predicting whether or not there will be a collision between an object moving body having a mobile terminal, from among the plurality of moving bodies, and another moving body other than the object moving body, and obtaining an evaluation value indicating the possibility of collision between the object moving body and another moving body; and an assessment unit for assessing whether or not to communicate a collision prediction result indicating the result of the collision prediction performed by the computation unit to the mobile terminal. The computation unit obtains the evaluation value on the basis of: the collision prediction time length until the object moving body and another moving body collide, the collision prediction time length being included in the collision prediction result; outer appearance attribute values corresponding to the respective outer appearance information for the object moving body and each of the other moving bodies; and behavior attribute values corresponding to behavior information indicating the specifics of the respective behavior of the object moving body and each of the other moving bodies, obtained from the movement information. The assessment unit assesses, on the basis of the evaluation value, whether or not to communicate the collision prediction result to the mobile terminal.
Description
本開示は、情報提供システム、移動端末、情報提供方法、及びコンピュータプログラムに関するものである。
本出願は、2018年10月2日出願の日本出願第2018-187530号に基づく優先権を主張し、前記日本出願に記載された全ての記載内容を援用するものである。 The present disclosure relates to an information providing system, a mobile terminal, an information providing method, and a computer program.
This application claims the priority based on Japanese Patent Application No. 2018-187530 filed on Oct. 2, 2018, and incorporates all the contents described in the Japanese application.
本出願は、2018年10月2日出願の日本出願第2018-187530号に基づく優先権を主張し、前記日本出願に記載された全ての記載内容を援用するものである。 The present disclosure relates to an information providing system, a mobile terminal, an information providing method, and a computer program.
This application claims the priority based on Japanese Patent Application No. 2018-187530 filed on Oct. 2, 2018, and incorporates all the contents described in the Japanese application.
特許文献1には、他車両の情報を自車両に報知する交通システムが開示されている。
Patent Document 1 discloses a traffic system that reports information about another vehicle to the own vehicle.
一実施形態である情報提供システムは、複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得部と、前記複数の移動体のうち移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算部と、前記演算部による衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定部と、を備え、前記演算部は、前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、に基づいて前記評価値を求め、前記判定部は、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する。
The information providing system according to an embodiment includes, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information, moving information indicating a moving state of the plurality of moving objects, Body appearance information, an acquisition unit that acquires, a target mobile having a mobile terminal among the plurality of mobiles, a collision prediction of whether or not a mobile other than the target mobile collides; While performing based on the movement information, a calculation unit that calculates an evaluation value indicating the possibility of collision between the target mobile object and the other mobile object, and a collision prediction result indicating a result of the collision prediction by the calculation unit, A determination unit that determines whether or not to notify a mobile terminal, wherein the calculation unit includes a collision prediction time, which is included in the collision prediction result, until the target mobile unit and the other mobile unit collide with each other. And the target moving body, and the other Based on an appearance attribute value corresponding to the appearance information of each moving object, and an action attribute value obtained from the movement information, the action attribute value corresponding to action information indicating the action content of each of the target moving object and the other moving object. The evaluation unit obtains the evaluation value, and the determination unit determines whether to notify the collision prediction result to the mobile terminal based on the evaluation value.
他の実施形態である情報提供システムは、複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報を取得する取得部と、前記複数の移動体のうち移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算部と、前記演算部による衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定部と、を備え、前記演算部は、前記移動情報に含まれる前記複数の移動体の位置情報に基づいて、前記対象移動体と前記他の移動体それぞれとの相対距離、及び前記相対距離の変位を求め、前記相対距離及び前記相対距離の変位に基づいて前記評価値を求め、前記判定部は、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する。
An information providing system according to another embodiment includes an acquisition unit configured to acquire movement information indicating a movement state of the plurality of moving objects based on dynamic map information in which dynamic information of the plurality of moving objects is superimposed on map information. And performing a collision prediction of whether or not a target mobile having a mobile terminal among the plurality of mobiles collides with another mobile other than the target mobile based on the movement information, A calculating unit that calculates an evaluation value indicating a possibility that the moving body collides with the other moving body, and determines whether to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculating unit; A determination unit, wherein the calculation unit is configured to determine, based on position information of the plurality of moving objects included in the movement information, a relative distance between the target moving object and each of the other moving objects, and the relative distance. Of the relative distance And the calculated evaluation value, the evaluation unit based on the displacement of the relative distance, whether to notify the collision prediction result to the mobile terminal determines on the basis of the evaluation value.
また、他の実施形態である移動端末は、上記情報提供システムから前記衝突予測結果を受け付け、ユーザへ前記衝突予測結果を出力する移動端末である。
In addition, a mobile terminal according to another embodiment is a mobile terminal that receives the collision prediction result from the information providing system and outputs the collision prediction result to a user.
他の実施形態である情報提供方法は、移動端末へ情報提供を行う情報提供方法であって、複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得ステップと、前記複数の移動体のうち前記移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算ステップと、前記演算ステップによる衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定ステップと、を含み、前記演算ステップでは、前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、に基づいて前記評価値を求め、前記判定ステップでは、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する。
An information providing method according to another embodiment is an information providing method for providing information to a mobile terminal, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information, An acquisition step of acquiring movement information indicating a moving state of a moving body and appearance information of the plurality of moving bodies, a target moving body having the mobile terminal among the plurality of moving bodies, and a target other than the target moving body Performing a collision prediction of whether or not to collide with another moving body based on the movement information, and calculating an evaluation value indicating a possibility of collision between the target moving body and the other moving body; and Determining whether or not to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculation step, wherein in the calculation step, the target movement included in the collision prediction result is included. A collision prediction time until the object and the other moving object collide with each other, the target moving object, an appearance attribute value according to the appearance information of each of the other moving objects, and the object obtained from the movement information. The evaluation value is obtained based on a behavior attribute value corresponding to behavior information indicating a behavior of each of the moving body and the other moving body, and in the determining step, the collision prediction result is notified to the mobile terminal. Is determined based on the evaluation value.
また、他の実施形態であるコンピュータプログラムは、移動端末へ情報提供を行う情報提供処理をコンピュータに実行させるためのコンピュータプログラムであって、前記コンピュータに複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得ステップと、前記複数の移動体のうち前記移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算ステップと、前記演算ステップによる衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定ステップと、を含み、前記演算ステップでは、前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、に基づいて前記評価値を求め、前記判定ステップでは、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定するコンピュータプログラムである。
Further, a computer program according to another embodiment is a computer program for causing a computer to execute an information providing process for providing information to a mobile terminal, and the computer converts dynamic information of a plurality of moving objects into map information. An acquisition step of acquiring movement information indicating a movement state of the plurality of moving objects, and appearance information of the plurality of moving objects, based on the superimposed dynamic map information; A target mobile having a terminal and a collision prediction of whether or not a mobile other than the target mobile collides is performed based on the movement information, and the target mobile and the other mobile are A calculating step of obtaining an evaluation value indicating a possibility of collision; and determining whether or not to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculating step. Determination step, wherein in the calculation step, the collision prediction time, which is included in the collision prediction result, until the target mobile body and the other mobile body collide with each other, the target mobile body, and the other An appearance attribute value corresponding to the appearance information of each moving object, and an action attribute value corresponding to action information indicating the action content of each of the target moving object and the other moving objects obtained from the movement information, A computer program that determines whether or not to notify the collision prediction result to the mobile terminal based on the evaluation value in the determining step.
[本開示が解決しようとする課題]
上記従来例では、システムの中央装置が各車両から得られた車両情報に基づいて各車両の異常事象の有無を判定し、判定結果を各車両に報知するように構成されている。 [Problems to be solved by the present disclosure]
In the above conventional example, the central device of the system is configured to determine the presence or absence of an abnormal event of each vehicle based on vehicle information obtained from each vehicle, and to notify each vehicle of the determination result.
上記従来例では、システムの中央装置が各車両から得られた車両情報に基づいて各車両の異常事象の有無を判定し、判定結果を各車両に報知するように構成されている。 [Problems to be solved by the present disclosure]
In the above conventional example, the central device of the system is configured to determine the presence or absence of an abnormal event of each vehicle based on vehicle information obtained from each vehicle, and to notify each vehicle of the determination result.
上記従来例では、異常事象が生じた結果を報知するように構成されているが、例えば、このようなシステムを用いて車両(移動体)同士の間で生じる衝突を予測した結果を通知することが考えられる。
ここで、移動体同士の衝突予測を行うために移動体の現在の速度や方位等を用いれば、衝突すると予想される移動体同士が衝突するまでの衝突予測時間を求めることができる。
この衝突予測時間は短ければ短いほど衝突の可能性が高く緊急性を要していると言える。よって、衝突予測時間に応じて、移動体へ衝突予測結果を通知するか否かを判定すれば、必要性に応じた衝突予測結果の通知を行うことができる。 In the above conventional example, the system is configured to notify a result of occurrence of an abnormal event. For example, using such a system, a result of predicting a collision occurring between vehicles (moving bodies) is notified. Can be considered.
Here, if the current speed, azimuth, and the like of the moving bodies are used to predict the collision between the moving bodies, a collision prediction time until the moving bodies that are predicted to collide can collide can be obtained.
It can be said that the shorter the collision prediction time is, the higher the possibility of collision is and the more urgent it is required. Therefore, if it is determined whether or not to notify the mobile body of the collision prediction result according to the collision prediction time, it is possible to notify the collision prediction result according to necessity.
ここで、移動体同士の衝突予測を行うために移動体の現在の速度や方位等を用いれば、衝突すると予想される移動体同士が衝突するまでの衝突予測時間を求めることができる。
この衝突予測時間は短ければ短いほど衝突の可能性が高く緊急性を要していると言える。よって、衝突予測時間に応じて、移動体へ衝突予測結果を通知するか否かを判定すれば、必要性に応じた衝突予測結果の通知を行うことができる。 In the above conventional example, the system is configured to notify a result of occurrence of an abnormal event. For example, using such a system, a result of predicting a collision occurring between vehicles (moving bodies) is notified. Can be considered.
Here, if the current speed, azimuth, and the like of the moving bodies are used to predict the collision between the moving bodies, a collision prediction time until the moving bodies that are predicted to collide can collide can be obtained.
It can be said that the shorter the collision prediction time is, the higher the possibility of collision is and the more urgent it is required. Therefore, if it is determined whether or not to notify the mobile body of the collision prediction result according to the collision prediction time, it is possible to notify the collision prediction result according to necessity.
ところで、移動体には、車両だけでなく歩行者等の交通弱者も含まれており、各移動体の属性によって衝突の可能性が異なる場合がある。
このため、衝突予測時間だけで衝突の可能性を評価し、移動体へ衝突予測結果を通知するか否かを判定したとすると、衝突予測時間で評価した衝突の可能性と、実際の衝突の可能性との間に乖離が生じ、衝突予測結果の通知が適切に行われないことが考えられる。
そこで、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することが望まれる。 By the way, the moving objects include not only vehicles but also traffic weak people such as pedestrians, and the possibility of collision may differ depending on the attributes of each moving object.
For this reason, if the possibility of a collision is evaluated only based on the collision prediction time and it is determined whether or not to notify the mobile body of the collision prediction result, the possibility of the collision evaluated based on the collision prediction time and the actual collision It is conceivable that a divergence occurs from the possibility and the notification of the collision prediction result is not properly performed.
Therefore, it is desired to appropriately provide a collision prediction result for which a notification is highly necessary to the mobile terminal.
このため、衝突予測時間だけで衝突の可能性を評価し、移動体へ衝突予測結果を通知するか否かを判定したとすると、衝突予測時間で評価した衝突の可能性と、実際の衝突の可能性との間に乖離が生じ、衝突予測結果の通知が適切に行われないことが考えられる。
そこで、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することが望まれる。 By the way, the moving objects include not only vehicles but also traffic weak people such as pedestrians, and the possibility of collision may differ depending on the attributes of each moving object.
For this reason, if the possibility of a collision is evaluated only based on the collision prediction time and it is determined whether or not to notify the mobile body of the collision prediction result, the possibility of the collision evaluated based on the collision prediction time and the actual collision It is conceivable that a divergence occurs from the possibility and the notification of the collision prediction result is not properly performed.
Therefore, it is desired to appropriately provide a collision prediction result for which a notification is highly necessary to the mobile terminal.
本開示はこのような事情に鑑みてなされたものであり、衝突予測結果を適切に移動端末へ提供することができる技術の提供を目的とする。
The present disclosure has been made in view of such circumstances, and has as its object to provide a technology capable of appropriately providing a collision prediction result to a mobile terminal.
[本開示の効果]
本開示によれば、衝突予測結果を適切に移動端末へ提供することができる。 [Effects of the present disclosure]
According to the present disclosure, it is possible to appropriately provide a collision prediction result to a mobile terminal.
本開示によれば、衝突予測結果を適切に移動端末へ提供することができる。 [Effects of the present disclosure]
According to the present disclosure, it is possible to appropriately provide a collision prediction result to a mobile terminal.
最初に実施形態の内容を列記して説明する。
[実施形態の概要]
(1)一実施形態である情報提供システムは、複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得部と、前記複数の移動体のうち移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算部と、前記演算部による衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定部と、を備え、前記演算部は、前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、に基づいて前記評価値を求め、前記判定部は、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する。 First, the contents of the embodiment will be listed and described.
[Overview of Embodiment]
(1) An information providing system according to one embodiment is configured to include, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information, movement information indicating a moving state of the plurality of moving objects, An acquisition unit that acquires appearance information of a plurality of moving objects, and whether or not a target moving object having a mobile terminal among the plurality of moving objects collides with another moving object other than the target moving object; A computing unit that performs a collision prediction based on the movement information, and calculates an evaluation value indicating a possibility of collision between the target moving object and the other moving object; and a collision prediction indicating a result of the collision prediction by the computing unit. A determination unit that determines whether or not to notify a result to the mobile terminal, wherein the calculation unit is included in the collision prediction result until the target mobile unit and the other mobile unit collide with each other. Collision prediction time, the target moving body, and Appearance attribute value according to the appearance information of each of the moving objects, obtained from the movement information, the target moving object, and an action attribute value according to the action information indicating the action content of each of the other moving objects, And determines the evaluation value based on the evaluation value. The determination unit determines whether to notify the collision prediction result to the mobile terminal.
[実施形態の概要]
(1)一実施形態である情報提供システムは、複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得部と、前記複数の移動体のうち移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算部と、前記演算部による衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定部と、を備え、前記演算部は、前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、に基づいて前記評価値を求め、前記判定部は、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する。 First, the contents of the embodiment will be listed and described.
[Overview of Embodiment]
(1) An information providing system according to one embodiment is configured to include, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information, movement information indicating a moving state of the plurality of moving objects, An acquisition unit that acquires appearance information of a plurality of moving objects, and whether or not a target moving object having a mobile terminal among the plurality of moving objects collides with another moving object other than the target moving object; A computing unit that performs a collision prediction based on the movement information, and calculates an evaluation value indicating a possibility of collision between the target moving object and the other moving object; and a collision prediction indicating a result of the collision prediction by the computing unit. A determination unit that determines whether or not to notify a result to the mobile terminal, wherein the calculation unit is included in the collision prediction result until the target mobile unit and the other mobile unit collide with each other. Collision prediction time, the target moving body, and Appearance attribute value according to the appearance information of each of the moving objects, obtained from the movement information, the target moving object, and an action attribute value according to the action information indicating the action content of each of the other moving objects, And determines the evaluation value based on the evaluation value. The determination unit determines whether to notify the collision prediction result to the mobile terminal.
上記構成によれば、衝突予測結果を通知するか否かを判定するための評価値を、衝突予測時間と、外観属性値と、行動属性値と、に基づいて求めるので、移動端末へ衝突予測結果を通知するか否かの判定に、対象移動体及び他の移動体の外観や行動を反映させることができる。
よって、例えば、外観や行動において衝突可能性の高い属性を有する移動体に関する衝突予測については、衝突予測結果が通知される方向に重み付けられるように評価値を設定することができ、通知の必要性を判定する上で、移動体の外観や行動を加味することができる。
この結果、通知の必要性を適切に判定することができ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 According to the above configuration, the evaluation value for determining whether or not to notify the collision prediction result is determined based on the collision prediction time, the appearance attribute value, and the action attribute value. The appearance and behavior of the target moving object and other moving objects can be reflected in the determination as to whether or not to notify the result.
Therefore, for example, for a collision prediction regarding a moving object having a high possibility of collision in appearance or behavior, an evaluation value can be set so as to be weighted in a direction in which a collision prediction result is notified, and the necessity of notification is required. In determining, the appearance and behavior of the moving object can be taken into account.
As a result, the necessity of notification can be appropriately determined, and a collision prediction result with high necessity of notification can be appropriately provided to the mobile terminal.
よって、例えば、外観や行動において衝突可能性の高い属性を有する移動体に関する衝突予測については、衝突予測結果が通知される方向に重み付けられるように評価値を設定することができ、通知の必要性を判定する上で、移動体の外観や行動を加味することができる。
この結果、通知の必要性を適切に判定することができ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 According to the above configuration, the evaluation value for determining whether or not to notify the collision prediction result is determined based on the collision prediction time, the appearance attribute value, and the action attribute value. The appearance and behavior of the target moving object and other moving objects can be reflected in the determination as to whether or not to notify the result.
Therefore, for example, for a collision prediction regarding a moving object having a high possibility of collision in appearance or behavior, an evaluation value can be set so as to be weighted in a direction in which a collision prediction result is notified, and the necessity of notification is required. In determining, the appearance and behavior of the moving object can be taken into account.
As a result, the necessity of notification can be appropriately determined, and a collision prediction result with high necessity of notification can be appropriately provided to the mobile terminal.
(2)上記情報提供システムにおいて、前記判定部が通知すると判定した移動端末に対して、前記衝突予測結果を通知する通知部をさらに備えてもよい。
この場合、評価値に基づいて衝突予測結果が必要と判定された移動端末のみに情報提供することができる。 (2) The information providing system may further include a notification unit that notifies the collision prediction result to the mobile terminal that the determination unit determines to notify.
In this case, information can be provided only to the mobile terminals for which it is determined that the collision prediction result is necessary based on the evaluation value.
この場合、評価値に基づいて衝突予測結果が必要と判定された移動端末のみに情報提供することができる。 (2) The information providing system may further include a notification unit that notifies the collision prediction result to the mobile terminal that the determination unit determines to notify.
In this case, information can be provided only to the mobile terminals for which it is determined that the collision prediction result is necessary based on the evaluation value.
(3)また、上記情報提供システムにおいて、前記演算部は、前記他の移動体のうち、前記対象移動体に対して衝突すると予測される移動体を、前記衝突予測結果に基づいて特定し、前記衝突すると予測される移動体の前記衝突予測時間、前記外観属性値、及び前記行動属性値を用いて前記評価値を求めるものであってもよい。
この場合、衝突すると予測される移動体の衝突予測結果に関する評価値を得ることができる。 (3) Further, in the information providing system, the calculation unit specifies a moving body that is predicted to collide with the target moving body among the other moving bodies based on the collision prediction result, The evaluation value may be obtained using the collision prediction time, the appearance attribute value, and the action attribute value of the moving object predicted to collide.
In this case, it is possible to obtain an evaluation value relating to the collision prediction result of the moving body predicted to collide.
この場合、衝突すると予測される移動体の衝突予測結果に関する評価値を得ることができる。 (3) Further, in the information providing system, the calculation unit specifies a moving body that is predicted to collide with the target moving body among the other moving bodies based on the collision prediction result, The evaluation value may be obtained using the collision prediction time, the appearance attribute value, and the action attribute value of the moving object predicted to collide.
In this case, it is possible to obtain an evaluation value relating to the collision prediction result of the moving body predicted to collide.
(4)上記情報提供システムにおいて、前記移動端末を制御し、前記衝突予測結果を前記移動端末のユーザへ向けて出力する処理を前記移動端末に実行させる制御部をさらに備え、前記制御部は、前記衝突すると予測される移動体における前記外観情報及び前記行動情報に応じて前記衝突予測結果の出力態様が異なるように前記移動端末を制御するものであってもよい。
この場合、衝突すると予測される移動体の特徴に応じた出力態様でユーザに出力することができる。 (4) The information providing system further includes a control unit that controls the mobile terminal and causes the mobile terminal to execute a process of outputting the collision prediction result to a user of the mobile terminal, wherein the control unit includes: The mobile terminal may be controlled so that an output mode of the collision prediction result is different depending on the appearance information and the behavior information of the moving body predicted to collide.
In this case, it is possible to output to the user in an output mode according to the characteristics of the moving object predicted to collide.
この場合、衝突すると予測される移動体の特徴に応じた出力態様でユーザに出力することができる。 (4) The information providing system further includes a control unit that controls the mobile terminal and causes the mobile terminal to execute a process of outputting the collision prediction result to a user of the mobile terminal, wherein the control unit includes: The mobile terminal may be controlled so that an output mode of the collision prediction result is different depending on the appearance information and the behavior information of the moving body predicted to collide.
In this case, it is possible to output to the user in an output mode according to the characteristics of the moving object predicted to collide.
(5)また、上記情報提供システムにおいて、前記他の移動体が車両である場合、前記外観属性値には、前記車両のサイズに応じた属性値、前記車両の形状に応じた属性値、前記車両の色に応じた属性値、及び、前記車両に取り付けられたナンバープレートに記載された情報に応じた属性値のうち少なくとも1つが含まれていてもよい。
(5) In the information providing system, when the other moving object is a vehicle, the appearance attribute value includes an attribute value corresponding to a size of the vehicle, an attribute value corresponding to a shape of the vehicle, At least one of an attribute value according to a color of the vehicle and an attribute value according to information described on a license plate attached to the vehicle may be included.
(6)上記情報提供システムにおいて、前記他の移動体が歩行者である場合、前記外観属性値には、前記歩行者の身長に応じた属性値、前記歩行者の服装に応じた属性値、及び前記歩行者の装備品に応じた属性値のうち少なくとも1つが含まれていてもよい。
(6) In the information providing system, when the other moving object is a pedestrian, the appearance attribute value includes an attribute value corresponding to the height of the pedestrian, an attribute value corresponding to the clothes of the pedestrian, And at least one of attribute values according to the pedestrian's equipment.
(7)上記情報提供システムにおいて、前記他の移動体が車両である場合、前記行動属性値には、現在の走行状態に応じた属性値、及び過去の走行履歴に応じた属性値のうち少なくとも1つが含まれていてもよい。
(7) In the information providing system, when the other moving object is a vehicle, the action attribute value includes at least an attribute value corresponding to a current driving state and an attribute value corresponding to a past driving history. One may be included.
(8)上記情報提供システムにおいて、前記他の移動体が歩行者である場合、前記行動属性値には、前記歩行者の歩行場所に応じた属性値、前記歩行者の歩行軌跡に応じた属性値、信号機の灯色に対する動作に応じた属性値のうち少なくとも1つが含まれていてもよい。
(8) In the information providing system, when the other moving object is a pedestrian, the action attribute value includes an attribute value corresponding to a walking location of the pedestrian and an attribute corresponding to a walking locus of the pedestrian. At least one of a value and an attribute value according to the operation of the traffic light with respect to the lamp color may be included.
(9)上記情報提供システムにおいて、前記演算部は、前記移動情報に含まれる前記複数の移動体の位置情報に基づいて、前記対象移動体と前記他の移動体それぞれとの相対距離、及び前記相対距離の変位を求め、前記相対距離及び前記相対距離の変位に基づく加算値を前記評価値に加算するものであってもよい。
この場合、衝突予測時間には現れないが、対象移動体と他の移動体が接近することで衝突の可能性が生じている場合において、対象移動体の移動端末へ衝突予測結果が提供されるように評価値に反映させることができる。 (9) In the information providing system, the calculation unit is configured to determine a relative distance between the target moving object and each of the other moving objects based on position information of the plurality of moving objects included in the movement information; The relative distance may be determined, and an additional value based on the relative distance and the relative distance may be added to the evaluation value.
In this case, the collision prediction result is not provided in the collision prediction time, but the collision prediction result is provided to the mobile terminal of the target moving body when the possibility of collision occurs due to the approach of the target moving body and another moving body. In the evaluation value.
この場合、衝突予測時間には現れないが、対象移動体と他の移動体が接近することで衝突の可能性が生じている場合において、対象移動体の移動端末へ衝突予測結果が提供されるように評価値に反映させることができる。 (9) In the information providing system, the calculation unit is configured to determine a relative distance between the target moving object and each of the other moving objects based on position information of the plurality of moving objects included in the movement information; The relative distance may be determined, and an additional value based on the relative distance and the relative distance may be added to the evaluation value.
In this case, the collision prediction result is not provided in the collision prediction time, but the collision prediction result is provided to the mobile terminal of the target moving body when the possibility of collision occurs due to the approach of the target moving body and another moving body. In the evaluation value.
(10)他の実施形態である情報提供システムは、複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報を取得する取得部と、前記複数の移動体のうち移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算部と、前記演算部による衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定部と、を備え、前記演算部は、前記移動情報に含まれる前記複数の移動体の位置情報に基づいて、前記対象移動体と前記他の移動体それぞれとの相対距離、及び前記相対距離の変位を求め、前記相対距離及び前記相対距離の変位に基づいて前記評価値を求め、前記判定部は、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する。
(10) An information providing system according to another embodiment acquires movement information indicating a movement state of the plurality of moving objects based on dynamic map information in which dynamic information of the plurality of moving objects is superimposed on map information. And performing a collision prediction of whether or not a target mobile having a mobile terminal among the plurality of mobiles collides with another mobile other than the target mobile based on the movement information. An arithmetic unit for calculating an evaluation value indicating a possibility of collision between the target mobile object and the other mobile object, and whether to notify the mobile terminal of a collision prediction result indicating a result of a collision prediction by the arithmetic unit. A determination unit that determines the relative distance between the target mobile unit and each of the other mobile units based on the position information of the plurality of mobile units included in the movement information, and Determine the displacement of the relative distance, Obtains the evaluation value based on the displacement of the pair distance and the relative distance, the determining unit determines whether to notify the collision prediction result to the mobile terminal determines on the basis of the evaluation value.
上記構成によれば、衝突予測結果を通知するか否かを判定するための評価値を、対象移動体と前記他の移動体それぞれとの相対距離、及び前記相対距離の変位に基づいて求めるので、対象移動体と他の移動体とが所定の相対距離の範囲内であって、互いに接近するように相対距離が減少していれば、衝突の可能性が生じていると判定することができる。この結果、対象移動体及び他の移動体の進行方向に関わらず、対象移動体に対して衝突の可能性を有する他の移動体を幅広く評価でき、通知の必要性を適切に判定することができる。
According to the above configuration, the evaluation value for determining whether or not to notify the collision prediction result is obtained based on the relative distance between the target moving object and each of the other moving objects, and the displacement of the relative distance. If the target moving body and the other moving body are within the range of the predetermined relative distance and the relative distance decreases so as to approach each other, it can be determined that the possibility of collision has occurred. . As a result, regardless of the traveling directions of the target moving object and the other moving objects, other moving objects having a possibility of collision with the target moving object can be widely evaluated, and the necessity of notification can be appropriately determined. it can.
(11)また、他の実施形態である移動端末は、上記(1)から(10)のいずれか1つの情報提供システムから前記衝突予測結果を受け付け、ユーザへ前記衝突予測結果を出力する移動端末である。
(11) A mobile terminal according to another embodiment receives the collision prediction result from the information providing system according to any one of (1) to (10) and outputs the collision prediction result to a user. It is.
(12)また、他の実施形態である情報提供方法は、移動端末へ情報提供を行う情報提供方法であって、複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得ステップと、前記複数の移動体のうち前記移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算ステップと、前記演算ステップによる衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定ステップと、を含み、前記演算ステップでは、前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、に基づいて前記評価値を求め、前記判定ステップでは、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する。
(12) An information providing method according to another embodiment is an information providing method for providing information to a mobile terminal, based on dynamic map information in which dynamic information of a plurality of moving objects is superimposed on map information. Moving information indicating a moving state of the plurality of moving objects, appearance information of the plurality of moving objects, an obtaining step of obtaining, a target moving object having the mobile terminal among the plurality of moving objects, An evaluation value indicating the possibility of collision between the target moving object and the other moving object, while performing a collision prediction of whether or not the moving object collides with another moving object other than the target moving object based on the movement information. And a determining step of determining whether or not to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculation step, wherein the calculation step includes the collision prediction result. The A collision prediction time until the target moving object and the other moving object collide with each other, an appearance attribute value according to the appearance information of the target moving object and the other moving object, and the movement information. Determining the evaluation value based on the target mobile object and an action attribute value corresponding to action information indicating the action content of each of the other mobile objects. Whether to notify the terminal is determined based on the evaluation value.
(13)また、他の実施形態であるコンピュータプログラムは、移動端末へ情報提供を行う情報提供処理をコンピュータに実行させるためのコンピュータプログラムであって、前記コンピュータに複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得ステップと、前記複数の移動体のうち前記移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算ステップと、前記演算ステップによる衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定ステップと、を含み、前記演算ステップでは、前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、に基づいて前記評価値を求め、前記判定ステップでは、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定するコンピュータプログラムである。
(13) Further, a computer program according to another embodiment is a computer program for causing a computer to execute an information providing process for providing information to a mobile terminal, and stores the dynamic information of a plurality of moving objects in the computer. An acquisition step of acquiring movement information indicating a moving state of the plurality of moving bodies, and appearance information of the plurality of moving bodies, based on the dynamic map information superimposed on the map information; The target mobile having the mobile terminal and a mobile other than the target mobile perform a collision prediction based on the mobile information as to whether or not a collision occurs, and the target mobile and the other mobile are used. A calculating step of calculating an evaluation value indicating a possibility of collision with the body, and determining whether to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculation step. Determination step, wherein in the calculation step, the collision prediction result included in the collision prediction result, the collision prediction time until the target moving body and the other moving body collides, the target moving body, and the An appearance attribute value according to the appearance information of each of the other moving objects, and an action attribute value obtained from the movement information and corresponding to action information indicating an action content of each of the target moving object and the other moving object; And a computer program that determines whether or not to notify the collision prediction result to the mobile terminal based on the evaluation value in the determining step.
[実施形態の詳細]
以下、好ましい実施形態について図面を参照しつつ説明する。
なお、以下に記載する各実施形態の少なくとも一部を任意に組み合わせてもよい。 [Details of Embodiment]
Hereinafter, preferred embodiments will be described with reference to the drawings.
In addition, at least a part of each embodiment described below may be arbitrarily combined.
以下、好ましい実施形態について図面を参照しつつ説明する。
なお、以下に記載する各実施形態の少なくとも一部を任意に組み合わせてもよい。 [Details of Embodiment]
Hereinafter, preferred embodiments will be described with reference to the drawings.
In addition, at least a part of each embodiment described below may be arbitrarily combined.
〔無線通信システムの全体構成〕
図1は、一実施形態にかかる無線通信システムの全体構成を示す概略図である。図1を参照して、無線通信システムは、無線通信が可能な複数の通信端末1A~1D、通信端末1A~1Dと無線通信する1または複数の基地局2、基地局2と有線又は無線で通信する1または複数のエッジサーバ3、および、エッジサーバ3と有線または無線で通信する1または複数のコアサーバ4を含む。通信端末1A~1Dを代表させて通信端末1とも称する。 [Overall configuration of wireless communication system]
FIG. 1 is a schematic diagram illustrating an overall configuration of a wireless communication system according to an embodiment. Referring to FIG. 1, a wireless communication system includes a plurality ofcommunication terminals 1A to 1D capable of wireless communication, one or a plurality of base stations 2 wirelessly communicating with communication terminals 1A to 1D, and a wired or wireless communication with base station 2. It includes one or more edge servers 3 that communicate with each other and one or more core servers 4 that communicate with the edge servers 3 by wire or wirelessly. The communication terminals 1A to 1D are also referred to as communication terminals 1.
図1は、一実施形態にかかる無線通信システムの全体構成を示す概略図である。図1を参照して、無線通信システムは、無線通信が可能な複数の通信端末1A~1D、通信端末1A~1Dと無線通信する1または複数の基地局2、基地局2と有線又は無線で通信する1または複数のエッジサーバ3、および、エッジサーバ3と有線または無線で通信する1または複数のコアサーバ4を含む。通信端末1A~1Dを代表させて通信端末1とも称する。 [Overall configuration of wireless communication system]
FIG. 1 is a schematic diagram illustrating an overall configuration of a wireless communication system according to an embodiment. Referring to FIG. 1, a wireless communication system includes a plurality of
コアサーバ4は、コアネットワークのコアデータセンタ(DC)に設置される。エッジサーバ3は、メトロネットワークの分散データセンタ(DC)に設置される。メトロネットワークは、たとえば都市ごとに構築された通信ネットワークである。各地のメトロネットワークは、それぞれコアネットワークに接続される。基地局2は、メトロネットワークに含まれる分散データセンタのいずれかのエッジサーバ3に通信可能に接続される。
(4) The core server 4 is installed in a core data center (DC) of the core network. The edge server 3 is installed in a distributed data center (DC) of a metro network. The metro network is, for example, a communication network constructed for each city. Each metro network is connected to the core network. The base station 2 is communicably connected to any one of the edge servers 3 of the distributed data center included in the metro network.
コアサーバ4は、コアネットワークに通信可能に接続される。エッジサーバ3は、メトロネットワークに通信可能に接続される。従って、コアサーバ4は、コアネットワークおよびメトロネットワークを介して、各地のメトロネットワークに属するエッジサーバ3および基地局2と通信可能である。基地局2は、マクロセル基地局、マイクロセル基地局、およびピコセル基地局のうちの少なくとも1つよりなる。
(4) The core server 4 is communicably connected to the core network. The edge server 3 is communicably connected to a metro network. Therefore, the core server 4 can communicate with the edge server 3 and the base station 2 belonging to each metro network via the core network and the metro network. The base station 2 includes at least one of a macro cell base station, a micro cell base station, and a pico cell base station.
本実施形態の無線通信システムにおいて、エッジサーバ3およびコアサーバ4は、SDN(Software-Defined Networking)が可能な汎用サーバである。基地局2および図示しないリピータなどの中継装置は、SDNが可能なトランスポート機器である。従って、ネットワーク仮想化技術により、低遅延通信と大容量通信などの相反するサービス要求条件を満足する複数の仮想的なネットワーク(ネットワークスライス)S1~S4を、無線通信システムの物理機器に定義できる。
In the wireless communication system of the present embodiment, the edge server 3 and the core server 4 are general-purpose servers capable of performing SDN (Software-Defined Networking). The relay device such as the base station 2 and a repeater (not shown) is a transport device capable of SDN. Therefore, a plurality of virtual networks (network slices) S1 to S4 satisfying conflicting service requirements such as low-delay communication and large-capacity communication can be defined as physical devices of the wireless communication system by the network virtualization technology.
上記のネットワーク仮想化技術は、現時点で規格化が進行中の「第5世代移動通信システム」(以下、「5G」(5th Generation)と略記する。)の基本コンセプトである。従って、本実施形態にかかる無線通信システムは、たとえば5Gに準拠している。
もっとも、本実施形態にかかる無線通信システムは、遅延時間などの所定のサービス要求条件に応じて複数のネットワークスライス(以下、「スライス」ともいう。)S1~S4を定義可能な移動通信システムであればよく、5Gに限定されるものではない。また、定義するスライスの階層は、4階層に限らず5階層以上であってもよい。 The above-described network virtualization technology is a basic concept of a “fifth generation mobile communication system” (hereinafter abbreviated as “5G” (5th Generation)) that is currently being standardized. Therefore, the wireless communication system according to the present embodiment is compliant with, for example, 5G.
However, the wireless communication system according to the present embodiment is a mobile communication system capable of defining a plurality of network slices (hereinafter, also referred to as “slices”) S1 to S4 according to predetermined service requirements such as delay time. It is not limited to 5G. The number of slices to be defined is not limited to four, but may be five or more.
もっとも、本実施形態にかかる無線通信システムは、遅延時間などの所定のサービス要求条件に応じて複数のネットワークスライス(以下、「スライス」ともいう。)S1~S4を定義可能な移動通信システムであればよく、5Gに限定されるものではない。また、定義するスライスの階層は、4階層に限らず5階層以上であってもよい。 The above-described network virtualization technology is a basic concept of a “fifth generation mobile communication system” (hereinafter abbreviated as “5G” (5th Generation)) that is currently being standardized. Therefore, the wireless communication system according to the present embodiment is compliant with, for example, 5G.
However, the wireless communication system according to the present embodiment is a mobile communication system capable of defining a plurality of network slices (hereinafter, also referred to as “slices”) S1 to S4 according to predetermined service requirements such as delay time. It is not limited to 5G. The number of slices to be defined is not limited to four, but may be five or more.
図1の例では、各ネットワークスライスS1~S4は、次のように定義される。スライスS1は、通信端末1A~1Dが、直接通信するように定義されたネットワークスライスである。スライスS1で直接通信する通信端末1A~1Dを、「ノードN1」ともいう。
In the example of FIG. 1, each of the network slices S1 to S4 is defined as follows. The slice S1 is a network slice defined so that the communication terminals 1A to 1D communicate directly. The communication terminals 1A to 1D that directly communicate in the slice S1 are also referred to as “node N1”.
スライスS2は、通信端末1A~1Dが、基地局2と通信するように定義されたネットワークスライスである。スライスS2における最上位の通信ノード(図例では基地局2)を、「ノードN2」ともいう。
Slice S2 is a network slice defined so that communication terminals 1A to 1D communicate with base station 2. The highest communication node in the slice S2 (the base station 2 in the illustrated example) is also referred to as “node N2”.
スライスS3は、通信端末1A~1Dが、基地局2を経由してエッジサーバ3と通信するように定義されたネットワークスライスである。スライスS3における最上位の通信ノード(図例ではエッジサーバ3)を、「ノードN3」ともいう。スライスS3では、ノードN2が中継ノードとなる。すなわち、ノードN1→ノードN2→ノードN3のアップリンク経路と、ノードN3→ノードN2→ノードN1のダウンリンク経路によりデータ通信が行われる。
The slice S3 is a network slice defined so that the communication terminals 1A to 1D communicate with the edge server 3 via the base station 2. The highest communication node in the slice S3 (the edge server 3 in the illustrated example) is also referred to as “node N3”. In the slice S3, the node N2 becomes a relay node. That is, data communication is performed by an uplink route from the node N1 to the node N2 to the node N3 and a downlink route from the node N3 to the node N2 to the node N1.
スライスS4は、通信端末1A~1Dが、基地局2およびエッジサーバ3を経由してコアサーバ4と通信するように定義されたネットワークスライスである。スライスS4における最上位の通信ノード(図例ではコアサーバ4)を、「ノードN4」ともいう。スライスS4では、ノードN2およびノードN3が中継ノードとなる。すなわち、ノードN1→ノードN2→ノードN3→ノードN4のアップリンク経路と、ノードN4→ノードN3→ノードN2→ノードN1のダウンリンク経路と、によりデータ通信が行われる。
Slice S4 is a network slice defined so that communication terminals 1A to 1D communicate with core server 4 via base station 2 and edge server 3. The highest-order communication node in the slice S4 (the core server 4 in the illustrated example) is also referred to as “node N4”. In the slice S4, the nodes N2 and N3 become relay nodes. That is, data communication is performed by an uplink route of the node N1 → node N2 → node N3 → node N4 and a downlink route of the node N4 → node N3 → node N2 → node N1.
スライスS4において、エッジサーバ3を中継ノードとしないルーティングの場合もある。この場合、ノードN1→ノードN2→ノードN4のアップリンク経路と、ノードN4→ノードN2→ノードN1のダウンリンク経路と、によりデータ通信が行われる。
ル ー テ ィ ン グ In the slice S4, there is a case where the routing is performed without using the edge server 3 as a relay node. In this case, data communication is performed by an uplink route from the node N1 to the node N2 to the node N4 and a downlink route from the node N4 to the node N2 to the node N1.
スライスS2において、複数の基地局2(ノードN2)が含まれる場合は、基地局2,2間の通信を辿るルーティングも可能である。同様に、スライスS3において、複数のエッジサーバ3(ノードN3)が含まれる場合は、エッジサーバ3,3間の通信を辿るルーティングも可能である。スライスS4において、複数のコアサーバ4(ノードN4)が含まれる場合は、コアサーバ4,4の通信を辿るルーティングも可能である。
In the case where a plurality of base stations 2 (nodes N2) are included in the slice S2, routing for tracing communication between the base stations 2 and 2 is also possible. Similarly, when a plurality of edge servers 3 (nodes N3) are included in the slice S3, routing for tracing communication between the edge servers 3 and 3 is also possible. In the case where a plurality of core servers 4 (node N4) are included in the slice S4, routing for following the communication of the core servers 4 and 4 is also possible.
通信端末1Aは、車両5に搭載された無線通信機である。車両5には、通常の乗用車だけでなく、路線バスや緊急車両などの公共車両も含まれる。車両5は、四輪車だけでなく、二輪車(バイク)であってもよい。車両5の駆動方式は、エンジン駆動、電気モータ駆動、およびハイブリッド方式のいずれでもよい。車両5の運転方式は、搭乗者が加減速やハンドル操舵などの操作を行う通常運転、およびその操作をソフトウェアが実行する自動運転のうちのいずれでもよい。
The communication terminal 1A is a wireless communication device mounted on the vehicle 5. The vehicles 5 include not only ordinary passenger cars but also public vehicles such as route buses and emergency vehicles. The vehicle 5 may be not only a four-wheeled vehicle but also a two-wheeled vehicle (motorcycle). The drive system of the vehicle 5 may be any of an engine drive, an electric motor drive, and a hybrid system. The driving method of the vehicle 5 may be any of normal driving in which a passenger performs operations such as acceleration / deceleration and steering of a steering wheel, and automatic driving in which the operation is executed by software.
車両5の通信端末1Aは、車両5に既設の無線通信機であってもよいし、搭乗者が車両5に持ち込んだ携帯端末であってもよい。搭乗者の携帯端末は、車両5の車内LAN(Local Area Network)に接続されることにより、一時的に車載の無線通信機となる。
The communication terminal 1A of the vehicle 5 may be an existing wireless communication device of the vehicle 5, or may be a portable terminal carried by the passenger into the vehicle 5. The passenger's mobile terminal is temporarily connected to the in-vehicle LAN (Local Area Network) of the vehicle 5 to temporarily become a wireless communication device mounted on the vehicle.
通信端末1Bは、歩行者7が携帯する携帯端末(歩行者端末)である。歩行者7は、道路や駐車場などの屋外、および建物内や地下街などの屋内を徒歩で移動する人間である。歩行者7には、徒歩だけでなく、動力源を有しない自転車などに搭乗する人間も含まれる。
The communication terminal 1B is a portable terminal (pedestrian terminal) carried by the pedestrian 7. The pedestrian 7 is a person who walks outdoors on roads and parking lots, and indoors in buildings and underground malls. The pedestrian 7 includes not only a person walking but also a person riding a bicycle having no power source.
通信端末1Cは、路側センサ8に搭載された無線通信機である。路側センサ8は、道路に設置された画像式車両感知器、および屋外または屋内に設置された防犯カメラなどである。通信端末1Dは、交差点の交通信号制御機9に搭載された無線通信機である。
The communication terminal 1C is a wireless communication device mounted on the roadside sensor 8. The roadside sensor 8 is an image-type vehicle sensor installed on a road, a security camera installed outdoors or indoors, and the like. The communication terminal 1D is a wireless communication device mounted on the traffic signal controller 9 at the intersection.
スライスS1~S4のサービス要求条件は、次の通りである。スライスS1~S4に許容される遅延時間D1~D4は、D1<D2<D3<D4となるように定義される。たとえば、D1=1ms、D2=10ms、D3=100ms、D4=1sである。スライスS1~S4に許容される所定期間(たとえば1日)当たりのデータ通信量C1~C4は、C1<C2<C3<C4となるように定義される。たとえば、C1=20GB、C2=100GB、C3=2TB、C4=10TBである。
Service requirements for slices S1 to S4 are as follows. Delay times D1 to D4 allowed for slices S1 to S4 are defined such that D1 <D2 <D3 <D4. For example, D1 = 1 ms, D2 = 10 ms, D3 = 100 ms, and D4 = 1s. Data traffic C1 to C4 per predetermined period (for example, one day) allowed for slices S1 to S4 are defined so as to satisfy C1 <C2 <C3 <C4. For example, C1 = 20 GB, C2 = 100 GB, C3 = 2 TB, and C4 = 10 TB.
上記の通り、図1の無線通信システムでは、スライスS1での直接的な無線通信(たとえば、車両5の通信端末1Aが直接通信する「車車間通信」など)、および基地局2を経由するスライスS2の無線通信が可能である。もっとも、本実施形態では、図1の無線通信システムにおけるスライスS3およびスライスS4を利用した、比較的広域のサービスエリア(たとえば、市町村や都道府県を包含するエリア)に含まれるユーザに対する情報提供サービスが想定される。
As described above, in the wireless communication system of FIG. 1, the direct wireless communication in the slice S1 (for example, “inter-vehicle communication” in which the communication terminal 1A of the vehicle 5 directly communicates) and the slice through the base station 2 Wireless communication of S2 is possible. However, in the present embodiment, an information providing service for users included in a relatively wide service area (for example, an area including municipalities and prefectures) using the slices S3 and S4 in the wireless communication system of FIG. 1 is provided. is assumed.
〔エッジサーバおよびコアサーバの内部構成〕
図2は、エッジサーバ3およびコアサーバ4の内部構成の一例を示すブロック図である。図2を参照して、エッジサーバ3は、CPU(Central Processing Unit)などを含む制御部31、ROM(Read Only Memory)32、RAM(Random Access Memory)33、記憶部34、および、通信部35を含む。 [Internal configuration of edge server and core server]
FIG. 2 is a block diagram illustrating an example of an internal configuration of theedge server 3 and the core server 4. With reference to FIG. 2, the edge server 3 includes a control unit 31 including a CPU (Central Processing Unit), a ROM (Read Only Memory) 32, a RAM (Random Access Memory) 33, a storage unit 34, and a communication unit 35. including.
図2は、エッジサーバ3およびコアサーバ4の内部構成の一例を示すブロック図である。図2を参照して、エッジサーバ3は、CPU(Central Processing Unit)などを含む制御部31、ROM(Read Only Memory)32、RAM(Random Access Memory)33、記憶部34、および、通信部35を含む。 [Internal configuration of edge server and core server]
FIG. 2 is a block diagram illustrating an example of an internal configuration of the
制御部31は、ROM32に予め記憶された1または複数のプログラムをRAM33に読み出して実行することにより、各ハードウェアの動作を制御し、コンピュータ装置をコアサーバ4や基地局2などと通信可能なエッジサーバ3として機能させる。
The control unit 31 controls the operation of each hardware by reading out and executing one or a plurality of programs stored in the ROM 32 in advance in the RAM 33, so that the computer can communicate with the core server 4, the base station 2, and the like. It functions as the edge server 3.
RAM33は、SRAM(Static RAM)またはDRAM(Dynamic RAM)などの揮発性のメモリ素子で構成され、制御部31が実行するプログラムおよびその実行に必要なデータを一時的に記憶する。
The RAM 33 is composed of a volatile memory element such as an SRAM (Static RAM) or a DRAM (Dynamic RAM), and temporarily stores a program executed by the control unit 31 and data necessary for the execution.
記憶部34は、フラッシュメモリもしくはEEPROM(Electrically Erasable Programmable Read Only Memory)などの不揮発性のメモリ素子、または、ハードディスクなどの磁気記憶装置などにより構成される。
The storage unit 34 is configured by a nonvolatile memory element such as a flash memory or an EEPROM (Electrically Erasable Programmable Read Only Only Memory), or a magnetic storage device such as a hard disk.
通信部35は、メトロネットワークを介してコアサーバ4や基地局2などと通信する機能を有している。通信部35は、制御部31から与えられた情報を、メトロネットワークを介して外部装置に送信するとともに、メトロネットワークを介して受信した情報を制御部31に与える。
The communication unit 35 has a function of communicating with the core server 4, the base station 2, and the like via the metro network. The communication unit 35 transmits the information provided from the control unit 31 to an external device via the metro network, and provides the control unit 31 with the information received via the metro network.
記憶部34は、動的なマップ情報として動的情報マップ(以下、単に「マップ」ともいう。)M1を記憶する。マップM1は、静的情報である高精細のデジタル地図に対して、時々刻々と変化する動的情報を重畳させたデータの集合体(仮想的なデータベース)である。
マップM1を構成する情報には、下記の「動的情報」や「静的情報」が含まれる。 Thestorage unit 34 stores a dynamic information map (hereinafter, also simply referred to as a “map”) M1 as dynamic map information. The map M1 is an aggregate (virtual database) of data in which dynamic information that changes every moment is superimposed on a high-definition digital map that is static information.
The information constituting the map M1 includes the following “dynamic information” and “static information”.
マップM1を構成する情報には、下記の「動的情報」や「静的情報」が含まれる。 The
The information constituting the map M1 includes the following “dynamic information” and “static information”.
「動的情報」は、1秒以内の遅延時間が要求される動的なデータを指す。たとえば、ITS(Intelligent Transport Systems)先読み情報として活用される、移動体(車両および歩行者など)の位置情報、および信号情報などが動的情報に該当する。
なお、動的情報に含まれる移動体の位置情報には、通信端末1A,1Bを有することで無線通信が可能な車両5や歩行者7の位置情報の他、無線通信機能を有していない車両5や歩行者7の位置情報も含まれている。 “Dynamic information” refers to dynamic data that requires a delay time within one second. For example, dynamic information includes position information and signal information of a moving object (vehicle and pedestrian, etc.) used as ITS (Intelligent Transport Systems) look-ahead information.
Note that the position information of the moving object included in the dynamic information does not have a wireless communication function in addition to the position information of thevehicle 5 and the pedestrian 7 capable of wireless communication by having the communication terminals 1A and 1B. The position information of the vehicle 5 and the pedestrian 7 is also included.
なお、動的情報に含まれる移動体の位置情報には、通信端末1A,1Bを有することで無線通信が可能な車両5や歩行者7の位置情報の他、無線通信機能を有していない車両5や歩行者7の位置情報も含まれている。 “Dynamic information” refers to dynamic data that requires a delay time within one second. For example, dynamic information includes position information and signal information of a moving object (vehicle and pedestrian, etc.) used as ITS (Intelligent Transport Systems) look-ahead information.
Note that the position information of the moving object included in the dynamic information does not have a wireless communication function in addition to the position information of the
「静的情報」は、1カ月以内の遅延時間が許容される静的なデータを指す。たとえば、路面情報、車線情報、および3次元構造物データなどが静的情報に該当する。
"Static information" refers to static data with a delay of one month or less. For example, road surface information, lane information, three-dimensional structure data, and the like correspond to the static information.
エッジサーバ3の制御部31は、記憶部34に格納されたマップM1の動的情報を、所定の更新周期ごとに更新する(更新処理)。具体的には、制御部31は、所定の更新周期ごとに、自装置のサービスエリア内で車両5や路側センサ8などによって取得された各種のセンサ情報を各通信端末1A~1Dから収集し、収集したセンサ情報に基づいてマップM1の動的情報を更新する。
The control unit 31 of the edge server 3 updates the dynamic information of the map M1 stored in the storage unit 34 at a predetermined update cycle (update processing). Specifically, the control unit 31 collects, from the communication terminals 1A to 1D, various types of sensor information acquired by the vehicle 5, the roadside sensor 8, and the like in the service area of the own device for each predetermined update cycle, The dynamic information of the map M1 is updated based on the collected sensor information.
制御部31は、所定のユーザの通信端末1A,1Bから動的情報の要求メッセージを受信すると、所定の配信周期ごとに、最新の動的情報を要求メッセージの送信元の通信端末1A,1Bへ配信する(配信処理)。
制御部31は、交通管制センターおよび民間気象業務支援センターなどからサービスエリア内の各地の交通情報および気象情報を収集し、収集した情報に基づいて、マップM1の動的情報または静的情報を更新してもよい。 When receiving the dynamic information request message from the communication terminals 1A and 1B of the predetermined user, the control unit 31 sends the latest dynamic information to the communication terminals 1A and 1B of the transmission source of the request message at every predetermined distribution cycle. Distribute (distribution processing).
Thecontrol unit 31 collects traffic information and weather information of various places in the service area from the traffic control center, the private weather service support center, and the like, and updates the dynamic information or the static information of the map M1 based on the collected information. May be.
制御部31は、交通管制センターおよび民間気象業務支援センターなどからサービスエリア内の各地の交通情報および気象情報を収集し、収集した情報に基づいて、マップM1の動的情報または静的情報を更新してもよい。 When receiving the dynamic information request message from the
The
さらに図2を参照して、コアサーバ4は、CPUなどを含む制御部41、ROM42、RAM43、記憶部44、および、通信部45を含む。
2, core server 4 includes a control unit 41 including a CPU, a ROM 42, a RAM 43, a storage unit 44, and a communication unit 45.
制御部41は、ROM32に予め記憶された1または複数のプログラムをRAM43に読み出して実行することにより、各ハードウェアの動作を制御し、コンピュータ装置をエッジサーバ3と通信可能なコアサーバ4として機能させる。
The control unit 41 controls the operation of each hardware by reading and executing one or a plurality of programs stored in the ROM 32 in advance in the RAM 43, and functions as a core server 4 capable of communicating the computer device with the edge server 3. Let it.
RAM43は、SRAMまたはDRAMなどの揮発性のメモリ素子で構成され、制御部41が実行するプログラムおよびその実行に必要なデータを一時的に記憶する。
The RAM 43 is composed of a volatile memory element such as an SRAM or a DRAM, and temporarily stores a program executed by the control unit 41 and data necessary for the execution.
記憶部44は、フラッシュメモリもしくはEEPROMなどの不揮発性のメモリ素子、または、ハードディスクなどの磁気記憶装置などにより構成される。
The storage unit 44 is configured by a nonvolatile memory element such as a flash memory or an EEPROM, or a magnetic storage device such as a hard disk.
通信部45は、コアネットワークを介してエッジサーバ3や基地局2などと通信する機能を有している。通信部45は、制御部41から与えられた情報を、コアネットワークを介して外部装置に送信するとともに、コアネットワークを介して受信した情報を制御部41に与える。
The communication unit 45 has a function of communicating with the edge server 3, the base station 2, and the like via the core network. The communication unit 45 transmits the information provided from the control unit 41 to an external device via the core network, and provides the control unit 41 with the information received via the core network.
図2に示すように、コアサーバ4の記憶部44は情報マップM2を記憶する。マップM2のデータ構造(動的情報および静的情報を含むデータ構造)は、マップM1のデータ構造と同様である。マップM2は、特定のエッジサーバ3のマップM1と同じサービスエリアのマップでもよいし、複数のエッジサーバ3が保持する各マップM1を統合した、より広域のマップであってもよい。
(2) As shown in FIG. 2, the storage unit 44 of the core server 4 stores an information map M2. The data structure of the map M2 (data structure including dynamic information and static information) is the same as the data structure of the map M1. The map M2 may be a map of the same service area as the map M1 of the specific edge server 3, or may be a wider area map obtained by integrating the maps M1 held by the plurality of edge servers 3.
コアサーバ4の制御部41は、エッジサーバ3と同様に、記憶部44に格納されたマップM2の動的情報を更新する更新処理と、要求メッセージに応答して動的情報を配信する配信処理と、を行ってもよい。すなわち、制御部41は、エッジサーバ3から独立して、自装置のマップM2に基づく更新処理および配信処理を行うことができる。
すなわち、制御部41は、エッジサーバ3とは別に、自装置のマップM2に基づく動的情報の更新処理及び配信処理を独自に実行可能である。 Like theedge server 3, the control unit 41 of the core server 4 updates the dynamic information of the map M2 stored in the storage unit 44, and distributes the dynamic information in response to the request message. And may be performed. That is, the control unit 41 can perform the update process and the distribution process based on the map M2 of the own device independently of the edge server 3.
That is, thecontrol unit 41 can independently execute the update process and the distribution process of the dynamic information based on the map M2 of the own device, separately from the edge server 3.
すなわち、制御部41は、エッジサーバ3とは別に、自装置のマップM2に基づく動的情報の更新処理及び配信処理を独自に実行可能である。 Like the
That is, the
もっとも、スライスS4に属するコアサーバ4は、スライスS3に属するエッジサーバ3に比べて、通信端末1A~1Dとの通信の遅延時間が大きい。
このため、コアサーバ4がマップM2の動的情報を独自に更新しても、エッジサーバ3が管理するマップM1の動的情報に比べてリアルタイム性に劣る。
そこで、例えば所定のエリアごとに定義した優先度に応じて、エッジサーバ3の制御部31とコアサーバ4の制御部41が動的情報の更新処理及び配信処理を分散的に処理してもよい。 However, thecore server 4 belonging to the slice S4 has a longer communication delay time with the communication terminals 1A to 1D than the edge server 3 belonging to the slice S3.
For this reason, even if thecore server 4 independently updates the dynamic information of the map M2, the real-time property is inferior to the dynamic information of the map M1 managed by the edge server 3.
Therefore, for example, thecontrol unit 31 of the edge server 3 and the control unit 41 of the core server 4 may perform the update processing and the distribution processing of the dynamic information in a distributed manner according to the priority defined for each predetermined area. .
このため、コアサーバ4がマップM2の動的情報を独自に更新しても、エッジサーバ3が管理するマップM1の動的情報に比べてリアルタイム性に劣る。
そこで、例えば所定のエリアごとに定義した優先度に応じて、エッジサーバ3の制御部31とコアサーバ4の制御部41が動的情報の更新処理及び配信処理を分散的に処理してもよい。 However, the
For this reason, even if the
Therefore, for example, the
〔車載装置の内部構成〕
図3は、通信端末1Aを搭載した車両5の車載装置50の内部構成の一例を示すブロック図である。
図3を参照して、車両5に搭載される車載装置50は、制御部(ECU:Electronic Control Unit)51、GPS受信機52、車速センサ53、ジャイロセンサ54、記憶部55、ディスプレイ56、スピーカ57、入力デバイス58、車載カメラ59、レーダセンサ60、および、通信部61を含む。 [Internal configuration of in-vehicle device]
FIG. 3 is a block diagram illustrating an example of an internal configuration of the vehicle-mounteddevice 50 of the vehicle 5 equipped with the communication terminal 1A.
Referring to FIG. 3, on-vehicle device 50 mounted on vehicle 5 includes a control unit (ECU: Electronic Control Unit) 51, a GPS receiver 52, a vehicle speed sensor 53, a gyro sensor 54, a storage unit 55, a display 56, and a speaker. 57, an input device 58, a vehicle-mounted camera 59, a radar sensor 60, and a communication unit 61.
図3は、通信端末1Aを搭載した車両5の車載装置50の内部構成の一例を示すブロック図である。
図3を参照して、車両5に搭載される車載装置50は、制御部(ECU:Electronic Control Unit)51、GPS受信機52、車速センサ53、ジャイロセンサ54、記憶部55、ディスプレイ56、スピーカ57、入力デバイス58、車載カメラ59、レーダセンサ60、および、通信部61を含む。 [Internal configuration of in-vehicle device]
FIG. 3 is a block diagram illustrating an example of an internal configuration of the vehicle-mounted
Referring to FIG. 3, on-
通信部61は、前述の通信端末1A(5Gに準拠した通信が可能な無線通信機)である。従って、車両5の車載装置50は、スライスS3に属する移動端末の一種として、エッジサーバ3と通信することができる。また、車両5の車載装置50は、スライスS4に属する移動端末の一種として、コアサーバ4と通信することもできる。
The communication unit 61 is the above-described communication terminal 1A (a wireless communication device capable of performing communication conforming to 5G). Therefore, the in-vehicle device 50 of the vehicle 5 can communicate with the edge server 3 as a kind of mobile terminal belonging to the slice S3. Further, the in-vehicle device 50 of the vehicle 5 can communicate with the core server 4 as a kind of mobile terminal belonging to the slice S4.
制御部51は、車両5の経路探索および他の電子機器52~61の制御などを行うコンピュータ装置である。制御部51は、GPS受信機52が定期的に取得するGPS信号により自車両の車両位置を求める。また、制御部51は、車速センサ53およびジャイロセンサ54の入力信号に基づいて、車両位置および方位を補完し、車両5の正確な現在位置および方位を把握する。
The control unit 51 is a computer device that searches for a route of the vehicle 5 and controls other electronic devices 52 to 61. The control unit 51 obtains the vehicle position of the own vehicle from the GPS signals periodically acquired by the GPS receiver 52. Further, the control unit 51 complements the vehicle position and direction based on the input signals of the vehicle speed sensor 53 and the gyro sensor 54, and grasps the accurate current position and direction of the vehicle 5.
GPS受信機52、車速センサ53およびジャイロセンサ54は、車両5の現在位置、速度および向きを計測するセンサ類である。
記憶部55は、地図データベースを含む。地図データベースは、制御部51に道路地図データを提供する。道路地図データは、リンクデータやノードデータを含み、DVD、CD-ROM、メモリカード、またはHDDなどの記録媒体に格納されている。記憶部55は、記録媒体から必要な道路地図データを読み出して制御部51に提供する。 TheGPS receiver 52, the vehicle speed sensor 53, and the gyro sensor 54 are sensors for measuring the current position, speed, and direction of the vehicle 5.
Storage unit 55 includes a map database. The map database provides the control unit 51 with road map data. The road map data includes link data and node data, and is stored in a recording medium such as a DVD, CD-ROM, memory card, or HDD. The storage unit 55 reads necessary road map data from the recording medium and provides the read road map data to the control unit 51.
記憶部55は、地図データベースを含む。地図データベースは、制御部51に道路地図データを提供する。道路地図データは、リンクデータやノードデータを含み、DVD、CD-ROM、メモリカード、またはHDDなどの記録媒体に格納されている。記憶部55は、記録媒体から必要な道路地図データを読み出して制御部51に提供する。 The
ディスプレイ56およびスピーカ57は、制御部51が生成した各種情報を車両5の搭乗者であるユーザに通知するための出力装置である。具体的には、ディスプレイ56は、経路探索の際の入力画面、自車周辺の地図画像および目的地までの経路情報などを表示する。スピーカ57は、車両5を目的地に誘導するためのアナウンスなどを音声出力する。これらの出力装置は、通信部61が受信した提供情報を搭乗者に通知することもできる。
The display 56 and the speaker 57 are output devices for notifying the user who is the occupant of the vehicle 5 of various information generated by the control unit 51. Specifically, the display 56 displays an input screen at the time of a route search, a map image around the own vehicle, route information to a destination, and the like. The speaker 57 outputs an announcement or the like for guiding the vehicle 5 to the destination by voice. These output devices can also notify the passenger of the provided information received by the communication unit 61.
入力デバイス58は、車両5の搭乗者が各種の入力操作を行うためデバイスである。入力デバイス58は、ハンドルに設けた操作スイッチ、ジョイスティックディスプレイ56に設けたタッチパネル、およびこれらの組合せなどである。入力デバイス58は、搭乗者の音声認識によって入力を受け付ける音声認識装置であってもよい。入力デバイス58が生成した入力信号は、制御部51に送信される。
The input device 58 is a device for the passenger of the vehicle 5 to perform various input operations. The input device 58 is an operation switch provided on a steering wheel, a touch panel provided on the joystick display 56, a combination thereof, or the like. The input device 58 may be a voice recognition device that receives an input by voice recognition of a passenger. The input signal generated by the input device 58 is transmitted to the control unit 51.
車載カメラ59は、車両5の前方の映像を取り込む画像センサである。車載カメラ59は、単眼または複眼のいずれでもよい。レーダセンサ60は、ミリ波レーダやLiDAR方式などにより車両5の前方や周囲に存在する物体を検出するセンサである。
制御部51は、車載カメラ59およびレーダセンサ60による計測データに基づいて、運転中の搭乗者に対する注意喚起をディスプレイ56に出力させたり、強制的なブレーキ介入を行ったりする運転支援制御を実行することができる。 The on-vehicle camera 59 is an image sensor that captures an image in front of the vehicle 5. The in-vehicle camera 59 may be either a monocular or a compound eye. The radar sensor 60 is a sensor that detects an object existing in front of or around the vehicle 5 using a millimeter-wave radar, a LiDAR method, or the like.
Thecontrol unit 51 executes a driving support control to output an alert to the occupant during driving on the display 56 or to perform a forced brake intervention based on the measurement data by the on-vehicle camera 59 and the radar sensor 60. be able to.
制御部51は、車載カメラ59およびレーダセンサ60による計測データに基づいて、運転中の搭乗者に対する注意喚起をディスプレイ56に出力させたり、強制的なブレーキ介入を行ったりする運転支援制御を実行することができる。 The on-
The
制御部51は、記憶部55に格納された各種の制御プログラムを実行する、マイクロコンピュータなどの演算処理装置により構成される。
制御部51は、上記制御プログラムを実行することにより実現される機能として、ディスプレイ56に地図画像を表示させる機能、出発地から目的地までの経路(中継地がある場合はその位置を含む。)を算出する機能、および、算出した経路に従って車両5を目的地まで誘導する機能など、各種のナビゲーション機能を有する。 Thecontrol unit 51 is configured by an arithmetic processing device such as a microcomputer that executes various control programs stored in the storage unit 55.
Thecontrol unit 51 has a function of displaying a map image on the display 56 and a route from the departure place to the destination (including the position of the stopover point, if any) as functions realized by executing the control program. And a function of guiding the vehicle 5 to the destination according to the calculated route.
制御部51は、上記制御プログラムを実行することにより実現される機能として、ディスプレイ56に地図画像を表示させる機能、出発地から目的地までの経路(中継地がある場合はその位置を含む。)を算出する機能、および、算出した経路に従って車両5を目的地まで誘導する機能など、各種のナビゲーション機能を有する。 The
The
制御部51は、車載カメラ59およびレーダセンサ60のうちの少なくとも1つの計測データに基づいて、自車両の前方または周囲の物体を認識する物体認識処理と、認識した物体までの距離を算出する測距処理とを実行する機能を有する。
制御部51は、測距処理により算出した距離と、自車両のセンサ位置とから、物体認識処理によって認識した物体の位置情報を算出できる。 Thecontrol unit 51 performs an object recognition process of recognizing an object in front of or around the own vehicle based on at least one measurement data of the in-vehicle camera 59 and the radar sensor 60, and calculates a distance to the recognized object. It has a function of performing distance processing.
Thecontrol unit 51 can calculate the position information of the object recognized by the object recognition process from the distance calculated by the distance measurement process and the sensor position of the own vehicle.
制御部51は、測距処理により算出した距離と、自車両のセンサ位置とから、物体認識処理によって認識した物体の位置情報を算出できる。 The
The
制御部51は、エッジサーバ3(コアサーバ4であってもよい。)との通信において、以下の各処理を実行可能である。
1)要求メッセージの送信処理
2)動的情報の受信処理
3)変化点情報の生成処理
4)変化点情報の送信処理 Thecontrol unit 51 can execute the following processes in communication with the edge server 3 (or the core server 4).
1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
1)要求メッセージの送信処理
2)動的情報の受信処理
3)変化点情報の生成処理
4)変化点情報の送信処理 The
1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
要求メッセージの送信処理は、エッジサーバ3が逐次更新するマップM1の動的情報の配信を要求する制御パケットを、エッジサーバ3に送信する処理である。当該制御パケットは、自車両の車両IDを含む。
エッジサーバ3は、所定の車両IDを含む要求メッセージを受信すると、送信元の車両IDを有する車両5の通信端末1A宛てに、動的情報を所定の配信周期で配信する。 The request message transmission process is a process of transmitting, to theedge server 3, a control packet requesting distribution of the dynamic information of the map M1 sequentially updated by the edge server 3. The control packet includes the vehicle ID of the own vehicle.
When receiving the request message including the predetermined vehicle ID, theedge server 3 distributes the dynamic information at a predetermined distribution cycle to the communication terminal 1A of the vehicle 5 having the transmission source vehicle ID.
エッジサーバ3は、所定の車両IDを含む要求メッセージを受信すると、送信元の車両IDを有する車両5の通信端末1A宛てに、動的情報を所定の配信周期で配信する。 The request message transmission process is a process of transmitting, to the
When receiving the request message including the predetermined vehicle ID, the
動的情報の受信処理は、自装置に宛ててエッジサーバ3が配信した動的情報を受信する処理である。
制御部51による変化点情報の生成処理は、受信した動的情報と、受信時点における自車両のセンサ情報との比較結果から、それらの情報間の変化を算出し、両情報間の差分に関する情報である変化点情報を生成する処理である。
制御部51が生成する変化点情報は、たとえば、次の情報例a1~a2である。 The dynamic information receiving process is a process of receiving dynamic information distributed by theedge server 3 to the own device.
The generation process of the change point information by thecontrol unit 51 calculates the change between the received dynamic information and the sensor information of the own vehicle at the time of reception, based on the comparison result, and obtains information on the difference between the two information. This is the process of generating change point information.
The change point information generated by thecontrol unit 51 is, for example, the following information examples a1 to a2.
制御部51による変化点情報の生成処理は、受信した動的情報と、受信時点における自車両のセンサ情報との比較結果から、それらの情報間の変化を算出し、両情報間の差分に関する情報である変化点情報を生成する処理である。
制御部51が生成する変化点情報は、たとえば、次の情報例a1~a2である。 The dynamic information receiving process is a process of receiving dynamic information distributed by the
The generation process of the change point information by the
The change point information generated by the
情報例a1:認識物体に関する変化点情報
制御部51は、受信した動的情報には含まれない物体X(車両、歩行者等の移動体および障害物など)を、自身の物体認識処理により検出した場合は、検出した物体Xの画像データと位置情報とを変化点情報とする。制御部51は、受信した動的情報に含まれる物体Xの位置情報と、自身の物体認識処理により求めた物体Xの位置情報とが、所定の閾値以上ずれている場合は、検出した物体Xの画像データと、両者の位置情報の差分値とを変化点情報とする。 Information example a1: Change point information related to a recognized object Thecontrol unit 51 detects an object X (a moving object such as a vehicle or a pedestrian or an obstacle) not included in the received dynamic information by its own object recognition processing. In this case, the image data and the position information of the detected object X are used as change point information. If the position information of the object X included in the received dynamic information and the position information of the object X obtained by its own object recognition processing are shifted by a predetermined threshold or more, the control unit 51 And the difference value between the two pieces of position information are set as change point information.
制御部51は、受信した動的情報には含まれない物体X(車両、歩行者等の移動体および障害物など)を、自身の物体認識処理により検出した場合は、検出した物体Xの画像データと位置情報とを変化点情報とする。制御部51は、受信した動的情報に含まれる物体Xの位置情報と、自身の物体認識処理により求めた物体Xの位置情報とが、所定の閾値以上ずれている場合は、検出した物体Xの画像データと、両者の位置情報の差分値とを変化点情報とする。 Information example a1: Change point information related to a recognized object The
情報例a2:自車両に関する変化点情報
制御部51は、受信した動的情報に含まれる自車両の位置情報と、GPS信号により自身が算出した自車両の車両位置とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。制御部51は、受信した動的情報に含まれる自車両の方位と、ジャイロセンサ54の計測データから自身が算出した自車両の方位とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。 Information Example a2: Change Point Information Regarding the Own Vehicle Thecontrol unit 51 shifts the position information of the own vehicle included in the received dynamic information from the vehicle position of the own vehicle calculated by the GPS signal by a predetermined threshold or more. In this case, the difference value between the two is used as change point information. When the azimuth of the own vehicle included in the received dynamic information and the azimuth of the own vehicle calculated from the measurement data of the gyro sensor 54 deviate by a predetermined threshold or more, the control unit 51 determines the difference between the two. The value is used as change point information.
制御部51は、受信した動的情報に含まれる自車両の位置情報と、GPS信号により自身が算出した自車両の車両位置とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。制御部51は、受信した動的情報に含まれる自車両の方位と、ジャイロセンサ54の計測データから自身が算出した自車両の方位とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。 Information Example a2: Change Point Information Regarding the Own Vehicle The
制御部51は、上記のようにして変化点情報を生成すると、生成した変化点情報と、自車両の車両IDとを含むエッジサーバ3宛の通信パケットを生成する。
変化点情報の送信処理は、変化点情報を含む上記の通信パケットを、エッジサーバ3宛てに送信する処理である。変化点情報の送信処理は、エッジサーバ3による動的情報の配信周期内に行われる。 When generating the change point information as described above, thecontrol unit 51 generates a communication packet addressed to the edge server 3 including the generated change point information and the vehicle ID of the own vehicle.
The transmission process of the change point information is a process of transmitting the communication packet including the change point information to theedge server 3. The transmission process of the change point information is performed within the dynamic information distribution cycle by the edge server 3.
変化点情報の送信処理は、変化点情報を含む上記の通信パケットを、エッジサーバ3宛てに送信する処理である。変化点情報の送信処理は、エッジサーバ3による動的情報の配信周期内に行われる。 When generating the change point information as described above, the
The transmission process of the change point information is a process of transmitting the communication packet including the change point information to the
制御部51は、エッジサーバ3などから受信した動的情報に基づいて、運転中の搭乗者に対する注意喚起をディスプレイ56に出力させたり、強制的なブレーキ介入を行ったりする運転支援制御を実行することもできる。
Based on the dynamic information received from the edge server 3 or the like, the control unit 51 outputs a warning to the occupant during driving on the display 56 or executes driving support control for forcibly intervening braking. You can also.
〔歩行者端末の内部構成〕
図4は、歩行者端末70(通信端末1B)の内部構成の一例を示すブロック図である。
図4の歩行者端末70は、たとえば5Gに準拠した通信処理が可能な無線通信機である。従って、歩行者端末70は、スライスS3に属する移動端末の一種として、エッジサーバ3と通信することができる。また、歩行者端末70は、スライスS4に属する移動端末の一種として、コアサーバ4と通信することもできる。 (Internal configuration of pedestrian terminal)
FIG. 4 is a block diagram illustrating an example of an internal configuration of the pedestrian terminal 70 (thecommunication terminal 1B).
Thepedestrian terminal 70 in FIG. 4 is a wireless communication device capable of performing communication processing conforming to, for example, 5G. Therefore, the pedestrian terminal 70 can communicate with the edge server 3 as a kind of mobile terminal belonging to the slice S3. Further, the pedestrian terminal 70 can also communicate with the core server 4 as a kind of mobile terminal belonging to the slice S4.
図4は、歩行者端末70(通信端末1B)の内部構成の一例を示すブロック図である。
図4の歩行者端末70は、たとえば5Gに準拠した通信処理が可能な無線通信機である。従って、歩行者端末70は、スライスS3に属する移動端末の一種として、エッジサーバ3と通信することができる。また、歩行者端末70は、スライスS4に属する移動端末の一種として、コアサーバ4と通信することもできる。 (Internal configuration of pedestrian terminal)
FIG. 4 is a block diagram illustrating an example of an internal configuration of the pedestrian terminal 70 (the
The
図4を参照して、歩行者端末70は、制御部71、記憶部72、表示部73、操作部74、および、通信部75を含む。
を Referring to FIG. 4, pedestrian terminal 70 includes a control unit 71, a storage unit 72, a display unit 73, an operation unit 74, and a communication unit 75.
通信部75は、5Gサービスを提供するキャリアの基地局2と無線通信する通信インターフェースである。通信部75は、基地局2からのRF信号をデジタル信号に変換して制御部71に出力する。また、通信部75は、制御部71から入力されたデジタル信号をRF信号に変換して、基地局2に送信する。
The communication unit 75 is a communication interface that performs wireless communication with the base station 2 of the carrier that provides the 5G service. The communication unit 75 converts the RF signal from the base station 2 into a digital signal and outputs the digital signal to the control unit 71. The communication unit 75 converts the digital signal input from the control unit 71 into an RF signal, and transmits the RF signal to the base station 2.
制御部71は、CPU、ROMおよびRAMを含む。制御部71は、記憶部72に記憶されたプログラムを読み出して実行し、歩行者端末70の全体の動作を制御する。
The control unit 71 includes a CPU, a ROM, and a RAM. The control unit 71 reads out and executes the program stored in the storage unit 72, and controls the entire operation of the pedestrian terminal 70.
記憶部72は、ハードディスクや不揮発性のメモリなどより構成され、各種のコンピュータプログラムやデータを記憶する。また、記憶部72は、歩行者端末70の識別情報である携帯IDを記憶する。携帯IDは、たとえば、キャリア契約者の固有のユーザIDやMACアドレスなどである。
The storage unit 72 includes a hard disk, a nonvolatile memory, and the like, and stores various computer programs and data. The storage unit 72 stores a mobile ID, which is identification information of the pedestrian terminal 70. The mobile ID is, for example, a unique user ID or MAC address of a carrier contractor.
さらに、記憶部72は、ユーザが任意にインストールした各種のアプリケーションソフトを記憶している。記憶部72が記憶するアプリケーションソフトは、例えば、エッジサーバ3(コアサーバ4でもよい。)との通信により、マップM1の動的情報などを受信する情報提供サービスを享受するためのアプリケーションソフトを含む。
(4) The storage unit 72 stores various application software arbitrarily installed by the user. The application software stored in the storage unit 72 includes, for example, application software for enjoying an information providing service for receiving dynamic information of the map M1 through communication with the edge server 3 (or the core server 4). .
操作部74は、各種の操作ボタンや表示部73のタッチパネル機能により構成される。操作部74は、ユーザの操作に応じた操作信号を制御部71に出力する。
The operation unit 74 includes various operation buttons and a touch panel function of the display unit 73. The operation unit 74 outputs an operation signal according to a user operation to the control unit 71.
表示部73は、たとえば液晶ディスプレイである。表示部73は、各種の情報をユーザに提示する。たとえば、表示部73は、サーバ3,4から送信された情報マップM1,M2の画像データなどを画面表示する。
The display unit 73 is, for example, a liquid crystal display. The display unit 73 presents various information to the user. For example, the display unit 73 displays the image data of the information maps M1 and M2 transmitted from the servers 3 and 4 on a screen.
制御部71は、GPS信号から現在時刻を取得する時刻同期機能と、GPS信号から自車両の現在位置(緯度、経度及び高度)を計測する位置検出機能と、方位センサによって歩行者7の向きを計測する方位検出機能と、をさらに有する。
The control unit 71 has a time synchronization function of acquiring the current time from the GPS signal, a position detection function of measuring the current position (latitude, longitude, and altitude) of the vehicle from the GPS signal, and detects the direction of the pedestrian 7 by using the direction sensor. And a azimuth detecting function for measuring.
制御部71は、エッジサーバ3(コアサーバ4であってもよい。)との通信において、以下の各処理を実行可能である。
1)要求メッセージの送信処理
2)動的情報の受信処理
3)変化点情報の生成処理
4)変化点情報の送信処理 Thecontrol unit 71 can execute the following processes in communication with the edge server 3 (may be the core server 4).
1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
1)要求メッセージの送信処理
2)動的情報の受信処理
3)変化点情報の生成処理
4)変化点情報の送信処理 The
1) Request message transmission processing 2) Dynamic information reception processing 3) Change point information generation processing 4) Change point information transmission processing
要求メッセージの送信処理は、エッジサーバ3が逐次更新するマップM1の動的情報の配信を要求する制御パケットを、エッジサーバ3に送信する処理である。当該制御パケットは、歩行者端末70の携帯IDを含む。
エッジサーバ3は、所定の携帯IDを含む要求メッセージを受信すると、送信元の携帯IDを有する歩行者7の通信端末1B宛てに、動的情報を所定の配信周期で配信する。 The request message transmission process is a process of transmitting, to theedge server 3, a control packet requesting distribution of the dynamic information of the map M1 sequentially updated by the edge server 3. The control packet includes the mobile ID of the pedestrian terminal 70.
When receiving the request message including the predetermined mobile ID, theedge server 3 distributes the dynamic information to the communication terminal 1B of the pedestrian 7 having the transmission source mobile ID at a predetermined distribution cycle.
エッジサーバ3は、所定の携帯IDを含む要求メッセージを受信すると、送信元の携帯IDを有する歩行者7の通信端末1B宛てに、動的情報を所定の配信周期で配信する。 The request message transmission process is a process of transmitting, to the
When receiving the request message including the predetermined mobile ID, the
動的情報の受信処理は、自装置に宛ててエッジサーバ3が配信した動的情報を受信する処理である。
制御部71による変化点情報の生成処理は、受信した動的情報と、受信時点における自車両のセンサ情報との比較結果から、それらの情報間の変化を算出し、両情報間の差分に関する情報である変化点情報を生成する処理である。
制御部71が生成する変化点情報は、たとえば、次の情報例である。 The dynamic information receiving process is a process of receiving dynamic information distributed by theedge server 3 to the own device.
The generation process of the change point information by thecontrol unit 71 calculates the change between the received dynamic information and the sensor information of the own vehicle at the time of reception, based on the comparison result, and obtains information on the difference between the two information. This is the process of generating change point information.
The change point information generated by thecontrol unit 71 is, for example, the following information example.
制御部71による変化点情報の生成処理は、受信した動的情報と、受信時点における自車両のセンサ情報との比較結果から、それらの情報間の変化を算出し、両情報間の差分に関する情報である変化点情報を生成する処理である。
制御部71が生成する変化点情報は、たとえば、次の情報例である。 The dynamic information receiving process is a process of receiving dynamic information distributed by the
The generation process of the change point information by the
The change point information generated by the
情報例:自車両に関する変化点情報
制御部71は、受信した動的情報に含まれる自歩行者7の位置情報と、GPS信号により自身が算出した自歩行者7の位置とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。制御部71は、受信した動的情報に含まれる自歩行者7の方位と、方位センサによって算出した自歩行者7の方位とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。 Information example: change point information regarding the own vehicle Thecontrol unit 71 determines that the position information of the own pedestrian 7 included in the received dynamic information and the position of the own pedestrian 7 calculated by the GPS signal are a predetermined threshold value. If there is a deviation, the difference between the two is used as change point information. When the direction of the self-pedestrian 7 included in the received dynamic information and the direction of the self-pedestrian 7 calculated by the direction sensor deviate by a predetermined threshold or more, the control unit 71 determines the difference value between the two. Change point information.
制御部71は、受信した動的情報に含まれる自歩行者7の位置情報と、GPS信号により自身が算出した自歩行者7の位置とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。制御部71は、受信した動的情報に含まれる自歩行者7の方位と、方位センサによって算出した自歩行者7の方位とが、所定の閾値以上ずれている場合は、両者の差分値を変化点情報とする。 Information example: change point information regarding the own vehicle The
制御部71は、上記のようにして変化点情報を生成すると、生成した変化点情報と、自端末70の携帯IDとを含むエッジサーバ3宛の通信パケットを生成する。
変化点情報の送信処理は、変化点情報を含む上記の通信パケットを、エッジサーバ3宛てに送信する処理である。変化点情報の送信処理は、エッジサーバ3による動的情報の配信周期内に行われる。 When generating the change point information as described above, thecontrol unit 71 generates a communication packet addressed to the edge server 3 including the generated change point information and the mobile ID of the own terminal 70.
The transmission process of the change point information is a process of transmitting the communication packet including the change point information to theedge server 3. The transmission process of the change point information is performed within the dynamic information distribution cycle by the edge server 3.
変化点情報の送信処理は、変化点情報を含む上記の通信パケットを、エッジサーバ3宛てに送信する処理である。変化点情報の送信処理は、エッジサーバ3による動的情報の配信周期内に行われる。 When generating the change point information as described above, the
The transmission process of the change point information is a process of transmitting the communication packet including the change point information to the
以上のように、制御部71は、変化点情報の生成処理及び変化点情報の送信処理を行うことで、自端末70の位置及び方位情報などを含む状態情報を、エッジサーバ3へ送信する。
As described above, the control unit 71 transmits the state information including the position and orientation information of the own terminal 70 to the edge server 3 by performing the change point information generation processing and the change point information transmission processing.
〔路側センサの内部構成〕
図5は、通信端末1Cである無線通信機を搭載した路側センサ8の内部構成の一例を示すブロック図である。図5を参照して、路側センサ8は、制御部81、記憶部82、路側カメラ83、レーダセンサ84、および、通信部85を含む。 [Internal configuration of roadside sensor]
FIG. 5 is a block diagram illustrating an example of an internal configuration of theroadside sensor 8 including the wireless communication device as the communication terminal 1C. Referring to FIG. 5, roadside sensor 8 includes a control unit 81, a storage unit 82, a roadside camera 83, a radar sensor 84, and a communication unit 85.
図5は、通信端末1Cである無線通信機を搭載した路側センサ8の内部構成の一例を示すブロック図である。図5を参照して、路側センサ8は、制御部81、記憶部82、路側カメラ83、レーダセンサ84、および、通信部85を含む。 [Internal configuration of roadside sensor]
FIG. 5 is a block diagram illustrating an example of an internal configuration of the
通信部85は、前述の通信端末1C、すなわち、たとえば5Gに準拠した通信処理が可能な無線通信機である。従って、路側センサ8は、スライスS3に属する固定端末の一種として、エッジサーバ3と通信することができる。また、路側センサ8は、スライスS4に属する固定端末の一種として、コアサーバ4と通信することもできる。
The communication unit 85 is the above-described communication terminal 1C, that is, a wireless communication device capable of performing communication processing conforming to, for example, 5G. Therefore, the roadside sensor 8 can communicate with the edge server 3 as a kind of fixed terminal belonging to the slice S3. The roadside sensor 8 can also communicate with the core server 4 as a kind of fixed terminal belonging to the slice S4.
制御部81は、CPU、ROMおよびRAMを含む。制御部81は、記憶部82に記憶されたプログラムを読み出して実行し、路側センサ8の全体の動作を制御する。
The control unit 81 includes a CPU, a ROM, and a RAM. The control unit 81 reads and executes the program stored in the storage unit 82, and controls the entire operation of the roadside sensor 8.
記憶部82は、ハードディスクや不揮発性のメモリなどより構成され、各種のコンピュータプログラムやデータを記憶する。また、記憶部82は、路側センサ8の識別情報であるセンサIDを記憶する。センサIDは、たとえば、路側センサ8の所有者固有のユーザIDやMACアドレスなどである。
The storage unit 82 includes a hard disk, a nonvolatile memory, and the like, and stores various computer programs and data. Further, the storage unit 82 stores a sensor ID which is identification information of the roadside sensor 8. The sensor ID is, for example, a user ID or a MAC address unique to the owner of the roadside sensor 8.
路側カメラ83は、所定の撮影エリアの映像を取り込む画像センサである。路側カメラ83は、単眼または複眼のいずれでもよい。レーダセンサ60は、ミリ波レーダやLiDAR方式などにより車両5の前方や周囲に存在する物体を検出するセンサである。
The roadside camera 83 is an image sensor that captures an image of a predetermined shooting area. The roadside camera 83 may be a single eye or a compound eye. The radar sensor 60 is a sensor that detects an object existing in front of or around the vehicle 5 using a millimeter-wave radar, a LiDAR method, or the like.
路側センサ8が防犯カメラである場合、制御部81は、取り込んだ映像データなどを防犯管理者のコンピュータ装置に送信する。路側センサ8が画像式車両感知器である場合、制御部81は、取り込んだ映像データなどを交通管制センターに送信する。
When the roadside sensor 8 is a security camera, the control unit 81 transmits the captured video data and the like to the computer device of the security administrator. When the roadside sensor 8 is an image type vehicle sensor, the control unit 81 transmits the captured image data and the like to the traffic control center.
制御部81は、路側カメラ83およびレーダセンサ84のうちの少なくとも1つの計測データに基づいて、撮影エリア内の物体を認識する物体認識処理と、認識した物体までの距離を算出する測距処理と、を実行する機能を有する。制御部51は、測距処理により算出した距離と、自車両のセンサ位置とから、物体認識処理によって認識した物体の位置情報を算出できる。
The control unit 81 performs an object recognition process of recognizing an object in the shooting area based on at least one measurement data of the roadside camera 83 and the radar sensor 84, and a distance measurement process of calculating a distance to the recognized object. , Has the function of executing. The control unit 51 can calculate the position information of the object recognized by the object recognition process from the distance calculated by the distance measurement process and the sensor position of the own vehicle.
制御部81は、エッジサーバ3(コアサーバ4であってもよい。)との通信において、以下の各処理を実行可能である。
1)変化点情報の生成処理
2)変化点情報の送信処理 Thecontrol unit 81 can execute the following processes in communication with the edge server 3 (may be the core server 4).
1) Change point information generation processing 2) Change point information transmission processing
1)変化点情報の生成処理
2)変化点情報の送信処理 The
1) Change point information generation processing 2) Change point information transmission processing
路側センサ8における変化点情報の生成処理は、所定の計測周期(たとえば、エッジサーバ3による動的情報の配信周期)ごとの、前回のセンサ情報と今回のセンサ情報との比較結果から、それらのセンサ情報間の変化を算出し、両情報間の差分に関する情報である変化点情報を生成する処理である。
路側センサ8が生成する変化点情報は、たとえば、次の情報例b1である。 The generation process of the change point information in theroadside sensor 8 is performed based on a comparison result between the previous sensor information and the current sensor information for each predetermined measurement cycle (for example, the distribution cycle of the dynamic information by the edge server 3). This is a process of calculating a change between sensor information and generating change point information that is information on a difference between the two information.
The change point information generated by theroadside sensor 8 is, for example, the following information example b1.
路側センサ8が生成する変化点情報は、たとえば、次の情報例b1である。 The generation process of the change point information in the
The change point information generated by the
情報例b1:認識物体に関する変化点情報
制御部81は、前回の物体認識処理では検出されなかった物体Y(車両、歩行者等の移動体及び障害物など)を、今回の物体認識処理により検出した場合は、検出した物体Yの画像データと位置情報を変化点情報とする。
制御部81は、前回の物体認識処理により求めた物体Yの位置情報と、今回の物体認識処理により求めた物体Yの位置情報とが、所定の閾値以上ずれている場合は、検出した物体Yの位置情報と、両者の差分値とを変化点情報とする。 Information example b1: change point information related to a recognized object Thecontrol unit 81 detects an object Y (a moving object such as a vehicle or a pedestrian or an obstacle) that was not detected in the previous object recognition process by the current object recognition process. In this case, the detected image data and position information of the object Y are used as change point information.
When the position information of the object Y obtained by the previous object recognition process and the position information of the object Y obtained by the current object recognition process are shifted by a predetermined threshold or more, thecontrol unit 81 And the difference value between them are defined as change point information.
制御部81は、前回の物体認識処理では検出されなかった物体Y(車両、歩行者等の移動体及び障害物など)を、今回の物体認識処理により検出した場合は、検出した物体Yの画像データと位置情報を変化点情報とする。
制御部81は、前回の物体認識処理により求めた物体Yの位置情報と、今回の物体認識処理により求めた物体Yの位置情報とが、所定の閾値以上ずれている場合は、検出した物体Yの位置情報と、両者の差分値とを変化点情報とする。 Information example b1: change point information related to a recognized object The
When the position information of the object Y obtained by the previous object recognition process and the position information of the object Y obtained by the current object recognition process are shifted by a predetermined threshold or more, the
制御部81は、上記のようにして変化点情報を生成すると、生成した変化点情報と、自装置のセンサIDとを含むエッジサーバ3宛の通信パケットを生成する。
変化点情報の送信処理は、変化点情報をデータに含む上記の通信パケットを、エッジサーバ3宛てに送信する処理である。変化点情報の送信処理は、エッジサーバ3による動的情報の配信周期内に行われる。 When generating the change point information as described above, thecontrol unit 81 generates a communication packet addressed to the edge server 3 including the generated change point information and the sensor ID of the own device.
The transmission process of the change point information is a process of transmitting the communication packet including the change point information in the data to theedge server 3. The transmission process of the change point information is performed within the dynamic information distribution cycle by the edge server 3.
変化点情報の送信処理は、変化点情報をデータに含む上記の通信パケットを、エッジサーバ3宛てに送信する処理である。変化点情報の送信処理は、エッジサーバ3による動的情報の配信周期内に行われる。 When generating the change point information as described above, the
The transmission process of the change point information is a process of transmitting the communication packet including the change point information in the data to the
〔情報提供システムの全体構成〕
図6は、本実施形態にかかる情報提供システムの全体構成図である。
図6を参照して、本実施形態にかかる情報提供システムは、比較的広範囲であるエッジサーバ3のサービスエリア(リアルワールド)に散在する多数の車両5、歩行者端末70、および路側センサ8と、これらの通信ノードと基地局2を介して行われる5Gに準拠した通信などにより低遅延での無線通信が可能であって、情報提供装置として機能するエッジサーバ3と、を含む。つまり、情報提供システムは、上述の無線通信システムの一部又は全部を含んで構成される。
なお、エッジサーバ3のサービスエリアに存在する移動体には、通信端末1Aや車載装置50が搭載されることで無線通信が可能な車両5や、歩行者端末70を携帯する歩行者7の他、無線通信機能を有していない車両5、歩行者端末70を携帯しない歩行者7も含まれている。 [Overall configuration of information provision system]
FIG. 6 is an overall configuration diagram of the information providing system according to the present embodiment.
Referring to FIG. 6, the information providing system according to the present embodiment includes a large number ofvehicles 5, pedestrian terminals 70, and roadside sensors 8 scattered in a service area (real world) of edge server 3 which is relatively wide. And an edge server 3 that can perform wireless communication with low delay by 5G-compliant communication and the like performed via these communication nodes and the base station 2 and that functions as an information providing device. That is, the information providing system is configured to include part or all of the above-described wireless communication system.
In addition, the mobile object existing in the service area of theedge server 3 includes the vehicle 5 capable of wireless communication by mounting the communication terminal 1A and the in-vehicle device 50, and the pedestrian 7 carrying the pedestrian terminal 70. And a pedestrian 7 who does not carry the pedestrian terminal 70 without a wireless communication function.
図6は、本実施形態にかかる情報提供システムの全体構成図である。
図6を参照して、本実施形態にかかる情報提供システムは、比較的広範囲であるエッジサーバ3のサービスエリア(リアルワールド)に散在する多数の車両5、歩行者端末70、および路側センサ8と、これらの通信ノードと基地局2を介して行われる5Gに準拠した通信などにより低遅延での無線通信が可能であって、情報提供装置として機能するエッジサーバ3と、を含む。つまり、情報提供システムは、上述の無線通信システムの一部又は全部を含んで構成される。
なお、エッジサーバ3のサービスエリアに存在する移動体には、通信端末1Aや車載装置50が搭載されることで無線通信が可能な車両5や、歩行者端末70を携帯する歩行者7の他、無線通信機能を有していない車両5、歩行者端末70を携帯しない歩行者7も含まれている。 [Overall configuration of information provision system]
FIG. 6 is an overall configuration diagram of the information providing system according to the present embodiment.
Referring to FIG. 6, the information providing system according to the present embodiment includes a large number of
In addition, the mobile object existing in the service area of the
エッジサーバ3は、サービスエリア内の車両5の車載装置50や、歩行者端末70、路側センサ8などから、前述の変化点情報を所定周期で収集する(ステップS31)。
エッジサーバ3は、収集した変化点情報をマップマッチングによって統合し(統合処理)、管理中の情報マップM1の動的情報を更新する(ステップS32)。 Theedge server 3 collects the above-mentioned change point information at a predetermined cycle from the in-vehicle device 50 of the vehicle 5 in the service area, the pedestrian terminal 70, the roadside sensor 8, and the like (step S31).
Theedge server 3 integrates the collected change point information by map matching (integration processing), and updates the dynamic information of the information map M1 being managed (step S32).
エッジサーバ3は、収集した変化点情報をマップマッチングによって統合し(統合処理)、管理中の情報マップM1の動的情報を更新する(ステップS32)。 The
The
エッジサーバ3は、車両5の車載装置50または歩行者端末70から要求があれば、最新の動的情報を要求元の通信ノードに送信する(ステップS33)。これにより、たとえば動的情報を受信した車両5は、搭乗者の運転支援などに動的情報を活用することができる。
なお、エッジサーバ3は、ステップS32で更新したマップM1を動的情報として要求元の通信ノードに送信してもよい。 If there is a request from the in-vehicle device 50 or the pedestrian terminal 70 of the vehicle 5, the edge server 3 transmits the latest dynamic information to the requesting communication node (step S33). Thus, for example, the vehicle 5 that has received the dynamic information can utilize the dynamic information for driving assistance of a passenger and the like.
Theedge server 3 may transmit the map M1 updated in step S32 to the requesting communication node as dynamic information.
なお、エッジサーバ3は、ステップS32で更新したマップM1を動的情報として要求元の通信ノードに送信してもよい。 If there is a request from the in-
The
動的情報を受信した車両5は、動的情報に基づいて自車両のセンサ情報との変化点情報を検出すると、検出した変化点情報をエッジサーバ3に送信する(ステップS34)。
When the vehicle 5 receiving the dynamic information detects the change point information from the sensor information of the own vehicle based on the dynamic information, the vehicle 5 transmits the detected change point information to the edge server 3 (Step S34).
このように、本実施形態の情報提供システムでは、変化点情報の収集(ステップS31)→動的情報の更新(ステップS32)→動的情報の配信(ステップS33)→車両による変化点情報の検出(ステップS34)→変化点情報の収集(ステップS31)の順で、各通信ノードにおける情報処理が循環する。
As described above, in the information providing system of the present embodiment, collection of change point information (step S31) → update of dynamic information (step S32) → distribution of dynamic information (step S33) → detection of change point information by vehicle The information processing in each communication node circulates in the order of (step S34) → collection of change point information (step S31).
図6は1つのエッジサーバ3のみを含む情報提供システムを例示しているが、情報提供システムは複数のエッジサーバ3を含んでもよいし、エッジサーバ3の替わりに、あるいはエッジサーバ3に加えて、1または複数のコアサーバ4を含んでもよい。
また、エッジサーバ3が管理する情報マップM1は、デジタル地図などの地図情報に少なくとも物体の動的情報が重畳されたマップであればよい。この点は、コアサーバの情報マップM2の場合も同様である。 FIG. 6 illustrates an information providing system including only oneedge server 3, but the information providing system may include a plurality of edge servers 3, or may be used instead of the edge server 3 or in addition to the edge server 3. One or more core servers 4 may be included.
The information map M1 managed by theedge server 3 may be any map as long as at least dynamic information of an object is superimposed on map information such as a digital map. This is the same for the information map M2 of the core server.
また、エッジサーバ3が管理する情報マップM1は、デジタル地図などの地図情報に少なくとも物体の動的情報が重畳されたマップであればよい。この点は、コアサーバの情報マップM2の場合も同様である。 FIG. 6 illustrates an information providing system including only one
The information map M1 managed by the
〔動的情報の更新処理および配信処理〕
図7は、歩行者端末70、車両5の車載装置50、路側センサ8、およびエッジサーバ3の協働により実行される、動的情報の更新処理および配信処理の一例を示すシーケンス図である。
以下の説明では、実行主体が歩行者端末70、車両5の車載装置50、路側センサ8およびエッジサーバ3となっているが、実際の実行主体は、それらの制御部71,51,81,31である。
なお、図7中のU1,U2・・・は、動的情報の配信周期である。 [Dynamic information update processing and distribution processing]
FIG. 7 is a sequence diagram illustrating an example of a dynamic information update process and a distribution process performed by cooperation of thepedestrian terminal 70, the vehicle-mounted device 50 of the vehicle 5, the roadside sensor 8, and the edge server 3.
In the following description, the execution subject is thepedestrian terminal 70, the in-vehicle device 50 of the vehicle 5, the roadside sensor 8, and the edge server 3, but the actual execution subject is those control units 71, 51, 81, 31. It is.
.. In FIG. 7 are the distribution periods of the dynamic information.
図7は、歩行者端末70、車両5の車載装置50、路側センサ8、およびエッジサーバ3の協働により実行される、動的情報の更新処理および配信処理の一例を示すシーケンス図である。
以下の説明では、実行主体が歩行者端末70、車両5の車載装置50、路側センサ8およびエッジサーバ3となっているが、実際の実行主体は、それらの制御部71,51,81,31である。
なお、図7中のU1,U2・・・は、動的情報の配信周期である。 [Dynamic information update processing and distribution processing]
FIG. 7 is a sequence diagram illustrating an example of a dynamic information update process and a distribution process performed by cooperation of the
In the following description, the execution subject is the
.. In FIG. 7 are the distribution periods of the dynamic information.
図7を参照して、エッジサーバ3は、歩行者端末70および車両5の車載装置50から動的情報の要求メッセージを受信すると(ステップS1)、受信時点において最新の動的情報を、送信元の歩行者端末70および車両5の車載装置50に配信する(ステップS2)。
Referring to FIG. 7, when receiving a request message for dynamic information from pedestrian terminal 70 and on-vehicle device 50 of vehicle 5 (step S1), edge server 3 transmits the latest dynamic information at the time of reception to the transmission source. To the pedestrian terminal 70 and the in-vehicle device 50 of the vehicle 5 (step S2).
ステップS1でエッジサーバ3は、受信した要求メッセージを解析し、当該メッセージに含まれる要求元を示す情報が予め登録されている通信端末1を示す情報である場合に、当該要求メッセージの送信元に対して動的情報を送信してもよい。
In step S1, the edge server 3 analyzes the received request message. If the information indicating the request source included in the message is information indicating the communication terminal 1 registered in advance, the edge server 3 determines that the request message has been transmitted. Alternatively, dynamic information may be transmitted.
ステップS1において、歩行者端末70および車載装置50のうちのいずれか一方から要求メッセージがあった場合には、ステップS2において、要求メッセージの送信元である一方の通信端末のみに動的情報が配信される。
In step S1, if there is a request message from either the pedestrian terminal 70 or the in-vehicle device 50, in step S2, the dynamic information is distributed to only one of the communication terminals that transmitted the request message. Is done.
ステップS2で配信された動的情報を受信した歩行者端末70は、配信周期U1内に、変化点情報を生成すると(ステップS3)、生成した変化点情報をエッジサーバ3に送信する(ステップS6)。
ステップS2で配信された動的情報を受信した車載装置50は、配信周期U1内に、動的情報と自身のセンサ情報との比較結果から変化点情報を生成すると(ステップS4)、生成した変化点情報をエッジサーバ3に送信する(ステップS6)。
また、路側センサ8は、配信周期U1内に、自身のセンサ情報の変化点情報を生成すると、生成した変化点情報をエッジサーバ3に送信する(ステップS6)。 When thepedestrian terminal 70 that has received the dynamic information distributed in step S2 generates the change point information within the distribution cycle U1 (step S3), the pedestrian terminal 70 transmits the generated change point information to the edge server 3 (step S6). ).
When the in-vehicle device 50 that has received the dynamic information distributed in step S2 generates change point information from the comparison result between the dynamic information and its own sensor information within the distribution cycle U1 (step S4), the generated change The point information is transmitted to the edge server 3 (Step S6).
When theroadside sensor 8 generates the change point information of its own sensor information within the distribution cycle U1, the roadside sensor 8 transmits the generated change point information to the edge server 3 (step S6).
ステップS2で配信された動的情報を受信した車載装置50は、配信周期U1内に、動的情報と自身のセンサ情報との比較結果から変化点情報を生成すると(ステップS4)、生成した変化点情報をエッジサーバ3に送信する(ステップS6)。
また、路側センサ8は、配信周期U1内に、自身のセンサ情報の変化点情報を生成すると、生成した変化点情報をエッジサーバ3に送信する(ステップS6)。 When the
When the in-
When the
エッジサーバ3は、更新周期U1内に、歩行者端末70、車載装置50および路側センサ8から変化点情報を受信すると、それらの変化点情報を反映した動的情報に更新し(ステップS7)、更新後の動的情報を歩行者端末70および車載装置50に配信する(ステップS8)。
When the edge server 3 receives the change point information from the pedestrian terminal 70, the in-vehicle device 50, and the roadside sensor 8 within the update period U1, the edge server 3 updates the change information with dynamic information reflecting the change point information (Step S7). The updated dynamic information is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (Step S8).
例えば、配信周期U1内に、車載装置50のみが変化点情報を生成した場合は、ステップS4で車載装置50が生成した変化点情報のみがエッジサーバ3に送信され(ステップS6)、その変化点情報のみを反映した動的情報の更新が行われる(ステップS7)。
また、配信周期U1内に、歩行者端末70、車載装置50、および路側センサ8が変化点情報をしなかった場合は、ステップS3~S7の処理が実行されず、前回送信分の動的情報(ステップS2)と同じ動的情報が歩行者端末70および車載装置50に配信される(ステップS8)。
このように、エッジサーバ3は、配信周期U1内に送信された変化点情報に基づき、ステップS7における動的情報の更新を行う。 For example, when only the in-vehicle device 50 generates the change point information within the distribution cycle U1, only the change point information generated by the in-vehicle device 50 in step S4 is transmitted to the edge server 3 (step S6), and the change point Update of the dynamic information reflecting only the information is performed (step S7).
If thepedestrian terminal 70, the in-vehicle device 50, and the roadside sensor 8 do not perform the change point information within the distribution cycle U1, the processes of steps S3 to S7 are not performed, and the dynamic information of the previous transmission is not performed. The same dynamic information as in (Step S2) is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (Step S8).
Thus, theedge server 3 updates the dynamic information in step S7 based on the change point information transmitted within the distribution cycle U1.
また、配信周期U1内に、歩行者端末70、車載装置50、および路側センサ8が変化点情報をしなかった場合は、ステップS3~S7の処理が実行されず、前回送信分の動的情報(ステップS2)と同じ動的情報が歩行者端末70および車載装置50に配信される(ステップS8)。
このように、エッジサーバ3は、配信周期U1内に送信された変化点情報に基づき、ステップS7における動的情報の更新を行う。 For example, when only the in-
If the
Thus, the
ステップS8で配信された動的情報を受信した歩行者端末70は、配信周期U2内に、変化点情報を生成すると(ステップS9)、生成した変化点情報をエッジサーバ3に送信する(ステップS12)。
ステップS8で配信された動的情報を受信した車載装置50は、配信周期U2内に、動的情報と自身のセンサ情報との比較結果から変化点情報を生成すると(ステップS10)、生成した変化点情報をエッジサーバ3に送信する(ステップS12)。
また、路側センサ8は、配信周期U2内に、自身のセンサ情報の変化点情報を生成すると、生成した変化点情報をエッジサーバ3に送信する(ステップS12)。 Thepedestrian terminal 70 that has received the dynamic information distributed in step S8 generates the change point information within the distribution cycle U2 (step S9), and transmits the generated change point information to the edge server 3 (step S12). ).
The in-vehicle device 50 that has received the dynamic information distributed in step S8 generates change point information from the comparison result between the dynamic information and its own sensor information within the distribution cycle U2 (step S10). The point information is transmitted to the edge server 3 (Step S12).
When theroadside sensor 8 generates the change point information of its own sensor information within the distribution cycle U2, the roadside sensor 8 transmits the generated change point information to the edge server 3 (step S12).
ステップS8で配信された動的情報を受信した車載装置50は、配信周期U2内に、動的情報と自身のセンサ情報との比較結果から変化点情報を生成すると(ステップS10)、生成した変化点情報をエッジサーバ3に送信する(ステップS12)。
また、路側センサ8は、配信周期U2内に、自身のセンサ情報の変化点情報を生成すると、生成した変化点情報をエッジサーバ3に送信する(ステップS12)。 The
The in-
When the
エッジサーバ3は、配信周期U2内に車載装置50および路側センサ8から変化点情報を受信すると、それらの変化点情報を反映した動的情報に更新し(ステップS13)、更新後の動的情報を歩行者端末70および車載装置50に配信する(ステップS14)。
このように、エッジサーバ3は、配信周期U2内に送信された変化点情報に基づき、ステップS13における動的情報の更新を行う。 When receiving the change point information from the vehicle-mounteddevice 50 and the roadside sensor 8 within the distribution cycle U2, the edge server 3 updates the change information with dynamic information reflecting the change point information (Step S13), and updates the updated dynamic information. Is distributed to the pedestrian terminal 70 and the in-vehicle device 50 (step S14).
Thus, theedge server 3 updates the dynamic information in step S13 based on the change point information transmitted within the distribution cycle U2.
このように、エッジサーバ3は、配信周期U2内に送信された変化点情報に基づき、ステップS13における動的情報の更新を行う。 When receiving the change point information from the vehicle-mounted
Thus, the
ステップS14以降の処理は、歩行者端末70および車両5の双方から、動的情報の配信停止の要求メッセージを受信するか、または、歩行者端末70および車両5の通信が遮断されるまで、上記と同様のシーケンスによって繰り返される。
The processing after step S14 is performed until a request message for stopping distribution of dynamic information is received from both the pedestrian terminal 70 and the vehicle 5 or the communication between the pedestrian terminal 70 and the vehicle 5 is interrupted. Is repeated by the same sequence as.
〔第1実施形態について〕
〔移動端末に対する情報提供〕
本実施形態の情報提供システムは、サービスエリア内に位置する1又は複数の移動体に搭載された歩行者端末70や、車両5の車載装置50に対して衝突予測結果を提供する機能を有している。なお、衝突予測結果とは、車載装置50や歩行者端末70を備えた移動体と、当該移動体以外の他の移動体とが衝突するか否かを予測したときの予測結果を示す情報である。衝突予測結果には、衝突が予測される移動体の属性等の情報や、衝突が予測される移動体が接近してくる方向、衝突予測時間等が含まれる。 [About the first embodiment]
[Provision of information to mobile terminals]
The information providing system according to the present embodiment has a function of providing a collision prediction result to thepedestrian terminal 70 mounted on one or a plurality of moving objects located in the service area and the vehicle-mounted device 50 of the vehicle 5. ing. Note that the collision prediction result is information indicating a prediction result when predicting whether or not a mobile body including the in-vehicle device 50 or the pedestrian terminal 70 collides with another mobile body other than the mobile body. is there. The collision prediction result includes information such as the attribute of the moving body predicted to collide, the direction in which the moving body predicted to collide approaches, the collision prediction time, and the like.
〔移動端末に対する情報提供〕
本実施形態の情報提供システムは、サービスエリア内に位置する1又は複数の移動体に搭載された歩行者端末70や、車両5の車載装置50に対して衝突予測結果を提供する機能を有している。なお、衝突予測結果とは、車載装置50や歩行者端末70を備えた移動体と、当該移動体以外の他の移動体とが衝突するか否かを予測したときの予測結果を示す情報である。衝突予測結果には、衝突が予測される移動体の属性等の情報や、衝突が予測される移動体が接近してくる方向、衝突予測時間等が含まれる。 [About the first embodiment]
[Provision of information to mobile terminals]
The information providing system according to the present embodiment has a function of providing a collision prediction result to the
図8は、第1実施形態に係る衝突予測結果を提供するための機能について示したエッジサーバ3の機能ブロック図である。
エッジサーバ3の制御部31は、演算部31a、判定部31b、通知部31c、検出部31dを機能的に有している。これら各機能は、記憶部34に記憶されたプログラムを制御部31が実行することで実現される。 FIG. 8 is a functional block diagram of theedge server 3 showing a function for providing a collision prediction result according to the first embodiment.
Thecontrol unit 31 of the edge server 3 functionally includes a calculation unit 31a, a determination unit 31b, a notification unit 31c, and a detection unit 31d. Each of these functions is realized by the control unit 31 executing a program stored in the storage unit 34.
エッジサーバ3の制御部31は、演算部31a、判定部31b、通知部31c、検出部31dを機能的に有している。これら各機能は、記憶部34に記憶されたプログラムを制御部31が実行することで実現される。 FIG. 8 is a functional block diagram of the
The
演算部31aは、動的情報マップM1に基づいて、動的情報マップM1が表すサービスエリア内に位置する移動体(歩行者7、車両5)に搭載された移動端末(歩行者端末70及び車載装置50)それぞれにおける衝突予測を行い、その衝突予測結果についての評価値を求める機能を有している。演算部31aは、動的情報マップM1から得られる情報が登録された移動体データベース34a(後に説明する)を参照し、衝突予測結果についての評価値を求める。
The calculation unit 31a is configured to perform a mobile terminal (pedestrian terminal 70 and vehicle-mounted) mounted on a moving object (pedestrian 7, vehicle 5) located in the service area represented by the dynamic information map M1 based on the dynamic information map M1. Each of the devices 50) has a function of performing a collision prediction and obtaining an evaluation value of the collision prediction result. The calculation unit 31a refers to the mobile object database 34a (described later) in which information obtained from the dynamic information map M1 is registered, and obtains an evaluation value for a collision prediction result.
判定部31bは、移動端末(歩行者端末70及び車載装置50)が搭載された移動体における衝突予測結果を当該移動端末へ通知するか否かを、前記評価値に基づいて判定する機能を有している。
通知部31cは、判定部31bの判定結果に基づいて、移動端末へ衝突予測結果を通知する機能を有している。
検出部31dは、動的情報マップM1の動的情報に位置情報が含まれている複数の移動体(歩行者7、車両5)を検出し、各移動体の状況を示す移動体情報を生成する機能を有している。 The determiningunit 31b has a function of determining whether or not to notify a collision prediction result of a mobile object equipped with a mobile terminal (the pedestrian terminal 70 and the in-vehicle device 50) to the mobile terminal based on the evaluation value. doing.
The notifyingunit 31c has a function of notifying the mobile terminal of a collision prediction result based on the determination result of the determining unit 31b.
Thedetection unit 31d detects a plurality of moving objects (pedestrians 7, vehicles 5) in which the position information is included in the dynamic information of the dynamic information map M1, and generates moving object information indicating the status of each moving object. It has the function to do.
通知部31cは、判定部31bの判定結果に基づいて、移動端末へ衝突予測結果を通知する機能を有している。
検出部31dは、動的情報マップM1の動的情報に位置情報が含まれている複数の移動体(歩行者7、車両5)を検出し、各移動体の状況を示す移動体情報を生成する機能を有している。 The determining
The notifying
The
また、記憶部34には、上述の動的情報マップM1の他、移動体データベース34a及び評価値データベース34bが記憶されている。
図9は、移動体データベース34aの一例を示す図である。
図9に示すように、移動体データベース34aには、検出部31dが生成する移動体情報が登録されている。
移動体データベース34aは、検出部31dによって管理、更新される。
検出部31dは、動的情報マップM1を参照し、動的情報に新たな移動体(歩行者7、車両5)の位置情報が登録されると、その移動体に移動体IDを付与し、移動体IDに対応する移動体情報を生成して移動体データベース34aに登録する。このように、検出部31dは、動的情報マップM1に位置情報が登録されている移動体を検出するとともに、移動体それぞれに移動体IDを付与し、移動体情報を生成する。 Thestorage unit 34 stores a mobile object database 34a and an evaluation value database 34b in addition to the above-described dynamic information map M1.
FIG. 9 is a diagram illustrating an example of themobile object database 34a.
As shown in FIG. 9, mobile object information generated by thedetection unit 31d is registered in the mobile object database 34a.
Themobile database 34a is managed and updated by the detection unit 31d.
The detectingunit 31d refers to the dynamic information map M1, and when the position information of the new moving object (pedestrian 7, vehicle 5) is registered in the dynamic information, the detecting unit 31d assigns the moving object ID to the moving object. The moving body information corresponding to the moving body ID is generated and registered in the moving body database 34a. As described above, the detection unit 31d detects a moving object whose position information is registered in the dynamic information map M1, assigns a moving object ID to each moving object, and generates moving object information.
図9は、移動体データベース34aの一例を示す図である。
図9に示すように、移動体データベース34aには、検出部31dが生成する移動体情報が登録されている。
移動体データベース34aは、検出部31dによって管理、更新される。
検出部31dは、動的情報マップM1を参照し、動的情報に新たな移動体(歩行者7、車両5)の位置情報が登録されると、その移動体に移動体IDを付与し、移動体IDに対応する移動体情報を生成して移動体データベース34aに登録する。このように、検出部31dは、動的情報マップM1に位置情報が登録されている移動体を検出するとともに、移動体それぞれに移動体IDを付与し、移動体情報を生成する。 The
FIG. 9 is a diagram illustrating an example of the
As shown in FIG. 9, mobile object information generated by the
The
The detecting
移動体情報には、通信機能の有無に関する情報、車両ID(携帯ID)、移動体の属性情報、位置情報、移動方向を示す方位情報、及び移動速度を示す速度情報といった情報が含まれる。属性情報は、各移動体の属性を示す情報である。移動体の属性には、移動体の外観に関する属性や、移動体の行動に関する属性等、複数の属性が含まれている。なお、属性情報は車両と歩行者とで区別されている。属性情報は、移動体における複数の属性それぞれに関する属性内容を示す情報を含む。
The moving object information includes information such as information on the presence or absence of a communication function, a vehicle ID (mobile ID), attribute information of the moving object, position information, azimuth information indicating a moving direction, and speed information indicating a moving speed. The attribute information is information indicating an attribute of each moving object. The attributes of the moving object include a plurality of attributes such as an attribute related to the appearance of the moving object and an attribute related to the behavior of the moving object. Note that the attribute information is distinguished between a vehicle and a pedestrian. The attribute information includes information indicating the attribute content of each of the plurality of attributes in the moving object.
移動体データベース34aには、移動体ID、通信機能の有無に関する情報、車両ID(携帯ID)、移動体の属性情報、位置情報、方位情報、及び速度情報それぞれを登録するための欄が設けられている。
移動体データベース34aにおける各情報は、移動体IDに対応付けられて登録されている。 Themobile object database 34a has columns for registering the mobile object ID, information on the presence or absence of a communication function, a vehicle ID (mobile ID), attribute information of the mobile object, position information, azimuth information, and speed information. ing.
Each piece of information in the movingobject database 34a is registered in association with the moving object ID.
移動体データベース34aにおける各情報は、移動体IDに対応付けられて登録されている。 The
Each piece of information in the moving
検出部31d(取得部)は、動的情報マップM1を参照し、移動体それぞれの移動体情報を生成する。検出部31dは、移動体情報のうち、動的情報マップM1に含まれている通信機能の有無に関する情報、車両ID(携帯ID)、及び位置情報については、動的マップ情報M1に含まれる情報をそのまま取得して移動体データベース34aへ登録する。
検出部31dは、動的情報マップM1に含まれる移動体それぞれの位置情報の経時変化に基づいて方位情報及び速度情報を算出する。 Thedetection unit 31d (acquisition unit) refers to the dynamic information map M1 and generates moving body information of each moving body. The detection unit 31d determines, among the moving object information, information on the presence or absence of a communication function included in the dynamic information map M1, a vehicle ID (mobile ID), and position information, information included in the dynamic map information M1. Is acquired as it is and registered in the mobile object database 34a.
The detectingunit 31d calculates the azimuth information and the speed information based on the change over time of the position information of each moving object included in the dynamic information map M1.
検出部31dは、動的情報マップM1に含まれる移動体それぞれの位置情報の経時変化に基づいて方位情報及び速度情報を算出する。 The
The detecting
図10及び図11は、移動体データベース34aにおける属性情報を登録する部分の一部を示す図である。図10は、移動体が車両の場合における属性情報を登録するための欄を示している。図11は、移動体が歩行者の場合における属性情報を登録するための欄を示している。
FIGS. 10 and 11 are diagrams showing a part of a part for registering attribute information in the mobile object database 34a. FIG. 10 shows a column for registering attribute information when the moving object is a vehicle. FIG. 11 shows a column for registering attribute information when the moving object is a pedestrian.
図10に示すように、車両の属性情報は、外観属性と、行動属性とを含む。
外観属性には、「サイズ」、「形状」、「色」、及び「ナンバープレートの表示」の4種類の属性が含まれる。これら属性の内容は予め設定されており、予め設定された複数の属性内容の中から対象の車両に該当する属性内容(外観情報)が移動体IDに対応付けられて登録される。 As shown in FIG. 10, the attribute information of the vehicle includes an appearance attribute and an action attribute.
The appearance attributes include four types of attributes of “size”, “shape”, “color”, and “display license plate”. The contents of these attributes are set in advance, and the attribute contents (appearance information) corresponding to the target vehicle are registered in association with the moving object ID from among a plurality of preset attribute contents.
外観属性には、「サイズ」、「形状」、「色」、及び「ナンバープレートの表示」の4種類の属性が含まれる。これら属性の内容は予め設定されており、予め設定された複数の属性内容の中から対象の車両に該当する属性内容(外観情報)が移動体IDに対応付けられて登録される。 As shown in FIG. 10, the attribute information of the vehicle includes an appearance attribute and an action attribute.
The appearance attributes include four types of attributes of “size”, “shape”, “color”, and “display license plate”. The contents of these attributes are set in advance, and the attribute contents (appearance information) corresponding to the target vehicle are registered in association with the moving object ID from among a plurality of preset attribute contents.
「サイズ」における属性内容には、大型、中型、及び小型が含まれる。バスやトラック等は大型と判定され、軽自動車は小型と判定され、それ以外の一般乗用車は中型と判定される。
「形状」における属性内容には、セダン、クーペ、ワゴン、ミニバン、SUV、オープンカー、及びトラックが含まれる。
「色」における属性内容には、高明度、及び低明度が含まれる。
「ナンバープレートの表示」における属性内容には、対象の車両5に取り付けられたナンバープレートの地名表示(自動車の使用の本拠を管轄する運輸支局の地名)が含まれる。 The attribute content in “size” includes large, medium, and small. Buses and trucks are determined to be large, light vehicles are determined to be small, and other general passenger vehicles are determined to be medium.
The attribute contents in the “shape” include a sedan, a coupe, a wagon, a minivan, an SUV, a convertible car, and a truck.
The attribute contents of “color” include high brightness and low brightness.
The attribute content in the “display of license plate” includes the display of the place name of the license plate attached to the target vehicle 5 (the place name of the transportation branch office that controls the home of use of the vehicle).
「形状」における属性内容には、セダン、クーペ、ワゴン、ミニバン、SUV、オープンカー、及びトラックが含まれる。
「色」における属性内容には、高明度、及び低明度が含まれる。
「ナンバープレートの表示」における属性内容には、対象の車両5に取り付けられたナンバープレートの地名表示(自動車の使用の本拠を管轄する運輸支局の地名)が含まれる。 The attribute content in “size” includes large, medium, and small. Buses and trucks are determined to be large, light vehicles are determined to be small, and other general passenger vehicles are determined to be medium.
The attribute contents in the “shape” include a sedan, a coupe, a wagon, a minivan, an SUV, a convertible car, and a truck.
The attribute contents of “color” include high brightness and low brightness.
The attribute content in the “display of license plate” includes the display of the place name of the license plate attached to the target vehicle 5 (the place name of the transportation branch office that controls the home of use of the vehicle).
検出部31dは、動的情報マップM1に含まれる、カメラ等によって撮像された移動体の画像データを参照し、上述の車両5の外観属性に関する情報を取得する。
The detection unit 31d refers to the image data of the moving object captured by the camera or the like included in the dynamic information map M1, and acquires the information on the appearance attribute of the vehicle 5 described above.
図10中、行動属性には、過去の走行履歴が含まれる。より具体的に、行動属性には、「過去の車線変更数」、「過去の急ハンドル急ブレーキ数」、「警報後の車線変更数」、及び「警報後の急ハンドル急ブレーキ数」の4種類の属性が含まれる。「過去の車線変更数」とは、対象の車両5が過去一定期間に車線変更をした回数を示す。「過去の急ハンドル急ブレーキ数」とは、対象の車両5が過去一定期間に急ハンドル操作又は急ブレーキ操作をした回数を示す。「警報後の車線変更数」とは、後述するエッジサーバ3によって対象の車両5の車載装置50へ通知される警報後に当該車両5が車線変更をした回数を示す。「警報後の急ハンドル急ブレーキ数」とは、警報後に車両5が急ハンドル急ブレーキ操作をした回数を示す。
中 In FIG. 10, the behavior attribute includes a past traveling history. More specifically, the behavior attributes include four numbers of “past lane change number”, “past sudden steering sharp brake number”, “lane change number after warning”, and “fast steering sharp brake number after warning”. Contains type attributes. The “number of past lane changes” indicates the number of times the target vehicle 5 has changed lanes during a fixed period in the past. The “number of past sudden steering wheel sudden brakes” indicates the number of times that the target vehicle 5 has performed the sudden steering operation or the sudden braking operation in a past fixed period. The “number of lane changes after warning” indicates the number of times the vehicle 5 has changed lanes after a warning is issued to the in-vehicle device 50 of the target vehicle 5 by the edge server 3 described below. The “number of sudden steering wheel sudden brakes after the alarm” indicates the number of times the vehicle 5 performs the sudden steering wheel sudden braking operation after the alarm.
検出部31dは、各移動体の過去の動的情報(に含まれる位置情報)を記憶部34に記憶し、過去の動的情報に基づいて、各移動体の過去に移動した軌跡情報を生成する機能を有している。この軌跡情報には、移動体の速度情報も含まれている。よって、車両5の場合、軌跡情報に基づいて、急ハンドル急ブレーキ操作がなされたことを検出することができる。また、軌跡情報に基づいて、直進時に車線を変更する車線変更がなされたことも検出することができる。
The detecting unit 31d stores past dynamic information (position information included therein) of each moving body in the storage unit 34 and generates trajectory information of each moving body that has moved in the past based on the past dynamic information. It has the function to do. The trajectory information also includes speed information of the moving object. Therefore, in the case of the vehicle 5, it is possible to detect that the sudden steering wheel sudden braking operation is performed based on the trajectory information. Further, based on the trajectory information, it is also possible to detect that a lane change that changes lanes when traveling straight ahead has been performed.
検出部31dは、各車両5の軌跡情報に基づいて、各車両5による車線変更、及び急ハンドル急ブレーキ操作を検出し、各車両5が車線変更を行った回数、及び急ハンドル急ブレーキ操作を行った回数をカウントする機能を有している。
The detecting unit 31d detects a lane change and a sudden steering sudden braking operation by each vehicle 5 based on the trajectory information of each vehicle 5, and determines the number of times each vehicle 5 has changed lanes and the sudden steering sudden braking operation. It has a function to count the number of times it has been performed.
なお、エッジサーバ3は、検出部31dがカウントする車線変更を行った回数、及び急ハンドル急ブレーキ操作を行った回数に基づいて、車載装置50に対して警報を通知する機能を有している。エッジサーバ3は、所定期間内に一定回数以上の車線変更及び急ハンドル急ブレーキ操作を行った車載装置50に対して警報を通知する。
検出部31dは、この警報が通知された車載装置50については、警報後の各車両5が車線変更を行った回数、及び急ハンドル急ブレーキ操作を行った回数もカウントする。 Theedge server 3 has a function of notifying a warning to the in-vehicle device 50 based on the number of times the lane change counted by the detection unit 31d and the number of times of the sudden steering wheel sudden braking operation are performed. . The edge server 3 notifies the in-vehicle device 50 that has performed lane change and sudden steering sudden braking operation a certain number of times or more within a predetermined period.
Thedetection unit 31d also counts, for the in-vehicle device 50 that has been notified of the alarm, the number of times that each vehicle 5 has changed lanes and the number of times that the driver has performed a sudden steering wheel sudden braking operation after the alarm.
検出部31dは、この警報が通知された車載装置50については、警報後の各車両5が車線変更を行った回数、及び急ハンドル急ブレーキ操作を行った回数もカウントする。 The
The
検出部31dは、これら機能により、「過去の車線変更数」、「過去の急ハンドル急ブレーキ数」、「警報後の車線変更数」、及び「警報後の急ハンドル急ブレーキ数」に関する情報(行動情報)を取得する。
Based on these functions, the detecting unit 31d uses the information on “the number of past lane changes”, “the number of past sudden steering sudden brakes”, “the number of lane changes after warning”, and “the number of sudden steering sudden brakes after warning” ( Behavior information).
図11に示すように、歩行者の属性情報も、車両の属性情報と同様、外観属性と、行動属性とを含む。
外観属性には、「身長」、「服装」、「イヤホン又はわき見」、及び「松葉杖又は車椅子」の4種類の属性が含まれる。
「服装」、「イヤホン又はわき見」、及び「松葉杖又は車椅子」の属性の内容は予め設定されており、予め設定された複数の属性内容(外観情報)の中から対象の歩行者に該当する属性内容が登録される。 As shown in FIG. 11, the attribute information of the pedestrian also includes an appearance attribute and an action attribute, like the attribute information of the vehicle.
The appearance attributes include four types of attributes of “height”, “clothes”, “earphones or aside”, and “crutches or wheelchairs”.
The contents of the attributes of "clothes", "earphones or aside", and "crutches or wheelchairs" are set in advance, and an attribute corresponding to the target pedestrian from a plurality of preset attribute contents (appearance information). The contents are registered.
外観属性には、「身長」、「服装」、「イヤホン又はわき見」、及び「松葉杖又は車椅子」の4種類の属性が含まれる。
「服装」、「イヤホン又はわき見」、及び「松葉杖又は車椅子」の属性の内容は予め設定されており、予め設定された複数の属性内容(外観情報)の中から対象の歩行者に該当する属性内容が登録される。 As shown in FIG. 11, the attribute information of the pedestrian also includes an appearance attribute and an action attribute, like the attribute information of the vehicle.
The appearance attributes include four types of attributes of “height”, “clothes”, “earphones or aside”, and “crutches or wheelchairs”.
The contents of the attributes of "clothes", "earphones or aside", and "crutches or wheelchairs" are set in advance, and an attribute corresponding to the target pedestrian from a plurality of preset attribute contents (appearance information). The contents are registered.
「身長」における属性内容には、歩行者の身長が登録される。「服装」における属性内容には、学生、及び学生以外が含まれる。「服装」における属性内容が学生の場合、歩行者の服装が学校の制服や帽子を着用していることを示す。「服装」における属性内容が学生以外の場合、歩行者の服装が学校の制服ではないことを示す。
「イヤホン又はわき見」における属性内容には、該当、及び非該当が含まれる。「イヤホン又はわき見」における属性内容が該当である場合、歩行者がイヤホンを装着していたり、歩きながらスマートフォン等の端末を見ていたりしていることを示す。「イヤホン又はわき見」における属性内容が非該当である場合、普通に歩行していることを示す。
「松葉杖又は車椅子」における属性内容には、該当、及び非該当が含まれる。「松葉杖又は車椅子」における属性内容が該当である場合、歩行者が松葉杖又は車椅子を利用していることを示す。「松葉杖又は車椅子」における属性内容が非該当である場合、普通に歩行していることを示す。 The height of the pedestrian is registered in the attribute content of “height”. The attribute contents in “clothes” include students and non-students. If the attribute content of “clothes” is a student, it indicates that the pedestrian's clothes are wearing school uniforms and hats. If the attribute content of "clothes" is other than student, it indicates that the pedestrian's clothes are not school uniforms.
The attribute content in “earphones or aside” includes applicable and non-applicable. If the attribute content in “earphones or looking away” is applicable, it indicates that the pedestrian is wearing earphones or looking at a terminal such as a smartphone while walking. When the attribute content of “earphone or side view” is not applicable, it indicates that the user is walking normally.
The attribute content in “crutch or wheelchair” includes applicable and non-applicable. When the attribute content in “crutches or wheelchairs” is applicable, it indicates that the pedestrian is using crutches or wheelchairs. If the attribute content of “Crutch or Wheelchair” is not applicable, it indicates that the user is walking normally.
「イヤホン又はわき見」における属性内容には、該当、及び非該当が含まれる。「イヤホン又はわき見」における属性内容が該当である場合、歩行者がイヤホンを装着していたり、歩きながらスマートフォン等の端末を見ていたりしていることを示す。「イヤホン又はわき見」における属性内容が非該当である場合、普通に歩行していることを示す。
「松葉杖又は車椅子」における属性内容には、該当、及び非該当が含まれる。「松葉杖又は車椅子」における属性内容が該当である場合、歩行者が松葉杖又は車椅子を利用していることを示す。「松葉杖又は車椅子」における属性内容が非該当である場合、普通に歩行していることを示す。 The height of the pedestrian is registered in the attribute content of “height”. The attribute contents in “clothes” include students and non-students. If the attribute content of “clothes” is a student, it indicates that the pedestrian's clothes are wearing school uniforms and hats. If the attribute content of "clothes" is other than student, it indicates that the pedestrian's clothes are not school uniforms.
The attribute content in “earphones or aside” includes applicable and non-applicable. If the attribute content in “earphones or looking away” is applicable, it indicates that the pedestrian is wearing earphones or looking at a terminal such as a smartphone while walking. When the attribute content of “earphone or side view” is not applicable, it indicates that the user is walking normally.
The attribute content in “crutch or wheelchair” includes applicable and non-applicable. When the attribute content in “crutches or wheelchairs” is applicable, it indicates that the pedestrian is using crutches or wheelchairs. If the attribute content of “Crutch or Wheelchair” is not applicable, it indicates that the user is walking normally.
図11中、行動属性には、「歩行場所」、「信号無視」、及び「歩行軌跡」の4種類の属性が含まれる。「歩行場所」における属性内容には、車道、歩道、及びその他が含まれる。「歩行場所」における属性内容が車道の場合、対象の歩行者7が車道を歩行していることを示す。「歩行場所」における属性内容が歩道の場合、対象の歩行者7が歩道又は横断歩道を歩行していることを示す。「歩行場所」における属性内容がその他の場合、対象の歩行者7が建物の中等を歩行していることを示す。
中 In FIG. 11, the behavior attributes include four types of attributes, “walking place”, “ignore signal”, and “walking locus”. The attribute content of “walking place” includes a road, a sidewalk, and others. When the attribute content of “walking place” is a road, it indicates that the target pedestrian 7 is walking on the road. When the attribute content in “walking place” is a sidewalk, it indicates that the target pedestrian 7 is walking on a sidewalk or a crosswalk. If the attribute content of “walking place” is “other”, it indicates that the target pedestrian 7 is walking in a building or the like.
「信号無視」における属性内容には、該当、及び非該当が含まれる。「信号無視」における属性内容が該当である場合、現在信号無視をしていることを示す。「信号無視」における属性内容が非該当である場合、現在信号無視をしていないことを示す。
「歩行軌跡」における属性内容には、直行、及び蛇行が含まれる。「歩行軌跡」における属性内容が蛇行である場合、対象の歩行者7の直近の歩行軌跡が蛇行していることを示している。「歩行軌跡」における属性内容が直行である場合、対象の歩行者7の直近の歩行軌跡が直行していることを示している。 The attribute content in “ignore signal” includes applicable and non-applicable. If the attribute content in “ignore signal” is applicable, it indicates that the signal is currently ignored. If the attribute content in "ignore signal" is not applicable, it indicates that the signal is not currently ignored.
The attribute content in the “walking locus” includes straight and meandering. If the attribute content in the “walking locus” is meandering, it indicates that the nearest walking locus of thetarget pedestrian 7 is meandering. When the attribute content in the “walking locus” is straight, it indicates that the nearest walking locus of the target pedestrian 7 is straight.
「歩行軌跡」における属性内容には、直行、及び蛇行が含まれる。「歩行軌跡」における属性内容が蛇行である場合、対象の歩行者7の直近の歩行軌跡が蛇行していることを示している。「歩行軌跡」における属性内容が直行である場合、対象の歩行者7の直近の歩行軌跡が直行していることを示している。 The attribute content in “ignore signal” includes applicable and non-applicable. If the attribute content in “ignore signal” is applicable, it indicates that the signal is currently ignored. If the attribute content in "ignore signal" is not applicable, it indicates that the signal is not currently ignored.
The attribute content in the “walking locus” includes straight and meandering. If the attribute content in the “walking locus” is meandering, it indicates that the nearest walking locus of the
検出部31dは、動的情報マップM1に含まれるカメラ等によって撮像された移動体の画像データ等を参照するとともに上述の軌跡情報を参照し、歩行者それぞれの各属性に関する情報(行動情報)を取得する。
なお、動的情報マップM1に移動体(車両及び歩行者)の画像データ等が含まれない場合、検出部31dは、動的情報マップM1を更新するための変化点情報に含まれる画像データ等を用いることができる。
検出部31dは、取得した属性に関する情報に基づいて、移動体それぞれの属性情報を生成する。 The detectingunit 31d refers to the image data of the moving object captured by the camera or the like included in the dynamic information map M1 and refers to the above-described trajectory information, and obtains information (behavior information) regarding each attribute of each pedestrian. get.
Note that when the dynamic information map M1 does not include image data of a moving object (vehicle and pedestrian), thedetection unit 31d determines the image data and the like included in the change point information for updating the dynamic information map M1. Can be used.
The detectingunit 31d generates attribute information of each moving object based on the acquired information on the attributes.
なお、動的情報マップM1に移動体(車両及び歩行者)の画像データ等が含まれない場合、検出部31dは、動的情報マップM1を更新するための変化点情報に含まれる画像データ等を用いることができる。
検出部31dは、取得した属性に関する情報に基づいて、移動体それぞれの属性情報を生成する。 The detecting
Note that when the dynamic information map M1 does not include image data of a moving object (vehicle and pedestrian), the
The detecting
検出部31dは、移動体それぞれの移動体情報の生成と、移動体情報の移動体データベース34aへの登録を繰り返し実行し、移動体データベース34aを随時更新する。これにより、移動体データベース34aに登録されている移動体情報は、最新の情報に維持される。
The detection unit 31d repeatedly generates the moving body information of each moving body and registers the moving body information in the moving body database 34a, and updates the moving body database 34a as needed. As a result, the moving object information registered in the moving object database 34a is maintained at the latest information.
また、図8中の評価値データベース34bは、演算部31aが求める衝突予測結果の評価値を登録するためのデータベースである。評価値データベース34bについては後に説明する。
The evaluation value database 34b in FIG. 8 is a database for registering the evaluation value of the collision prediction result obtained by the calculation unit 31a. The evaluation value database 34b will be described later.
〔評価値の演算処理〕
図12は、演算部31aによる衝突予測結果の評価値の演算処理の一例を示すフローチャートである。
図12に示すように、まず、演算部31aは、移動体データベース34aを読み込み(ステップS51)、通信機能を有する移動体を評価対象として特定する(ステップS52)。衝突予測結果は、歩行者端末70や車載装置50を有している移動体に提供可能である。よって、演算部31aは、歩行者端末70や車載装置50を有している移動体を、評価値を求める評価対象として特定する。
つまり、評価対象とは、演算部31aにより評価値が求められる移動端末を指す。
なお、以下の説明において、評価対象(対象移動体)とは、歩行者端末70や車載装置50を有している移動体である歩行者7や車両5の他、移動体に携帯又は搭載されている歩行者端末70又は車載装置50を指すことがある。また、歩行者端末70を所持する人が搭乗する車両5も評価対象に含まれる。 [Evaluation value calculation processing]
FIG. 12 is a flowchart illustrating an example of the calculation process of the evaluation value of the collision prediction result by thecalculation unit 31a.
As shown in FIG. 12, first, thecalculation unit 31a reads the mobile object database 34a (step S51), and specifies a mobile object having a communication function as an evaluation target (step S52). The collision prediction result can be provided to a moving object having the pedestrian terminal 70 or the in-vehicle device 50. Therefore, the calculation unit 31a specifies a moving object having the pedestrian terminal 70 and the in-vehicle device 50 as an evaluation target for obtaining an evaluation value.
That is, the evaluation target indicates a mobile terminal for which an evaluation value is obtained by thecalculation unit 31a.
In the following description, the evaluation target (target moving object) is apedestrian terminal 70 or a pedestrian 7 or a vehicle 5 which is a moving object having the in-vehicle device 50, or is carried or mounted on a moving object. Pedestrian terminal 70 or on-vehicle device 50 that is running. In addition, the vehicle 5 on which a person carrying the pedestrian terminal 70 boards is also included in the evaluation target.
図12は、演算部31aによる衝突予測結果の評価値の演算処理の一例を示すフローチャートである。
図12に示すように、まず、演算部31aは、移動体データベース34aを読み込み(ステップS51)、通信機能を有する移動体を評価対象として特定する(ステップS52)。衝突予測結果は、歩行者端末70や車載装置50を有している移動体に提供可能である。よって、演算部31aは、歩行者端末70や車載装置50を有している移動体を、評価値を求める評価対象として特定する。
つまり、評価対象とは、演算部31aにより評価値が求められる移動端末を指す。
なお、以下の説明において、評価対象(対象移動体)とは、歩行者端末70や車載装置50を有している移動体である歩行者7や車両5の他、移動体に携帯又は搭載されている歩行者端末70又は車載装置50を指すことがある。また、歩行者端末70を所持する人が搭乗する車両5も評価対象に含まれる。 [Evaluation value calculation processing]
FIG. 12 is a flowchart illustrating an example of the calculation process of the evaluation value of the collision prediction result by the
As shown in FIG. 12, first, the
That is, the evaluation target indicates a mobile terminal for which an evaluation value is obtained by the
In the following description, the evaluation target (target moving object) is a
次いで、演算部31aは、評価値演算処理を行う(ステップS53)。
演算部31aは、評価値演算処理において、特定した評価対象それぞれに対して順次処理を実行し、評価対象の全てに対して処理するまで処理を繰り返す。 Next, thecalculation unit 31a performs an evaluation value calculation process (Step S53).
In the evaluation value calculation processing, thecalculation unit 31a sequentially executes the processing on each of the specified evaluation targets, and repeats the processing until the processing is performed on all of the evaluation targets.
演算部31aは、評価値演算処理において、特定した評価対象それぞれに対して順次処理を実行し、評価対象の全てに対して処理するまで処理を繰り返す。 Next, the
In the evaluation value calculation processing, the
まず、演算部31aは、ステップS55へ進み、個別演算処理を行う。演算部31aは、個別演算処理において評価対象の評価値を求める(ステップS55)。
First, the calculation unit 31a proceeds to step S55, and performs an individual calculation process. The calculation unit 31a obtains an evaluation value to be evaluated in the individual calculation process (Step S55).
図13は、図12中ステップS55の個別演算処理の一例を示すフローチャートである。
図13に示すように、演算部31aは、評価対象以外の移動体(他の移動体)を特定し(ステップS61)、評価対象(対象移動体)と、他の移動体との衝突予測時間を、他の移動体それぞれについて算出する(ステップS62)。
なお、他の移動体には、歩行者端末70又は車載装置50を搭載した移動体も含まれる。 FIG. 13 is a flowchart illustrating an example of the individual calculation process in step S55 in FIG.
As illustrated in FIG. 13, thecalculation unit 31a specifies a mobile body (another mobile body) other than the evaluation target (Step S61), and calculates a collision prediction time between the evaluation target (the target mobile body) and another mobile body. Is calculated for each of the other moving objects (step S62).
It should be noted that other moving objects include moving objects equipped with thepedestrian terminal 70 or the vehicle-mounted device 50.
図13に示すように、演算部31aは、評価対象以外の移動体(他の移動体)を特定し(ステップS61)、評価対象(対象移動体)と、他の移動体との衝突予測時間を、他の移動体それぞれについて算出する(ステップS62)。
なお、他の移動体には、歩行者端末70又は車載装置50を搭載した移動体も含まれる。 FIG. 13 is a flowchart illustrating an example of the individual calculation process in step S55 in FIG.
As illustrated in FIG. 13, the
It should be noted that other moving objects include moving objects equipped with the
演算部31aは、移動体データベース34aを参照し、評価対象、及び他の移動体それぞれの位置情報、方位情報、及び速度情報から、評価対象と、他の移動体とが衝突するとした場合の衝突予測時間を求める。
演算部31aは、位置情報、及び方位情報から衝突の可能性がない場合、衝突予測時間を極端に大きい所定値(例えば、5分)に設定する。また、演算部31aは、求めた衝突予測時間が前記所定値以上となる場合も、衝突予測時間を前記所定値に設定する。 Thecalculation unit 31a refers to the mobile object database 34a and, based on the position information, the azimuth information, and the speed information of the evaluation object and the other mobile objects, a collision when the evaluation object collides with another mobile object. Find the estimated time.
When there is no possibility of collision from the position information and the azimuth information, thecalculation unit 31a sets the collision prediction time to an extremely large predetermined value (for example, 5 minutes). The calculation unit 31a also sets the collision prediction time to the predetermined value when the calculated collision prediction time is equal to or longer than the predetermined value.
演算部31aは、位置情報、及び方位情報から衝突の可能性がない場合、衝突予測時間を極端に大きい所定値(例えば、5分)に設定する。また、演算部31aは、求めた衝突予測時間が前記所定値以上となる場合も、衝突予測時間を前記所定値に設定する。 The
When there is no possibility of collision from the position information and the azimuth information, the
このように、演算部31aは、動的情報マップM1に基づいて、評価対象と他の移動体との衝突を予測し、衝突予測時間を求める。
As described above, the calculation unit 31a predicts a collision between the evaluation target and another moving body based on the dynamic information map M1, and obtains a collision prediction time.
演算部31aは、他の移動体それぞれについて算出した衝突予測時間のうち、最も短い衝突予測時間となる移動体を衝突予測対象として特定する(ステップS63)。
ここで、衝突予測時間に基づいて衝突の可能性を判定した場合、衝突予測時間が最も短い移動体が評価対象に対して最も衝突する可能性が高いと判定することができる。そこで、演算部31aは、最も短い衝突予測時間の移動体を衝突予測対象として特定する。
これにより、演算部31aは他の移動体それぞれについての衝突予測をし、衝突予測結果を得る。 Thecalculation unit 31a specifies, as the collision prediction target, the moving object having the shortest collision prediction time among the collision prediction times calculated for the other moving objects (step S63).
Here, when the possibility of a collision is determined based on the predicted collision time, it can be determined that the moving object having the shortest predicted collision time has the highest possibility of colliding with the evaluation target. Therefore, thecalculation unit 31a specifies a moving object having the shortest collision prediction time as a collision prediction target.
As a result, thecalculation unit 31a performs a collision prediction on each of the other moving objects, and obtains a collision prediction result.
ここで、衝突予測時間に基づいて衝突の可能性を判定した場合、衝突予測時間が最も短い移動体が評価対象に対して最も衝突する可能性が高いと判定することができる。そこで、演算部31aは、最も短い衝突予測時間の移動体を衝突予測対象として特定する。
これにより、演算部31aは他の移動体それぞれについての衝突予測をし、衝突予測結果を得る。 The
Here, when the possibility of a collision is determined based on the predicted collision time, it can be determined that the moving object having the shortest predicted collision time has the highest possibility of colliding with the evaluation target. Therefore, the
As a result, the
次いで、演算部31aは、算出した衝突予測時間に基づいて、衝突予測結果の評価値を求めるための基礎値を求める(ステップS64)。
基礎値は、衝突予測結果の評価値を求めるための基礎となる値であり、後の処理において基礎値に加算値が加算されることで、評価値が得られる。 Next, thecalculation unit 31a obtains a base value for obtaining an evaluation value of the collision prediction result based on the calculated collision prediction time (Step S64).
The base value is a value serving as a basis for obtaining an evaluation value of the collision prediction result, and an evaluation value is obtained by adding an additional value to the base value in a later process.
基礎値は、衝突予測結果の評価値を求めるための基礎となる値であり、後の処理において基礎値に加算値が加算されることで、評価値が得られる。 Next, the
The base value is a value serving as a basis for obtaining an evaluation value of the collision prediction result, and an evaluation value is obtained by adding an additional value to the base value in a later process.
演算部31aは、衝突予測時間に対して、下記に示すルールに従って基礎値を求める。例えば、以下のような設定になるが、これに限定されるものではない。
衝突予測時間が10秒以上: 基礎値=0
衝突予測時間が10秒未満: 基礎値=20
衝突予測時間が 5秒未満: 基礎値=100
衝突予測時間が 3秒未満: 基礎値=200
衝突予測時間が 1秒未満: 基礎値=500
衝突予測結果の評価値は、値が大きいほど、衝突の可能性が高く、通知の必要性が高くなるように設定される。よって基礎値も同様に、値が大きいほど、衝突の可能性が高く、通知の必要性が高くなるように設定される。
つまり、衝突予測結果の評価値は、衝突予測対象が、衝突予測対象以外の移動体と衝突する可能性を示す値である。
なお、演算部31aは、衝突予測対象以外の移動体に対する基礎値として「0」を設定する。衝突予測対象以外の移動体の基礎値には、以降の処理によって加算値が加算されない。よって、衝突予測対象以外の移動体の衝突予測結果の評価値(対象移動体と衝突予測対象以外の移動体との衝突予測結果の評価値)は「0」となる。
以上のように、演算部31aは、衝突予測時間に基づいて移動体ごとの基礎値を求める。 Thecalculation unit 31a obtains a base value for the predicted collision time according to the following rule. For example, the settings are as follows, but are not limited thereto.
Collision prediction time is 10 seconds or more: Base value = 0
Collision prediction time less than 10 seconds: Base value = 20
Collision prediction time less than 5 seconds: Base value = 100
Collision prediction time less than 3 seconds: Base value = 200
Collision prediction time less than 1 second: Base value = 500
The evaluation value of the collision prediction result is set such that the larger the value, the higher the possibility of collision and the higher the need for notification. Accordingly, the base value is also set such that the larger the value, the higher the possibility of collision and the higher the need for notification.
That is, the evaluation value of the collision prediction result is a value indicating a possibility that the collision prediction target collides with a moving object other than the collision prediction target.
Thecalculation unit 31a sets “0” as a base value for a moving object other than the collision prediction target. The added value is not added to the base value of the moving object other than the collision prediction target by the subsequent processing. Therefore, the evaluation value of the collision prediction result of the moving object other than the collision prediction target (evaluation value of the collision prediction result between the target moving object and the moving object other than the collision prediction target) is “0”.
As described above, thecalculation unit 31a obtains the base value for each moving object based on the predicted collision time.
衝突予測時間が10秒以上: 基礎値=0
衝突予測時間が10秒未満: 基礎値=20
衝突予測時間が 5秒未満: 基礎値=100
衝突予測時間が 3秒未満: 基礎値=200
衝突予測時間が 1秒未満: 基礎値=500
衝突予測結果の評価値は、値が大きいほど、衝突の可能性が高く、通知の必要性が高くなるように設定される。よって基礎値も同様に、値が大きいほど、衝突の可能性が高く、通知の必要性が高くなるように設定される。
つまり、衝突予測結果の評価値は、衝突予測対象が、衝突予測対象以外の移動体と衝突する可能性を示す値である。
なお、演算部31aは、衝突予測対象以外の移動体に対する基礎値として「0」を設定する。衝突予測対象以外の移動体の基礎値には、以降の処理によって加算値が加算されない。よって、衝突予測対象以外の移動体の衝突予測結果の評価値(対象移動体と衝突予測対象以外の移動体との衝突予測結果の評価値)は「0」となる。
以上のように、演算部31aは、衝突予測時間に基づいて移動体ごとの基礎値を求める。 The
Collision prediction time is 10 seconds or more: Base value = 0
Collision prediction time less than 10 seconds: Base value = 20
Collision prediction time less than 5 seconds: Base value = 100
Collision prediction time less than 3 seconds: Base value = 200
Collision prediction time less than 1 second: Base value = 500
The evaluation value of the collision prediction result is set such that the larger the value, the higher the possibility of collision and the higher the need for notification. Accordingly, the base value is also set such that the larger the value, the higher the possibility of collision and the higher the need for notification.
That is, the evaluation value of the collision prediction result is a value indicating a possibility that the collision prediction target collides with a moving object other than the collision prediction target.
The
As described above, the
次いで、演算部31aは、ステップS65に進み、評価対象と衝突予測対象との間に死角を生じさせる死角要因が有るか否かを判定する(ステップS65)。
評価対象と衝突予測対象との間の死角要因の有無は、演算部31aが動的情報マップM1を参照することで判定される。演算部31aは、動的情報マップM1を参照し、評価対象と衝突予測対象との間に両者の見通しを遮る建物や、他の移動体等の有無を判定する。評価対象と衝突予測対象との間にこのような建物や他の移動体等が存在する場合、演算部31aは、評価対象と衝突予測対象との間に死角要因が有ると判定する。一方、評価対象と衝突予測対象との間に見通しを遮るものが無ければ、演算部31aは、死角要因がないと判定する。 Next, thecalculation unit 31a proceeds to step S65, and determines whether there is a blind spot factor that causes a blind spot between the evaluation target and the collision prediction target (step S65).
The presence or absence of a blind spot factor between the evaluation target and the collision prediction target is determined by thecalculation unit 31a by referring to the dynamic information map M1. The calculation unit 31a refers to the dynamic information map M1, and determines whether there is a building or another moving object that blocks the line of sight between the evaluation target and the collision prediction target. When such a building or another moving object exists between the evaluation target and the collision prediction target, the calculation unit 31a determines that there is a blind spot factor between the evaluation target and the collision prediction target. On the other hand, if there is no obstacle between the evaluation target and the collision prediction target, the calculation unit 31a determines that there is no blind spot factor.
評価対象と衝突予測対象との間の死角要因の有無は、演算部31aが動的情報マップM1を参照することで判定される。演算部31aは、動的情報マップM1を参照し、評価対象と衝突予測対象との間に両者の見通しを遮る建物や、他の移動体等の有無を判定する。評価対象と衝突予測対象との間にこのような建物や他の移動体等が存在する場合、演算部31aは、評価対象と衝突予測対象との間に死角要因が有ると判定する。一方、評価対象と衝突予測対象との間に見通しを遮るものが無ければ、演算部31aは、死角要因がないと判定する。 Next, the
The presence or absence of a blind spot factor between the evaluation target and the collision prediction target is determined by the
ステップS65において、評価対象と衝突予測対象との間に死角要因が有ると判定する場合、演算部31aは、ステップS64において求めた基礎値に加算を行い(ステップS66)、ステップS67へ進む。
一方、評価対象と衝突予測対象との間に死角要因が無いと判定する場合(ステップS65)、演算部31aは、ステップS67へ進む。 If it is determined in step S65 that there is a blind spot factor between the evaluation target and the collision prediction target, thecalculation unit 31a adds the base value obtained in step S64 (step S66), and proceeds to step S67.
On the other hand, when it is determined that there is no blind spot factor between the evaluation target and the collision prediction target (step S65), thecalculation unit 31a proceeds to step S67.
一方、評価対象と衝突予測対象との間に死角要因が無いと判定する場合(ステップS65)、演算部31aは、ステップS67へ進む。 If it is determined in step S65 that there is a blind spot factor between the evaluation target and the collision prediction target, the
On the other hand, when it is determined that there is no blind spot factor between the evaluation target and the collision prediction target (step S65), the
評価値は、上述のように、値が大きいほど、評価対象の安全性を阻害する要因が高くなるように設定される。
評価対象と衝突予測対象との間に死角要因が有る場合、評価対象においては、安全性がより阻害される。よって、死角要因が有ると判定する場合には、演算部31aは基礎値に加算を行う。
なおステップS66において基礎値に加算される加算値は、例えば「100」である。この加算値は、一例であってこれに限定されるものではない。以下に示す加算値も同様である。 As described above, the evaluation value is set such that the larger the value, the higher the factor that impairs the safety of the evaluation target.
When there is a blind spot factor between the evaluation target and the collision prediction target, safety is further impaired in the evaluation target. Therefore, when it is determined that there is a blind spot factor, thecalculation unit 31a adds the base value.
The added value added to the base value in step S66 is, for example, "100". This addition value is an example and is not limited to this. The same applies to the following addition values.
評価対象と衝突予測対象との間に死角要因が有る場合、評価対象においては、安全性がより阻害される。よって、死角要因が有ると判定する場合には、演算部31aは基礎値に加算を行う。
なおステップS66において基礎値に加算される加算値は、例えば「100」である。この加算値は、一例であってこれに限定されるものではない。以下に示す加算値も同様である。 As described above, the evaluation value is set such that the larger the value, the higher the factor that impairs the safety of the evaluation target.
When there is a blind spot factor between the evaluation target and the collision prediction target, safety is further impaired in the evaluation target. Therefore, when it is determined that there is a blind spot factor, the
The added value added to the base value in step S66 is, for example, "100". This addition value is an example and is not limited to this. The same applies to the following addition values.
このように、演算部31aは、評価対象と、衝突予測対象との間に死角を生じさせる死角要因の有無を判定し、死角要因の有無の判定結果を評価値に加味する。
この場合、死角要因の有無を、評価値に反映でき、後述する評価対象への情報提供の実行判定に反映させることができる。 As described above, thecalculation unit 31a determines the presence or absence of a blind spot factor that causes a blind spot between the evaluation target and the collision prediction target, and adds the determination result of the presence or absence of the blind spot factor to the evaluation value.
In this case, the presence or absence of a blind spot factor can be reflected in the evaluation value, and can be reflected in the execution determination of providing information to the evaluation target described later.
この場合、死角要因の有無を、評価値に反映でき、後述する評価対象への情報提供の実行判定に反映させることができる。 As described above, the
In this case, the presence or absence of a blind spot factor can be reflected in the evaluation value, and can be reflected in the execution determination of providing information to the evaluation target described later.
次に、演算部31aは、ステップS67において、衝突予測対象の属性内容に応じて基礎値に値を加算する加算処理を行う(ステップS67)。
Next, in step S67, the calculation unit 31a performs an addition process of adding a value to the basic value according to the content of the attribute of the collision prediction target (step S67).
図14は、図13中の衝突予測対象の加算処理の一例を示すフローチャートである。
まず、演算部31aは、衝突予測対象(の移動体)が車両か否かを判定する(ステップS70)。
衝突予測対象が車両5であると判定する場合、演算部31aは、ステップS71へ進み、車両の加算処理を実行し、処理を終える。 FIG. 14 is a flowchart illustrating an example of the process of adding a collision prediction target in FIG.
First, thecalculation unit 31a determines whether (the moving object of) the collision prediction target is a vehicle (step S70).
If it is determined that the collision prediction target is thevehicle 5, the calculation unit 31a proceeds to step S71, executes the vehicle addition processing, and ends the processing.
まず、演算部31aは、衝突予測対象(の移動体)が車両か否かを判定する(ステップS70)。
衝突予測対象が車両5であると判定する場合、演算部31aは、ステップS71へ進み、車両の加算処理を実行し、処理を終える。 FIG. 14 is a flowchart illustrating an example of the process of adding a collision prediction target in FIG.
First, the
If it is determined that the collision prediction target is the
図15は、図14中の車両の加算処理の一例を示すフローチャートである。
車両の加算処理において、演算部31aは、まず、移動体データベース34aにおける、衝突予測対象(車両5)の各属性情報を参照する。
演算部31aは、属性情報のうちの「サイズ」に関する加算を行う(ステップS81)。 FIG. 15 is a flowchart showing an example of the vehicle addition process in FIG.
In the vehicle addition process, thecalculation unit 31a first refers to the attribute information of the collision prediction target (vehicle 5) in the mobile object database 34a.
Thecalculation unit 31a performs addition regarding “size” of the attribute information (step S81).
車両の加算処理において、演算部31aは、まず、移動体データベース34aにおける、衝突予測対象(車両5)の各属性情報を参照する。
演算部31aは、属性情報のうちの「サイズ」に関する加算を行う(ステップS81)。 FIG. 15 is a flowchart showing an example of the vehicle addition process in FIG.
In the vehicle addition process, the
The
「サイズ」における属性内容である大型、中型、及び小型のそれぞれには、予め加算値が設定されている。
演算部31aは、「サイズ」における属性内容が大型である場合、加算値として「50」を基礎値に加算し、中型である場合、加算値として「30」を基礎値に加算し、小型である場合、加算値として「20」を基礎値に加算する。
一般に、車両5の大きさが大きければ、衝突の可能性が高まると考えられる。よって、車両5の大きさが大きくなるに従って、加算値が大きくなるように設定されている。
このように、演算部31aは、衝突予測対象の「サイズ」に応じた加算値を基礎値に加算する(ステップS81)。 An addition value is set in advance for each of the attribute contents of “size”, that is, large, medium, and small.
Thearithmetic unit 31a adds “50” to the base value as an added value when the attribute content in “size” is large, and adds “30” to the base value as an added value when the attribute content is medium-sized. In some cases, “20” is added to the base value as an additional value.
Generally, it is considered that the greater the size of thevehicle 5, the greater the possibility of collision. Therefore, the addition value is set to increase as the size of the vehicle 5 increases.
As described above, thecalculation unit 31a adds the added value corresponding to the “size” of the collision prediction target to the base value (Step S81).
演算部31aは、「サイズ」における属性内容が大型である場合、加算値として「50」を基礎値に加算し、中型である場合、加算値として「30」を基礎値に加算し、小型である場合、加算値として「20」を基礎値に加算する。
一般に、車両5の大きさが大きければ、衝突の可能性が高まると考えられる。よって、車両5の大きさが大きくなるに従って、加算値が大きくなるように設定されている。
このように、演算部31aは、衝突予測対象の「サイズ」に応じた加算値を基礎値に加算する(ステップS81)。 An addition value is set in advance for each of the attribute contents of “size”, that is, large, medium, and small.
The
Generally, it is considered that the greater the size of the
As described above, the
次に、演算部31aは、属性情報のうちの「形状」に関する加算を行う(ステップS82)。
「形状」における属性内容であるセダン、クーペ、ワゴン、ミニバン、SUV、オープンカー、及びトラックのそれぞれには、予め加算値が設定されている。例えば、車両5の形状に応じて、事故率等に違いがある場合、その事故率が高い属性内容に対してはより大きな加算値を設定することができる。
例えば、「形状」における属性内容が、クーペ、オープンカー、及びトラックの場合、加算値「30」を加算し、セダン、ワゴン、ミニバン、及びSUVの場合、加算値「10」を加算する。
このように、演算部31aは、衝突予測対象の「形状」に応じた加算値を基礎値に加算する(ステップS82)。 Next, thecalculation unit 31a performs addition regarding “shape” in the attribute information (step S82).
An addition value is set in advance for each of the attributes of the “shape”, that is, the sedan, coupe, wagon, minivan, SUV, open car, and truck. For example, when there is a difference in the accident rate or the like according to the shape of thevehicle 5, a larger added value can be set for the attribute content having a high accident rate.
For example, when the attribute content in the “shape” is coupe, open car, and truck, the addition value “30” is added, and when the attribute content is sedan, wagon, minivan, and SUV, the addition value “10” is added.
As described above, thecalculation unit 31a adds the added value corresponding to the “shape” of the collision prediction target to the base value (Step S82).
「形状」における属性内容であるセダン、クーペ、ワゴン、ミニバン、SUV、オープンカー、及びトラックのそれぞれには、予め加算値が設定されている。例えば、車両5の形状に応じて、事故率等に違いがある場合、その事故率が高い属性内容に対してはより大きな加算値を設定することができる。
例えば、「形状」における属性内容が、クーペ、オープンカー、及びトラックの場合、加算値「30」を加算し、セダン、ワゴン、ミニバン、及びSUVの場合、加算値「10」を加算する。
このように、演算部31aは、衝突予測対象の「形状」に応じた加算値を基礎値に加算する(ステップS82)。 Next, the
An addition value is set in advance for each of the attributes of the “shape”, that is, the sedan, coupe, wagon, minivan, SUV, open car, and truck. For example, when there is a difference in the accident rate or the like according to the shape of the
For example, when the attribute content in the “shape” is coupe, open car, and truck, the addition value “30” is added, and when the attribute content is sedan, wagon, minivan, and SUV, the addition value “10” is added.
As described above, the
次に、演算部31aは、属性情報のうちの「色」に関する加算を行う(ステップS83)。
「色」における属性内容である高明度、及び低明度のそれぞれにも、予め加算値が設定されている。例えば、高明度と、低明度との間に事故率等の違いがある場合、その事故率が高い属性内容に対してはより大きな加算値となるように設定することができる。
例えば、「色」における属性内容が、高明度の場合、加算値を加算せず、低明度の場合、加算値「10」を加算する。
このように、演算部31aは、衝突予測対象の「色」に応じた加算値を基礎値に加算する(ステップS83)。 Next, thecalculation unit 31a performs addition regarding “color” in the attribute information (step S83).
An addition value is set in advance for each of the high brightness and the low brightness, which are the attribute contents of “color”. For example, when there is a difference in the accident rate or the like between the high brightness and the low brightness, it is possible to set the attribute content having the high accident rate to have a larger addition value.
For example, when the attribute content of “color” is high lightness, the added value is not added, and when the attribute content is “low lightness”, the added value “10” is added.
As described above, thecalculation unit 31a adds the addition value corresponding to the “color” of the collision prediction target to the base value (Step S83).
「色」における属性内容である高明度、及び低明度のそれぞれにも、予め加算値が設定されている。例えば、高明度と、低明度との間に事故率等の違いがある場合、その事故率が高い属性内容に対してはより大きな加算値となるように設定することができる。
例えば、「色」における属性内容が、高明度の場合、加算値を加算せず、低明度の場合、加算値「10」を加算する。
このように、演算部31aは、衝突予測対象の「色」に応じた加算値を基礎値に加算する(ステップS83)。 Next, the
An addition value is set in advance for each of the high brightness and the low brightness, which are the attribute contents of “color”. For example, when there is a difference in the accident rate or the like between the high brightness and the low brightness, it is possible to set the attribute content having the high accident rate to have a larger addition value.
For example, when the attribute content of “color” is high lightness, the added value is not added, and when the attribute content is “low lightness”, the added value “10” is added.
As described above, the
次に、演算部31aは、属性情報のうちの「ナンバープレートの表示」における地名表示が地元でないか否かを判定する(ステップS84)。
ナンバープレートの地名表示が、車両5の現在位置の管轄運輸支局の地名と異なる場合、演算部31aは、地名表示が地元でないと判定する。
ステップS84においてナンバープレートの地名表示が地元でないと判定すると、演算部31aは、基礎値に加算値を加算し(ステップS85)、ステップS86へ進む。
一方、ステップS84においてナンバープレートの地名表示が地元であると判定すると、演算部31aは、ステップS86へ進む。
例えば、「ナンバープレートの表示」における属性内容が、地元である場合、加算値を加算せず、地元でない場合、加算値「10」を加算する。
なお、ステップS81からステップS85までは、外観属性に関する処理であり、ステップS86は行動属性に関する処理である。 Next, thecalculation unit 31a determines whether or not the place name display in the “display of license plate” in the attribute information is not local (step S84).
If the place name display on the license plate is different from the place name of the jurisdiction of the transport branch office at the current position of thevehicle 5, the arithmetic unit 31a determines that the place name display is not local.
If it is determined in step S84 that the place name display on the license plate is not local, thecalculation unit 31a adds the added value to the base value (step S85), and proceeds to step S86.
On the other hand, if it is determined in step S84 that the place name display on the license plate is local, thecalculation unit 31a proceeds to step S86.
For example, if the attribute content in “display license plate” is local, the added value is not added, and if it is not local, the added value “10” is added.
Steps S81 to S85 are processing relating to appearance attributes, and step S86 is processing relating to action attributes.
ナンバープレートの地名表示が、車両5の現在位置の管轄運輸支局の地名と異なる場合、演算部31aは、地名表示が地元でないと判定する。
ステップS84においてナンバープレートの地名表示が地元でないと判定すると、演算部31aは、基礎値に加算値を加算し(ステップS85)、ステップS86へ進む。
一方、ステップS84においてナンバープレートの地名表示が地元であると判定すると、演算部31aは、ステップS86へ進む。
例えば、「ナンバープレートの表示」における属性内容が、地元である場合、加算値を加算せず、地元でない場合、加算値「10」を加算する。
なお、ステップS81からステップS85までは、外観属性に関する処理であり、ステップS86は行動属性に関する処理である。 Next, the
If the place name display on the license plate is different from the place name of the jurisdiction of the transport branch office at the current position of the
If it is determined in step S84 that the place name display on the license plate is not local, the
On the other hand, if it is determined in step S84 that the place name display on the license plate is local, the
For example, if the attribute content in “display license plate” is local, the added value is not added, and if it is not local, the added value “10” is added.
Steps S81 to S85 are processing relating to appearance attributes, and step S86 is processing relating to action attributes.
次に、演算部31aは、属性情報のうちの行動属性である過去の走行履歴に関する加算を行う(ステップS86)。
行動属性に含まれる「過去の車線変更数」、「過去の急ハンドル急ブレーキ数」、「警報後の車線変更数」、及び「警報後の急ハンドル急ブレーキ数」のそれぞれには、回数に応じた加算値が予め設定されている。これら行動属性に設定される加算値は、その属性内容である回数が増加するに従って大きな加算値が設定される。
車線変更や、急ハンドル急ブレーキ操作が頻繁に行われる場合、衝突の可能性が高い行動を行う確率が高いことが考えられる。このため、車線変更数及び急ハンドル急ブレーキ数が増加するに従って大きな加算値が設定される。 Next, thecalculation unit 31a performs addition on the past travel history that is the action attribute of the attribute information (step S86).
Each of the "past lane change count", "past sudden steering sharp brake count", "lane change count after warning", and "fast steering sharp brake count after warning" included in the behavior attribute includes A corresponding addition value is set in advance. As the added value set for these action attributes, a larger added value is set as the number of times, which is the attribute content, increases.
When a lane change or a sudden steering wheel sudden braking operation is frequently performed, it is conceivable that there is a high probability that an action having a high possibility of a collision is performed. For this reason, as the number of lane changes and the number of sudden steering brakes increase, a large addition value is set.
行動属性に含まれる「過去の車線変更数」、「過去の急ハンドル急ブレーキ数」、「警報後の車線変更数」、及び「警報後の急ハンドル急ブレーキ数」のそれぞれには、回数に応じた加算値が予め設定されている。これら行動属性に設定される加算値は、その属性内容である回数が増加するに従って大きな加算値が設定される。
車線変更や、急ハンドル急ブレーキ操作が頻繁に行われる場合、衝突の可能性が高い行動を行う確率が高いことが考えられる。このため、車線変更数及び急ハンドル急ブレーキ数が増加するに従って大きな加算値が設定される。 Next, the
Each of the "past lane change count", "past sudden steering sharp brake count", "lane change count after warning", and "fast steering sharp brake count after warning" included in the behavior attribute includes A corresponding addition value is set in advance. As the added value set for these action attributes, a larger added value is set as the number of times, which is the attribute content, increases.
When a lane change or a sudden steering wheel sudden braking operation is frequently performed, it is conceivable that there is a high probability that an action having a high possibility of a collision is performed. For this reason, as the number of lane changes and the number of sudden steering brakes increase, a large addition value is set.
さらに、「過去の車線変更数」と、「警報後の車線変更数」との間においては、「警報後の車線変更数」の方に相対的により大きな加算値が設定される。
「過去の急ハンドル急ブレーキ数」と、「警報後の急ハンドル急ブレーキ数」との間においても同様に、「警報後の急ハンドル急ブレーキ数」の方に相対的により大きな加算値が設定される。
エッジサーバ3から警報を通知したにも関わらず、車線変更数や、急ハンドル急ブレーキ操作を繰り返す場合、衝突の可能性が高い行動を行う確率がより高いと考えられるからである。 Further, between "the number of lane changes in the past" and "the number of lane changes after the warning", a relatively larger added value is set in the "number of lane changes after the warning".
Similarly, between "the number of sudden steering sudden brakes in the past" and "the number of sudden steering sudden brakes after the alarm", a relatively larger added value is set in the "number of sudden steering sudden brakes after the alert" Is done.
This is because, in the case where the number of lane changes or the sudden steering wheel sudden braking operation is repeated despite the notification of the warning from theedge server 3, it is considered that the probability of performing an action having a high possibility of collision is higher.
「過去の急ハンドル急ブレーキ数」と、「警報後の急ハンドル急ブレーキ数」との間においても同様に、「警報後の急ハンドル急ブレーキ数」の方に相対的により大きな加算値が設定される。
エッジサーバ3から警報を通知したにも関わらず、車線変更数や、急ハンドル急ブレーキ操作を繰り返す場合、衝突の可能性が高い行動を行う確率がより高いと考えられるからである。 Further, between "the number of lane changes in the past" and "the number of lane changes after the warning", a relatively larger added value is set in the "number of lane changes after the warning".
Similarly, between "the number of sudden steering sudden brakes in the past" and "the number of sudden steering sudden brakes after the alarm", a relatively larger added value is set in the "number of sudden steering sudden brakes after the alert" Is done.
This is because, in the case where the number of lane changes or the sudden steering wheel sudden braking operation is repeated despite the notification of the warning from the
このように、演算部31aは、過去の走行履歴に応じた加算値を基礎値に加算し(ステップS86)、加算処理を終える。
なお、ここでは、過去の走行履歴に応じた加算値を加算した場合を例示したが、さらに、現在の走行速度等、現在の走行状態に応じた加算値を基礎値に加算してもよい。 As described above, thecalculation unit 31a adds the addition value according to the past traveling history to the base value (step S86), and ends the addition processing.
Here, the case where the addition value according to the past traveling history is added is illustrated, but the addition value according to the current traveling state such as the current traveling speed may be further added to the base value.
なお、ここでは、過去の走行履歴に応じた加算値を加算した場合を例示したが、さらに、現在の走行速度等、現在の走行状態に応じた加算値を基礎値に加算してもよい。 As described above, the
Here, the case where the addition value according to the past traveling history is added is illustrated, but the addition value according to the current traveling state such as the current traveling speed may be further added to the base value.
図14に戻って、ステップS70において、衝突予測対象が車両5でないと判定する場合、演算部31aは、ステップS72へ進み、歩行者の加算処理を実行し、処理を終える。
Returning to FIG. 14, when it is determined in step S70 that the collision prediction target is not the vehicle 5, the calculation unit 31a proceeds to step S72, executes a pedestrian addition process, and ends the process.
図16は、図14中の歩行者の加算処理の一例を示すフローチャートである。
歩行者の加算処理において、演算部31aは、まず、移動体データベース34aにおける、衝突予測対象(歩行者7)の各属性情報を参照する。
演算部31aは、属性情報のうちの「身長」が所定値以下か否かを判定する(ステップS91)。
ステップS91において「身長」が所定値以下であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS92)、ステップS93へ進む。
一方、ステップS91において「身長」が所定値以下でないと判定すると、演算部31aは、ステップS93へ進む。 FIG. 16 is a flowchart showing an example of the pedestrian addition process in FIG.
In the pedestrian addition process, thecalculation unit 31a first refers to the attribute information of the collision prediction target (pedestrian 7) in the mobile object database 34a.
Thecalculation unit 31a determines whether “height” of the attribute information is equal to or less than a predetermined value (step S91).
If it is determined in step S91 that the “height” is equal to or smaller than the predetermined value, thecalculation unit 31a adds the added value to the base value (step S92), and proceeds to step S93.
On the other hand, if it is determined in step S91 that “height” is not smaller than or equal to the predetermined value, thecalculation unit 31a proceeds to step S93.
歩行者の加算処理において、演算部31aは、まず、移動体データベース34aにおける、衝突予測対象(歩行者7)の各属性情報を参照する。
演算部31aは、属性情報のうちの「身長」が所定値以下か否かを判定する(ステップS91)。
ステップS91において「身長」が所定値以下であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS92)、ステップS93へ進む。
一方、ステップS91において「身長」が所定値以下でないと判定すると、演算部31aは、ステップS93へ進む。 FIG. 16 is a flowchart showing an example of the pedestrian addition process in FIG.
In the pedestrian addition process, the
The
If it is determined in step S91 that the “height” is equal to or smaller than the predetermined value, the
On the other hand, if it is determined in step S91 that “height” is not smaller than or equal to the predetermined value, the
所定値としては、例えば、120cmに設定される。この場合、ステップS91では、歩行者が、小学校低学年程度よりも年齢の小さい子供か、それ以外の歩行者かを判定することができる。よって、歩行者が、小学校低学年程度よりも年齢の小さい子供である場合、加算値を加算することができる。例えば、ステップS92における加算値は「20」である。
The predetermined value is set to, for example, 120 cm. In this case, in step S91, it can be determined whether the pedestrian is a child younger than the lower grade of elementary school or a pedestrian other than that. Therefore, when the pedestrian is a child younger than the lower grade of elementary school, an additional value can be added. For example, the added value in step S92 is “20”.
次に演算部31aは、「服装」における属性内容が学生か否かを判定する(ステップS93)。
ステップS93において「服装」における属性内容が学生であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS94)、ステップS95へ進む。
一方、ステップS93において「服装」における属性内容が学生でないと判定すると、演算部31aは、ステップS95へ進む。
ステップS93及びステップS94では、歩行者が、学生である場合、加算値を加算することができる。例えば、ステップS94における加算値は「10」である。 Next, thecalculation unit 31a determines whether or not the attribute content of “clothes” is a student (step S93).
If it is determined in step S93 that the attribute content of “clothes” is a student, thecalculation unit 31a adds the added value to the base value (step S94), and proceeds to step S95.
On the other hand, if it is determined in step S93 that the attribute content of “clothes” is not a student, thecalculation unit 31a proceeds to step S95.
In steps S93 and S94, when the pedestrian is a student, an additional value can be added. For example, the added value in step S94 is “10”.
ステップS93において「服装」における属性内容が学生であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS94)、ステップS95へ進む。
一方、ステップS93において「服装」における属性内容が学生でないと判定すると、演算部31aは、ステップS95へ進む。
ステップS93及びステップS94では、歩行者が、学生である場合、加算値を加算することができる。例えば、ステップS94における加算値は「10」である。 Next, the
If it is determined in step S93 that the attribute content of “clothes” is a student, the
On the other hand, if it is determined in step S93 that the attribute content of “clothes” is not a student, the
In steps S93 and S94, when the pedestrian is a student, an additional value can be added. For example, the added value in step S94 is “10”.
次に演算部31aは、「イヤホン又はわき見」における属性内容が該当か否かを判定する(ステップS95)。
ステップS95において「イヤホン又はわき見」における属性内容が該当であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS96)、ステップS97へ進む。
一方、ステップS95において「イヤホン又はわき見」における属性内容が該当でない(非該当である)と判定すると、演算部31aは、にステップS97へ進む。
ステップS95及びステップS96では、歩行者が、イヤホンを装着していたり、歩きながらスマートフォン等の端末を見ていたりしている場合、加算値を加算することができる。例えば、ステップS96における加算値は「10」である。 Next, thecalculation unit 31a determines whether or not the attribute content in “earphone or side-by-side” is applicable (step S95).
If it is determined in step S95 that the attribute content of “earphone or aside” is applicable, thecalculation unit 31a adds the added value to the base value (step S96), and proceeds to step S97.
On the other hand, if it is determined in step S95 that the attribute content in “earphone or aside” is not applicable (not applicable), theoperation unit 31a proceeds to step S97.
In steps S95 and S96, when the pedestrian is wearing earphones or looking at a terminal such as a smartphone while walking, the added value can be added. For example, the added value in step S96 is “10”.
ステップS95において「イヤホン又はわき見」における属性内容が該当であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS96)、ステップS97へ進む。
一方、ステップS95において「イヤホン又はわき見」における属性内容が該当でない(非該当である)と判定すると、演算部31aは、にステップS97へ進む。
ステップS95及びステップS96では、歩行者が、イヤホンを装着していたり、歩きながらスマートフォン等の端末を見ていたりしている場合、加算値を加算することができる。例えば、ステップS96における加算値は「10」である。 Next, the
If it is determined in step S95 that the attribute content of “earphone or aside” is applicable, the
On the other hand, if it is determined in step S95 that the attribute content in “earphone or aside” is not applicable (not applicable), the
In steps S95 and S96, when the pedestrian is wearing earphones or looking at a terminal such as a smartphone while walking, the added value can be added. For example, the added value in step S96 is “10”.
次に、演算部31aは、「松葉杖又は車椅子」における属性内容が該当か否かを判定する(ステップS97)。
ステップS97において「松葉杖又は車椅子」における属性内容が該当であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS98)、ステップS99へ進む。
一方、ステップS97において「松葉杖又は車椅子」における属性内容が該当でない(非該当である)と判定すると、演算部31aは、ステップS99へ進む。
ステップS97及びステップS98では、歩行者が、松葉杖又は車椅子を利用している場合、加算値を加算することができる。例えば、ステップS98における加算値は「30」である。
なお、ステップS91からステップS98までは、外観属性に関する処理であり、ステップS99以降の処理は行動属性に関する処理である。 Next, thecalculation unit 31a determines whether or not the attribute content of “crucible or wheelchair” is applicable (step S97).
If it is determined in step S97 that the attribute content of “crutch or wheelchair” is applicable, thecalculation unit 31a adds the added value to the base value (step S98), and proceeds to step S99.
On the other hand, if it is determined in step S97 that the attribute content of “crutch or wheelchair” is not applicable (not applicable), thecalculation unit 31a proceeds to step S99.
In steps S97 and S98, when the pedestrian is using a crutch or a wheelchair, the added value can be added. For example, the added value in step S98 is “30”.
Steps S91 to S98 are processing relating to appearance attributes, and processing from step S99 is processing relating to action attributes.
ステップS97において「松葉杖又は車椅子」における属性内容が該当であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS98)、ステップS99へ進む。
一方、ステップS97において「松葉杖又は車椅子」における属性内容が該当でない(非該当である)と判定すると、演算部31aは、ステップS99へ進む。
ステップS97及びステップS98では、歩行者が、松葉杖又は車椅子を利用している場合、加算値を加算することができる。例えば、ステップS98における加算値は「30」である。
なお、ステップS91からステップS98までは、外観属性に関する処理であり、ステップS99以降の処理は行動属性に関する処理である。 Next, the
If it is determined in step S97 that the attribute content of “crutch or wheelchair” is applicable, the
On the other hand, if it is determined in step S97 that the attribute content of “crutch or wheelchair” is not applicable (not applicable), the
In steps S97 and S98, when the pedestrian is using a crutch or a wheelchair, the added value can be added. For example, the added value in step S98 is “30”.
Steps S91 to S98 are processing relating to appearance attributes, and processing from step S99 is processing relating to action attributes.
演算部31aは、「歩行場所」における属性内容が車道か否かを判定する(ステップS99)。
ステップS99において「歩行場所」における属性内容が車道であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS100)、ステップS101へ進む。
一方、ステップS99において「歩行場所」における属性内容が車道でないと判定すると、演算部31aは、ステップS101へ進む。
ステップS100及びステップS101では、歩行者が、車道を歩行している場合、加算値を加算することができる。例えば、ステップS100における加算値は「30」である。 Thecalculation unit 31a determines whether or not the attribute content of the “walking place” is a road (Step S99).
If it is determined in step S99 that the attribute content of the “walking place” is a road, thecalculation unit 31a adds the added value to the base value (step S100), and proceeds to step S101.
On the other hand, if it is determined in step S99 that the attribute content of “walking place” is not a road, thecalculation unit 31a proceeds to step S101.
In steps S100 and S101, when the pedestrian is walking on the road, the added value can be added. For example, the added value in step S100 is “30”.
ステップS99において「歩行場所」における属性内容が車道であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS100)、ステップS101へ進む。
一方、ステップS99において「歩行場所」における属性内容が車道でないと判定すると、演算部31aは、ステップS101へ進む。
ステップS100及びステップS101では、歩行者が、車道を歩行している場合、加算値を加算することができる。例えば、ステップS100における加算値は「30」である。 The
If it is determined in step S99 that the attribute content of the “walking place” is a road, the
On the other hand, if it is determined in step S99 that the attribute content of “walking place” is not a road, the
In steps S100 and S101, when the pedestrian is walking on the road, the added value can be added. For example, the added value in step S100 is “30”.
次に、演算部31aは、歩行速度が所定値よりも遅いか否かを判定する(ステップS101)。
歩行者の歩行速度が所定値よりも遅いか否かの判定は、移動体データベース34aの移動体情報を参照することで判定することができる。
なお、歩行速度と比較される所定値は、例えば、一般的な歩行者の速度として3.6km毎時に設定される。 Next, thecalculation unit 31a determines whether the walking speed is lower than a predetermined value (Step S101).
The determination as to whether the walking speed of the pedestrian is lower than the predetermined value can be made by referring to the moving object information of the movingobject database 34a.
The predetermined value to be compared with the walking speed is set, for example, every 3.6 km as a general pedestrian speed.
歩行者の歩行速度が所定値よりも遅いか否かの判定は、移動体データベース34aの移動体情報を参照することで判定することができる。
なお、歩行速度と比較される所定値は、例えば、一般的な歩行者の速度として3.6km毎時に設定される。 Next, the
The determination as to whether the walking speed of the pedestrian is lower than the predetermined value can be made by referring to the moving object information of the moving
The predetermined value to be compared with the walking speed is set, for example, every 3.6 km as a general pedestrian speed.
ステップS101において歩行速度が所定値よりも遅いと判定すると、演算部31aは、基礎値に加算値を加算し(ステップS102)、ステップS103へ進む。
一方、ステップS101において歩行速度が所定値よりも遅くないと判定すると、演算部31aは、ステップS103へ進む。
ステップS101及びステップS102では、歩行者の歩行速度が遅い場合、加算値を加算することができる。例えば、ステップS102における加算値は「10」である。 If it is determined in step S101 that the walking speed is lower than the predetermined value, thecalculation unit 31a adds the added value to the base value (step S102), and proceeds to step S103.
On the other hand, if it is determined in step S101 that the walking speed is not lower than the predetermined value, thecalculation unit 31a proceeds to step S103.
In steps S101 and S102, when the walking speed of the pedestrian is low, the added value can be added. For example, the added value in step S102 is “10”.
一方、ステップS101において歩行速度が所定値よりも遅くないと判定すると、演算部31aは、ステップS103へ進む。
ステップS101及びステップS102では、歩行者の歩行速度が遅い場合、加算値を加算することができる。例えば、ステップS102における加算値は「10」である。 If it is determined in step S101 that the walking speed is lower than the predetermined value, the
On the other hand, if it is determined in step S101 that the walking speed is not lower than the predetermined value, the
In steps S101 and S102, when the walking speed of the pedestrian is low, the added value can be added. For example, the added value in step S102 is “10”.
次に、演算部31aは、「信号無視」における属性内容が該当か否かを判定する(ステップS103)。
ステップS103において「信号無視」における属性内容が該当であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS104)、ステップS105へ進む。
一方、ステップS103において「信号無視」における属性内容が該当でない(非該当である)と判定すると、演算部31aは、ステップS105へ進む。
ステップS103及びステップS104では、歩行者が、信号無視をしている場合、加算値を加算することができる。例えば、ステップS100における加算値は「50」である。 Next, thearithmetic unit 31a determines whether or not the attribute content in "ignore signal" is applicable (step S103).
If it is determined in step S103 that the attribute content of "ignore signal" is applicable, theoperation unit 31a adds the added value to the base value (step S104), and proceeds to step S105.
On the other hand, if it is determined in step S103 that the attribute content of “ignore signal” is not applicable (not applicable), theoperation unit 31a proceeds to step S105.
In step S103 and step S104, when the pedestrian ignores the signal, the added value can be added. For example, the added value in step S100 is “50”.
ステップS103において「信号無視」における属性内容が該当であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS104)、ステップS105へ進む。
一方、ステップS103において「信号無視」における属性内容が該当でない(非該当である)と判定すると、演算部31aは、ステップS105へ進む。
ステップS103及びステップS104では、歩行者が、信号無視をしている場合、加算値を加算することができる。例えば、ステップS100における加算値は「50」である。 Next, the
If it is determined in step S103 that the attribute content of "ignore signal" is applicable, the
On the other hand, if it is determined in step S103 that the attribute content of “ignore signal” is not applicable (not applicable), the
In step S103 and step S104, when the pedestrian ignores the signal, the added value can be added. For example, the added value in step S100 is “50”.
次に、演算部31aは、「歩行軌跡」における属性内容が蛇行か否かを判定する(ステップS105)。
ステップS105において「歩行軌跡」における属性内容が蛇行であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS106)、加算処理を終える。
一方、ステップS105において「歩行軌跡」における属性内容が蛇行でない(直行である)と判定すると、演算部31aは、加算処理を終える。
ステップS105及びステップS106では、例えば、歩行者が飲酒等によって酩酊状態にあることから、蛇行している場合、加算値を加算することができる。例えば、ステップS106における加算値は「50」である。 Next, thecalculation unit 31a determines whether the attribute content in the “walking locus” is meandering (Step S105).
If it is determined in step S105 that the attribute content in the “walking locus” is meandering, thecalculation unit 31a adds the addition value to the base value (step S106), and ends the addition processing.
On the other hand, if it is determined in step S105 that the attribute content in the “walking locus” is not meandering (is a straight line), thecalculation unit 31a ends the addition processing.
In step S105 and step S106, for example, since the pedestrian is in a drunken state due to drinking or the like, the added value can be added when the pedestrian is meandering. For example, the added value in step S106 is “50”.
ステップS105において「歩行軌跡」における属性内容が蛇行であると判定すると、演算部31aは、基礎値に加算値を加算し(ステップS106)、加算処理を終える。
一方、ステップS105において「歩行軌跡」における属性内容が蛇行でない(直行である)と判定すると、演算部31aは、加算処理を終える。
ステップS105及びステップS106では、例えば、歩行者が飲酒等によって酩酊状態にあることから、蛇行している場合、加算値を加算することができる。例えば、ステップS106における加算値は「50」である。 Next, the
If it is determined in step S105 that the attribute content in the “walking locus” is meandering, the
On the other hand, if it is determined in step S105 that the attribute content in the “walking locus” is not meandering (is a straight line), the
In step S105 and step S106, for example, since the pedestrian is in a drunken state due to drinking or the like, the added value can be added when the pedestrian is meandering. For example, the added value in step S106 is “50”.
以上のように、演算部31aは、車両と歩行者とを区別した上で、加算処理を行う(図14)。
図13に戻って、演算部31aは、ステップS67における加算処理を終えると、ステップS68へ進み、評価対象の属性内容に応じて基礎値に値を加算する加算処理を行う(ステップS68)。
ステップS68における加算処理は、評価対象に対して行われる点以外、ステップS67における加算処理と同じ内容なので、ここでは説明を省略する。
ステップS68の加算処理を終えると、演算部31aは、個別演算処理を終える。 As described above, thearithmetic unit 31a performs the addition process after distinguishing between the vehicle and the pedestrian (FIG. 14).
Returning to FIG. 13, after completing the addition processing in step S67, thecalculation unit 31a proceeds to step S68, and performs an addition processing of adding a value to a base value according to the attribute content of the evaluation target (step S68).
The addition processing in step S68 is the same as the addition processing in step S67 except that the addition processing is performed on the evaluation target, and thus the description is omitted here.
After completing the addition processing in step S68, thecalculation unit 31a ends the individual calculation processing.
図13に戻って、演算部31aは、ステップS67における加算処理を終えると、ステップS68へ進み、評価対象の属性内容に応じて基礎値に値を加算する加算処理を行う(ステップS68)。
ステップS68における加算処理は、評価対象に対して行われる点以外、ステップS67における加算処理と同じ内容なので、ここでは説明を省略する。
ステップS68の加算処理を終えると、演算部31aは、個別演算処理を終える。 As described above, the
Returning to FIG. 13, after completing the addition processing in step S67, the
The addition processing in step S68 is the same as the addition processing in step S67 except that the addition processing is performed on the evaluation target, and thus the description is omitted here.
After completing the addition processing in step S68, the
ステップS67,S68における加算処理において、演算部31aは、評価対象及び衝突予測対象の外観属性の属性内容に応じた加算値(外観属性値)、及び行動属性の属性内容に応じた加算値(行動属性値)を基礎値に加算する。
これにより、評価値に、評価対象及び衝突予測対象の外観や、過去又は現在の行動を反映させることができる。
つまり、評価対象及び衝突予測対象の外観や行動に現れる、衝突予測に影響を及ぼす要因を評価値に反映させることができる。 In the addition processing in steps S67 and S68, thecalculation unit 31a calculates an addition value (appearance attribute value) corresponding to the attribute content of the appearance attribute of the evaluation target and the collision prediction target, and an addition value (action value) corresponding to the attribute content of the action attribute. Attribute value) to the base value.
As a result, it is possible to reflect the appearance of the evaluation target and the collision prediction target and the past or current behavior in the evaluation value.
In other words, factors that affect the collision prediction, which appear in the appearance and behavior of the evaluation target and the collision prediction target, can be reflected in the evaluation value.
これにより、評価値に、評価対象及び衝突予測対象の外観や、過去又は現在の行動を反映させることができる。
つまり、評価対象及び衝突予測対象の外観や行動に現れる、衝突予測に影響を及ぼす要因を評価値に反映させることができる。 In the addition processing in steps S67 and S68, the
As a result, it is possible to reflect the appearance of the evaluation target and the collision prediction target and the past or current behavior in the evaluation value.
In other words, factors that affect the collision prediction, which appear in the appearance and behavior of the evaluation target and the collision prediction target, can be reflected in the evaluation value.
図12に戻って、ステップS55の個別演算処理を終えると、演算部31aは、ステップS57へ進み、個別演算処理において加算された加算値と基礎値との合計を、衝突予測結果の評価値とし、評価値データベース34bへ登録する(ステップS57)。
Returning to FIG. 12, when the individual calculation processing in step S55 is completed, the calculation unit 31a proceeds to step S57, and uses the sum of the added value added in the individual calculation processing and the base value as the evaluation value of the collision prediction result. Is registered in the evaluation value database 34b (step S57).
演算部31aは、図12中のステップS55及びステップS57を、特定した評価対象に対して順次処理し、評価対象の全てを処理するまで処理を繰り返す。
演算部31aは、特定した評価対象の全てについて評価値を求めデータベース34bへ登録すると、再度、ステップS51に戻り、同様の処理を繰り返す。 Thecalculation unit 31a sequentially processes Steps S55 and S57 in FIG. 12 for the specified evaluation targets, and repeats the processing until all the evaluation targets are processed.
After obtaining the evaluation values for all of the specified evaluation targets and registering them in thedatabase 34b, the calculation unit 31a returns to step S51 again and repeats the same processing.
演算部31aは、特定した評価対象の全てについて評価値を求めデータベース34bへ登録すると、再度、ステップS51に戻り、同様の処理を繰り返す。 The
After obtaining the evaluation values for all of the specified evaluation targets and registering them in the
よって、演算部31aは、演算処理を繰り返すことで、特定した評価対象それぞれの評価値の算出及び登録を繰り返し実行し、評価値データベース34bの登録内容を随時更新する。
Therefore, the arithmetic unit 31a repeats the calculation processing, repeatedly calculates and registers the evaluation value of each of the specified evaluation targets, and updates the registered contents of the evaluation value database 34b as needed.
図17は、評価値データベース34bの一例を示す図である。
図17に示すように、評価値データベース34bには、評価対象の移動体IDと、評価対象以外の移動体(他の移動体)の評価値とが対応付けられて登録されている。 FIG. 17 is a diagram illustrating an example of theevaluation value database 34b.
As shown in FIG. 17, in theevaluation value database 34b, a moving object ID to be evaluated and evaluation values of moving objects other than the evaluation object (other moving objects) are registered in association with each other.
図17に示すように、評価値データベース34bには、評価対象の移動体IDと、評価対象以外の移動体(他の移動体)の評価値とが対応付けられて登録されている。 FIG. 17 is a diagram illustrating an example of the
As shown in FIG. 17, in the
評価値データベース34bには、評価対象の移動体IDの欄、車両ID(携帯ID)の欄、及び評価対象以外の移動体の評価値の欄が設けられている。
評価対象の移動体IDの欄には、評価対象として特定された移動体の移動体IDが登録されている。
また、車両ID(携帯ID)の欄には、評価対象の移動体が有する移動端末(歩行者端末70又は車載装置50)の車両ID(携帯ID)が登録されている。 Theevaluation value database 34b is provided with a column of a mobile unit ID to be evaluated, a column of a vehicle ID (mobile ID), and a column of evaluation values of mobile units other than the evaluation target.
In the column of the mobile object ID to be evaluated, the mobile object ID of the mobile object specified as the evaluation object is registered.
In the column of vehicle ID (mobile ID), a vehicle ID (mobile ID) of a mobile terminal (pedestrian terminal 70 or in-vehicle device 50) included in the mobile object to be evaluated is registered.
評価対象の移動体IDの欄には、評価対象として特定された移動体の移動体IDが登録されている。
また、車両ID(携帯ID)の欄には、評価対象の移動体が有する移動端末(歩行者端末70又は車載装置50)の車両ID(携帯ID)が登録されている。 The
In the column of the mobile object ID to be evaluated, the mobile object ID of the mobile object specified as the evaluation object is registered.
In the column of vehicle ID (mobile ID), a vehicle ID (mobile ID) of a mobile terminal (
評価対象以外の移動体の評価値の欄には、評価対象以外の移動体それぞれの移動体IDの欄が含まれている。これにより、評価対象以外の移動体の評価値の欄には、複数の移動体それぞれの評価値が登録される。
上述したように、衝突予測対象以外の移動体に対する評価値は「0」に設定される。
よって、評価値に「0」以外の値が登録されている評価対象以外の移動体は、その評価対象の衝突予測対象であることを示している。 The column of the evaluation value of the mobile unit other than the evaluation target includes a column of the mobile unit ID of each mobile unit other than the evaluation target. Thereby, the evaluation values of the plurality of moving objects are registered in the column of the evaluation values of the moving objects other than the evaluation target.
As described above, the evaluation value for the moving object other than the collision prediction target is set to “0”.
Therefore, a moving object other than the evaluation target whose evaluation value is registered with a value other than “0” is a collision prediction target to be evaluated.
上述したように、衝突予測対象以外の移動体に対する評価値は「0」に設定される。
よって、評価値に「0」以外の値が登録されている評価対象以外の移動体は、その評価対象の衝突予測対象であることを示している。 The column of the evaluation value of the mobile unit other than the evaluation target includes a column of the mobile unit ID of each mobile unit other than the evaluation target. Thereby, the evaluation values of the plurality of moving objects are registered in the column of the evaluation values of the moving objects other than the evaluation target.
As described above, the evaluation value for the moving object other than the collision prediction target is set to “0”.
Therefore, a moving object other than the evaluation target whose evaluation value is registered with a value other than “0” is a collision prediction target to be evaluated.
例えば、図17中、移動体IDが「1006」である評価対象の評価値において、評価対象以外の移動体の評価値の欄のうち、移動体IDが「1004」に対応する欄には評価値として「500」が登録され、それ以外には「0」が登録されている。このことより、移動体IDが「1006」である評価対象の衝突予測対象として、移動体IDが「1004」の移動体が特定されていることが判る。
このように、評価値データベース34bを参照することで、各評価対象の衝突予測対象を特定することができる。 For example, in FIG. 17, in the evaluation value of the evaluation target whose mobile object ID is “1006”, among the columns of the evaluation values of the mobile objects other than the evaluation target, the column corresponding to the mobile object ID of “1004” is the evaluation value. “500” is registered as a value, and “0” is registered as the other values. From this, it can be seen that the moving object with the moving object ID “1004” is specified as the collision prediction target of the evaluation object with the moving object ID “1006”.
In this manner, by referring to theevaluation value database 34b, it is possible to specify the collision prediction target of each evaluation target.
このように、評価値データベース34bを参照することで、各評価対象の衝突予測対象を特定することができる。 For example, in FIG. 17, in the evaluation value of the evaluation target whose mobile object ID is “1006”, among the columns of the evaluation values of the mobile objects other than the evaluation target, the column corresponding to the mobile object ID of “1004” is the evaluation value. “500” is registered as a value, and “0” is registered as the other values. From this, it can be seen that the moving object with the moving object ID “1004” is specified as the collision prediction target of the evaluation object with the moving object ID “1006”.
In this manner, by referring to the
〔評価値の判定処理〕
図18は、判定部31bによる判定処理の一例を示すフローチャートである。
図18に示すように、判定部31bは、評価値データベース34bを参照し、評価値が予め設定された閾値(通知判定用閾値)以上である評価対象があるか否かを判定する(ステップS110)。
ステップS110において、評価値が通知判定用閾値以上である評価対象がないと判定すると、判定部31bは、さらにステップS110を繰り返す。よって、判定部31bは、評価値が通知判定用閾値以上である評価対象があると判定するまで、ステップS110を繰り返す。 [Evaluation value judgment processing]
FIG. 18 is a flowchart illustrating an example of the determination processing by thedetermination unit 31b.
As illustrated in FIG. 18, thedetermination unit 31b refers to the evaluation value database 34b and determines whether there is an evaluation target whose evaluation value is equal to or greater than a preset threshold (threshold for notification determination) (Step S110). ).
If it is determined in step S110 that there is no evaluation target whose evaluation value is equal to or greater than the notification determination threshold, thedetermination unit 31b repeats step S110. Therefore, the determination unit 31b repeats step S110 until it determines that there is an evaluation target whose evaluation value is equal to or greater than the notification determination threshold.
図18は、判定部31bによる判定処理の一例を示すフローチャートである。
図18に示すように、判定部31bは、評価値データベース34bを参照し、評価値が予め設定された閾値(通知判定用閾値)以上である評価対象があるか否かを判定する(ステップS110)。
ステップS110において、評価値が通知判定用閾値以上である評価対象がないと判定すると、判定部31bは、さらにステップS110を繰り返す。よって、判定部31bは、評価値が通知判定用閾値以上である評価対象があると判定するまで、ステップS110を繰り返す。 [Evaluation value judgment processing]
FIG. 18 is a flowchart illustrating an example of the determination processing by the
As illustrated in FIG. 18, the
If it is determined in step S110 that there is no evaluation target whose evaluation value is equal to or greater than the notification determination threshold, the
ステップS110において、評価値が通知判定用閾値以上である評価対象があると判定すると、判定部31bは、ステップS111へ進み、その評価対象へ、衝突予測対象の衝突予測結果を通知することを決定し、ステップS110へ戻る。
If it is determined in step S110 that there is an evaluation target whose evaluation value is equal to or larger than the notification determination threshold, the determination unit 31b proceeds to step S111 and determines to notify the evaluation target of the collision prediction result of the collision prediction target. Then, the process returns to step S110.
判定部31bは、各評価対象の評価値を参照し、評価値が通知判定用閾値以上の評価対象が複数あれば、その複数すべての評価対象へ、衝突予測対象の衝突予測結果の通知を決定する。
このように、判定部31bは、評価対象へ、評価対象それぞれの衝突予測結果を通知するか否かを、評価値に基づいて評価対象ごとに判定する。 Thedetermination unit 31b refers to the evaluation value of each evaluation target, and if there are a plurality of evaluation targets whose evaluation values are equal to or greater than the notification determination threshold, determines the notification of the collision prediction result of the collision prediction target to all of the plurality of evaluation targets. I do.
As described above, thedetermination unit 31b determines, for each evaluation target, whether to notify the evaluation target of the collision prediction result of each evaluation target based on the evaluation value.
このように、判定部31bは、評価対象へ、評価対象それぞれの衝突予測結果を通知するか否かを、評価値に基づいて評価対象ごとに判定する。 The
As described above, the
本実施形態では、衝突予測結果を通知するか否かを、評価対象(歩行者端末70や車載装置50又はこれらを有している移動体)それぞれの評価値に基づいて評価対象ごとに判定するので、必要な情報を適切に各評価対象へ情報提供することができる。
また、評価対象それぞれの衝突予測結果の評価値を求めるために、複数の移動体に関する動的情報が重畳された動的情報マップM1を用いるので、移動端末が搭載されていない移動体も含めて衝突予測を行うことができ、各評価対象において、適切に予測された衝突予測結果を評価値として表すことができる。 In the present embodiment, whether or not to notify the collision prediction result is determined for each evaluation target based on the evaluation value of each of the evaluation targets (thepedestrian terminal 70, the in-vehicle device 50, or the moving object including these). Therefore, necessary information can be appropriately provided to each evaluation target.
In addition, since the dynamic information map M1 on which the dynamic information on a plurality of moving objects is superimposed is used to obtain the evaluation value of the collision prediction result of each evaluation target, the moving objects including no mobile terminals are included. Collision prediction can be performed, and a collision prediction result appropriately predicted for each evaluation target can be represented as an evaluation value.
また、評価対象それぞれの衝突予測結果の評価値を求めるために、複数の移動体に関する動的情報が重畳された動的情報マップM1を用いるので、移動端末が搭載されていない移動体も含めて衝突予測を行うことができ、各評価対象において、適切に予測された衝突予測結果を評価値として表すことができる。 In the present embodiment, whether or not to notify the collision prediction result is determined for each evaluation target based on the evaluation value of each of the evaluation targets (the
In addition, since the dynamic information map M1 on which the dynamic information on a plurality of moving objects is superimposed is used to obtain the evaluation value of the collision prediction result of each evaluation target, the moving objects including no mobile terminals are included. Collision prediction can be performed, and a collision prediction result appropriately predicted for each evaluation target can be represented as an evaluation value.
〔通知処理〕
判定部31bが、評価対象へ衝突予測結果の通知を決定すると、通知部31cは、判定部31bが通知を決定した評価対象へ、当該評価対象と衝突予測対象との間の衝突予測結果を通知する。
これにより、評価値に基づいて衝突予測結果が必要と判定された移動端末のみに情報提供することができる。
なお、通知部31cが評価対象へ通知する衝突予測結果には、上述したように、衝突予測対象の属性や、衝突予測対象が接近してくる方向、衝突予測時間等が含まれる。 [Notification processing]
When thedetermination unit 31b determines to notify the evaluation target of the collision prediction result, the notification unit 31c notifies the evaluation target of which the determination unit 31b has determined the notification of the collision prediction result between the evaluation target and the collision prediction target. I do.
Thereby, information can be provided only to the mobile terminals for which it is determined that the collision prediction result is necessary based on the evaluation value.
As described above, the collision prediction result notified to the evaluation target by the notifyingunit 31c includes the attribute of the collision prediction target, the direction in which the collision prediction target approaches, the collision prediction time, and the like.
判定部31bが、評価対象へ衝突予測結果の通知を決定すると、通知部31cは、判定部31bが通知を決定した評価対象へ、当該評価対象と衝突予測対象との間の衝突予測結果を通知する。
これにより、評価値に基づいて衝突予測結果が必要と判定された移動端末のみに情報提供することができる。
なお、通知部31cが評価対象へ通知する衝突予測結果には、上述したように、衝突予測対象の属性や、衝突予測対象が接近してくる方向、衝突予測時間等が含まれる。 [Notification processing]
When the
Thereby, information can be provided only to the mobile terminals for which it is determined that the collision prediction result is necessary based on the evaluation value.
As described above, the collision prediction result notified to the evaluation target by the notifying
通知部31cによる通知を受け付けた評価対象の移動端末(歩行者端末70及び車載装置50)は、当該移動端末のユーザへ向けて、通知された衝突予測結果を出力する。
The mobile terminal to be evaluated (the pedestrian terminal 70 and the in-vehicle device 50) that has received the notification by the notification unit 31c outputs the notified collision prediction result to the user of the mobile terminal.
上記構成によれば、衝突予測結果を通知するか否かを判定するための評価値を、衝突予測時間と、外観属性値と、行動属性値と、に基づいて求めるので、移動端末へ衝突予測結果を通知するか否かの判定に、評価対象及び衝突予測対象の外観や行動を反映させることができる。
よって、例えば、外観や行動において衝突可能性の高い属性を有する移動体に関する衝突予測については、衝突予測結果が通知される方向に重み付けられるように評価値を設定することができ、通知の必要性を判定する上で、移動体の外観や行動を加味することができる。
この結果、通知の必要性を適切に判定することができ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 According to the above configuration, the evaluation value for determining whether or not to notify the collision prediction result is determined based on the collision prediction time, the appearance attribute value, and the action attribute value. The appearance and behavior of the evaluation target and the collision prediction target can be reflected in the determination of whether or not to notify the result.
Therefore, for example, for a collision prediction regarding a moving object having a high possibility of collision in appearance or behavior, an evaluation value can be set so as to be weighted in a direction in which a collision prediction result is notified, and the necessity of notification is required. In determining, the appearance and behavior of the moving object can be taken into account.
As a result, the necessity of notification can be appropriately determined, and a collision prediction result with high necessity of notification can be appropriately provided to the mobile terminal.
よって、例えば、外観や行動において衝突可能性の高い属性を有する移動体に関する衝突予測については、衝突予測結果が通知される方向に重み付けられるように評価値を設定することができ、通知の必要性を判定する上で、移動体の外観や行動を加味することができる。
この結果、通知の必要性を適切に判定することができ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 According to the above configuration, the evaluation value for determining whether or not to notify the collision prediction result is determined based on the collision prediction time, the appearance attribute value, and the action attribute value. The appearance and behavior of the evaluation target and the collision prediction target can be reflected in the determination of whether or not to notify the result.
Therefore, for example, for a collision prediction regarding a moving object having a high possibility of collision in appearance or behavior, an evaluation value can be set so as to be weighted in a direction in which a collision prediction result is notified, and the necessity of notification is required. In determining, the appearance and behavior of the moving object can be taken into account.
As a result, the necessity of notification can be appropriately determined, and a collision prediction result with high necessity of notification can be appropriately provided to the mobile terminal.
〔シナリオ1について〕
次に、本実施形態の情報提供システムの動作について、道路上で想定されるシナリオに従って説明する。
図19は、シナリオ1に係る交差点周辺の状況を示した図である。
図19において、歩行者7A,7Bは、横断歩道Pを通過した直後の位置に駐車されている車両5Cから降車して横断歩道Pを横断している。また、歩行者7Aは大人であり、歩行者7Bは子供(身長が120cm以下)である。
歩行者7A,7Bが横断歩道Pを横断し始めたタイミングにおいて横断歩道Pの信号の灯色は、青色の点滅であった。 [About Scenario 1]
Next, the operation of the information providing system of the present embodiment will be described according to a scenario assumed on a road.
FIG. 19 is a diagram showing a situation around an intersection according toscenario 1.
In FIG. 19, the pedestrians 7A and 7B get off the vehicle 5C parked at the position immediately after passing the pedestrian crossing P and cross the pedestrian crossing P. The pedestrian 7A is an adult, and the pedestrian 7B is a child (having a height of 120 cm or less).
At the timing when the pedestrians 7A and 7B started to cross the pedestrian crossing P, the light color of the signal of the pedestrian crossing P was flashing blue.
次に、本実施形態の情報提供システムの動作について、道路上で想定されるシナリオに従って説明する。
図19は、シナリオ1に係る交差点周辺の状況を示した図である。
図19において、歩行者7A,7Bは、横断歩道Pを通過した直後の位置に駐車されている車両5Cから降車して横断歩道Pを横断している。また、歩行者7Aは大人であり、歩行者7Bは子供(身長が120cm以下)である。
歩行者7A,7Bが横断歩道Pを横断し始めたタイミングにおいて横断歩道Pの信号の灯色は、青色の点滅であった。 [About Scenario 1]
Next, the operation of the information providing system of the present embodiment will be described according to a scenario assumed on a road.
FIG. 19 is a diagram showing a situation around an intersection according to
In FIG. 19, the
At the timing when the
一方、歩行者7A,7Bが横断歩道Pを渡り始めたタイミングにおいては、車両5Aが走行する方路の信号の灯色は赤色であるが、横断歩道Pの信号が青色の点滅であるため、横断歩道Pに接近する車両5Aは、もうすぐ青色に変わることを認識している。このため車両5Aは、交差点を停止せずに通過しようと試みているとする。
なお、車両5Aは、車両5Cの存在によって歩行者7A,7Bが目視できないものとする。
さらに、車両5Aの後方には、さらに車両5Bが走行している。 On the other hand, at the timing when the pedestrians 7A and 7B start to cross the pedestrian crossing P, the traffic light on the route on which the vehicle 5A runs is red, but the traffic light on the pedestrian crossing P is blinking blue. The vehicle 5A approaching the pedestrian crossing P recognizes that it will soon turn blue. Therefore, it is assumed that the vehicle 5A is trying to pass through the intersection without stopping.
It is assumed that the pedestrians 7A and 7B cannot see the vehicle 5A due to the presence of the vehicle 5C.
Further, avehicle 5B is running behind the vehicle 5A.
なお、車両5Aは、車両5Cの存在によって歩行者7A,7Bが目視できないものとする。
さらに、車両5Aの後方には、さらに車両5Bが走行している。 On the other hand, at the timing when the
It is assumed that the
Further, a
なお、車両5A,5B,5C、歩行者7A,7Bは、システムによって移動体として動的情報マップM1及び移動体データベース34aに登録されているものとする。
また、車両5A,5Bは、車載装置50を搭載しており、歩行者7A,7Bは歩行者端末70を携帯しているものとする。 It is assumed that the vehicles 5A, 5B, 5C and the pedestrians 7A, 7B are registered as moving objects in the dynamic information map M1 and the moving object database 34a by the system.
In addition, it is assumed that the vehicles 5A and 5B have the in-vehicle device 50 mounted thereon, and the pedestrians 7A and 7B carry the pedestrian terminal 70.
また、車両5A,5Bは、車載装置50を搭載しており、歩行者7A,7Bは歩行者端末70を携帯しているものとする。 It is assumed that the
In addition, it is assumed that the
ここで、横断歩道Pの途中で、横断歩道Pの信号の灯色が赤色に変わり、歩行者7A,7Bが、そのまま急いで横断歩道Pを渡っているとする。
Here, it is assumed that the light color of the traffic light of the pedestrian crossing P turns red in the middle of the pedestrian crossing P, and the pedestrians 7A and 7B cross the pedestrian crossing P in a hurry.
このとき、車両5Aと、歩行者7A,7Bとは、横断歩道Pで衝突の可能性がある。よって、エッジサーバ3(の演算部31a)は、車両5Aの衝突予測対象として、歩行者7A,7Bを特定するとともに、歩行者7A,7Bの衝突予測対象として、車両5Aを特定する。また、歩行者7A,7Bは、属性が子供(歩行者7Bのみ)であり、信号無視をしている。さらに、車両5Aと歩行者7A,7Bとの間に死角要因である車両5Cが存在する。
At this time, there is a possibility that the vehicle 5A and the pedestrians 7A and 7B will collide on the pedestrian crossing P. Therefore, (the arithmetic unit 31a of) the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B. The attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B) and the signals are ignored. Further, a vehicle 5C that is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
この場合、エッジサーバ3は、車両5Aを評価対象としたときの評価値を求める際、衝突予測時間による基礎値を求め、車両5に関する加算処理による加算値の他、死角要因や、歩行者7A,7Bに関する加算処理による加算値を基礎値に加算し、評価値を得る。この際、歩行者7A,7Bに関する加算処理では、歩行者の身長や、信号無視による加算値が加算される。
車両5に関する加算処理については、車両5それぞれの外観属性及び行動属性の属性内容によって適宜加算される。以下の説明において現れる車両5についても同様である。 In this case, when calculating the evaluation value when thevehicle 5A is to be evaluated, the edge server 3 obtains a basic value based on the collision prediction time, and in addition to the addition value obtained by the addition processing regarding the vehicle 5, the blind spot factor and the pedestrian 7A , 7B are added to the base value to obtain an evaluation value. At this time, in the addition process for the pedestrians 7A and 7B, the height of the pedestrian and the added value due to ignoring the signal are added.
The addition process for thevehicle 5 is appropriately added depending on the attribute contents of the appearance attribute and the behavior attribute of each vehicle 5. The same applies to the vehicle 5 appearing in the following description.
車両5に関する加算処理については、車両5それぞれの外観属性及び行動属性の属性内容によって適宜加算される。以下の説明において現れる車両5についても同様である。 In this case, when calculating the evaluation value when the
The addition process for the
衝突予測結果を通知するか否かの判定に用いられる前記通知判定用閾値は、本シナリオのように、安全性を阻害する要因が重なるような場合における評価値よりも小さく設定される。よって、本シナリオのような場合では、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となる。このため、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
(4) The notification determination threshold used for determining whether or not to notify a collision prediction result is set to be smaller than an evaluation value in a case where factors that impair safety overlap as in this scenario. Therefore, in the case of the present scenario, the evaluation values of the pedestrians 7A and 7B in the vehicle 5A are larger than the notification determination threshold. For this reason, the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
また、エッジサーバ3は、評価値を求める際、評価対象及び衝突予測対象の両方に対して加算処理を行うので、歩行者7A,7Bを評価対象としたときの評価値は、車両5を評価対象としたときと同じ評価値となる。よって、歩行者7A,7Bにおける車両5Aの評価値が前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測結果を歩行者7A,7Bへ通知する。
Further, the edge server 3 performs the addition process on both the evaluation target and the collision prediction target when obtaining the evaluation value. Therefore, the evaluation value when the pedestrians 7A and 7B are the evaluation targets The evaluation value is the same as the target. Therefore, the evaluation value of the vehicle 5A in the pedestrians 7A and 7B becomes larger than the notification determination threshold, and the edge server 3 notifies the pedestrians 7A and 7B of the collision prediction result of the vehicle 5A in the pedestrians 7A and 7B. I do.
さらに、車両5Bにおいても、車両5Aが横断歩道Pに近づくに従って速度を落とし、両車両が接近すると、エッジサーバ3は、車両5Bの衝突予測対象として車両5Aを特定する。衝突予測の結果、衝突予測時間が短ければ、評価値が大きな値に設定される。この結果、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。
{Circle around (5)} Also, in the vehicle 5B, the speed decreases as the vehicle 5A approaches the pedestrian crossing P, and when both vehicles approach, the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B. As a result of the collision prediction, if the collision prediction time is short, the evaluation value is set to a large value. As a result, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
このように、エッジサーバ3は、必要な情報を適切に各移動端末へ情報提供することができる。
車両5A,5Bの車載装置50、歩行者7A,7Bの歩行者端末70は、エッジサーバ3からの通知に基づいて、衝突予測結果を自装置のユーザへ出力する。 In this way, theedge server 3 can appropriately provide necessary information to each mobile terminal.
The in-vehicle devices 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B output a collision prediction result to the user of the own device based on the notification from the edge server 3.
車両5A,5Bの車載装置50、歩行者7A,7Bの歩行者端末70は、エッジサーバ3からの通知に基づいて、衝突予測結果を自装置のユーザへ出力する。 In this way, the
The in-
図19中、車両5Aの車載装置50の出力画面V1には、自車両前方の横断歩道Pの右側から歩行者が現れることを示す表示D1や、その歩行者の進行方向を示す矢印D2等が表示される。
これにより、エッジサーバ3は、前方の横断歩道Pの右側から歩行者が現れることを事前に車両5Aのユーザに認識させることができる。
なお、ここでは、歩行者7A,7Bを特定することなく表示しているが、例えば、車両5Aの制御部51は、歩行者7の属性が子供である場合と、大人である場合とで表示方法を異なるようにしてもよい。歩行者が子供の場合の方が、より注意が必要だからである。 In FIG. 19, an output screen V1 of the vehicle-mounteddevice 50 of the vehicle 5A includes a display D1 indicating that a pedestrian appears from the right side of the pedestrian crossing P in front of the own vehicle, an arrow D2 indicating the traveling direction of the pedestrian, and the like. Is displayed.
Thereby, theedge server 3 can make the user of the vehicle 5A recognize in advance that a pedestrian appears from the right side of the pedestrian crossing P ahead.
Here, although the pedestrians 7A and 7B are displayed without being specified, for example, the control unit 51 of the vehicle 5A displays whether the attribute of the pedestrian 7 is a child or an adult. The method may be different. This is because more attention is required when the pedestrian is a child.
これにより、エッジサーバ3は、前方の横断歩道Pの右側から歩行者が現れることを事前に車両5Aのユーザに認識させることができる。
なお、ここでは、歩行者7A,7Bを特定することなく表示しているが、例えば、車両5Aの制御部51は、歩行者7の属性が子供である場合と、大人である場合とで表示方法を異なるようにしてもよい。歩行者が子供の場合の方が、より注意が必要だからである。 In FIG. 19, an output screen V1 of the vehicle-mounted
Thereby, the
Here, although the
つまり、本システムでは、衝突予測対象における外観属性及び行動属性の属性内容に応じて、衝突予測結果の出力態様が異なるように制御されてもよい。これにより、衝突予測対象の属性の特徴に応じた出力態様でユーザへ向けて情報を出力することができる。
That is, in the present system, the output mode of the collision prediction result may be controlled to be different depending on the attribute contents of the appearance attribute and the action attribute in the collision prediction target. Thus, information can be output to the user in an output mode according to the characteristics of the attribute of the collision prediction target.
また、歩行者7A,7Bの歩行者端末70の出力画面V2には、前方の横断歩道Pの左側から車両Aが現れることを示す表示D3や、その車両5Aの進行方向を示す矢印D4等が表示される。
これにより、エッジサーバ3は、前方の横断歩道Pの左側から車両が現れることを事前に歩行者7A,7Bに認識させることができる。 The output screen V2 of thepedestrian terminal 70 of the pedestrians 7A and 7B includes a display D3 indicating that the vehicle A appears from the left side of the pedestrian crossing P ahead, an arrow D4 indicating the traveling direction of the vehicle 5A, and the like. Is displayed.
Thereby, theedge server 3 can make the pedestrians 7A and 7B recognize in advance that the vehicle appears from the left side of the pedestrian crossing P in front.
これにより、エッジサーバ3は、前方の横断歩道Pの左側から車両が現れることを事前に歩行者7A,7Bに認識させることができる。 The output screen V2 of the
Thereby, the
また、車両5Bの車載装置50の出力画面V3には、自車両前方を走行する車両5Aに対する注意を促すことを表す表示D5や、その車両5Aの進行方向を示す矢印D6等が表示される。
これにより、エッジサーバ3は、前方を走行する車両5Aが停止して接近することを事前に車両5Bのユーザに認識させることができる。 In addition, on the output screen V3 of the in-vehicle device 50 of the vehicle 5B, a display D5 indicating that the vehicle 5A traveling ahead of the own vehicle is warned, an arrow D6 indicating the traveling direction of the vehicle 5A, and the like are displayed.
Thereby, theedge server 3 can make the user of the vehicle 5B recognize in advance that the vehicle 5A traveling ahead stops and approaches.
これにより、エッジサーバ3は、前方を走行する車両5Aが停止して接近することを事前に車両5Bのユーザに認識させることができる。 In addition, on the output screen V3 of the in-
Thereby, the
このように、エッジサーバ3は、車両5A,5Bの車載装置50、歩行者7A,7Bの歩行者端末70にユーザへの出力を行わせることで、各移動体同士の衝突を回避させることができる。
In this way, the edge server 3 can avoid collision between the moving bodies by causing the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B to output to the user. it can.
なお、歩行者端末70及び車載装置50による衝突予測結果の出力は、歩行者端末70及び車載装置50の制御部71及び制御部51が有する出力制御部によって制御される。
また、エッジサーバ3の制御部31が、歩行者端末70及び車載装置50によるユーザへ向けた出力を制御可能な出力制御部を有している場合には、エッジサーバ3の出力制御部が、歩行者端末70及び車載装置50によるユーザへの出力を制御してもよい。 The output of the collision prediction result by thepedestrian terminal 70 and the in-vehicle device 50 is controlled by an output control unit of the control unit 71 and the control unit 51 of the pedestrian terminal 70 and the in-vehicle device 50.
When thecontrol unit 31 of the edge server 3 has an output control unit that can control the output of the pedestrian terminal 70 and the in-vehicle device 50 to the user, the output control unit of the edge server 3 The output to the user by the pedestrian terminal 70 and the in-vehicle device 50 may be controlled.
また、エッジサーバ3の制御部31が、歩行者端末70及び車載装置50によるユーザへ向けた出力を制御可能な出力制御部を有している場合には、エッジサーバ3の出力制御部が、歩行者端末70及び車載装置50によるユーザへの出力を制御してもよい。 The output of the collision prediction result by the
When the
また、図19では、衝突予測対象の位置やその進行方向を出力画面V1,V2,V3に表示する場合を示したが、衝突予測時間を合わせて表示してもよい。
In addition, FIG. 19 shows the case where the position of the collision prediction target and the traveling direction thereof are displayed on the output screens V1, V2, V3, but may be displayed together with the collision prediction time.
また、本シナリオにおいては、歩行者において安全性を阻害する要因が重なるような場合を示したが、例えば、歩行者7A,7Bに信号無視等の安全性を阻害する要因がなく、評価値が通知判定用閾値よりも大きい値とならなければ、エッジサーバ3は、衝突予測結果を車両5A及び歩行者7A,7Bへ通知しない。
Further, in this scenario, a case was shown in which the factors that impede safety in pedestrians overlap. For example, there are no factors that impede safety such as signal ignorance in the pedestrians 7A and 7B, and the evaluation value is If the value is not larger than the notification determination threshold, the edge server 3 does not notify the collision prediction result to the vehicle 5A and the pedestrians 7A and 7B.
一方、歩行者7A,7Bに信号無視等の安全性を阻害する要因がなくても、車両5Aの属性内容に応じて加算値が加算されることで、評価値が通知判定用閾値を超える場合もある。
車両の加算処理(図15)では、車両5Aのサイズや、形状、色、ナンバープレートの表示、過去の走行履歴によって加算値が加算される。
よって、歩行者7A,7Bが車両5Aにおける衝突予測対象とされたが、衝突予測結果を通知するに至らない場合であっても、車両5Aの属性によっては、エッジサーバ3は、衝突予測結果を歩行者7A,7B及び車両5Aに通知することがある。 On the other hand, even if the pedestrians 7A and 7B do not have any obstacles to safety such as ignoring the signal, the added value is added according to the attribute content of the vehicle 5A, so that the evaluation value exceeds the notification determination threshold. There is also.
In the vehicle addition process (FIG. 15), the added value is added based on the size, shape, color, license plate display, and past traveling history of thevehicle 5A.
Therefore, although the pedestrians 7A and 7B are set as the collision prediction targets in the vehicle 5A, even when the collision prediction result is not notified, the edge server 3 may determine the collision prediction result depending on the attribute of the vehicle 5A. The pedestrians 7A and 7B and the vehicle 5A may be notified.
車両の加算処理(図15)では、車両5Aのサイズや、形状、色、ナンバープレートの表示、過去の走行履歴によって加算値が加算される。
よって、歩行者7A,7Bが車両5Aにおける衝突予測対象とされたが、衝突予測結果を通知するに至らない場合であっても、車両5Aの属性によっては、エッジサーバ3は、衝突予測結果を歩行者7A,7B及び車両5Aに通知することがある。 On the other hand, even if the
In the vehicle addition process (FIG. 15), the added value is added based on the size, shape, color, license plate display, and past traveling history of the
Therefore, although the
このように、衝突予測結果の通知の必要性を判定する上で、車両5Aの属性を加味することができる。この結果、衝突予測結果の通知の必要性を適切に判定し、衝突予測結果の通知を適切に行うことができる。
As described above, the attribute of the vehicle 5A can be taken into account in determining the necessity of the notification of the collision prediction result. As a result, the necessity of notification of the collision prediction result can be appropriately determined, and the notification of the collision prediction result can be appropriately performed.
〔シナリオ2について〕
図20は、シナリオ2に係る交差点周辺の状況を示した図である。
図20中、車両5A,5B、歩行者7A,7Bの設定は、シナリオ1と同様である。また、シナリオ2には、シナリオ1の車両5Cがいない。
シナリオ2では、歩行者7A,7Bが横断歩道Pを一般的な歩行速度(3.6km毎時)よりも低い速度で渡っている。 [About scenario 2]
FIG. 20 is a diagram showing a situation around an intersection according toscenario 2.
In FIG. 20, the settings of the vehicles 5A and 5B and the pedestrians 7A and 7B are the same as in the scenario 1. Further, the scenario 2 does not include the vehicle 5C of the scenario 1.
Inscenario 2, the pedestrians 7A and 7B are crossing the pedestrian crossing P at a speed lower than a general walking speed (3.6 km per hour).
図20は、シナリオ2に係る交差点周辺の状況を示した図である。
図20中、車両5A,5B、歩行者7A,7Bの設定は、シナリオ1と同様である。また、シナリオ2には、シナリオ1の車両5Cがいない。
シナリオ2では、歩行者7A,7Bが横断歩道Pを一般的な歩行速度(3.6km毎時)よりも低い速度で渡っている。 [About scenario 2]
FIG. 20 is a diagram showing a situation around an intersection according to
In FIG. 20, the settings of the
In
歩行者7A,7Bがゆっくりと横断歩道Pを渡っているため、歩行者7A,7Bが横断歩道Pを横断し始めたタイミングにおいて横断歩道Pの信号の灯色は、青色の点滅であったが、横断歩道Pの途中で、横断歩道Pの信号の灯色が赤色に変わり、歩行者7A,7Bは、そのままの歩行速度でゆっくりと横断歩道Pを渡っているとする。
Since the pedestrians 7A and 7B are slowly crossing the pedestrian crossing P, at the timing when the pedestrians 7A and 7B start to cross the pedestrian crossing P, the light color of the signal of the pedestrian crossing P is flashing blue. In the middle of the pedestrian crossing P, it is assumed that the light color of the signal of the pedestrian crossing P changes to red, and the pedestrians 7A and 7B slowly cross the pedestrian crossing P at the same walking speed.
この場合、シナリオ1のように、車両5Aと歩行者7A,7Bとの間に死角要因はない。
また、歩行者7A,7Bはゆっくりと横断歩道Pを渡っているため、青色の点滅であった横断歩道Pの信号の灯色が、横断歩道Pの途中で赤色に変わり、歩行者7A,7Bがそのままの歩行速度でゆっくりと横断歩道Pを渡っているとする。
ここで、車両5Aと、歩行者7A,7Bとは、横断歩道上で衝突の可能性がある。よって、エッジサーバ3は、車両5Aの衝突予測対象として、歩行者7A,7Bを特定するとともに、歩行者7A,7Bの衝突予測対象として、車両5Aを特定する。 In this case, there is no blind spot factor between thevehicle 5A and the pedestrians 7A and 7B as in the scenario 1.
Further, since the pedestrians 7A and 7B are slowly crossing the pedestrian crossing P, the light color of the signal of the pedestrian crossing P which has been blinking blue changes to red in the middle of the pedestrian crossing P, and the pedestrians 7A and 7B Is slowly crossing the crosswalk P at the same walking speed.
Here, there is a possibility that thevehicle 5A and the pedestrians 7A and 7B will collide on the crosswalk. Therefore, the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B.
また、歩行者7A,7Bはゆっくりと横断歩道Pを渡っているため、青色の点滅であった横断歩道Pの信号の灯色が、横断歩道Pの途中で赤色に変わり、歩行者7A,7Bがそのままの歩行速度でゆっくりと横断歩道Pを渡っているとする。
ここで、車両5Aと、歩行者7A,7Bとは、横断歩道上で衝突の可能性がある。よって、エッジサーバ3は、車両5Aの衝突予測対象として、歩行者7A,7Bを特定するとともに、歩行者7A,7Bの衝突予測対象として、車両5Aを特定する。 In this case, there is no blind spot factor between the
Further, since the
Here, there is a possibility that the
このとき、エッジサーバ3は、車両5Aを評価対象としたときの評価値を求める場合、衝突予測時間から基礎値を求め、車両5に関する加算処理による加算値の他、歩行者7A,7Bに関する加算処理による加算値を基礎値に加算し、評価値を得る。この際、歩行者7A,7Bに関する加算処理では、歩行者の身長(120cm以下)や、歩行者の速度、信号無視による加算値が加算される。
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測の予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation value when thevehicle 5A is to be evaluated, the edge server 3 obtains a basic value from the collision prediction time, and adds the pedestrians 7A and 7B in addition to the addition value by the addition processing for the vehicle 5. The value added by the processing is added to the base value to obtain an evaluation value. At this time, in the addition processing relating to the pedestrians 7A and 7B, the height of the pedestrian (120 cm or less), the speed of the pedestrian, and the added value due to ignoring the signal are added.
As a result, the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes larger than the notification determination threshold, and the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
Further, theedge server 3 notifies the pedestrians 7A and 7B of the prediction result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測の予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation value when the
As a result, the evaluation value of the
Further, the
さらに、車両5Bにおいても、車両5Aが横断歩道Pに近づくに従って速度を落とし、両車両が接近すると、エッジサーバ3は、車両5Bの衝突予測対象として車両5Aを特定する。衝突予測の結果、衝突予測時間が短ければ、評価値が大きな値に設定される。この結果、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。
{Circle around (5)} Also, in the vehicle 5B, the speed decreases as the vehicle 5A approaches the pedestrian crossing P, and when both vehicles approach, the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B. As a result of the collision prediction, if the collision prediction time is short, the evaluation value is set to a large value. As a result, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
以上によって、車両5A,5Bの車載装置50、歩行者7A,7Bの歩行者端末70は、エッジサーバ3からの通知に基づいて、衝突予測結果を自装置のユーザへ出力する。
これによって、エッジサーバ3は、前方の横断歩道Pの右側から歩行者が現れることを事前に車両5Aのユーザに認識させることができる。
また、エッジサーバ3は、前方の横断歩道Pの左側から車両が現れることを事前に歩行者7A,7Bに認識させることができる。 As described above, the in-vehicle devices 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B output the collision prediction result to the user of the own device based on the notification from the edge server 3.
Thereby, theedge server 3 can make the user of the vehicle 5A recognize in advance that a pedestrian appears from the right side of the pedestrian crossing P in front.
In addition, theedge server 3 can cause the pedestrians 7A and 7B to recognize in advance that a vehicle will appear from the left side of the pedestrian crossing P in front.
これによって、エッジサーバ3は、前方の横断歩道Pの右側から歩行者が現れることを事前に車両5Aのユーザに認識させることができる。
また、エッジサーバ3は、前方の横断歩道Pの左側から車両が現れることを事前に歩行者7A,7Bに認識させることができる。 As described above, the in-
Thereby, the
In addition, the
これにより、エッジサーバ3は、車両5A,5Bの車載装置50、歩行者7A,7Bの歩行者端末70にユーザへの出力を行わせることで、各移動体同士の衝突を回避させることができる。
Thereby, the edge server 3 can avoid collision between the moving objects by causing the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B to output to the user. .
〔シナリオ3について〕
図21は、シナリオ3に係る交差点周辺の状況を示した図である。
図21において、歩行者7A,7Bは、歩道Hを歩いていたところ、歩道H上に障害物G1があり、障害物G1を避けて車道にはみ出して歩行している。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 3]
FIG. 21 is a diagram showing a situation around an intersection according toscenario 3.
In FIG. 21, when the pedestrians 7A and 7B are walking on the sidewalk H, there is an obstacle G1 on the sidewalk H, and the pedestrians 7A and 7B run off the roadway while avoiding the obstacle G1. The pedestrian 7A is an adult, and the pedestrian 7B is a child.
図21は、シナリオ3に係る交差点周辺の状況を示した図である。
図21において、歩行者7A,7Bは、歩道Hを歩いていたところ、歩道H上に障害物G1があり、障害物G1を避けて車道にはみ出して歩行している。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 3]
FIG. 21 is a diagram showing a situation around an intersection according to
In FIG. 21, when the
ここで、車両5Aと、歩行者7A,7Bとは、車道上で衝突の可能性がある。よって、エッジサーバ3は、車両5Aの衝突予測対象として、歩行者7A,7Bを特定するとともに、歩行者7A,7Bの衝突予測対象として、車両5Aを特定する。また、歩行者7A,7Bは、属性が子供(歩行者7Bのみ)であり、歩道を歩行していない。
Here, the vehicle 5A and the pedestrians 7A and 7B may collide on the road. Therefore, the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B. In addition, the pedestrians 7A and 7B have a child attribute (only the pedestrian 7B) and do not walk on the sidewalk.
このとき、エッジサーバ3は、車両5Aを評価対象としたときの歩行者7A,7Bの評価値を求める場合、衝突予測時間から基礎値を求め、車両5に関する加算処理による加算値の他、歩行者7A,7Bに関する加算処理による加算値を基礎値に加算し、評価値を得る。この際、歩行者7A,7Bに関する加算処理では、歩行者の身長(120cm以下)や、歩行者の歩行場所(車道)による加算値が加算される。
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation values of the pedestrians 7A and 7B when the vehicle 5A is the evaluation target, the edge server 3 obtains the base value from the predicted collision time, The addition value obtained by the addition processing for the persons 7A and 7B is added to the base value to obtain an evaluation value. At this time, in the addition process for the pedestrians 7A and 7B, an added value based on the height (120 cm or less) of the pedestrian and the walking place (roadway) of the pedestrian is added.
As a result, the evaluation values of the pedestrians 7A and 7B in the vehicle 5A become larger than the notification determination threshold, and the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
Further, theedge server 3 notifies the pedestrians 7A and 7B of the result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation values of the
As a result, the evaluation values of the
Further, the
また、車両5Aが、歩行者7A,7Bを避けて反対車線側にはみ出そうとすると、反対車線を走行している車両5Bは、車両5Aとの間で衝突の可能性がある。よって、この場合、エッジサーバ3は、車両5Bの衝突予測対象として、車両5Aを特定する。また、車両5Bと車両5Aとの間に死角要因である車両5Dが存在する。
If the vehicle 5A attempts to protrude into the opposite lane avoiding the pedestrians 7A and 7B, the vehicle 5B running on the opposite lane may collide with the vehicle 5A. Therefore, in this case, the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B. Further, a vehicle 5D which is a blind spot factor exists between the vehicle 5B and the vehicle 5A.
エッジサーバ3は、車両5Bを評価対象としたときの評価値を求める場合、衝突予測時間から求められる評価値に、死角要因の有無による加算値を加算する。
車両5Bにおける車両5Aの評価値が前記通知判定用閾値よりも大きい値となれば、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。 When calculating the evaluation value when thevehicle 5B is to be evaluated, the edge server 3 adds an addition value based on the presence or absence of a blind spot factor to the evaluation value obtained from the collision prediction time.
When the evaluation value of thevehicle 5A in the vehicle 5B is larger than the notification determination threshold, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
車両5Bにおける車両5Aの評価値が前記通知判定用閾値よりも大きい値となれば、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。 When calculating the evaluation value when the
When the evaluation value of the
また、車両5Cにおいても、車両5Bが車両5Aのはみ出しによって速度を落とし、両車両が接近すると、エッジサーバ3は、車両5Cの衝突予測対象として車両5Bを特定する。衝突予測の結果、衝突予測時間が短ければ、評価値が大きな値に設定される。この結果、エッジサーバ3は、車両5Cにおける車両5Bの衝突予測結果を車両5Cへ通知する。
{Circle around (5)} Also, in the vehicle 5C, the speed of the vehicle 5B is reduced by the protrusion of the vehicle 5A, and when the two vehicles approach, the edge server 3 specifies the vehicle 5B as a collision prediction target of the vehicle 5C. As a result of the collision prediction, if the collision prediction time is short, the evaluation value is set to a large value. As a result, the edge server 3 notifies the collision prediction result of the vehicle 5B in the vehicle 5C to the vehicle 5C.
これにより、エッジサーバ3は、車両5A,5B,5Cの車載装置50、歩行者7A,7Bの歩行者端末70にユーザへの出力を行わせることで、各移動体同士の衝突を回避させることができる。
Thus, the edge server 3 causes the in-vehicle devices 50 of the vehicles 5A, 5B, 5C and the pedestrian terminals 70 of the pedestrians 7A, 7B to output to the user, thereby avoiding collision between the moving objects. Can be.
〔シナリオ4について〕
図22は、シナリオ4に係る交差点周辺の状況を示した図である。
図22において、歩行者7A,7Bは、歩道Hを歩いていたところ、停車している車両5Cと、車両5Dとの間をすり抜けて横断歩道ではないところを歩行し車道を横断しようとしている。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 4]
FIG. 22 is a diagram showing a situation around an intersection according toscenario 4.
In FIG. 22, the pedestrians 7A and 7B are walking on the sidewalk H, pass through between the stopped vehicle 5C and the vehicle 5D, and walk on a place other than the pedestrian crossing to cross the road. The pedestrian 7A is an adult, and the pedestrian 7B is a child.
図22は、シナリオ4に係る交差点周辺の状況を示した図である。
図22において、歩行者7A,7Bは、歩道Hを歩いていたところ、停車している車両5Cと、車両5Dとの間をすり抜けて横断歩道ではないところを歩行し車道を横断しようとしている。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 4]
FIG. 22 is a diagram showing a situation around an intersection according to
In FIG. 22, the
ここで、反対車線を走行する車両5Aと、歩行者7A,7Bとは、車道上で衝突の可能性がある。よって、エッジサーバ3は、車両5Aの衝突予測対象として、歩行者7A,7Bを特定するとともに、歩行者7A,7Bの衝突予測対象として、車両5Aを特定する。また、歩行者7A,7Bは、属性が子供(歩行者7Bのみ)であり、歩道及び横断歩道を歩行していない。さらに、車両5Aと歩行者7A,7Bとの間に死角要因である車両5Dが存在する。
Here, there is a possibility that the vehicle 5A running in the opposite lane and the pedestrians 7A and 7B will collide on the road. Therefore, the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B. In addition, the pedestrians 7A and 7B have a child attribute (only the pedestrian 7B) and do not walk on the sidewalk and the crosswalk. Further, a vehicle 5D which is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
このとき、エッジサーバ3は、車両5Aを評価対象としたときの歩行者7A,7Bの評価値を求める場合、衝突予測時間から基礎値を求め、車両5に関する加算処理による加算値の他、死角要因や、歩行者7A,7Bに関する加算処理による加算値を基礎値に加算し、評価値を得る。この際、歩行者7A,7Bに関する加算処理では、歩行者の身長(120cm以下)や、歩行者の歩行場所(車道)による加算値が加算される。
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測結果を歩行者7A,7Bへ通知する。 At this time, when calculating the evaluation values of the pedestrians 7A and 7B when the vehicle 5A is to be evaluated, the edge server 3 obtains a base value from the predicted collision time, The factors and the added value by the adding process for the pedestrians 7A and 7B are added to the base value to obtain an evaluation value. At this time, in the addition process for the pedestrians 7A and 7B, an added value based on the height (120 cm or less) of the pedestrian and the walking place (roadway) of the pedestrian is added.
As a result, the evaluation values of the pedestrians 7A and 7B in the vehicle 5A become larger than the notification determination threshold, and the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
Further, theedge server 3 notifies the pedestrians 7A and 7B of the result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測結果を歩行者7A,7Bへ通知する。 At this time, when calculating the evaluation values of the
As a result, the evaluation values of the
Further, the
また、車両5Bにおいても、車両5Aが歩行者7A,7Bの車道横断によって速度を落とし、両車両が接近すると、エッジサーバ3は、車両5Bの衝突予測対象として車両5Aを特定する。衝突予測の結果、衝突予測時間が短ければ、評価値が大きな値に設定される。この結果、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。
{Circle around (5)} Also, in the vehicle 5B, when the vehicle 5A slows down due to the pedestrians 7A and 7B crossing the road, and the two vehicles approach, the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B. As a result of the collision prediction, if the collision prediction time is short, the evaluation value is set to a large value. As a result, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
なお、本シナリオのように、車両5同士の間をすり抜ける移動体(歩行者)を検知する方法としては、路側センサ8による検知の他、歩行者がすり抜ける車両5に車載カメラ59を有する車載装置50が搭載されていれば、車載カメラ59によって検知することもできる。
As a method of detecting a moving object (pedestrian) passing between the vehicles 5 as in this scenario, in addition to the detection by the roadside sensor 8, an in-vehicle device having an in-vehicle camera 59 in the vehicle 5 in which a pedestrian passes. If 50 is mounted, it can also be detected by the in-vehicle camera 59.
〔シナリオ5について〕
図23は、シナリオ5に係る交差点周辺の状況を示した図である。
図23において、歩行者7A,7Bは、歩道Hを蛇行して歩いている。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 5]
FIG. 23 is a diagram showing a situation around an intersection according toscenario 5.
In FIG. 23, pedestrians 7A and 7B are walking meandering on sidewalk H. The pedestrian 7A is an adult, and the pedestrian 7B is a child.
図23は、シナリオ5に係る交差点周辺の状況を示した図である。
図23において、歩行者7A,7Bは、歩道Hを蛇行して歩いている。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 5]
FIG. 23 is a diagram showing a situation around an intersection according to
In FIG. 23,
歩行者7A,7Bが歩道Hを蛇行することで、車道にはみ出せば、車両5Aと、歩行者7A,7Bとは、車道上で衝突の可能性がある。よって、エッジサーバ3は、車両5Aの衝突予測対象として、歩行者7A,7Bを特定するとともに、歩行者7A,7Bの衝突予測対象として、車両5Aを特定する。また、歩行者7A,7Bは、属性が子供(歩行者7Bのみ)であり、車道をはみ出した場合、歩道を歩行していないことになる。
If the pedestrians 7A and 7B meander the sidewalk H and run off the road, the vehicle 5A may collide with the pedestrians 7A and 7B on the road. Therefore, the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A, and specifies the vehicle 5A as the collision prediction targets of the pedestrians 7A and 7B. In addition, the pedestrians 7A and 7B have a child attribute (only the pedestrian 7B), and if the pedestrians run off the road, they are not walking on the sidewalk.
このとき、エッジサーバ3は、車両5Aを評価対象としたときの歩行者7A,7Bの評価値を求める場合、衝突予測時間から基礎値を求め、車両5に関する加算処理による加算値の他、歩行者7A,7Bに関する加算処理による加算値を基礎値に加算し、評価値を得る。この際、歩行者7A,7Bに関する加算処理では、歩行者の身長(120cm以下)や、歩行者の歩行軌跡(蛇行)、歩行場所(車道)による加算値が加算される。
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation values of the pedestrians 7A and 7B when the vehicle 5A is the evaluation target, the edge server 3 obtains the base value from the predicted collision time, The addition value obtained by the addition processing for the persons 7A and 7B is added to the base value to obtain an evaluation value. At this time, in the addition process for the pedestrians 7A and 7B, an added value based on the height (120 cm or less) of the pedestrian, the walking locus (meandering) of the pedestrian, and the walking place (roadway) is added.
As a result, the evaluation values of the pedestrians 7A and 7B in the vehicle 5A become larger than the notification determination threshold, and the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
Further, theedge server 3 notifies the pedestrians 7A and 7B of the result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation values of the
As a result, the evaluation values of the
Further, the
また、車両5Aが、歩行者7A,7Bを避けて反対車線側にはみ出そうとすると、反対車線を走行している車両5Bは、車両5Aとの間で衝突の可能性がある。よって、この場合、エッジサーバ3は、車両5Bの衝突予測対象として、車両5Aを特定する。
If the vehicle 5A attempts to protrude into the opposite lane avoiding the pedestrians 7A and 7B, the vehicle 5B running on the opposite lane may collide with the vehicle 5A. Therefore, in this case, the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B.
エッジサーバ3は、車両5Bを評価対象としたときの評価値を、歩行者7A,7Bとの間の衝突予測時間に基づいて求める。
車両5Bにおける車両5Aの評価値が前記通知判定用閾値よりも大きい値となれば、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。 Theedge server 3 obtains an evaluation value when the vehicle 5B is an evaluation target, based on a predicted collision time between the pedestrians 7A and 7B.
When the evaluation value of thevehicle 5A in the vehicle 5B is larger than the notification determination threshold, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
車両5Bにおける車両5Aの評価値が前記通知判定用閾値よりも大きい値となれば、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。 The
When the evaluation value of the
〔シナリオ6について〕
図24は、シナリオ6に係る交差点周辺の状況を示した図である。
図24において、歩行者7A,7Bは、方路R1を横断する横断歩道Pを横断している。歩行者7A,7Bが横断歩道Pを横断し始めたタイミングにおいて横断歩道Pの信号の灯色は、青色の点滅であったが、横断歩道Pの途中で、横断歩道Pの信号の灯色が赤色に変わり、歩行者7A,7Bは、そのまま急いで横断歩道Pを渡っているとする。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 6]
FIG. 24 is a diagram showing a situation around an intersection according to scenario 6.
In FIG. 24, pedestrians 7A and 7B are crossing a pedestrian crossing P that crosses a route R1. At the timing when the pedestrians 7A and 7B start crossing the pedestrian crossing P, the light color of the signal of the pedestrian crossing P is blinking blue, but the light color of the signal of the pedestrian crossing P is It turns red, and it is assumed that the pedestrians 7A and 7B cross the pedestrian crossing P in a hurry. The pedestrian 7A is an adult, and the pedestrian 7B is a child.
図24は、シナリオ6に係る交差点周辺の状況を示した図である。
図24において、歩行者7A,7Bは、方路R1を横断する横断歩道Pを横断している。歩行者7A,7Bが横断歩道Pを横断し始めたタイミングにおいて横断歩道Pの信号の灯色は、青色の点滅であったが、横断歩道Pの途中で、横断歩道Pの信号の灯色が赤色に変わり、歩行者7A,7Bは、そのまま急いで横断歩道Pを渡っているとする。また、歩行者7Aは大人であり、歩行者7Bは子供である。 [About Scenario 6]
FIG. 24 is a diagram showing a situation around an intersection according to scenario 6.
In FIG. 24,
車両5Aは、方路R2から交差点に進入し、左折して方路R1に向かって交差点内を走行している。また、車両5Bは、車両5Aに後続して走行している。横断歩道Pの信号の灯色が青色の点滅から赤色に変わるタイミングであるので、車両5A,5Bも前方の信号機の灯色も青から黄色、赤色と、切り替わるタイミングである。よって、車両5A,5Bも交差点の通過を急いでいる。
さらに、方路R2と方路R1との間にであって交差点の角部には、方路R1と方路R2との見通しを遮る建物G2が存在する。 Thevehicle 5A enters the intersection from the route R2, turns left, and travels in the intersection toward the route R1. The vehicle 5B is running following the vehicle 5A. Since the light color of the signal on the pedestrian crossing P changes from blinking blue to red, the light color of the vehicles 5A and 5B and the traffic light ahead also changes from blue to yellow and red. Therefore, the vehicles 5A and 5B are also in a hurry to pass the intersection.
Further, there is a building G2 between the route R2 and the route R1 and at a corner of the intersection, which blocks the line of sight between the route R1 and the route R2.
さらに、方路R2と方路R1との間にであって交差点の角部には、方路R1と方路R2との見通しを遮る建物G2が存在する。 The
Further, there is a building G2 between the route R2 and the route R1 and at a corner of the intersection, which blocks the line of sight between the route R1 and the route R2.
ここで、交差点を方路R1に向かって左折する車両5Aと、歩行者7A,7Bとは、車道上で衝突の可能性がある。特に、方路R1と方路R2との間には見通しを遮る建物G2が存在しているため、車両5Aと、歩行者7A,7Bの双方で接近するまで互いの存在に気づきにくい状況である。
Here, there is a possibility that the vehicle 5A that turns left at the intersection toward the route R1 and the pedestrians 7A and 7B collide on the road. In particular, since there is a building G2 obstructing the line of sight between the route R1 and the route R2, it is difficult for the vehicle 5A and the pedestrians 7A and 7B to notice each other until they approach each other. .
車両5Aが左折し、歩行者7A,7Bに向かって走行すれば、エッジサーバ3は、車両5Aの衝突予測対象として、歩行者7A,7Bを特定するとともに、歩行者7A,7Bの衝突予測対象として、車両5Aを特定する。また、歩行者7A,7Bは、属性が子供(歩行者7Bのみ)であり、信号を無視している。さらに、車両5Aと歩行者7A,7Bとの間に死角要因である建物G2が存在する。
When the vehicle 5A turns left and runs toward the pedestrians 7A and 7B, the edge server 3 specifies the pedestrians 7A and 7B as the collision prediction targets of the vehicle 5A and the collision prediction targets of the pedestrians 7A and 7B. , The vehicle 5A is specified. Further, the attributes of the pedestrians 7A and 7B are children (only the pedestrian 7B), and the signals are ignored. Further, a building G2 that is a blind spot factor exists between the vehicle 5A and the pedestrians 7A and 7B.
このとき、エッジサーバ3は、車両5Aを評価対象としたときの評価値を求める場合、衝突予測時間から基礎値を求め、車両5に関する加算処理による加算値の他、死角要因や、歩行者7A,7Bに関する加算処理による加算値を基礎値に加算し、評価値を得る。この際、歩行者7A,7Bに関する加算処理では、歩行者の身長(120cm以下)や、信号無視による加算値が加算される。
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測の予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation value when thevehicle 5A is the evaluation target, the edge server 3 obtains the base value from the collision prediction time, and in addition to the addition value by the addition processing regarding the vehicle 5, the blind spot factor and the pedestrian 7A , 7B are added to the base value to obtain an evaluation value. At this time, in the addition processing for the pedestrians 7A and 7B, the height of the pedestrian (120 cm or less) and an added value due to ignoring the signal are added.
As a result, the evaluation value of the pedestrians 7A and 7B in the vehicle 5A becomes larger than the notification determination threshold, and the edge server 3 notifies the vehicle 5A of the collision prediction result of the pedestrians 7A and 7B in the vehicle 5A.
Further, theedge server 3 notifies the pedestrians 7A and 7B of the prediction result of the collision prediction of the vehicle 5A in the pedestrians 7A and 7B.
これにより、車両5Aにおける歩行者7A,7Bの評価値は前記通知判定用閾値よりも大きい値となり、エッジサーバ3は、車両5Aにおける歩行者7A,7Bの衝突予測結果を車両5Aへ通知する。
また、エッジサーバ3は、歩行者7A,7Bにおける車両5Aの衝突予測の予測結果を歩行者7A,7Bへ通知する。 At this time, when obtaining the evaluation value when the
As a result, the evaluation value of the
Further, the
さらに、車両5Bにおいても、車両5Aが横断歩道Pの歩行者7A,7Bに気づいて速度を落とし、両車両が接近すると、エッジサーバ3は、車両5Bの衝突予測対象として車両5Aを特定する。衝突予測の結果、衝突予測時間が短ければ、評価値が大きな値に設定される。この結果、エッジサーバ3は、車両5Bにおける車両5Aの衝突予測結果を車両5Bへ通知する。
{Circle around (5)} Also, in the vehicle 5B, when the vehicle 5A notices the pedestrians 7A and 7B on the pedestrian crossing P and slows down, and the two vehicles approach, the edge server 3 specifies the vehicle 5A as a collision prediction target of the vehicle 5B. As a result of the collision prediction, if the collision prediction time is short, the evaluation value is set to a large value. As a result, the edge server 3 notifies the vehicle 5B of the collision prediction result of the vehicle 5A in the vehicle 5B.
以上によって、車両5A,5Bの車載装置50、歩行者7A,7Bの歩行者端末70は、エッジサーバ3からの通知に基づいて、衝突予測結果を自装置のユーザへ出力する。
これによって、エッジサーバ3は、横断歩道Pを歩行者が渡っていることを事前に車両5Aのユーザに認識させることができる。
また、エッジサーバ3は、横断歩道Pの右側から車両5Aが現れることを事前に歩行者7A,7Bに認識させることができる。 As described above, the in-vehicle devices 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B output the collision prediction result to the user of the own device based on the notification from the edge server 3.
Thereby, theedge server 3 can make the user of the vehicle 5A recognize in advance that the pedestrian is crossing the pedestrian crossing P.
In addition, theedge server 3 can make the pedestrians 7A and 7B recognize in advance that the vehicle 5A will appear from the right side of the pedestrian crossing P.
これによって、エッジサーバ3は、横断歩道Pを歩行者が渡っていることを事前に車両5Aのユーザに認識させることができる。
また、エッジサーバ3は、横断歩道Pの右側から車両5Aが現れることを事前に歩行者7A,7Bに認識させることができる。 As described above, the in-
Thereby, the
In addition, the
これにより、エッジサーバ3は、車両5A,5Bの車載装置50、歩行者7A,7Bの歩行者端末70にユーザへの出力を行わせることで、各移動体同士の衝突を回避させることができる。
Thereby, the edge server 3 can avoid collision between the moving objects by causing the in-vehicle device 50 of the vehicles 5A and 5B and the pedestrian terminals 70 of the pedestrians 7A and 7B to output to the user. .
〔第2実施形態について〕
図25は、第2実施形態に係るエッジサーバ3の記憶部34を示す図である。
本実施形態の記憶部34には、動的情報マップM1、移動体データベース34a、及び評価値データベース34bの他、相対距離データベース34cが記憶されている。
相対距離データベース34cには、移動体データベース34aに登録されている各移動体同士の相対距離が登録されている。 [About the second embodiment]
FIG. 25 is a diagram illustrating thestorage unit 34 of the edge server 3 according to the second embodiment.
Thestorage unit 34 of the present embodiment stores a dynamic information map M1, a moving object database 34a, an evaluation value database 34b, and a relative distance database 34c.
In therelative distance database 34c, the relative distance between the moving objects registered in the moving object database 34a is registered.
図25は、第2実施形態に係るエッジサーバ3の記憶部34を示す図である。
本実施形態の記憶部34には、動的情報マップM1、移動体データベース34a、及び評価値データベース34bの他、相対距離データベース34cが記憶されている。
相対距離データベース34cには、移動体データベース34aに登録されている各移動体同士の相対距離が登録されている。 [About the second embodiment]
FIG. 25 is a diagram illustrating the
The
In the
本実施形態の演算部31aは、評価対象(対象移動体)、及び評価対象以外の移動体(他の移動体)それぞれの移動情報(位置情報、方位情報、及び速度情報)等を用いて評価値を求める個別演算処理に加えて、相対距離データベース34cに登録されている各移動体同士の相対距離に基づく加算処理をさらに行う点で、第1実施形態と相違する。
以下で説明する構成以外の構成については、第1実施形態と同様である。 Thecalculation unit 31a of the present embodiment evaluates using the movement information (position information, azimuth information, and speed information) of the evaluation target (target moving object) and the moving objects other than the evaluation target (other moving objects). The third embodiment is different from the first embodiment in that an addition process based on the relative distance between the moving objects registered in the relative distance database 34c is further performed in addition to the individual calculation process for obtaining the value.
The configuration other than the configuration described below is the same as that of the first embodiment.
以下で説明する構成以外の構成については、第1実施形態と同様である。 The
The configuration other than the configuration described below is the same as that of the first embodiment.
図26は、相対距離データベース34cの一例を示す図である。
図26に示すように、相対距離データベース34cには、移動体データベース34aに登録されている各移動体同士の相対距離が登録されている。
相対距離データベース34cには、移動体IDごとに、他の移動体IDとの間の相対距離が登録されている。また、相対距離データベース34cには、過去の相対距離も登録されている。
例えば、相対距離データベース34c中、移動体ID「1001」と移動体ID「1002」との相対距離は、直近に更新された位置情報から得られた相対距離として「132m」が登録され、前回更新時の位置情報から得られた相対距離として「152m」が登録され、前々回更新時の位置情報から得られた相対距離として「172m」が登録されている。つまり、相対距離データベース34cには、更新間隔ごとに時系列に並ぶ複数の相対距離が登録される。 FIG. 26 is a diagram illustrating an example of therelative distance database 34c.
As shown in FIG. 26, the relative distance between the moving objects registered in the movingobject database 34a is registered in the relative distance database 34c.
In therelative distance database 34c, a relative distance from another moving object ID is registered for each moving object ID. Further, a past relative distance is also registered in the relative distance database 34c.
For example, in therelative distance database 34c, as the relative distance between the mobile unit ID “1001” and the mobile unit ID “1002”, “132 m” is registered as the relative distance obtained from the latest updated position information, “152 m” is registered as the relative distance obtained from the positional information at the time, and “172 m” is registered as the relative distance obtained from the positional information at the time of updating two times before. That is, in the relative distance database 34c, a plurality of relative distances arranged in chronological order at each update interval are registered.
図26に示すように、相対距離データベース34cには、移動体データベース34aに登録されている各移動体同士の相対距離が登録されている。
相対距離データベース34cには、移動体IDごとに、他の移動体IDとの間の相対距離が登録されている。また、相対距離データベース34cには、過去の相対距離も登録されている。
例えば、相対距離データベース34c中、移動体ID「1001」と移動体ID「1002」との相対距離は、直近に更新された位置情報から得られた相対距離として「132m」が登録され、前回更新時の位置情報から得られた相対距離として「152m」が登録され、前々回更新時の位置情報から得られた相対距離として「172m」が登録されている。つまり、相対距離データベース34cには、更新間隔ごとに時系列に並ぶ複数の相対距離が登録される。 FIG. 26 is a diagram illustrating an example of the
As shown in FIG. 26, the relative distance between the moving objects registered in the moving
In the
For example, in the
検出部31dは、各移動体の位置情報が登録される移動体データベース34aを更新するごとに、各移動体同士の相対距離を求め、相対距離データベース34cに登録する。
よって、検出部31dは、移動体データベース34aを随時更新するのと同時に、相対距離データベース34cも更新する。 Each time the detectingunit 31d updates the moving object database 34a in which the position information of each moving object is registered, the detecting unit 31d calculates the relative distance between the moving objects and registers the relative distance in the relative distance database 34c.
Therefore, the detectingunit 31d updates the relative distance database 34c at the same time as updating the moving object database 34a as needed.
よって、検出部31dは、移動体データベース34aを随時更新するのと同時に、相対距離データベース34cも更新する。 Each time the detecting
Therefore, the detecting
図27は、本実施形態の演算部31aが行う演算処理の一例を示すフローチャートである。
本実施形態の演算処理は、ステップS55の個別演算処理の後に、相対距離に基づく加算処理(ステップS59)を行う。
相対距離に基づく加算処理では、ステップS57において登録された各評価値に対してさらに加算値を加算する処理を行う。
よって、ここでは、主にステップS59について説明する。 FIG. 27 is a flowchart illustrating an example of a calculation process performed by thecalculation unit 31a according to the present embodiment.
In the calculation processing of the present embodiment, an addition processing (step S59) based on the relative distance is performed after the individual calculation processing in step S55.
In the adding process based on the relative distance, a process of further adding the added value to each evaluation value registered in step S57 is performed.
Therefore, here, step S59 will be mainly described.
本実施形態の演算処理は、ステップS55の個別演算処理の後に、相対距離に基づく加算処理(ステップS59)を行う。
相対距離に基づく加算処理では、ステップS57において登録された各評価値に対してさらに加算値を加算する処理を行う。
よって、ここでは、主にステップS59について説明する。 FIG. 27 is a flowchart illustrating an example of a calculation process performed by the
In the calculation processing of the present embodiment, an addition processing (step S59) based on the relative distance is performed after the individual calculation processing in step S55.
In the adding process based on the relative distance, a process of further adding the added value to each evaluation value registered in step S57 is performed.
Therefore, here, step S59 will be mainly described.
図28は、図27中ステップS59の相対距離に基づく加算処理の一例を示すフローチャートである。
図28に示すように、演算部31aは、まず、評価対象以外の移動体を特定し(ステップS120)、評価対象以外の移動体のうち、対象移動体との相対距離が100m以内でかつ相対距離の変位が負の移動体を接近対象として特定する(ステップS121)。 FIG. 28 is a flowchart showing an example of the adding process based on the relative distance in step S59 in FIG.
As illustrated in FIG. 28, thecalculation unit 31a first specifies a mobile object other than the evaluation target (step S120), and among the mobile objects other than the evaluation target, the relative distance to the target mobile object is within 100 m and the relative distance to the target mobile object is less than 100 m. A moving body having a negative distance displacement is specified as an approaching target (step S121).
図28に示すように、演算部31aは、まず、評価対象以外の移動体を特定し(ステップS120)、評価対象以外の移動体のうち、対象移動体との相対距離が100m以内でかつ相対距離の変位が負の移動体を接近対象として特定する(ステップS121)。 FIG. 28 is a flowchart showing an example of the adding process based on the relative distance in step S59 in FIG.
As illustrated in FIG. 28, the
相対距離の変位とは、現在の相対距離から過去の相対距離を減算した値である。相対距離の変位が正の場合、相対距離が増加していることを示しており、評価対象と、評価対象以外の移動体とは相対的に離れる方向に移動している。また、相対距離の変位が負の場合、相対距離が減少していることを示しており、評価対象と、評価対象以外の移動体とは相対的に接近する方向に移動していることが判る。
演算部31aは、直近に更新された位置情報による相対距離から、直近よりも過去に更新された位置情報による相対距離を減算することで、相対距離の変位を求める。
なお、演算部31aは、直近に更新された位置情報による相対距離から、直近よりも過去に更新された位置情報による相対距離を減算し、さらに、相対距離データベース34cの更新間隔に基づいて、単位時間当たりの相対距離の変位を求め、これを相対距離の変位として用いてもよい。 The displacement of the relative distance is a value obtained by subtracting the past relative distance from the current relative distance. When the displacement of the relative distance is positive, it indicates that the relative distance is increasing, and the evaluation target and the moving object other than the evaluation target are moving in a direction relatively away from each other. In addition, when the displacement of the relative distance is negative, it indicates that the relative distance is decreasing, and it can be seen that the evaluation target and the moving object other than the evaluation target are moving in a direction relatively approaching. .
Thecalculation unit 31a obtains the displacement of the relative distance by subtracting the relative distance based on the position information updated earlier than the latest one from the relative distance based on the latest updated position information.
Thearithmetic unit 31a subtracts the relative distance based on the position information updated earlier than the latest one from the relative distance based on the latest updated position information, and further calculates a unit based on the update interval of the relative distance database 34c. The displacement of the relative distance per time may be obtained, and this may be used as the displacement of the relative distance.
演算部31aは、直近に更新された位置情報による相対距離から、直近よりも過去に更新された位置情報による相対距離を減算することで、相対距離の変位を求める。
なお、演算部31aは、直近に更新された位置情報による相対距離から、直近よりも過去に更新された位置情報による相対距離を減算し、さらに、相対距離データベース34cの更新間隔に基づいて、単位時間当たりの相対距離の変位を求め、これを相対距離の変位として用いてもよい。 The displacement of the relative distance is a value obtained by subtracting the past relative distance from the current relative distance. When the displacement of the relative distance is positive, it indicates that the relative distance is increasing, and the evaluation target and the moving object other than the evaluation target are moving in a direction relatively away from each other. In addition, when the displacement of the relative distance is negative, it indicates that the relative distance is decreasing, and it can be seen that the evaluation target and the moving object other than the evaluation target are moving in a direction relatively approaching. .
The
The
対象移動体の周囲所定の距離に位置しかつ対象移動体に接近する移動体は、対象移動体に衝突又は接触する可能性を有する。
そこで、演算部31aは、対象移動体の周囲100m以内に位置しかつ対象移動体に接近するように相対距離が減少する評価対象以外の移動体を、接近対象として特定する。
つまり、接近対象として特定された移動体は、対象移動体に衝突又は接触する可能性を有する移動体である。 A moving object located at a predetermined distance around the target moving object and approaching the target moving object has a possibility of colliding with or contacting the target moving object.
Therefore, thecalculation unit 31a specifies, as an approaching target, a moving object that is located within 100 m around the target moving object and whose relative distance decreases so as to approach the target moving object.
That is, the moving object specified as the approaching target is a moving object that has a possibility of colliding or contacting the target moving object.
そこで、演算部31aは、対象移動体の周囲100m以内に位置しかつ対象移動体に接近するように相対距離が減少する評価対象以外の移動体を、接近対象として特定する。
つまり、接近対象として特定された移動体は、対象移動体に衝突又は接触する可能性を有する移動体である。 A moving object located at a predetermined distance around the target moving object and approaching the target moving object has a possibility of colliding with or contacting the target moving object.
Therefore, the
That is, the moving object specified as the approaching target is a moving object that has a possibility of colliding or contacting the target moving object.
ステップS121の後、演算部31aは、接近対象として特定された移動体(以下、接近移動体ともいう)の評価値に対して加算処理を行う(ステップS122)。
図29は、評価値に対する加算処理の一例を示すフローチャートである。
接近移動体に対する加算処理は、対象移動体における接近移動体の評価値にさらに加算値を加算する処理である。
なお、対象移動体における接近移動体の評価値は、個別演算処理によって求められる。 After step S121, thecalculation unit 31a performs an addition process on the evaluation value of the moving object specified as an approaching target (hereinafter, also referred to as an approaching moving object) (step S122).
FIG. 29 is a flowchart illustrating an example of an addition process for an evaluation value.
The addition processing for the approaching moving body is processing for further adding an addition value to the evaluation value of the approaching moving body in the target moving body.
Note that the evaluation value of the approaching moving body in the target moving body is obtained by an individual calculation process.
図29は、評価値に対する加算処理の一例を示すフローチャートである。
接近移動体に対する加算処理は、対象移動体における接近移動体の評価値にさらに加算値を加算する処理である。
なお、対象移動体における接近移動体の評価値は、個別演算処理によって求められる。 After step S121, the
FIG. 29 is a flowchart illustrating an example of an addition process for an evaluation value.
The addition processing for the approaching moving body is processing for further adding an addition value to the evaluation value of the approaching moving body in the target moving body.
Note that the evaluation value of the approaching moving body in the target moving body is obtained by an individual calculation process.
演算部31aは、ステップS131からステップS137までの処理を、ステップS121にて特定した全ての接近移動体について処理するまで繰り返す。
以下、ステップS132からステップS136の説明については、対象移動体と、特定した全ての接近移動体のうちの1つの接近移動体との処理について説明する。 Thecalculation unit 31a repeats the processing from step S131 to step S137 until all the approaching moving objects identified in step S121 are processed.
Hereinafter, in the description of steps S132 to S136, the processing of the target moving object and one of the specified approaching moving objects will be described.
以下、ステップS132からステップS136の説明については、対象移動体と、特定した全ての接近移動体のうちの1つの接近移動体との処理について説明する。 The
Hereinafter, in the description of steps S132 to S136, the processing of the target moving object and one of the specified approaching moving objects will be described.
演算部31aは、まず、ステップS132において、対象移動体と、接近移動体との関係が、歩行者対歩行者か否かを判定する(ステップS132)。ステップS132において、対象移動体と、接近移動体との関係が歩行者対歩行者であると判定すると、演算部31aは、この接近移動体については加算処理を行わずステップS137へ進む。これにより、対象移動体と、接近移動体との関係が、歩行者対歩行者の場合、加算処理は行われない。
First, in step S132, the calculation unit 31a determines whether the relationship between the target moving body and the approaching moving body is pedestrian-to-pedestrian (step S132). If it is determined in step S132 that the relationship between the target moving body and the approaching moving body is pedestrian-to-pedestrian, the calculation unit 31a proceeds to step S137 without performing the adding process on the approaching moving body. Accordingly, when the relationship between the target moving body and the approaching moving body is pedestrian-to-pedestrian, the addition processing is not performed.
一方、ステップS132において、対象移動体と、接近移動体との関係が歩行者対歩行者でないと判定すると、演算部31aは、ステップS134へ進み、対象移動体と、接近移動体との関係が車両対車両か否かを判定する(ステップS134)。
ステップS134において、対象移動体と、接近移動体との関係が車両対車両でないと判定すると、演算部31aは、ステップS136へ進み、車両対歩行者又は歩行者対車両に対応した処理を行う。 On the other hand, if it is determined in step S132 that the relationship between the target moving object and the approaching moving object is not a pedestrian-to-pedestrian, thecalculation unit 31a proceeds to step S134, where the relationship between the target moving object and the approaching moving object is determined. It is determined whether or not the vehicle is a vehicle-to-vehicle (step S134).
If it is determined in step S134 that the relationship between the target moving body and the approaching moving body is not a vehicle-to-vehicle, thecalculation unit 31a proceeds to step S136 and performs a process corresponding to the vehicle-to-pedestrian or pedestrian-to-vehicle.
ステップS134において、対象移動体と、接近移動体との関係が車両対車両でないと判定すると、演算部31aは、ステップS136へ進み、車両対歩行者又は歩行者対車両に対応した処理を行う。 On the other hand, if it is determined in step S132 that the relationship between the target moving object and the approaching moving object is not a pedestrian-to-pedestrian, the
If it is determined in step S134 that the relationship between the target moving body and the approaching moving body is not a vehicle-to-vehicle, the
図30は、車両対歩行者又は歩行者対車両に対応した処理の一例を示すフローチャートである。
この場合、対象移動体と、接近移動体とは、一方が車両、他方が歩行者である。ここでは、対象移動体が車両、接近移動体が歩行者である場合について説明するが、対象移動体が歩行者、接近移動体が車両である場合も同様の処理が行われる。 FIG. 30 is a flowchart illustrating an example of a process corresponding to vehicle-to-pedestrian or pedestrian-to-vehicle.
In this case, one of the target moving body and the approaching moving body is a vehicle, and the other is a pedestrian. Here, a case where the target moving body is a vehicle and the approaching moving body is a pedestrian will be described. However, the same processing is performed when the target moving body is a pedestrian and the approaching moving body is a vehicle.
この場合、対象移動体と、接近移動体とは、一方が車両、他方が歩行者である。ここでは、対象移動体が車両、接近移動体が歩行者である場合について説明するが、対象移動体が歩行者、接近移動体が車両である場合も同様の処理が行われる。 FIG. 30 is a flowchart illustrating an example of a process corresponding to vehicle-to-pedestrian or pedestrian-to-vehicle.
In this case, one of the target moving body and the approaching moving body is a vehicle, and the other is a pedestrian. Here, a case where the target moving body is a vehicle and the approaching moving body is a pedestrian will be described. However, the same processing is performed when the target moving body is a pedestrian and the approaching moving body is a vehicle.
図30に示すように、まず、演算部31aは、接近移動体である歩行者の属性「歩行場所」を参照し、その内容が歩道、又は建物の中であるか否かを判定する(ステップS141)。
As shown in FIG. 30, first, the calculation unit 31a refers to the attribute “walking place” of the pedestrian who is the approaching moving body, and determines whether or not the content is in the sidewalk or in the building (step S141).
本実施形態の検出部31dは、歩行者の歩行場所が、車道、歩道、その他(建物の中)、及び横断歩道のいずれであるかを所得し、歩行者の行動属性「歩行場所」の内容として、車道、歩道、その他(建物の中)、及び横断歩道のいずれかを登録する。
The detection unit 31d of the present embodiment acquires whether the pedestrian's walking place is a roadway, a sidewalk, other (in a building), or a pedestrian crossing, and describes the content of the pedestrian's behavior attribute “walking place”. , One of a roadway, a sidewalk, other (inside a building), and a crosswalk is registered.
「歩行場所」の内容が歩道、又は建物の中であると判定すると、演算部31aは、この接近移動体(歩行者)については加算処理を行わずに処理を終え、ステップS137(図29)へ進む。歩行者が歩道や建物の中にいるときは、歩行者と対象移動体である車両とが衝突する可能性は低いと考えられる。よって、歩行者が歩道や建物の中にいるときは、演算部31aは、加算処理を行わない。
When it is determined that the content of the “walking place” is inside the sidewalk or the building, the calculation unit 31a ends the process without performing the adding process on the approaching moving body (pedestrian), and proceeds to step S137 (FIG. 29). Proceed to. When the pedestrian is in the sidewalk or the building, it is considered that the possibility that the pedestrian will collide with the vehicle as the target moving body is low. Therefore, when the pedestrian is on the sidewalk or in the building, the calculation unit 31a does not perform the addition processing.
ステップS141において「歩行場所」の内容が歩道、又は建物の中でないと判定すると、演算部31aは、対象移動体である車両の属性「速度」を参照し、車両の速度が5km/h以下か否かを判定する(ステップS143)。
車両の速度が5km/h以下であると判定すると、演算部31aは、接近移動体である歩行者については加算処理を行わずに処理を終え、ステップS137(図29)へ進む。この場合に加算処理を行わない理由は、車両の速度が5km/h以下であれば、歩行者と対象移動体である車両とが衝突する可能性は低いと考えられるからである。 If it is determined in step S141 that the content of “walking place” is not in the sidewalk or in the building, thecalculation unit 31a refers to the attribute “speed” of the target moving object vehicle and determines whether the speed of the vehicle is 5 km / h or less. It is determined whether or not it is (step S143).
When determining that the speed of the vehicle is 5 km / h or less, thecalculation unit 31a ends the process without performing the addition process on the pedestrian who is the approaching moving body, and proceeds to step S137 (FIG. 29). The reason why the addition process is not performed in this case is that if the speed of the vehicle is 5 km / h or less, the possibility that the pedestrian will collide with the vehicle as the target moving body is considered to be low.
車両の速度が5km/h以下であると判定すると、演算部31aは、接近移動体である歩行者については加算処理を行わずに処理を終え、ステップS137(図29)へ進む。この場合に加算処理を行わない理由は、車両の速度が5km/h以下であれば、歩行者と対象移動体である車両とが衝突する可能性は低いと考えられるからである。 If it is determined in step S141 that the content of “walking place” is not in the sidewalk or in the building, the
When determining that the speed of the vehicle is 5 km / h or less, the
ステップS143において車両の速度が5km/h以下でないと判定すると、演算部31aは、歩行者の属性「歩行場所」を参照し、その内容が横断歩道であるか否かを判定する(ステップS144)。
「歩行場所」の内容が横断歩道であると判定すると、演算部31aは、ステップS145へ進み、横断歩道が横断する道路の中央を、接近移動体である歩行者が通過したか否かを判定する(ステップS145)。
歩行者が道路の中央を通過したか否かの判定は、演算部31aによって行われる。演算部31aは、マップM1及び歩行者の属性情報を参照し、歩行者の位置及び移動方向から、歩行者の位置が、横断歩道を歩行し横断歩道の中間地点である道路の中央を通過した位置か否かを判定する。 If it is determined in step S143 that the speed of the vehicle is not lower than 5 km / h, thecalculation unit 31a refers to the attribute “walking place” of the pedestrian and determines whether or not the content is a pedestrian crossing (step S144). .
When determining that the content of “walking place” is a pedestrian crossing, thecalculation unit 31a proceeds to step S145, and determines whether a pedestrian who is an approaching moving body has passed the center of the road traversed by the pedestrian crossing. (Step S145).
Thecalculation unit 31a determines whether the pedestrian has passed the center of the road. The calculation unit 31a refers to the map M1 and the attribute information of the pedestrian, and based on the position and the moving direction of the pedestrian, the position of the pedestrian has walked on the pedestrian crossing and has passed through the center of the road which is an intermediate point of the pedestrian crossing. It is determined whether or not it is a position.
「歩行場所」の内容が横断歩道であると判定すると、演算部31aは、ステップS145へ進み、横断歩道が横断する道路の中央を、接近移動体である歩行者が通過したか否かを判定する(ステップS145)。
歩行者が道路の中央を通過したか否かの判定は、演算部31aによって行われる。演算部31aは、マップM1及び歩行者の属性情報を参照し、歩行者の位置及び移動方向から、歩行者の位置が、横断歩道を歩行し横断歩道の中間地点である道路の中央を通過した位置か否かを判定する。 If it is determined in step S143 that the speed of the vehicle is not lower than 5 km / h, the
When determining that the content of “walking place” is a pedestrian crossing, the
The
歩行者が道路の中央を通過したと判定すると、演算部31aは、歩行者の位置が歩道に達するまで2m以内か否かを判定する(ステップS146)。
歩行者の位置が歩道に達するまで2m以内か否かの判定についても、演算部31aによって行われる。演算部31aは、マップM1及び歩行者の属性情報を参照し、歩行者の位置及び移動方向から、歩行者が横断歩道を歩行し、かつ歩行者の位置が歩道に達するまで2m以内か否かを判定する。 When determining that the pedestrian has passed the center of the road, thecalculation unit 31a determines whether or not the position of the pedestrian is within 2 m until reaching the sidewalk (step S146).
Thecalculation unit 31a also determines whether the position of the pedestrian is within 2 meters before reaching the sidewalk. The calculation unit 31a refers to the map M1 and the attribute information of the pedestrian, and determines whether the pedestrian walks on the pedestrian crossing from the pedestrian's position and the moving direction and the pedestrian's position is within 2 m until reaching the sidewalk. Is determined.
歩行者の位置が歩道に達するまで2m以内か否かの判定についても、演算部31aによって行われる。演算部31aは、マップM1及び歩行者の属性情報を参照し、歩行者の位置及び移動方向から、歩行者が横断歩道を歩行し、かつ歩行者の位置が歩道に達するまで2m以内か否かを判定する。 When determining that the pedestrian has passed the center of the road, the
The
歩行者の位置が歩道に達するまで2m以内であると判定すると、演算部31aは、接近移動体である歩行者については加算処理を行わずに処理を終え、ステップS137(図29)へ進む。歩行者の位置が歩道に達するまで2m以内であれば、歩行者は、速やかに横断歩道を渡り終え、歩行者と対象移動体である車両とが衝突する可能性は低いと考えられる。よって、演算部31aは、ステップS142において歩行者が歩道に達するまで2m以内と判定すると、加算処理を行わない。
(4) If it is determined that the position of the pedestrian is within 2 m before reaching the sidewalk, the calculation unit 31a ends the process without performing the addition process on the pedestrian who is the approaching moving body, and proceeds to step S137 (FIG. 29). If the position of the pedestrian is within 2 m before reaching the sidewalk, it is considered that the pedestrian quickly finishes crossing the pedestrian crossing, and the possibility that the pedestrian will collide with the vehicle as the target moving body is low. Therefore, if it is determined in step S142 that the pedestrian is within 2 m until the pedestrian reaches the sidewalk, the calculation unit 31a does not perform the addition process.
一方、ステップS144において歩行者の行動属性「歩行場所」の内容が横断歩道でないと判定すると、演算部31aは、ステップS147へ進み、対象移動体における接近移動体の評価値に加算値を加算し(ステップS147)、処理を終える。
ステップS147において、演算部31aは、評価値データベース34bを参照し、対象移動体に対応する移動体IDと、接近移動体に対応する移動体IDとの間の評価値が登録される欄を特定する。演算部31aは、特定した欄に登録されている評価値に加算値を加算する。
ステップS147において評価値に加算される加算値は、例えば「200」である。 On the other hand, if it is determined in step S144 that the content of the pedestrian's behavior attribute “walking place” is not a pedestrian crossing, theoperation unit 31a proceeds to step S147, and adds the addition value to the evaluation value of the approaching moving body in the target moving body. (Step S147), the process ends.
In step S147, thecalculation unit 31a refers to the evaluation value database 34b, and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do. The calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
The added value added to the evaluation value in step S147 is, for example, “200”.
ステップS147において、演算部31aは、評価値データベース34bを参照し、対象移動体に対応する移動体IDと、接近移動体に対応する移動体IDとの間の評価値が登録される欄を特定する。演算部31aは、特定した欄に登録されている評価値に加算値を加算する。
ステップS147において評価値に加算される加算値は、例えば「200」である。 On the other hand, if it is determined in step S144 that the content of the pedestrian's behavior attribute “walking place” is not a pedestrian crossing, the
In step S147, the
The added value added to the evaluation value in step S147 is, for example, “200”.
この場合、歩行者は、横断歩道以外の車道を歩行している可能性が高く、また、歩行者の行動属性「歩行場所」も車道である。歩行者が車道に位置し、かつ相対的に接近中である歩行者と車両との相対距離が100m以内であるため、歩行者と車両とが衝突する可能性が比較的高いと考えることができる。よって、ここで加算にされる加算値は、通知の必要性が高く、個別演算処理において衝突予測時間が3秒未満のときの基礎値と同じ値に設定される。
In this case, the pedestrian is likely to be walking on a road other than the crosswalk, and the pedestrian's behavior attribute “walking location” is also a road. Since the pedestrian is located on the roadway and the relative distance between the pedestrian and the vehicle that is relatively approaching is within 100 m, it can be considered that the possibility that the pedestrian and the vehicle collide is relatively high. . Therefore, the added value added here has a high necessity of notification, and is set to the same value as the base value when the predicted collision time is less than 3 seconds in the individual calculation processing.
また、ステップS145において歩行者が道路の中央を通過していないと判定すると、演算部31aは、この場合も、ステップS147へ進み、歩行者が歩道に達するまで2m以内でないと判定すると、対象移動体における接近移動体の評価値に加算値を加算し(ステップS147)、処理を終える。
この場合、歩行者は、横断歩道を歩行しているが、未だ、横断歩道を渡り終えるまでにある一定の時間が必要であり、その間に、接近中の車両と歩行者とが衝突する可能性があると考えることができる。よって、この場合も、対象移動体における接近移動体の評価値に加算値を加算する。 If it is determined in step S145 that the pedestrian has not passed through the center of the road, thecalculation unit 31a also proceeds to step S147, and if it is determined that the pedestrian is not within 2 meters before reaching the sidewalk, the target movement The addition value is added to the evaluation value of the approaching moving body in the body (step S147), and the process ends.
In this case, the pedestrian is walking on the pedestrian crossing, but it still requires a certain amount of time to finish crossing the pedestrian crossing, during which the approaching vehicle may collide with the pedestrian. It can be considered that there is. Therefore, also in this case, the addition value is added to the evaluation value of the approaching moving body in the target moving body.
この場合、歩行者は、横断歩道を歩行しているが、未だ、横断歩道を渡り終えるまでにある一定の時間が必要であり、その間に、接近中の車両と歩行者とが衝突する可能性があると考えることができる。よって、この場合も、対象移動体における接近移動体の評価値に加算値を加算する。 If it is determined in step S145 that the pedestrian has not passed through the center of the road, the
In this case, the pedestrian is walking on the pedestrian crossing, but it still requires a certain amount of time to finish crossing the pedestrian crossing, during which the approaching vehicle may collide with the pedestrian. It can be considered that there is. Therefore, also in this case, the addition value is added to the evaluation value of the approaching moving body in the target moving body.
また、ステップS146において歩行者が歩道に達するまで2m以内でないと判定すると、演算部31aは、ステップS148へ進み、対象移動体における接近移動体の評価値に加算値を加算し(ステップS148)、処理を終える。
ステップS148において、演算部31aは、評価値データベース34bを参照し、対象移動体に対応する移動体IDと、接近移動体に対応する移動体IDとの間の評価値が登録される欄を特定する。演算部31aは、特定した欄に登録されている評価値に加算値を加算する。
ステップS148において評価値に加算される加算値は、例えば「100」である。 If it is determined in step S146 that the pedestrian is not within 2 meters before reaching the sidewalk, thecalculation unit 31a proceeds to step S148, and adds an addition value to the evaluation value of the approaching moving body in the target moving body (step S148). Finish the process.
In step S148, thecalculation unit 31a refers to the evaluation value database 34b and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do. The calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
The added value added to the evaluation value in step S148 is, for example, "100".
ステップS148において、演算部31aは、評価値データベース34bを参照し、対象移動体に対応する移動体IDと、接近移動体に対応する移動体IDとの間の評価値が登録される欄を特定する。演算部31aは、特定した欄に登録されている評価値に加算値を加算する。
ステップS148において評価値に加算される加算値は、例えば「100」である。 If it is determined in step S146 that the pedestrian is not within 2 meters before reaching the sidewalk, the
In step S148, the
The added value added to the evaluation value in step S148 is, for example, "100".
この場合、歩行者は、横断歩道を半分渡り終えているが、未だ、横断歩道を渡り終えるまでにある一定の時間が必要であり、その間に、接近中の車両と歩行者とが衝突する可能性があると考えることができる。よって、この場合も、対象移動体における接近移動体の評価値に加算値を加算する。
なお、歩行者が横断歩道を渡り終えるまでの時間は、歩行者が道路の中央を通過していないと判定される場合と比較して短い。よって、ステップS148における加算値は、ステップS147における加算値よりも小さい値に設定される。 In this case, the pedestrian has crossed the pedestrian crossing halfway, but it still requires a certain amount of time to complete the pedestrian crossing, during which the approaching vehicle may collide with the pedestrian. Can be considered to be. Therefore, also in this case, the addition value is added to the evaluation value of the approaching moving body in the target moving body.
It should be noted that the time required for the pedestrian to cross the pedestrian crossing is shorter than the case where it is determined that the pedestrian has not passed through the center of the road. Therefore, the added value in step S148 is set to a value smaller than the added value in step S147.
なお、歩行者が横断歩道を渡り終えるまでの時間は、歩行者が道路の中央を通過していないと判定される場合と比較して短い。よって、ステップS148における加算値は、ステップS147における加算値よりも小さい値に設定される。 In this case, the pedestrian has crossed the pedestrian crossing halfway, but it still requires a certain amount of time to complete the pedestrian crossing, during which the approaching vehicle may collide with the pedestrian. Can be considered to be. Therefore, also in this case, the addition value is added to the evaluation value of the approaching moving body in the target moving body.
It should be noted that the time required for the pedestrian to cross the pedestrian crossing is shorter than the case where it is determined that the pedestrian has not passed through the center of the road. Therefore, the added value in step S148 is set to a value smaller than the added value in step S147.
図29に戻って、ステップS134において、対象移動体と、接近移動体との関係が車両対車両であると判定すると、演算部31aは、ステップS135へ進み、車両対車両に対応した処理を行う。
Returning to FIG. 29, when it is determined in step S134 that the relationship between the target moving body and the approaching moving body is vehicle-to-vehicle, the calculation unit 31a proceeds to step S135 and performs processing corresponding to the vehicle-to-vehicle. .
図31は、車両対車両に対応した処理の一例を示すフローチャートである。
図31に示すように、まず、演算部31aは、対象移動体の車両及び接近移動体の車両が同じ道路を同じ方向へ向かって並走しているか、又は同一車線を走行しているか否かを判定する(ステップS151)。
両車両が同じ道路を同じ方向へ向かって並走しているか、又は同一車線を走行しているか否かの判定は、演算部31aによって行われる。
演算部31aは、マップM1及び両車両の属性情報を参照し、両車両の位置及び移動方向から、両車両が同じ道路を同じ方向へ向かって並走しているか、又は同一車線を走行しているか否かを判定する。 FIG. 31 is a flowchart illustrating an example of a process corresponding to a vehicle-to-vehicle process.
As shown in FIG. 31, first, thecalculation unit 31a determines whether the vehicle of the target moving body and the vehicle of the approaching moving body are running in parallel on the same road in the same direction, or are traveling on the same lane. Is determined (step S151).
Thedetermination unit 31a determines whether the two vehicles are running in parallel in the same direction on the same road or in the same lane.
Thecalculation unit 31a refers to the map M1 and the attribute information of the two vehicles, and based on the positions and the moving directions of the two vehicles, the two vehicles are running in parallel on the same road in the same direction, or traveling in the same lane. Is determined.
図31に示すように、まず、演算部31aは、対象移動体の車両及び接近移動体の車両が同じ道路を同じ方向へ向かって並走しているか、又は同一車線を走行しているか否かを判定する(ステップS151)。
両車両が同じ道路を同じ方向へ向かって並走しているか、又は同一車線を走行しているか否かの判定は、演算部31aによって行われる。
演算部31aは、マップM1及び両車両の属性情報を参照し、両車両の位置及び移動方向から、両車両が同じ道路を同じ方向へ向かって並走しているか、又は同一車線を走行しているか否かを判定する。 FIG. 31 is a flowchart illustrating an example of a process corresponding to a vehicle-to-vehicle process.
As shown in FIG. 31, first, the
The
The
両車両が同じ道路を同じ方向へ向かって並走しておらず、かつ同一車線を走行していないと判定すると、演算部31aは、両車両が走行する道路に中央分離帯があるか否かを判定する(ステップS152)。
If it is determined that the two vehicles are not traveling in the same direction on the same road and are not traveling in the same lane, the arithmetic unit 31a determines whether or not the road on which both vehicles travel has a median strip. Is determined (step S152).
両車両が走行する道路に中央分離帯があるか否かの判定についても、演算部31aによって行われる。演算部31aは、マップM1及び両車両の属性情報を参照し、両車両が走行する道路に中央分離帯があるか否かの判定を行う。
判定 The calculation unit 31a also determines whether or not there is a median strip on the road on which both vehicles travel. The calculation unit 31a refers to the map M1 and the attribute information of both vehicles, and determines whether there is a median strip on the road on which both vehicles travel.
両車両が走行する道路に中央分離帯があると判定すると、演算部31aは、対象移動体における接近移動体の評価値に加算値を加算することなく処理を終え、ステップS137(図29)へ進む。
両車両が互いに接近しているときに、両車両が同じ道路を同じ方向へ向かって並走しておらず、かつ同一車線を走行していない場合、両車両は、互いに対向車線を走行中であると考えられる。このとき、中央分離帯があれば、両者が衝突する可能性は極めて低いと考えられる。このため、ステップS152において両車両が走行する道路に中央分離帯があると判定すると、演算部31aは、加算処理を行わない。 When determining that there is a median strip on the road on which both vehicles travel, thecalculation unit 31a ends the process without adding the added value to the evaluation value of the approaching moving body in the target moving body, and proceeds to step S137 (FIG. 29). move on.
If both vehicles are not running side by side on the same road in the same direction and are not traveling in the same lane when both vehicles are approaching each other, both vehicles are traveling in opposite lanes It is believed that there is. At this time, if there is a median strip, it is considered that the possibility of collision between the two is extremely low. For this reason, if it is determined in step S152 that there is a median strip on the road on which both vehicles travel, thecalculation unit 31a does not perform the addition processing.
両車両が互いに接近しているときに、両車両が同じ道路を同じ方向へ向かって並走しておらず、かつ同一車線を走行していない場合、両車両は、互いに対向車線を走行中であると考えられる。このとき、中央分離帯があれば、両者が衝突する可能性は極めて低いと考えられる。このため、ステップS152において両車両が走行する道路に中央分離帯があると判定すると、演算部31aは、加算処理を行わない。 When determining that there is a median strip on the road on which both vehicles travel, the
If both vehicles are not running side by side on the same road in the same direction and are not traveling in the same lane when both vehicles are approaching each other, both vehicles are traveling in opposite lanes It is believed that there is. At this time, if there is a median strip, it is considered that the possibility of collision between the two is extremely low. For this reason, if it is determined in step S152 that there is a median strip on the road on which both vehicles travel, the
一方、ステップS152において両車両が走行する道路に中央分離帯がないと判定すると、演算部31aは、ステップS153へ進み、対象移動体における接近移動体の評価値に加算値を加算し、処理を終える。
この場合、両車両は、互いに対向車線を走行中であり、原則として衝突しないと考えられるが、両車両が互いに接近しているため、演算部31aは、加算処理を行う。
ステップS153において、演算部31aは、評価値データベース34bを参照し、対象移動体に対応する移動体IDと、接近移動体に対応する移動体IDとの間の評価値が登録される欄を特定する。演算部31aは、特定した欄に登録されている評価値に加算値を加算する。
ステップS153において評価値に加算される加算値は、例えば「50」である。 On the other hand, if it is determined in step S152 that there is no median strip on the road on which both vehicles are traveling, thecalculation unit 31a proceeds to step S153, adds the addition value to the evaluation value of the approaching moving body in the target moving body, and performs the processing. Finish.
In this case, the two vehicles are traveling on the opposite lane and do not collide in principle. However, since the two vehicles are approaching each other, thecalculation unit 31a performs an addition process.
In step S153, thecalculation unit 31a refers to the evaluation value database 34b and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do. The calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
The added value added to the evaluation value in step S153 is, for example, “50”.
この場合、両車両は、互いに対向車線を走行中であり、原則として衝突しないと考えられるが、両車両が互いに接近しているため、演算部31aは、加算処理を行う。
ステップS153において、演算部31aは、評価値データベース34bを参照し、対象移動体に対応する移動体IDと、接近移動体に対応する移動体IDとの間の評価値が登録される欄を特定する。演算部31aは、特定した欄に登録されている評価値に加算値を加算する。
ステップS153において評価値に加算される加算値は、例えば「50」である。 On the other hand, if it is determined in step S152 that there is no median strip on the road on which both vehicles are traveling, the
In this case, the two vehicles are traveling on the opposite lane and do not collide in principle. However, since the two vehicles are approaching each other, the
In step S153, the
The added value added to the evaluation value in step S153 is, for example, “50”.
ステップS151において、対象移動体の車両及び接近移動体の車両が同じ道路を同じ方向へ向かって並走している、又は、同一車線を走行している、と判定すると、演算部31aは、対象移動体の車両を基準として前方100m以内に道路がカーブしているか否かを判定する(ステップS154)。
In step S151, when it is determined that the vehicle of the target moving body and the vehicle of the approaching moving body are running in parallel on the same road in the same direction or are running on the same lane, the arithmetic unit 31a It is determined whether or not the road curves within 100 m in front of the moving vehicle (step S154).
前方100m以内に道路がカーブしているか否かの判定は、演算部31aによって行われる。演算部31aは、マップM1及び対象移動体の車両の属性情報を参照し、対象移動体の前方100m以内の範囲で道路がカーブしているか否かを判定する。ステップS154で判定されるカーブは、大きく速度を低下させることなく、法定速度で曲がることができる程度のカーブを指す。
判定 The calculation unit 31a determines whether the road is curved within 100m ahead. The calculation unit 31a refers to the map M1 and the attribute information of the vehicle of the target moving body, and determines whether the road is curved within a range of 100 m ahead of the target moving body. The curve determined in step S154 indicates a curve that can be turned at a legal speed without greatly reducing the speed.
前方100m以内に道路がカーブしていないと判定すると、演算部31aは、演算部31aは、対象移動体における接近移動体の評価値に加算値を加算することなく処理を終え、ステップS137(図29)へ進む。
両車両が並走している、又は同一車線を走行している場合、両車両が徐々に接近しているとしても、両車両が直線道路を通常に走行していれば、そのような状態の車両に対して積極的に衝突予測結果を通知する必要性が低い。このため、ステップS154において前方100m以内に道路がカーブしていないと判定すると、演算部31aは、加算処理を行わない。 If it is determined that the road is not curved within 100 m ahead, thecalculation unit 31a ends the processing without adding the added value to the evaluation value of the approaching moving body in the target moving body, and proceeds to step S137 (FIG. Proceed to 29).
If both vehicles are running side by side, or are traveling in the same lane, such a condition may occur if both vehicles are traveling normally on a straight road, even if both vehicles are gradually approaching. It is not necessary to actively notify the vehicle of the collision prediction result. Therefore, if it is determined in step S154 that the road is not curved within 100 m ahead, thecalculation unit 31a does not perform the addition processing.
両車両が並走している、又は同一車線を走行している場合、両車両が徐々に接近しているとしても、両車両が直線道路を通常に走行していれば、そのような状態の車両に対して積極的に衝突予測結果を通知する必要性が低い。このため、ステップS154において前方100m以内に道路がカーブしていないと判定すると、演算部31aは、加算処理を行わない。 If it is determined that the road is not curved within 100 m ahead, the
If both vehicles are running side by side, or are traveling in the same lane, such a condition may occur if both vehicles are traveling normally on a straight road, even if both vehicles are gradually approaching. It is not necessary to actively notify the vehicle of the collision prediction result. Therefore, if it is determined in step S154 that the road is not curved within 100 m ahead, the
ステップS154において前方100m以内に道路がカーブしていると判定すると、演算部31aは、ステップS155へ進み、対象移動体における接近移動体の評価値に加算値を加算し、処理を終える。
両車両がカーブに進入すると、両車両はカーブに応じてハンドルを切る必要がある。両車両が並走又は同一車線を走行し、互いに接近するときに、両車両がハンドルを切ると、互いに接触する可能性が高まる。このため、演算部31aは、加算処理を行う。 If it is determined in step S154 that the road is curved within 100 m ahead, theoperation unit 31a proceeds to step S155, adds the added value to the evaluation value of the approaching moving body in the target moving body, and ends the processing.
When both vehicles enter the curve, both vehicles need to turn the steering wheel according to the curve. When both vehicles are running side by side or on the same lane and approach each other, the possibility of contact with each other increases if both vehicles turn the steering wheel. Therefore, thearithmetic unit 31a performs an addition process.
両車両がカーブに進入すると、両車両はカーブに応じてハンドルを切る必要がある。両車両が並走又は同一車線を走行し、互いに接近するときに、両車両がハンドルを切ると、互いに接触する可能性が高まる。このため、演算部31aは、加算処理を行う。 If it is determined in step S154 that the road is curved within 100 m ahead, the
When both vehicles enter the curve, both vehicles need to turn the steering wheel according to the curve. When both vehicles are running side by side or on the same lane and approach each other, the possibility of contact with each other increases if both vehicles turn the steering wheel. Therefore, the
ステップS155において、演算部31aは、評価値データベース34bを参照し、対象移動体に対応する移動体IDと、接近移動体に対応する移動体IDとの間の評価値が登録される欄を特定する。演算部31aは、特定した欄に登録されている評価値に加算値を加算する。
ステップS155において評価値に加算される加算値は、例えば「200」である。ステップS155では、両車両がハンドルを切る必要があるため、ステップS155の場合よりも両車両同士で接触する可能性が高いと考えられる。よって、ステップS155での加算値は、ステップS153での加算値よりも大きい値に設定される。 In step S155, thecalculation unit 31a refers to the evaluation value database 34b and specifies a column in which an evaluation value between the moving object ID corresponding to the target moving object and the moving object ID corresponding to the approaching moving object is registered. I do. The calculation unit 31a adds the addition value to the evaluation value registered in the specified column.
The added value added to the evaluation value in step S155 is, for example, “200”. In step S155, since both vehicles need to turn the steering wheel, it is considered that there is a higher possibility that both vehicles will contact each other than in the case of step S155. Therefore, the added value in step S155 is set to a value larger than the added value in step S153.
ステップS155において評価値に加算される加算値は、例えば「200」である。ステップS155では、両車両がハンドルを切る必要があるため、ステップS155の場合よりも両車両同士で接触する可能性が高いと考えられる。よって、ステップS155での加算値は、ステップS153での加算値よりも大きい値に設定される。 In step S155, the
The added value added to the evaluation value in step S155 is, for example, “200”. In step S155, since both vehicles need to turn the steering wheel, it is considered that there is a higher possibility that both vehicles will contact each other than in the case of step S155. Therefore, the added value in step S155 is set to a value larger than the added value in step S153.
図29に戻って、演算部31aは、ステップS135の処理、又はステップS136の処理を終えると、ステップS137へ進む。
演算部31aは、ステップS131からステップS137までの処理を、ステップS121にて特定した全ての接近移動体について処理するまで繰り返す。 Returning to FIG. 29, after completing the process of step S135 or the process of step S136, thecalculation unit 31a proceeds to step S137.
Thecalculation unit 31a repeats the processing from step S131 to step S137 until all the approaching moving objects identified in step S121 are processed.
演算部31aは、ステップS131からステップS137までの処理を、ステップS121にて特定した全ての接近移動体について処理するまで繰り返す。 Returning to FIG. 29, after completing the process of step S135 or the process of step S136, the
The
全ての接近移動体についての処理を終了すると、演算部31aは、評価値に対する加算処理を終え(図28中のステップS122)、相対距離に基づく加算処理を終える。
このように、相対距離に基づく加算処理において、演算部31aは、接近移動体を対象移動体に衝突又は接触する可能性を有する移動体として扱い、接近移動体に対する評価値に加算処理を行う。 When the processing for all approaching moving objects is completed, thecalculation unit 31a ends the addition processing for the evaluation value (Step S122 in FIG. 28), and ends the addition processing based on the relative distance.
As described above, in the addition processing based on the relative distance, thecalculation unit 31a treats the approaching moving body as a moving body having a possibility of colliding with or contacting the target moving body, and performs the addition processing on the evaluation value for the approaching moving body.
このように、相対距離に基づく加算処理において、演算部31aは、接近移動体を対象移動体に衝突又は接触する可能性を有する移動体として扱い、接近移動体に対する評価値に加算処理を行う。 When the processing for all approaching moving objects is completed, the
As described above, in the addition processing based on the relative distance, the
ここで、個別演算処理では、衝突予測対象以外の移動体には評価値として「0」が設定される。しかし、相対距離に基づく加算処理では、複数の接近移動体が特定され、複数の接近移動体それぞれの評価値に対して加算値が加算される。
従って、演算部31aは、個別演算処理において、ある対象移動体に対する衝突予測対象に特定せず評価値を「0」とした移動体であっても、相対距離に基づく加算処理においては、前記ある対象移動体に対する接近移動体に特定し、評価値に加算することがある。 Here, in the individual calculation processing, “0” is set as an evaluation value for a moving object other than the collision prediction target. However, in the addition processing based on the relative distance, a plurality of approaching moving bodies are specified, and an addition value is added to the evaluation value of each of the plurality of approaching moving bodies.
Therefore, in the individual calculation process, even in the case of a moving object whose evaluation value is “0” without being specified as a collision prediction target for a certain target moving object in the individual calculation process, thecalculation unit 31a has the above-mentioned in the addition process based on the relative distance. It may be specified as a moving body approaching the target moving body and added to the evaluation value.
従って、演算部31aは、個別演算処理において、ある対象移動体に対する衝突予測対象に特定せず評価値を「0」とした移動体であっても、相対距離に基づく加算処理においては、前記ある対象移動体に対する接近移動体に特定し、評価値に加算することがある。 Here, in the individual calculation processing, “0” is set as an evaluation value for a moving object other than the collision prediction target. However, in the addition processing based on the relative distance, a plurality of approaching moving bodies are specified, and an addition value is added to the evaluation value of each of the plurality of approaching moving bodies.
Therefore, in the individual calculation process, even in the case of a moving object whose evaluation value is “0” without being specified as a collision prediction target for a certain target moving object in the individual calculation process, the
評価値に対する加算処理を終えると、演算部31aは、図27の演算処理に戻る。演算部31aは、図27中のステップS55,S57,S59を、特定した対象移動体に対して順次処理し、対象移動体の全てを処理するまで処理を繰り返す。
演算部31aは、特定した対象移動体の全てについて評価値を求め、さらに評価値に対する加算処理を終えると、再度、ステップS51に戻り、同様の処理を繰り返す。 After completing the addition process for the evaluation value, thecalculation unit 31a returns to the calculation process of FIG. The calculation unit 31a sequentially processes steps S55, S57, and S59 in FIG. 27 for the specified target moving objects, and repeats the processing until all of the target moving objects are processed.
Thecalculation unit 31a obtains evaluation values for all of the specified target moving objects, and after completing the addition processing on the evaluation values, returns to step S51 again and repeats the same processing.
演算部31aは、特定した対象移動体の全てについて評価値を求め、さらに評価値に対する加算処理を終えると、再度、ステップS51に戻り、同様の処理を繰り返す。 After completing the addition process for the evaluation value, the
The
よって、演算部31aは、演算処理を繰り返すことで、特定した対象移動体それぞれの評価値の算出及び登録を繰り返し実行し、評価値データベース34bの登録内容を随時更新する。
Therefore, the arithmetic unit 31a repeatedly calculates and registers the evaluation value of each specified target moving object by repeating the arithmetic processing, and updates the registered contents of the evaluation value database 34b as needed.
図32は、相対距離に基づく加算処理の後の評価値データベース34bの一例を示す図である。
図32中、移動体ID「1005」である対象移動体の評価値において、評価対象以外の移動体の評価値の欄のうち、移動体ID「1001」に対応する欄には評価値として「200」が登録され、移動体ID「1003」に対応する欄には評価値として「200」が登録され、それ以外には「0」が登録されている。 FIG. 32 is a diagram illustrating an example of theevaluation value database 34b after the addition processing based on the relative distance.
In FIG. 32, in the evaluation value of the target moving object having the moving object ID “1005”, in the column of the evaluation values of the moving objects other than the evaluation target, the column corresponding to the moving object ID “1001” has “ "200" is registered, and "200" is registered as an evaluation value in a column corresponding to the mobile unit ID "1003", and "0" is registered in other fields.
図32中、移動体ID「1005」である対象移動体の評価値において、評価対象以外の移動体の評価値の欄のうち、移動体ID「1001」に対応する欄には評価値として「200」が登録され、移動体ID「1003」に対応する欄には評価値として「200」が登録され、それ以外には「0」が登録されている。 FIG. 32 is a diagram illustrating an example of the
In FIG. 32, in the evaluation value of the target moving object having the moving object ID “1005”, in the column of the evaluation values of the moving objects other than the evaluation target, the column corresponding to the moving object ID “1001” has “ "200" is registered, and "200" is registered as an evaluation value in a column corresponding to the mobile unit ID "1003", and "0" is registered in other fields.
図32の評価値データベース34bにおいて、例えば、移動体ID「1003」の移動体が、移動体ID「1005」の対象移動体に対する衝突予測対象として特定され、これによって評価値が「200」とされているとする。また、移動体ID「1001」の移動体が、移動体ID「1005」の対象移動体に対する接近移動体に特定され、これによって評価値が「200」とされているものとする。さらに、判定部31bが衝突予測結果を通知するか否かを判定するために用いる評価値に対する通知判定用閾値が「200」に設定されているとする。つまり、判定部31bは、評価値が「200」以上の場合、その評価対象に対して衝突予測結果を通知すべきと判定する。
In the evaluation value database 34b of FIG. 32, for example, the moving object with the moving object ID “1003” is specified as a collision prediction target with respect to the target moving object with the moving object ID “1005”, whereby the evaluation value is set to “200”. Suppose It is also assumed that the moving object with the moving object ID “1001” is specified as a moving object approaching the target moving object with the moving object ID “1005”, and the evaluation value is “200”. Further, it is assumed that the notification determination threshold value for the evaluation value used to determine whether the determination unit 31b notifies the collision prediction result is set to “200”. That is, when the evaluation value is “200” or more, the determination unit 31b determines that the collision prediction result should be notified to the evaluation target.
上述したように、個別演算処理では、衝突予測対象以外の移動体には評価値として「0」が設定される。しかし、相対距離に基づく加算処理では、複数の接近移動体が特定され、複数の接近移動体それぞれに対する対象移動体の評価値に対して加算値が加算される。
このため、図32に示すように、移動体ID「1005」の移動体に対して、複数の移動体(移動体ID「1001」の移動体、及び移動体ID「1003」の移動体)の評価値が登録される。 As described above, in the individual calculation processing, “0” is set as an evaluation value for a moving object other than the collision prediction target. However, in the addition processing based on the relative distance, a plurality of approaching moving bodies are specified, and an addition value is added to the evaluation value of the target moving body for each of the plurality of approaching moving bodies.
For this reason, as shown in FIG. 32, a plurality of moving objects (the moving object with the moving object ID “1001” and the moving object with the moving object ID “1003”) correspond to the moving object with the moving object ID “1005”. The evaluation value is registered.
このため、図32に示すように、移動体ID「1005」の移動体に対して、複数の移動体(移動体ID「1001」の移動体、及び移動体ID「1003」の移動体)の評価値が登録される。 As described above, in the individual calculation processing, “0” is set as an evaluation value for a moving object other than the collision prediction target. However, in the addition processing based on the relative distance, a plurality of approaching moving bodies are specified, and an addition value is added to the evaluation value of the target moving body for each of the plurality of approaching moving bodies.
For this reason, as shown in FIG. 32, a plurality of moving objects (the moving object with the moving object ID “1001” and the moving object with the moving object ID “1003”) correspond to the moving object with the moving object ID “1005”. The evaluation value is registered.
この場合、対象移動体(移動体ID「1005」の移動体)には、移動体ID「1001」の移動体に関する衝突予測結果と、移動体ID「1003」の移動体に関する衝突予測結果とが、通知される。
このように、本実施形態では、個別演算処理によって特定された衝突予測対象に関する衝突予測結果のみならず、相対距離に基づく加算処理によって特定された接近移動体に関する衝突予測結果についても対象移動体へ通知される。
なお、接近移動体に関する衝突予測結果の内容は、衝突予測対象に関する衝突予測結果と同様であり、接近移動体の属性等の情報や、接近移動体が接近してくる方向等が含まれる。 In this case, the target mobile unit (the mobile unit with the mobile unit ID “1005”) has a collision prediction result for the mobile unit with the mobile unit ID “1001” and a collision prediction result for the mobile unit with the mobile unit ID “1003”. Will be notified.
As described above, in the present embodiment, not only the collision prediction result related to the collision prediction target specified by the individual calculation processing but also the collision prediction result related to the approaching moving body specified by the addition processing based on the relative distance is transmitted to the target moving body. Notified.
The content of the collision prediction result for the approaching moving body is the same as the collision prediction result for the collision prediction target, and includes information such as the attribute of the approaching moving body and the direction in which the approaching moving body approaches.
このように、本実施形態では、個別演算処理によって特定された衝突予測対象に関する衝突予測結果のみならず、相対距離に基づく加算処理によって特定された接近移動体に関する衝突予測結果についても対象移動体へ通知される。
なお、接近移動体に関する衝突予測結果の内容は、衝突予測対象に関する衝突予測結果と同様であり、接近移動体の属性等の情報や、接近移動体が接近してくる方向等が含まれる。 In this case, the target mobile unit (the mobile unit with the mobile unit ID “1005”) has a collision prediction result for the mobile unit with the mobile unit ID “1001” and a collision prediction result for the mobile unit with the mobile unit ID “1003”. Will be notified.
As described above, in the present embodiment, not only the collision prediction result related to the collision prediction target specified by the individual calculation processing but also the collision prediction result related to the approaching moving body specified by the addition processing based on the relative distance is transmitted to the target moving body. Notified.
The content of the collision prediction result for the approaching moving body is the same as the collision prediction result for the collision prediction target, and includes information such as the attribute of the approaching moving body and the direction in which the approaching moving body approaches.
本実施形態のエッジサーバ3は、他の移動体のうち、対象移動体との相対距離が100m以内でかつ相対距離の変位が負の移動体を接近移動体と特定し、対象移動体における接近移動体の評価値に加算値を加算する。
言い換えると、エッジサーバ3は、対象移動体と他の移動体との相対距離、及び相対距離の変位の基づく加算値を、対象移動体における接近移動体の評価値に加算する。 Theedge server 3 according to the present embodiment identifies a moving object having a relative distance of 100 m or less and a negative displacement of the relative distance as an approaching moving object among other moving objects, and approaches the moving object at the target moving object. The additional value is added to the evaluation value of the moving object.
In other words, theedge server 3 adds the relative distance between the target moving object and another moving object and an added value based on the displacement of the relative distance to the evaluation value of the approaching moving object in the target moving object.
言い換えると、エッジサーバ3は、対象移動体と他の移動体との相対距離、及び相対距離の変位の基づく加算値を、対象移動体における接近移動体の評価値に加算する。 The
In other words, the
個別演算処理によって演算される衝突予測時間は、対象移動体、及び他の移動体の移動情報(位置情報、方位情報、及び速度情報)を用いて、対象移動体と、他の移動体とが衝突する場合の衝突予測時間を求める。よって、エッジサーバ3は、対象移動体と他の移動体との進行方向(方位情報)が互いに交差するか又は重複し、かつ、対象移動体の速度と他の移動体の速度との関係から衝突するか否かを判定する。
よって、対象移動体と他の移動体とが接近中である場合であっても、エッジサーバ3は、接近中の対象移動体と他の移動体とは衝突の可能性が低いと判定する場合がある。 The collision prediction time calculated by the individual calculation processing is based on the movement information (position information, azimuth information, and speed information) of the target moving body and other moving bodies, and the target moving body and the other moving bodies are used. The collision prediction time when a collision occurs is obtained. Therefore, theedge server 3 determines that the traveling directions (azimuth information) of the target mobile unit and the other mobile units intersect or overlap with each other, and that the relationship between the speed of the target mobile unit and the speed of the other mobile unit is different. It is determined whether or not a collision occurs.
Therefore, even when the target moving body is approaching another moving body, theedge server 3 determines that the possibility of collision between the approaching target moving body and the other moving body is low. There is.
よって、対象移動体と他の移動体とが接近中である場合であっても、エッジサーバ3は、接近中の対象移動体と他の移動体とは衝突の可能性が低いと判定する場合がある。 The collision prediction time calculated by the individual calculation processing is based on the movement information (position information, azimuth information, and speed information) of the target moving body and other moving bodies, and the target moving body and the other moving bodies are used. The collision prediction time when a collision occurs is obtained. Therefore, the
Therefore, even when the target moving body is approaching another moving body, the
これに対して、本実施形態では、エッジサーバ3は、対象移動体と他の移動体との相対距離、及び相対距離の変位の基づく加算値を、対象移動体における接近移動体の評価値に加算する。これにより、衝突予測時間には現れないが、対象移動体と他の移動体が接近することで衝突の可能性が生じている場合において、対象移動体の移動端末へ衝突予測結果が提供されるように評価値に反映させることができる。
On the other hand, in the present embodiment, the edge server 3 uses the relative distance between the target moving object and another moving object and the added value based on the displacement of the relative distance as the evaluation value of the approaching moving object in the target moving object. to add. Thereby, the collision prediction result is provided to the mobile terminal of the target moving object when the possibility of a collision occurs due to the approach of the target moving object and another moving object, which does not appear in the collision prediction time. In the evaluation value.
なお、本実施形態の図28中のステップS121において、評価対象以外の移動体のうち、対象移動体との相対距離が100m以内でかつ相対距離の変位が負の移動体を接近移動体として特定する場合を例示したが、接近移動体と判定するための相対距離の閾値は、100mに限定されるものではない。
In step S121 in FIG. 28 of the present embodiment, among the moving objects other than the evaluation object, a moving object whose relative distance to the target moving object is within 100 m and whose relative distance is negative is specified as an approaching moving object. However, the threshold of the relative distance for determining the approaching moving body is not limited to 100 m.
本実施形態において、相対距離の閾値を100mとすれば、対象移動体と他の移動体とが共に車両であって、互いに向かい合う方向に沿って50km/hで走行していたとしても、相対距離が100m以内になった直後に通知をし両車両が停止動作を開始すれば、互いに停止することが可能である。このため、本実施形態では、相対距離の閾値を100mとしている。
In the present embodiment, assuming that the threshold of the relative distance is 100 m, even if the target moving body and the other moving body are both vehicles and run at 50 km / h along the facing directions, the relative moving distance If a notification is made immediately after the vehicle is within 100 m and both vehicles start a stop operation, they can stop each other. For this reason, in this embodiment, the threshold value of the relative distance is set to 100 m.
一方、対象移動体及び他の移動体のうち、一方が車両、他方が歩行者の場合、相対距離の閾値をより小さくすることができる。
よって、相対距離の閾値は、対象移動体及び他の移動体の属性(車両又は歩行者)に応じて変更するように構成してもよい。
さらに、相対距離の閾値は、相対距離の変位に応じて変更するように構成してもよい。 On the other hand, when one of the target moving object and the other moving objects is a vehicle and the other is a pedestrian, the threshold of the relative distance can be made smaller.
Therefore, the threshold value of the relative distance may be changed according to the attributes (vehicle or pedestrian) of the target moving object and other moving objects.
Furthermore, the threshold of the relative distance may be configured to change according to the displacement of the relative distance.
よって、相対距離の閾値は、対象移動体及び他の移動体の属性(車両又は歩行者)に応じて変更するように構成してもよい。
さらに、相対距離の閾値は、相対距離の変位に応じて変更するように構成してもよい。 On the other hand, when one of the target moving object and the other moving objects is a vehicle and the other is a pedestrian, the threshold of the relative distance can be made smaller.
Therefore, the threshold value of the relative distance may be changed according to the attributes (vehicle or pedestrian) of the target moving object and other moving objects.
Furthermore, the threshold of the relative distance may be configured to change according to the displacement of the relative distance.
また、本実施形態の図31中のステップS154において、対象移動体の車両を基準として前方100m以内に道路がカーブしているか否かを判定する場合を例示した。しかし、前方のカーブの有無を判定するための距離の閾値は、100mに限定されるものではない。
Also, in step S154 in FIG. 31 of the present embodiment, a case has been illustrated in which it is determined whether or not the road curves within 100 m ahead of the vehicle of the target moving object. However, the threshold value of the distance for determining whether there is a curve ahead is not limited to 100 m.
本実施形態において、前方のカーブの有無を判定するための距離の閾値を100mとすれば、車両が50km/hで走行しているとすると、当該車両がカーブに到達するまでに、数秒間かかる。この数秒間の間に事前に通知することができるため、本実施形態では、前方のカーブの有無を判定するための距離の閾値を100mとしている。
よって、カーブに到達するまでに事前に通知できれば、前記閾値をより長い距離に設定してもよい。
また、前方のカーブの有無を判定するための距離の閾値は、車両の速度に応じて変更するように構成してもよい。 In the present embodiment, assuming that the vehicle is traveling at 50 km / h, assuming that the vehicle is traveling at 50 km / h, it takes several seconds until the vehicle reaches the curve, assuming that the threshold for determining the presence or absence of the curve ahead is 100 m. . In the present embodiment, the threshold value of the distance for determining the presence or absence of a forward curve is set to 100 m because the notification can be made in advance during these several seconds.
Therefore, the threshold may be set to a longer distance if the notification can be made in advance before the vehicle reaches the curve.
Further, the threshold value of the distance for determining the presence or absence of a forward curve may be changed according to the speed of the vehicle.
よって、カーブに到達するまでに事前に通知できれば、前記閾値をより長い距離に設定してもよい。
また、前方のカーブの有無を判定するための距離の閾値は、車両の速度に応じて変更するように構成してもよい。 In the present embodiment, assuming that the vehicle is traveling at 50 km / h, assuming that the vehicle is traveling at 50 km / h, it takes several seconds until the vehicle reaches the curve, assuming that the threshold for determining the presence or absence of the curve ahead is 100 m. . In the present embodiment, the threshold value of the distance for determining the presence or absence of a forward curve is set to 100 m because the notification can be made in advance during these several seconds.
Therefore, the threshold may be set to a longer distance if the notification can be made in advance before the vehicle reaches the curve.
Further, the threshold value of the distance for determining the presence or absence of a forward curve may be changed according to the speed of the vehicle.
また、本実施形態では、演算部31aが、対象移動体及び他の移動体それぞれの移動情報(位置情報、方位情報、及び速度情報)等を用いて評価値を求める個別演算処理に加えて、相対距離データベース34cに登録されている各移動体同士の相対距離に基づく加算処理をさらに行う場合を例示したが、演算部31aは、個別演算処理を行わず、各移動体同士の相対距離に基づく加算処理のみによって、対象移動体と他の移動体との間の評価値を求めるように構成してもよい。
Further, in the present embodiment, in addition to the individual calculation processing for calculating the evaluation value using the movement information (position information, azimuth information, and speed information) of the target moving body and the other moving bodies, Although the case where the addition process based on the relative distance between the moving objects registered in the relative distance database 34c is further performed is illustrated, the calculation unit 31a does not perform the individual calculation process and based on the relative distance between the moving objects. The evaluation value between the target moving object and another moving object may be obtained only by the addition processing.
この場合、衝突予測結果を通知するか否かを判定するための評価値を、対象移動体と他の移動体それぞれとの相対距離、及び相対距離の変位に基づいて求めるので、対象移動体と他の移動体とが所定の相対距離の範囲内であって、互いに接近するように相対距離が減少していれば、衝突の可能性が生じていると判定することができる。この結果、対象移動体及び他の移動体の進行方向に関わらず、対象移動体に対して衝突の可能性を有する他の移動体を幅広く評価でき、通知の必要性を適切に判定することができる。
In this case, the evaluation value for determining whether or not to notify the collision prediction result is determined based on the relative distance between the target moving object and each of the other moving objects, and the displacement of the relative distance. If the relative distance to another moving object is within a range of a predetermined relative distance and the relative distance decreases so as to approach each other, it can be determined that a possibility of collision has occurred. As a result, regardless of the traveling directions of the target moving object and the other moving objects, other moving objects having a possibility of collision with the target moving object can be widely evaluated, and the necessity of notification can be appropriately determined. it can.
〔相対距離に基づく加算処理による具体例について〕
〔具体例1〕
図33は、相対距離に基づく加算処理による具体例を説明するための図であり、交差点の横断歩道90を渡る歩行者7と、交差点を右折する車両5とを示した図である。
図33では、歩行者7及び車両5は共に信号に従って進行している。
歩行者7及び車両5は、移動体として動的情報マップM1及び移動体データベース34aに登録されているものとする。
また、車両5は、車載装置50を搭載しており、歩行者7は歩行者端末70を携帯しているものとする。
また、以下の説明において、通知判定用閾値が「200」に設定されており、エッジサーバ3は、対象移動体と他の移動体との間の評価値が通知判定用閾値(「200」)以上であれば、対象移動体に対して衝突予測結果を通知するように構成されているものとする。 [Specific example by addition processing based on relative distance]
[Specific example 1]
FIG. 33 is a diagram for describing a specific example of the addition process based on the relative distance, and is a diagram illustrating apedestrian 7 crossing a crosswalk 90 at an intersection and a vehicle 5 turning right at the intersection.
In FIG. 33, thepedestrian 7 and the vehicle 5 are both traveling in accordance with the traffic light.
It is assumed that thepedestrian 7 and the vehicle 5 are registered as moving objects in the dynamic information map M1 and the moving object database 34a.
Thevehicle 5 is equipped with the on-vehicle device 50, and the pedestrian 7 carries the pedestrian terminal 70.
In the following description, the notification determination threshold is set to “200”, and theedge server 3 determines that the evaluation value between the target mobile unit and another mobile unit is the notification determination threshold value (“200”). If so, it is assumed that the collision prediction result is notified to the target moving body.
〔具体例1〕
図33は、相対距離に基づく加算処理による具体例を説明するための図であり、交差点の横断歩道90を渡る歩行者7と、交差点を右折する車両5とを示した図である。
図33では、歩行者7及び車両5は共に信号に従って進行している。
歩行者7及び車両5は、移動体として動的情報マップM1及び移動体データベース34aに登録されているものとする。
また、車両5は、車載装置50を搭載しており、歩行者7は歩行者端末70を携帯しているものとする。
また、以下の説明において、通知判定用閾値が「200」に設定されており、エッジサーバ3は、対象移動体と他の移動体との間の評価値が通知判定用閾値(「200」)以上であれば、対象移動体に対して衝突予測結果を通知するように構成されているものとする。 [Specific example by addition processing based on relative distance]
[Specific example 1]
FIG. 33 is a diagram for describing a specific example of the addition process based on the relative distance, and is a diagram illustrating a
In FIG. 33, the
It is assumed that the
The
In the following description, the notification determination threshold is set to “200”, and the
図33中の(a),(b),(c),(d)は、歩行者7が横断歩道90を渡り始めたときに車両5が交差点に進入して右折する場合を時系列で示している。また、歩行者7と車両5との距離dは、図33中の(a)から(d)の順番で徐々に減少している。また、歩行者7と車両5との相対距離は(a)から(d)のいずれの場合も100m以内であるとする。さらに、車両5の速度は時速5km/hより大きいものとする。
(A), (b), (c), and (d) in FIG. 33 show, in chronological order, a case where the vehicle 5 enters an intersection and turns right when the pedestrian 7 starts to cross the pedestrian crossing 90. ing. Further, the distance d between the pedestrian 7 and the vehicle 5 gradually decreases in the order of (a) to (d) in FIG. Further, it is assumed that the relative distance between the pedestrian 7 and the vehicle 5 is within 100 m in any of the cases (a) to (d). Further, the speed of the vehicle 5 is assumed to be higher than 5 km / h.
図33中の(a),(b),(c),(d)において、上述の条件を満たしている場合、車両5を対象移動体としたときに歩行者7は接近移動体として特定され、歩行者7を対象移動体としたときに車両5は接近移動体として特定される。
なお、以下の説明では、主として車両5を対象移動体とし、歩行者7を接近移動体とした場合について説明するが、歩行者7を対象移動体とし、車両5を接近移動体とした場合も同様の処理が行われる。 In (a), (b), (c), and (d) of FIG. 33, when the above condition is satisfied, thepedestrian 7 is specified as the approaching moving body when the vehicle 5 is set as the target moving body. When the pedestrian 7 is set as the target moving body, the vehicle 5 is specified as the approaching moving body.
In the following description, a case will be mainly described in which thevehicle 5 is set as the target moving body and the pedestrian 7 is set as the approaching moving body. Similar processing is performed.
なお、以下の説明では、主として車両5を対象移動体とし、歩行者7を接近移動体とした場合について説明するが、歩行者7を対象移動体とし、車両5を接近移動体とした場合も同様の処理が行われる。 In (a), (b), (c), and (d) of FIG. 33, when the above condition is satisfied, the
In the following description, a case will be mainly described in which the
図33中の(a)では、歩行者7及び車両5は、進行方向(図中矢印)が互いに平行であるため、位置情報、方位情報、及び速度情報に基づいた衝突予測では衝突しないこととなり、エッジサーバ3(の演算部31a)は、個別演算処理において、車両5における衝突予測対象として歩行者7を特定することはない。また、エッジサーバ3は、歩行者7における衝突予測対象として車両5を特定することはない。よって、エッジサーバ3は、個別演算処理において、歩行者7を対象移動体としたときの車両5に対する評価値を「0」とする。同様に、エッジサーバ3は、個別演算処理において、車両5を対象移動体としたときの歩行者7に対する評価値を「0」とする。
In (a) of FIG. 33, the pedestrian 7 and the vehicle 5 do not collide in the collision prediction based on the position information, the azimuth information, and the speed information because the traveling directions (arrows in the figure) are parallel to each other. The edge server 3 (the calculation unit 31a thereof) does not specify the pedestrian 7 as the collision prediction target of the vehicle 5 in the individual calculation processing. The edge server 3 does not specify the vehicle 5 as a collision prediction target for the pedestrian 7. Therefore, the edge server 3 sets the evaluation value for the vehicle 5 when the pedestrian 7 is the target moving object to “0” in the individual calculation processing. Similarly, the edge server 3 sets the evaluation value for the pedestrian 7 when the vehicle 5 is the target moving object to “0” in the individual calculation processing.
また、相対距離に基づく加算処理においても、歩行者7が歩道91上にいるので、エッジサーバ3は、対象移動体である車両5の歩行者7に対する評価値に加算値を加算することはない(図30中のステップS141)。
よって、図33中の(a)の段階では、エッジサーバ3は、車両5に対して歩行者7に関する衝突予測結果を通知することはなく、歩行者7に対して車両5に関する衝突予測結果を通知することはない。 Also, in the addition processing based on the relative distance, since thepedestrian 7 is on the sidewalk 91, the edge server 3 does not add the addition value to the evaluation value for the pedestrian 7 of the vehicle 5 as the target moving body. (Step S141 in FIG. 30).
Therefore, at the stage (a) in FIG. 33, theedge server 3 does not notify the vehicle 5 of the collision prediction result regarding the pedestrian 7, and notifies the pedestrian 7 of the collision prediction result regarding the vehicle 5. No notification.
よって、図33中の(a)の段階では、エッジサーバ3は、車両5に対して歩行者7に関する衝突予測結果を通知することはなく、歩行者7に対して車両5に関する衝突予測結果を通知することはない。 Also, in the addition processing based on the relative distance, since the
Therefore, at the stage (a) in FIG. 33, the
次いで、図33中の(b)では、歩行者7が歩道91から歩道92へ向かって横断歩道を歩行し、車両5が交差点に進入して右折を開始しようとしている。
また、図33中の(b)では、歩行者7は、道路の中央(横断歩道90の中間地点)を未だ通過しておらず、かつ進行方向先の歩道92に達するまでの距離が2mよりも長い地点を歩行しているものとする。 Next, in (b) of FIG. 33, thepedestrian 7 walks on the pedestrian crossing from the sidewalk 91 to the sidewalk 92, and the vehicle 5 enters the intersection and is about to start a right turn.
Further, in (b) of FIG. 33, thepedestrian 7 has not yet passed the center of the road (the middle point of the pedestrian crossing 90), and the distance to reach the sidewalk 92 ahead in the traveling direction is 2 m or more. Is also walking at a long point.
また、図33中の(b)では、歩行者7は、道路の中央(横断歩道90の中間地点)を未だ通過しておらず、かつ進行方向先の歩道92に達するまでの距離が2mよりも長い地点を歩行しているものとする。 Next, in (b) of FIG. 33, the
Further, in (b) of FIG. 33, the
このとき、歩行者7及び車両5は、進行方向(図中矢印)が互いに交差するため、位置情報、方位情報、及び速度情報に基づいた衝突予測によって衝突予測対象として特定される可能性はあるが、進行方向が互いに交差する交差位置Tは比較的遠方となる上、互いの速度によっては衝突しないと予測される場合もある。このため、エッジサーバ3は、個別演算処理において、車両5及び歩行者7を相互に衝突予測対象に特定するか否かはケースバイケースである。
ここでは、エッジサーバ3が車両5及び歩行者7を相互に衝突予測対象に特定しなかったものとする。よって、エッジサーバ3は、個別演算処理において、対象移動体としての歩行者7における車両5の評価値を「0」とする。同様に、エッジサーバ3は、個別演算処理において、対象移動体としての車両5における歩行者7の評価値を「0」とする。 At this time, since thepedestrian 7 and the vehicle 5 cross each other in the traveling directions (arrows in the figure), there is a possibility that the pedestrian 7 and the vehicle 5 are specified as collision prediction targets by collision prediction based on position information, direction information, and speed information. However, the intersection position T where the traveling directions intersect each other is relatively distant, and it may be predicted that no collision will occur depending on the speed of each other. Therefore, whether or not the edge server 3 mutually specifies the vehicle 5 and the pedestrian 7 as the collision prediction targets in the individual calculation processing is a case-by-case basis.
Here, it is assumed that theedge server 3 does not mutually identify the vehicle 5 and the pedestrian 7 as collision prediction targets. Therefore, the edge server 3 sets the evaluation value of the vehicle 5 of the pedestrian 7 as the target moving object to “0” in the individual calculation processing. Similarly, the edge server 3 sets the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object to “0” in the individual calculation processing.
ここでは、エッジサーバ3が車両5及び歩行者7を相互に衝突予測対象に特定しなかったものとする。よって、エッジサーバ3は、個別演算処理において、対象移動体としての歩行者7における車両5の評価値を「0」とする。同様に、エッジサーバ3は、個別演算処理において、対象移動体としての車両5における歩行者7の評価値を「0」とする。 At this time, since the
Here, it is assumed that the
図33中の(b)の場合、その後、図33中の(c)及び(d)に示すように、歩行者7が横断歩道90の横断を継続し、車両5が右折を継続すると、横断歩道90上で歩行者7と車両5とが衝突する可能性が高い。
In the case of (b) in FIG. 33, as shown in (c) and (d) in FIG. 33, if the pedestrian 7 continues to cross the pedestrian crossing 90 and the vehicle 5 continues to turn right, The possibility that the pedestrian 7 and the vehicle 5 collide on the sidewalk 90 is high.
ここで、歩行者7は横断歩道90上であって、道路の中央を未だ通過してない。このため、エッジサーバ3は、相対距離に基づく加算処理において、対象移動体としての車両5における歩行者7の評価値に加算値「200」を加算する(図30中のステップS147)。よって、対象移動体としての車両5における歩行者7の評価値は、「200」となる。
よって、車両5における歩行者7の評価値は通知判定用閾値以上となり、エッジサーバ3は、図33中の(b)の段階で、車両5に対して歩行者7に関する衝突予測結果を通知する。また、対象移動体としての歩行者7に対しても、エッジサーバ3は同様の処理を行う。よって、エッジサーバ3は、図33中の(b)の段階で、歩行者7に対して車両5に関する衝突予測結果を通知する。 Here, thepedestrian 7 is on the pedestrian crossing 90 and has not yet passed the center of the road. Therefore, in the addition processing based on the relative distance, the edge server 3 adds the added value “200” to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body (Step S147 in FIG. 30). Therefore, the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body is “200”.
Therefore, the evaluation value of thepedestrian 7 in the vehicle 5 is equal to or greater than the notification determination threshold, and the edge server 3 notifies the vehicle 5 of the collision prediction result regarding the pedestrian 7 to the stage (b) in FIG. . The edge server 3 performs the same processing on the pedestrian 7 as the target moving body. Therefore, the edge server 3 notifies the pedestrian 7 of the collision prediction result regarding the vehicle 5 at the stage (b) in FIG.
よって、車両5における歩行者7の評価値は通知判定用閾値以上となり、エッジサーバ3は、図33中の(b)の段階で、車両5に対して歩行者7に関する衝突予測結果を通知する。また、対象移動体としての歩行者7に対しても、エッジサーバ3は同様の処理を行う。よって、エッジサーバ3は、図33中の(b)の段階で、歩行者7に対して車両5に関する衝突予測結果を通知する。 Here, the
Therefore, the evaluation value of the
このように、仮に、個別演算処理において車両5及び歩行者7を相互に衝突予測対象に特定しなかったとしても、エッジサーバ3は、図33中の(b)の段階で、相対距離に基づく加算処理によって、対象移動体としての歩行者7における車両5の評価値、及び対象移動体としての車両5における歩行者7の評価値を通知判定用閾値以上に増加させる。これにより、エッジサーバ3は、歩行者7及び車両5の両方に対して衝突予測結果を通知する。
この結果、その後、図33中の(c)及び(d)に示すような状況となる前に、車両5及び歩行者7に互いの存在を認識させることができ、衝突を回避させることができる。 As described above, even if thevehicle 5 and the pedestrian 7 are not mutually specified as the collision prediction targets in the individual calculation processing, the edge server 3 performs the calculation based on the relative distance in the stage of (b) in FIG. By the addition processing, the evaluation value of the vehicle 5 in the pedestrian 7 as the target moving object and the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object are increased to be equal to or more than the notification determination threshold value. Thereby, the edge server 3 notifies both the pedestrian 7 and the vehicle 5 of the collision prediction result.
As a result, thevehicle 5 and the pedestrian 7 can be made to recognize each other before the situation as shown in (c) and (d) in FIG. 33, and a collision can be avoided. .
この結果、その後、図33中の(c)及び(d)に示すような状況となる前に、車両5及び歩行者7に互いの存在を認識させることができ、衝突を回避させることができる。 As described above, even if the
As a result, the
〔具体例2〕
図34は、相対距離に基づく加算処理による他の具体例を説明するための図であり、交差点の横断歩道90を渡る歩行者7と、交差点を右折する車両5とを示した図である。
図34中の(a),(b),(c),(d)は、歩行者7が横断歩道90を渡っている最中に車両5が交差点に進入して右折する場合を時系列で示している。 [Specific example 2]
FIG. 34 is a diagram for describing another specific example of the addition processing based on the relative distance, and is a diagram illustrating apedestrian 7 crossing a crosswalk 90 at an intersection and a vehicle 5 turning right at the intersection.
(A), (b), (c), and (d) in FIG. 34 show, in chronological order, a case where thevehicle 5 enters an intersection and turns right while the pedestrian 7 is crossing the crosswalk 90. Is shown.
図34は、相対距離に基づく加算処理による他の具体例を説明するための図であり、交差点の横断歩道90を渡る歩行者7と、交差点を右折する車両5とを示した図である。
図34中の(a),(b),(c),(d)は、歩行者7が横断歩道90を渡っている最中に車両5が交差点に進入して右折する場合を時系列で示している。 [Specific example 2]
FIG. 34 is a diagram for describing another specific example of the addition processing based on the relative distance, and is a diagram illustrating a
(A), (b), (c), and (d) in FIG. 34 show, in chronological order, a case where the
具体例1では、車両5が交差点に進入する直前において、歩行者7は歩道91上に位置する場合を示した(図33中の(a))。一方、具体例2では、図34中の(a)に示すように、車両5が交差点に進入する直前において、歩行者7は横断歩道90上であって道路の中央を通過した位置に位置している場合を示している。つまり、具体例2では、具体例1よりも歩行者7が車両5に対して道路幅半分程度先行して歩行している場合を示している。
なお、図34中の(a)では、歩行者7は横断歩道90上であって道路の中央を通過しているが、進行方向先の歩道92に達するまでの距離が2mよりも長い地点を歩行しているものとする。
その他の条件は、具体例1と同様であるものとする。 In the specific example 1, thepedestrian 7 is located on the sidewalk 91 immediately before the vehicle 5 enters the intersection ((a) in FIG. 33). On the other hand, in the specific example 2, as shown in FIG. 34A, immediately before the vehicle 5 enters the intersection, the pedestrian 7 is located on the pedestrian crossing 90 at a position passing through the center of the road. Is shown. That is, the specific example 2 shows a case where the pedestrian 7 walks ahead of the vehicle 5 by about half the road width as compared with the specific example 1.
In FIG. 34A, thepedestrian 7 is on the pedestrian crossing 90 and passes through the center of the road, but the point where the distance to reach the sidewalk 92 ahead in the traveling direction is longer than 2 m. Assume that you are walking.
Other conditions are the same as in the first embodiment.
なお、図34中の(a)では、歩行者7は横断歩道90上であって道路の中央を通過しているが、進行方向先の歩道92に達するまでの距離が2mよりも長い地点を歩行しているものとする。
その他の条件は、具体例1と同様であるものとする。 In the specific example 1, the
In FIG. 34A, the
Other conditions are the same as in the first embodiment.
図34中の(a)では、歩行者7及び車両5は、進行方向(図中矢印)が互いに平行であるため、位置情報、方位情報、及び速度情報に基づいた衝突予測では衝突しないこととなり、エッジサーバ3(の演算部31a)は、車両5及び歩行者7を相互に衝突予測対象に特定することはない。よって、エッジサーバ3は、個別演算処理において、対象移動体としての歩行者7における車両5の評価値を「0」とする。同様に、エッジサーバ3は、個別演算処理において、対象移動体としての車両5における歩行者7の評価値を「0」とする。
In (a) of FIG. 34, the pedestrian 7 and the vehicle 5 do not collide in the collision prediction based on the position information, the azimuth information, and the speed information because the traveling directions (arrows in the figure) are parallel to each other. The edge server 3 (the calculation unit 31a thereof) does not mutually specify the vehicle 5 and the pedestrian 7 as collision prediction targets. Therefore, the edge server 3 sets the evaluation value of the vehicle 5 of the pedestrian 7 as the target moving object to “0” in the individual calculation processing. Similarly, the edge server 3 sets the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object to “0” in the individual calculation processing.
また、歩行者7は、横断歩道90上であって道路の中央を通過しているが歩道92に達するまでの距離が2mよりも長い地点を歩行している。よって、エッジサーバ3は、相対距離に基づく加算処理において、対象移動体としての車両5における歩行者7の評価値に加算値「100」を加算する(図30中のステップS148)。
よって、対象移動体としての車両5における歩行者7の評価値は、「100」となる。 Further, thepedestrian 7 is walking at a point on the pedestrian crossing 90, passing through the center of the road but reaching a sidewalk 92 with a distance longer than 2 m. Therefore, in the addition processing based on the relative distance, the edge server 3 adds the added value “100” to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body (Step S148 in FIG. 30).
Therefore, the evaluation value of thepedestrian 7 in the vehicle 5 as the target moving body is “100”.
よって、対象移動体としての車両5における歩行者7の評価値は、「100」となる。 Further, the
Therefore, the evaluation value of the
本実施形態では、エッジサーバ3は、評価値が「200」以上で衝突予測結果を通知する。よって、対象移動体としての車両5における歩行者7の評価値が「100」の場合、エッジサーバ3は、車両5に対して衝突予測結果を通知しない。
しかし、何らかの要因で、個別演算処理による評価値が「100」以上となり、評価値が通知判定用閾値(「200」)を超えれば、エッジサーバ3は、衝突予測結果を通知する。 In the present embodiment, theedge server 3 notifies the collision prediction result when the evaluation value is “200” or more. Therefore, when the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object is “100”, the edge server 3 does not notify the vehicle 5 of the collision prediction result.
However, if for some reason the evaluation value obtained by the individual calculation processing becomes “100” or more and exceeds the notification determination threshold (“200”), theedge server 3 notifies the collision prediction result.
しかし、何らかの要因で、個別演算処理による評価値が「100」以上となり、評価値が通知判定用閾値(「200」)を超えれば、エッジサーバ3は、衝突予測結果を通知する。 In the present embodiment, the
However, if for some reason the evaluation value obtained by the individual calculation processing becomes “100” or more and exceeds the notification determination threshold (“200”), the
つまり、相対距離に基づく加算処理によって、通知判定用閾値よりも少ない加算値を加算すれば、個別演算処理のみによる評価値が通知判定用閾値に至らない場合であっても、加算値が加算されることで、衝突予測結果を通知させることができる。これにより、エッジサーバ3が衝突予測結果を通知すると判定する際の車両5及び歩行者7との間における状況をより多様化することができる。
In other words, if an addition value smaller than the notification determination threshold is added by the addition processing based on the relative distance, the addition value is added even when the evaluation value obtained only by the individual calculation processing does not reach the notification determination threshold. Thus, the result of the collision prediction can be notified. Thereby, the situation between the vehicle 5 and the pedestrian 7 when the edge server 3 determines to notify the collision prediction result can be further diversified.
次いで、図34中の(b)では、歩行者7が歩道92の直前まで横断歩道90を渡っており、車両5が交差点に進入して右折を開始しようとしている。
この場合、図33中の(b)と同様であり、エッジサーバ3は、個別演算処理において、車両5及び歩行者7を相互に衝突予測対象に特定するか否かはケースバイケースである。
ここでは、エッジサーバ3が車両5及び歩行者7を相互に衝突予測対象に特定しなかったものとする。よって、エッジサーバ3は、個別演算処理において、対象移動体としての歩行者7における車両5の評価値を「0」とする。同様に、エッジサーバ3は、個別演算処理において、対象移動体としての車両5における歩行者7の評価値を「0」とする。 Next, in (b) of FIG. 34, thepedestrian 7 has crossed the pedestrian crossing 90 immediately before the sidewalk 92, and the vehicle 5 is approaching the intersection and is about to start a right turn.
In this case, it is the same as (b) in FIG. 33, and it is a case-by-case whether or not theedge server 3 specifies the vehicle 5 and the pedestrian 7 as collision prediction targets in the individual calculation processing.
Here, it is assumed that theedge server 3 does not mutually identify the vehicle 5 and the pedestrian 7 as collision prediction targets. Therefore, the edge server 3 sets the evaluation value of the vehicle 5 of the pedestrian 7 as the target moving object to “0” in the individual calculation processing. Similarly, the edge server 3 sets the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving object to “0” in the individual calculation processing.
この場合、図33中の(b)と同様であり、エッジサーバ3は、個別演算処理において、車両5及び歩行者7を相互に衝突予測対象に特定するか否かはケースバイケースである。
ここでは、エッジサーバ3が車両5及び歩行者7を相互に衝突予測対象に特定しなかったものとする。よって、エッジサーバ3は、個別演算処理において、対象移動体としての歩行者7における車両5の評価値を「0」とする。同様に、エッジサーバ3は、個別演算処理において、対象移動体としての車両5における歩行者7の評価値を「0」とする。 Next, in (b) of FIG. 34, the
In this case, it is the same as (b) in FIG. 33, and it is a case-by-case whether or not the
Here, it is assumed that the
一方、歩行者7は、歩道92の直前まで横断歩道90を渡っており、例えば、歩道92に達するまで2m以内に位置しているとする。
この場合、エッジサーバ3は、相対距離に基づく加算処理において、対象移動体としての車両5における歩行者7の評価値に加算しない(図30中のステップS146)。 On the other hand, it is assumed that thepedestrian 7 has crossed the pedestrian crossing 90 immediately before the sidewalk 92 and is located within 2 m before reaching the sidewalk 92, for example.
In this case, theedge server 3 does not add to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body in the addition processing based on the relative distance (Step S146 in FIG. 30).
この場合、エッジサーバ3は、相対距離に基づく加算処理において、対象移動体としての車両5における歩行者7の評価値に加算しない(図30中のステップS146)。 On the other hand, it is assumed that the
In this case, the
図34中の(b)の場合、その後、図34中の(c)及び(d)に示すように、歩行者7は、速やかに横断歩道90を渡り終え、歩道92へ到達する。よって、歩行者7と車両5とが衝突する可能性は低い。
このような関係における車両5及び歩行者7においては、衝突予測結果を通知する必要は低い。 In the case of (b) in FIG. 34, thereafter, as shown in (c) and (d) in FIG. 34, thepedestrian 7 quickly finishes crossing the pedestrian crossing 90 and reaches the sidewalk 92. Therefore, the possibility that the pedestrian 7 and the vehicle 5 collide is low.
In thevehicle 5 and the pedestrian 7 in such a relationship, the need to notify the collision prediction result is low.
このような関係における車両5及び歩行者7においては、衝突予測結果を通知する必要は低い。 In the case of (b) in FIG. 34, thereafter, as shown in (c) and (d) in FIG. 34, the
In the
このため、エッジサーバ3は、相対距離に基づく加算処理において、対象移動体としての車両5における歩行者7の評価値に加算しない。よって、対象移動体としての車両5における歩行者7の評価値は、「0」となる。よって、エッジサーバ3は、車両5に対して衝突予測結果を通知しない。
また、対象移動体としての歩行者7の場合にも、エッジサーバ3は同様の処理を行う。よって、エッジサーバ3は、歩行者7に対しても衝突予測結果を通知しない。 For this reason, theedge server 3 does not add to the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body in the addition processing based on the relative distance. Therefore, the evaluation value of the pedestrian 7 in the vehicle 5 as the target moving body is “0”. Therefore, the edge server 3 does not notify the vehicle 5 of the collision prediction result.
Also, theedge server 3 performs the same processing in the case of the pedestrian 7 as the target moving body. Therefore, the edge server 3 does not notify the pedestrian 7 of the collision prediction result.
また、対象移動体としての歩行者7の場合にも、エッジサーバ3は同様の処理を行う。よって、エッジサーバ3は、歩行者7に対しても衝突予測結果を通知しない。 For this reason, the
Also, the
このように、本実施形態では、エッジサーバ3は、対象移動体に対して接近する接近移動体を特定したとしても、衝突予測結果を通知する必要性が低い場合、当該対象移動体に対して衝突予測結果を通知しない。これにより、無駄に衝突予測結果を通知するのを抑制できる。
As described above, in the present embodiment, even if the edge server 3 specifies the approaching moving body approaching the target moving body, if the necessity of notifying the collision prediction result is low, the edge server 3 Do not notify the collision prediction result. Thereby, it is possible to suppress unnecessary notification of the collision prediction result.
〔具体例3〕
図35は、相対距離に基づく加算処理による他の具体例を説明するための図であり、交差点の横断歩道90を渡る歩行者7と、交差点を左折する車両5とを示した図である。
図35中、(a)(b)(c)は、歩行者7が横断歩道90を渡り始めたときに車両5が交差点に進入して左折する場合を時系列で示している。
具体例3では、車両5が交差点を左折して歩行者7が横断する横断歩道90を通過する点以外、車両5や歩行者7等に関する条件は具体例1と同じであるものとする。よって、図35中の(a),(b),(c)において、車両5を対象移動体としたときに歩行者7は接近移動体として特定され、歩行者7を対象移動体としたときに車両5は接近移動体として特定される。 [Specific example 3]
FIG. 35 is a diagram for describing another specific example of the addition process based on the relative distance, and is a diagram illustrating apedestrian 7 crossing a crosswalk 90 at an intersection and a vehicle 5 turning left at the intersection.
In FIG. 35, (a), (b), and (c) show, in chronological order, a case where thevehicle 5 enters an intersection and turns left when the pedestrian 7 starts crossing the pedestrian crossing 90.
In the specific example 3, it is assumed that the conditions regarding thevehicle 5, the pedestrian 7, and the like are the same as those in the specific example 1 except that the vehicle 5 turns left at the intersection and passes the pedestrian crossing 90 where the pedestrian 7 crosses. Therefore, in (a), (b), and (c) of FIG. 35, when the vehicle 5 is the target moving object, the pedestrian 7 is specified as the approaching moving object, and when the pedestrian 7 is the target moving object. The vehicle 5 is specified as an approaching moving body.
図35は、相対距離に基づく加算処理による他の具体例を説明するための図であり、交差点の横断歩道90を渡る歩行者7と、交差点を左折する車両5とを示した図である。
図35中、(a)(b)(c)は、歩行者7が横断歩道90を渡り始めたときに車両5が交差点に進入して左折する場合を時系列で示している。
具体例3では、車両5が交差点を左折して歩行者7が横断する横断歩道90を通過する点以外、車両5や歩行者7等に関する条件は具体例1と同じであるものとする。よって、図35中の(a),(b),(c)において、車両5を対象移動体としたときに歩行者7は接近移動体として特定され、歩行者7を対象移動体としたときに車両5は接近移動体として特定される。 [Specific example 3]
FIG. 35 is a diagram for describing another specific example of the addition process based on the relative distance, and is a diagram illustrating a
In FIG. 35, (a), (b), and (c) show, in chronological order, a case where the
In the specific example 3, it is assumed that the conditions regarding the
図35中の(a)では、歩行者7及び車両5は、進行方向(図中矢印)が互いに平行であり、さらに、歩行者7は歩道91上にいる。
よって、エッジサーバ3は、図33中の(a)の場合と同様、車両5に対して歩行者7に関する衝突予測結果を通知することはなく、歩行者7に対して車両5に関する衝突予測結果を通知することはない。 35A, thepedestrian 7 and the vehicle 5 have traveling directions (arrows in the figure) parallel to each other, and the pedestrian 7 is on the sidewalk 91.
Therefore, theedge server 3 does not notify the vehicle 5 of the collision prediction result regarding the pedestrian 7 as in the case of (a) in FIG. You will not be notified.
よって、エッジサーバ3は、図33中の(a)の場合と同様、車両5に対して歩行者7に関する衝突予測結果を通知することはなく、歩行者7に対して車両5に関する衝突予測結果を通知することはない。 35A, the
Therefore, the
図35中の(b)では、歩行者7が歩道91から歩道92へ向かって横断歩道を歩行し、車両5が交差点に進入して左折を開始しようとしている。
ここで、歩行者7は横断歩道90上であって、道路の中央を未だ通過してない。
よって、エッジサーバ3は、個別演算処理によって歩行者7を衝突予測対象に特定しなかったとしても、相対距離に基づく加算処理によって、車両5における歩行者7の評価値に加算値「200」を加算する(図30中のステップS147)。
よって、車両5における歩行者7の評価値は通知判定用閾値以上となり、エッジサーバ3は、図33中の(b)の場合と同様、車両5に対して歩行者7に関する衝突予測結果を通知し、歩行者7に対して車両5に関する衝突予測結果を通知する。 In (b) of FIG. 35, thepedestrian 7 walks on the pedestrian crossing from the sidewalk 91 to the sidewalk 92, and the vehicle 5 enters the intersection and starts to turn left.
Here, thepedestrian 7 is on the pedestrian crossing 90 and has not yet passed the center of the road.
Therefore, even if thepedestrian 7 is not specified as the collision prediction target by the individual calculation process, the edge server 3 adds the addition value “200” to the evaluation value of the pedestrian 7 in the vehicle 5 by the addition process based on the relative distance. It is added (step S147 in FIG. 30).
Therefore, the evaluation value of thepedestrian 7 in the vehicle 5 is equal to or greater than the notification determination threshold, and the edge server 3 notifies the vehicle 5 of the collision prediction result regarding the pedestrian 7 as in the case of (b) in FIG. Then, the pedestrian 7 is notified of the collision prediction result regarding the vehicle 5.
ここで、歩行者7は横断歩道90上であって、道路の中央を未だ通過してない。
よって、エッジサーバ3は、個別演算処理によって歩行者7を衝突予測対象に特定しなかったとしても、相対距離に基づく加算処理によって、車両5における歩行者7の評価値に加算値「200」を加算する(図30中のステップS147)。
よって、車両5における歩行者7の評価値は通知判定用閾値以上となり、エッジサーバ3は、図33中の(b)の場合と同様、車両5に対して歩行者7に関する衝突予測結果を通知し、歩行者7に対して車両5に関する衝突予測結果を通知する。 In (b) of FIG. 35, the
Here, the
Therefore, even if the
Therefore, the evaluation value of the
図35中の(b)の場合、その後、図35中の(c)に示すように、歩行者7が横断歩道90の横断を継続し、車両5が左折を継続すると、横断歩道90上で歩行者7と車両5とが衝突する可能性が高い。
これに対して、エッジサーバ3は、歩行者7及び車両5の両方に対して衝突予測結果を通知する。
この結果、その後、図35中の(c)に示すような状況となる前に、車両5及び歩行者7に互いの存在を認識させることができ、衝突を回避させることができる。 In the case of (b) in FIG. 35, thereafter, as shown in (c) in FIG. 35, when thepedestrian 7 continues to cross the pedestrian crossing 90 and the vehicle 5 continues to turn left, the pedestrian 7 There is a high possibility that the pedestrian 7 and the vehicle 5 collide.
On the other hand, theedge server 3 notifies both the pedestrian 7 and the vehicle 5 of the collision prediction result.
As a result, thevehicle 5 and the pedestrian 7 can be made to recognize each other before the situation as shown in FIG. 35 (c), and a collision can be avoided.
これに対して、エッジサーバ3は、歩行者7及び車両5の両方に対して衝突予測結果を通知する。
この結果、その後、図35中の(c)に示すような状況となる前に、車両5及び歩行者7に互いの存在を認識させることができ、衝突を回避させることができる。 In the case of (b) in FIG. 35, thereafter, as shown in (c) in FIG. 35, when the
On the other hand, the
As a result, the
〔具体例4〕
図36Aは、相対距離に基づく加算処理による他の具体例を説明するための図であり、2台の車両5A,5Bが片側2車線の道路R10を同じ方向(図中矢印の方向)へ向かって並走している状態を示した図である。
車両5Aは、車線R10aを走行し、車両5Bは、車線R10aに隣接する車線R10bを走行している。
また、図36Aでは、車両5Aの前方100m以内の範囲で道路R10がカーブしているものとする。 [Specific Example 4]
FIG. 36A is a diagram for explaining another specific example of the addition process based on the relative distance, in which two vehicles 5A and 5B head on a two-lane road R10 in the same direction (the direction of the arrow in the figure). FIG.
Vehicle 5A is traveling in lane R10a, and vehicle 5B is traveling in lane R10b adjacent to lane R10a.
In FIG. 36A, it is assumed that the road R10 is curved within a range of 100 m ahead of thevehicle 5A.
図36Aは、相対距離に基づく加算処理による他の具体例を説明するための図であり、2台の車両5A,5Bが片側2車線の道路R10を同じ方向(図中矢印の方向)へ向かって並走している状態を示した図である。
車両5Aは、車線R10aを走行し、車両5Bは、車線R10aに隣接する車線R10bを走行している。
また、図36Aでは、車両5Aの前方100m以内の範囲で道路R10がカーブしているものとする。 [Specific Example 4]
FIG. 36A is a diagram for explaining another specific example of the addition process based on the relative distance, in which two
In FIG. 36A, it is assumed that the road R10 is curved within a range of 100 m ahead of the
また、車両5A及び車両5Bは、移動体として動的情報マップM1及び移動体データベース34aに登録されているものとする。
また、車両5Aは、車載装置50を搭載しているが、車両5Bは車載装置50を搭載していないものとする。よって、以下の説明では、車両5Aを対象移動体とし、車両5Bを接近移動体とした場合について説明する。
また、以下の説明では、主として相対距離に基づく加算処理について説明する。 It is assumed that thevehicle 5A and the vehicle 5B are registered as moving objects in the dynamic information map M1 and the moving object database 34a.
Also, it is assumed that thevehicle 5A has the in-vehicle device 50 mounted thereon, but the vehicle 5B does not have the in-vehicle device 50 mounted thereon. Therefore, in the following description, a case will be described in which the vehicle 5A is a target moving body and the vehicle 5B is an approaching moving body.
In the following description, an addition process based on a relative distance will be mainly described.
また、車両5Aは、車載装置50を搭載しているが、車両5Bは車載装置50を搭載していないものとする。よって、以下の説明では、車両5Aを対象移動体とし、車両5Bを接近移動体とした場合について説明する。
また、以下の説明では、主として相対距離に基づく加算処理について説明する。 It is assumed that the
Also, it is assumed that the
In the following description, an addition process based on a relative distance will be mainly described.
例えば、車両5Aと車両5Bとの相対距離が100m以内で、50km/h程度で並走し、さらに、車両5Aが車両5Bに徐々に接近している場合について考える。
この場合、エッジサーバ3は、車両5Aを対象移動体としたときに車両5Bを接近移動体として特定する。 For example, consider a case in which the relative distance between thevehicle 5A and the vehicle 5B is within 100 m, the vehicle runs at about 50 km / h, and the vehicle 5A is gradually approaching the vehicle 5B.
In this case, theedge server 3 specifies the vehicle 5B as the approaching mobile when the vehicle 5A is the target mobile.
この場合、エッジサーバ3は、車両5Aを対象移動体としたときに車両5Bを接近移動体として特定する。 For example, consider a case in which the relative distance between the
In this case, the
ここで、車両5Aの前方100m以内の範囲で道路R10がカーブしていなければ、エッジサーバ3は、相対距離に基づく加算処理において加算は行わない(図31中、ステップS154)。
上述したように、両車両が並走している、又は同一車線を走行している場合、両車両が徐々に接近しているとしても、両車両が直線道路を通常に走行していれば、そのような状態の車両に対して積極的に衝突予測結果を通知する必要性が低い。
このため、車両5Aと車両5Bとが並走している場合において、前方100m以内に道路がカーブしていないと判定すると、エッジサーバ3は、相対距離に基づく加算処理において、加算処理を行わない。 Here, if the road R10 does not curve within a range of 100 m ahead of thevehicle 5A, the edge server 3 does not perform the addition in the addition processing based on the relative distance (step S154 in FIG. 31).
As described above, when both vehicles are running side by side or traveling in the same lane, even if both vehicles are gradually approaching, if both vehicles are traveling normally on a straight road, It is not necessary to actively notify the vehicle in such a state of the collision prediction result.
Therefore, when thevehicle 5A and the vehicle 5B are running in parallel, if it is determined that the road is not curved within 100 m ahead, the edge server 3 does not perform the addition processing in the addition processing based on the relative distance. .
上述したように、両車両が並走している、又は同一車線を走行している場合、両車両が徐々に接近しているとしても、両車両が直線道路を通常に走行していれば、そのような状態の車両に対して積極的に衝突予測結果を通知する必要性が低い。
このため、車両5Aと車両5Bとが並走している場合において、前方100m以内に道路がカーブしていないと判定すると、エッジサーバ3は、相対距離に基づく加算処理において、加算処理を行わない。 Here, if the road R10 does not curve within a range of 100 m ahead of the
As described above, when both vehicles are running side by side or traveling in the same lane, even if both vehicles are gradually approaching, if both vehicles are traveling normally on a straight road, It is not necessary to actively notify the vehicle in such a state of the collision prediction result.
Therefore, when the
なお、車両5Aと車両5Bとの間の個別演算処理は行われる。エッジサーバ3は、個別演算処理によって車両5Aの車両5Bに対する評価値を求める。個別演算処理による車両5Aの車両5Bに対する評価値が、通知判定用閾値を超える場合、エッジサーバ3は、車両5Aに向けて車両5Bに関する衝突予測結果を通知する。
個別 In addition, individual calculation processing between the vehicle 5A and the vehicle 5B is performed. The edge server 3 obtains an evaluation value of the vehicle 5A with respect to the vehicle 5B by the individual calculation processing. When the evaluation value of the vehicle 5A with respect to the vehicle 5B by the individual calculation processing exceeds the notification determination threshold, the edge server 3 notifies the vehicle 5A of the collision prediction result regarding the vehicle 5B.
一方、図36Aに示すように、車両5Aの前方100m以内の範囲で道路R10がカーブしている場合、エッジサーバ3は、相対距離に基づく加算処理において、対象移動体である車両5Aにおける車両5Bの評価値に加算値「200」を加算する(図31中、ステップS155)。
車両5A及び車両5Bがカーブに進入すると、車両5A及び車両5Bはカーブに応じてハンドルを切る必要がある。このとき、例えば、後続の車両5Aがカーブに合わせてハンドルを切ったとしても前を走行する車両5Bがハンドルを切らなかった場合、互いに接触する可能性が高まる。このため、エッジサーバ3は加算処理を行う。 On the other hand, as illustrated in FIG. 36A, when the road R10 is curved within a range of 100 m in front of thevehicle 5A, the edge server 3 performs the addition processing based on the relative distance to the vehicle 5B in the vehicle 5A that is the target moving body. The added value “200” is added to the evaluation value of (step S155 in FIG. 31).
When thevehicle 5A and the vehicle 5B enter the curve, the vehicle 5A and the vehicle 5B need to turn the steering wheel according to the curve. At this time, for example, even if the following vehicle 5A turns the steering wheel according to the curve, if the vehicle 5B running ahead does not turn the steering wheel, the possibility of contact with each other increases. Therefore, the edge server 3 performs an addition process.
車両5A及び車両5Bがカーブに進入すると、車両5A及び車両5Bはカーブに応じてハンドルを切る必要がある。このとき、例えば、後続の車両5Aがカーブに合わせてハンドルを切ったとしても前を走行する車両5Bがハンドルを切らなかった場合、互いに接触する可能性が高まる。このため、エッジサーバ3は加算処理を行う。 On the other hand, as illustrated in FIG. 36A, when the road R10 is curved within a range of 100 m in front of the
When the
エッジサーバ3が車両5Aの車両5Bに対する評価値に加算値「200」を加算すれば、当該評価値は、通知判定用閾値以上となる。
よって、仮に、個別演算処理において車両5Aの衝突予測対象として車両5Bを特定しなかったとしても、エッジサーバ3は、相対距離に基づく加算処理によって、車両5Aに対して衝突予測結果を通知する。
この結果、カーブに進入する前に、車両5Aに対して車両5Bとの接触の可能性を通知することができ、車両5Aに対して注意を促すことができる。 If theedge server 3 adds the addition value “200” to the evaluation value of the vehicle 5A for the vehicle 5B, the evaluation value becomes equal to or larger than the notification determination threshold.
Therefore, even if thevehicle 5B is not specified as the collision prediction target of the vehicle 5A in the individual calculation processing, the edge server 3 notifies the vehicle 5A of the collision prediction result by the addition processing based on the relative distance.
As a result, thevehicle 5A can be notified of the possibility of contact with the vehicle 5B before entering the curve, and the vehicle 5A can be alerted.
よって、仮に、個別演算処理において車両5Aの衝突予測対象として車両5Bを特定しなかったとしても、エッジサーバ3は、相対距離に基づく加算処理によって、車両5Aに対して衝突予測結果を通知する。
この結果、カーブに進入する前に、車両5Aに対して車両5Bとの接触の可能性を通知することができ、車両5Aに対して注意を促すことができる。 If the
Therefore, even if the
As a result, the
〔具体例5〕
図36Bは、相対距離に基づく加算処理による他の具体例を説明するための図であり、車両5Aが道路R10の車線R11を図中矢印の方向へ向かって走行しており、車両5Aが走行する車線R11に対向する車線R12を車両5Bが図中矢印の方向へ向かって走行している状態を示した図である。 [Specific Example 5]
FIG. 36B is a diagram for describing another specific example of the addition processing based on the relative distance, in which thevehicle 5A is traveling in the lane R11 of the road R10 in the direction of the arrow in the figure, and the vehicle 5A is traveling. FIG. 5 is a diagram showing a state in which a vehicle 5B is traveling in a lane R12 facing a lane R11 in a direction of an arrow in the figure.
図36Bは、相対距離に基づく加算処理による他の具体例を説明するための図であり、車両5Aが道路R10の車線R11を図中矢印の方向へ向かって走行しており、車両5Aが走行する車線R11に対向する車線R12を車両5Bが図中矢印の方向へ向かって走行している状態を示した図である。 [Specific Example 5]
FIG. 36B is a diagram for describing another specific example of the addition processing based on the relative distance, in which the
また、車両5A及び車両5Bは、移動体として動的情報マップM1及び移動体データベース34aに登録されているものとする。
また、車両5Aは、車載装置50を搭載しているが、車両5Bは車載装置50を搭載していないものとする。よって、以下の説明では、車両5Aを対象移動体とし、車両5Bを接近移動体とした場合について説明する。
また、以下の説明では、主として相対距離に基づく加算処理について説明する。 It is assumed that thevehicle 5A and the vehicle 5B are registered as moving objects in the dynamic information map M1 and the moving object database 34a.
Also, it is assumed that thevehicle 5A has the in-vehicle device 50 mounted thereon, but the vehicle 5B does not have the in-vehicle device 50 mounted thereon. Therefore, in the following description, a case will be described in which the vehicle 5A is a target moving body and the vehicle 5B is an approaching moving body.
In the following description, an addition process based on a relative distance will be mainly described.
また、車両5Aは、車載装置50を搭載しているが、車両5Bは車載装置50を搭載していないものとする。よって、以下の説明では、車両5Aを対象移動体とし、車両5Bを接近移動体とした場合について説明する。
また、以下の説明では、主として相対距離に基づく加算処理について説明する。 It is assumed that the
Also, it is assumed that the
In the following description, an addition process based on a relative distance will be mainly described.
例えば、車両5Aと車両5Bとの相対距離が100m以内で、共に50km/h程度で走行している場合について考える。
図36Bにおいて、車両5Aの前方にはカーブがあり、車両5Bはそのカーブを走行している。このため、車両5Aの進行方向と車両5Bの進行方向とが互いに交差する。また、車両5Aと車両5Bとは互いに接近する。よって、エッジサーバ3は、車両5Aを対象移動体としたときに車両5Bを接近移動体として特定する。 For example, consider a case where the relative distance between thevehicle 5A and the vehicle 5B is within 100 m and both vehicles are traveling at about 50 km / h.
In FIG. 36B, there is a curve ahead ofvehicle 5A, and vehicle 5B is traveling along the curve. Therefore, the traveling direction of the vehicle 5A and the traveling direction of the vehicle 5B cross each other. The vehicle 5A and the vehicle 5B approach each other. Therefore, the edge server 3 specifies the vehicle 5B as the approaching mobile when the vehicle 5A is the target mobile.
図36Bにおいて、車両5Aの前方にはカーブがあり、車両5Bはそのカーブを走行している。このため、車両5Aの進行方向と車両5Bの進行方向とが互いに交差する。また、車両5Aと車両5Bとは互いに接近する。よって、エッジサーバ3は、車両5Aを対象移動体としたときに車両5Bを接近移動体として特定する。 For example, consider a case where the relative distance between the
In FIG. 36B, there is a curve ahead of
ここで、図36Bに示すように、車線R11と、車線R12との間に中央分離帯Cが存在する場合、エッジサーバ3は、相対距離に基づく加算処理において加算は行わない(図31中、ステップS152)。
Here, as shown in FIG. 36B, when the median strip C exists between the lane R11 and the lane R12, the edge server 3 does not perform the addition in the addition processing based on the relative distance (see FIG. 31). Step S152).
上述したように、両車両が互いに接近しているときに、両車両が同じ道路を同じ方向へ向かって並走しておらず、かつ同一車線を走行していない場合、両車両は、互いに対向車線を走行中である。このとき、中央分離帯があれば、両者が衝突する可能性は極めて低いと考えられる。
このため、図36Bに示すように、車両5Aと車両5Bとが互いに対向車線(車線R11及び車線R12)を走行中であり、かつ中央分離帯Cがある場合、エッジサーバ3は、相対距離に基づく加算処理において、評価値に加算値を加算しない。
なお、車両5Aと車両5Bとの間の個別演算処理は行われる。エッジサーバ3は、個別演算処理によって車両5Aの車両5Bに対する評価値を求める。よって、エッジサーバ3は、個別演算処理による車両5Aの車両5Bに対する評価値に応じて衝突予測結果を通知するか否かを判定する。 As described above, when the two vehicles are not approaching each other on the same road in the same direction and are not traveling in the same lane when the two vehicles are approaching each other, the two vehicles face each other. You are driving in the lane. At this time, if there is a median strip, it is considered that the possibility of collision between the two is extremely low.
For this reason, as shown in FIG. 36B, when thevehicle 5A and the vehicle 5B are traveling on opposite lanes (lane R11 and lane R12) and there is a median strip C, the edge server 3 In the addition process based on the above, the addition value is not added to the evaluation value.
In addition, the individual calculation process between thevehicle 5A and the vehicle 5B is performed. The edge server 3 obtains an evaluation value of the vehicle 5A with respect to the vehicle 5B by the individual calculation processing. Therefore, the edge server 3 determines whether or not to notify the collision prediction result according to the evaluation value of the vehicle 5A with respect to the vehicle 5B by the individual calculation processing.
このため、図36Bに示すように、車両5Aと車両5Bとが互いに対向車線(車線R11及び車線R12)を走行中であり、かつ中央分離帯Cがある場合、エッジサーバ3は、相対距離に基づく加算処理において、評価値に加算値を加算しない。
なお、車両5Aと車両5Bとの間の個別演算処理は行われる。エッジサーバ3は、個別演算処理によって車両5Aの車両5Bに対する評価値を求める。よって、エッジサーバ3は、個別演算処理による車両5Aの車両5Bに対する評価値に応じて衝突予測結果を通知するか否かを判定する。 As described above, when the two vehicles are not approaching each other on the same road in the same direction and are not traveling in the same lane when the two vehicles are approaching each other, the two vehicles face each other. You are driving in the lane. At this time, if there is a median strip, it is considered that the possibility of collision between the two is extremely low.
For this reason, as shown in FIG. 36B, when the
In addition, the individual calculation process between the
一方、車線R11と車線R12との間に中央分離帯がない場合、エッジサーバ3は、相対距離に基づく加算処理において、対象移動体としての車両5Aにおける車両5Bの評価値に加算値「50」を加算する(図31中、ステップS153)。
この場合、エッジサーバ3は、車両5Aに向けて車両5Bに関する衝突予測結果を通知しない。 On the other hand, when there is no median strip between the lanes R11 and R12, theedge server 3 adds “50” to the evaluation value of the vehicle 5B in the vehicle 5A as the target moving object in the addition processing based on the relative distance. Is added (step S153 in FIG. 31).
In this case, theedge server 3 does not notify the vehicle 5A of the collision prediction result regarding the vehicle 5B.
この場合、エッジサーバ3は、車両5Aに向けて車両5Bに関する衝突予測結果を通知しない。 On the other hand, when there is no median strip between the lanes R11 and R12, the
In this case, the
両車両は、互いに対向車線を走行中であり、原則として衝突しないと考えられる。
しかし、、車線R11と車線R12との間に中央分離帯がなく、他の要因が組み合わさることで、両車両に衝突の可能性が生じることも考えられる。そこで、エッジサーバ3は、車両5Aにおける車両5Bの評価値に加算値「50」を加算する。これにより、エッジサーバ3は、個別演算処理による評価値が「150」以上の場合に車両5Aへ衝突予測結果を通知する。
これにより、対向車線を走行中の車両5が存在するとき、何らかの要因で、個別演算処理による評価値が「150」以上となれば、相対距離に基づく加算処理の加算値を加えた評価値が「200」を超え、エッジサーバ3は、衝突予測結果を通知する。 Both vehicles are traveling on opposite lanes, and it is considered that they do not collide in principle.
However, there is no central divider between the lanes R11 and R12, and a combination of other factors may possibly cause a collision between the two vehicles. Therefore, theedge server 3 adds the additional value “50” to the evaluation value of the vehicle 5B in the vehicle 5A. Thereby, the edge server 3 notifies the collision prediction result to the vehicle 5A when the evaluation value by the individual calculation processing is “150” or more.
With this, when thevehicle 5 traveling in the oncoming lane exists, if for some reason the evaluation value obtained by the individual calculation processing becomes “150” or more, the evaluation value obtained by adding the addition value of the addition processing based on the relative distance becomes Exceeding “200”, the edge server 3 notifies the collision prediction result.
しかし、、車線R11と車線R12との間に中央分離帯がなく、他の要因が組み合わさることで、両車両に衝突の可能性が生じることも考えられる。そこで、エッジサーバ3は、車両5Aにおける車両5Bの評価値に加算値「50」を加算する。これにより、エッジサーバ3は、個別演算処理による評価値が「150」以上の場合に車両5Aへ衝突予測結果を通知する。
これにより、対向車線を走行中の車両5が存在するとき、何らかの要因で、個別演算処理による評価値が「150」以上となれば、相対距離に基づく加算処理の加算値を加えた評価値が「200」を超え、エッジサーバ3は、衝突予測結果を通知する。 Both vehicles are traveling on opposite lanes, and it is considered that they do not collide in principle.
However, there is no central divider between the lanes R11 and R12, and a combination of other factors may possibly cause a collision between the two vehicles. Therefore, the
With this, when the
このように、相対距離に基づく加算処理によって、通知判定用閾値よりも少ない加算値を加算すれば、個別演算処理のみによる評価値が通知判定用閾値に至らない場合であっても、加算値が加算されることで、衝突予測結果を通知させることができる。
As described above, by adding the addition value smaller than the notification determination threshold value by the addition processing based on the relative distance, even when the evaluation value obtained only by the individual calculation processing does not reach the notification determination threshold value, the addition value becomes smaller. By being added, a collision prediction result can be notified.
〔その他〕
なお、今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。
例えば、車両5及び歩行者7における属性情報における属性の種類は例示であり、これに限定されるものではない。
また、加算処理において加算される加算値についても例示であり、これに限定されるものではない。 [Others]
It should be noted that the embodiments disclosed this time are illustrative in all aspects and not restrictive.
For example, the types of attributes in the attribute information on thevehicle 5 and the pedestrian 7 are examples, and the present invention is not limited to this.
Further, the addition value added in the addition processing is also an example, and is not limited to this.
なお、今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。
例えば、車両5及び歩行者7における属性情報における属性の種類は例示であり、これに限定されるものではない。
また、加算処理において加算される加算値についても例示であり、これに限定されるものではない。 [Others]
It should be noted that the embodiments disclosed this time are illustrative in all aspects and not restrictive.
For example, the types of attributes in the attribute information on the
Further, the addition value added in the addition processing is also an example, and is not limited to this.
また、上記実施形態では、評価対象以外の移動体を特定し、さらにその中から最も短い衝突予測時間となる移動体を衝突予測対象として特定し、評価対象と衝突予測対象との間で加算処理を行い、評価値を求めた場合を例示したが、例えば、評価対象以外の移動体についても加算処理を行って評価値を求め、判定部31bによる判定処理の対象としてもよい。
Further, in the above embodiment, a moving object other than the evaluation target is specified, and a moving object having the shortest collision prediction time is specified as the collision prediction target, and an addition process is performed between the evaluation target and the collision prediction target. Is performed, and the evaluation value is obtained. However, for example, an addition process may be performed on a moving object other than the evaluation target to obtain an evaluation value, and the evaluation value may be obtained by the determination unit 31b.
また、上記実施形態では、判定部31bが、衝突予測結果を通知するか否かを、通知判定用閾値に基づいて判定する場合を例示したが、例えば、各評価対象の評価値を互いに比較し、比較結果に基づいて、衝突予測結果を通知するか否かを判定するように構成してもよい。
例えば、各評価対象の評価値を互いに比較し、評価値が高い順に上位所定数の評価対象について衝突予測結果を通知すると判定するように構成してもよい。
この場合、衝突予測結果を通知する評価対象が所定数を超えることがないので、無線通信システムに対して必要以上に負荷が及ぶのを抑制することができる。
これにより、無線通信システムへの負荷を抑制しつつ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 Further, in the above embodiment, the case where thedetermination unit 31b determines whether or not to notify the collision prediction result based on the notification determination threshold value is illustrated. For example, the evaluation values of the respective evaluation targets are compared with each other. Alternatively, it may be configured to determine whether to notify the collision prediction result based on the comparison result.
For example, the evaluation values of the respective evaluation objects may be compared with each other, and a determination may be made to notify the collision prediction result for a predetermined number of evaluation objects in the descending order of the evaluation values.
In this case, since the number of evaluation targets for notifying the collision prediction result does not exceed the predetermined number, it is possible to suppress an unnecessary load on the wireless communication system.
This makes it possible to appropriately provide the mobile terminal with a collision prediction result having a high need for notification while suppressing the load on the wireless communication system.
例えば、各評価対象の評価値を互いに比較し、評価値が高い順に上位所定数の評価対象について衝突予測結果を通知すると判定するように構成してもよい。
この場合、衝突予測結果を通知する評価対象が所定数を超えることがないので、無線通信システムに対して必要以上に負荷が及ぶのを抑制することができる。
これにより、無線通信システムへの負荷を抑制しつつ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 Further, in the above embodiment, the case where the
For example, the evaluation values of the respective evaluation objects may be compared with each other, and a determination may be made to notify the collision prediction result for a predetermined number of evaluation objects in the descending order of the evaluation values.
In this case, since the number of evaluation targets for notifying the collision prediction result does not exceed the predetermined number, it is possible to suppress an unnecessary load on the wireless communication system.
This makes it possible to appropriately provide the mobile terminal with a collision prediction result having a high need for notification while suppressing the load on the wireless communication system.
また、上記実施形態では、通知判定用閾値を予め設定した場合を例示したが、判定部31bが、通知判定用閾値を適応的に変更するように構成してもよい。
例えば、判定部31bは、過去所定期間において各評価対象に衝突予測結果を通知した通知回数をカウントし、通知回数に応じて通知判定用閾値を変更するように構成してもよい。
この場合、通知回数が、無線通信システムに対して必要以上の負荷が及ぶ程度の値である場合、判定部31bは、通知回数が減るように通知判定用閾値をより大きい値に調整し、通知回数が、無線通信システムに対して十分余力がある負荷であると判断しうる値である場合、判定部31bは、通知回数が増えるように通知判定用閾値をより大きい値に調整する。
これにより、無線通信システムへの負荷を抑制しつつ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 Further, in the above-described embodiment, the case where the notification determination threshold is set in advance is illustrated, but thedetermination unit 31b may be configured to adaptively change the notification determination threshold.
For example, thedetermination unit 31b may be configured to count the number of notifications of notification of the collision prediction result to each evaluation target in a past predetermined period, and change the notification determination threshold according to the number of notifications.
In this case, if the number of notifications is a value that causes an unnecessary load on the wireless communication system, thedetermination unit 31b adjusts the notification determination threshold to a larger value so as to reduce the number of notifications, and If the number of times is a value that can be determined to be a load having sufficient capacity for the wireless communication system, the determination unit 31b adjusts the notification determination threshold to a larger value so that the number of times of notification increases.
This makes it possible to appropriately provide the mobile terminal with a collision prediction result having a high need for notification while suppressing the load on the wireless communication system.
例えば、判定部31bは、過去所定期間において各評価対象に衝突予測結果を通知した通知回数をカウントし、通知回数に応じて通知判定用閾値を変更するように構成してもよい。
この場合、通知回数が、無線通信システムに対して必要以上の負荷が及ぶ程度の値である場合、判定部31bは、通知回数が減るように通知判定用閾値をより大きい値に調整し、通知回数が、無線通信システムに対して十分余力がある負荷であると判断しうる値である場合、判定部31bは、通知回数が増えるように通知判定用閾値をより大きい値に調整する。
これにより、無線通信システムへの負荷を抑制しつつ、通知の必要性が高い衝突予測結果を適切に移動端末へ提供することができる。 Further, in the above-described embodiment, the case where the notification determination threshold is set in advance is illustrated, but the
For example, the
In this case, if the number of notifications is a value that causes an unnecessary load on the wireless communication system, the
This makes it possible to appropriately provide the mobile terminal with a collision prediction result having a high need for notification while suppressing the load on the wireless communication system.
本開示の範囲は、上記した意味ではなく、請求の範囲によって示され、請求の範囲と均等の意味、及び範囲内でのすべての変更が含まれることが意図される。
The scope of the present disclosure is defined by the terms of the claims, rather than the meanings described above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
1A~1D 通信端末
2 基地局
3 エッジサーバ
4 コアサーバ
5,5A,5B,5C,5D 車両
7,7A,7B 歩行者
8 路側センサ
9 交通信号制御機
31 制御部
31a 演算部
31b 判定部
31c 通知部
31d 検出部
32 ROM
33 RAM
34 記憶部
34a 移動体データベース
34b 評価値データベース
34c 相対距離データベース
35 通信部
41 制御部
42 ROM
43 RAM
44 記憶部
45 通信部
50 車載装置
51 制御部
52 GPS受信機
53 車速センサ
54 ジャイロセンサ
55 記憶部
56 ディスプレイ
57 スピーカ
58 入力デバイス
59 車載カメラ
60 レーダセンサ
61 通信部
70 歩行者端末
71 制御部
72 記憶部
73 表示部
74 操作部
75 通信部
81 制御部
82 記憶部
83 路側カメラ
84 レーダセンサ
85 通信部
90 横断歩道
91 歩道
92 歩道
C 中央分離帯
d 距離
D1 表示
D2 矢印
D3 表示
D4 矢印
D5 表示
D6 矢印
G1 障害物
G2 建物
M1 動的情報マップ
M2 動的情報マップ
N1~N4 ノード
R1,R2 方路
R10 道路
R10a 車線
R10b 車線
R11 車線
R12 車線
T 交差位置
S1~S4 ネットワークスライス
V1,V2,V3 出力画面 1A to1D Communication terminal 2 Base station 3 Edge server 4 Core server 5, 5A, 5B, 5C, 5D Vehicle 7, 7A, 7B Pedestrian 8 Roadside sensor 9 Traffic signal controller 31 Control unit 31a Operation unit 31b Judgment unit 31c Notification Unit 31d detection unit 32 ROM
33 RAM
34storage unit 34a mobile object database 34b evaluation value database 34c relative distance database 35 communication unit 41 control unit 42 ROM
43 RAM
44storage unit 45 communication unit 50 in-vehicle device 51 control unit 52 GPS receiver 53 vehicle speed sensor 54 gyro sensor 55 storage unit 56 display 57 speaker 58 input device 59 in-vehicle camera 60 radar sensor 61 communication unit 70 pedestrian terminal 71 control unit 72 storage Unit 73 display unit 74 operation unit 75 communication unit 81 control unit 82 storage unit 83 roadside camera 84 radar sensor 85 communication unit 90 crosswalk 91 sidewalk 92 sidewalk C median strip d distance D1 display D2 arrow D3 display D4 arrow D5 display D6 arrow G1 Obstacle G2 Building M1 Dynamic information map M2 Dynamic information map N1 to N4 Node R1, R2 Route R10 Road R10a Lane R10b Lane R11 Lane R12 Lane T Intersection S1-S4 Network slice V1, V2, V3 Output Surface
2 基地局
3 エッジサーバ
4 コアサーバ
5,5A,5B,5C,5D 車両
7,7A,7B 歩行者
8 路側センサ
9 交通信号制御機
31 制御部
31a 演算部
31b 判定部
31c 通知部
31d 検出部
32 ROM
33 RAM
34 記憶部
34a 移動体データベース
34b 評価値データベース
34c 相対距離データベース
35 通信部
41 制御部
42 ROM
43 RAM
44 記憶部
45 通信部
50 車載装置
51 制御部
52 GPS受信機
53 車速センサ
54 ジャイロセンサ
55 記憶部
56 ディスプレイ
57 スピーカ
58 入力デバイス
59 車載カメラ
60 レーダセンサ
61 通信部
70 歩行者端末
71 制御部
72 記憶部
73 表示部
74 操作部
75 通信部
81 制御部
82 記憶部
83 路側カメラ
84 レーダセンサ
85 通信部
90 横断歩道
91 歩道
92 歩道
C 中央分離帯
d 距離
D1 表示
D2 矢印
D3 表示
D4 矢印
D5 表示
D6 矢印
G1 障害物
G2 建物
M1 動的情報マップ
M2 動的情報マップ
N1~N4 ノード
R1,R2 方路
R10 道路
R10a 車線
R10b 車線
R11 車線
R12 車線
T 交差位置
S1~S4 ネットワークスライス
V1,V2,V3 出力画面 1A to
33 RAM
34
43 RAM
44
Claims (13)
- 複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得部と、
前記複数の移動体のうち移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算部と、
前記演算部による衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定部と、を備え、
前記演算部は、
前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、
前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、
前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、
に基づいて前記評価値を求め、
前記判定部は、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する
情報提供システム。 Acquisition for acquiring movement information indicating a moving state of the plurality of moving objects and appearance information of the plurality of moving objects based on dynamic map information in which dynamic information of the plurality of moving objects is superimposed on map information. Department and
A target mobile having a mobile terminal among the plurality of mobiles, and performing a collision prediction of whether or not a mobile other than the target mobile collides based on the movement information; And an arithmetic unit for calculating an evaluation value indicating a possibility of collision with the other moving object;
A determining unit that determines whether to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the arithmetic unit,
The arithmetic unit includes:
Included in the collision prediction result, a collision prediction time until the target mobile body and the other mobile body collide,
An appearance attribute value according to the appearance information of each of the target moving object and the other moving object;
Obtained from the movement information, the target mobile object, and an action attribute value according to action information indicating the action content of each of the other mobile objects,
The evaluation value is obtained based on
The information providing system, wherein the determination unit determines whether to notify the collision prediction result to the mobile terminal based on the evaluation value. - 前記判定部が通知すると判定した移動端末に対して、前記衝突予測結果を通知する通知部をさらに備える
請求項1に記載の情報提供システム。 The information providing system according to claim 1, further comprising: a notifying unit that notifies the collision prediction result to the mobile terminal that the determining unit determines to notify. - 前記演算部は、前記他の移動体のうち、前記対象移動体に対して衝突すると予測される移動体を、前記衝突予測結果に基づいて特定し、前記衝突すると予測される移動体の前記衝突予測時間、前記外観属性値、及び前記行動属性値を用いて前記評価値を求める
請求項1又は請求項2に記載の情報提供システム。 The calculation unit specifies, based on the collision prediction result, a moving body predicted to collide with the target moving body among the other moving bodies, and the collision of the moving body predicted to collide is performed. The information providing system according to claim 1, wherein the evaluation value is obtained using a predicted time, the appearance attribute value, and the behavior attribute value. - 前記移動端末を制御し、前記衝突予測結果を前記移動端末のユーザへ向けて出力する処理を前記移動端末に実行させる制御部をさらに備え、
前記制御部は、前記衝突すると予測される移動体における前記外観情報及び前記行動情報に応じて前記衝突予測結果の出力態様が異なるように前記移動端末を制御する
請求項1から請求項3のいずれか一項に記載の情報提供システム。 The mobile terminal further includes a control unit that causes the mobile terminal to execute a process of outputting the collision prediction result toward a user of the mobile terminal,
4. The mobile terminal according to claim 1, wherein the control unit controls the mobile terminal such that an output mode of the collision prediction result differs according to the appearance information and the behavior information of the moving body predicted to collide. 5. An information providing system according to any one of the preceding claims. - 前記他の移動体が車両である場合、前記外観属性値には、前記車両のサイズに応じた属性値、前記車両の形状に応じた属性値、前記車両の色に応じた属性値、及び、前記車両に取り付けられたナンバープレートに記載された情報に応じた属性値のうち少なくとも1つが含まれる
請求項1から請求項4のいずれか一項に記載の情報提供システム。 When the other moving object is a vehicle, the appearance attribute value includes an attribute value corresponding to a size of the vehicle, an attribute value corresponding to a shape of the vehicle, an attribute value corresponding to a color of the vehicle, and The information providing system according to any one of claims 1 to 4, wherein at least one of attribute values corresponding to information described on a license plate attached to the vehicle is included. - 前記他の移動体が歩行者である場合、前記外観属性値には、前記歩行者の身長に応じた属性値、前記歩行者の服装に応じた属性値、及び前記歩行者の装備品に応じた属性値のうち少なくとも1つが含まれる
請求項1から請求項4のいずれか一項に記載の情報提供システム。 When the other moving object is a pedestrian, the appearance attribute value includes an attribute value corresponding to the height of the pedestrian, an attribute value corresponding to the clothes of the pedestrian, and an accessory value of the pedestrian. The information providing system according to any one of claims 1 to 4, wherein at least one of the attribute values is included. - 前記他の移動体が車両である場合、前記行動属性値には、現在の走行状態に応じた属性値、及び過去の走行履歴に応じた属性値のうち少なくとも1つが含まれる
請求項1から請求項4のいずれか一項に記載の情報提供システム。 2. The method according to claim 1, wherein when the other moving object is a vehicle, the action attribute value includes at least one of an attribute value according to a current traveling state and an attribute value according to a past traveling history. 3. Item 5. The information providing system according to any one of Items 4. - 前記他の移動体が歩行者である場合、前記行動属性値には、前記歩行者の歩行場所に応じた属性値、前記歩行者の歩行軌跡に応じた属性値、信号機の灯色に対する動作に応じた属性値のうち少なくとも1つが含まれる
請求項1から請求項4のいずれか一項に記載の情報提供システム。 When the other moving object is a pedestrian, the action attribute value includes an attribute value corresponding to a walking place of the pedestrian, an attribute value corresponding to a trajectory of the pedestrian, and an operation with respect to a light color of a traffic light. The information providing system according to claim 1, wherein at least one of the attribute values according to the information is included. - 前記演算部は、前記移動情報に含まれる前記複数の移動体の位置情報に基づいて、前記対象移動体と前記他の移動体それぞれとの相対距離、及び前記相対距離の変位を求め、前記相対距離及び前記相対距離の変位に基づく加算値を前記評価値に加算する
請求項1から請求項8のいずれか一項に記載の情報提供システム。 The calculation unit obtains a relative distance between the target mobile unit and each of the other mobile units based on the position information of the plurality of mobile units included in the movement information, and a displacement of the relative distance. The information providing system according to any one of claims 1 to 8, wherein an addition value based on a displacement of the distance and the relative distance is added to the evaluation value. - 複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報を取得する取得部と、
前記複数の移動体のうち移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算部と、
前記演算部による衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定部と、を備え、
前記演算部は、
前記移動情報に含まれる前記複数の移動体の位置情報に基づいて、前記対象移動体と前記他の移動体それぞれとの相対距離、及び前記相対距離の変位を求め、前記相対距離及び前記相対距離の変位に基づいて前記評価値を求め、
前記判定部は、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する
情報提供システム。 An acquisition unit configured to acquire movement information indicating a movement state of the plurality of moving objects based on dynamic map information in which dynamic information of the plurality of moving objects is superimposed on map information;
A target mobile having a mobile terminal among the plurality of mobiles, and performing a collision prediction of whether or not a mobile other than the target mobile collides based on the movement information; And an arithmetic unit for calculating an evaluation value indicating a possibility of collision with the other moving object;
A determining unit that determines whether to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the arithmetic unit,
The arithmetic unit includes:
Based on the position information of the plurality of moving objects included in the movement information, a relative distance between the target moving object and each of the other moving objects, and a displacement of the relative distance are determined, and the relative distance and the relative distance are calculated. Calculating the evaluation value based on the displacement of
The information providing system, wherein the determination unit determines whether to notify the collision prediction result to the mobile terminal based on the evaluation value. - 請求項1から請求項10のいずれか一項に記載の情報提供システムから前記衝突予測結果を受け付け、ユーザへ前記衝突予測結果を出力する移動端末。 A mobile terminal that receives the collision prediction result from the information providing system according to any one of claims 1 to 10, and outputs the collision prediction result to a user.
- 移動端末へ情報提供を行う情報提供方法であって、
複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得ステップと、
前記複数の移動体のうち前記移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算ステップと、
前記演算ステップによる衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定ステップと、を含み、
前記演算ステップでは、
前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、
前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、
前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、
に基づいて前記評価値を求め、
前記判定ステップでは、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する
情報提供方法。 An information providing method for providing information to a mobile terminal,
Acquisition for acquiring movement information indicating a moving state of the plurality of moving objects and appearance information of the plurality of moving objects based on dynamic map information in which dynamic information of the plurality of moving objects is superimposed on map information. Steps and
The target mobile unit having the mobile terminal among the plurality of mobile units and a collision prediction of whether or not a mobile unit other than the target mobile unit collides is performed based on the movement information, and the target mobile unit A calculating step of obtaining an evaluation value indicating a possibility that the body and the other moving body collide with each other;
A determination step of determining whether to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculation step,
In the calculation step,
Included in the collision prediction result, a collision prediction time until the target mobile body and the other mobile body collide,
An appearance attribute value according to the appearance information of each of the target moving object and the other moving object;
Obtained from the movement information, the target mobile object, and an action attribute value according to action information indicating the action content of each of the other mobile objects,
The evaluation value is obtained based on
In the determining step, an information providing method for determining whether to notify the collision prediction result to the mobile terminal based on the evaluation value. - 移動端末へ情報提供を行う情報提供処理をコンピュータに実行させるためのコンピュータプログラムであって、
前記コンピュータに
複数の移動体の動的情報を地図情報に重畳した動的マップ情報に基づいて、前記複数の移動体の移動状態を示す移動情報と、前記複数の移動体の外観情報と、を取得する取得ステップと、
前記複数の移動体のうち前記移動端末を有する対象移動体と、当該対象移動体以外の他の移動体とが衝突するか否かの衝突予測を前記移動情報に基づいて行うとともに、前記対象移動体と前記他の移動体とが衝突する可能性を示す評価値を求める演算ステップと、
前記演算ステップによる衝突予測の結果を示す衝突予測結果を前記移動端末へ通知するか否かを判定する判定ステップと、を含み、
前記演算ステップでは、
前記衝突予測結果に含まれる、前記対象移動体と前記他の移動体とが衝突するまでの衝突予測時間と、
前記対象移動体、及び前記他の移動体それぞれの前記外観情報に応じた外観属性値と、
前記移動情報から得られる、前記対象移動体、及び前記他の移動体それぞれの行動内容を示す行動情報に応じた行動属性値と、
に基づいて前記評価値を求め、
前記判定ステップでは、前記衝突予測結果を前記移動端末へ通知するか否かを、前記評価値に基づいて判定する
コンピュータプログラム。 A computer program for causing a computer to execute an information providing process of providing information to a mobile terminal,
Based on the dynamic map information obtained by superimposing the dynamic information of the plurality of moving objects on the map information, the computer may include moving information indicating a moving state of the plurality of moving objects and appearance information of the plurality of moving objects. An acquiring step for acquiring;
The target mobile unit having the mobile terminal among the plurality of mobile units and a collision prediction of whether or not a mobile unit other than the target mobile unit collides is performed based on the movement information, and the target mobile unit A calculating step of obtaining an evaluation value indicating a possibility that the body and the other moving body collide with each other;
A determination step of determining whether to notify the mobile terminal of a collision prediction result indicating a result of the collision prediction by the calculation step,
In the calculation step,
Included in the collision prediction result, a collision prediction time until the target mobile body and the other mobile body collide,
An appearance attribute value according to the appearance information of each of the target moving object and the other moving object;
Obtained from the movement information, the target mobile object, and an action attribute value according to action information indicating the action content of each of the other mobile objects,
The evaluation value is obtained based on
In the determining step, a computer program for determining whether to notify the collision prediction result to the mobile terminal based on the evaluation value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020550244A JPWO2020071072A1 (en) | 2018-10-02 | 2019-09-11 | Information provision system, mobile terminal, information provision method, and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018187530 | 2018-10-02 | ||
JP2018-187530 | 2018-10-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020071072A1 true WO2020071072A1 (en) | 2020-04-09 |
Family
ID=70054518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/035744 WO2020071072A1 (en) | 2018-10-02 | 2019-09-11 | Information provision system, mobile terminal, information provision method, and computer program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2020071072A1 (en) |
WO (1) | WO2020071072A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021174361A (en) * | 2020-04-28 | 2021-11-01 | 矢崎エナジーシステム株式会社 | Approach notification system and approach notification program |
CN114078317A (en) * | 2020-08-11 | 2022-02-22 | 丰田自动车株式会社 | Warning notification system, warning notification method, and warning notification program |
JP2022172567A (en) * | 2021-05-06 | 2022-11-17 | ソフトバンク株式会社 | Server, system, method and program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011198295A (en) * | 2010-03-23 | 2011-10-06 | Denso Corp | Vehicle approach warning system |
WO2015182221A1 (en) * | 2014-05-27 | 2015-12-03 | 本田技研工業株式会社 | Collision possibility determination device |
JP2016035738A (en) * | 2014-08-04 | 2016-03-17 | 富士重工業株式会社 | Running environment risk determination device and running environment risk notification device |
-
2019
- 2019-09-11 WO PCT/JP2019/035744 patent/WO2020071072A1/en active Application Filing
- 2019-09-11 JP JP2020550244A patent/JPWO2020071072A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011198295A (en) * | 2010-03-23 | 2011-10-06 | Denso Corp | Vehicle approach warning system |
WO2015182221A1 (en) * | 2014-05-27 | 2015-12-03 | 本田技研工業株式会社 | Collision possibility determination device |
JP2016035738A (en) * | 2014-08-04 | 2016-03-17 | 富士重工業株式会社 | Running environment risk determination device and running environment risk notification device |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021174361A (en) * | 2020-04-28 | 2021-11-01 | 矢崎エナジーシステム株式会社 | Approach notification system and approach notification program |
JP7421409B2 (en) | 2020-04-28 | 2024-01-24 | 矢崎エナジーシステム株式会社 | Approach notification system and approach notification program |
CN114078317A (en) * | 2020-08-11 | 2022-02-22 | 丰田自动车株式会社 | Warning notification system, warning notification method, and warning notification program |
JP2022032144A (en) * | 2020-08-11 | 2022-02-25 | トヨタ自動車株式会社 | Warning notification system, warning notification method, and warning notification program |
JP7327320B2 (en) | 2020-08-11 | 2023-08-16 | トヨタ自動車株式会社 | WARNING NOTIFICATION SYSTEM, WARNING NOTIFICATION METHOD AND WARNING NOTIFICATION PROGRAM |
CN114078317B (en) * | 2020-08-11 | 2024-01-09 | 丰田自动车株式会社 | Warning notification system, warning notification method, and warning notification program |
JP2022172567A (en) * | 2021-05-06 | 2022-11-17 | ソフトバンク株式会社 | Server, system, method and program |
JP7418373B2 (en) | 2021-05-06 | 2024-01-19 | ソフトバンク株式会社 | Servers, systems, methods and programs |
Also Published As
Publication number | Publication date |
---|---|
JPWO2020071072A1 (en) | 2021-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7031612B2 (en) | Information provision systems, servers, mobile terminals, and computer programs | |
US11967230B2 (en) | System and method for using V2X and sensor data | |
US20230311810A1 (en) | Methods for passenger authentication and door operation for autonomous vehicles | |
USRE48958E1 (en) | Vehicle to pedestrian communication system and method | |
CN107921968B (en) | System and method for driving assistance along a path | |
US20210014643A1 (en) | Communication control device, communication control method, and computer program | |
US20180033300A1 (en) | Navigation system with dynamic mapping mechanism and method of operation thereof | |
US11945472B2 (en) | Trajectory planning of vehicles using route information | |
US10220776B1 (en) | Scenario based audible warnings for autonomous vehicles | |
JP2020027645A (en) | Server, wireless communication method, computer program, and on-vehicle device | |
JP7548225B2 (en) | Automatic driving control device, automatic driving control system, and automatic driving control method | |
WO2020071072A1 (en) | Information provision system, mobile terminal, information provision method, and computer program | |
JP2020091614A (en) | Information providing system, server, mobile terminal, and computer program | |
WO2019198449A1 (en) | Information provision system, mobile terminal, information provision device, information provision method, and computer program | |
WO2023250290A1 (en) | Post drop-off passenger assistance | |
US20220345861A1 (en) | Passenger support system | |
JP2020091652A (en) | Information providing system, server, and computer program | |
KR20230108672A (en) | Graph exploration for rulebook trajectory generation | |
CN110956850A (en) | Moving body monitoring device, vehicle control system using same, and traffic system | |
WO2021117370A1 (en) | Dynamic information update device, update method, information providing system, and computer program | |
JP2020091612A (en) | Information providing system, server, and computer program | |
JP2019160059A (en) | Information provision device, information provision method, and computer program | |
WO2019239757A1 (en) | Communication control apparatus, communication control method, computer program, and in-vehicle communication device | |
JP2020091613A (en) | Information providing system, server, and computer program | |
GB2613037A (en) | Predicting motion of hypothetical agents |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19869834 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020550244 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19869834 Country of ref document: EP Kind code of ref document: A1 |