WO2019221782A2 - Semi-autonomous motorized weapon systems - Google Patents
Semi-autonomous motorized weapon systems Download PDFInfo
- Publication number
- WO2019221782A2 WO2019221782A2 PCT/US2018/059261 US2018059261W WO2019221782A2 WO 2019221782 A2 WO2019221782 A2 WO 2019221782A2 US 2018059261 W US2018059261 W US 2018059261W WO 2019221782 A2 WO2019221782 A2 WO 2019221782A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- weapon
- target point
- firing
- motorized
- Prior art date
Links
- 238000010304 firing Methods 0.000 claims abstract description 243
- 238000000034 method Methods 0.000 claims abstract description 96
- 230000007246 mechanism Effects 0.000 claims abstract description 39
- 230000008685 targeting Effects 0.000 claims description 123
- 230000033001 locomotion Effects 0.000 claims description 43
- 230000015654 memory Effects 0.000 claims description 20
- 230000004044 response Effects 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 11
- 230000009471 action Effects 0.000 claims description 8
- 230000003190 augmentative effect Effects 0.000 claims 6
- 230000000977 initiatory effect Effects 0.000 claims 4
- 238000012913 prioritisation Methods 0.000 abstract description 23
- 230000008569 process Effects 0.000 description 22
- 238000004891 communication Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 14
- 230000003466 anti-cipated effect Effects 0.000 description 13
- 238000004422 calculation algorithm Methods 0.000 description 12
- 238000012795 verification Methods 0.000 description 11
- 238000004364 calculation method Methods 0.000 description 8
- 230000003936 working memory Effects 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000035484 reaction time Effects 0.000 description 3
- 235000015842 Hesperis Nutrition 0.000 description 2
- 235000012633 Iberis amara Nutrition 0.000 description 2
- 238000013475 authorization Methods 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000010809 targeting technique Methods 0.000 description 1
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
- F41G3/16—Sighting devices adapted for indirect laying of fire
- F41G3/165—Sighting devices adapted for indirect laying of fire using a TV-monitor
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41A—FUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
- F41A17/00—Safety arrangements, e.g. safeties
- F41A17/08—Safety arrangements, e.g. safeties for inhibiting firing in a specified direction, e.g. at a friendly person or at a protected area
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/04—Aiming or laying means for dispersing fire from a battery ; for controlling spread of shots; for coordinating fire from spaced weapons
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G5/00—Elevating or traversing control systems for guns
- F41G5/06—Elevating or traversing control systems for guns using electric means for remote control
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G5/00—Elevating or traversing control systems for guns
- F41G5/14—Elevating or traversing control systems for guns for vehicle-borne guns
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G5/00—Elevating or traversing control systems for guns
- F41G5/14—Elevating or traversing control systems for guns for vehicle-borne guns
- F41G5/16—Elevating or traversing control systems for guns for vehicle-borne guns gyroscopically influenced
Definitions
- This disclosure generally relates to autonomous and semi-autonomous motorized weapons systems. More specifically, the present disclosure relates to hardware- and software- based techniques for efficient operation of motorized weapons systems, via improvements in target identification and selection, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment.
- a“kill chain” refers to the sequence of actions performed between the first detection of potential targets, and the elimination of the targets.
- the sequence of actions within a kill chain generally may include the following: (1) Find - identifying and locating a target, (2) Fix or Track - determining the accurate location of the target, (3) Target - time-critical targeting, including predicting where the target may pop-up, (4) Engage - firing on the target, and (5) Assess - determining whether or not the target has been hit and/or eliminated.
- Conventional weapon systems may include various components for achieving the above steps of a kill chain, including cameras and sensors to identify targets, display screens and controls (e.g., joysticks) to allow an operator to identify targets and aim the weapon, and a variety of weapons that may be fired at the target.
- Such systems may include“fully
- autonomous weapons systems which are capable of targeting and firing without any intervention by a human operator
- “semi-autonomous” weapons systems which may use automated software target tracking tools but still rely on a human operator for target selection and firing commands
- “supervised autonomous” weapons systems which may be granted permission to react to threats autonomously, and/or manual weapon systems that are operated entirely by the human operator.
- Techniques described herein relate to hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques.
- Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device.
- such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds.
- Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point.
- the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon.
- the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
- Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems.
- Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence.
- Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
- the various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi- autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
- FIG. 1 is a depiction of a motorized weapon system, in accordance with one or more embodiments of the present invention.
- FIG. 2 is a block diagram illustrating example component architecture diagram of a motorized weapon system, in accordance with one or more embodiments of the present invention.
- FIGS. 3A-3C are illustrative drawings depicting the mounting and application of a motorized weapon system in accordance with one or more embodiments of the present invention, within different engagement environments.
- FIG. 4 is a flowchart illustrating an example process of using a motorized weapon system to engage one or more targets, in accordance with certain embodiments of the present invention.
- FIG. 5 is an example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain
- FIG. 6 is another example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain embodiments of the present invention.
- FIG. 7 is a flowchart illustrating an example process of disabling or enabling a firing mechanism of a motorized weapon system during engagement of the motor to move the weapon, in accordance with certain embodiments of the present invention.
- FIGS. 8A and 8B are example screens of a user interface displayed to an operator of a motorized weapon system during engagement of the motor to move the weapon toward a target point, in accordance with certain embodiments of the present invention.
- FIG. 9 is a schematic illustration of a computer system configured to perform techniques in accordance with certain embodiments of the present invention.
- circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail.
- well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
- individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
- the term“computer-readable medium” includes, but is not limited non-transitory media such as portable or fixed storage devices, optical storage devices, and various other mediums capable of storing, containing or carrying instruction(s) and/or data.
- a code segment or computer-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements.
- a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
- embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
- the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium.
- a processor(s) may perform the necessary tasks.
- semi-autonomous motorized weapon systems may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device.
- a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds.
- Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point.
- the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon.
- the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
- Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems.
- Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence.
- Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
- the various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi- autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls for operating the semi- autonomous motorized weapon systems, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
- weapon system 100 may include a weapon 110 with ammunition feed 115, a gimbal mount 120, a camera/sensor unit 125. Additionally, in this example, the weapon system 100 includes a base/housing 130, which contains and obscures additional components of the system 100, including the motor, servos, targeting system, processing and memory components, communications system, firing controls, and various other components described herein.
- weapon system 100 may be a remotely operated weapon stations (ROWS), including stabilization and auto-targeting technology.
- the targeting system of weapon system 100 may be configured to perform rapid target selection and acquisition, and increased hit probabilities.
- Weapon system 100 may be compatible with many different types of weapon 110 and different corresponding types of ammunition, and as discussed below, the operation of the targeting system and other components of the weapon system 100 may depend on knowledge of which type of weapon 110 and ammunition is currently in use.
- weapon system 100 may be fully integrated, with auto-targeting capabilities, and/or remote operation.
- Weapon system 100 also may be capable of being mounted to various different types of platforms, including tripods, buildings, ground vehicles (e.g., trucks, tanks, cars, jeeps), all-terrain vehicles (ATVs), utility task vehicles (UTVs), boats, fixed-wing aircraft, helicopters, and drones.
- ground vehicles e.g., trucks, tanks, cars, jeeps
- ATVs all-terrain vehicles
- UUVs utility task vehicles
- boats fixed-wing aircraft, helicopters, and drones.
- various embodiments of weapon systems 100 may include capabilities for automatic target detection, selection, and re-selection, active stabilization, automatic ballistic solutions, target tagging, and/or continuous target tracking.
- weapon 110 may any type of gun, armament, or ordinance, including without limitation, off-the-shelf firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices.
- the weapon 110 may be attached to the weapon system 100 using a 2-axis or 3-axis mechanical gimbal mount 120, capable of controlling azimuth and yaw, elevation and pitch, and possibly cant and roll.
- a closed loop servomotor within the weapon system 100 may be configured to drive the gimbal to an identified target.
- a firing mechanism within the weapon system may be configured to fire the weapon 110, either electronically or by manually pulling the trigger, in response to a firing command from a human operator and/or additional firing instructions received from a targeting/firing component of the weapon system 110.
- Camera/sensor unit 125 may include an array of various different sensors configured to collect data at the weapon system 100, and transmit the sensor/image data back to the internal software systems of the weapon system 100 (e.g., targeting system/component, firing control, ballistics engine) and/or to a display device for outputting to an operator.
- Cameras/sensors within the sensor unit 125 may include, for example, cameras sensitive in various spectrums such as visible and infrared (IR), for day and night visibility, as well as rangefinders (e.g., LIDAR, RADAR, ultrasonic, etc.) to determine distance to target.
- IR visible and infrared
- rangefinders e.g., LIDAR, RADAR, ultrasonic, etc.
- Additional sensors within the sensor unit 125 may include rate gyros (e.g., MEMS or fiber optic gyros), which may be used to stabilize the weapon 110 within the mount 120.
- rate gyros e.g., MEMS or fiber optic gyros
- Magnetometers and accelerometers also may be included within the weapon system 100, and may be used for canceling gyro drift.
- Accelerometers also may be used to detect and respond to vehicle accelerations (i.e., when the weapon system 100 is mounted on a vehicle), and vibrations caused by vehicle movement and/or terrain and weather.
- Sensors 125 also may include wind speed sensors, including hot-wire, laser/LIDAR, sonic and other types of anemometers.
- a global positioning system (GPS) receiver or other positioning devices may be included within the sensor unit 125, in order to determine the weapon location, head, and velocity to compute firing solutions, and for use in situations where external target coordinates are provided.
- GPS global positioning system
- the cameras/sensors may be housed within the sensor unit 125, positioned elsewhere in the weapon system 100, installed on a structure or vehicle on which the weapon system 100 is mounted, or installed at a separate remote location and configured to transmit wireless sensor data back to the weapon system 100.
- weapon system 200 may correspond to same weapon system 100 discussed above, and/or other variations of weapon systems described herein.
- weapon system 200 includes a weapon 225, mount 230, motor 235, and a camera/sensor unit 245.
- Weapon system 200 also includes a targeting/firing system 210, described below in more detail, which may be implemented in hardware, software, or a combination of hardware and software.
- weapon system 200 may include operator-facing components, including controls 245 and a display screen 250.
- the targeting/firing system 210 may be configured to control drive the motor 235 to a particular target point, and to initiate firing of the weapon 225.
- the cam era/ sensor unit 240 may collect image and sensor data, and transmit that data back to the targeting/firing system 210 for use in target detecting, selection, and tracking functionality.
- image and sensor data may be transmitted directly from the sensor unit 240 to the display 250 for rendering/use in an operator user interface.
- the targeting/firing system 210 also may transmit various targeting data to the display device 250 for presentation to the operator, and may receive from the operator firing commands and/or other control commands via the operator controls 245.
- weapon systems 200 may include turrets or platform-mounted guns which include the weapon/motor 225-235, camera/sensor unit 240, targeting/firing system 210, as well as the operator controls 245 and display 250.
- some or all of the components of a weapon system 200 may non- integrated and located remoted from the others.
- the weapon/motor 225-235 and a subset of the sensors/cameras 240 may be located near the potential targets, while the targeting/firing system 210 and operator interface components 245-250 may be in a distance remote location.
- Certain sensors 240 may be located at or near the weapon 225 (e.g., to measure distance to target, current location, weapon movement and vibration, wind and weather conditions, etc.), while other sensors 240 may be positioned at or near the target and/or at other angles to the target, while still other sensors or cameras 240 may be remotely located (e.g., drone-based cameras, satellite imagery, etc.).
- each of the components may include network transceivers and interfaces configured for secure network communication, including components for data encryption and transmission over public or private computer networks, satellite transmission systems, and/or secure short-range wireless communications, etc.
- the targeting/firing system 210 may receive input data from various data sources, and analyze the data to identify, select, and prioritize targets, actuate the motor 235, dynamically track targets, generate firing solutions, and control firing of the weapon 225. In order to perform these functions, the targeting/firing system 210 may receive data from one or more
- the sensor data may include images of targets and potential targets, distance/range data, heat or infrared data, audio data, vehicle or weapon location data, vehicle or weapon movement and vibration data, wind and weather condition data, and any other sensor data described herein.
- one or more data stores may store system configuration and operation data, including a rules data store 213 and a profiles data store 214.
- the rules data store 213 may include, for example, target identification rules, target selection/priority rules, firing rules, and other rules of engagement, each of which may depend on the particular operation, the current location of the weapon system 200, the individual operator, etc.
- the profiles data store 213 may include, for example, individual user profiles with user preferences and parameters, weapon profiles, and/or ballistic profiles that may include specifications for individual weapon types and ammunition types that may be used to calculate maximize ranges and targeting solutions.
- one or more communication modules 212 within the targeting/firing system 210 may be used to receive commands and other data from the current operator and/or from a separate command centers. As discussed below, commands received from a command center or other higher-level authority may be to control the target selection and rules of engagement for particular operations. Communication modules 212 also may be used to receive or retrieve sensor data from remote sensor systems, including satellite data, image data from remote cameras, target GPS data, weather data, etc.
- the targeting/firing system 210 may include various components (e.g., targeting component 220) configured to receive and analyze the various data to performing target functions including subcomponents for target detection 221, target selection 222, target tracking 223, and firing control 215, among others.
- the operator controls 245 and display screen 250 may correspond to the input/output interface between the human operator and the weapon system 200. As noted above, certain weapons systems 200 may be fully autonomous, or may operate in a supervised autonomous mode, in which case the operator controls 245 and display screen 250 need not be present.
- the operator controls 245 and display screen 250 may be remotely located in some embodiments, allowing the operators to control the weapon system 200 from a separate location that may be a few feet away or across the globe.
- the display device 250 may receive and output various user interview views to the operator, including views described below for identifying and highlighting targets, obscuring non-targets, rendering target points, weapon trajectories, confidence ranges, and providing various additional sensor readings to the operator.
- the operator controls 245 may allow the operator to identify, select, and mark targets, and to fire the weapon 225.
- the operator controls 245 may include a fire button 246 (to fire the weapon 225), and a“next target” button 247 to instruct the target component 220 to re-select the next priority target.
- the operator controls might include only these two buttons, and need not include a joystick for aiming tracking, etc.
- FIGS 3A-3C these drawings illustrate the operation of motorized weapons systems on three different vehicle-based mounting platforms.
- a motorized weapon system is mounted on a stationary or moving vehicle 306.
- the remote weapon system 304 holds the firearm 305, and various sensors may be installed in the frame of reference of the firearm 305, in the frame of reference of the gimballed remote control, and/or in the frame of reference of the vehicle 306.
- the field of view 307 is represented by dotted lines.
- a crosshair 301 shows the current projected point of impact.
- the crosshair 301 is not yet on target, and it may be assumed that the motor is engaged driving the firearm to the target position, or the operator has not yet confirmed the target.
- the targeting system in these examples shows a primary target 302 identified by a doubled-dashed box, and a secondary target which has been identified but not yet targeted, is shown within a singled dashed box 303.
- Figure 3B shows a similar set of components, but in this case, the scenario is a maritime use with an armed boat 306 as the vehicle.
- Figure 3C shows yet another scenario in which the vehicle 306 is a helicopter.
- Figure 3C also illustrates that the system may identify multiple secondary targets 303 within the field of view 307.
- FIG. 4 a flow diagram is shown illustrating a process by which a motorized weapon system may identify, target, engage, and fire on one or more targets.
- the steps in this process may be performed by one or more components in the example motorized weapon system 200 discussed above, such as targeting/firing system 210 and the subsystems thereof, in conjunction with the weapon/mount/motor components 225-235, one or more sensor units 240, operator interface components 245-250, and/or various remote and external systems.
- process steps described herein need not be limited to the specific systems and hardware implementations described above in Figures 1-3, but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
- the components of the motorized weapon system 200 may identify and verify one or more targets, using sensor units 240 and/or additional data sources. In some embodiments, the identification and/or verification of targets may be performed fully
- image data from cameras and sensor data from other sensors 240 may be used to identify one or more targets within the range and proximity of the weapon system 200.
- data from additional sources may be used as well, including imagery or sensor data from remote sensor or imaging systems (e.g., other weapons systems 200, fixed cameras, drones, satellites, etc.).
- the targeting/firing system 210 may be configured to calculate approximate range data using passive ranging techniques. For example, heights of known objects (or presumed heights) may be used to calculate the distance of those objects from the weapon system 200.
- Additional sources of target data also may be received via communication modules 212, which may include the GPS coordinates of targets, or bearing to targets, received from a command center. Such image data and other sensor data received from additional data sources may be used by the targeting/firing system 210 to triangulate or confirm a target’s location, or verify the identity of a target, etc.
- target identification and target verification refer to related but separate techniques.
- Target identification or target detection refers to the analysis of camera images, sensor data, etc., to detect objects and identify the detected object as potential targets for the weapon system 200 (e.g., vehicles, structures, weapons, individuals, etc.), rather than generally non-target objects such as rocks, trees, hills, shadows, and the like.
- Target verification or target confirmation refers to additional analyses of the same images/sensor data, and/or additional sources images/sensor data, to determine whether or not the identified potential target should be selected for targeting by the weapon system 200.
- Target verification techniques may be based on the configuration of the system and priorities of the particular mission, etc.
- target verification techniques for vehicles may include identifying the size of a vehicle target (e.g., based on image analysis, target range, heat signatures from engines, etc.), the vehicle type (e.g., based on image analysis, and comparisons to a database 214 of target/non-target images), the presence of weapons on a target or proximate to a target, etc.
- the size, shape, color, movement, audio and heat signatures of a vehicle may be analyzed to determine if that vehicle is a drone, helicopter, aircraft, boat, tank, truck, jeep, or car, whether the target is a military or civilian vehicle, the number of individuals and/or weapons on the vehicle, and the like, all of which may be used be a rules database 213 to determine whether the vehicle is a target non target.
- Target verification also may include identifying particular insignia on targets, and for human targets, facial recognition and/or biometric recognition to confirm the identity of the target. [0041 j
- both target identification and target verification in step 401 may be performed fully autonomously by the weapon system 200, using the techniques described above.
- target identification and/or verification may include semi-autonomous or manual steps.
- the rules of engagement for particular operations may require that each target be visually confirmed by a human operator.
- Such visual confirmation may be performed by the operator, as described in steps 406-407 below.
- the visual confirmation may be received from a different user, such as a commanding officer at a remote command center or other authorized user.
- the weapon system 200 may be configured to transmit imagery and other sensor data to one or more remote locations, and then to receive the instructions identifying the potential target as a selected target or a non-target, from the remote authorized user/command center via a communication module 212.
- remote visual confirmation techniques may be entirely transparent with respect to the operator of the weapon system 200 in some cases, that is, if a target is not selected/confirmed by a remote authorized user then that target might not ever be rendered or selected via the operator display device and/or might not be selectable by the operator during steps 406-407.
- both target identification and target selection in step 401 may be based on sets of rules received via a rules database 213 or other sources.
- Target selection rules may be based on target type (e.g., types of vehicles, individuals (if any), and structures, etc.), target size, target distance, the presence and types of weapons on a target, the uniform/insignia on a target, and the like.
- Additional rules may relate to the probability that the target has been accurately identified (e.g., level of confidence of facial recognition, vehicle type identification, insignia recognition, etc.), the probability that the weapon system 200 will be able to hit the selected target (e.g., based on target distance, target movement, weapon and ammunition type, wind and weather conditions, etc.), and/or the presence of potential collateral damage that may occur if the target is fired upon (e.g., based on detection of friendly and non-targets in the proximity of the identified target).
- Different sets of rules may be applied for different operators, different weapons 225 and ammunition types, different times, and/or different physical locations for the engagement.
- target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 for an engagement with a particular operator, at a particular date and time, using a particular weapon/ammunition type, in a particular country/region of the engagement, having particular lighting or weather conditions, and so on
- an entirely different set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 if one or more of these variables (e.g., operator, time, weapon or ammunition type, engagement location or environmental conditions, etc.) changes.
- the targeting/firing system 210 of the motorized weapon system 200 may be configured to prioritize the multiple targets, thereby determining a firing order.
- target prioritization techniques similarly may be on imagery and sensor data, as well as sets of operational rules that may apply to operators, weapons, locations, etc.
- Examples of target prioritization rules may include, without limitation, rules that prioritize vehicles over human targets, certain types of vehicles over other types of vehicles, armored vehicles over non-armored vehicles, armed targets over non-armed targets, uniformed/insignia targets over non-uniformed or insignia targets, close targets over far targets, advancing targets over stationary or retreating targets, higher confidence targets (i.e., higher probability of weapon being able to hit the target) over lower confidence targets, targets firing weapons over targets not firing weapons, and/or any combination of these criteria.
- the targeting/firing system 210 may evaluate the current target distance and trajectory of all advancing and armed targets (e.g., missiles, drones, ground vehicles, and individuals, etc.), in order to prioritize the targets in the order in which they would first reach the current position (or future position) of the weapon system 200.
- targets e.g., missiles, drones, ground vehicles, and individuals, etc.
- These target prioritization rules also may include rules determining how particular types of targets may be targeted. For example, such rules may include the desired point of impact for a particular target type (e.g., the engine of boat, the center of mass of an individual, etc.).
- rules or algorithms may be applied for prioritizing targets, depending on the current operator, current location, current date/time, and/or based on predefined operation-specific rules of engagement. Further, rules or algorithms for prioritization may be based on or adjusted in view of current conditions, such as the current amount of ammunition of the weapon system 200 (e.g., lower ammunition circumstances may cause prioritization of most valuable/important targets first), the current wind or weather conditions (e.g., in which closer and/or higher confidence targets may be prioritized), or based on nearby friendly or non-hostile targets (e.g., in which closer and/or higher confidence targets may be prioritized).
- current conditions such as the current amount of ammunition of the weapon system 200 (e.g., lower ammunition circumstances may cause prioritization of most valuable/important targets first), the current wind or weather conditions (e.g., in which closer and/or higher confidence targets may be prioritized), or based on nearby friendly or non-hostile targets (e.g., in which closer and/or higher confidence targets may be prioritized).
- certain prioritizing algorithms may adjust the priorities of a set of targets to reduce and/or minimize the lag time between successive firings of the weapon, for instance, by prioritizing a set of nearby targets successively in the priority rank order, in order to reduce the firing latency time required to drive the weapon 225 through the sequence of targets.
- operators may be permitted to switch on-the-fly between different rules or algorithms for target selection and prioritization. Such switching capabilities may be based the rank and/or authorization level of the operator, and in some cases may require that a request for approval be transmitted from the weapons system 200 to a high-level user at a remote command center.
- a display screen is shown displaying an example user interface 500 that may be generated by a motorized weapon system 200 during engagement of a set of targets.
- a plurality of targets have been identified and selected within the range and proximity of the weapon system 200.
- the targets have been prioritized to select a primary target 501, several secondary targets 502, and several non-targets 503 (e.g., friendly or non-hostile vehicles or individuals).
- the primary target 501 is indicated with a double dotted line
- the secondary targets 502 are indicated with a single dotted line
- the non targets have no lines.
- example user interface 500 includes two operator controls: a fire button 510 to allow the user to fire the weapon 225, and a next button 515 to allow the user to select the next target in the priority list.
- fire button 510 is shaded indicating that the weapon 225 cannot currently be fired. As described below in more detail, this may represent a feature in which the operator’s firing control mechanism 246 is disabled whenever the weapon 225 is not currently aimed at a selected target.
- the next button 515 is enabled in this example, indicating that the next mechanism 247 that allows the operator to change the primary target 501 to the next highest priority target 502 in the priority list may be enabled even when the crosshairs 505 are not yet positioned on the primary target 501.
- the kill chain sequence may continue by performing the functionality of steps 403-410 in a continuous loop for each of the targets selected in step 401, and in the priority order of the target prioritization performed in step 402. Therefore, the first iteration of steps 403-410 may be performed for the highest priority target, the second iteration of steps 403-410 may be performed for the second highest priority target, and so on.
- the targeting/firing system 210 may perform a dynamic tracking technique to determine a firing solution for that target.
- a firing solution refers to a precise firing position for the weapon (e.g., an azimuth/horizontal angle and altitude/elevation angle) and a precise firing time calculated by the targeting/firing system 210 to hit the primary target.
- target tracking need not be performed, and the firing solution may be computed based on a number of factors, including the target distance and target bearing from the weapon 225, the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions).
- dynamic target tracking may be required to generate a firing solution, introducing additional variables which may increase the complexity and uncertainty of the firing solution calculation.
- dynamic target tracking may involve calculating the anticipated direction and velocity of the target.
- the targeting/firing system 210 may assume that the primary target will continue along its current course with the same velocity and direction. If the target is currently moving along a curved path, and/or is currently accelerating or decelerating, then the
- targeting/firing system 210 may assume the same curved path and/or the same
- the targeting/firing system 210 may anticipate future changes in course or speed, based on factors such as upcoming obstructions in the target’s path, curves in roads, previous flight patterns, etc.
- the determination of a firing solution for a moving target also may take into account the anticipated time to drive the motor 235 so that the weapon is positioned at the correct firing point, and the anticipated amount of time between the firing command and when the
- the time to drive the motor 235 may be calculated based on the distance the gun is to be driven, the speed of the motor and/or the weight of the weapon 225. The amount of time between receiving a firing command and when the
- projectile/ammunition will reach the target may be based on the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, etc. Additionally, in some cases, an anticipated delay for operator reaction time (e.g., .5 seconds, 1 second) also may be included in the firing solution calculation.
- an anticipated delay for operator reaction time e.g., .5 seconds, 1 second also may be included in the firing solution calculation.
- FIG. 6 another example user interface 600 is shown that may be generated by a motorized weapon system 200 during engagement of one or more targets.
- a motorized weapon system 200 during engagement of one or more targets.
- only a single primary target 601 is shown, and the targeting/firing system 210 has assessed that the target 601 is moving toward the lower-right direction of the interface 600.
- the targeting/firing system 210 may calculate the firing solution.
- the crosshairs 605 represents the point at which the weapon 225 is currently aimed
- the point 606 represents the desired point of impact on the target 601
- point 607 represents the firing solution determined by the targeting/firing system 210.
- the motor 235 is currently re-positioning the weapon toward the firing solution point 607
- the firing solution computation has taken into account the time reposition the weapon 225 and the projectile time-to-target.
- the firing solution computation also may take into account a short time delay to fire the weapon, and/or an anticipated operator decision time delay.
- example interface 600 also includes three operator controls: a fire button 610, a next button 615, and a safe button 620.
- the fire button 610 allows the operator to fire the weapon 225, but in some cases might be enabled only after the weapon 225 has reached the firing solution point 607.
- the next button 615 allows the operator not to fire the weapon 225 at the primary target 601, but instead to re-select the next highest priority target in the priority list.
- the primary target 601 may be moved to the back of the priority list or elsewhere in the priority list, based on the operator’s selection of the next control 615.
- the safe button 620 allows the operator to mark the currently selected primary target 601 as a friendly or non-target object, thereby removing it from the set of selected targets determined in step 401 and priority list of step 402.
- the configuration settings of the targeting/firing system 210 may determine that a target marked as safe by an operator during one engagement might thereafter be excluded from target
- weapon system 200 may transmit data identifying any targets marked as safe to other weapons systems 200 in the same general location, so that those other weapons systems 200 may automatically remove the target marked as safe from their target selection/prioritization lists as well.
- step 404 was described above as performed for only a single target (i.e., the current highest priority target), in some embodiments, the targeting/firing system 210 may continuously performing dynamic tracking for all targets selected/prioritized in steps 401-402.
- the targeting/firing system 210 may more quickly and efficiently determine the firing solution for the next primary target as soon as the firing sequence 403-410 is completed for the first primary target. Additionally, while dynamically tracking a plurality of secondary target(s), the targeting/firing system 210 may potentially re-order the prioritization sequence determined in step 402, for example, based on movement of the secondary targets and/or based on newly received data about one or more of the secondary targets (e.g., improved verification information, additional threat information, etc.).
- the targeting/firing system 210 may engage the motor 235 to drive the orientation of the weapon 225 toward the firing solution determined for the primary target in step 404
- the motor 235 may be engaged to aim the weapon 225 from its currently aimed position 605, to the determined firing solution point 607. It may be noted from this example, that (a) the weapon 225 may be driven not toward the current position point of the target 606, but instead to the future position point 607, and (b) that the motor 235 may be engaged and the weapon 225 may be driven to this point by the targeting/firing system 210 in a fully autonomous manner, before any action has been taken by the operator to view, select, mark, or engage this target.
- the targeting/firing system 210 may generate and transmit a user interface to be rendered for the operator via one or more display devices 250.
- the human operator may be located at the weapon system 200 or remote to the weapon system 200, in which case the user interface may be transmitted via the communication module 212 over one or more secure computer networks, wireless networks, satellite networks, etc.
- the user interface provided in step 406 may correspond to user interfaces 500 and/or 600 discussed above, although several variations may be implemented in different embodiments.
- the primary target 501 may be marked by a particular scheme that is different from the secondary targets and from non-targets.
- the user interface may automatically zoom in on the primary target (as in screen 600) to allow the operator the best possible visual of the target. Additionally or alternatively, secondary targets and/or non-targets may be blocked out, hidden, or otherwise obscured to prevent confusion or distraction by the operator. Further, in different embodiments, each of the various different target points discussed above (e.g., crosshairs 605 representing current weapon aiming point, the current target position point 606, and/or firing solution target point 607) may or may not be rendered within the user interface, and/or may be shown in different colors, using different graphics and icons, etc. Finally, the user interface generated and rendered in step 406 may include additional components such as side menus, overlays, and the like, to convey any relevant sensor information about the target or the firing environment.
- the targeting/firing system 210 may receive engagement instructions from the operator, via operator controls 245.
- the operator controls might only include two buttons: a fire button and next button.
- the operator controls might include only three buttons: a fire button, a next button, and safe button.
- any number of different/additional operator controls may be included in other embodiments (e.g., mouse/joystick for aiming, manual override, target selection controls, etc.), there are certain technical advantages associated with a limited interface such as a two-button or three-button interface as shown 500-600, including simplification of operator interface, reduction or real-time operator errors, increased speed to weapon firing, etc.
- the dynamic target tracking there may be time delay between steps 406 and 407, for target analysis, evaluation, and decision making by the operator. During this time delay, the dynamic tracking may continue for the primary target as well as the secondary targets selected by the targeting/firing system 210. Thus, while the operator deliberates on whether or not to fire on a target between steps 406 and 407, for moving targets and/or other circumstances (e.g., a detected change in the wind), the firing solution may be updated during this time delay and the motor 235 may be continuously engaged so that the weapon 225 is continuously aimed at the most recent firing solution target point.
- the firing solution may be updated during this time delay and the motor 235 may be continuously engaged so that the weapon 225 is continuously aimed at the most recent firing solution target point.
- the target identification, selection, and prioritization techniques discussed above in steps 401 and 402 may be updated, automatically and entirely transparently to the operator, to re-select and re-prioritize the targets based on new imagery, sensor data, and other relevant data received during the time delay between steps 406-407.
- the targeting/firing system 210 may perform the received instructions in steps 408-410.
- the fire command (408) is an operator instruction to fire the weapon 225, and in some cases might be enabled only after the weapon 225 has reached the firing solution target point.
- the targeting/firing system 210 may initiate firing of the weapon 225, and then return to perform steps 403-410 for the next highest priority target.
- the targeting/firing system 210 may be configured to evaluate the accuracy of the projectile fired in step 410, and may perform a real-time automatic correction in the targeting algorithm based on the accuracy evaluation. For example, upon firing a shot in step 410, the targeting/firing system 210 may be configured to activate one or more cameras or sensors from sensor units 240 (which may be local or remote), to detect the landing time and location of the projectile. Additional sensors such as audio sensors, heat sensors, etc., also may be used to determine where the projectile hit/landed. The projectile landing/hit data may compared to the firing solution/target point data that was determined by the targeting/firing system 210 prior to firing the projectile.
- the targeting/firing system 210 may be configured to adjust its targeting algorithm in real-time, so that the updated algorithm may be used in the next iteration of steps 403-410. Additionally, if the shot was off target by a sufficient amount that the target was missed, then the targeting/firing system 210 may be further configured to re-insert the previously fired upon target back into the priority list of selected targets.
- the next command is an operator instruction not to fire the weapon 225 at the target, but to retain the target within the set of selected targets/target priority list, and then to re-select the next highest priority target in the priority list.
- a next command in step 409 may cause the target to be placed at the back of the priority list of selected targets, or may cause the target to placed immediately after the next highest priority target in the priority list.
- a safe command is an operator instruction to mark the target as a friendly or non-target object, thereby removing it from the set of selected targets and target priority list.
- the target may not be selected again by the targeting/firing system 210, during at least the current engagement by the current weapon system 200.
- a target marked as safe during step 410 during an engagement at one weapon system 200 also might be excluded from target selection in future engagements of the weapon system 200, and/or during current and future engagements at different weapons systems 200.
- the various techniques discussed above with reference to Figure 4 including without limitation: (a) autonomous target selection, prioritization, and re-selection by the targeting/firing system 210, (b) dynamic target tracking of both the primary target and secondary targets that takes into account target movement, weapon/projectile characteristics, etc., (c) autonomous actuation of the motor to automatically orient the weapon toward the primary target before receiving any operator input, (d) a simplified user interface and operator controls, and (e) enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, alone and in combination, provide increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
- certain aspects of the present disclosure relate to techniques for disabling and re-enabling an operator firing control (e.g., 246), during the period of time when the motor 235 of a motorized weapon system 200 is engaged and the weapon 225 is being positioned and oriented toward a determined target point for firing.
- the process of engaging the motor 235 of the weapon system 200 to position the weapon 225 to fire on a particular target point may take anywhere from a fraction of second to several seconds, depending on factors size as the motor size and speed, gun size and weight, angular distance to be traveled, etc.
- the projected point of impact of a projectile fired from the weapon 225 may become closer and closer to the target point, and similarly, the likelihood of hitting the target may increase continuously until a maximum likelihood is reached when the projected point of impact of the weapon 225 (e.g., marked by crosshairs 505, 605, etc.) is directly on the determined firing solution target point.
- the probability of hitting the target might never be 100%.
- the targeting/firing system 210 may be configured to enable firing of the weapon 225 (and/or automatically fire the weapon 225).
- the targeting/firing system 210 may be configured to determine if/when the predetermined likelihood threshold for hitting the target is reached during the time period when the motor 235 is engaged in positioning the weapon 225, but before the crosshairs 505 are directly on the target (i.e., before the projected point of impact of the weapon 225 is directly on the determined firing solution target point).
- the targeting/firing system 210 may be configured to disable the operator firing mechanism 246 when the current likelihood of hitting the target is below the predetermined likelihood threshold, based on the position/orientation of the weapon 225 and other factors. The operator firing mechanism 246 then may be re-enabled in response to the targeting/firing system 210
- FIG. 7 a flow diagram is shown illustrating a process of disabling and/or re-enabling the firing mechanism of a motorized weapon system while the motor is engaged to move the weapon to a target point.
- the steps in this process may be performed by one or more components in the example motorized weapon system 200 discussed above, such as targeting/firing system 210 and the subsystems thereof, in conjunction with the weapon/mount/motor components 225-235, one or more sensor units 240, operator interface components 245-250, and/or various remote and external systems.
- process steps described herein such as determination of likelihood thresholds for hitting targets, and corresponding boundary areas for motorized weapons systems, need not be limited to the specific systems and hardware implementations described above in Figures 1-3, but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
- step 701 a motorized weapon system 200 has identified and selected a particular target, and determines a firing solution and/or target point for the selected target.
- step 701 may be similar or identical to step 404 discussed above.
- one or both of the target and the weapon system 200 may potentially be moving during this process.
- the firing solution target point may be computed based on factors including the target distance, target bearing from the weapon 225, muzzle velocity of the weapon 225, aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions).
- dynamic target tracking may be required to generate a firing solution, and additional variables may increase the complexity and uncertainty of the firing solution calculation.
- dynamic target tracking may be used to determine the current velocity and direction of travel of both the weapon system 200 and the target, and that data may be used to calculate the anticipated velocity and direction of travel of both in the near future.
- the targeting/firing system 210 may assume that both the weapon system 200 and the target may continue along their current course with the same velocity and direction, and if either is currently moving along a curved path and/or is currently accelerating/decelerating, then the targeting/firing system 210 may assume the same curved path and/or the same acceleration/declaration in the near future.
- the determination of a firing solution e.g., predicted future coordinates at a future firing time
- the targeting/firing system 210 may build in an anticipated delay for operator reaction time (e.g., .5 seconds, 1 second) which may be included in the firing solution calculations for moving targets.
- the targeting/firing system 210 of the motorized weapon system 200 may determine a boundary area surrounding the target point determined in step 701.
- the boundary area may be referred to as a“confidence lock” boundary, because as discussed below, the firing mechanism may be disabled when the projected point of impact of the weapon is outside of this area.
- the boundary area may be a circle or other two-dimensional closed shape surrounding the target point. A simple example of a circular boundary area 807 is shown in Figures 8A-8B, discussed in more detail below.
- the boundaries of the area may correspond to a predetermined likelihood threshold of hitting the target and need not be any particular shape.
- the likelihood of the weapon 225 hitting the target may be calculated as a probability P, which may be the same for every point on the boundary of the area and is also the same as a predetermined likelihood threshold set by the targeting/firing system 210.
- P a probability of the likelihood of hitting the target is less than P, and for any shot taken when the weapon crosshairs are inside of the boundary area, the likelihood of hitting the target is greater than P.
- the boundary area may be circular, as shown in Figures 8A-8B. Circular boundaries may generally apply when the determined probability P is the probability of the hitting the target point. However, if the determined probability P is the probability of hitting any point on the target, then the boundary area may be target-shaped (e.g., a larger vehicle- shaped boundary surrounding the target vehicle, a larger person-shaped boundary surrounding the target person, etc.). When either the target or the weapon system 200 is current moving, the boundary area may assume a more elongated shape in the direction of the movement, to account for the additional targeting uncertainties caused by the movement of the weapon system 200 or target.
- target-shaped e.g., a larger vehicle- shaped boundary surrounding the target vehicle, a larger person-shaped boundary surrounding the target person, etc.
- the boundary area may be shaped like a horizontally-elongated circle (or horizontally-elongated vehicle shape).
- the boundary area may be defined in terms of angular coordinates (e.g., azimuth and altitude) from the perspective of the weapon 225.
- the size of the boundary area determined in step 702 may be based on any combination of factors that may introduce uncertainty in the point of impact calculation of the weapon 225 with respect to the target.
- the size of the boundary area (e.g., in terms of angular degrees or coordinates) may be based on one or more of the target size, distance between the weapon 225 and the target, the general accuracy and precision data for the weapon type 225 and ammunition type, and other factors such as wind, vibration level of the weapon 225 during movement by the motor, and current movement of the weapon system 200 and/or the target.
- the boundary area may be relatively small.
- the boundary area may be relatively large.
- step 703 the targeting/firing system 210 engages the motor 235 to position and orient the weapon 225 toward the target point identified in step 701.
- step 703 may be similar or identical to step 405, discussed above.
- the engagement of the motor 635 may drive the position and orientation of the weapon 225 to a predicted point of impact of the stationary target point 606.
- the engagement of the motor 635 may drive the position and orientation of the weapon 225 to a separate predicted future target point (e.g., 607) determined by a firing solution calculation based on predicted target movement and anticipated time delays until firing and impact.
- step 704 at a particular point of time when the motor 235 is engaged and the weapon 225 is moving, the targeting/firing system 210 may compute the projected point of impact if a projectile were fired from the weapon 225 at that time.
- the projected point of impact corresponds to the calculation of the crosshairs (e.g., 505 and 605) discussed above and shown in Figures 5 and 6.
- the calculation of the projected point of impact may be based on the specifications of the weapon system 200 and/or collected sensor data, such as the current position and orientation of the gun, the distance to target and bearing of the target from the weapon 225, the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile to be fired, the current wind and weather conditions, and gravity (which may vary based on the current elevation).
- step 705 the targeting/firing system 210 may compare the projected point of impact computed in step 704 to the“confidence lock” boundary area defined in step 702. This may be straightforward comparison of angular coordinates from the perspective of the weapon 225. If the current point of impact of the weapon 225 is projected to fall outside of the defined boundary area (705 :No), then in step 706 the targeting/firing system 210 may disable the operator firing mechanism 246 thereby preventing the weapon 225 from being fired. However, if the current point of impact of the weapon 225 is projected to fall within the defined boundary area
- step 707 the targeting/firing system 210 may enable (or re-enable) the operator firing mechanism 246, thereby allowing the operator to fire the weapon 225.
- the targeting/firing system 210 may be configured to perform a rapid post-firing command movement of the weapon 225 in order to further improve shot confidence. For instance, after the operator pushes the enabled firing mechanism 246, rather than immediately firing the weapon 225, the targeting/firing system 210 in some cases may engage the motor 235 for a short amount of time (e.g., 50 ms, 100 ms, 200 ms, etc.), in response to a determination that the corresponding small weapon movement may significantly increase shot confidence.
- a short amount of time e.g., 50 ms, 100 ms, 200 ms, etc.
- These short post-firing command movements may be performed in the case of moving targets and/or moving weapon systems 200, in the event of a sudden change in the trajectory of the target, to correct for a lag in operator reaction time, and/or as part of a firing burst to increase hit probability.
- FIGS 8A and 8B two example user interface screens 800 are shown, during a process of engaging the motor 235 of a motorized weapon system 200 to position and orient the weapon 225 at a selected target point 806.
- a circular “confidence lock” boundary area 807 has been defined by the targeting/firing system 210, outside of which firing of the weapon 225 is to be disabled.
- the operator may be unable to fire the weapon 225 (as indicated by the shaded fire button 810).
- steps 704-707 may be performed multiple times while the motor 235 is engaged and the weapon 225 is moving toward the target point.
- targeting/firing system 210 may perform steps 704-707 on a continuous loop at all times while the motor 235 is engaged, or in some cases even when the motor 235 is not engaged.
- the targeting/firing system 210 may be configured to initiate an instance of steps 704 in accordance with a schedule (e.g., every 100 ms, 200 ms, ... , 500 ms, etc.).
- these steps may be performed periodically or continuously even when the motor 235 is not moving and the crosshairs 805 are fixed on the target point 806.
- a new action such as a change in movement of the target 801 or the weapon system 200, an object obscuring the target 801, and/or new sensor readings (e.g., a change in wind conditions) may temporarily cause the probability level of the weapon 225 hitting the target to drop below the predetermine likelihood threshold and out of the confidence lock boundary area 807, requiring a minor adjust via the motor 235 or other corrective action by the weapon system 200.
- a motorized weapon system 200 may implement a minimum confidence threshold for target selection and/or prioritization.
- this minimum confidence threshold may be a separate determination from the level of confidence computed by the system 200 for identifying or verifying a target. Rather, this minimum confidence threshold may refer to the level of confidence that the weapon system 200 is able to hit the identified target.
- the targeting/firing system 210 may determine that the confidence level that the weapon system 200 will hit the target is not sufficiently high to fire the weapon 235.
- Environmental conditions such as wind or weather conditions, lighting conditions, and/or other objects potentially obscuring the target object also may lower the confidence level computed by the targeting/firing system 210 for hitting the target.
- that target may be automatically deprioritized so that it is not selectable by the operator (or selectable only via manual override).
- the targeting/firing system 210 may continue to monitor and dynamically track the low-confidence target, and may re-enable target selection and firing capabilities on that target as soon as the confidence level of hitting the target returns to above the minimum confidence threshold.
- the minimum confidence threshold is another operation-specific variable that may be altered based on the operation, the particular operator, the location, and other factors.
- the firing/targeting system 210 may continuously assess and evaluate its target accuracy, which may result the system 210 increasing or decreasing the confidence levels it had previously computed for one or more selected targets. As an example, if a first target is initially determined to be too small and too far away to have a sufficiently high confidence level for firing on the target, the firing/targeting system 210 may instead select a number of closer targets and may fire on those targets.
- the firing/targeting system 210 may be better able to evaluate the range, lighting, wind conditions, and the like, so that the confidence level for the hitting the first target now may be increased based on the accuracy feedback from the closer targets.
- a motorized weapon system 200 may be weapon-agnostic, in that a weapon system 200 may support many different types or models of weapons 235, including various firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices. Further, the targeting/firing system 210 may weapon profiles in data store 214 and/or weapon-specific rules in data store 213, that allow the weapon system 200 to perform the techniques discussed herein in a similar or identical manner regardless of the current weapon type.
- the targeting/firing system 210, sensor units 240, and the operator interface 245-250 may function identically regardless of the type of motor 235, mount 230, and weapon 225 integrated into the system 200. Because systems 200 having different types of weapons 225, mounts 230, and/or motors 235, may perform differently in some respects (e.g., time required to re-position and re-orient the weapon 225, maximum range of weapon, type, size, and speed of projectiles fired, etc.), the targeting/firing system 210 may be configured to initially determine these weapon-specific data factors, and adjust the techniques described herein to provide a uniform operator experience.
- the targeting/firing system 210 of a first weapon system 200 may automatically select targets based on the firing range of the weapon 225 installed on that system 200, whereas a different system 200 might select more or less targets based on its having a weapon 225 with a different range.
- a first weapon system 200 may prioritize a set of selected targets taking into account the speed of the motor 235 on that system 200, whereas a different system 200 might prioritize the same set of targets differently as a result of having a different motor speed.
- different sensor units 240 have different numbers, types, and/or qualities of cameras and other sensors, may result in different sets of input provided to the targeting/firing systems 210.
- a first weapon system 200 may have sufficient data to select and verify a target with high confidence, while a second weapon system 200 with different cameras/sensors 240 would not select because it could not verify the target with a sufficient confidence level.
- the different behaviors of the weapon systems 200, resulting from different weapons 225, mounts 230, motors 235, and/or sensor units 240 may be entirely transparent to the operator.
- operators of weapons systems 200 need not ever know what weapon 225 they are firing, and the entire operator interface may function identically regardless of the particular weapon, motor, mount, or sensor unit.
- Additional techniques applicable to the above examples include the implementation of operation-specific rules of engagement that may be retrieved/received and enforced by the targeting/firing system 210.
- specific rules of engagement and/or operational parameters for the motorized weapon system may include different requirements or parameters for target identification and selection, different minimum confidence thresholds for firing the weapon 225, different target prioritization algorithms, and so on.
- the motorized weapon system 200 may be configured to receive a set of operation-specific rules of engagement from a remote command center via a secure communication channel, store and apply those operation-specific rules during the appropriate operation.
- specific rules of engagement and/or sets of operational parameters may be associated with specific operators, operator rank, engagement location (e.g., country, region, etc.). In some
- operators having sufficient rank and/or authorization levels may be permitted to manually override certain rules of engagement and/or operational parameters of the weapon system 200, and to apply the operator’s own preferred rules/parameters in place. Additionally or alternatively, such overrides may require outside approval, and thus upon receiving a
- the weapon system may be configured to transmit a secure request for override approval a remote command center.
- the target points for selected targets are computed based on a desired point of impact location on the target (e.g., an engine of a boat or vehicle, the center of mass of an individual, etc.).
- the targeting/firing system 210 may be configured with warning shot capabilities in which the desired point of impact location is not on the target. For instance, the rules of engagement enforced by the targeting/firing system 210 for a particular operation may dictate that only warning shots are to be fired at particular selected target.
- such rules may dictate that at least one initial warning shot is to be fired at a selected target before an attempt is made to hit the target.
- the operator controls 245 also may include a warning shot mode that can be activated by the operator, independent of the rules of engagement of the operation, to allow the operator to independently fire one or more warning shots on any selected target.
- the firing solution may be adjusted to assure that the projectiles fired by the weapon 225 will miss the target.
- the targeting/firing system 210 may determine the preferred location of a desired warning shot based on the type and size of the target (e.g., the number and position of warning shots for human targets may be different than for vehicle targets), the orientation and/or the direction of movement of the target (e.g., it may be desirable to firing a warning shot directly in front of the target), and so on.
- Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof.
- the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- processors controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
- embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof.
- the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium.
- a code segment or machine- executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements.
- a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
- any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein.
- software codes may be stored in a memory.
- Memory may be implemented within the processor or external to the processor.
- the term“memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
- the term“storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
- ROM read only memory
- RAM random access memory
- magnetic RAM magnetic RAM
- core memory magnetic disk storage mediums
- optical storage mediums flash memory devices and/or other machine readable mediums for storing information.
- machine-readable medium includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
- FIG. 9 provides a schematic illustration of one embodiment of a computer system 900 that can perform various steps of the methods provided by various embodiments. It should be noted that FIG. 9 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 9, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
- the computer system 900 is shown comprising hardware elements that can be electrically coupled via a bus 905 (or may otherwise be in communication, as appropriate).
- the hardware elements may include one or more processors 910, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like); one or more input devices 915, which can include without limitation a mouse, a keyboard, remote control, and/or the like; and one or more output devices 920, which can include without limitation a display device, a printer, and/or the like.
- processors 910 including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like)
- input devices 915 which can include without limitation a mouse, a keyboard, remote control, and/or the like
- output devices 920 which can include without limitation a display device, a printer
- the computer system 900 may further include (and/or be in communication with) one or more non-transitory storage devices 925, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like.
- RAM random access memory
- ROM read-only memory
- Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
- the computer system 900 might also include a communications subsystem 930, which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset (such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, cellular communication device, etc.), and/or the like.
- the communications subsystem 930 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein.
- the computer system 900 will further comprise a working memory 935, which can include a RAM or ROM device, as described above.
- the computer system 900 also can comprise software elements, shown as being currently located within the working memory 935, including an operating system 940, device drivers, executable libraries, and/or other code, such as one or more application programs 945, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- an operating system 940 operating system 940
- device drivers executable libraries
- application programs 945 which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
- a set of these instructions and/or code might be stored on a non-transitory computer- readable storage medium, such as the non-transitory storage device(s) 925 described above.
- the storage medium might be incorporated within a computer system, such as computer system 900.
- the storage medium might be separate from a computer system (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
- These instructions might take the form of executable code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 900 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
- some embodiments may employ a computer system (such as the computer system 900) to perform methods in accordance with various embodiments of the invention.
- some or all of the procedures of such methods are performed by the computer system 900 in response to processor 910 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 940 and/or other code, such as an application program 945) contained in the working memory 935.
- Such instructions may be read into the working memory 935 from another computer-readable medium, such as one or more of the non-transitory storage device(s) 925.
- execution of the sequences of instructions contained in the working memory 935 might cause the processor(s) 910 to perform one or more procedures of the methods described herein.
- machine-readable medium refers to any medium that participates in providing data that causes a machine to operate in a specific fashion. These mediums may be non- transitory.
- various computer- readable media might be involved in providing instructions/code to processor(s) 910 for execution and/or might be used to store and/or carry such instructions/code.
- a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media.
- Non-volatile media include, for example, optical and/or magnetic disks, such as the non-transitory storage device(s) 925.
- Volatile media include, without limitation, dynamic memory, such as the working memory
- Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, any other physical medium with patterns of marks, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
- FIG. 910 Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 910 for execution.
- the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
- a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 900.
- the communications subsystem 930 (and/or components thereof) generally will receive signals, and the bus 905 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 935, from which the processor(s) 910 retrieves and executes the instructions.
- the instructions received by the working memory 935 may optionally be stored on a non-transitory storage device 925 either before or after execution by the processor(s) 910.
- computer system 900 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system 900 may be similarly distributed. As such, computer system 900 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system 900 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
- the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined.
- features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner.
- technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims. [0100] Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations.
- configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
- examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
- the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium.
- Processors may perform the described tasks.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18919047.3A EP3704437A4 (en) | 2017-11-03 | 2018-11-05 | Semi-autonomous motorized weapon systems |
AU2018423158A AU2018423158A1 (en) | 2017-11-03 | 2018-11-05 | Semi-autonomous motorized weapon systems |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762581280P | 2017-11-03 | 2017-11-03 | |
US62/581,280 | 2017-11-03 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2019221782A2 true WO2019221782A2 (en) | 2019-11-21 |
WO2019221782A3 WO2019221782A3 (en) | 2019-12-26 |
WO2019221782A9 WO2019221782A9 (en) | 2020-02-20 |
Family
ID=66328447
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2018/059261 WO2019221782A2 (en) | 2017-11-03 | 2018-11-05 | Semi-autonomous motorized weapon systems |
Country Status (4)
Country | Link |
---|---|
US (1) | US20190137219A1 (en) |
EP (1) | EP3704437A4 (en) |
AU (1) | AU2018423158A1 (en) |
WO (1) | WO2019221782A2 (en) |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019221782A2 (en) * | 2017-11-03 | 2019-11-21 | Aimlock Inc. | Semi-autonomous motorized weapon systems |
KR102290860B1 (en) * | 2017-11-10 | 2021-08-17 | 한화디펜스 주식회사 | Remote weapon control device and method for targeting multiple objects |
IL263603B2 (en) * | 2018-12-09 | 2023-02-01 | Israel Weapon Ind I W I Ltd | A weapon controlled by a user |
US20220349677A1 (en) * | 2019-03-12 | 2022-11-03 | P2K Technologies LLC | Device for locating, sharing, and engaging targets with firearms |
US11307575B2 (en) * | 2019-04-16 | 2022-04-19 | The Boeing Company | Autonomous ground attack system |
JP2021056841A (en) * | 2019-09-30 | 2021-04-08 | スズキ株式会社 | Teacher data forming device and image classification device |
WO2021080684A1 (en) | 2019-10-25 | 2021-04-29 | Aimlock Inc. | Remotely operable weapon mount |
WO2021080683A1 (en) | 2019-10-25 | 2021-04-29 | Aimlock Inc. | Trigger and safety actuating device and method therefor |
IL270559A (en) * | 2019-11-11 | 2021-05-31 | Israel Weapon Ind I W I Ltd | Firearm with automatic target acquiring and shooting |
GB2590067B8 (en) * | 2019-11-14 | 2023-10-11 | Bae Systems Plc | A weapon system |
EP4100690A4 (en) * | 2020-02-03 | 2024-05-29 | Bae Systems Hägglunds Aktiebolag | Embedded target tracking training |
CN111609753B (en) * | 2020-06-01 | 2022-07-08 | 中光智控(北京)科技有限公司 | Trigger control method and system |
US11231252B2 (en) * | 2020-06-10 | 2022-01-25 | Brett C. Bilbrey | Method for automated weapon system with target selection of selected types of best shots |
US20210389071A1 (en) * | 2020-06-10 | 2021-12-16 | David H. Sitrick | Automatic Weapon Subsystem Selecting Target, ID Target, Fire |
US11994366B2 (en) * | 2020-06-10 | 2024-05-28 | David H. Sitrick | Automatic weapon subsystem movably mounted barrel to strike target at firing time |
US20210389088A1 (en) * | 2020-06-10 | 2021-12-16 | Jacob W. Bilbrey | Autonomous + Automated Weapon System for Drones with Additional Linked Weapons |
US12066263B2 (en) * | 2020-06-10 | 2024-08-20 | Brett C. Bilbrey | Human transported automatic weapon subsystem with human-non-human target recognition |
US12117258B1 (en) * | 2020-07-15 | 2024-10-15 | Flex Force Enterprises, Llc | Devices, systems, and methods for transitioning between local or remote operating modes and a safety mode |
US11525649B1 (en) * | 2020-07-15 | 2022-12-13 | Flex Force Enterprises Inc. | Weapon platform operable in remote control and crew-served operating modes |
US11606194B2 (en) * | 2020-07-31 | 2023-03-14 | United States Government As Represented By The Secretary Of The Army | Secure cryptographic system for datalinks |
DE102020127430A1 (en) * | 2020-10-19 | 2022-04-21 | Krauss-Maffei Wegmann Gmbh & Co. Kg | Determination of a fire control solution of an artillery weapon |
US11555670B2 (en) * | 2020-12-31 | 2023-01-17 | Smart Shooter Ltd. | Foldable man-portable remote-controlled light-weapon station |
US20230010309A1 (en) * | 2021-07-12 | 2023-01-12 | Caleb Crye | Mobile munition assembly and apparatus, systems, and methods of executing a mission for the mobile munition assembly |
US11933559B2 (en) * | 2021-08-09 | 2024-03-19 | Allan Mann | Firearm safety control system |
US20230056472A1 (en) * | 2021-08-19 | 2023-02-23 | Raytheon Company | Firing cutout rapid generation aided by machine learning |
US20230243622A1 (en) * | 2022-01-31 | 2023-08-03 | Robo Duels Inc. | Method of operation of a mounted weapon and system for weapon stabilization and target tracking |
SE2200135A1 (en) * | 2022-11-23 | 2024-05-24 | Bae Systems Bofors Ab | ADAPTIVE SHOT PATTERNS |
FR3147360A1 (en) * | 2023-03-30 | 2024-10-04 | Seaowl Technology Solutions | Teleoperation system |
US20250030576A1 (en) * | 2023-07-17 | 2025-01-23 | Tomahawk Robotics, Inc. | Layered fail-safe redundancy architecture and process for use by single data bus mobile device |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE1728222B2 (en) * | 1968-09-12 | 1973-03-01 | DEVICE FOR FIRE AREA LIMITATION | |
FR2737001B1 (en) * | 1995-07-20 | 1997-08-29 | Giat Ind Sa | STABILIZATION DEVICE FOR INDIVIDUAL FIREARMS |
US5974940A (en) * | 1997-08-20 | 1999-11-02 | Bei Sensors & Systems Company, Inc. | Rifle stabilization system for erratic hand and mobile platform motion |
US6269730B1 (en) * | 1999-10-22 | 2001-08-07 | Precision Remotes, Inc. | Rapid aiming telepresent system |
AUPR080400A0 (en) * | 2000-10-17 | 2001-01-11 | Electro Optic Systems Pty Limited | Autonomous weapon system |
KR100819801B1 (en) * | 2006-03-03 | 2008-04-07 | 삼성테크윈 주식회사 | Automatic trigger mechanism and boundary robot equipped with the same |
SE533248C2 (en) * | 2008-11-04 | 2010-07-27 | Tommy Andersson | Method of gyro-stabilizing the aiming of rifles and one-handed weapons |
IL206142A0 (en) * | 2010-06-02 | 2011-02-28 | Rafael Advanced Defense Sys | Firing mechanism security apparatus for remotely controlled automatic machine gun |
US8453368B2 (en) * | 2010-08-20 | 2013-06-04 | Rocky Mountain Scientific Laboratory, Llc | Active stabilization targeting correction for handheld firearms |
IL211966A (en) * | 2011-03-28 | 2016-12-29 | Smart Shooter Ltd | Firearm, aiming system therefor, method of operating the firearm and method of reducing the probability of missing a target |
US8833231B1 (en) * | 2012-01-22 | 2014-09-16 | Raytheon Company | Unmanned range-programmable airburst weapon system for automated tracking and prosecution of close-in targets |
US10782097B2 (en) * | 2012-04-11 | 2020-09-22 | Christopher J. Hall | Automated fire control device |
US9127907B2 (en) * | 2013-06-07 | 2015-09-08 | Trackingpoint, Inc. | Precision guided firearm including an optical scope configured to determine timing of discharge |
US20150211828A1 (en) * | 2014-01-28 | 2015-07-30 | Trackingpoint, Inc. | Automatic Target Acquisition for a Firearm |
SA115360300B1 (en) * | 2014-02-14 | 2017-08-29 | ميريل افياشين، انك. | Modular weapon station system |
KR101932544B1 (en) * | 2014-04-16 | 2018-12-27 | 한화지상방산 주식회사 | Remote-weapon apparatus and control method thereof |
US9612088B2 (en) * | 2014-05-06 | 2017-04-04 | Raytheon Company | Shooting system with aim assist |
IL232828A (en) * | 2014-05-27 | 2015-06-30 | Israel Weapon Ind I W I Ltd | Apparatus and method for improving hit probability of a firearm |
FR3026174B1 (en) * | 2014-09-24 | 2018-03-02 | Philippe Levilly | TELEOPERATED SYSTEM FOR SELECTIVE TARGET PROCESSING |
US9784529B1 (en) * | 2015-04-07 | 2017-10-10 | Matthew G. Angle | Small arms stabilization system |
WO2019221782A2 (en) * | 2017-11-03 | 2019-11-21 | Aimlock Inc. | Semi-autonomous motorized weapon systems |
KR102290860B1 (en) * | 2017-11-10 | 2021-08-17 | 한화디펜스 주식회사 | Remote weapon control device and method for targeting multiple objects |
US20190310042A1 (en) * | 2018-04-04 | 2019-10-10 | Sed C. HIMMICH | Weapon lock control system |
-
2018
- 2018-11-05 WO PCT/US2018/059261 patent/WO2019221782A2/en unknown
- 2018-11-05 EP EP18919047.3A patent/EP3704437A4/en not_active Withdrawn
- 2018-11-05 AU AU2018423158A patent/AU2018423158A1/en not_active Abandoned
- 2018-11-05 US US16/181,153 patent/US20190137219A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
WO2019221782A9 (en) | 2020-02-20 |
EP3704437A2 (en) | 2020-09-09 |
WO2019221782A3 (en) | 2019-12-26 |
AU2018423158A1 (en) | 2020-05-21 |
EP3704437A4 (en) | 2021-07-28 |
US20190137219A1 (en) | 2019-05-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190137219A1 (en) | Semi-autonomous motorized weapon systems | |
US11867479B2 (en) | Interactive weapon targeting system displaying remote sensed image of target area | |
US8833231B1 (en) | Unmanned range-programmable airburst weapon system for automated tracking and prosecution of close-in targets | |
CA2457669C (en) | Autonomous weapon system | |
US9074847B1 (en) | Stabilized weapon platform with active sense and adaptive motion control | |
AU2002210260A1 (en) | Autonomous weapon system | |
US11486677B2 (en) | Grenade launcher aiming control system | |
JP2020502465A (en) | Guided ammunition system for detecting off-axis targets | |
EP3546879A1 (en) | Imaging seeker for a spin-stabilized projectile | |
KR102489644B1 (en) | Apparatus and method for Calculating real-time fire control command for 30 mm gatling gun |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18919047 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018423158 Country of ref document: AU Date of ref document: 20181105 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018919047 Country of ref document: EP Effective date: 20200603 |