Processing

Please wait...

PATENTSCOPE will be unavailable a few hours for maintenance reason on Tuesday 26.10.2021 at 12:00 PM CEST
Settings

Settings

Goto Application

1. WO2020141504 - SYSTEM, METHOD AND COMPUTER PROGRAM PRODUCT FOR SPEEDING DETECTION

Note: Text based on automatic Optical Character Recognition processes. Please use the PDF version for legal matters

[ EN ]

System, Method And Computer Program Product For Speeding Detection

FIELD OF THIS DISCLOSURE

The present invention relates generally to traffic accident prevention vehicle safety, and more particularly to detecting speeding vehicles.

BACKGROUND FOR THIS DISCLOSURE

The World Health Organization has warned that road traffic crashes are predicted to become the seventh leading cause of death by 2030, and are the leading cause of death among young people, aged between 15 and 29 years. 90% of the world's fatalities on the roads occur in low- and middle-income countries. In addition to fatalities, between 20 and 50 million people worldwide are estimated to suffer non-fatal injuries for road traffic crashes, of which many incur a lasting disability. Road traffic injuries cause economic losses; road traffic crashes are said to cost some countries 3% of their gross domestic product - including cost of treatment (and repairs) and lost productivity for those killed, injured or disabled, and their family members.

Speeding increases the likelihood of a crash occurring, as well as the severity of the consequences of the crash. Even an increase of just 1 km/h in mean vehicle speed is believed to translate to a 3% increase in the incidence of crashes, resulting in injury and a 4-5% increase in fatalities. Speed traps are commonly deployed along roads to prevent speeding. However, commonly, drivers know the exact locations of speeding-detection cameras, via Waze, or simply due to their personal familiarity with an oft-repeated route. Therefore, many drivers allow themselves to exceed the speed limit and drive dangerously for much of their trip, slowing down to the legal speed limit just before they reach a known camera location and reverting to their former illegal speed after safely passing such cameras.

US Patent US6400304 describes a combination of a global positioning satellite system (GPS) and a radar detection unit, in wireless communication with the GPU, for tracking and determining the speed of a vehicle.

Patent US5381155 describes a system operative for measuring the speed of the vehicles and determining if any of the vehicles are speeding vehicles because their speed exceeds the predetermined speed limit.

Published US Patent Application US20160232785 describes a system for mapping and storing traffic violation citations and alerting a user of traffic violations.

The various embodiments herein provide a system for mapping and storing traffic violation citations and alerting a user of traffic violations. Sign and road marking information is necessary to indicate the existence of bus lanes, U-turn signs, and school zone speed signs.

Algorithms for radar (e.g. GMTI) based Detection and Tracking of Moving Vehicles are known, such as: https://www.baesystems.com/en/download-en/... Z1434575834939.pdf .

Comparison of Nonlinear Filtering Algorithms ... " , in

https://pdfs.semanticscholar.org/c52c/147c38689781e0398c3b8e60e2fa0684249c.pdf, by M Mallick, "What is the recent trend in SAR GMTI processing?" - ResearchGate, available at

https://www.researchgate.net/post/what is the recent trends in SAR GMTI processin g, 14 May 2016 - Performance evaluation of SAR/GMTI algorithms by Wendy Garber et al, Proceedings Volume 9843, Algorithms for Synthetic Aperture Radar Imagery XXIII; 984305 (2016) https://doi.org/10.1117/12.2230385, and many more.

The disclosures of all publications and patent documents mentioned in the specification, and of the publications and patent documents cited therein directly or indirectly, are hereby incorporated by reference. Materiality of such publications and patent documents to patentability is not conceded.

SUMMARY OF CERTAIN EMBODIMENTS

Certain embodiments seek to provide Airborne Radar for vehicle speed measuring, tracking and shooting (imaging).

Certain embodiments seek to provide a speeding vehicle incrimination system.

Certain embodiments of the present invention seek to provide circuitry typically comprising at least one processor in communication with at least one memory, with instructions stored in such memory executed by the processor to provide functionalities which are described herein in detail. Any functionality described herein may be firmware-implemented or processor-implemented, as appropriate.

The present invention typically includes at least the following embodiments:

Embodiment 1. A speeding detection system comprising: at least one radar unit having a field of view of at least a portion of a network of roads, and configured to track at least some vehicles travelling along at least the portion of the network of roads and to determine which tracked vehicles are committing traffic offenses and to continue tracking at least some vehicles committing traffic offenses at least until each of the at least some vehicles committing traffic offenses have entered a field of view of a camera; and at least one camera synchronized to the radar unit and deployed at a location along the network of roads and operative to generate at least one image of at least one vehicle from among the vehicles detected by the radar unit committing a traffic offense,

thereby to facilitate: image processing of the at least one image including

identification of a license plate therewithin and reading a license plate number from the license plate identified within the image, and

issuance of a traffic violation ticket to a vehicle owner whose contact particulars are associated in memory with the license plate number.

Embodiment 2. A system according to any of the preceding embodiments wherein the radar unit is configured to compute velocities of tracked vehicles and wherein the traffic offense comprises speeding.

Embodiment 3. A system according to any of the preceding embodiments wherein the traffic offense comprises failing to maintain distance from a vehicle ahead.

Embodiment 4. A system according to any of the preceding embodiments wherein at least some radar units are airborne thereby to facilitate dynamic deployment thereof responsive to changing traffic conditions.

Embodiment 5. A system according to any of the preceding embodiments and also comprising logic configured for image processing of the at least one image including identification of a license plate therewithin.

Embodiment 6. A system according to any of the preceding embodiments and also comprising logic configured for reading a license plate number from the license plate identified within the image.

Embodiment 7. A system according to any of the preceding embodiments and also comprising logic configured to issue a ticket to a vehicle owner whose contact particulars are associated in memory with the license plate number.

Embodiment 8. A system according to any of the preceding embodiments and also comprising logic configured to output an indication of vehicle owner particulars associated in memory with the license plate number.

Embodiment 9. A system according to any of the preceding embodiments wherein the radar unit comprises a Ground Moving Target Indicator [GMTI] radar configured to detect moving ground vehicles.

Embodiment 10. A system according to any of the preceding embodiments wherein at least some cameras are airborne thereby to facilitate dynamic deployment thereof.

Embodiment 11. A speeding detection method comprising:

Using at least one radar unit having a field of view of at least a portion of a network of roads, to track at least some vehicles travelling along at least the portion of the network of roads and to determine which tracked vehicles are committing traffic offenses and to continue tracking at least some vehicles committing traffic offenses at least until each of the at least some vehicles committing traffic offenses have entered a field of view of a camera; and

using at least one camera synchronized to the radar unit and deployed at a location along the network of roads to generate at least one image of at least one vehicle from among the vehicles detected by the radar unit committing a traffic offense,

thereby to facilitate:

image processing of the at least one image including

identification of a license plate therewithin and

reading a license plate number from the license plate identified within the image, and

issuance of a traffic violation ticket to a vehicle owner whose contact particulars are associated in memory with the license plate number.

Embodiment 12. A method according to any of the preceding embodiments and also comprising image processing of the at least one image including identification of a license plate therewithin.

Embodiment 13. A method according to any of the preceding embodiments and wherein the image processing also comprises reading a license plate number from the license plate identified within the image.

Embodiment 14. A method according to any of the preceding embodiments and also comprising issuance of a traffic violation ticket to a vehicle owner whose contact particulars are associated in memory with the license plate number.

Embodiment 15. A computer program product, comprising a non-transitory tangible computer readable medium having computer readable program code embodied therein, the computer readable program code adapted to be executed to implement a method for reducing traffic accidents at junctions with traffic lights, the method comprising:

using at least one radar unit having a field of view of at least a portion of a network of roads, to track at least some vehicles travelling along at least the portion of the network of roads and to determine which tracked vehicles are committing traffic offenses and to continue tracking at least some vehicles committing traffic offenses at least until each of the at least some vehicles committing traffic offenses have entered a field of view of a camera; and

using at least one camera synchronized to the radar unit and deployed at a location along the network of roads to generate at least one image of at least one vehicle from among the vehicles detected by the radar unit committing a traffic offense,

thereby to facilitate:

image processing of the at least one image including

identification of a license plate therewithin and

reading a license plate number from the license plate identified within the image,

and

issuance of a traffic violation ticket to a vehicle owner whose contact particulars are associated in memory with the license plate number.

Also provided, excluding signals, is a computer program comprising computer program code means for performing any of the methods shown and described herein when the program is run on at least one computer; and a computer program product, comprising a typically non-transitory computer-usable or -readable medium e.g. non-transitory computer -usable or -readable storage medium, typically tangible, having a computer readable program code embodied therein, the computer readable program code adapted to be executed to implement any or all of the methods shown and described herein. The operations in accordance with the teachings herein may be performed by at least one computer specially constructed for the desired purposes or general purpose computer specially configured for the desired purpose by at least one computer program stored in a typically non-transitory computer readable storage medium. The term "non-transitory" is used herein to exclude transitory, propagating signals or waves, but to otherwise include any volatile or non-volatile computer memory technology suitable to the application.

Any suitable processor/s, display and input means may be used to process, display e.g. on a computer screen or other computer output device, store, and accept information such as information used by or generated by any of the methods and apparatus shown and described herein; the above processor/s, display and input means including computer programs, in accordance with some or all of the embodiments of the present invention. Any or all functionalities of the invention shown and described herein, such as but not limited to operations within flowcharts, may be performed by any one or more of: at least one conventional personal computer processor, workstation or other programmable device or computer or electronic computing device or processor, either general-purpose or specifically constructed, used for processing; a computer display screen and/or printer and/or speaker for displaying; machine-readable memory such as optical disks, CDROMs, DVDs, BluRays, magnetic-optical discs or other discs; RAMs, ROMs, EPROMs, EEPROMs, magnetic or optical or other cards, for storing, and keyboard or mouse for accepting. Modules shown and described herein may include any one or combination or

plurality of: a server, a data processor, a memory/computer storage, a communication interface, a computer program stored in memory/computer storage.

The term "process" as used above is intended to include any type of computation or manipulation or transformation of data represented as physical, e.g. electronic, phenomena which may occur or reside e.g. within registers and/or memories of at least one computer or processor. Use of nouns in singular form is not intended to be limiting; thus the term processor is intended to include a plurality of processing units which may be distributed or remote, the term server is intended to include plural typically interconnected modules running on plural respective servers, and so forth.

The above devices may communicate via any conventional wired or wireless digital communication means, e.g. via a wired or cellular telephone network or a computer network such as the Internet.

The apparatus of the present invention may include, according to certain embodiments of the invention, machine readable memory containing or otherwise storing a program of instructions which, when executed by the machine, implements some or all of the apparatus, methods, features and functionalities of the invention shown and described herein. Alternatively or in addition, the apparatus of the present invention may include, according to certain embodiments of the invention, a program as above which may be written in any conventional programming language, and optionally a machine for executing the program such as but not limited to a general purpose computer which may optionally be configured or activated in accordance with the teachings of the present invention. Any of the teachings incorporated herein may, wherever suitable, operate on signals representative of physical objects or substances.

The embodiments referred to above, and other embodiments, are described in detail in the next section.

Any trademark occurring in the text or drawings is the property of its owner and occurs herein merely to explain or illustrate one example of how an embodiment of the invention may be implemented.

Unless stated otherwise, terms such as, "processing", "computing", "estimating", "selecting", "ranking", "grading", "calculating", "determining", "generating", "reassessing", "classifying", "generating", "producing", "stereo-matching", "registering", "detecting", "associating", "superimposing", "obtaining", "providing", "accessing", "setting" or the like, refer to the action and/or processes of at least one computer/s or computing system/s, or processor/s or similar electronic computing device/s or circuitry, that manipulate and/or transform data which may be represented as physical, such as electronic, quantities e.g. within the computing system's registers and/or memories, and/or may be provided on-the-fly, into other data which may be similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices or may be provided to external factors e.g. via a suitable data network. The term“computer” should be broadly construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, personal computers, servers, embedded cores, computing system, communication devices, processors (e.g. digital signal processor (DSP), microcontrollers, field programmable gate array (FPGA), application specific integrated circuit (ASIC), etc.) and other electronic computing devices. Any reference to a computer, controller or processor is intended to include one or more hardware devices e.g. chips, which may be co-located or remote from one another. Any controller or processor may for example comprise at least one CPU, DSP, FPGA or ASIC, suitably configured in accordance with the logic and functionalities described herein.

The present invention may be described, merely for clarity, in terms of terminology specific to, or references to, particular programming languages, operating systems, browsers, system versions, individual products, protocols and the like. It will be appreciated that this terminology or such reference/s is intended to convey general principles of operation clearly and briefly, by way of example, and is not intended to limit the scope of the invention solely to a particular programming language, operating system, browser, system version, or individual product or protocol. Nonetheless, the disclosure of the standard or other professional literature defining the programming language, operating system, browser, system version, or individual product or protocol in question, is incorporated by reference herein in its entirety.

Elements separately listed herein need not be distinct components and alternatively may be the same structure. A statement that an element or feature may exist is intended to include (a) embodiments in which the element or feature exists; (b) embodiments in which the element or feature does not exist; and (c) embodiments in which the element or feature exist selectably e.g. a user may configure or select whether the element or feature does or does not exist.

Any suitable input device, such as but not limited to a sensor, may be used to generate or otherwise provide information received by the apparatus and methods shown and described herein. Any suitable output device or display may be used to display or output information generated by the apparatus and methods shown and described herein. Any suitable processor/s may be employed to compute or generate information as described herein and/or to perform functionalities described herein and/or to implement any engine, interface or other system described herein. Any suitable computerized data storage e.g. computer memory may be used to store information received by or generated by the systems shown and described herein. Functionalities shown and described herein may be divided between a server computer and a plurality of client computers. These or any other computerized components shown and described herein may communicate between themselves via a suitable computer network.

BRIEF DESCRIPTION OF THE DRAWINGS

Fig. 1 is a simplified flowchart illustration of a speeding detection method constructed and operative in accordance with certain embodiments; any computational operations may be performed by processor/s, e.g. in accordance with certain embodiments described herein.

Fig. 2 is a simplified flowchart illustration of a speeding detection system constructed and operative in accordance with certain embodiments described herein which may operate in accordance with the method of Fig. 1.

Methods and systems included in the scope of the present invention may include some (e.g. any suitable subset) or all of the functional blocks shown in the specifically illustrated implementations by way of example, in any suitable order e.g. as shown.

Computational, functional or logical components described and illustrated herein can be implemented in various forms, for example, as hardware circuits, such as, but not limited to, custom VFSI circuits or gate arrays or programmable hardware devices, such as, but not limited to, FPGAs, or as software program code stored on at least one tangible or intangible computer readable medium and executable by at least one processor, or any suitable combination thereof. A specific functional component may be formed by one particular sequence of software code, or by a plurality of such, which collectively act or behave or act as described herein with reference to the functional component in question. For example, the component may be distributed over several code sequences, such as, but not limited to, objects, procedures, functions, routines and programs, and may originate from several computer files which typically operate synergistically.

Each functionality or method herein may be implemented in software (e.g. for execution on suitable processing hardware such as a microprocessor or digital signal processor), firmware, hardware (using any conventional hardware technology such as Integrated Circuit Technology) or any combination thereof.

Functionality or operations stipulated as being software-implemented may alternatively be wholly or fully implemented by an equivalent hardware or firmware module and vice-versa. Firmware implementing functionality described herein, if provided, may be held in any suitable memory device and a suitable processing unit (aka processor) may be configured for executing firmware code. Alternatively, certain embodiments described herein may be implemented partly or exclusively in hardware in which case some or all of the variables, parameters, and computations described herein may be in hardware.

Any module or functionality described herein may comprise a suitably configured hardware component or circuitry. Alternatively or in addition, modules or functionality described herein may be performed by a general purpose computer or more generally by a suitable microprocessor, configured in accordance with methods shown and described herein, or any suitable subset, in any suitable order, of the operations included in such methods, or in accordance with methods known in the art.

Any logical functionality described herein may be implemented as a real time application, if and as appropriate, and which may employ any suitable architectural option such as but not limited to FPGA, ASIC or DSP or any suitable combination thereof.

Any hardware component mentioned herein may in fact include either one or more hardware devices e.g. chips, which may be co-located or remote from one another.

Any method described herein is intended to include within the scope of the embodiments of the present invention also any software or computer program performing some or all of the method’s operations, including a mobile application, platform or operating system e.g. as stored in a medium, as well as combining the computer program with a hardware device to perform some or all of the operations of the method.

Data can be stored on one or more tangible or intangible computer readable media stored at one or more different locations, different network nodes or different storage devices at a single node or location.

It is appreciated that any computer data storage technology, including any type of storage or memory and any type of computer components and recording media that retain digital data used for computing for an interval of time, and any type of information retention technology, may be used to store the various data provided and employed herein. Suitable computer data storage or information retention apparatus may include apparatus which is primary, secondary, tertiary or off-line, which is of any type or level or amount or category of volatility, differentiation, mutability, accessibility, addressability, capacity, performance and energy use, and which is based on any suitable technologies such as semiconductor, magnetic, optical, paper and others.

DETAILED DESCRIPTION OF CERTAIN EMBODIMENTS

The method of fig. 1 includes set-up operations e.g. all or any subset of those illustrated, in any suitable order e.g. as shown; and also real time or near-real time operations e.g. all or any subset of those illustrated, in any suitable order e.g. as shown Set up operations may include:

Operation 10: Deploy radar units e.g. units 1050 in Fig. 2, typically airborne, which together establish a field of view of at least a portion of a network of roads

Operation 20: Deploy at least one camera e.g. cameras 1 - N of Fig. 2, within the radar units' field of view, typically on the ground. The radar subsystem (1050 e.g.) may pre store each camera’s location, in memory

Operation 30: provide radar and cameras with data communication channels between them and between cameras and police

Operation 35: configure radar

Real time operations may include:

Operation 40: radar units track all speeding vehicles

Operation 50: each camera images all vehicles or only speeding vehicles upon command

Operation 60: image processing unit finds license plate in camera-generated image of speeding vehicle, and reads license plate number

Operation 70: ticketing functionality sends speeding ticket to each speeding vehicle Example embodiments of the above operations are now described in detail.

Operation 10: Deploy radar units, e.g. airborne, which together establish a field of view of at least a portion of a network of roads.

According to some embodiments, radars are deployed to cover only“red roads” known to be dangerous or prone to speeding. Alternatively or in addition, drone-mounted, or other airborne radar, may roam around the country’s or district’s network of roads as per any suitable schedule e.g. a random or round-robin schedule.

The system can use an off-the-shelf radar e.g. GMTI (Ground Moving Target Indicator) radar typically with range resolutions of 2-4 meters or better, and range rate (or radial velocity) resolutions of at least 2-5 KpH or better radar system.

A suitable communication protocol may be applied to the off-the-shelf radar to deliver the radar tracks data, for speeding vehicles, to the ground control station and to cue the cameras accordingly e.g. as described herein.

Each radar unit typically has a range, or field of view, of several dozen km in each azimuthal direction - thus a single radar may be deployed to cover an area whose radius is dozens of kilometers, with thousands of vehicles (e.g. if the radar is sufficiently powerful, has a sufficiently large antenna, and has other suitable parameters).

The radar is typically stand-alone; all hardware and signal processing is typically done over the air. The radar is typically operative to do all or any subset of the following: transmit high-frequency beams to a specific area;

receive returns from the ground and from moving targets;

filter out clutter (e.g. returns from zero velocity objects and

detect moving targets, and/or measure the distance to the targets and/or velocity thereof.

Airborne Radar may be installed in Aerostat (which has the advantage of a long period (e.g. several weeks) of continuous operation), in which case the radar may be deployed in locations whose field of view covers areas where speeding is prevalent and/or expected e.g. on known red-roads with high traffic accident statistics.

Operation 20: Deploy at least one camera with a suitable field of view e.g. 100 -120 degrees, within the radar units' field of view, typically on the ground.

Typically, the radar subsystem is configured to pre store, in memory, the exact locations of the cameras e.g. x, y or x, y, z coordinates input manually via a suitably configured user interface, or using GPS (e.g. gps module 1030 of Fig. 2) or any other suitable localization technology, automatically or manually operated.

According to certain embodiments, radar units are deployed as per operation 10, and are then used in conjunction with a legacy deployment of cameras and/or legacy software for license plate reading.

Each camera typically has a controller in data communication therewith. The controller, which may be local or remote, typically gets imaging commands and, responsively, commands the camera to image, and transfers the image to the system. Typically, the image, and more generally the data of the speeding vehicle with a time tag, is sent to a ground station, that includes an operator working station.

The OWS (operator work-station) e.g. module 1060 in Fig. 2 may perform all or any subset of the following tasks :

a.The workstation 1060’s system status monitor processor 1070 may monitor system and subsystem operability (e.g. including conventional periodic calibration of the radar system which may, alternatively or in addition, be performed, controlled and monitored by the radar itself as is the case in conventional radar systems). Status monitor processor 1070 may alert an operator in an event of faulty serviceability status. For example, radar may be tested with calibrated tuning forks periodically e.g. daily and poor results may be recorded. More generally, any equipment testing is typically recorded typically including date, test conducted, of which equipment, by whom, and results.

To ascertain the radar hardware is setup and functioning properly the radar may be tested by an external certified authority periodically, typically once or twice a year. Data may be stored to indicate last calibration test, next (due) test, and who tested. Radar Self-Test and/or on-site testing (e.g. of radar vis a vis a vehicle of known speed) and/or interference testing (e.g. Set radar to Receive Only mode and scan for interference) may be run periodically or on occasion; data may be stored to document this.

b.Present to the operator the evidentiary data aka incrimination data the system has collected regarding speeding cars e.g. as described herein and/or creating a report including this evidentiary data, where the report then serves as a base for serving speeding tickets, e.g. later on.

c. Store and backup raw data and/or reports.

It is appreciated any type of data may be logged, typically in association with each speeding ticket issued or in association with dates and times such that association with individual tickets is clear assuming each ticket includes a unique identification of the radar subsystem which generated the ticket and date/time of generation, to allow later verification of the fact that the vehicle in question was speeding, and that the vehicle imaged was the vehicle which was speeding because the speeding vehicle was properly tracked logged data may include for example all or any subset of the following: Confidence interval estimation for any statistical computation made by the radar or its tracker, data collected by the operator workstation, Range Control Settings or radar receiver sensitivity settings, camera location and direction (e.g. in degrees from North), data generated by the radar and its tracker or local tracking system e.g. Radar target reports, range-Doppler maps or plots (e.g. to verify that imaged vehicle is what which was speeding the and could not have been confused with any other vehicle ), tracks including the tracked vehicle's state estimates (e.g. 2d or 3d position, heading, speed, acceleration, Unique track number, Track reliability or uncertainty information etc.), data regarding association of plots to tracks including data regarding plots unassociated with any tracked target hence used for acquisition of new tracks, data regarding tracks which may have remained without updates, data regarding assumed target motion model (e.g. constant velocity, constant acceleration, etc.), data regarding plot-track association methods e.g. defining an acceptance gate around the current track location and then selecting the plot in the gate which is strongest, or closest to the predicted position, or selecting a most probable location of plot through a statistical combination of all the likely plots, track smoothing data, track initiation data, track maintenance data, recording of each radar subsystem’ s manufacturer and model e.g. since the extent to which a radar self-tests and automatically adjusts its circuits varies with model . radar tracker data that is logged may include all or any subset of how consecutive radar observations of different targets were associated into respective tracks, how false alarms were rejected, heading and speed estimates generated, and documentation of filtering of errors in radar measurements e.g. by the radar tracker fitting a smooth curve to reported plots.

Practically speaking, obstructions may impede the radar's tracking abilities which may affect decisions on where to deploy radar and/or where to deploy cameras.

The radar and cameras’ locations are typically selected to catch the maximal number of speeding cars according to line of sight and places on roads that the operator identifies as problematic. The radar may be deployed in one place which affords large coverage, or in plural locations, each of which, or at least some of which, cover many kilometers suited to operation of the system e.g. not normally or not always congested, not obscured by objects— such that one radar can cover tens of kilometers with hundreds of vehicles.

When deploying along city roads, deployment typically takes into account that built-up areas may be a more challenging environment, since efficiency of radar operation may depend on location, height of the radar, and buildings’ size and height.

Typically, location and/or height of the radar is optimized to focus on pre-selected dangerous roads.

One option to achieve system location optimization, is to assemble obtain 360° images of the area of interest, taken from plural e.g. airborne locations, selected to cover the terrain. Deployment of radar (and cameras) may then be determined so as to verify free line of sight to pre-identified interesting e.g. dangerous portions of road/s, typically while verifying the line of sight is kept continually to a possible camera location. The above process may be repeated until good quality coverage of the roads in an area of interest, is deemed to have been achieved.

The cameras may be located in accordance with radar line of sight analysis and/or a database of road properties which may store all or any subset of: each road segment’s speed limit, road width, and predicted times of traffic jams on each segment of the road. For example, cameras may be deployed at entrances to tunnels, before traffic lights, before each exit from the highway, and just before each area that is susceptible to traffic jams -since the system tends to "lose" the offender if s/he enters a tunnel or traffic jam.

It is also appreciated that congested roads may constitute more challenging operating conditions for this system due to the difficulties of differentiating (separating) moving objects on such roads, however, speeding is mainly a problem when the roads are not congested, since drivers are normally compelled to drive more slowly when surrounded by vehicles. Therefore, typically, operation of the radar (and cameras) can be limited to times of day or year in which roads are not congested. Also, the system need not be deployed at all on roads which are constantly or almost constantly congested, such as downtown roads in the heart of a metropolitan area. Data identifying congested portions of roads or areas, in which speeding does not occur because it cannot occur, hence the system’s presence may be superfluous, may be measured by the system itself, during a relative short time (say, a few days) following initial deployment.

It is appreciated that cameras may, if desired, be mounted on drone/s to provide flexible deployment of cameras (e.g. if radar is deployed as per a variable e.g. random or round-robin, rather than fixed, schedule), and/or to allow cameras to be moved to a position upstream of a traffic jam that has just developed. In this embodiment, the radar typically is supplied e.g. by gps module 1030 of Fig. 2, with the precise location of the drone (e.g. by GPS accuracy).

Operation 30: provide radar and cameras with data communication channels using any suitable communication subsystem (1040 in Fig. 2) e.g. standard civilian communication (say, 4g (say) cellular communications equipment), with relatively low band width e.g. LTE 4G bandwidth. Physical communication between radar and cameras, and between cameras and police headquarters/police stations may be based on commercial solutions such as conventional cellular or Motorola equipment. According to certain embodiments, communication between cameras, radar and police (or workstation 1060)

are always via the controller, not directly between radar/camera/police (or workstation 1060).

Typically, communication are provided between radar and cameras and between cameras and police headquarters/police stations and between cameras and conventional base stations and conventional ground stations for monitoring the airborne system components, typically during 100% of the airborne time, from the ground.

Optionally, a human operator, e.g. via workstation 1060, views system results before a speeding ticket is actually issued based on these results.

Operation 35: configure radar e.g. use the configuration file in the radar to determine which car to incriminate by configuring the high speed and unambiguity thresholds in the tracker algorithm resident within the radar.

Changes of the configuration file can be done through uplink from a suitable ground station which typically comprises a computer on the ground (e.g. workstation 1060), linked via a communication link, to the radar.

Example embodiments of real time operations are now described in detail.

Operation 40: Radar units detect vehicles, measure their speed, and continuously track all speeding vehicles moving in the radar field of view or all vehicles moving in the radar field of view.

Typically, continuous tracking ensures that a given vehicle is tracked all along the path between a location at which speeding was detected, and the camera which images that vehicle. If any doubt develops about the exact vehicle, the system typically will not order the camera to image the specific cars e.g. as described in detail herein.

The radar may have two operational modes e.g. a rotating mode 360° and a sector mode. When 360° mode is used, coverage is high, but the revisit time depends on the radar’s rotation speed. When sector mode is used, the revisit time is higher, but coverage is limited by the sector angle.

Typically, if it is desired to cover a specific road and the radar can be deployed in a position that allows the radar to cover this road in sector mode, sector mode is used. If several roads, or a single road which is longer than the radar’s field of view are being covered by a single radar subsystem, but cannot be covered in sector mode, 360° mode is employed.

Any suitable control functionality, including manual control by a human operator, may be employed, to determine when, where and in what mode, to operate the system e.g. according to varying weather and visibility conditions, and possibly according to dynamic requirements regarding which areas are to be covered.

GMTI (Ground Moving Target Indicator) algorithms may be used to provide continuous surveillance of moving vehicles, such that many vehicles can be detected and tracked with each sweep of the radar, by conventionally generating and maintaining tracks that respectively represent ground moving vehicles. A track database may be maintained to store history and heading information of moving vehicles. GMTI tracks may be exchanged and shared among nodes, allowing for hand-off and/or non- ambiguous sharing of GMTI track data among distributed users. Data fusion may be employed to ensure that only one track is generated for a given detected moving vehicle and that this same track is maintained, including by provision of new data pertaining to the given vehicle, over the entire life of the track.

If real-time traffic monitoring is desired, a suitable algorithm may be employed for estimation of the whole position and velocity vectors of detected moving vehicles, such as but not limited to (PDF) Fast GMTI Algorithm For Traffic Monitoring Based On A Priori Knowledge. Available from:

https://www.researchgate.net/publication/225006964 Fast GMTI Algorithm For Tra ffic Monitoring Based On A Priori Knowledge.

Options and thresholds for GMTI Algorithm (say) operation may be preset by the system developer (and/or machine-learned) and/or may be controlled by the user.

According to certain embodiments, the cameras may be detected by the radar e.g. via repeaters installed on the radar, to create stronger evidence. The system may generate a radar image that includes both the car and the camera, to create stronger evidence which proves that the car in the image is the same one the system elected to incriminate.

According to certain embodiments, some or all cameras have a unique ID and send a beacon to the radar with their ID. This may be used for example to enable the radar to know the exact locations of cameras, and/or to ensure imaging of a given vehicle was done by the right camera/s.

Typically although not necessarily, the radar is master having unidirectional control over one or more other system modules aka slaves, and the radar subsystem’s internal signal processing and tracker algorithm decides conventionally about opening new tracks or terminating existing tracks.

According to certain embodiments, the logic ensures that the radar tracks all vehicles, whenever possible; thus the tracker handles all targets at all times, and conventionally associates plots to track, if and whenever possible.

Alternatively, the logic may command the radar to stop tracking a certain target (e.g. moving vehicle) as soon as the moving vehicle is imaged by a camera.

Typically, the system is configured to prevent multiple speeding tickets being issued to a single given offender within a given time-window e.g. during one or over a few hours. For example, since each potential ticket is associated with a registration plate or other unique ID, the system may simply filter out subsequent speeding tickets to the same offender, within a given time-window, or along a given segment of road.

Operation 50: each camera images all vehicles, or only speeding vehicles, or only a subset of speeding vehicles which indicates sufficient confidence that the vehicle has been properly tracked (e.g. a camera may sleep and may be triggered each time a vehicle, which the system’s logic wants to image, enters the camera’s field of view).

The imaging process typically occurs if and only if (a) the radar sub-system is successful in continuously tracking a vehicle with a high (over-threshold) degree of confidence and also (b) the vehicle in question is determined (e.g. by the radar sub-system) to have been speeding. If there is any doubt about the vehicle (e.g. if the level of confidence, typically computed in the radar controller, is below threshold) the camera is typically not commanded (by the radar controller e.g.) to image that particular vehicle.

Doubt may be deemed by the system to exist if there is uncertainty with regard to whether the vehicle was successfully tracked (or if there is no doubt, and the vehicle was not tracked), or due to a“separation question” e.g. question of whether or not certain putative tracks are a single track or plural tracks; this may occur for example if plural vehicles are travelling in the same vicinity and at the same speed).

Typically, the system does not incriminate (issue a ticket to, e.g.), and may not even image, speeding vehicles, that are detected and are not successfully tracked e.g. because these speeding vehicles cannot be tracked continuously during the tracking period e.g. between their offense and camera shooting (imaging) time, due e.g. to object/s obscuring the radar subsystem’s line of sight to the vehicle and/or due to ambiguity with other vehicles during the tracking period.

The radar may use two parameters of separation (one or both): range separation and Doppler - or radial velocity— separation. Typically, then, the radar can only distinguish between two or more cars, if the cars respectively have different radial velocity and/or are in a different range, where the extent of difference required depends on the radar’s (velocity and range separation) resolutions respectively.

Typically, if there are plural cars with almost the same velocity and/or plural cars very close to one another, especially if only one or only some of the plural cars are speeding, then, to avoid ambiguity, the system will not order the camera to image those specific cars. For example, a radial velocity difference of delta-v may be selected, such that if plural cars whose velocity difference is delta-v or less are not incriminated, the confidence level that an incriminated car was actually speeding, is 99.99%. Similarly, a distance delta-x between cars may be selected, such that if plural cars, whose range difference is delta-x or less are not incriminated, the confidence level that an incriminated car was actually speeding, is 99.99%.

According to PD (probability of Detection), if a single detection probability is 0.8 at a given range for given radar system, then the probability for miss-detection is 0.2 for every car in the line of sight of the radar. Therefore, the probability of detection for each car during two scans is (1-0.2*0.2=0.96) 96% and for 4 scans (1-0.2L4=0.9984) 99.84%!

According to certain embodiments, the radar's tracking functionality tracks periodically e.g. every 1 - 2 sec, and if the tracking at any time yields two vehicles rather than one with identical movement parameters along all axes, the track is disqualified (in which case the vehicle which was being tracked will not get a ticket, and may not even be imaged).

The communication protocol via which the radar subsystem communicates with the camera typically stipulates an exact time, which the radar may send the camera, at which the camera is to shoot the image, in order to capture the speeding vehicle, according to a time tag the tickets will deliver.

The exact time at which to take the picture may be computed by the radar tracker according to the offending vehicle’s continually measured and updated speed and acceleration, typically assuming that imaging is to occur when the vehicle is in the center of the camera’s field of view.

The imaging time and/or time at which the actual speeding event was detected (including date) is stored in association with the image, so as to allow each speeding ticket to include an indication of the date and time at which the imaged speeding event occurred and/or the date and time at which the speeding vehicle was imaged.

Due to the importance of a time baseline (e.g. GPS which may be provided by gps module 1030 of Fig. 2) in real time systems, the radar and cameras are typically synchronized, using conventional technology at accuracies of, typically, less than 1 micro second (10l( 6) Sec) or any other level of accuracy found to ensure that the camera images the right car. This synchronization between radar and camera ensures that the image being taken is of the right car (the actual speeding vehicle, not any of the vehicles near it). Imaging typically takes place while the camera is in online connection with the radar.

It is appreciated that many technologies are known for clock synchronization to coordinate plural independent clocks, e.g. to prevent lack of temporal coordination which may occur if real clocks, even if initially set accurately, drift over time, e.g. due to clock rate differences. Conventional solutions include Berkeley algorithm, lock Sampling Mutual Network Synchronization, Cristian's algorithm, Global Positioning System, Inter-range instrumentation group time codes, Network Time Protocol, Precision Time Protocol, Reference broadcast synchronization, Reference Broadcast Infrastructure Synchronization, Synchronous Ethernet and Synchronization in Ad-hoc Wireless Networks.

The image thus taken is sent e.g. to law enforcement authorities e.g. police and/or to a system controller e.g. to ticket reporter maker processor 1080 of Fig. 2, and typically, meta-data is added, typically by the controller e.g. ticket reporter maker processor 1080 of Fig. 2. The meta-data may include all or any subset of the following meta-data (all or any subset of which may also be included in the actual speeding ticket): time-stamp (including date), velocity, camera direction, ID of the radar track corresponding to the vehicle if the raw track data is stored separately, or the raw radar data itself

As described above (operation 20), the radar memory typically contains the locations of the cameras. Therefore, the radar can continually track speeding vehicles and may maintain their tracks in the memory of the radar subsystem. Then, the radar subsystem may command all, or a subset or a single most suitable camera, to image the speeding vehicle, typically stipulating the exact time that the speeding vehicles will be within the line of sight of a given camera, since the radar has in the memory the exact location of the cameras and the paths and location in real time of each speeding car, and this stored data allows the exact time to be predicted.

The radar subsystem’s command to camera/s also typically includes a unique identifier of the track, which the camera can then use to label an image (captured as per that command, e.g. at the time stipulated in the command) when the image is sent. This allows the image to be matched to the track, hence to the details of the speeding offense. Alternatively or in addition, each image sent by the camera may be time-stamped, thus, if desired, serving as a basis or an additional basis allowing the image to be matched to the track hence to the details of the speeding offense (e.g. if the command is sent to a given camera and/or stipulates the time at which to image).

Typically, each radar-camera command is stored by the radar subsystem, either as a log typically in conjunction with a track ID, or in association with, e.g. linked to or in the same record in memory as the track itself, as stored in the radar’s memory.

According to certain embodiments, probability of tracking is increased and ambiguity reduced, by determining the first camera that the speeding vehicle will meet, using the stored track data, and commanding that first camera to image the speeding vehicle.

Typically, the camera to image a given vehicle is identified by storing the locations (e.g. x, y coordinates) of the cameras and/or of their respective fields of view. Then, once a vehicle V has been identified as a speeding vehicle, the vehicle (e.g. the vehicle’s x, y coordinates, in the same coordinate system as the cameras) continues to be tracked, and the vehicle’s location relative to at least some of the cameras (or system cameras’ fields of views’ centers) is periodically or continually evaluated. As soon as vehicle V is found to be within a distance d of one of the cameras c, camera c is commanded to image V ; once that has occurred, tracking may be discontinued, since it is no longer necessary to track vehicle V. Distance d is typically selected to be long enough to allow processing and to allow a command signal to travel to the camera, and is also typically selected to be short enough to prevent cameras deployed along roadl from being commanded to image a vehicle coming along nearby road2. For example, if d is a few meters (e.g. 5 or 10 or 20 meters), this distance is typically both short enough and long enough (e.g. given that adjacent roads are always at least 50 or 100 meters apart), or from a camera whose direction corresponds to the direction of motion of vehicle V.

It is appreciated that vehicle V’s location may be periodically or continually evaluated only vis a vis cameras which are in the same area as V and not vis a vis cameras which are geographically distant enough from V to make it impossible for these cameras to ever be the first camera V will encounter.

It is appreciated that the direction of vehicle V’s trajectory may be computed and may be compared to the direction of travel being monitored by potential cameras. For example, if vehicle V is travelling south to north (or is travelling in a direction which, say, increases V’s x-coordinate, but decreases v’s y-coordinate) but camera c’s field of view (e.g. as pre-stored when the camera is deployed) captures the north-to-south lane (or captures the lane, which, when traversed by a car, decreases that car’s x-coordinate but increases that car’s y-coordinate), camera c is not a relevant candidate to image vehicle V.

Alternatively or in addition, pre-store a topological structure of the network of roads e.g. as a graph in memory e.g. with directional edges interconnecting nodes (junctions) hence representing routes or lanes, and use suitable conventional topological methods to determine the edge that the speeding car is on, then retrieve from memory the next camera that the car will encounter. Shortest-path algorithms operative for finding the shortest paths between nodes in a graph, e.g. Dijkistra, can find the shortest path through the network of roads from the speeding driver to each of the cameras, then the controller may send an imaging command to the camera, or only to the camera, whose distance along the roads, in the direction of travel, to the speeding driver, is shortest.

It is appreciated that known variants of Dijkstra's algorithm find shortest paths from a given source node (e.g. a speeding car's location) to all other nodes in the graph, producing a shortest-path tree.

Alternatively or in addition, cameras take many time-stamped images per unit of time e.g. several or dozens of images per second, all images are image -processed to detect license plates, and matching of images captured by camera C to tracks of speeding vehicles is performed off-line, by computing, off-line, the arrival time at which a tracked speeding vehicle is due to arrive at camera C, and identifying, off-line, the image in camera C’s image stream, whose time-stamp most closely matches that arrival time. Optionally, if the time interval between the image in camera C’s image stream, whose time-stamp most closely matches that arrival time, and the predicted arrival time, exceeds a threshold, or if the confidence level in the predicted arrival time is lower than a threshold, no traffic ticket is sent.

Operation 60: In real-time, near real-time or off-line, image processing unit finds license plate in camera-generated image of vehicle, and reads license plate number, e.g. for all vehicles or only for speeding vehicles or only for speeding vehicles which were successfully tracked.

Operation 70: ticketing functionality sends speeding ticket to each speeding vehicle, e.g. by SMS or email or regular mail (snailmail), using contact particulars stored in a data repository in conjunction with each license plate number

A particular advantage of the system and methods shown herein, including the method of fig. 1 , may be that a road network whose total area has a radius of, say, even 50 km, may be monitored effectively for speeding, by only a few cameras.

A further particular advantage may be that the system impacts drivers to drive within the legal speed limit throughout their journey, because the radar measures vehicles' speeds along the entire length of all roads (if the radar units' combined field of view covers the entire length of all roads), in contrast to conventional camera-based speeding detection systems, which detect speeding only within a limited vicinity of each deployed camera. The distance between the location at which the traffic violation e.g. speeding occurred, to the location at which the offending vehicle, having been tracked continuously, is imaged, may be several kilometers, or even longer distances. The radar can continuously track hundreds of vehicles simultaneously, typically imaging only the speeding vehicles. A particular advantage may be that the system is effective, practically speaking, even if, e.g. for simplicity or reasons of cost effectiveness, has no capacity to hand off target tracks from one radar unit to another, and instead, say, simply disregards speeding vehicles which leave the field of view of a given radar unit without encountering a camera. This is because a goal of the system (in some use-cases, the main goal) is deterrence, for which it is not necessary to provide a 100% success rate in tracking every single speeding vehicle. For the same reason, in most use cases, there is no need to provide technology for handing off target tracks from one radar unit to another; tracking targets through more than one radar with handover between the plural radars may be helpful in identifying vehicles speeding right at the boundary between plural radar units’ respective fields of view, but since each radar covers, itself, an area large enough to identify many speeding vehicles, it is not necessary to make provisions to detect the small number of vehicles speeding right at the boundary between plural radar units’ respective fields of view.

It is appreciated that while the system may not have a 100% success rate in tracking every single speeding vehicle unambiguously all the way to the next camera along the vehicle's route, this is of no significance in most use-cases. In conventional systems, success rate is far from 100%, since a great deal of undetected speeding occurs between the known (to the drivers) locations of the speed traps along the route. Therefore, the system may be configured, as described herein, to disregard or not incriminate, or not even image a subset of vehicles regarding which there is uncertainty as to whether they are speeding or not, and/or as to whether they have been confounded with (have not been separated or differentiated from) another vehicle, and/or the system may simply disregard vehicles, even if speeding, that were or may have been unsuccessfully tracked (e.g. due to the vehicle having been obscured, due to a tunnel, larger vehicle, building in the line of sight between radar unit and vehicle, traffic jam between the speeding incident and the next camera

available along the offending driver's route, etc.), using predetermined or learned or adjustable or programmable thresholds to distinguish certainty from uncertainty.

It is also appreciated that speeding is a significant contributory factor to fatalities, specifically along highways. These roads, fortunately, are characterized by few obstructions compared to city roads, hence are particularly suited to coverage by the system shown and described herein.

It is appreciated that when the system herein is deployed, drivers need to drive within the speed limit at all times, rather than slowing down each time they approach a camera or other speed trap, and resuming speeding in between cameras. This is because violation detection is by a radar device, whose field of view is sufficiently large as not to be readily apparent and/or as to cover entire roads or regions - in contrast to conventional camera-based or human-operated speed- traps.

It is appreciated that the flow of Fig. 1 is not intended to be limiting. For example, the following link:

HTTPS ://WWW.RAND.ORG/CONTENT/D AM/RAND/PUBS/MONOGRAPH REPO RTS/MR1398/MR1398.APPA.PDF

describes advanced technologies for finding and recognizing mobile targets; it is appreciated that any of these technologies may be used to replace or augment any operation shown or described herein.

Alternatively or in addition, according to certain embodiments, the system herein is configured for detecting traffic violations in addition to, or other than, speeding, e.g. keeping a safe distance between vehicles.

Fig. 2 is a simplified block diagram of an example system; all or any subset of the illustrated units may be provided. As shown, typically an airborne platform 1010 powers an airborne system 1020 which typically comprises a inertia! Navigation System or gps/ins 1030 which provides time and/or navigation data to radar system 1050 which in turn provides tracks and/or incrimination data and/or system status and/or camera commands to a suitable communication subsystem 1040 which also typically communicates camera status and/or position, and/or operator commands from operator workstation/s 1060, to the radar subsystem. The communication subsystem 1040 typically

communicates camera commands and/or the status of airborne system 1020 to/from the camera/s 1, ....N and/or to/from operator workstation 1060. Th operator workstation 1060 may receive images and/or camera position and/or camera status data directly from the cameras 1, ...n. the operator workstation 1060 typically includes a system status monitor 1070 and/or a processor/controller configured for ticker report generation e.g. as described herein.

It is appreciated that the term“camera” as used herein may include any suitable sensor e.g. camera or other imaging system or sensor suitable for uniquely identifying a passing vehicle. For example, the camera may include a first license plate recognition camera with its associated license plate recognition software operative for image processing, which may have shutter rate control which gives a shutter value short enough to darken the video or capture the license plate clearly enough to enable number recognition, and also, optionally, another camera with a wider angle of view, typically wide enough to image identifying features of vehicle e.g. to identify make and model and/or to further corroborate identification. If a road has several lanes, each lane may have its own camera/s. Typically, the camera angle is less than 45 degrees from the license plates and preferably faces the license plate (almost) straight on. The camera is typically deployed at less than a 30 degree angle above (or below) the license plate to prevent e.g. the bumper, in some models, from blocking the field of view.

Headlight Compensation, aka HLC, may be provided to reduce glare and overexposure that results from vehicle headlights directly pointing at the camera e.g. when front facing license plates are captured. Cameras may have a Wide Dynamic Range feature which divides each frame/image into sections and determines a correct exposure per section e.g. due to varied lighting levels caused, say, by headlights or streetlamps.

Since the vehicle may both be fast moving (say, up to 200 kph) and needs to be captured even at night at relatively high resolution, cameras may have adjustable shutter speeds, e.g. between l/30s to 1/100,000 of a second.

Illumination of the camera’s field of view may be provided to reduce IR reflection issues that may adversely reduce confidence levels.

Cameras may, when possible, be positioned at locations in which vehicles are expected to stop or slow down.

Cameras are typically installed to face an area with adequate lighting. A source of full-spectrum ambient light may be provided, since infrared light impinging upon the reflective paint used in license plates can reflect too much light. A sensor e.g. internal PIR sensor, may be used to switch from one illumination mode to another e.g. from IR to full spectrum white e.g. natural light.

Cameras may for example be mounted on street poles, streetlights and highway overpasses. Conventional camera systems automatically associate captured license plate numbers that come into their view, with the location, date, time and, sometimes, images of the vehicle and even its driver, and, perhaps, its passengers. All this data may then be uploaded to a remote server or controller e.g. as described herein.

According to certain embodiments, the Radar or a processor associated therewith makes a computation to determine whether or not it can safely be assumed, at a predeterminedly high level of confidence, that each given vehicle being tracked is the speeding vehicle, throughout the time window starting at speeding time and culminating at imaging time, and that the tracking could not have confused the speeding vehicle with another (eg with a second vehicle which passes the speeding vehicle, or is passed thereby, during the window). This may be done by verifying that all other radar detections around the tracked vehicle are separated in speed and/or range by threshold/s whose value takes into account the vehicle's physical capabilities (possible velocities and accelerations i.e.) And possible radar errors (eg accuracy of range and/or velocity). Any speeding tracked vehicle that does not meet those thresholds is not ticketed. Typically, the radar stores and documents these computations in case the ticket is challenged ex post facto.

An example computation is this: in a certain system, the radar range accuracy is better than lm, and radar rate accuracy is better than lKph. The Radar sampling time is 2 sec. A scenario of one car passing another is given, and the question is whether the system can track each one of them without mixing or confounding or confusing them. Assume car #1 is keeping a constant speed of 90KPH, and car #2 is maintaining a constant speed of 80KPH. The distance between the detection with the 90KPH (at T=0 and at T=2sec) would be 50m, while the distance between the detection with the 80KPH (at T=0 and at T=2sec) would be only 44m. Consider whether it could be that (hypothesis:) car #1 decreased its speed to 80KMH during those 2 sec and car #2 increased its speed to 90KPH during the

same 2 sec. If so than the range difference would have to be 47m for both cars. In order for car #2 to be measured at 90KPH at a range distance of 50m, car #2 must have increased its speed from 80KPH to over 100KPH then decelerated back to 90KPH— all in 2 sec. Also, car #1 would have had to decrease its speed to below 70KPH then increase back to 80KPH. The logical conclusions from the hypothesis is not physically possible, so the logical conclusion is that both cars maintained their original speeds hence there could not have been any confusion between them hence if car #1 (or #2) previously was speeding, car #1 (or #2) can be ticketed.

More generally, given a final radar performance together with road orientation data, the measurements data cars physical limitations (of cars and/or motorcycles for example), the Radar or a processor associated therewith may compute if a car overtaking another car does or does not allow ticketing to proceed (if one of the 2 cars was previously speeding) depending on whether or not there are plural plausible (physically possible) hyphotheses in which case it is not valid to ticket either of the cars, or whether there is only one plausible hypothesis, in which case the cars cannot have been confounded in which case it is valid to ticket.

The Radar may ensure, at a level of confidence (which may be pre-determined or may be a system parameter which may be selected by regions, to conform with region-specific legal requirements) that each tracked speeding vehicle which is ticketed, has not been confounded or confused with any other vehicle, during the entire time interval starting from the time at which the vehicle committed the offense (e.g. Speeding) and ending at the time the vehicle is imaged. This can be done, for example, by checking for each vehicle found to be speeding whether there was any point in time (say: any given second or minute), during that entire time interval, in which even one radar detection identified by the radar system which identified the speeding incident, differed in speed by less than a threshold amount (aka "speed threshold") from the speeding vehicle and also, was there was any point in time (say: any given second or minute), during that entire time interval, in which even one radar detection identified by the radar differed in range by less than a threshold amount (aka "range threshold") from the speeding vehicle.

Then, any speeding tracked vehicle that has in its vicinity even one such radar detection e.g. Has even one radar detection which does not exceed, at every single second

or minute (or other time unit) during the above time interval, at least one of those thresholds (or, to be even more strict about preventing ticketing of innocent vehicles, has even one radar detection which does not exceed, at every single second or minute (or other time unit) during the above time interval, both of those thresholds) is disqualified for ticketing i.e. Is not ticketed. Since deterrence does not require 100% enforcement, this systematic disqualification (for ticketing) of speeding vehicles that might have been confounded with an innocent vehicle (leading to possible imaging and consequent ticketing of an innocent vehicle), ensures that the system is simultaneously effective in ticketing most offenders, and non-susceptible to legal challenges since only indisputably guilty vehicles are ticketed.

Disqualification need not be done on the fly and can instead be done off-line. There is no imperative to issue the tickets in real time or near-real time and disqualification may be performed off line if, as is typically the case, the radar stores all radar detections in a log, to serve as documentation of the ticketing process to enable ex post facto legal defence of each and every ticket. Typically, each radar detection of each radar system is stored with a time-stamp and includes the range and speed as recorded at that time.

The speed and range thresholds are each selected to take into account the possible physical vehicle behavior and the radar errors such that the probability that the guilty vehicle could be confounded with an innocent vehicle, is as small as the level of confidence demands.

It is appreciated that terminology such as "mandatory", "required", "need" and "must" refer to implementation choices made within the context of a particular implementation or application described herewithin for clarity and are not intended to be limiting since in an alternative implementation, the same elements might be defined as not mandatory and not required, or might even be eliminated altogether.

Components described herein as software may, alternatively, be implemented wholly or partly in hardware and/or firmware, if desired, using conventional techniques, and vice-versa. Each module or component or processor may be centralized in a single physical location or physical device or distributed over several physical locations or physical devices.

Included in the scope of the present disclosure, inter alia, are electromagnetic signals in accordance with the description herein. These may carry computer-readable

instructions for performing any or all of the operations of any of the methods shown and described herein, in any suitable order including simultaneous performance of suitable groups of operations as appropriate; machine -readable instructions for performing any or all of the operations of any of the methods shown and described herein, in any suitable order; program storage devices readable by machine, tangibly embodying a program of instructions executable by the machine to perform any or all of the operations of any of the methods shown and described herein, in any suitable order i.e. not necessarily as shown, including performing various operations in parallel or concurrently rather than sequentially as shown; a computer program product comprising a computer useable medium having computer readable program code, such as executable code, having embodied therein, and/or including computer readable program code for performing, any or all of the operations of any of the methods shown and described herein, in any suitable order; any technical effects brought about by any or all of the operations of any of the methods shown and described herein, when performed in any suitable order; any suitable apparatus or device or combination of such, programmed to perform, alone or in combination, any or all of the operations of any of the methods shown and described herein, in any suitable order; electronic devices each including at least one processor and/or cooperating input device and/or output device and operative to perform e.g. in software any operations shown and described herein; information storage devices or physical records, such as disks or hard drives, causing at least one computer or other device to be configured so as to carry out any or all of the operations of any of the methods shown and described herein, in any suitable order; at least one program pre-stored e.g. in memory or on an information network such as the Internet, before or after being downloaded, which embodies any or all of the operations of any of the methods shown and described herein, in any suitable order, and the method of uploading or downloading such, and a system including server/s and/or client/s for using such; at least one processor configured to perform any combination of the described operations or to execute any combination of the described modules; and hardware which performs any or all of the operations of any of the methods shown and described herein, in any suitable order, either alone or in conjunction with software. Any computer-readable or machine-readable media described herein is intended to include non-transitory computer- or machine -readable media.

Any computations or other forms of analysis described herein may be performed by a suitable computerized method. Any operation or functionality described herein may be wholly or partially computer-implemented e.g. by one or more processors. The invention shown and described herein may include (a) using a computerized method to identify a solution to any of the problems or for any of the objectives described herein, the solution optionally includes at least one of a decision, an action, a product, a service or any other information described herein that impacts, in a positive manner, a problem or objectives described herein; and (b) outputting the solution.

The system may, if desired, be implemented as a web-based system employing software, computers, routers and telecommunications equipment as appropriate.

Any suitable deployment may be employed to provide functionalities e.g. software functionalities shown and described herein. For example, a server may store certain applications, for download to clients, which are executed at the client side, the server side serving only as a storehouse. Some or all functionalities e.g. software functionalities shown and described herein may be deployed in a cloud environment. Clients e.g. mobile communication devices, such as smartphones, may be operatively associated with, but external to the cloud.

The scope of the present invention is not limited to structures and functions specifically described herein and is also intended to include devices which have the capacity to yield a structure, or perform a function, described herein, such that even though users of the device may not use the capacity, they are, if they so desire, able to modify the device to obtain the structure or function.

Any“if -then” logic described herein is intended to include embodiments in which a processor is programmed to repeatedly determine whether condition x, which is sometimes true and sometimes false, is currently true or false, and to perform y each time x is determined to be true, thereby to yield a processor which performs y at least once, typically on an“if and only if’ basis e.g. triggered only by determinations that x is true and never by determinations that x is false.

Features of the present invention, including operations which are described in the context of separate embodiments, may also be provided in combination in a single embodiment. For example, a system embodiment is intended to include a corresponding

process embodiment, and vice versa. Also, each system embodiment is intended to include a server-centered“view” or client centered“view”, or“view” from any other node of the system, of the entire functionality of the system, computer-readable medium, apparatus, including only those functionalities performed at that server or client or node. Features may also be combined with features known in the art and particularly, although not limited to, those described in the Background section or in publications mentioned therein.

Conversely, features of the invention, including operations, which are described for brevity in the context of a single embodiment or in a certain order may be provided separately or in any suitable subcombination, including with features known in the art (particularly although not limited to those described in the Background section or in publications mentioned therein) or in a different order "e.g." is used herein in the sense of a specific example which is not intended to be limiting. Each method may comprise some or all of the operations illustrated or described, suitably ordered e.g. as illustrated or described herein.

Devices, apparatus or systems shown coupled in any of the drawings may in fact be integrated into a single platform in certain embodiments or may be coupled via any appropriate wired or wireless coupling such as but not limited to optical fiber, Ethernet, Wireless LAN, HomePNA, power line communication, cell phone, Smart Phone (e.g. iPhone), Tablet, Laptop, PDA, Blackberry GPRS, Satellite including GPS, or other mobile delivery. It is appreciated that in the description and drawings shown and described herein, functionalities described or illustrated as systems and sub-units thereof can also be provided as methods and operations therewithin, and functionalities described or illustrated as methods and operations therewithin can also be provided as systems and sub-units thereof. The scale used to illustrate various elements in the drawings is merely exemplary and/or appropriate for clarity of presentation and is not intended to be limiting.