Please wait...



Goto Application


Note: Text based on automatic Optical Character Recognition processes. Please use the PDF version for legal matters

[ EN ]
The invention relates to a measuring device that can becontrolled without contact, and to a method forcontrolling such a measuring device.For measuring a target point, numerous geodeticmeasuring appliances have been known since ancienttimes. In this case, distance and direction or anglefrom a measuring appliance to the target point to bemeasured are recorded and, in particular, the absoluteposition of the measuring appliance together withreference points possibly present are acquired asspatial standard data.Generally known examples of such geodetic measuringappliances include theodolite, tachymeter and totalstation, which is also designated as electronictachymeter or computer tachymeter. One geodeticmeasuring device from the prior art is described in thepublication document EP 1 68 6 350, for example. Suchappliances have electrical-sensor-based angle anddistance measuring functions that permit direction anddistance to be determined with respect to a selectedtarget. In this case, the angle and distance variablesare determined in the internal reference system of theappliance and, if appropriate, also have to be combinedwith an external reference system for absolute positiondetermination.Modern total stations have microprocessors for digitalfurther processing and storage of acquired measurementdata. The appliances generally have a compact andintegrated design, wherein coaxial distance measuringelements and also computing, control and storage unitsare usually present in an appliance. Depending on theextension stage of the total station, motorization ofthe targeting or sighting device and means forautomatic target seeking and tracking can additionallybe integrated. As a human-machine interface, the totalstation can have an electronic display control unit -generally a microprocessor computing unit withelectronic data storage means - with display and inputmeans, e.g. a keyboard. The measurement data acquiredin an electrical-sensor-based manner are fed to thedisplay control unit, such that the position of thetarget point can be attained, optically displayed andstored by the display control unit. Total stationsknown from the prior art can furthermore have a radiodata interface for setting up a radio link to externalperipheral components such as e.g. a handheld dataacquisition device, which can be designed, inparticular, as a data logger or field computer.For sighting or targeting the target point to bemeasured, geodetic measuring appliances of the generictype have a telescopic sight, such as e.g. an opticaltelescope, as sighting device. The telescopic sight isgenerally rotatable about a vertical axis and about ahorizontal tilting axis relative to a base of themeasuring appliance, such that the telescopic sight canbe aligned with the point to be measured by pivotingand tilting. Modern appliances can have, in addition tothe optical viewing channel, a camera for acquiring animage, said camera being integrated into the telescopicsight and being aligned for example coaxially or in aparallel fashion, wherein the acquired image can berepresented, in particular, as a live image on thedisplay of the display control unit and/or on a displayof the peripheral device - such as e.g. the data logger- used for remote control. In this case, the opticalsystem of the sighting device can have a manual focus -for example an adjusting screw for altering theposition of a focusing optical system - or anautofocus, wherein the focus position is altered servomotors. Automatic focusing devices fortelescopic sights of geodetic devices are known e. g.from DE 19710722; DE 19926706 and DE 19949580.The optical system or the optical viewing channel ofthe sighting device usually contains an objective lensgroup, an image reversal system, a focusing opticalsystem, a reticle for producing a crosshair, and aneyepiece, which are arranged e.g. in this order fromthe ob j ect side. The position of the focusing lensgroup is set depending on the object distance in such away that a sharp object image arises on the reticlearranged in the focusing plane. The said image can thenbe viewed through the eyepiece or e.g. acquired withthe aid of a camera arranged coaxially.By way of example, the construction of generictelescopic sights of geodetic appliances is disclosedin the publication documents EP 1 081 459 orEP 1 662 278.In known measuring devices it is customary to alignthem coarsely with a target. Afterward, the target isacquired by the user via an optical system such as atelescope, for example, and the measuring device isaligned precisely with the target by means of finesetting. This is followed by customary measuring taskssuch as determining the distance, the direction,position, etc. Articles introduced into the terrainsuch as, for example, prisms, rods or mirrors and alsostationary objects such as, for example, mountingsummits, church towers, pylons and so on can be used astarget.In the known measuring methods it is repeatedlynecessary for the user to look away from the targetbeing viewed through the optical system, in order tomeet diverse settings on the measuring appliance. Thisis normally done by means of setting means, such asswitches, buttons or levers, which are fitted on theappliance itself or a remote control and are to beoperated manually. In order to operate these means, theuser's eye must regularly be concentrated on differentdistances and articles, for which reason it issusceptible to fatigue. Moreover, regularly touchingthe measuring device in order to actuate the latter(for example in the course of fine setting to ameasurement target) can repeatedly result in vibrationsand impacts, as a result of which the accuracy of themeasurement can be impaired or the measurement work isat least protracted. As a result, particularly in thecase of high-precision measurements, delays canrepeatedly occur during the measurement.Therefore, there is a need for a measuring device whichcan be actuated in a manner free of contact, and for amethod by which such a measuring device can becontrolled.According to the invention, a measuring devicecomprises a targeting device and optionally arepresentation device for representing an image of thetarget sighted by the targeting device. Furthermore, aneye image acquisition device oriented toward a user endof the targeting device is provided, which is designedto continuously acquire images of a user's eye {eyeimages}, said eye being situated in particular at theuser end. The eye image acquisition device can canacquire the eye images as part of a recording of thewhole face or of parts of the face of the user, inparticular comprising the visible part of the eye withthe rim of the eye, or else acquire only part of theuser's eye, in particular the front part of the eyeballcontaining the cornea with iris and pupil. Eye imageacquisition devices within the meaning of thisinvention can in this case be cameras or else lightsensitivesensors, for example two-dimensional CCDsensors or, as described in US 7,697,032 CMOS imagesensors.On account of the horizontally elliptically shaped andspherically curved cornea of the human eye, an eyeimage within the meaning of this invention canalternatively also be acquired by three-dimensionallyscanning the surface of the user's eye, in particularthe visible part of the eyeball, for example by meansof a scanner. A three-dimensional image of the eye canalso be generated by means of at least two cameras.The eye image acquisition device can be constituted insuch a way that it is able, as described for example inWO 99/05988, to actively illuminate the user's eye. Asa result, eye images can still be acquired even indarkness or when the eye is shaded as can be broughtabout for example by great proximity of the eye to theeyepiece. LEDs, in particular, are appropriate asillumination means. In order to prevent the user's eyefrom being dazzled, which would make targetidentification by the user considerably more difficultwhen there is low brightness, this active irradiationof the user's eye preferably takes place in a nonvisiblewavelength range of electromagnetic radiation,for example by means of infrared light, as described inWO 2011/064534. For this purpose, the eye imageacquisition device must be suitable for receiving theelectromagnetic radiation respectively emitted.On the part of the measuring appliance, evaluationmeans for data storage and control of the alignment ofthe targeting unit are provided, which are designed forimplementing an automatic viewing-direction-dependenttargeting functionality. The evaluation means contain,in particular, machine-readable data carriers withcomputer program code for carrying out the methodaccording to the invention. For determining the viewingdirection of the user's eye, the evaluation meansfunction as a viewing direction determining device.The viewing direction determining device serves todetermine, in each of the continuously acquired eyeimages, the pupil midpoint of the user's eye, or toacquire information from which the pupil midpoint canbe derived (eye information); this can be, for example,the position of the pupil or iris, or generally alsothe distribution of bright and dark areas in theacquired eye image. The determination of the pupilmidpoint or of other eye features on the basis of theeye images can preferably take place by means of imagerecognition systems - well known to the person skilledin the art - with a pattern recognition of a mask forthe form of the eye and another for the pupil or othereye features. On the basis of the determined or derivedposition of the pupil midpoint, the viewing directiondetermining device then determines the viewingdirection of the user's eye.Furthermore, the viewing direction determining devicecan be able to determine the distance between the ~determined or derived - pupil midpoint and the opticalaxis of the targeting device. On the basis of thedetermined distance, the viewing direction determiningdevice can then determine the viewing direction of theuser's eye.Known measuring devices, such as total stations, forexample, can have targeting devices having telescopes,wide angle cameras, panoramic cameras, axial camerasand so on. A representation device serves forrepresenting the image of a target sighted by thetargeting device, said representation device beingembodied for example as a display in the targetingdevice. Since the targeting device is able to show areal image of the sighted target in its image plane,e.g. an eyepiece of the targeting device can serve asthe representation device, said eyepiece enabling theimage to be observed directly by the human eye.Alternatively or additionally, a ground-glass screen inthe image plane can serve as the representation device,which can be embodied e.g. as a photographic sensor.The image signal can then be transmitted to a display,such as e.g. an LCD monitor, provided within or outsidethe targeting device.By way of example, a video camera can serve as the eyeimage acquisition device for acquiring continuous eyeimages of a user's eye situated at the user end, saidvideo camera preferably being aligned along the opticalaxis of the targeting device. However, the eye imageacquisition device can also be provided outside thetargeting device. It can be fitted for example in ahandheld operating unit or else in special spectacles.In the acquired eye image, by means of image extractionmethods it is possible to determine the exact positionof the pupil or of the pupil midpoint of the user'seye. Since the position of the optical axis of thetargeting device in relation to the acquired eye imagesof the pupil midpoint is known, it is possible todetermine a distance - preferably a pixel distance -between the pupil midpoint and the optical axis. On thebasis of the determined distance, it is then possibleto determine the viewing direction of the user's eye.Incidentally, this does not necessitate representingthe eye image in a manner visible to the user.Depending on the targeting device used, according tothe beam path in the targeting device it may besufficient to determine the distance between the pupilmidpoint and the optical axis since the viewingdirection of the user's eye is defined as a straightline through the pupil midpoint.In order to increase the accuracy, it is possible totake account of the anatomy of the human eye. Since thehuman eyeball is substantially spherical, and itsdiameter is 2.5 cm on average, an angle between theviewing direction of the user's eye and the opticalaxis can be determined in a simple manner as an angularoffset. This is possible by means of a sine law or ruleof three, for example. In this case, the midpoint ofthe eyeball essentially serves as the pivot pointthereof about which the change in the viewing directiontakes place.As an alternative thereto for checking the viewingdirection determined, a calibration device can beprovided, which successively marks individual pixels inthe displayed image of the target, e.g. by means ofilluminating said pixels, and determines the distancebetween the pupil midpoint and the optical axis uponconsideration of the pixels respectively marked. Sincethe respective viewing direction corresponding to themarked points is known, it is possible in this way toascertain a deviation in the result of thedetermination of the viewing direction. According tothe deviations ascertained, the viewing directiondetermined can then be corrected during operation. Theviewing direction can then be adapted for example byinterpolation of the deviations measured previously.It is possible, after determining the viewing directionof the user's eye, to vary the position of thetargeting device on the basis of the determineddistance and/or angular offset in such a way that theoptical axis of the targeting device and the viewingdirection of the user's eye coincide. This corresponds,for example, to the fine setting to a measurementtarget sighted by the user through a telescope. In thiscase, the change in the position of the targetingdevice can be effected around the midpoint of the user,in order that the user is not distracted by a variationof the representation of the sighted target that takesplace as a result of the movement of the targetingdevice.A reticle can advantageously be provided in thetargeting device. The reticle can be fixedly assignedto the optical axis of the targeting device, or it canbe embodied as a movable reticle. However, a fixed andone or more movable reticles can also be provided.According to the determined viewing direction of theuser's eye, the position of the respective reticle canthen be varied in such a way that the viewing directionof the user's eye is directed through the reticle.It is advantageously possible to make the variationspeed in the case of the alignment change or themovement of the reticle dependent on the determineddistance and/or angular offset. In this regard, it ispossible, in the case of relatively largedistances/angular offsets, to carry out the change inthe position of the targeting device and/or of thereticle(s) with a higher speed than that in the case ofa small distance/angular offset.It is advantageously possible to output a controlsignal, for example as a result of blinking of theuser's eye. For this purpose, the view determiningdevice is designed, for example, in such a way that itinterprets as control signal a specific number of eyeimages in which a pupil midpoint cannot be determined.A closed eye or the process of closing and opening theeye can also be recognized as blinking and interpretedas control signal, in particular by means of image10recognition systems with a pattern recognition of amask.Movements of the eyeball of the user's eye {eyemovements) or combinations of eye movements can also beinterpreted as control signal. By means of acalibration device, for example before the beginning ofthe measurement work, various time durations and/orrepetitions of blinking with the user's eye or specificeye movement combinations can be allocated to differentcontrol commands. By way of example, the beginningand/or the end of a dynamic change in the alignment ofthe targeting device can be initiated by means of sucha control command without contact.Alternatively or additionally, the representation ofthe target can be superimposed or replaced by opticallyrepresented control commands. By way of example, thecontrol commands can be visible to the user in the formof pictograms or else in the form of text. In thiscase, the viewing direction determination can bedesigned such that the determined distance between thepupil midpoint and the optical axis (for examplecorresponding to the midpoint axis of therepresentation) is taken as a basis for identifyingwhich of the optically represented control commands issighted by the user, i.e. which control command lies inthe viewing direction. This relevant control commandcan then be activated by means of blinking, forexample. In this case, it can advantageously bepossible to identify the currently sighted controlcommand by color variation and/or illumination.It can likewise be possible to provide a measuringdevice according to the invention with a target pointidentification device. The target point identificationdevice is able to identify possible target points inthe representation of the sighted target which lie in11the viewing direction of the user's eye, and to markthem for example by illumination or by a color change.In response to the outputting of a control signal, forexample as a result of blinking once, it is possiblefor the target points to be targeted, measured andstored together with their data such as distance,direction, height, etc. in a storage device. With theuse of a plurality of reticles, correspondingly aplurality of target points can be marked in succession.In a method according to the invention for controllinga measuring device which represents a target sighted bya sighting device, an image of a user's eye (eye image]is acquired, in particular continuously. A viewingdirection of the user's eye can be derived on the basisof specific features of the eye image. This can be, forexample, a distance between a pupil midpoint of theuser's eye and the optical axis of the targetingdevice, or a position of the pupil or of the iris inrelation to the extent of the eye. The determineddistance is taken as a basis for varying the positionof the optical axis of the targeting device or theposition of a movable reticle in the representation ofthe sighted target. As a result, the optical axis ofthe targeting device coincides with the viewingdirection of the user's eye or the viewing direction ofthe user's eye is directed through the reticle,On the basis of the distance between the pupil midpointand the optical axis, on the one hand, and a diameterof the eyeball, on the other hand, it is possible todetermine an angle between the viewing direction of theuser's eye and the optical axis of the angular offset.Alternatively or additionally, it is possible, by meansof successively marking individual pixels and thedistance between the pupil midpoint and the opticalaxis that is respectively determined in this case, todetermine a deviation of the determined viewing12direction of the user's eye from the actual viewingdirection of the user's eye.The alignment of the targeting unit can advantageouslybe changed with a speed that is variable depending onthe distance between the pupil midpoint of the user'seye and the optical axis of the targeting device.The representation of the target can be superimposed orreplaced by optically represented control commands.With the aid of the viewing direction determiningdevice, a sighted control command from among theoptically represented control commands can be selectedon the basis of the determined distance between thepupil midpoint and the optical axis of the targetingdevice.Advantageously, a target point identification devicecan identify possible target points lying near theviewing direction in the representation of the sightedtarget and can mark them. In response to the outputtingof a control signal, the corresponding target point canbe targeted by the targeting device, measured andstored together with its associated data such asdirection, distance, height, etc.The European application EP11150580.6 describes aconcept for a dynamic targeting function whichcomprises a control by touching a touch-sensitivesurface, wherein said surface is divided into sectorsformed by a virtual line grid. This function, too, canbe controlled by the eye without contact in accordancewith the present invention.The measuring device according to the invention and themethod according to the invention are described ingreater detail purely by way of example below on thebasis of concrete exemplary embodiments illustrated13schematically in the drawings, and further advantagesof the invention are also discussed. Schematically inthe figures:Figure 1 shows a geodetic measuring appliance accordingto the invention designed as a total station;Figure 2a shows a first embodiment of an opticalconstruction of a targeting device of a geodeticmeasuring appliance according to the invention;Figure 2b shows a second embodiment of an opticalconstruction of a targeting device of a geodeticmeasuring appliance according to the invention;Figure 3 shows a construction of a measuring deviceaccording to the invention;Figure 4a shows an eye image and a pattern recognitionof the user's eye and of the pupil midpoint by means ofmasks;Figure 4b shows an eye image with a user's eye recordedoff-center;Figure 4c shows an eye image with a closed user's eye;Figure 5 shows an exemplary illustration fordetermining an angular offset or a viewing direction onthe basis of a pupil distance from an optical axis;Figure 6a shows a first example of an alignment of atargeting device on the basis of eye images of a user'seye;Figure 6b shows a second example of an alignment of atargeting device on the basis of eye images of a user'seye;14Figure 7 shows a flow chart for the movement of atelescope;Figure 8 shows an example of a further embodiment, inwhich a reticle is provided in a displaceable manner inan image of a target sighted by the targeting device;Figure 9 shows the image from figure 8, into whichcontrol commands are inserted; andFigure 10 shows the functioning of a dynamic targetingfunctionality on the basis of an example.Figure 1 shows a geodetic measuring appliance 1according to the invention that is designed as a totalstation and serves for measuring horizontal angles,vertical angles and distances with respect to a targetobject situated at a distance.The total station is arranged on a stand 12, wherein abase 11 of the total station is directly and fixedlyconnected to the stand. The main body of the totalstation, which is also designated as the upper part 10,is rotatable about a vertical axis relative to the base11. In this case, the upper part 10 has a support 14,formed e.g. by two columns, a sighting device 5, forexample a telescope, which is mounted in a mannerrotatable about the horizontal tilting axis between thecolumns, and an electronic display control unit 15. Thedisplay control unit 15 can be designed in a knownmanner for controlling the measuring appliance 1 andfor processing, displaying and storing measurementdata.The sighting device 5 is arranged on the support 14 ina manner rotatable about a horizontal tilting axis andcan thus be pivoted or tilted horizontally and15vertically relative to the base 11 for alignment with atarget object. Motors {not illustrated here) arepresent for carrying out necessary pivoting and tiltingmovements for the alignment of the sighting device. Thesighting device 5 can be embodied as a common sightingdevice structural unit, wherein an objective, afocusing optical system, a coaxial camera sensor, theeyepiece 13 and a graphics processor can be integratedin a common sighting device housing. By means of thesighting device 5, the target object can be targetedand the distance between the total station and thetarget object can be acquired in an electrical-sensorbasedmanner. Furthermore, provision is made of meansfor the electrical-sensor-based acquisition of theangular alignment of the upper part 10 relative to thebase 11 and of the sighting device 5 relative to thesupport 14. These measurement data acquired in anelectrical-sensor-based manner are fed to the displaycontrol unit 15 and processed by the latter, such thatthe position of the target point relative to the totalstation can be determined, optically displayed andstored by the display control unit 15.Up to this point, the measuring appliance is known fromthe prior art. In addition, according to the invention,a camera 4 (not illustrated here) oriented toward theuser end of the targeting device 5 is provided as eyeimage acquisition device according to the invention,which can record images of the user's eye 3.Figure 2a shows an optical construction of a targetingdevice 5 of a geodetic measuring appliance 1 accordingto the invention. An optical target axis 6 is definedby means of an objective unit 21 and the associatedbeam path from a target or object to be sighted throughthe objective unit 21, which target axis is to bealigned with the target or object to be observed. Theobjective unit 21 can be constructed with a plurality16of lenses. A camera sensor 22 having pixel-definedresolution serves for acquiring a camera image of anobject or target to be sighted or a target mark.A beam path 23 extends from the objective unit 21 tothe camera sensor 22, which beam path can be foldedwith an optical deflection element 24, as illustratedin figure 2, or alternatively can be embodied incontinuously rectilinear fashion. The opticaldeflection element 24 can be embodied, for example, asa beam splitter or partly transmissive mirror, suchthat one part, for example 50%, of the light guided inthe beam path 23 as far as the deflection element 24 isdirected onto the camera sensor 22 and another part canpropagate further in the direction of the target axisto an eyepiece unit 13 for an observer. In thedirection of propagation of the light acquired by theobjective unit 21, an adjustment or alignment' aid 26,for example a reticle, can be arranged fixedly upstreamof the eyepiece. In addition, a focusing element 27that is variable in terms of its positioning along theaxis 6 and serves for varying the focusing position forthe light acquired by the objective unit 21 can bearranged in the beam path between the objective unit 21and the optical deflection element 24. The focusingelement 27 can be embodied with a plurality of lenses.Advantageously, for the focusing element 27 provisionis made of a stable, precisely reproducible positioningfor image acquisition of objects arranged at a largedistance with a de facto parallel beam path to theobjective unit 21.Optionally, the arrangement can additionally beequipped with means for an electro-optical distancemeasurement. For this purpose, as illustrated in figure2, a measurement radiation source 31 (e.g. emitting inthe near infrared spectral range not visible to thehuman eye) can be used, the measurement radiation of17which is directed via an optical deflection element 32,for example a mirror, onto a further optical deflectionelement 33, for example a dichroic beam splitter whichis reflective in the spectral range of the light source31 and transmissive in the rest of the spectral range,and from there further through the objective unit 21 toa target mark to be sighted. In this optionalembodiment of an optical construction of a targetingdevice of the geodetic measuring appliance according tothe invention, part of the light having the wavelengthof the light source 31 that is reflected diffusely ordirect ionally at the target and is acquired by theobjective unit 21 passes through the deflection element33 and propagates further as far as a dichroic beamoutput coupler 34, which is designed to be reflectiveto light having the emission wavelength of the lightsource 31, and transmissive to light in the rest of thespectral range. The measurement light reflected backfrom the dichroic beam output coupler 34 is directedvia the deflection element 33 to a detector 35 for anelectro-optical distance measurement. By way ofexample, the light source 31 can be pulsed and thedistance measurement can be effected in a known mannerby determining pulse application times or phasedifferences between emitted light and reflected light.Alternatively, the camera sensor 22 can also bearranged on the optical target axis 6 (not illustratedhere) . The beam path from the objective unit along theoptical target axis 6 is ended with the camera sensor22 in this arrangement. The camera sensor is thenconnected to evaluation means, which can output thecurrently acquired image 2 of the camera sensor 22, ifappropriate with superimposed target mark patterns, toa display, if appropriate in such a way that anobserver is given an impression as though said observerwas seeing a direct "telescope imaging" of a viewedobject, target or target pattern through the eyepiece1813. Such a system is already described in thepublication document WO2010/092087A1.Up to this point, the targeting device is known fromthe prior art. According to the invention, a camera 4is additionally provided, which camera can recordimages of the user's eye 3 via a deflection element 40and through the eyepiece 13.Figure 2b shows a second embodiment of an opticalconstruction of a targeting device 5 of a geodeticmeasuring appliance 1 according to the invention. Incontrast to the embodiment illustrated in figure 2a,the camera 4 is not fitted in the interior but ratheron the outer side of the targeting device 5 anddirectly records images of the user's eye 3. For betteracquisition of the whole eye 3 it is also possible touse a further camera 4' or a multiplicity of cameraswhich can jointly record images of the user's eye 3. Byusing a plurality of cameras 4, 4', it is possible alsoto acquire three-dimensional images of the user's eye 3and to deduce a viewing direction for example on thebasis of the curvature of the cornea of the user's eye3. The camera 4 or the cameras can also be providedoutside the targeting device 13, for example in aperipheral device of the measuring appliance 1, such asspectacles or a handheld operating unit.Figure 3 shows an exemplary schematic construction, of ameasuring device according to the invention. In figure3, the measuring device is designed in the form of atotal station 1 comprising a telescope 5. Referencesign 2 in figure 3 schematically represents an image ofa measurement environment, which image is created bythe telescope 5 and is an example of an image accordingto the invention of a target sighted by the targetingdevice. The telescope 5 of the total station 1 forms atargeting device according to the invention.19Reference sign 3 in figure 3 represents a human eye(user's eye) that views the image 2 through theeyepiece 13 of the telescope 5. Depending on theconstruction of the telescope 5, the created image 2 ofthe measurement environment can be a measurementenvironment viewed directly through the optical systemof the telescope 5, or can be projected onto a display,such as an LCD screen, for example. Reference sign 4 infigure 3 denotes a camera that records, in particularcontinuously, eye images of the user's eye 3. In thiscase, the camera 4 serves as the eye image acquisitiondevice according to the invention. The camera 4 isaligned with the optical axis of the telescope 5 of thetotal station 1 in such a way that a center of the eyeimage respectively acquired corresponds to the opticalaxis of the telescope 5 of the total station 1.If the user looks exactly in the direction of theoptical axis 6 of the telescope 5, which is representedby a small reticle in the image 2, the center of thepupil of the eye 3 lies exactly at the midpoint of theeye image currently acquired by the camera 4. Thisposition corresponds to the position of the opticalaxis 6 in the image 2.Figure 4a illustrates one example of an eye image whichis acquired by a camera and which is centered on theoptical axis 6. The user's eye 3 can be recognized assuch by image recognition and pattern recognitionmethods by means of a mask 38. Within the user's eye 3recognized as such, the pupil midpoint and its positionin relation to the extent of the eye are determined bymeans of a further mask 39 (a mask for recognizing theiris illustrated here). For the image or patternrecognition, the user's eye does not have to be locatedcentrally on the optical axis 6, but rather, asillustrated in figure 4b, can also lie beyond the20optical axis. If the eye is closed, as indicated infigure 4c, no user's eye 3 is recognized, or the factthat the eye is closed can be recognized by means of afurther mask (not illustrated).If the user moves his/her eye 3 in order to moreaccurately view an article or an object in the image 2beyond the optical axis or to sight it using the eye 3,the eyeball of the user's eye 3 performs a rotationalmovement around the midpoint M of the eyeball in orderto change the viewing direction of the eye 3 toward thesighted point. In this case, the viewing directionsubstantially corresponds to the axis of vision of thehuman eye 3, which is defined essentially by the pupilP, on the one hand, and the fovea centralis situatedopposite the pupil P on the inner side of the eyeball.The fovea centralis is that part of the eye 3 which isresponsible in particular for sharp color vision. Theaxis of vision therefore runs approximately from thefovea centralis via the midpoint M of the eyeballthrough the midpoint of the pupil P of the eye 3. Theaxis of vision in this case corresponds to the viewingdirection.The described movement of the eyeball results in avariation of the pupil position. This variation isacquired by an eye image acquisition device which isprovided in the camera 4 and which is able to determinethe distance between the pupil midpoint and the opticalaxis or the position of the pupil in relation to theextent of the eye in the form of eye image pixels. Therotational angle of the eye 3 can be derived from theposition of the pupil midpoint in relation to theextent of the eye. Since the distance between the pupiland the eye image plane of the eye image acquired bythe camera 4 is known, and the diameter of the humaneyeball with a value of approximately 2.5 cm islikewise known, it is also possible to determine a21value, for example an angle a, by which the currentviewing direction deviates from the optical axis. Thispossibility is illustrated in figure 5.Figure 5 illustrates an eye 3, the current viewingdirection of which is represented by a solid drawnline. The position of the pupil P and the viewingdirection exactly straight ahead forward arerepresented respectively in a hatched manner and bymeans of a dashed line in figure 5. Since the distancebetween the pupil P and the midpoint M of the eyeball(approximately 12.5 mm), the distance between the pupilP and the image plane BE of the eye image and thedistance between the pupil P and the exact centercorresponding to the viewing direction directlystraight ahead forward are known, the point ofintersection S between the current viewing directionand the image plane BE of the eye image can bedetermined e,g. by means of the rule of three.Consequently, on the basis of the distance between thepupil midpoint and the center of the eye image acquiredby the camera 4, it is possible for the deviation ofthe current viewing direction from the optical axis 6,for example expressed by the angle a, and thus theviewing direction of the user's eye 3 to be determinedsufficiently accurately.Since the viewing direction of the user's eye 3 inrelation to the optical axis 6 of the telescope 5 ofthe total station 1 can be determined sufficientlyaccurately, it is likewise possible, by means of adrive device (not illustrated) to adjust the telescope5 in such a way that its optical axis 6 is aligned withthe target that the user views through the eyepiece 13,in the special case the optical axis 6 coinciding withthe viewing direction of the user's eye 3. Since, inaccordance with figure 3, the reticle is fixedlyassigned to the position of the optical axis 6, the22viewing direction of the user's eye 3 passes throughthe reticle after this adjustment.In this case, it is possible for the telescope 5 to beadjusted around the midpoint M of the eyeball of theuser's eye 3 in order to preclude a variation of theimage 2 for the user as a result of a displacement ofthe telescope 5 or the like.Figure 6a schematically shows a user's eye 3 in aninitial position with its viewing direction alignedexactly straight ahead forward, and also an eye imageof this user's eye 3 and a telescope 5 in an initialposition in which the optical axis 6 is aligned withthe target viewed by the user through the eyepiece 13.Figure 6b illustrates the same user's eye 3 with analtered viewing direction. An altered position of thepupil P is registered in the acquired eye image.Depending on the registered position of the pupil, theviewing direction determining device determines theviewing direction of the user's eye 3, or, asillustrated in figure 4a, derives it by means of masks38, 39 for eye and pupil or iris recognition from aposition of the pupil P within the eye 3. A drivedevice (not illustrated) aligns the telescope 5depending on the viewing direction determined orascertained in this way, with the result that theoptical axis 6 of the telescope 5 is directed at thetarget viewed by the user through the eyepiece 13. Inthis case, the telescope 5 can be aligned depending ona transmission ratio that is determined with theinclusion of a current magnification factor in thecontext of a calibration.Figure 7 shows a flowchart describing essential methodsteps for carrying out the exemplary embodimentdescribed above.23After the position of the pupil has been measured andthe distance between the midpoint of the pupil and theposition of the optical axis has been determined, theangular offset is calculated by means of the rule ofthree, as has been described above. If the valuedetermined here, for example the angle a, is less thana predefined threshold value, it is determined that theactual viewing direction corresponds to the directionof the optical axis and to the angular offset. Noalteration of the position of the telescope isperformed in this case.If the calculated angular offset is greater than thepredefined threshold value, it is determined that thetelescope should be moved, in particular until theoptical axis 6 of the telescope 5 is aligned with thetarget viewed by the user through the eyepiece 13.Figure 8 illustrates a second exemplary embodiment ofthe present invention. The measuring device of thesecond embodiment is provided with a reticle 1, theposition of which is not fixedly linked to the opticalaxis, rather said reticle is movable. This can beachieved, for example, by the measurement environmentnot being viewed directly through the optical system ofa telescope, but rather being projected as an imageonto a display, such as an LCD screen, for example. Ifthe user views this image, which, according to theinvention, corresponds to the image of a target sightedby the targeting device, on the LCD screen, theposition of said user's pupil P is likewise acquiredcontinuously by the camera 4. The position of thereticle 7 is adapted to the determined viewingdirection by means of a control device, without theneed for changing the alignment of the targeting deviceitself. This can be done by the reticle 7 being24inserted into the image represented on the LCD screen,or other known means.In figure 8, the user's eye is directed at the apex ofthe church tower 8, and the position of the reticle 1,the original position of which in the center of theimage is represented by dotted lines, iscorrespondingly shifted to the apex of the church tower8 after the current viewing direc^tion has beendetermined.As a variant of the second embodiment, it is possibleto analyze the representation of the target sighted bythe targeting device by means of image extraction meansand to determine objects of interest such as the churchtower 8, for example, as measurement target. In thiscase, it suffices if the viewing direction of theuser's eye 3 is directed in proximity to the churchtower 8. Since the church tower 8 is identified as apossible measurement target, in accordance with thevariant of the second embodiment the reticle 7 ispositioned at the apex of the church tower 8, What isadvantageous about this variant is that only a lowaccuracy is required when determining the viewingdirection of the user's eye 3, as a result of whichcomputing time, memory resources and so on can besaved.In accordance with the variant of the second embodimentis it necessary to confirm that the target selected bythe measuring device - the church tower 8 in thepresent case - actually serves as the measurementtarget. In order to determine the possible targetdefinitively as the measurement target, it is possible,in particular by means of a movement of the eyelids(blinking), to output control commands since the camera4 is preferably also able to identify a sequence of eyeimages in which the pupil P or other eye information25cannot be acquired as blinking and to interpret this asa control command. In this case, it is possibleentirely discretionarily to allocate differentsequences of blinking to different control commands, asa result of which completely contactless opieration ispossible with the measuring device according to theinvention.Control commands can also be assigned to specificmovements of the eyeball (eye movements). These caninclude, for example, combinations of movements of thepupil toward the left, toward the right, upward ordownward. Preferably, diverse control commands arepredefined by the user at the beginning of operationand are learned by the viewing direction determiningdevice. By way of example, the beginning and/or the endof a dynamic change in the alignment of the targetingdevice can be initiated by means of such a contactlesscontrol command.In accordance with a further preferred variant of thesecond embodiment, it is possible to insert pictogramsor text fields corresponding to different controlcommands into the representation of the target sightedby the targeting device. In figure 9, these aredesignated by way of example by Al, Bl, CI and by A2 toA4. If the control commands are inserted, the camera 4is able to identify the viewed control commanddepending on the viewing direction of the user's eye 3and then to carry out said control command on the basisof an actuation such as blinking, for example. Thecontrol command CI, for example, is currently marked inthe illustration in figure 8.However, the variants described on the basis of theexample of the second embodiment are not restricted tothis second embodiment, but rather can, for example,also be applied in a telescope 5 in accordance with the26first embodiment. For this purpose, an image of thereticle 1, of the pictograms of the control commands,etc. is projected into the image plane of the telescope5.Figure 10 illustrates a further variant of the secondembodiment of the invention with a dynamic targetingfunction. An image 2 generated by the telescope, or adisplay is subdivided into a virtual line grid 9,corresponding to digitized distances and directionsfrom the target image point or the midpoint of thereticle 7 to groups of display points. In theembodiment in accordance with figure 10, the virtualline grid 9 is formed from concentric circular lines 19around the midpoint of the reticle and radial lines 17which proceed from the midpoint of the reticle andintersect said circular lines, such that the display isthereby divided into sectors 18 - each containing agroup of a plurality of display points. In this case,the sectors 18 each correspond to concrete values foran alignment change direction and alignment changespeed when changing the alignment of the telescope 5.That is to say that the display points lying within asector are in each case assigned the same concretevalue for the alignment change direction and alignmentchange speed.The alignment of the telescope 5 is changed in avertical and, in particular simultaneously, horizontaldirection, in the direction of the different markedimage point 16 - located in a sector - corresponding toa different spatial point to be sighted, either for aslong as said different image point 16 is continuouslysighted by the user's eye 3 or, after the sighted imagepoint 16 has been marked, for example by blinking,until the attainment of the desired alignment or thecancellation of the marking. Upon cancellation of themarking, for example as a result of renewed blinking,27the movement of the telescope 5 is terminated. However,a user can at any time mark or sight a differentdisplay point in a different sector 7 using said user'seye 3 in order to instigate a change in the alignmentof the telescope 5 in accordance with the direction andspeed assigned to this sector for changing thealignment.Sectors situated further outward, which, as a result oftheir position, have a greater distance from the anchordisplay point (i.e. midpoint of the reticle}, in thiscase correspond to higher aligmnent change speeds, andsectors situated further inward, which have a smallerdistance from the anchor display point, in this casecorrespond to lower alignment change speeds. As thedistance between the respective sectors and the anchordisplay point increases, therefore, the alignmentchange speed respectively assigned to the sectors alsoincreases.In this case, the sectors defined by the outermostcircular line can also be assigned the highest movementspeed (100%) and a marking of the anchor display point(that is to say of the midpoint of the reticle) canmean a movement speed of 0%.Each sector 18 furthermore corresponds to a specific -that is to say assigned thereto - alignment changedirection of the telescope 5 (azimuthal andelevational) . When an image point e.g. to the right ofthe midpoint of the reticle is marked or sighted by theeye 3, the telescope 5 is moved toward the right forchanging the alignment in a horizontal direction untilthe different image point mentioned above is no longermarked or sighted (for example because now anotherdifferent display point is marked - and then thetargeting unit is moved further or changed further withthe direction and speed assigned to said display point28- or no point is marked anymore - and the movement ofthe targeting unit is then stopped). The situationillustrated in figure 10 (with the point illustratedhere as currently marked display point 16) corresponds,for instance, to a change in the alignment of thetelescope 5 with an alignment change direction upwardobliquely toward the right (that is to say a directionchange component pointing upward and a direction changecomponent pointing rightward, wherein the componentpointing upward is chosen to be somewhat greater thanthe component pointing rightward} and also with anaverage movement speed. In particular, for this purposethe rotary drives can be driven in such a way that thetargeting unit pivots upward relative to the base at60% of the maximum pivoting speed that can be provided,and rotates rightward about the vertical axis at 40% ofthe maximum rotational speed that can be provided.In particular, in this case the line grid 9 isestablished in such a way that a multiplicity ofsectors 18 are defined, in particular at leastapproximately thirty sectors, specifically at leastapproximately fifty sectors.As has been described above, the invention makes itpossible to control a measuring device such as a totalstation, for example, by one or both eyes or the pupilsof the user. Accordingly, it is no longer necessary totouch the measuring instrument, as a result of whichvibrations and resultant disturbances of themeasurement work can be avoided. Apart from the directsighting of target points and the setting of themeasuring instruments, it is also possible to performcontrol commands that are represented in the targetingdevice of the measuring device. Constantly changingbetween looking through the eyepiece in order to sighta target, looking at input means to be operatedmanually in order to give control commands, and once29again looking through the eyepiece is accordinglyobviated as well.Said control commands can be inserted into an image ofthe measurement environment, said image being projectedor represented on a display such as e.g. an LCD screen,but can also be inserted into a telescope or the likeby means of a dedicated display. The selection ofdifferent menu points can be carried out for example byopening and closing (blinking) of the eye. It islikewise possible to achieve said control commands bymeans of a remote control, such as a radio remotecontrol, for example.Preferably, it can likewise be possible to temporarilyswitch off specific functions or the entire contactlesscontrol, in order to avoid excessive fatigue of theuser and resultant erroneous operation as a result ofinadvertent eye movements. By way of example, it ispossible, alternatively or additionally, to activate orbring about a control by means of the user's voice.Preferably, the user's voice can serve to activate anddeactivate the control by the user's eye.Alternatively, or additionally, it is also possible touse a radio remote control and/or switches or buttonson the measuring device.In order to increase the accuracy, the measuring devicecan have a calibration mode in which the user isencouraged to focus a series of successively identifiedimage points. The measurement of the pupil positionthat takes place simultaneously in this case makes itpossible to ascertain deviations of the determinedviewing direction from the actual viewing direction andthus to compensate for anatomical characteristics ofthe user's eye.30Control signals, such as the blinking of the user'seye, for example, can likewise be adapted by means of acorresponding calibration.By means of the movement of the reticle as described onthe basis of the second embodiment, the user can beprevented from being distracted on account of the factthat the user's eye is very close to the telescope.Different control commands can be given by blinking ofthe user's eye. By way of example, closing and openingthe eye once can result in the menu being inserted,opening and closing the eye twice can result inselection of the menu point currently sighted or theinitiation of the measurement, and closing and openingthe eye three times can bring about a terminationfunction.A pointer {mouse pointer) can also be inserted into therepresentation of the sighted target, said pointerfollowing the pupil movement.However, a target point can also be selected by beingviewed for a relatively long period of time, such astwo seconds, for example. The control is then able todisplay the selected target point by illumination or acolor change. The displayed selection can then beconfirmed by the user, for example by blinking.Preferably, it can also be possible to control themeasuring device dynamically. This means that in thecase of a relatively large deviation of the pupil fromthe optical axis, the final speed during the change inposition of the targeting device or the movement speedof the reticle is higher than that in the case of asmall distance.31It goes without saying that these illustrated figuresmerely illustrate possible exemplary embodimentsschematically. The various approaches can likewise becombined with one another and also with methods andappliances from the prior art.32