[go: up one dir, main page]

EP3565276B1 - Verfahren zum betrieb eines hörgeräts und hörgerät - Google Patents

Verfahren zum betrieb eines hörgeräts und hörgerät Download PDF

Info

Publication number
EP3565276B1
EP3565276B1 EP19171367.6A EP19171367A EP3565276B1 EP 3565276 B1 EP3565276 B1 EP 3565276B1 EP 19171367 A EP19171367 A EP 19171367A EP 3565276 B1 EP3565276 B1 EP 3565276B1
Authority
EP
European Patent Office
Prior art keywords
acceleration
movement
head
yaw
hearing aid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Revoked
Application number
EP19171367.6A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP3565276A1 (de
Inventor
Tobias Wurzbacher
Thomas Kübert
Dirk Mauler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sivantos Pte Ltd
Original Assignee
Sivantos Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=66290306&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=EP3565276(B1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Sivantos Pte Ltd filed Critical Sivantos Pte Ltd
Publication of EP3565276A1 publication Critical patent/EP3565276A1/de
Application granted granted Critical
Publication of EP3565276B1 publication Critical patent/EP3565276B1/de
Revoked legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/50Customised settings for obtaining desired overall acoustical characteristics
    • H04R25/505Customised settings for obtaining desired overall acoustical characteristics using digital signal processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/40Arrangements for obtaining a desired directivity characteristic
    • H04R25/402Arrangements for obtaining a desired directivity characteristic using contructional means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2225/00Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
    • H04R2225/41Detection or adaptation of hearing aid parameters or programs to listening situation, e.g. pub, forest
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2225/00Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
    • H04R2225/43Signal processing in hearing aids to enhance the speech intelligibility
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2225/00Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
    • H04R2225/59Arrangements for selective connection between one or more amplifiers and one or more receivers within one hearing aid

Definitions

  • the invention relates to a method for operating a hearing aid and a hearing aid which is set up in particular to carry out the method.
  • Hearing aids in particular in the form of hearing aids, are used by people with a hearing loss to at least partially compensate for the hearing loss.
  • conventional hearing aids regularly include at least one microphone for capturing noises from the environment and a signal processing processor, which is used to process the captured noises and, in particular, to amplify and / or attenuate them as a function of the individual hearing impairment (particularly frequency-specific).
  • the processed microphone signals are forwarded by the signal processing processor to an output converter - usually in the form of a loudspeaker - for output to the hearing of the respective hearing aid wearer.
  • so-called bone conduction headphones or cochlear implants are also used as output transducers for mechanical or electrical stimulation of the hearing.
  • hearing aid also includes other devices such as headphones, so-called tinnitus maskers or headsets.
  • Hearing aids in particular often have what is known as a classifier, which is used to infer certain, predefined “hearing situations”, in particular on the basis of the recorded noises.
  • the signal processing is then changed as a function of the recognized hearing situation. Since the hearing aid wearer's speech understanding is frequently impaired due to the hearing impairment present, the (signal processing) algorithms stored in the signal processing processor are usually adapted to the utterances of third parties to work out in the recorded microphone signals and to reproduce them for the respective hearing aid wearer in a form that is as understandable as possible. To recognize a conversation situation, a speech recognition algorithm is often processed in the classifier.
  • the end EP 3 154 277 A1 a method for operating a hearing device is known in which the hearing device has two acceleration sensors which each provide a sensor signal and by means of which an orientation of the user's head is determined. The sensor signals from both acceleration sensors are used in combination to determine the orientation. An operating parameter of the hearing device is then set as a function of the orientation.
  • a binaural hearing aid system which comprises left and right hearing aid devices and a user interface, as well as its use and an operating method therefor.
  • the left and right hearing aids include at least two input units for providing a time-frequency representation of an input signal in a number of frequency bands and a number of time instances, and a multiple input unit noise suppression system that includes a multi-channel beamformer filter unit that is operable with the at least two input units is coupled and configured to provide a beam-shaped signal.
  • the binaural hearing assistance system is configured in such a way that a user can indicate a direction to or the location of a target signal source relative to the user via the user interface.
  • a method for physically adapting a hearing aid to a hearing aid wearer is known.
  • a characteristic measure for a current actual wearing position of the hearing aid is determined by means of a position sensor of the hearing aid.
  • Using the characteristic measure for the actual wearing position is then determined a deviation of the actual wearing position from a predetermined target wearing position.
  • an instruction is issued to the hearing aid wearer to adapt the receiver connection means as a function of the determined deviation.
  • KESHNER EA et al . "Characterizing head motion in three planes during combined visual and base of support disturbances in healthy and visually sensitive subjects", Gait & Posture, Elsevier, Amsterdam, NL, Vol. 28, No. 1, July 1 2008, pages 127-134, XP022686191, ISSN: 0966-6362, DOI: 10.10l6 / j.gaitpost.2007.11.003 describes an investigation into whether multiplanar movements in the environment lead to head instability, especially when the visual environment moves in planes that are orthogonal to a physical disturbance. For this purpose, the displacement of surfaces in the sagittal plane was combined with visual field disturbances in 12 healthy (29-31 years) and 3 visually sensitive (27-57 years) adults.
  • COP Center of Pressure
  • tip head angle and RMS values of head movement were calculated and a three-dimensional model of joint movement was developed to study gross head movement in three planes. It was found that subjects who stood quietly in front of a visual scene that was translating in the sagittal plane produced a significantly greater (p ⁇ 0.003) head movement when yawing than on a translational platform. However, if the platform was moved in the dark or with a visual scene rotating in a rolling motion, head movement orthogonal to the plane of platform movement was significantly increased (p ⁇ 0.02). Visually sensitive subjects with no history of vestibular disorder produced large, delayed compensatory head movements.
  • the invention is based on the object of enabling improved operation of a hearing aid.
  • the method according to the invention is used to operate a hearing aid which (preferably only) has one acceleration sensor.
  • This acceleration sensor is positioned on the head of a hearing aid wearer when it is worn as intended. Furthermore, the acceleration sensor is set up for measurement in at least two measurement axes that are perpendicular to one another (also referred to as “measurement directions”).
  • at least one main feature is derived from an acceleration signal of the acceleration sensor, which is related to an acceleration directed tangentially (and preferably approximately horizontally) to the head of the hearing aid wearer.
  • the presence of a yaw movement of the head is then determined on the basis of the respective main feature (s), taking into account at least one predetermined criterion which can be derived from the acceleration signal itself and which goes beyond the presence of an acceleration value of the tangentially directed acceleration indicative of a movement.
  • “Related to the acceleration directed tangentially to the head of the hearing aid wearer” is understood here and below to mean that the main feature directly reproduces this tangentially directed acceleration, or that the main feature contains at least information about this.
  • “yaw movement” is understood to mean, in particular, a rotational movement of the head about a vertical axis (which preferably coincides at least approximately with the vertical).
  • “nodding” or “nodding movement” for an up and down movement directed up and down around a “nodding axis” that is preferably horizontal and in particular connecting the ears of the hearing aid wearer, as well as “rolling” or “ Rolling motion “for a sideways inclination or tilting of the head around a "roll axis” which is preferably horizontal and in particular is oriented in the neutral viewing direction (also referred to as "zero-degree viewing direction”).
  • Acceleration sensor is understood here and in the following in particular to mean a sensor in which sensor elements for measurement in the at least two measurement axes (ie for two-dimensional measurement), preferably in three measurement axes perpendicular to one another (three-dimensional measurement), are integrated. Such an acceleration sensor therefore preferably represents a self-contained component that is set up for connection to an evaluation unit.
  • a yaw movement is preferably not concluded when an acceleration directed tangentially to the head can be read from the acceleration signal, but only when it is concluded that the yaw movement is present, taking into account the at least one additional criterion.
  • the probability that there is actually a yaw movement is increased. Misinterpretations of the acceleration signal can thus be avoided or at least reduced.
  • the recognized yaw movement is used to support the analysis of hearing situations.
  • a time profile of the tangentially directed acceleration (also referred to as “tangential acceleration” for short in the following) is used as the main feature.
  • the predefined criterion used, and thus considered is whether the temporal course of the tangential acceleration is within a predefined movement time window has two oppositely directed local extrema consecutively (that is, for example, a local maximum and a local minimum). In particular, it is considered whether in the course of time the tangential acceleration assumes values with opposite signs at these two extremes.
  • the tangential acceleration initially indicates an "actual” acceleration and then a “negative” acceleration (namely when the head is braked) with a respective associated deflection (the respective extreme) over time.
  • the tangential acceleration thus initially assumes positive values, for example, and "changes" to negative values when the head is braked.
  • the values of the tangential acceleration change accordingly from negative to positive.
  • the movement time window is preferably adapted to the duration of a normal head turning movement, especially in a group conversation, and preferably has values between 0.25 and 2 or 1.5 seconds, in particular from 0.5 to 1 second.
  • the movement time window is preferably "opened” (ie its monitoring is started) when a sufficiently significant change in the values of the tangential acceleration is detected.
  • the movement time window advantageously limits the consideration of the main feature, in particular the temporal course of the tangential acceleration, so that "acceleration events" which, due to their comparatively long duration, are very likely not to be assigned to a head rotation (i.e. no yawing) , are not taken into account.
  • an extremum of the temporal course is only inferred if the underlying change in the temporal course differs from a usual measured value fluctuation, e.g. noise, or from minor movements (which regularly do not cause sufficiently significant changes in the temporal course) can distinguish.
  • a threshold value comparison is carried out for this purpose.
  • an additional feature derived from the acceleration signal is a time curve of an acceleration directed radially (and in particular also horizontally) in particular with respect to a yaw axis of the head of the hearing aid wearer (which regularly at least approximately coincides with the vertical).
  • the predefined criterion used is, in particular, whether the time profile of the radially directed acceleration (hereinafter referred to as “radial acceleration”) assumes a local extreme within the predefined (and above-described) movement time window.
  • radial acceleration the time profile of the radially directed acceleration
  • two criteria are considered, namely whether the tangential acceleration indicates the acceleration described above and the braking, and whether the radial acceleration also indicates an acceleration.
  • a movement intensity is determined on the basis of the temporal course of the tangential and possibly also the radial acceleration.
  • a strength of the movement intensity - preferably the size of the determined value of the movement intensity - is used as a predetermined criterion.
  • a threshold value comparison is carried out for this purpose in order to compare the determined value of the movement intensity with a predetermined threshold value.
  • the existence of the yaw movement is inferred in particular if the movement intensity has a specific, in particular predetermined, strength.
  • the presence of the yaw movement is not inferred if the movement intensity clearly exceeds and / or exceeds the specified (expected) strength falls below.
  • a probability for the presence of the yaw movement is determined, the probability value of which decreases the further the movement intensity deviates from the expected strength (in particular it exceeds or falls below it).
  • a duration of movement and / or a total energy or mean energy contained in the tangential and radial acceleration - in particular in the respective measured value curve - is preferably determined as a measure of the movement intensity.
  • a correlation coefficient between a time derivative of the tangential acceleration and the radial acceleration is determined as the main feature.
  • the tangential acceleration (preferably its time profile) is thus first derived from time and then correlated with the radial acceleration (preferably with its time profile).
  • the strength - ie in particular the magnitude of the value - of the correlation coefficient is used as the predetermined criterion.
  • a threshold value comparison of the correlation coefficient with an in particular predetermined threshold value also takes place here. This approach is based on the knowledge that with a yaw movement of the head, the change in the tangential acceleration (i.e.
  • This correlation coefficient thus advantageously represents an indicator that is comparatively easy to check for the presence of a yaw movement.
  • a high magnitude value of the correlation coefficient can thus advantageously be used to infer a high probability of the presence of the yaw movement.
  • a comparatively low strength of the correlation coefficient (for example less than 0.5 or less than 0.3), on the other hand, indicates comparatively uncoordinated or aimless head movements or an immobile head.
  • a yaw direction is determined on the basis of the correlation coefficient, preferably on the basis of the sign.
  • the sign of the determined correlation coefficient is used as an indicator for the direction in which the hearing aid wearer turns his head. This is due in particular to the fact that the acceleration sensor used has a positive and a negative measuring direction for each measuring axis.
  • the acceleration sensor is arranged on the left ear of the hearing aid wearer when the hearing aid is worn as intended and the measurement axis assigned to the tangential acceleration points with its positive direction in the viewing direction of the hearing aid wearer
  • the temporal course of the tangential acceleration in the case of a yaw movement to the left initially show negative values.
  • the time course will initially assume positive values in the case of a yaw movement to the right.
  • the main feature used is a curve of a diagram in which the tangential acceleration is plotted against the radial acceleration. I. E. this curve is first determined.
  • the geometric shape of this curve in particular is used as the specified criterion.
  • Such a curve advantageously already contains the information from the two measurement axes relevant for a yaw movement. As an option, there is no need to determine additional features.
  • the curve described above will have other shapes, for example a zigzag-like course (that is to say with alternating directions of curvature).
  • ellipsoid is understood to mean in particular that the curve has a shape that is curved and approximately closed in one direction of rotation (i.e., in particular with a slight offset compared to the curve length) or is at least composed of several curve sections that are curved in this way and optionally connect straight sections .
  • the direction of rotation of the curve described above - which can in particular be read from the time sequence of the individual measured values - is used to determine the direction of yaw.
  • only one main feature is sufficient to ensure that the yaw movement and its yaw direction are present in a particularly robust manner, ie. H. with a comparatively low susceptibility to errors.
  • the respective main feature or the respective main feature as well as the possibly additionally determined additional feature are ascertained slidingly over a time window which overlaps with a subsequent, in particular similar, time window.
  • the length of the respective time window is about 0.25 to 2 seconds, especially about 0.5 to 1.5 seconds.
  • an overlap of the subsequent time window with the preceding time window of approximately 0.25-1, in particular up to 0.75 seconds, is preferably used.
  • the length of the (respective) time window results from the knowledge that a usual, conscious yaw movement of the head lasts about 0.5 seconds to one second.
  • two or three measured values assigned to the two or three measuring axes are output by the acceleration sensor at a frequency of approximately 10-60 Hertz, preferably approximately 15-20 Hertz.
  • These groups of measured values ie the respective two or three measured values
  • a so-called "update rate" of the buffer memory is preferably about two Hertz.
  • a value of a yaw angle is determined from the acceleration signal only when the presence of the yaw movement is detected in particular according to one or more of the method variants described above.
  • this is useful to save computational effort.
  • stationary influences or slowly changing disturbance variables for example the earth's gravitational field, an inclined head posture or the like
  • the tangential acceleration in particular its temporal course, is preferably integrated (twice), it being recognized that the above-described influences or disturbance variables have a particularly strong effect due to the integration, especially in the case of comparatively long integration periods.
  • the time length of the section of the time profile of the tangential acceleration to be integrated can be determined can be kept particularly short, so that the influences described above have only a slight effect and a drifting of the result can be avoided particularly effectively.
  • constant and / or linear measured value components are filtered from the acceleration signal, in particular from the tangential and radial acceleration - optionally also from the integrated tangential acceleration (in particular its temporal progression), i.e. H. removed.
  • a high-pass filter is used in a simple but expedient variant.
  • a temporal (in particular sliding) mean value of the measured values assigned to the respective measuring axis is subtracted from the individual measured values. In this way, stationary influences (for example gravitation) or influences which change only comparatively slowly and which are detected by the acceleration sensor can be removed or at least reduced.
  • linear trends are removed from the measured values, in particular from the respective time courses or optionally from the integrated tangential acceleration, in that what is known as “detrending” is used in particular.
  • compensation in particular for gravity is preferably carried out, preferably in the "bare" acceleration signal, in particular in that the acceleration signal is fed to the high-pass filter.
  • the influence of gravity can be reduced at least to a significant extent.
  • the pitch and roll angles of the head are determined in relation to the gravitational field. Using this angle, a so-called “direction cosine matrix" is then determined, by means of which the present measurement data contained in the acceleration signal (ie the measurement values assigned to the respective measurement axes) are transformed from a hearing aid wearer-specific coordinate system to the "global" earth-related coordinate system , especially rotated.
  • the measurement data are cleared of the influence of the gravitational field - or at least the remnants of it remaining after the high-pass filtering - and then the measurement data are transferred to the original coordinate system (ie coordinate system related to the hearing aid wearer) transformed back.
  • the original coordinate system ie coordinate system related to the hearing aid wearer
  • the integrated tangential acceleration is optionally also cleared of such (stationary or slowly changing) influences, for example by means of "detrending".
  • This variant is based on the consideration that, due to the comparatively short duration of a yaw movement, the remaining drift is comparatively low or is at least contained as an approximately constant or linear influence within the time window to be considered (which is mapped in particular in the buffer described above). In this way, the integrated tangential acceleration can be cleared of these (optionally remaining) constant and / or linear measured value components in a simple manner.
  • a classification algorithm is applied to the respective main feature or features and possibly the additional feature in order to determine the presence or at least a probability of the presence of the yaw movement.
  • the respective main feature or the respective main feature and, if applicable, the additional feature are fed to a classification algorithm which serves to carry out the consideration described above with regard to the fulfillment of the criteria assigned to the respective main feature (and possibly the additional feature).
  • the classification algorithm is also set up to determine not only the presence of the yaw movement but also the yaw direction (ie the direction of rotation or rotation when yawing the head), the duration and / or the strength of the yaw movement or at least the head movement.
  • a “Gaussian mixed mode model”, a neural network, a “support vector machine” or the like is used as the classification algorithm.
  • a classifier that is often already present in a hearing aid (in which, in addition to the usual classification algorithms, the corresponding classification algorithm described above is preferably implemented) is used.
  • the classifier and thus also the classification algorithm are preferably on trains the respective expression of the respective main or additional feature (ie the respective criterion) indicative of the presence of the yaw movement.
  • the classifier is also modeled in a self-learning manner.
  • a spatial area of interest of the hearing aid wearer is determined on the basis of the yaw movement itself, but preferably on the basis of the determined values of the yaw angle covered during the yaw movement.
  • I. E. It is observed over a predetermined period of time - which is again preferably a sliding period of time with a duration of, for example, 20 seconds to two minutes, in particular about 30 seconds to one minute - in which viewing directions, in particular starting from a zero degree line of sight the hearing aid wearer turns his head.
  • This area of interest is preferably determined by statistically evaluating the yaw angles determined within the predetermined period of time (i.e. specifically the individual values) and, in particular, by creating a histogram.
  • an area can be read from the statistical evaluation, for example the histogram of past yaw movements, by comparing a great interest of the hearing aid wearer is or at least was.
  • the information about the yaw movement of the head of the hearing aid wearer is used to adapt a signal processing algorithm for a conversation situation. For example, it is possible to derive from the yaw movement, in particular from the histogram created therefrom, in which spatial viewing area the current main interest of the hearing aid wearer lies and thus also where potential conversation partners are.
  • This information is particularly expediently used together with the information of an acoustic classifier, ie the information from the movement analysis described above (ie the determination of the presence the yaw movement) are combined with those of an acoustic analysis (ie the acoustic classifier), which is also referred to as "fusion".
  • the acoustic classifier is used to fundamentally determine the existence of a conversation situation and, if necessary, also to determine from which spatial directions relevant acoustic signals (usually speech signals emanating from third parties) hit the hearing aid and thus the hearing aid wearer.
  • the information about the yaw movement of the head is preferably used in this case to further delimit the spatial area in which the interlocutors of the hearing aid wearer are most likely to be. This is particularly useful, for example, in the event that the hearing aid wearer is in an acoustically ambiguous conversation situation in which at least two conversations take place in parallel, but the hearing aid wearer only takes part in one of the two conversations.
  • the above-described zero-degree viewing direction of the hearing aid wearer is referenced in particular using a nodding movement of the head, a vertical movement of the hearing aid wearer and / or a forward movement of the hearing aid wearer (optionally detected by means of a separate "movement classifier").
  • Such movements that can be derived from the acceleration signal are used in particular to detect movements such as nodding, drinking, standing up, activities such as tying shoes, walking, jogging, driving a car, cycling and the like.
  • This variant of the method which also represents an independent invention, is based on the knowledge that, in particular, movements such as nodding and drinking are also possible during a group conversation or a lecture situation in which the hearing aid wearer If you look at a blackboard or a canvas for comparatively long periods of time, it is very likely that this will be done regularly with your head aligned in a zero-degree line of sight.
  • the referencing serves to avoid or at least compensate for a drift, in particular when creating the above-described histogram, which may be caused, for example, by incorrect non-detection of a yaw movement.
  • a "movement classifier” is used to detect the movements described here, in particular activities such as walking, jogging, driving, cycling, tying shoes, in which the entire body of the hearing aid wearer is in particular in motion.
  • This is preferably formed by a corresponding classification algorithm, which in turn is expediently directed to movements of the entire body of the hearing aid wearer.
  • an output of the above-described movement classifier in particular aimed at the movement of the entire body of the hearing aid wearer, is used as an additional criterion for determining the yaw movement (in particular whether it is present). For example, it is assumed that in the case of activities recognized by means of the movement classifier, such as cycling, driving and jogging, the probability that the hearing aid wearer will take part in a group conversation is comparatively low. It is recognized that these activities each take place in comparatively “fast” movement situations in which the hearing aid wearer is likely to direct his (in particular visual) attention largely forwards with a comparatively high degree of probability.
  • the evaluation of the main features and, if applicable, the additional feature can be blocked or at least verified. If the hearing aid wearer is at rest, there will be multiple yaw movements of the head - especially in the case of a acoustically classified conversation situation - indicate with a high probability that the hearing aid wearer will participate in the conversation with several people.
  • the information of the movement classifier can thus also be included, for example, in the classification algorithm described above (directed at the yaw movement) and / or in the amalgamation of the movement and acoustic information.
  • such an arrangement of the acceleration sensor in or on the hearing aid is used that at least one of the measurement axes of the acceleration sensor is at least approximately tangential to the head, preferably parallel to the natural zero-degree viewing direction of the hearing aid wearer.
  • This measuring axis is preferably also aligned horizontally.
  • the two other measuring axes are preferably arranged (with an upright posture) vertically or horizontally and along the above-described pitch axis.
  • the above-described method for detecting the yaw movement and, if necessary, for determining the yaw angle is used separately in each of the two hearing aids. H. monaural - carried out and the two monaural decisions then "binaurally" synchronized.
  • the two monaural acceleration signals are combined to form a binaural signal - for example the difference is formed from the two acceleration signals - and the method described above is applied to the binaural sensor signal.
  • the hearing aid according to the invention comprises the (in particular single) acceleration sensor which, when the hearing aid is worn as intended, is arranged on the head of the hearing aid wearer and is set up for measurement in the at least two, optionally three, measuring axes.
  • the hearing aid also includes a (signal processing) processor which - in terms of program and / or circuitry - is set up to carry out the above-described method according to the invention, in particular automatically.
  • the processor is therefore set up to derive the at least one main feature linked to the tangential acceleration from the acceleration signal of the acceleration sensor and to determine the presence of the yaw movement of the head on the basis of the respective main feature, taking into account the at least one predetermined criterion.
  • the hearing aid thus has all the advantages and features that result from the method features described above in equal measure.
  • the processor is at least essentially formed by a microcontroller with a microprocessor and a data memory in which the functionality for performing the method according to the invention is implemented in the form of operating software (firmware) so that the method - possibly in interaction with the Hearing aid wearer - is carried out automatically when the operating software is executed.
  • the processor is alternatively a non-programmable electronic component, e.g. B. an ASIC, in which the functionality for performing the method according to the invention is implemented with circuitry means.
  • a hearing aid 1 specifically a so-called behind-the-ear hearing aid, is shown.
  • the hearing aid 1 comprises a (hearing aid) housing 2 in which several electronic components are arranged.
  • the hearing aid 1 comprises two microphones 3 as electronic components, which are set up to detect noises from the surroundings of the hearing aid 1.
  • the hearing aid 1 comprises a signal processor 4 as an electronic component, which is set up to process the noises detected by the microphones 3 and to output them to a loudspeaker 5 for output to the hearing of a hearing aid wearer.
  • an acceleration sensor 6 which is connected to the signal processor 4.
  • a battery 7 is also arranged in the housing 2, which in the present exemplary embodiment is specifically formed by an accumulator.
  • a sound tube 8 is connected to the housing 2 which, when worn as intended, is attached to the head 9, specifically to the ear of the hearing aid wearer (cf. Figure 2 ) is inserted with an ear mold 10 into the ear canal of the hearing aid wearer.
  • the acceleration sensor 6 is set up for three-dimensional measurement and has three measurement axes x, y and z that are perpendicular to one another (see Fig. Figure 2 ) on.
  • the acceleration sensor 6 is arranged in the housing 2 of the hearing aid 1 in such a way that the measurement axis z coincides with the vertical direction when the head 9 is worn as intended and when the hearing aid wearer is in an upright posture.
  • the measuring axis x is tangential to the head 9 and oriented to the front - that is, along a zero-degree viewing direction 12.
  • the measurement axis y is directed radially away from the head 9.
  • the two measuring axes x and y also lie in a horizontal plane when the hearing aid wearer is in an upright posture.
  • the measured values assigned to the measuring axis x reproduce an acceleration directed tangentially to the head 9 (hereinafter referred to as “tangential acceleration at”).
  • the measured values assigned to the measuring axis y correspondingly reproduce an acceleration directed radially to the head 9 (hereinafter referred to as “radial acceleration ar”).
  • the signal processor 4 is set up to use an acoustic classifier, which is implemented as an algorithm in the signal processor 4, to infer a conversation situation (i.e. a conversation between at least two people) based on the noises recorded by the microphones 3 and then the signal processing accordingly adapt.
  • an opening angle of a directional microphone formed by means of the two microphones 3 is set in such a way that all speech components from the environment that hit the microphones 3, specifically the source locations of these speech components, are within the opening range of the directional microphone.
  • the signal processor 4 carried out a method explained in more detail below.
  • a first method step 20 the measured values determined by the acceleration sensor 6 - which are output in groups of three measured values, each of which is assigned to one of the measuring axes x, y and z - are output in a buffer memory (the one in the signal processor 4 is integrated) filed.
  • the buffer memory is designed for the floating intermediate storage of eight such groups of measured values.
  • several features are derived (also: "extracted") from the measured values assigned to the respective measuring axes x, y and z.
  • these features are fed to a classifier in which a classification algorithm - in the present exemplary embodiment in the form of a Gaussian mixed mode model - is implemented.
  • this classifier determines whether the hearing aid wearer rotates his head 9, ie at least approximately rotates about the measurement axis z. Such a “sideways rotation” of the head 9 is referred to here and in the following as a “yaw movement”.
  • the measurement axis z represents a so-called yaw axis in the arrangement and alignment of the acceleration sensor 6 shown in the present exemplary embodiment which the hearing aid wearer tilts his head 9 downwards or upwards ("nod”; analogous to the English-language terms “yaw”, “roll” and “pitch”).
  • a method step 50 the measured values of the acceleration sensor 6 stored in the buffer memory are cleared of stationary influences that change only slowly compared to the duration of a head movement.
  • the influence of gravity which can be assumed to be stationary, is removed by means of a high-pass filter.
  • Further influences that lead to an offset of the measured values for example an anatomically-related deviation of the actual yaw axis from the vertical and / or the actual alignment of the measuring axis z, are in one embodiment by subtracting the time average of the buffered measured values from the respective "individual measured value " removed. Influences that have a linear effect (ie linear trends) are removed by means of so-called "detrending".
  • a value of a yaw angle W is determined from the determined measured values, specifically from the tangential acceleration at. I. E. it is determined how far the hearing aid wearer has turned his head 9 (cf. Figure 8 ).
  • a statistical analysis is carried out in a method step 70. It is determined how often the hearing aid wearer turns his head 9 within a predetermined time window.
  • the values of the yaw angle W assigned to the individual yaw movements are used to create a histogram from which it can be read off in which directions - based on the zero-degree viewing direction 12 - the hearing aid wearer has turned his head 9 in the specified time window (see Fig. Figure 9 ).
  • a spatial distribution of the area of interest of the hearing aid wearer can also be read from this histogram.
  • a further method step 80 the information generated in method steps 60 and 70 is used by the signal processor 4 in order to additionally adapt the signal processing.
  • the information from the acoustic classifier described above and the “movement analysis” described above are merged by means of the acceleration sensor 6 in order to enable the signal processing to be more precisely adapted to a conversation situation.
  • the opening angle of the directional microphone, the alignment of the directional cone of the directional microphone and the position of a so-called "notch" are specifically determined as a function of the information determined by the acceleration sensor 6 - namely the yaw angle W and the histogram - further adapted, possibly further limited with respect to a setting proposed solely by the acoustic classifier.
  • a time profile at (t) of the tangential acceleration at is determined as a main feature.
  • a time curve ar (t) of the radial acceleration ar is determined.
  • it is considered as a criterion for the presence of the yaw movement whether the temporal course at (t) of the tangential acceleration at within a predetermined time segment, hereinafter referred to as "movement time window Zb", of a duration of one second, two local extremes Assumes Mt with opposite signs, indicating two opposite accelerations, namely an actual acceleration and a deceleration.
  • time profile ar (t) of the radial acceleration ar assumes a local extreme Mr within the movement time window Zb, which indicates a head movement with an acceleration component directed radially towards the head 9.
  • the time courses at (t) and ar (t) are shown by way of example for a yaw movement of the head 9 to the right (cf. seconds 0.5-1.5) and to the left (cf. seconds 2-3).
  • the time course at (t) therefore runs through first the "positive" extremum Mt, which indicates the beginning of the yaw movement, and then the "negative" extremum Mt, which shows the Decelerating the head 9 indicates the end of the yaw movement.
  • the time course ar (t) - due to the alignment of the measurement axis y outwards - also shows a positive extreme Mr within the movement time window Zb due to the centrifugal force.
  • the opposite is true for the yaw movement to the left, as from the right half of Figure 4 can be taken.
  • the classifier outputs in method step 55 that a yaw movement is present.
  • a yaw movement is present.
  • the main feature in method step 30 is a correlation coefficient K between a time derivative of the tangential acceleration at, specifically its time profile at (t), and the Radial acceleration ar, specifically its time course ar (t) is determined.
  • This is in Figure 5 shown in more detail.
  • the change in the tangential acceleration at which can be taken from the time derivative of the tangential acceleration at, specifically a temporal extreme Md of this change, turns out to be as follows Figure 5 it can be seen - in the case of a yaw movement of the head 9, at least approximately temporally together with the extreme Mr of the radial acceleration ar. It can thus be inferred from the value of the correlation coefficient K - specifically from its magnitude - whether there is any yaw movement at all.
  • the direction of the yaw movement can also be read from the sign of the correlation coefficient K.
  • the value of the correlation coefficient K is approximately -0.75.
  • the correlation coefficient K is approximately 0.8.
  • a curve D of a diagram is created as the main feature in which the radial acceleration ar is plotted against the tangential acceleration at.
  • the shape of this curve D is used as the criterion. Specifically, it is considered whether the curve D can be approximated to the shape of an ellipse.
  • Figure 6 are also the previous ones Figures 4 or 5 underlying measured values for the yaw movement to the right and in Figure 7 applied to the left.
  • the offset shown between the respective starting point and the end point (the latter marked by a triangle on its apex) is caused by an inclined posture of the head.
  • curve D also deviates from the ideal circular shape and rather corresponds to an oval or an ellipse. If the curve D has such a shape, the classifier concludes in method step 40 that the yaw movement is present and outputs a corresponding result in method step 55.
  • a movement intensity I is determined as the main feature in method step 30. This is contained in the tangential and radial acceleration Energy mapped.
  • the movement intensity I is estimated on the basis of the averaged vector norms of the respective vector of the tangential and radial acceleration at or ar. For example, the energy is estimated by a temporally discrete sum of the vector length of the resulting vector of the tangential and radial acceleration at or ar.
  • the histogram determined in method step 70 is shown as an example in the form of a polar diagram. From this it is possible to read in concrete terms on the basis of the radial length of the hatched areas how often or for how long the hearing aid wearer has turned his head 9 into a specific angular range. From this, in turn, a spatial area of interest can be derived, which is used in method step 80 to set the aperture angle of the directional microphone accordingly. In this specific example, the hearing aid wearer is talking to two people, one directly opposite and one offset to the left by about 20-25 degrees.
  • a so-called movement classifier is used in order to infer a movement situation of the hearing aid wearer, ie a movement state of the entire body or an activity comprising this, on the basis of the features determined in method step 30.
  • a movement situation of the hearing aid wearer ie a movement state of the entire body or an activity comprising this, on the basis of the features determined in method step 30.
  • the determination of the yaw movement in method step 40 and the subsequent method steps 60-80 are optionally omitted.
  • the classifier in method step 55, also outputs the (temporal) duration of the yaw movement and optionally also the strength of the yaw movement, specifically the movement intensity I.
  • a “reset” takes place in a further method step; H. referencing the zero-degree viewing direction 12 whenever an almost pure nodding movement, which is indicative of drinking, for example.
  • H referencing the zero-degree viewing direction 12 whenever an almost pure nodding movement, which is indicative of drinking, for example.
  • the histogram can be created particularly precisely and robustly, since - even if yaw movements are not recognized - the zero-degree viewing direction 12 can be "found” again and again, thus preventing the individual values of the yaw angle W from adding up and thus erroneously it is assumed that the zero-degree line of sight 12 changes.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurosurgery (AREA)
  • Otolaryngology (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
EP19171367.6A 2018-05-04 2019-04-26 Verfahren zum betrieb eines hörgeräts und hörgerät Revoked EP3565276B1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102018206979.4A DE102018206979A1 (de) 2018-05-04 2018-05-04 Verfahren zum Betrieb eines Hörgeräts und Hörgerät

Publications (2)

Publication Number Publication Date
EP3565276A1 EP3565276A1 (de) 2019-11-06
EP3565276B1 true EP3565276B1 (de) 2021-08-25

Family

ID=66290306

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19171367.6A Revoked EP3565276B1 (de) 2018-05-04 2019-04-26 Verfahren zum betrieb eines hörgeräts und hörgerät

Country Status (5)

Country Link
US (1) US10959028B2 (da)
EP (1) EP3565276B1 (da)
CN (1) CN110446149B (da)
DE (1) DE102018206979A1 (da)
DK (1) DK3565276T3 (da)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3806496A1 (en) * 2019-10-08 2021-04-14 Oticon A/s A hearing device comprising a detector and a trained neural network
EP3886461B1 (en) * 2020-03-24 2023-11-08 Sonova AG Hearing device for identifying a sequence of movement features, and method of its operation
DE102020209939A1 (de) 2020-08-06 2022-02-10 Robert Bosch Gesellschaft mit beschränkter Haftung Vorrichtung und Verfahren zum Erkennen von Kopfgesten
CN111741419B (zh) * 2020-08-21 2020-12-04 瑶芯微电子科技(上海)有限公司 骨传导声音处理系统、骨传导麦克风及其信号处理方法
AU2022217895A1 (en) * 2021-02-05 2023-09-07 Jonathan Siegel A system and method for an electronic signature device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010078372A1 (en) 2008-12-30 2010-07-08 Sennheiser Electronic Gmbh & Co. Kg Control system, earphone and control method
US20120176865A1 (en) 2009-04-29 2012-07-12 Jan-Philip Schwarz Apparatus and Method for the Binaural Reproduction of Audio Sonar Signals
US20130329923A1 (en) 2012-06-06 2013-12-12 Siemens Medical Instruments Pte. Ltd. Method of focusing a hearing instrument beamformer
US20150109200A1 (en) 2013-10-21 2015-04-23 Samsung Electronics Co., Ltd. Identifying gestures corresponding to functions
US9516429B2 (en) 2013-07-02 2016-12-06 Samsung Electronics Co., Ltd. Hearing aid and method for controlling hearing aid
EP3154277A1 (de) 2015-10-09 2017-04-12 Sivantos Pte. Ltd. Verfahren zum betrieb einer hörvorrichtung und hörvorrichtung
EP3264798A1 (en) 2016-06-27 2018-01-03 Oticon A/s Control of a hearing device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110317858A1 (en) 2008-05-28 2011-12-29 Yat Yiu Cheung Hearing aid apparatus
DE102011075006B3 (de) 2011-04-29 2012-10-31 Siemens Medical Instruments Pte. Ltd. Verfahren zum Betrieb eines Hörgerätes mit verringerter Kammfilterwahrnehmung und Hörgerät mit verringerter Kammfilterwahrnehmung
KR102192361B1 (ko) * 2013-07-01 2020-12-17 삼성전자주식회사 머리 움직임을 이용한 사용자 인터페이스 방법 및 장치
EP2908549A1 (en) * 2014-02-13 2015-08-19 Oticon A/s A hearing aid device comprising a sensor member
EP2928210A1 (en) 2014-04-03 2015-10-07 Oticon A/s A binaural hearing assistance system comprising binaural noise reduction
DE102016205728B3 (de) * 2016-04-06 2017-07-27 Sivantos Pte. Ltd. Verfahren zur physischen Anpassung eines Hörgeräts, Hörgerät und Hörgerätesystem

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010078372A1 (en) 2008-12-30 2010-07-08 Sennheiser Electronic Gmbh & Co. Kg Control system, earphone and control method
US20120176865A1 (en) 2009-04-29 2012-07-12 Jan-Philip Schwarz Apparatus and Method for the Binaural Reproduction of Audio Sonar Signals
US20130329923A1 (en) 2012-06-06 2013-12-12 Siemens Medical Instruments Pte. Ltd. Method of focusing a hearing instrument beamformer
US9516429B2 (en) 2013-07-02 2016-12-06 Samsung Electronics Co., Ltd. Hearing aid and method for controlling hearing aid
US20150109200A1 (en) 2013-10-21 2015-04-23 Samsung Electronics Co., Ltd. Identifying gestures corresponding to functions
EP3154277A1 (de) 2015-10-09 2017-04-12 Sivantos Pte. Ltd. Verfahren zum betrieb einer hörvorrichtung und hörvorrichtung
EP3264798A1 (en) 2016-06-27 2018-01-03 Oticon A/s Control of a hearing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WEI HAOLIN; SCANLON PATRICIA; LI YINGBO; MONAGHAN DAVID S.; O'CONNOR NOEL E.: "Real-time head nod and shake detection for continuous human affect recognition", 2013 14TH INTERNATIONAL WORKSHOP ON IMAGE ANALYSIS FOR MULTIMEDIA INTERACTIVE SERVICES (WIAMIS), IEEE, 3 July 2013 (2013-07-03), pages 1 - 4, XP032492798, ISSN: 2158-5873, DOI: 10.1109/WIAMIS.2013.6616148

Also Published As

Publication number Publication date
EP3565276A1 (de) 2019-11-06
DE102018206979A1 (de) 2019-11-07
US20190342676A1 (en) 2019-11-07
DK3565276T3 (da) 2021-11-22
CN110446149B (zh) 2021-11-02
CN110446149A (zh) 2019-11-12
US10959028B2 (en) 2021-03-23

Similar Documents

Publication Publication Date Title
EP3565276B1 (de) Verfahren zum betrieb eines hörgeräts und hörgerät
EP2956920B1 (de) Orientierungshilfe für blinde und sehbehinderte mit einer vorrichtung zur detektierung einer umgebung
JP6448596B2 (ja) 補聴システム及び補聴システムの作動方法
EP2603018B1 (de) Hörvorrichtung mit Sprecheraktivitätserkennung und Verfahren zum Betreiben einer Hörvorrichtung
EP1530402B1 (de) Verfahren zur Adaption eines Hörgeräts unter Berücksichtigung der Kopfposition und entsprechendes Hörgerät
EP2519033B1 (de) Verfahren zum Betrieb eines Hörgerätes mit verringerter Kammfilterwahrnehmung und Hörgerät mit verringerter Kammfilterwahrnehmung
EP3788802B1 (de) Verfahren zum betrieb eines hörgeräts und hörgerät
US20220070567A1 (en) Hearing device adapted for orientation
EP4035422A1 (de) Verfahren zum betreiben eines hörsystems und hörsystem
EP3926981B1 (de) Hörsystem mit mindestens einem am kopf des nutzers getragenen hörinstrument sowie verfahren zum betrieb eines solchen hörsystems
DE102020208720B4 (de) Verfahren zum umgebungsabhängigen Betrieb eines Hörsystems
DE112016003273T5 (de) Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren und programm
DE102014215487A1 (de) Verfahren und Vorrichtung zum Unterstützen einer Bewegung einer Person mit einer Bewegungsstörung und Signalsystem für eine Person mit einer Bewegungsstörung
WO2022128083A1 (de) Verfahren zur bestimmung der höranstrengung eines hörgeräteträgers und entsprechender einstellung von hörgeräteparametern
WO2011036288A1 (de) Vorrichtung und verfahren zur assistenz für sehbehinderte personen mit dreidimensional ortsaufgelöster objekterfassung
EP4456559A1 (en) Providing optimal audiology based on user's listening intent
EP3833053B1 (de) Verfahren zum umgebungsabhängigen betrieb eines hörsystems
EP4327566B1 (de) Verfahren zur bestimmung der kopfbezogenen übertragungsfunktion
EP4273877A1 (de) Tragbares system sowie computerimplementiertes verfahren zur unterstützung einer person mit seheinschränkungen
DE102018204260B4 (de) Auswerteeinrichtung, Vorrichtung, Verfahren und Computerprogrammprodukt für eine hörgeschädigte Person zur umgebungsabhängigen Wahrnehmung eines Schallereignisses
Andersen Ordinal models of audiovisual speech perception
DE102015212001A1 (de) Verfahren für die Ansteuerung einer Vorrichtung und Vorrichtung zur Durchführung des Verfahrens
DE112018008012T5 (de) Informationsverarbeitungsvorrichtung und informationsverarbeitungsverfahren
DE102012100999A1 (de) Verfahren, Datenbank und Signalverarbeitungseinrichtung zur Rekonstruktion von Repräsentationen neuronaler Signale

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200505

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200728

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210322

RIN1 Information on inventor provided before grant (corrected)

Inventor name: MAULER, DIRK

Inventor name: KUEBERT, THOMAS

Inventor name: WURZBACHER, TOBIAS

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502019002106

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

Ref country code: AT

Ref legal event code: REF

Ref document number: 1425068

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210915

REG Reference to a national code

Ref country code: DK

Ref legal event code: T3

Effective date: 20211117

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20210825

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211125

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211125

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211227

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211126

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

REG Reference to a national code

Ref country code: DE

Ref legal event code: R026

Ref document number: 502019002106

Country of ref document: DE

PLBI Opposition filed

Free format text: ORIGINAL CODE: 0009260

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

PLAB Opposition data, opponent's data or that of the opponent's representative modified

Free format text: ORIGINAL CODE: 0009299OPPO

PLAX Notice of opposition and request to file observation + time limit sent

Free format text: ORIGINAL CODE: EPIDOSNOBS2

26 Opposition filed

Opponent name: OTICON A/S

Effective date: 20220525

R26 Opposition filed (corrected)

Opponent name: OTICON A/S

Effective date: 20220525

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

PLBB Reply of patent proprietor to notice(s) of opposition received

Free format text: ORIGINAL CODE: EPIDOSNOBS3

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220426

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220426

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230417

Year of fee payment: 5

Ref country code: DK

Payment date: 20230419

Year of fee payment: 5

Ref country code: DE

Payment date: 20230418

Year of fee payment: 5

Ref country code: CH

Payment date: 20230502

Year of fee payment: 5

REG Reference to a national code

Ref country code: DE

Ref legal event code: R103

Ref document number: 502019002106

Country of ref document: DE

Ref country code: DE

Ref legal event code: R064

Ref document number: 502019002106

Country of ref document: DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230420

Year of fee payment: 5

RDAF Communication despatched that patent is revoked

Free format text: ORIGINAL CODE: EPIDOSNREV1

RDAG Patent revoked

Free format text: ORIGINAL CODE: 0009271

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: PATENT REVOKED

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

27W Patent revoked

Effective date: 20231010

GBPR Gb: patent revoked under art. 102 of the ep convention designating the uk as contracting state

Effective date: 20231010

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825

REG Reference to a national code

Ref country code: AT

Ref legal event code: MA03

Ref document number: 1425068

Country of ref document: AT

Kind code of ref document: T

Effective date: 20231010

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20190426

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210825