WO2022244849A1 - 感覚制御方法、感覚制御システム、変換モデル生成方法、変換モデル生成システム、関係式変換方法、およびプログラム - Google Patents
感覚制御方法、感覚制御システム、変換モデル生成方法、変換モデル生成システム、関係式変換方法、およびプログラム Download PDFInfo
- Publication number
- WO2022244849A1 WO2022244849A1 PCT/JP2022/020863 JP2022020863W WO2022244849A1 WO 2022244849 A1 WO2022244849 A1 WO 2022244849A1 JP 2022020863 W JP2022020863 W JP 2022020863W WO 2022244849 A1 WO2022244849 A1 WO 2022244849A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensory
- parameter
- physical
- presentation
- parameters
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0362—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
Definitions
- the present disclosure relates to a sensory control method, a sensory control system, a transformation model generation method, a transformation model generation system, a relational expression transformation method, and a program for controlling physical properties related to sensory presentation.
- the tactile presentation includes tactile presentation, auditory presentation by sound, visual presentation by image display, and the like.
- Tactile presentation includes, for example, operation reaction force acting on body parts such as fingers of the user operating the device (including the case of using media such as touch pens and gloves), vibration presentation by driving actuators, etc. Including sensory presentation, electrical stimulation, etc. Adjusting the signals that drive such devices has been used to adjust the sensory presentation.
- Patent Literature 1 discloses an example of a system for designing haptics. In this system, when an audio signal is received by an audio capture device, a haptic effect is determined based on the audio signal, and the haptic effect is output by a haptic output device.
- the system described in U.S. Patent No. 6,300,101 receives an audio signal from an audio capture device that includes a user-desired haptic effect feature (e.g., a description of a concept such as impact, explosion, or rain).
- a user-desired haptic effect feature e.g., a description of a concept such as impact, explosion, or rain.
- a haptic effect with features that mimic the concept can be determined and output.
- JP 2019-220168 A Japanese Patent No. 5662425 Japanese Patent Publication No. 2020-523068 Japanese Patent Publication No. 2013-519961
- Patent Literature 1 can output a haptic effect having features that simulate the concept desired by the user. However, there is still room for improvement in presenting a tactile sensation that reflects human sensibility.
- the present disclosure is intended to solve the above-described conventional problems, and includes a sensory control method, a sensory control system, a conversion model generation method, a conversion model generation system, a relational expression conversion method, and a sensory presentation that reflects human sensitivity.
- the purpose is to provide a program.
- a sensory control method includes a reception step, a conversion step, and an output step.
- sensitivity parameters are received.
- the conversion step the received kansei parameter is converted into a physical parameter correlated with the kansei parameter among a plurality of types of physical parameters included in the physical properties relating to the sensation presentation.
- the output step outputs a sensor presentation signal based on the converted physical parameter.
- a transformation model generation method includes a storage step, an extraction step, and a generation step.
- correspondence information between a physical characteristic related to a predetermined sensation presentation and a sensitivity parameter indicating the degree of sensation expression for the sensation presentation is stored for each of one or more types of sensation presentation.
- physical parameters correlated with the sensory parameters are extracted from among a plurality of types of physical parameters included in the physical properties relating to the sensory presentation.
- a conversion model capable of converting a newly received Kansei parameter into a physical parameter correlated with the Kansei parameter is generated.
- a relational expression conversion method converts a first relational expression described with a plurality of types of physical parameters included in physical characteristics related to sensation presentation to a plurality of types of physical parameters. a step of converting each of them into the second relational expression described for the plurality of types of sensibility parameters.
- a program according to an embodiment of the present disclosure causes a computer to execute any of the methods according to the above embodiments.
- a sensory control system includes an input unit and a processor.
- the input unit receives sensitivity parameters.
- the processor converts the received sensory parameter into a physical parameter correlated with the sensory parameter among a plurality of types of physical parameters included in the physical properties related to the sensation presentation, and outputs a sensory presentation signal based on the converted physical parameter. .
- a transformation model generation system includes a storage unit and a processor.
- the storage unit stores, for each of one or more types of sensation presentation, correspondence information between a physical characteristic relating to a given sensation presentation and a sensitivity parameter indicating the degree of sensation expression for that sensation presentation.
- the processor extracts a physical parameter correlated with the sensory parameter from among a plurality of types of physical parameters included in the physical characteristics related to the sensory presentation, based on the correspondence information for each of the one or more types of sensory presentation, and extracts the sensory parameter. and the extracted physical parameter, a transformation model capable of transforming a newly received sensory parameter into a physical parameter correlated with the sensory parameter is generated.
- FIG. 1 is a block diagram showing the basic configuration of a sensory control system according to an embodiment of the present disclosure
- FIG. 1 is a block diagram illustrating a haptic control system as a first embodiment of a sensory control system according to the present disclosure
- FIG. 3 is an explanatory diagram illustrating an example of a configuration of a tactile sense presentation unit included in the tactile sense control system of FIG. 2 with an equivalent circuit using a Laplace transform operator
- FIG. 3 is an explanatory diagram showing an equivalent model of an example of a tactile sense presentation unit included in the tactile sense control system of FIG. 2
- 5 is an explanatory diagram illustrating an equivalent circuit and an internal structure of an example of the actuator shown in FIG. 4;
- FIG. 4 is an explanatory diagram showing an equivalent circuit and an internal structure of an example of the actuator shown in FIG. 4;
- FIG. 4 is a flow chart describing a transformation model generation method using the transformation model generation system of the present disclosure
- 4 is a flow chart illustrating a specific example of a transformation model generation method and a tactile sense presentation method of the present disclosure
- 4 is a flowchart illustrating a haptic control method using the haptic control system of the present disclosure
- FIG. 4 is an explanatory diagram of physical characteristics of a pressing-type operation tool
- FIG. 4 is an explanatory diagram showing an example of physical parameters of a pressing type operation tool
- FIG. 4 is an explanatory diagram of the operation of the pressing type operating tool
- FIG. 4 is an explanatory diagram showing the relationship between the sensibility parameter and the physical parameter in the pressing type operation tool
- FIG. 4 is an explanatory diagram showing the relationship between the sensibility parameter and the physical parameter in the pressing type operation tool;
- FIG. 4 is an explanatory diagram showing the relationship between the sensibility parameter and the physical parameter in the pressing type operation tool;
- FIG. 4 is an explanatory diagram showing the relationship between the sensibility parameter and the physical parameter in the pressing type operation tool;
- FIG. 4 is an explanatory diagram of physical characteristics of a pressing-type operation tool;
- FIG. 4 is an explanatory diagram showing the relationship between the sensibility parameter and the physical parameter in the rotary operation tool;
- FIG. 4 is an explanatory diagram of physical characteristics of a rotary operation tool;
- FIG. 3 is a block diagram showing the configuration of a rotary operation mold tool
- 2 is a block diagram illustrating a haptic control system as a second embodiment of a sensory control system according to an embodiment of the present disclosure
- FIG. FIG. 21 is a sequence diagram showing the operation of the haptic control system shown in FIG. 20; It is a figure which shows the 1st relational expression concerning an example of the conversion model generation method of this indication. It is a figure which shows the 2nd relational expression based on an example of the conversion model generation method of this indication.
- FIG. 10 is a diagram showing an example of temporal change in intensity of a drive signal supplied to a weight based on a tactile sense presentation signal; It is an example of a perspective view of a haptic control device.
- FIG. 10 is an example of a diagram showing an outline of work for a user to adjust an operational feel using a haptic control device
- FIG. 10 is an example of a diagram showing an outline of work for a user to adjust an operational feel using a haptic control device
- 3 is an example of a functional block diagram illustrating functions of a haptic control device
- FIG. 10 is an example of a flowchart diagram showing a flow of learning in generation of a classifier.
- FIG. 4 is a flow chart showing the flow of learning the correspondence between the expression frequency of the sensitivity parameter and the physical parameter.
- FIG. 10 is an example of a diagram showing an outline of work for a user to adjust an operational feel using a haptic control device
- 3 is an example of a functional block diagram illustrating functions of a haptic control device
- FIG. 10 is an example of a flowchart diagram showing a flow of learning in generation of a classifier.
- FIG. 4 is a flow chart showing the flow of learning the correspondence between the
- FIG. 10 is an example of a flow chart showing a flow in which the tactile sense control device uses the classification section and the first to third transformation models to present the user's preferred operational feel.
- FIG. 10 is a diagram showing an outline of work for a user to adjust an operational feel using a haptic control device; 3 is an example of a functional block diagram illustrating functions of a haptic control device;
- FIG. 10 is an example of a flow chart showing the flow of learning a physical parameter (load-displacement curve) corresponding to the expression frequency;
- FIG. 10 is an example of a flowchart showing the flow of curve fitting of the load-displacement curve of the reference operation tool.
- FIG. 10 is an example of a flow chart showing a flow in which the tactile sense control device uses the physical parameter conversion section and the comparison section to present an operation feel preferred by the user.
- FIG. 10 is a diagram showing an example of a neural network in which a classifier is implemented by a neural network;
- FIG. 10 is a diagram showing an example of a decision tree when the classification unit is implemented by a decision tree; It is a figure explaining the first input screen of an expression frequency.
- 1 is an example of a functional block diagram of a haptic control system in which the haptic control device of the first form is applied to a client server system;
- FIG. 10 is an example of a sequence diagram explaining the operation of the haptic control system;
- FIG. 10 is an example of a functional block diagram of a haptic control system in which the haptic control device of the second embodiment is applied to a client server system;
- FIG. 10 is an example of a sequence diagram explaining the operation of the second form of the haptic control system;
- FIG. 12 is a diagram showing the configuration of a tactile control system of the sensory control system (Example 2);
- FIG. 5 is a diagram showing an example of operation unit parameters;
- FIG. 5 is a diagram for explaining differences in physical characteristics of operation units;
- FIG. 10 is a diagram illustrating several methods of detecting the size and mass of an operation part by an operation part sensor; It is a figure explaining the estimation method of the mass of an operation part by calibration. It is a figure explaining correction
- FIG. 10 is an example of a functional block diagram of a haptic control system in which the haptic control device of the second embodiment is applied to a client server system;
- FIG. 10 is an example of
- FIG. 10 is a flow chart diagram showing a process of adjusting a tactile sense presentation signal according to physical parameters of an operation unit equipped with a tactile sense control system
- FIG. 10 is a flow chart diagram showing a process of adjusting a tactile sense presentation signal in accordance with physical parameters of an operation unit equipped with a haptic control system (modification).
- FIG. 46 is a diagram showing the configuration of a haptic control system as a second embodiment of the sensory control system shown in FIG. 45 together with the flow of signals;
- FIG. 4 is a sequence diagram in which a communication device and a terminal device communicate with each other to estimate sensitivity parameters of an operation unit attached thereto;
- FIG. 4 is a diagram for explaining static characteristics obtained by a rigid pressing tool and dynamic characteristics obtained by a finger model pressing tool in which a rigid body and an elastic body are integrated;
- FIG. 10 is a diagram for explaining the relative positions of the finger and the manipulator when the finger is deformed; It is a figure explaining a finger model pressing tool.
- FIG. 10 is a diagram illustrating generation of a sensation presentation signal with a click feeling; 1A and 1B are an example of a functional configuration diagram and a block diagram of a pressing type operation tool;
- FIG. FIG. 5 is a diagram for explaining dynamic characteristics when an operation tool is pressed by a finger model pressing tool;
- FIG. 5 is a diagram for explaining temporal transition of relative positions of a finger model pressing tool and an operating tool;
- FIG. 4 is a diagram illustrating dynamic characteristics in more detail together with periods A to C;
- FIG. 10 is an example of a diagram showing dynamic characteristics when a plurality of manipulating tools having different dynamic characteristics are pressed by a finger model pressing tool;
- FIG. 10 is an example of a flowchart illustrating the flow of determination of physical parameters correlated with kansei parameters;
- FIG. 11 is a scatter diagram of a set of dynamic characteristics and expression frequency of each operation implement in the sensitivity parameter of “with (no) sense of return” acquired by the processor in step ST153;
- FIG. 10 is a scatter diagram of a set of dynamic characteristics and expression frequency of each operation tool in the sensitivity parameter of “there is (no) a feeling of being sucked” acquired by the processor in step ST153;
- FIG. 11 is a scatter diagram of a set of dynamic characteristics and expression frequency of each operation implement in the sensitivity parameter of “with (no) sense of return” acquired by the processor in step ST153;
- FIG. FIG. 4 is a diagram showing a list of correlation coefficients between each sensitivity parameter and each dynamic characteristic;
- FIG. 10 is an example of a sequence diagram in which a communication device and a terminal device communicate with each other to estimate a sensitivity parameter of a mounted operation tool;
- FIG. 1 shows a basic configuration of a sensory control system 100 according to aspect 1 of the present disclosure.
- a sensory control system 100 shown in FIG. The storage unit 11 stores a sensory parameter-physical parameter conversion model (hereinafter simply referred to as "conversion model 15").
- the sense presentation unit 102 is a component that presents senses to a person. It can consist of a visual presenter, such as a presenting display device, or any combination thereof.
- the conversion model 15 is a conversion model that can convert a sensory parameter into a physical parameter that correlates with the sensory parameter.
- the kansei parameter is a parameter that indicates the degree of sensory expression with respect to the sensory presentation.
- the kansei parameter is a combination of two sensory expressions (adjectives, onomatopoeia, phonetic symbolic words, etc.). It may be one that indicates whether it is close or not by several stages of evaluation.
- the combination of the two sense expressions is "comfortable-uncomfortable", "light-heavy”, and the like.
- the expression frequency of the sensitivity parameter of "most comfortable” is set to “1", and as the expression frequency increases to "2", “3", and "4", it moves toward “uncomfortable”.
- "7" can be expressed as "least comfortable”.
- the sensitivity parameter is not limited to the combination of two sensory expressions, and may be the intensity of one sensory expression.
- a parameter may be multidimensionally expressed by taking a plurality of sense expression axes and combining the plurality of axes. Physical parameters are included in physical properties related to sensation presentation, and there are multiple types.
- the physical properties related to sensation presentation are physical properties that can affect the entire sensory transmission system including the sensation presentation means such as the sensation presentation unit 102 and the human body parts when presenting sensations to a person. That is, the physical properties related to sensation presentation are not limited to the physical properties of the sensation presentation means, but may also include the physical properties of the human body part to which the sensation is presented.
- Processor 101 controls the overall operation of sensory control system 100 .
- Processor 101 is a general term for one or more processors.
- each component of sensory control system 100 may be controlled by a plurality of processors, or one processor may control all components.
- each component of the sensory control system 100 may be able to transmit information to each other so as to execute the transformation model generation method and the sensory control method, which will be described later, and the connection method is not particularly limited.
- the connection method of each component of the sensory control system 100 may be wired connection or wireless connection including network connection.
- the sensory control system 100 may consist of multiple devices or may be a single device.
- the transformation model 15 included in the sensory control system 100 is obtained by the following transformation model generation method.
- the transformation model generation method first, the kansei database 16 stores correspondence information in which physical characteristics related to a given sense presentation and kansei parameters indicating the degree of sense expression for the sense presentation are associated with one or more types of sense presentations. are respectively stored (storage step).
- the processor 101 extracts the physical parameters correlated with the sensory parameters from among the plurality of types of physical parameters included in the physical properties related to the sensory presentation based on the correspondence information for each of the one or more types of sensory presentations in the sensory database 16. (extraction step). After that, the processor 101 generates the transformation model 15 based on the sensitivity parameters and the extracted physical parameters (generating step).
- the conversion model 15 generated in this manner is a conversion model that can convert a newly received sensory parameter into a physical parameter that correlates with the sensory parameter.
- the sensory control system 100 functions as a transformation model generation system when executing the transformation model generation method described above.
- the extraction step in order to derive a plurality of types of physical parameters included in the physical properties related to the sense presentation, the physical properties related to the sense presentation means may be extracted, or the physical properties of the system including the human body part may be extracted. Extraction is also possible.
- the transformation model generation method may be executed by a transformation model generation system separate from the sensory control system 100 .
- the transformation model generation system comprises at least the Kansei database 16 and the processor 101 .
- the sensory control system 100 may acquire the transformation model 15 obtained by another transformation model generation system executing the transformation model generation method and store it in the storage unit 11 . In this case, the sensory control system 100 does not have to include the Kansei database 16 .
- the correspondence information stored in the sensitivity database 16 may be updatable, and the conversion model 15 may also be updatable based on the updated correspondence information.
- the Kansei database 16 adds or updates the correspondence information described above for one or more types of sensory presentations.
- the processor 101 re-extracts the physical parameters correlated with the sensory parameters based on the corresponding information for each of the one or more types of sensory presentations in the sensory database 16 .
- the processor 101 updates the transformation model 15 based on the Kansei parameters and the newly extracted physical parameters.
- the sensory control system 100 executes the following sensory control method.
- the sensory control system 100 receives an input of sensitivity parameters from a user or the like via the input unit 4 (accepting step).
- the processor 101 converts the received kansei parameter into a physical parameter correlated with the kansei parameter among a plurality of types of physical parameters included in the physical characteristics related to the sensation presentation, based on the conversion model 15 (conversion step).
- Processor 101 then generates a sensation presentation signal based on the converted physical parameter and outputs it to sensation presentation section 102 (output step).
- the sensation presentation unit 102 presents a sensation to the user or the like based on the sensation presentation signal (a sensation presentation step).
- the sensation control system 100 can present a sensation to the user or the like based on the sensation presentation signal based on the physical parameter correlated with the received Kansei parameter. can be presented.
- FIG. 2 shows the configuration of the haptic control system 1 as the first embodiment of the sensory control system 100 shown in FIG. 1, together with the flow of signals.
- the haptic control system 1 shown in FIG. 2 has a main controller 10 .
- the main control device 10 is a personal computer, a server, or the like, and has a processor (CPU) 14 and a storage section 11 of RAM and ROM.
- Main controller 10 is provided with arithmetic function units 12 and 13 executed by processor 14 .
- the haptic control system 1 shown in FIG. 2 has an input/output device 3 .
- the input/output device 3 includes an input unit 4 , a display unit 5 , and a processor that operates the input unit 4 and the display unit 5 .
- the input/output device 3 and the main controller 10 are connected via various interfaces.
- the haptic control system 1 includes a haptic presentation device 20 .
- the haptic presentation device 20 includes a terminal processor 18 that controls its operation.
- Arithmetic function unit 13 functioning as an output unit of main control unit 10 and tactile presentation device 20 are connected by cables and connectors, USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface, registered trademark), Ethernet (registered trademark). ), and are connected via an interface such as Wi-Fi.
- a conversion model 15 is stored in the storage unit 11 of the main controller 10 shown in FIG.
- the transformation model 15 is a transformation model capable of transforming received kansei parameters into physical parameters correlated with the kansei parameters, as described in the description of the sensory control system 100 in FIG.
- the sensibility parameter in this example is a parameter indicating the degree of sensory expression with respect to tactile presentation.
- the kansei parameter in this example may be the user's evaluation of the operational feeling when operating a predetermined operation tool in an expression based on kansei.
- the sensitivity parameter of this example is input reflecting the operation of a predetermined operation tool.
- the physical parameters in this example are included in the physical properties related to tactile sensation presentation, and there are multiple types.
- the physical parameter in this example may be a physical parameter included in a physical characteristic that realizes tactile presentation when a predetermined operating tool is operated.
- the physical parameters in this example can be used to operate the tactile sense presentation device 20 to reproduce the sensation representation of a predetermined manipulation tool.
- the tactile sense presentation device 20 includes at least a tactile sense presentation section 30 .
- the tactile sense presentation device 20 presents a tactile sense to the user by controlling the tactile sense presentation section 30 based on the tactile sense presentation signal.
- the tactile sense presentation unit 30 is an example of the sensation presentation unit 102 in FIG. 1 .
- the tactile sense presentation unit 30 may present a tactile sense by generating drag or vibration.
- Examples of the tactile sense presentation unit 30 that generates drag and vibration include voice coil motors (VCM), linear actuators (both resonant and non-resonant types), piezo elements, eccentric motors, shape memory alloys, and magneto-rheological fluids. , electroactive polymers, and the like.
- the tactile sense presentation unit 30 may present a tactile sense by presenting a thermal sensation.
- a Peltier device is used as the tactile sense presentation unit 30 that presents a thermal sensation.
- a Peltier element utilizes heat transfer due to the Peltier effect when direct current is applied to two metal plates facing each other, and the amount of heat on the surfaces of the metal plates changes according to the direction of the current. By controlling the current direction and current amount, it is possible to make the user's finger or other body part that touches the Peltier element feel warm or cold.
- the tactile sense presentation unit 30 may present a tactile sense by applying electrical stimulation.
- As the tactile sense presentation unit 30 that provides electrical stimulation for example, there is a configuration that provides electrical stimulation by capacitively coupling with a body part such as a user's fingertip.
- the tactile sense presentation unit 30 may present an aerial haptic sense.
- As the tactile sense presentation unit 30 that presents an aerial tactile sense for example, a structure that presents a tactile sense by generating air vibrations using ultrasonic waves or the like and resonating a body part such as a user's fingertip with the air vibrations. be done.
- the tactile control system 1 may include an operating device 33, and the tactile sense presentation unit 30 may present a tactile sense to the user who operates the operating device 33.
- the tactile sense presentation unit 30 may present a tactile sense to the user who operates the operation device 33 to present a predetermined operational feeling.
- the tactile sense presentation unit 30 may present an operation feel that mimics the operation feel of a predetermined operation tool.
- the operation tool whose operation feeling is simulated includes a push switch that receives a pressing operation, a rotary switch that receives a rotation operation, a joystick that receives a tilting operation, a slide switch that receives a slide operation to the slide operation section, and a switch to the operation panel. Examples include a touch panel that accepts contact operation, pressing operation, tracing operation, and the like.
- the operating device 33 it is possible to use a device of any form that can be operated in the same manner as the above-described predetermined operating tool.
- the operation device 33 may be in a form that imitates a predetermined operation tool, or in a form that is unrelated to the predetermined operation tool. It may be an operating device such as an operating glove that accepts.
- the tactile sense presentation unit 30 may present a tactile sense to the user regardless of the operation of the operation device 33 . In that case, the haptic control system 1 does not have to include the operating device 33 .
- the tactile presentation device 20 may include various sensors such as a position sensor 27 and an acceleration sensor 28.
- the tactile sense presentation device 20 is provided with various sensors to detect physical quantities of at least one of the tactile sense presentation device 20 itself, the operation device, and the user's body part, and drives the tactile sense presentation unit 30 based on the physical quantity. can be controlled.
- sensors for example, torque sensors, angular velocity sensors, temperature sensors, pressure sensors (including atmospheric pressure sensors), humidity sensors, magnetic sensors, optical sensors, ultrasonic sensors, myoelectric sensors, etc. can be used. can.
- FIG. 3 to 5 reproduces the tactile sensation when operating a pressing type operating tool. is a press-type operation tool such as a tact switch (registered trademark) that generates an operation reaction force.
- the tactile sense presentation unit 30 reproduces a tactile sense corresponding to desired sensibility parameters based on a tactile sense presentation signal given from the main controller 10 .
- the tactile sense presentation unit 30 can be used as a pressing type operation tool that realizes a tactile sense (here, an operation feeling) corresponding to a desired sensibility parameter. It can be used in place of manipulators.
- the tactile sense presentation device 20 reproduces the operation reaction force, evaluates the relationship between the sensibility parameter representing the operation feeling and the physical parameter included in the physical characteristics for operating the tactile sense presentation device 20, and evaluates the evaluation as a pressure It is also possible to use it as a guideline when designing the mold manipulator.
- FIG. 4 shows an equivalent model showing an example of the components of the tactile sense presentation unit 30.
- FIG. 5 shows an equivalent circuit and internal structure of the actuator 39 included in the tactile sense presentation unit 30. As shown in FIG. An arrow mark F shown in FIG. 5 indicates the operation reaction force (vector amount).
- the principle of operation of the tactile sense presentation unit 30 is explained using an equivalent circuit using a Laplace transform operator.
- the tactile sense presentation section 30 may have a movable section 21 .
- the operating device 33 shown in FIG. 2 is integrated with the movable portion 21 shown in FIG.
- the operation device 33 may be provided outside the system of the tactile presentation device 20 , and the structure may be such that the movable portion 21 is moved by operating the operation device 33 .
- the tactile sense presentation unit 30 has an actuator 39 .
- the actuator 39 is provided with a bobbin 24 and a coil 25 wound around the bobbin 24 .
- the bobbin 24 and coil 25 are also part of the movable portion 21 .
- the tactile sense presentation section 30 may include a spring member 26 .
- the spring member 26 has a predetermined spring constant and is composed of, for example, a coil spring.
- the spring member 26 is held in a compressed state within the tactile sense presentation unit 30, so that the direction opposite to the direction in which the movable unit 21 is pressed (upward direction in FIG. 4) in normal use. gives an operation reaction force of
- the spring coefficient of the spring member 26 is indicated by "Ks”.
- the movable portion 21 is acted upon by an operation reaction force based on the viscosity coefficient "C" caused by the lubricating oil, sliding friction on the mechanism, and the like.
- the stroke amount in the direction in which the movable portion 21 is pressed is indicated by "x".
- the actuator 39 has a tubular yoke 31 made of an iron-based magnetic material.
- the yoke 31 has an outer yoke 31a and a center yoke 31b.
- a cylindrical magnet 32 is fixed inside the outer yoke 31a.
- a cylindrical magnetic gap is formed between the center yoke 31b and the magnet 32, and a cylindrical bobbin 24 and a coil 25 are inserted in the magnetic gap.
- the position sensor 27 included in the tactile presentation device 20 shown in FIG. 2 detects the amount of movement (hereinafter referred to as "stroke amount") "x" of the movable portion 21 in the pressing operation direction.
- the acceleration sensor 28 included in the tactile sense presentation device 20 shown in FIG. 2 detects acceleration of the movable portion 21 .
- the operation range variable unit 29 included in the tactile sense presentation device 20 shown in FIG. 2 can change the total length of the stroke amount of the movable unit 21 in the pressing operation direction.
- the tactile sense presentation device 20 can present a tactile sense to the movable portion 21 via the operation device 33 by controlling the current “I” applied to the coil 25 of the tactile sense presentation section 30 .
- the tactile sensation presented here is a change in the operation reaction force "F” with respect to a body part such as a user's finger pressing the movable part 21 in the pressing operation direction.
- This operation reaction force "F” is a resistance force that reproduces the operation reaction force of a pressing type operation tool that generates an operation reaction force with a disc-shaped leaf spring or a dome-shaped leaf spring.
- Equation 1 A model of the tactile sense presentation unit 30 is shown in FIG. Equation 1 below expresses the operation of the tactile presentation device 20 in terms of a “force” equation.
- Equation 1 indicates the force obtained by multiplying the mass "M" of the movable portion 21 by the acceleration.
- the first term on the right side is the operation reaction force generated by the actuator 39
- the second term is the operation reaction force generated by the spring member 26
- the third term is the operation reaction force caused by the viscosity coefficient "C”.
- the spring constant "Ks" and the viscosity coefficient "C” are substantially constant. If the operation of the tactile sense presentation unit 30 includes an element that varies the spring constant and the viscosity coefficient, it is possible to make the spring constant Ks and the viscosity coefficient C variable by the tactile sense presentation signal.
- the viscosity coefficient C can be varied by changing the viscosity of the functional fluid.
- the spring constant Ks can be made variable.
- Kv is a physical parameter extracted from the physical properties that realize tactile presentation. This physical parameter correlates with the Kansei parameter.
- the sensibility parameter changes according to the frequency of expression of adjectives expressing the feeling of operation when a predetermined operation tool is pressed.
- the tactile sensation presented by the actuator 39 that is, the operation reaction force "F” is (N.times.B.times.L).times.I.
- N is the number of coil turns
- B is the magnetic flux density
- L is the coil inductance
- I is the coil current.
- a back electromotive force "e” derived from the model of the actuator 39 is represented by the following differential equation (4).
- ⁇ is the magnetic flux.
- a second circuit portion (b) shows a model of the force acting on the actuator 39 .
- “F” is the reaction force
- " ⁇ ” is the acceleration of the movable part 21
- "v” is the velocity of the movable part 21
- "x” is the stroke amount of the movable part 21.
- “Kv” is not limited to changes in parameters in the formula, and may be variables extracted from a data map in which data are associated with each other and stored in advance.
- FIG. 6 shows an example of the process of generating the transformation model 15 stored in the haptic control system 1 of FIG. 2 (transformation model generation method).
- a transformation model generation method is executed by a transformation model generation system that includes at least an input unit, a storage unit, and a processor. "ST" in FIG. 6 indicates a processing step.
- the conversion model generation system accepts inputs of sensitivity parameters by multiple users for each of one or more types of tactile presentation.
- “one or more types of tactile sensation presentation” includes not only the tactile sensation when the user operates the operating tool, but also the tactile sensation given to the user when the user does not operate anything.
- one or more types of tactile sensations may be presented as tactile sensation presentations corresponding to content such as games and videos through a suit or gloves, and input of sensitivity parameters based on how the user feels about each may be accepted.
- This step is an example of the storage step in the transformation model generation method described for the sensory control system 100 of FIG.
- the conversion model generation system extracts physical parameters that are correlated with the sensibility parameters from the physical characteristics related to various tactile presentations.
- This step is an example of an extraction step in the conversion model generation method described for the sensory control system 100 of FIG.
- the conversion model generation system generates a conversion model 15 .
- This step is an example of the generation step in the transformation model generation method described for the sensory control system 100 of FIG.
- Generation of the conversion model 15 can be performed manually, by multiple regression analysis, by machine learning, or by various other analytical techniques.
- the conversion model 15 includes a model capable of converting one type of kansei parameter into one type of physical parameter, a model capable of converting one type of kansei parameter into a plurality of types of physical parameters, and a model capable of converting a plurality of types of kansei parameter into one type of physical parameter.
- models that can be converted models that can be converted from multiple types of sensory parameters to multiple types of physical parameters, and the like.
- By deriving complex correlation information from information on the correlation between one type of sensory parameter and one type of physical parameter using machine learning, etc. multiple types of physical parameters can be obtained from multiple types of sensory parameters.
- You may generate a model that can be converted to
- the data structure of the conversion model 15 may be a correspondence table between sensory parameters and tactile parameters, or may be stored so as to be calculable by a function.
- the transformation model generation system extracts information about the degree of correlation between each of the plurality of types of physical parameters and the Kansei parameters for the plurality of types of Kansei parameters. More specifically, the conversion model generation system extracts information on the plurality of degrees of correlation by multiple regression analysis using each of the plurality of types of sensory parameters as objective variables and the plurality of types of physical parameters as explanatory variables.
- the information about the degree of correlation includes, for example, a coefficient of determination, a constant term, or a value derived therefrom in multiple regression analysis.
- the transformation model generation system uses the plurality of types of physical parameters and the plurality of pieces of correlation degree information to generate a first relational expression that explains each of the plurality of types of kansei parameters (first generation step). Specifically , let A 1 , A 2 , . Let B m1 , B m2 , . can be done.
- Equation 5 one side (here, the left side) is a column vector representing multiple types of sensory parameters, and the other side (
- the first relational expression is expressed as shown in FIG.
- the coefficient matrix is a square matrix with n rows and n columns.
- Equation 5 one side (here, the left side) is a column vector representing multiple types of sensory parameters, and the other side (
- the first relational expression is expressed as shown in FIG.
- the coefficient matrix is a square matrix with n rows and n columns.
- the transformation model generation system uses a plurality of types of sensory parameters and a plurality of correlation degrees based on a first relational expression to generate a plurality of types of physical parameters.
- second generation step Specifically, the transformation model generation system generates the second relational expression by multiplying both sides of the first relational expression shown in FIG. 22 by the inverse matrix of the coefficient matrix from the left.
- the second relational expression has a column vector representing multiple types of physical parameters as one side (here, the left side), and the product of the inverse matrix of the coefficient matrix and the column vector representing multiple types of sensory parameters. It can be expressed as the other side (here the right side).
- the conversion model generation system can convert the multiple types of sensory parameters into multiple types of physical parameters correlated with the multiple types of sensory parameters based on the second relational expression.
- a transformation model 15 is generated (third generation step). In this way, the conversion model generation system can generate the conversion model 15 that can convert multiple types of sensory parameters into multiple types of physical parameters.
- the coefficient matrix is a square matrix, but the coefficient matrix does not necessarily have to be a square matrix.
- the transformation model 15 capable of transforming multiple types of sensory parameters into multiple types of physical parameters can be generated.
- the sensory control method by the sensory control system 100 shown in FIG. 1 can be executed as follows when using the conversion model 15 obtained in this example.
- the sensory control system 100 receives input of multiple types of sensory parameters from a user or the like via the input unit 4 .
- the processor 101 converts the acquired multiple types of sensory parameters into multiple types of physical parameters correlated with the multiple types of sensory parameters based on the conversion model 15 .
- the output step and the sense presentation step are the same as those described above, so descriptions thereof will be omitted.
- the sensibility parameter of the conversion model 15 in this example is the expression frequency of adjectives expressing the operational feeling of operating a pressing type operating tool as a predetermined operating tool.
- the physical parameters of the conversion model 15 in this example are included in the physical properties that realize sensation presentation when a pressing type operation tool as a predetermined operation tool is operated.
- the sensibility parameter for a pressing operation tool is the degree of sensory expression by adjectives, onomatopoeia, or the like that express the feeling of operation when a person presses the pressing operation tool.
- the physical properties realized by the physical parameters include, for example, displacement accompanying operation (for example, stroke amount), operation reaction force (load), velocity, acceleration, and jerk of the movable part 21, elasticity of body parts such as operator's fingers, etc. properties, or quantities derived from these physical properties.
- a physical parameter herein is defined as including one or more variables of a physical property.
- FIG. 7 is a flow chart explaining a specific example of the conversion model generation method and the tactile sense presentation method.
- processing steps are indicated by "ST", but ST1, ST2, etc. include artificial processing, and ST3, ST4, etc. Processing performed by processor 14 is included.
- a plurality of operating tools having the same function but different operational feel are prepared.
- a sensory test is performed by a plurality of users, and the operational feel of a plurality of prepared operating tools is classified by the frequency of expression of adjectives as sensitivity parameters.
- the processor 14 of the tactile sense control system 1 converts the expression frequency of the adjective as the sensibility parameter and the physical parameter included in the physical property that realizes the sense presentation when the operating tool is operated into a correlation coefficient or the like. Associate based on. At least one variable is included in each of the sensibility parameter and the physical parameter.
- Associated kansei parameters and physical parameters are stored as the kansei database 16 shown in FIG.
- the processor 14 uses the transformation model 15 to transform into physical parameters correlated with the expression frequencies of adjectives as sensitivity parameters for newly accepting inputs.
- a tactile sense presentation signal based on a physical parameter is generated in the computational function unit 12 , and the tactile sense presentation signal is output from the computational function unit 13 .
- This tactile sense presentation signal causes the tactile sense presentation device 20 to operate, and a tactile sense is presented.
- By controlling at least one of the coefficients "Kv”, “Ks", and "C" shown in FIG. is presented.
- FIG. 11 schematically shows a change in reaction force when a pressing operation tool is pressed.
- FIG. 11 shows a coordinate plane in which a horizontal axis represents a displacement caused by an operation and a vertical axis represents an operation reaction force (load) acting on a body part such as a finger of a user who operates the pressing type operation tool as the operation tool.
- “transition associated with manipulation of the manipulator” includes the amount of manipulation of the manipulator, the manipulation time of the manipulator, or a combination of the manipulation amount and the manipulation time. That is, the physical characteristics that realize the sense presentation when the manipulator is operated are the relationship between the operation amount of the manipulator and the operation reaction force, the relationship between the operation time of the manipulator and the operation reaction force, the operation amount of the manipulator and the operation. It can be expressed by either combination with time or the relationship between operation reaction force.
- the "displacement associated with the operation of the manipulator” may also include the displacement caused by the elastic properties of the operator's finger or other body part that operates the manipulator. In FIG.
- the "displacement due to the operation of the manipulator” is the amount of operation of the press-type manipulator as the manipulator, and is hereinafter referred to as "stroke amount 'x'" as appropriate.
- the operation amount of the operation tool is an amount in a one-dimensional space, a two-dimensional space, or a three-dimensional space.
- the operation amount of the pressing type operation tool as the operation tool is the amount in the one-dimensional space along the pressing operation direction.
- the operating tool may have a movable portion that moves as the operating tool is operated.
- a pressing-type operating tool as an operating tool has a knob portion that is pressed by a user or the like as a movable portion. Therefore, the amount of operation of the pressing operation tool may be the amount of movement of the movable portion of the pressing operation tool.
- a curve indicating characteristics is called an FS curve (Force Stroke Curve), a feeling curve, an operating force curve, a load displacement curve, or the like. Below, it describes as a "load-displacement curve" suitably.
- the user presses the press-type operation tool and as the stroke amount "x" in the press operation direction increases, the disc-shaped leaf spring or the dome-shaped leaf spring is compressed and deformed, resulting in an operation reaction force. gradually increases.
- the elastic restoring force of the plate-shaped leaf spring or the dome-shaped leaf spring causes the pressing operation tool to move.
- the knob portion as the movable portion of returns to the initial position.
- the load-displacement curve when the operating device 33 returns has hysteresis with respect to the load-displacement curve when increasing the displacement associated with the pressing operation shown in FIG. In the following, for convenience of explanation, the operation will be explained using only the load-displacement curve when the displacement associated with the pressing operation is increased.
- a plurality of (23 in total) push-type manipulators were classified into groups (A), (B), and (C) according to the total stroke amount when pushed to the final stroke.
- Category (A) has a total stroke of 0.25 mm or more and 0.35 mm or less
- Category (B) has a total stroke of 0.15 mm or more and less than 0.25 mm
- Category (C) has a total stroke of 0.15 mm. is less than
- a sensory test was conducted by 25 users on the multiple pressing-type operating tools described above.
- the operational feeling (tactile sensation) felt by the user was classified by the degree of expression according to the SD method.
- a sensory parameter a predetermined sensory parameter A is used, and seven levels of "1", “2", “3", “4", "5", "6", and “7” are used. is evaluated with
- the degree of expression of the sensitivity parameter A varied widely from around “1" to around "6" for the push-type manipulators of category (A).
- the degree of expression of the kansei parameter A varied in an intermediate range from around "2.5" to around "3.5".
- the pressure-type operation tool of category (C) varied in the degree of expression of the sensitivity parameter from around "3.5" to around "6".
- the sensibility parameter A is, for example, a parameter related to "determination”, “comfort”, “tactile sensation”, etc.
- the lower the expression frequency the more “determination The parameter may be such that the higher the degree of expression, the lower the sense of determination.
- FIG. 9 shows load-displacement curves (i), (ii), and (iii) of three push-type manipulators with different total stroke amounts.
- the area S4-1 of the recessed portion of the load-displacement curve (i) and the area S4-2 of the recessed portion of the load-displacement curve (ii) are taken out as variables of the physical quantity of motion, and are shown in FIG. In B), the areas S4-1 and S4-2 are translated so that their respective minimum values Tmin match and are compared.
- the area S4 is a coordinate plane in which the operation amount of the operation tool is set as the horizontal axis and the operation reaction force is set as the vertical axis. It is the area of the recessed portion up to the position where the same operation reaction force as is restored.
- the area S4 is the area of the region defined by the load-displacement curve and a straight line passing through the maximum value Tmax of the load-displacement curve and parallel to the horizontal axis on the coordinate plane.
- the dimension representing the area S4 is represented by "distance (stroke amount) ⁇ load (operation reaction force)", and this dimension is equivalent to energy (work).
- the area S4 corresponds to energy (lost energy) that is less than the energy consumption expected by the user due to the reduction in the operation reaction force when the user operates the pressing-type operation tool. Due to the existence of the area S4, the user feels a sense of being drawn in the direction of the pressing operation.
- the operation reaction force indicated by the load displacement curve (iii) in FIG. 9 has a preload when the stroke is zero. This preload creates a so-called “play” in operation. This "play” can also be adopted as one of the physical parameters.
- FIG. 12 is a graph showing the relationship between the sensibility parameter A, which is the degree of expression according to the SD method, and the area S4, which is the physical parameter extracted from the physical characteristics that realize the sensation presentation when the operating tool is operated.
- the horizontal axis of FIG. 12 indicates the sensitivity parameter A
- the vertical axis indicates the area S4, which is the physical parameter.
- the absolute value of the correlation coefficient between the kansei parameter and the physical parameter is preferably 0.5 or more, and 0.7 or more. is more preferred.
- the total stroke amount of the press-type operation tool is preferably 0.05 mm or more and less than 0.5 mm, and more preferably 0.05 mm or more and less than 0.35 mm.
- the change in the operation reaction force with respect to the displacement accompanying the operation of the operating tool has at least a maximum portion and a minimum portion.
- the physical parameter is a coordinate plane whose axes are the displacement accompanying the operation and the operation reaction force. Includes area-based variables.
- the maximum portion is a portion including the maximum value Tmax in the load-displacement curve shown in FIG. 11
- the minimum portion is a portion including the minimum value Tmin in the load-displacement curve shown in FIG.
- the haptic control system 1 shown in FIG. 2 uses the conversion model 15 to convert the expression frequency of the received sensitivity parameter A into an area S4, which is a physical parameter correlated with the sensitivity parameter A, and calculates 12, a load-displacement curve including the area S4 is calculated to set one tactile sense presentation signal including the load-displacement curve.
- the calculation function unit 12 calculates a plurality of load-displacement curves having the same area S4 but different strokes, loads, etc., and sets a plurality of tactile sensation presentation signals including these load-displacement curves.
- a plurality of types of load-displacement curves preliminarily associated with the size of the area S4 are stored in association with the expression frequency of the sensitivity parameter A.
- Information on the load-displacement curve corresponding to the expression frequency of the parameter A may be read out from the storage unit 11 to generate the tactile sense presentation signal.
- the expression frequencies of integers such as “2", “3”, . ⁇ Not only the expression frequency including decimals, but also the expression frequency such as "2-2.5", "2.5-3", “3-3.5”, “3.5-4”, etc. It can also accept numeric range inputs.
- the haptic control system 1 uses the conversion model 15 to convert one or a plurality of load-displacement curves having an area S4, which is a physical parameter corresponding to the expression frequency of the sensibility parameter received via the input unit 4.
- Information on the converted one or more load-displacement curves is output to the input/output device 3, and the input/output device 3 causes the display unit 5 to display the one or more load-displacement curves.
- the user confirms one load-displacement curve displayed on the display unit 5, or selects one of the plurality of displayed load-displacement curves.
- this confirmation command or selection command is given from the input unit 4 to the processor 14, a tactile sense presentation signal based on the load-displacement curve selected by the arithmetic function unit 12 is set, and the tactile sense is presented from the arithmetic function unit 13 to the tactile sense presentation device 20.
- a signal is output.
- the haptic control system 1 accepts an input of "stroke amount 0.25-0.35 mm" as a physical parameter along with the expressive power of the sensibility parameter A via the input unit 4, it is included in the category (A).
- a load-displacement curve having an area S4 matching the expression frequency of the adjective is selected from among the plurality of load-displacement curves, and a tactile sense presentation signal is generated based on this load-displacement curve.
- the tactile sense control system 1 receives an input of the numerical item "magnitude of operation reaction force" as a physical parameter together with the expression degree of the kansei parameter A via the input unit 4, the expression degree of the kansei parameter A and , and the "magnitude of operation reaction force" as a physical parameter.
- the total stroke amount is limited, for example, in the range of 0.35 to 0.15 mm, and with that range as a reference, the physical parameter, which is the size of the area S4, and the sensitivity parameter, which is the frequency of expression of adjectives. is associated with
- the magnitude of the area S may be associated with the expression frequency of the sensitivity parameter based on a numerical range other than the range of the total stroke amount. For example, maximum value Tmax, minimum value Tmin, maximum value minus minimum value (Tmax-Tmin), click stroke (Pend-Pmax), push stroke (Pmax/(Pend-Pmax)), click stroke ratio ( Pmax/Pend), pushing stroke ratio (Pmax/(Pend ⁇ Pmax)), etc., may be used as a reference.
- a predetermined numerical range may be set for areas S1, S2, and S3 other than S4 or a ratio thereof, and the numerical range may be used as a reference. Based on these numerical ranges, it is possible to associate the size of the area S4, which is a physical parameter, with the expression frequency of the sensitivity parameter.
- FIGS. 13-15 show the relationship between the expression frequencies of the sensitivity parameters other than the sensitivity parameter A and the physical parameters other than the area S4 that change according to the expression frequencies.
- the horizontal axis indicates the expression frequency of the sensitivity parameter B.
- the vertical axis represents a variable relating to the stroke amount of the pressing-type manipulator as a physical parameter, such as "click stroke (Pend-Pmax)" shown in FIG.
- FIG. 13 shows a negative correlation in which the expression frequency of the sensitivity parameter B decreases as the "click stroke (Pend-Pmax)" as the physical parameter increases.
- the sensibility parameter B is, for example, a parameter relating to "sense of determination", “comfort”, “tactile sensation", etc. Specifically, in the case of a parameter relating to "comfort”, the lower the expression frequency, the more “comfortable”. , and the higher the expression frequency, the more “unpleasant".
- the physical parameters include variables related to the amount of displacement associated with manipulation. More specifically, it includes a "click stroke (Pend-Pmax)", which is the amount of displacement from the maximum to the coordinates at which the operation reaction force moves from the maximum to the minimum to the same magnitude as the maximum.
- the horizontal axis indicates the expression frequency of the sensitivity parameter C.
- the vertical axis represents a variable related to the load of the pressure-type operation tool as a physical parameter, such as Pmax shown in FIG. 11 .
- FIG. 14 shows a positive correlation in which the expression frequency of the sensitivity parameter C decreases as the physical parameter Pmax decreases.
- the sensibility parameter C is, for example, a parameter related to “sense of determination,” “comfort,” “tactile sensation,” etc. Specifically, in the case of a parameter related to “tactile sensation,” the lower the degree of expression, the softer the operational feel. , the higher the degree of expression, the harder the feeling of operation may be.
- the horizontal axis indicates the expression frequency of the sensitivity parameter D.
- the vertical axis is a variable relating to the stroke amount of the pressing-type manipulator as a physical parameter, for example, "pressing stroke ratio (Pmax)/(Pend-Pmax)" shown in FIG.
- FIG. 15 shows a positive correlation in which the expression frequency of the sensitivity parameter D increases as the “pressing stroke ratio (Pmax)/(Pend-Pmax)" as the physical parameter increases.
- the sensibility parameter D is, for example, a parameter related to "determination”, “comfort”, “tactile sensation", and the like. It may indicate that the lower the degree of expression, the duller the tactile sensation is.
- the physical parameters include variables related to the amount of displacement associated with manipulation. More specifically, “click stroke (Pend-Pmax)", which is the amount of displacement from the maximum to the coordinate at which the operation reaction force shifts to the same magnitude as the maximum through the minimum, and the amount of movement from the start of the operation to the maximum Contains a variable for "push stroke ratio (Pmax)/(Pend-Pmax)” which is the ratio to "Pmax” which is the amount of displacement to the end.
- the conversion model 15 as correlations between the kansei parameter and the physical parameter, (1) the relationship between the expression frequency of the kansei parameter A shown in FIG. (3) the relationship between the expression frequency of the sensitivity parameter C shown in FIG. 14 and the maximum value minus the minimum value that is the physical parameter; (4) the sensitivity parameter D shown in FIG. A plurality of relationships may be stored, including the relationship between the expression frequency and the pressing stroke ratio, which is a physical parameter. Any one or more of these relationships (1) to (4) are combined to calculate a physical parameter included in a physical quantity such as a load-displacement curve to generate a tactile sense presentation signal.
- the acceleration of the movable part 21 of the tactile sense presentation part 30 shown in FIG. 4 can be detected by the acceleration sensor 28 .
- a disc-shaped leaf spring or a dome-shaped leaf spring when pushed, buckles, deforms, and reverses, vibration is generated, and the vibration is transmitted to a body part, such as a finger, performing the pressing operation. By doing so, it presents the feeling of operation.
- FIGS. 16A, 16B, and 16C are simulation data showing the acceleration of the movable parts of the pressing-type manipulators when the pressing-type manipulators as three manipulators are pressed. Sensory tests were conducted by users using three pressing-type manipulators to investigate the relationship between the degree of expression of the sensibility parameter E, which relates to the operational feel of the pressing operation, and the acceleration of the movable portion of the manipulator as a physical parameter.
- the peak-to-peak value of the acceleration when the disc-shaped leaf spring or dome-shaped leaf spring of the pressing-type manipulating tool undergoes buckling deformation is the largest in the pressing-type manipulating tool shown in FIG.
- the sensibility parameter E is, for example, a parameter related to "determination”, “comfort”, “tactile sensation”, etc. Specifically, in the case of a parameter related to "comfort”, the smaller the expression frequency, the more “comfortable”. , and the higher the expression frequency, the more “unpleasant”.
- the conversion model 15 may store the correlation between the expression frequency of the sensitivity parameter E and the acceleration of the movable part of the operating tool, which is a physical parameter.
- the haptic control system 1 uses the conversion model 15 to convert the expression frequency of the sensibility parameter E input by the input unit 4 into the acceleration of the movable part of the operating tool, which is a physical parameter, and generates a haptic presentation signal based on this acceleration. and outputting the tactile sense presentation signal, the tactile sense presentation device 20 can reproduce a desired operation feeling.
- a tactile sense presentation signal for controlling the corresponding physical parameters of the movable portion 21 of the tactile sense presentation device 20 may be generated based on the physical parameters (movement amount, speed, acceleration, jerk, etc.) of the movable portion of the operating tool. good.
- FIG. 8 shows a flowchart of a control operation example of the tactile presentation device 20 .
- the processing shown in the flowchart is executed by the control operation of the processor 18 included in the tactile presentation device 20.
- ST13 when the operation device 33 is operated, detection signals regarding the movable portion 21 are obtained from the position sensor 27 and the acceleration sensor 28.
- the processor 18 calculates the difference between the motion profile of the load-displacement curve set corresponding to the expression frequency, which is the sensibility parameter, and the detected position of the movable part 21 .
- the current I applied to the coil 25 of the tactile sense presenting unit 30 is optimized, and the tactile sense is presented so as to reproduce the user's desired sensitivity parameter expression frequency.
- FIG. A tactile sense presentation device 40 illustrated in FIG. 19 reproduces the tactile sense of a rotary operation tool.
- a rotary operation tool is, for example, a rotary switch.
- the tactile sense presentation device 40 shown in FIG. 19 has a processor 41, a tactile sense presentation section 43, and a sensor 45.
- the tactile sense presentation device 40 presents a tactile sense to the user who rotates the operation device 42 .
- the operation device 42 may be mechanically incorporated in the tactile presentation device 40 or may be provided outside the tactile presentation device 40 .
- the tactile sense presentation unit 43 includes a resistance torque generator 43a and a rotational torque generator 43b.
- the resistance torque generator 43a variably applies resistance torque in a direction opposite to the rotation direction to the rotation operation of the rotation operation portion of the operation device 42 .
- the resistance torque generator 43a has, for example, a yoke made of a magnetic material and a coil that applies a magnetic field to the yoke.
- a rotating plate that rotates in conjunction with the rotating operation of the rotating operation portion of the operating device 42 is positioned in the magnetic gap of the yoke, and in the magnetic gap, a magneto-rheological fluid is filled between the yoke and the rotating plate. ing. It is also possible to use magnetic powder instead of the magneto-rheological fluid.
- the resistance torque generator 43a includes, for example, a rotary motor, and the resistance torque can be varied by the rotary motor.
- the rotational torque generating device 43b variably applies rotational torque in the rotational direction to the rotational operation of the rotational operation portion of the operating device 42 .
- the rotary torque generator 43b includes, for example, a rotary motor.
- a sensor 45 detects the rotation angle of the rotary operation portion of the operation device 42 .
- FIG. 18 shows a load-displacement curve relating to the operation reaction force of a rotary switch, which is a rotary operation tool. 360 degrees (one rotation) of the rotary switch is divided into a plurality of divided angles, the operation reaction force changes within each divided angle, and the same change of the operation reaction force is repeated within each divided angle.
- FIG. 18 shows changes in reaction force within one division angle.
- the horizontal axis of FIG. 18 indicates the rotation angle of the rotary operation portion as the operation amount of the rotary switch, and the positive side of the vertical axis indicates the resistance torque acting on the rotary operation portion of the rotary switch in the direction opposite to the operation direction.
- the negative side of the vertical axis indicates the magnitude of the rotational torque acting on the rotary operation portion in the same direction as the operation direction.
- the rotary switch is provided with spring contacts within each split angle. When the rotation operation is started within the divided angle, the spring contact contracts and the resistance torque acting on the rotation operation part increases. When the resistance torque exceeds the maximum value Rmax, the restoring force of the spring contact pushes the rotary operation part in the rotation operation direction, and the resistance torque becomes smaller. acts a rotational torque directed to . Therefore, when rotating the rotary operation unit, the finger can feel the operation feeling for each divided angle.
- the conversion model 15 stores the correlation between the expression frequency of the sensitivity parameter and the physical parameter regarding the rotation operation.
- the haptic control system 1 receives an input of the expression frequency of the sensitivity parameter via the input unit 4 .
- the processor 14 of the haptic control system 1 converts the received sensory parameters into physical parameters using the conversion model 15, and generates a haptic presentation signal based on the physical parameters.
- Processor 14 then outputs the generated tactile sense presentation signal to processor 41 included in tactile sense presentation device 40 shown in FIG.
- the tactile sense presentation device 40 detects the rotation angle of the rotary operation portion with the sensor 45 and feeds back the detection output to the processor 41 when the rotary operation portion of the operation device 42 is rotated by a user's body part such as a finger. .
- the resistance torque and the rotation torque are controlled when the rotary operation unit of the operation device 42 is operated to rotate, simulating a rotary switch that reproduces the expression frequency of the sensitivity parameter. Can provide tactile sensations.
- FIG. 17 is an explanatory diagram for explaining changes in resistance torque as an example of physical characteristics related to the expression frequency of sensibility parameters.
- FIG. 17(A) shows the operation reaction force when the four rotary switches are rotationally operated as load displacement curves
- FIG. 17(B) shows the respective , the change in curvature on the load-displacement curve of .
- the conversion model 15 stores the correlation between the expression frequency of the sensibility parameter F and the physical parameter whose variable is the curvature of the change in the resistance torque.
- the rotation operation feeling to be realized can be presented as a tactile sensation.
- the sensibility parameter F is, for example, a parameter related to "sense of determination", “comfort”, “tactile sensation", and the like. The parameter may be such that the lower the degree of expression, the duller the tactile sensation is perceived.
- the change in the operation reaction force with respect to the displacement accompanying the operation of the operation tool has at least a local maximum.
- the physical parameters also include variables related to the curvature of the maxima, including the maxima Rmax.
- the maximum portion is a portion including the maximum value Rmax in the load-displacement curve shown in FIG.
- the angle of the rising vector Tb of the resistance torque from the starting point of the divided angle in the rotation angle as the operation amount of the rotary switch, and the area Sa indicated by the load displacement curve at the rising portion of the resistance torque It is possible to associate a physical parameter including a variable related to the rise of an increase in rotational load, such as the ratio of .
- the physical parameters include variables relating to the rise of the manipulation reaction force from the start of manipulation to the maximum.
- the area Sa shown in FIG. 18 is an area defined by the load-displacement curve, the horizontal axis, and a straight line passing through the intersection of the load-displacement curve and the maximum value Rmax and parallel to the vertical axis.
- the area Sa is a value obtained by integrating the load-displacement curve in the rotation angle range from the starting point of the divided angle of the rotation angle as the operation amount to the maximum value Rmax of the operation reaction force.
- the area Sb is defined by the load-displacement curve, the vertical axis, and a straight line passing through the maximum value Rmax and parallel to the horizontal axis.
- the area Sb is obtained by subtracting the area Sa from the area of a rectangle whose one side is the rotation angle of the intersection of the load-displacement curve and the maximum value Rmax and whose other side is the maximum value Rmax. That is, if the load-displacement curve changes linearly on the coordinate plane from the start of the operation until the operation reaction force reaches the maximum value Rmax as indicated by the dashed line in FIG. : 1, indicating that the smaller the area Sb with respect to the area Sa, the more the load-displacement curve expands toward the positive side of the vertical axis on the coordinate plane. In other words, the ratio between the area Sa and the area Sb indicates the extent of swelling of the load-displacement curve.
- the rising vector Tb of the resistance torque as a physical parameter shown in FIG. 18 includes a variable relating to the differential of the operation reaction force with respect to the manipulated variable.
- the physical parameters may include variables relating to the differentiation of the operation reaction force with respect to the operation time, and may include variables relating to the second order differentiation of the change in the operation reaction force.
- the physical parameter includes a variable related to the magnitude of the entrainment whose minimum part has a negative sign.
- the minimum portion is a portion including the maximum value Dmax in the load-displacement curve shown in FIG.
- FIG. 18 shows the load displacement curve related to the operation reaction force of the rotary switch, which is a rotary operation tool.
- FIG. 18 can also be used as a diagram showing a load displacement curve relating to the operation reaction force of the slide switch that receives the slide operation to the slide operation portion. That is, the horizontal axis of FIG. 18 indicates the slide operation amount of the slide operation portion, and the positive side of the vertical axis indicates the operation reaction force to the slide operation of the slide operation portion.
- the operation reaction force gradually increases with an increase in the slide operation amount of the slide operation unit, reaches a maximum value Rmax, and when the maximum value Rmax is exceeded, it begins to decrease, and acts in the same direction as the operation direction.
- a first modification of the sensory control method executed by the sensory control system 100 of the present disclosure further includes an acquisition step of acquiring sensory stimulation signals and a specification step of specifying sensitivity parameters based on the acquired sensory stimulation signals.
- the receiving step for receiving the input of the sensitivity parameter described above is not limited to the input from the user or the like, and is a step of receiving the sensitivity parameter specified in the specifying step.
- the sensory control system 100 according to the first modification can specify a sensory parameter based on the acquired sensory stimulation signal, and output a sensory presentation signal based on a physical parameter correlated with the specified sensory parameter. can.
- the sensory stimulation signals are auditory stimulation signals based on auditory stimulation elements such as sound, visual stimulation signals based on visual stimulation elements such as images and videos, tactile stimulation signals based on tactile stimulation elements such as operation reaction force and vibration, or a signal based on any combination thereof.
- the sensory control system 100 according to the first modification generates and acquires sensory stimulation signals by sensing auditory stimulation elements, visual stimulation elements, tactile stimulation elements, or combinations thereof in the obtaining step. good too.
- the sensory control system 100 includes at least one of the auditory stimulation element, the visual stimulation element, and the tactile stimulation element (hereinafter collectively referred to as the sensory stimulation element), which is the basis of the sensory stimulation signal. ) may be converted and designated as sensitivity parameters with which the physical parameters are correlated.
- the conversion model 15 described above may be used, or a conversion model different from the conversion model 15 may be used.
- a conversion model different from the conversion model 15 can be generated by AI analysis including machine learning based on the correspondence information stored in the sensitivity database 16, similarly to the conversion model 15.
- physical parameters included in the physical properties of sensory stimulation elements such as sounds, images, and videos can be extracted by AI analysis including machine learning.
- the sensory control system 100 acquires sensory stimulation signals based on sensory stimulation elements such as sounds, images, and videos, thereby obtaining physical parameters included in the physical properties of the sensory stimulation elements.
- a correlated sensibility parameter can be specified, and a sensory presentation signal based on a physical parameter correlated with the specified sensibility parameter can be output. Therefore, for example, it is possible to output a tactile sense presentation signal based on a sensory parameter adjusted based on sound, image, video, or the like.
- the operation device 33 of the present disclosure may have an operation surface that accepts slide operations.
- the slide operation is an operation of moving a contact position while keeping a body part such as a user's finger in contact with the operation surface of the operation device 33 .
- the tactile sense presentation unit 30 of the present disclosure generates an operation reaction force by vibrating the operation surface of the operation device 33 .
- a method of vibrating the operation surface of the operation device 33 for example, there is a method of vibrating a weight by an actuator or the like.
- the operation device 33 and the tactile sense presentation unit 30 are used to respond to the slide operation of the operation device 33 by the tactile sense presentation unit 30 .
- a step of presenting a tactile sensation can be achieved.
- the operation device 33 detects the slide operation, and in response to the detected slide operation, the tactile sense presentation unit 30 performs an operation reaction. generate power.
- the physical parameters that can be converted based on the conversion model 15 stored in the storage unit 11 of the sensory control system 100 according to the second modified example include parameters related to changes in operation reaction force with respect to displacement accompanying the slide operation of the operation device 33.
- the change in the operation reaction force includes at least a maximum portion or a minimum portion.
- the sense presentation step by controlling the tactile sense presentation unit 30 with a tactile sense presentation signal based on such a physical parameter, the change in the operation reaction force with respect to the displacement accompanying the slide operation of the operation device 33 is the above-mentioned maximum or minimum. can be pseudo-synthesized to include parts.
- the tactile sense presentation unit 30 supplies a drive signal that causes the operation surface of the operation device 33 to vibrate based on the received tactile sense presentation signal, thereby driving the operation surface in the first direction at the rising edge of the drive signal.
- the operation surface is driven in the second direction opposite to the first direction. Therefore, by making the rise time and fall time of the drive signal different, the power in the first direction corresponding to the rise or the second direction corresponding to the fall in the predetermined time average is made larger than the other. By doing so, it is possible to quasi-synthesize the above-mentioned maxima or minima.
- the drive signal that causes the operation surface of the operation device 33 to vibrate may be, for example, a signal that drives a weight by an actuator or the like, or may indirectly cause the operation surface to vibrate due to the vibration of the weight. .
- FIG. 24 is a diagram showing an example of temporal changes in the intensity of the drive signal supplied to the weight based on the tactile sense presentation signal.
- the weight is driven in the first direction when the time change of the strength of the drive signal is positive, and the weight is driven in the second direction when the time change of the strength of the drive signal is negative.
- the second time corresponding to the rise of the drive signal is determined. The force in one direction is greater than the force in a second direction corresponding to the falling edge of the drive signal.
- FIG. 24(a) when the time change of the rise of the drive signal for the weight is larger than the time change of the fall of the drive signal for the weight on average over a predetermined period of time, the second time corresponding to the rise of the drive signal is determined.
- the force in one direction is greater than the force in a second direction corresponding to the falling edge of the drive signal.
- the first direction and the second direction may be directions intersecting the operation surface of the operation device 33, or may be directions (parallel directions) along the operation surface.
- the first direction and the second direction are directions that intersect with the operation surface of the operation device 33, the user's finger or other body part that slides on the operation surface is pressed against the operation surface. , the frictional force between the body part and the operation surface associated with the slide operation, that is, the operation reaction force can be changed.
- the body part such as the finger of the user performing the slide operation on the operation surface can be moved on the operation surface. Since the reaction force in the direction of the slide operation changes, it is possible to change the frictional force between the body part and the operation surface accompanying the slide operation, that is, the operation reaction force.
- the transformation model 15 stored in the storage unit 11 of the sensory control system 100 according to the second modification may be obtained by a transformation model generation method including the following storage steps. That is, in the storage step of the transformation model generation method according to the second modification, the kansei database 16 reflects the physical characteristics that realize the sensation presentation when a predetermined manipulation tool is operated and the operation of the manipulation tool. Correspondence information in which sensitivity parameters input by the operation tool are associated with each other is stored for each of one or more types of operation tools.
- the operation tool has an operation surface that receives a slide operation.
- the change in the operation reaction force with respect to the displacement caused by the slide operation of the operating tool includes at least a maximum portion and a minimum portion.
- the operation reaction force is generated by vibration of the operation surface of the operation tool.
- the vibration of the operation surface of the operation tool may be an indirect vibration caused by vibration of a weight caused by an actuator or the like, similar to the vibration of the operation surface of the operation device 33 described above.
- the maximum part or the minimum part included in the change in the operation reaction force with respect to the displacement caused by the sliding operation of the operation tool is obtained by varying the rise time and the time change of the fall time of the drive signal that causes the operation surface of the operation tool to vibrate. Pseudo synthesis is performed by increasing the power in the direction corresponding to the rise or the direction corresponding to the fall in a predetermined time average.
- the kansei database 16 of the present disclosure stores correspondence information in which a physical characteristic related to a given sense presentation and a kansei parameter indicating the degree of sense expression with respect to the sense presentation are associated with each other, for one or more types of sense presentation. I remember each one.
- tactile presentation has been mainly described as sensory presentation
- the “tactile sense” mainly referred to in this specification is tactile sense in a broad sense
- tactile sense in a broad sense is a concept including tactile sense, pressure sense, force sense, etc. in a narrow sense.
- the term “tactile sense” when the term “tactile sense” is simply described, it means a broad sense of tactile sense.
- the tactile sense in a narrow sense is, for example, a sensation related to the texture of the surface of an object that is in contact with a body part, and is highly correlated with sensory parameters related to the sensation expression such as unevenness and roughness.
- the sense of pressure is, for example, a sensation related to drag force between a body part and an object, and is highly correlated with a sensory parameter related to sensation expression such as hardness.
- the sense of force is, for example, the sensation of external force applied to a body part, such as the sensation of being pulled or pushed.
- receptors mainly acting on narrow sense of touch, pressure, and force are different, and that the response characteristics of each receptor are also different.
- physical properties related to tactile presentation include static properties and dynamic properties.
- the static characteristics are, for example, physical characteristics obtained when an instrument or the like with negligible elasticity (hereinafter simply referred to as “rigid body”) is operated at a constant operation speed.
- Dynamic characteristics are physical characteristics obtained when a manipulation tool is operated with a flexible material that simulates a body part, such as a human finger, while changing the operation speed. , operation speed, operation acceleration, operation jerk, and frictional force.
- the corresponding information stored in the sensibility database 16 is information related to at least one of information about tactile, pressure, and force sensations in the narrow sense included in the tactile sensation in the broad sense, and information about static and dynamic characteristics included in the physical characteristics. There may be.
- the correspondence information stored in the sensitivity database 16 may be information in which the weighting of the static characteristics and dynamic characteristics of each of the narrowly defined tactile sensation, pressure sensation, and force sensation changes according to the stage of operation of the operating tool. .
- the weighting of the static characteristics is set larger than the weighting of the dynamic characteristics, and the operation stage in which the change in the reaction force with respect to the displacement accompanying the operation is large (for example, 11 and 18 and the minimum portion of the load-displacement curve shown in FIG. 11), the dynamic characteristics are weighted more heavily than the static characteristics. good too. This is because the influence of the operation speed etc. may be small in the operation stage immediately after the operation starts, so in that case the physical characteristics can be reproduced with high accuracy even if the static characteristics are approximated.
- the correspondence information stored in the sensitivity database 16 may be information including physical characteristics reflecting differences in the response characteristics of receptors mainly applied to each of the narrow sense of touch, pressure, and force. By generating the conversion model 15 based on such correspondence information, it is possible to present a tactile sensation that more reflects human sensibility.
- FIG. 20 shows the configuration of a haptic control system 2 as a second embodiment of the sensory control system 100 shown in FIG. 1, together with the flow of signals.
- the haptic control system 2 shown in FIG. 20 includes a terminal device 80 and a communication device 70, which are connected to each other via a network 9 so as to be able to communicate with each other.
- the terminal device 80 includes the main control device 6 , the input/output device 3 and the tactile presentation device 20 .
- the main control device 6 includes a processor 7 and a storage unit 8 and controls operations of the input/output device 3 and the tactile presentation device 20 .
- the tactile sense presentation device 20 includes a tactile sense presentation section 30, an operation range variable section 29, and sensors such as a position sensor 27 and an acceleration sensor .
- the communication device 70 is, for example, a server device, and includes a processor 14 , a storage section 11 , an arithmetic function section 12 , and an arithmetic function section 13 .
- a conversion model 15 is stored in the storage unit 11 .
- the input/output device 3 the tactile sense presentation device 20, the processor 14, the storage unit 11, the arithmetic function unit 12, and the arithmetic function unit 13 are the tactile sensations shown in FIG. Since it is the same as each structure shown with the same code
- the terminal device 80 may include the operation device 33, and the tactile sense presentation unit 30 may present a tactile sense to the user operating the operation device 33, similarly to the tactile sense control system 1. .
- the tactile sense control system 2 may include the tactile sense presenting portion 43 shown in FIG.
- FIG. 21 is a sequence diagram showing the operation of the haptic control system 2.
- the process executed by the terminal device 80 and the communication device 70 included in the haptic control system 2 is explained in step (ST).
- the terminal device 80 receives an input of sensitivity parameters. Specifically, the terminal device 80 receives sensitivity parameters input by the user or the like via the input unit 4 of the input/output device 3 .
- the terminal device 80 encodes the sensory parameter information and transmits the encoded sensory parameter information to the communication device 70 via the network 9 .
- the terminal device 80 may be provided with an encoder for encoding the information of the Kansei parameter. Also, the terminal device 80 may encode the entire information of the sensitivity parameter, or may encode only a part of the information.
- the communication device 70 decodes the information received from the terminal device 80 and acquires the information of the sensitivity parameter.
- the communication device 70 may comprise a decoder for decoding the sensory parameter information.
- the communication device 70 uses the conversion model 15 to convert the sensory parameters into physical parameters correlated with the sensory parameters.
- the communication device 70 encodes the converted physical parameters, and transmits information on the encoded physical parameters to the terminal device 80 via the network 9 .
- the communication device 70 may comprise an encoder for encoding the physical parameter information. Also, the communication device 70 may encode the entire physical parameter information, or may encode only a portion of the information.
- the terminal device 80 decodes the received information and acquires physical parameter information.
- the terminal device 80 may comprise a decoder for decoding the physical parameter information.
- the terminal device 80 generates a tactile sense presentation signal based on the physical parameter and causes the tactile sense presentation device 20 to operate. Note that the encoding and decoding processes in ST32, ST21, ST23, and ST33 are not essential.
- the haptic control system 2 when a sensory parameter is input to the terminal device 80, the haptic control system 2 according to the present embodiment receives information on a physical parameter correlated with the sensory parameter from the communication device 70 via the network 9. , a tactile sensation can be presented by a tactile sense presentation signal based on the physical parameter. Therefore, according to the tactile control system 2, tactile information communication via the network 9 enables tactile presentation that reflects human sensibility.
- the haptic control system 2 is particularly useful in the field of the Tactile Internet.
- the haptic control system 2 may include a plurality of terminal devices 80 . That is, the communication device 70 may be connected to each of the plurality of terminal devices 80 via the network 9 . In that case, the communication device 70 may store identification information such as an address or an ID that designates each of the plurality of terminal devices 80 and the conversion model 15 associated with each identification information. Thereby, the conversion model 15 can be optimized and configured for each user who uses each terminal device 80 .
- the conversion model 15 stored in the communication device 70 of the haptic control system 2 has a plurality of conversion models 15 according to, for example, uses (for games, vehicles, etc.). different conversion models may be used.
- the conversion model 15 can be optimized and configured according to the application so that different physical parameters can be selected according to the application, even if the physical parameters are converted from the same sensibility parameter.
- FIG. 20 shows an example in which the conversion model 15 is stored in the storage unit 11 of the communication device 70
- the conversion model 15 is stored in the storage unit 8 of the main controller 6 of the terminal device 80.
- the communication device 70 distributes information (including encoded information) related to the kansei parameter
- the main controller 6 of the terminal device 80 converts the kansei parameter into a physical parameter.
- a haptic presentation signal based on a physical parameter correlated with the parameter may be generated.
- the haptic control system 1 can be used, for example, for entertainment purposes such as games, images, and music.
- the tactile sense from the tactile sense presentation device 20 is presented to the user through an operation unit such as a button, joystick, or trigger switch included in the operation device 33 such as a game controller.
- the tactile sense presentation device 20 may present the tactile sense to a part other than the operation portion of the operating device 33 , for example, the whole or a part of the body part such as the user's hand holding the operating device 33 .
- the game controller may be, for example, a steering controller imitating a steering wheel of an automobile.
- the timing at which the tactile sensation is presented to the user through the operation device 33 includes the timing at which the operation to the operation unit included in the operation device 33 is detected, and the operation by movement, rotation, acceleration/deceleration, etc. of the whole or part of the operation device 33 .
- Examples include the timing of detection and the timing of presenting the tactile sensation according to the content.
- the timing of presenting the tactile sensation according to the content is the timing of presenting the tactile sensation preset in each content such as a game, video, music, etc., in order to enhance the presence, for example, and the timing when the operation from the user is not detected. may be
- the tactile sense presentation from the tactile sense presentation device 20 is not limited to being performed through the operation device 33 described above.
- the tactile sense presentation from the tactile sense presentation device 20 is, for example, a seat on which the user sits, a suit worn exclusively by the user, a headset used for virtual reality (VR) applications or augmented reality (AR) applications, and a body such as a user's hand. It may be performed through a wearable device such as a glove attached to the site or other wearable device. For example, the feel of operating a virtual switch or the like in VR or AR space may be presented through a wearable device.
- the haptic control system 1 can be used for in-vehicle applications, for example.
- the tactile presentation device 20 can be transmitted from the tactile presentation device 20 through devices used for driving operations such as a steering wheel, pedals, and shifters, an operating device 33 such as an infotainment system, an air conditioning unit, and a decorative panel, or a seat. may be presented to the occupant.
- the decorative panels are installed in arbitrary places such as door trims, pillars, glove boxes, center consoles, dashboards, overhead consoles, etc. inside the vehicle. It is a device that can display.
- the main purpose of presenting the tactile sense is to notify the occupant that an input operation has been performed on the operation device 33 or the like, as well as to detect a lane departure or collision with another vehicle. This is for giving a warning against an approach or the like.
- the purpose may be different from the above-mentioned entertainment application whose main purpose is to present a sense of presence. Therefore, the haptic control system 1 may store a conversion model 15 that can convert physical parameters converted from the same sensory parameter into different physical parameters depending on the application.
- the timing for presenting the tactile sensation to the occupant includes the timing at which an input operation to the operating device 33 or the like is detected, and the timing at which a danger such as lane departure or approaching another vehicle is detected. is mentioned.
- the haptic control system 2 according to the second embodiment can be used for the same purposes as the haptic control system 1 according to the first embodiment.
- the haptic control system 2 can be used for entertainment applications such as games, images, and music, and for in-vehicle applications, for example.
- a tactile sense presentation signal may be transmitted, received, distributed, or the like in association with data update, interaction between users, competition, or the like.
- each terminal device 80 may set common sensitivity parameters, or each terminal device 80 may set individual sensitivity parameters. In each terminal device 80, some of the sensitivity parameters may be set in common, while other sensitivity parameters may be individually set.
- the users of a plurality of terminal devices 80 are allowed to work in a common VR or AR environment, while the sensitivity parameter indicating the magnitude of the feeling is common to each terminal device 80, the user of each terminal device 80
- the environment can be individually adjusted by adjusting the sensibility parameter indicating the sharpness of the touch according to preference.
- the tactile control system 2 When the tactile control system 2 is used for in-vehicle use, in addition to the same usage as the tactile control system 1, communication between vehicles via the network 9, communication with road installations such as traffic signs, and traffic information from the server are possible. A tactile presentation signal, such as for alerts, may be received based on distribution, and the like. Communication between vehicles, communication with objects installed on the road, etc. can be realized by the haptic control system 1 according to the first embodiment as long as direct communication is possible without going through the network 9 .
- the haptic control system 2 can be used for medical applications and industrial applications, for example.
- Medical applications include, for example, transmission of tactile information associated with telemedicine.
- Industrial applications include, for example, haptic transmission associated with remote control of industrial robots. If the tactile sensation transmitted in these applications can be customized based on the sensibility value, a more realistic tactile sensation can be presented to the user and the user can operate comfortably.
- the haptic control system 2 according to the second embodiment can be used for Internet shopping, for example.
- the haptic control system 2 can be used for interaction between users in remote locations. It is possible to present the feeling of handshake or contact between users in remote locations. It is also possible to present the feeling of touching an animal such as a pet. In these applications, the tactile sense presenting unit 30 is particularly useful because it can convey warmth by using or in combination with presenting a sense of warmth.
- the tactile presentation includes tactile presentation, auditory presentation by sound, visual presentation by image display, and the like.
- Sensor presentation is adjusted by adjusting the signals that drive the various operating tools.
- Patent Literature 2 discloses a technique in which a user selects a reference model, and in subsequent steps, adds or changes colors, sizes, materials, positions, etc., based on the user's selection.
- an object of the present invention is to provide a tactile control device that can adjust the operational feeling by sensible input.
- a haptic control device capable of reproducing the user's preferred operational feeling in real time and a haptic control method performed by the haptic control device will be described.
- FIG. 25 is a perspective view of the haptic control device 50.
- FIG. FIG. 25 shows a single type (standalone type) haptic control device 50 .
- the haptic control device 50 has three reference manipulation tools 51 a to 51 c (a plurality of reference manipulation tools), a reproduction manipulation tool 52 , a touch panel 53 and a display 260 .
- An arbitrary reference operation tool among the reference operation tools 51a to 51c is hereinafter referred to as a "reference operation tool 51". It suffices that there are two or more reference operation tools 51 .
- the display 260 displays how to use the tactile control device 50, operation menus, and the like.
- the touch panel 53 displays sensibility parameters (for example, adjectives) into which the degree of expression is input, so that the user can input the degree of expression for each sensibility parameter. Since the tactile sense control device 50 accepts input of the degree of expression for each sensitivity parameter a plurality of times when reproducing the operation feeling preferred by the user, the sensitivity parameter to which the degree of expression can be input each time is displayed on the touch panel 53. .
- the three reference operation tools 51a to 51c are operation tools with different operation feel prepared as references. That is, the three standard operating tools 51a to 51c have different load-displacement curves.
- the operation feel of the reference operation tool 51 selected by the tactile control device 50 from among the three reference operation tools 51a to 51c is reproduced in the reproduction operation tool 52. That is, the haptic control device 50 copies the physical parameters of any one of the three reference manipulators 51 a to 51 c to the reproduction manipulator 52 .
- the user can adjust the operation feel to his/her preference by trying to operate the reproduction operation tool 52 and inputting the degree of expression.
- the user can adjust the operational feel in real time by repeatedly adjusting the operational feel of the reproduced operating tool 52 by inputting the degree of expression while operating the reproduced manipulating tool 52 and confirming the operational feel. Become. Further, the user can compare the operation feel of the adjusted reproduction operation tool 52 and the operation feel of the reference operation tools 51a to 51c, so that the user can easily adjust the degree of expression to his/her preference.
- FIG. 25 is only examples, and for example, a general-purpose system configuration in which the reference manipulation tool 51 and the reproduction manipulation tool 52 are connected to a PC or tablet terminal via a USB cable or the like may be used.
- FIG. 26 shows a client-server type haptic control system 2.
- the terminal device 80 and the server 200 can communicate via a network.
- the terminal device 80 may execute a web browser, for example, or may execute a dedicated application.
- the terminal device 80 displays a screen necessary for inputting the degree of expression for each sensitivity parameter, and accepts the input of the degree of expression from the user.
- the terminal device 80 transmits the expression frequency to the server 200, and the server 200 sends the selection result of the reference manipulation tools 51a to 51c, the physical parameters corresponding to the reference manipulation tools 51a to 51c, and the physical parameters after adjustment to the terminal device 80. Send.
- the user can adjust the operational feel in real time, similar to the haptic control device 50 .
- FIG. 27 and 28 show an outline of a user's operation of adjusting the operational feeling using the haptic control device 50.
- the user first inputs an expression frequency (an example of a first expression frequency) representing his/her preference for a plurality of sensitivity parameters (eg, adjectives) (FIG. 27(a)).
- a first input screen 281 shown in FIG. 27( a ) is displayed on the touch panel 53 , and the first input screen 281 has a sensory parameter presentation field 282 and a reference manipulation instrument field 112 .
- the user can input the degree of expression with a slide bar (an example of input means) for each sensitivity parameter (an example of the first sensitivity parameter).
- the standard operating tool column 112 displays the probabilities of the standard operating tools 51a to 51c selected with respect to the input expression frequency.
- the haptic control device 50 based on the previously learned correspondence between the expression frequency of each sensibility parameter and the reference manipulators 51a to 51c, responds to the user's preference (the expression frequency of each sensibility parameter that has been input). Select the closest reference operation tools 51a to 51c (FIG. 27(b)). This process is called STEP1.
- the tactile control device 50 reproduces the operation feel of the reference operation tools 51a to 51c with the reproduction operation tool 52 (FIG. 27(c)).
- the number of reference manipulators 51a to 51c is three, but this is only an example.
- the user tries to operate the reproduction operation tool 52 and confirms whether or not the operation feeling is to his liking.
- the user again inputs the expression frequency (an example of the second expression frequency) representing his/her preference for the plurality of sensitivity parameters (Fig. 28 (a )).
- a second input screen 120 shown in FIG. 28A is displayed on the touch panel 53, and the second input screen 120 has a sensory parameter presentation column 121.
- the user can input the frequency of expression with a slide bar for each sensitivity parameter (an example of the second sensitivity parameter).
- the number of sensitivity parameters in the sensitivity parameter presentation field 121 may be smaller than the number of sensitivity parameters in the sensitivity parameter presentation field 282 . This is because the reference manipulation tool 51 preferred by the user has already been selected in the sensitivity parameter presentation field 282 .
- the number of sensory parameters in the sensory parameter presentation field 121 is small, the user's workload is reduced.
- the expression frequency of the slide bar indicates the median value. Even if the user sets the expression frequency of the same sensitivity parameter to the minimum or maximum value in the sensitivity parameter presentation field 282, in the initial state of the sensitivity parameter presentation field 121, the expression frequency of the slide bar is the median value. By doing so, the user can easily adjust the expression frequency in the sensitivity parameter presentation field 121 to a range before and after including the expression frequency input in the sensitivity parameter presentation field 282 . Also, the degree of expression in the initial state of the sensitivity parameter presentation field 121 is the degree of expression corresponding to the physical parameter set in the standard manipulation tool 51 . By the user adjusting from this initial state, it becomes possible to adjust the degree of expression before and after.
- the tactile control device 50 converts the expression frequency of each sensibility parameter input by the user into a physical parameter based on the previously learned correspondence between the expression frequency of each sensibility parameter and the physical parameter (for example, a regression model). , and reflected in the reproduction operation tool 52 (FIG. 28(b)). This process is called STEP2. (6) The user tries to operate the reproduction operation tool 52 and confirms whether or not the operation feel is to his/her taste (Fig. 28(c)).
- the haptic control device 50 can determine the physical parameters of the user's preferred operational feel.
- FIG. 29 is a functional block diagram illustrating functions of the haptic control device 50.
- the haptic control device 50 includes a display control unit 61, a first input reception unit 62, a second input reception unit 63, a classification unit 64, a first conversion model 65a, a second conversion model 65b, a third It has a conversion model 65c and a physical parameter setting unit 66.
- FIG. Each of these functions of the haptic control device 50 is implemented by a CPU or processor as an information processing device executing a program developed in a RAM. Alternatively, each function may be realized by a hardware circuit.
- the display control unit 61 selectably displays on the touch panel 53 preset sensibility parameters and 5 or 7 levels of expression degrees set for the sensibility parameters (the first input screen 281 and the second input screen 120 are indicate).
- the degree of expression can be adjusted in arbitrary steps or continuously.
- a method of selecting the degree of expression by the user may be tapping using the touch panel 53, sliding a slide bar, or the like.
- a method of selecting the expression frequency by the user may be voice input or button input.
- the display control unit 61 displays different sensitivity parameters in STEP1 and STEP2. Also, the number of sensory parameters in STEP1 may be larger than the number of sensory parameters in STEP2.
- the first input reception unit 62 receives input of the expression frequency of each sensitivity parameter according to the user's operation.
- the second input reception unit 63 receives an input of the expression frequency of each sensitivity parameter according to the user's operation.
- the classification unit 64 is a discriminative model that has learned the correspondence between the expression frequency of the sensitivity parameter received by the first input receiving unit 62 and the three conversion models. There are many types of classification learning methods such as deep learning, decision trees, and support vector machines, but any learning method may be used in this embodiment.
- the classification unit 64 outputs the identification information of the first conversion model 65a to the third conversion model 65c for the expression frequency of the sensitivity parameter received by the first input reception unit 62 (selection from the plurality of conversion models 15 that is close to the user's preference). specifying the transformation model 15).
- the first conversion model 65a to the third conversion model 65c are conversion models capable of converting a sensory parameter into a physical parameter correlated with the sensory parameter, as described in aspect 1.
- the first conversion model 65a to the third conversion model 65c correspond to the three reference manipulation tools 51a to 51c, and can convert the expression frequency of the sensitivity parameter to the physical parameter in each of the reference manipulation tools 51a to 51c.
- the physical parameters include, for example, stroke amount of the operating tool, operation reaction force (load), velocity, acceleration, and jerk of the movable part, elastic characteristics of body parts such as fingers of the operator, and the like.
- the first conversion model 65a to the third conversion model 65c are generated by multiple regression or the like based on the expression frequency of the sensory test with respect to physical parameters having different load-displacement curves, in order to reproduce different operational feeling.
- the first conversion model 65a to the third conversion model 65c convert the expression frequencies of the sensitivity parameters received by the second input receiving unit 63 into different physical parameters.
- the reference conversion model selected in STEP 1 can convert the degree of expression close to the user's preference input in STEP 2 into physical parameters.
- the physical parameter setting unit 66 sets the physical parameters output by any one of the first conversion model 65a to the third conversion model 65c to the reproduction operation tool 52. Therefore, the tactile control device 50 can reproduce in real time the operation feeling desired by the user.
- FIG. 30 is a flow chart showing the flow of learning in the generation of the classification unit 64. As shown in FIG. Various types of learning are performed by the haptic control device 50, but learning can be performed by any information processing device.
- the haptic control device 50 receives an input of the degree of expression.
- the sensibility parameters used for generation of the classifier 64 are shown in FIG. 27(a).
- the sensitivity parameters are, for example, the following 24 items. Twenty-four is an example and may be less or more.
- the tactile sense control device 50 learns the correspondence between the expression frequencies of the sensitivity parameters and the reference manipulators 51a to 51c by machine learning.
- Classifier 64 has this correspondence.
- Machine learning is a technology that allows computers to acquire human-like learning abilities. Computers autonomously generate algorithms necessary for judgments such as data identification from pre-loaded learning data. It is a technology that applies this to data and makes predictions.
- the learning method for machine learning may be supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, or deep learning, or may be a learning method combining these learning methods. Any learning method for Machine learning methods include perceptron, deep learning, support vector machine, logistic regression, naive Bayes, decision tree, random forest, etc., and the learning method is not limited. Deep learning and decision tree will be explained later as an example of the learning method.
- the classification unit 64 generated by machine learning is incorporated into the haptic control device 50.
- FIG. 31 is a flow chart showing the flow of learning the correspondence between the expression frequency of the sensitivity parameter and the physical parameter.
- the tactile sense control device 50 receives an input of the degree of expression.
- FIG. 28(a) shows the sensitivity parameters used for learning the correspondence between the sensitivity parameter expression frequency and the physical parameters.
- the sensitivity parameters are, for example, the following five items. 5 is an example and may be less or more. "Mild (sharp)" "coarse (fine)” “Bright Dark)” “soft (hard)” “light (heavy)”
- the haptic control device 50 determines the correspondence between the expression frequency of the sensory parameter and the physical parameter by multiple regression analysis. In this mode, since three reference manipulators 51a to 51c are prepared, load-displacement curves are obtained for each of the three reference manipulators 51a to 51c.
- the physical parameters that realize this load-displacement curve are also known.
- the user operates the reference manipulators 51a to 51c and inputs the operational feel of the reference manipulators 51a to 51c as the frequency of expression.
- the haptic control device 50 uses Equation 5 to perform multiple regression analysis.
- the multiple regression analysis was described in Number 5 of Embodiment 1 and FIGS. Therefore, the coefficients of determination B 11 to B mn of the three reference manipulators 51a to 51c can be determined, and the conversion model 15 as shown in FIG. 23 is obtained for each of the reference manipulators 51a to 51c.
- the transformation models for each of the three reference manipulators 51a to 51c are the first transformation model 65a to the third transformation model 65c.
- the first transformation model 65a to the third transformation model 65c generated by the multiple regression analysis are incorporated into the haptic control device 50.
- FIG. 32 is a flow chart showing a flow in which the tactile sense control device 50 uses the classifying section 64 and the first to third transformation models 65a to 65c to present the user's preferred operational feeling.
- the first input screen 281 receives the input of the expression frequency of the sensitivity parameter (STEP 1).
- the classification unit 64 identifies the reference manipulation tools 51a to 51c based on the frequency of expression of each sensitivity parameter input on the first input screen 281. When the reference manipulators 51a to 51c are determined, any one of the first transformation model 65a to the third transformation model 65c is also determined.
- the physical parameter setting unit 66 sets the physical parameters of the selected reference manipulators 51a to 51c in the reproduction manipulator 52.
- the user can operate the reproduction operation tool 52 to confirm whether or not the operation feeling is to his liking.
- the user determines whether or not to adjust the operation feel to be different from the reference operation tools 51a to 51c, depending on whether the operation feel is one that the user likes.
- the haptic control device 50 receives an instruction to start readjustment from the user.
- the second input reception unit 63 receives an input of the expression frequency of the sensitivity parameter on the second input screen 120 (STEP 2).
- the expression frequencies (corresponding to A 1 to A n in FIG. 23) input by the user are converted into physical parameters P 1 to P n by one of the first conversion model 65a to third conversion model 65c selected in ST63.
- the physical parameter setting unit 66 sets the physical parameters P 1 to P n in the reproduction manipulator 52 . The user can operate the reproduction operation tool 52 again to confirm whether or not the operation feel is to his liking.
- the user can use the second input screen 120 to repeat the adjustment of the desired operation feel until the desired operation feel is obtained.
- the tactile control device 50 of this aspect can reproduce the user's preferred operational feel in real time.
- FIG. 33 shows an outline of the operation of adjusting the operational feel by the user using the haptic control device 50. As shown in FIG. 33
- the user first inputs the frequency of expression representing his or her preference for a plurality of sensitivity parameters on the first input screen 281 (Fig. 33(a)).
- the first input screen 281 may be the same as in FIG. 27(a).
- the haptic control device 50 based on the correspondence between the expression frequency of each sensibility parameter and the physical parameter (load-displacement curve) learned by regression in advance, the physical parameter corresponding to the expression frequency (second An example of physical parameters) is determined (FIG. 33(b)).
- the haptic control device 50 performs curve fitting using an appropriate fitting model for load-displacement curves of the reference manipulation tools 51a to 51c prepared in advance (Fig. 33(c)).
- This fitting model is, for example, a polynomial with physical parameters as coefficients. Therefore, a physical parameter (an example of the first physical parameter) representing the load-displacement curve is obtained for each of the reference manipulators 51a to 51c.
- the haptic control device 50 compares the physical parameter (2) with the physical parameter (3).
- the haptic control device 50 presents the similar reference manipulation tool 51, and if they are not similar, reproduces the The adjustment of the new feeling using the operation tool 52 is proposed (Fig. 33(d)).
- FIG. 34 is a functional block diagram illustrating functions of the haptic control device 50. As shown in FIG. Note that the explanation of FIG. 34 may mainly explain differences from FIG.
- the haptic control device 50 includes a display control unit 61, a first input reception unit 62, a second input reception unit 63, a physical parameter conversion unit 67, a curve fitting unit 68, a comparison unit 69, a first conversion model 65a, a second conversion model. 65b, a third conversion model 65c, and a physical parameter setting unit 66.
- Each of these functions of the haptic control device 50 is implemented by executing a program developed in the RAM by the CPU, which is an information processing device. Alternatively, each function may be realized by a hardware circuit.
- the physical parameter conversion unit 67 determines the physical parameter corresponding to the expression frequency received by the first input reception unit 62, using the correspondence between the expression frequency and the physical parameter obtained by multiple regression analysis. Since the load-displacement curve is also determined when the physical parameter is determined, it can be said that the physical parameter conversion unit 67 determines the load-displacement curve.
- the curve fitting unit 68 fits load-displacement curves of the reference operating tools 51a to 51c (first transformation model 65a to third transformation model 65c) with an appropriate fitting model (eg, polynomial).
- Curve fitting is a form of multiple regression analysis. By setting the physical parameters to the coefficients of the polynomial, the curve fitting section 68 can estimate the physical parameters for each of the reference manipulators 51a to 51c. Therefore, the fitting model should be chosen to fit the load-displacement curve with the physical parameters.
- a comparison unit 69 compares the physical parameter determined by the physical parameter conversion unit 67 and the physical parameter determined by the curve fitting unit 68, and determines whether or not they are similar. For example, the comparison unit 69 calculates the sum of squares of differences for each of the physical parameters P 1 to P n and determines whether or not it is less than the threshold. If there are similar physical parameters, the comparing section 69 instructs the physical parameter setting section 66 of the physical parameters corresponding to the reference manipulation tools 51a to 51c.
- the physical parameter setting unit 66 sets the physical parameters of the instructed reference manipulation tool 51 to the reproduction manipulation tool 52 .
- FIG. 35 is a flow chart showing the flow of learning the physical parameter (load-displacement curve) corresponding to the expression frequency.
- the haptic control device 50 accepts input of the degree of expression.
- the sensibility parameters used for generation of the classifier 64 are shown in FIG. 27(a).
- the haptic control device 50 determines the correspondence between the expression frequency of the sensory parameter and the physical parameter by multiple regression analysis.
- the user inputs the operational feel of the operational tool whose physical parameters are known as the frequency of expression.
- the manipulation tool whose physical parameters are known may be the reference manipulation tool 51 or an arbitrary manipulation tool.
- the haptic control device 50 uses Equation 5 to perform multiple regression analysis.
- the multiple regression analysis was described in Number 5 of Embodiment 1 and FIGS. Therefore, the haptic control device 50 can determine the coefficients of determination B 11 to B mn of Expression 5, and the conversion model 15 as shown in FIG. 23 is obtained.
- the physical parameter conversion unit 67 generated by the multiple regression analysis is incorporated into the haptic control device 50.
- FIG. 36 is a flow chart showing the flow of curve fitting of the load-displacement curves of the reference manipulators 51a to 51c.
- the curve fitting section 68 performs curve fitting on the load-displacement curves of the reference manipulators 51a to 51c. As shown in FIG. 9, the correspondence between the stroke amount x and the operation reaction force is obtained for each of the reference manipulators 51a to 51c.
- the curve fitting unit 68 performs curve fitting by applying the set of the stroke amount x and the operation reaction force y to the fitting model.
- the fitting model is a formula that obtains the operating stress from the stroke amount x using physical parameters as coefficients.
- the following fitting model is just an example, and an appropriate model (formula) may be adopted that obtains the operation reaction force y from the stroke amount x using the physical parameters as coefficients.
- Fitting model: y P1* x0 +P2 * x1+ P3 * x2 + ... Pn * xn
- the curve fitting unit 68 can obtain P1 to Pn by multiple regression analysis.
- the obtained P 1 to P n correspond to physical parameters.
- FIG. 37 is a flow chart showing a flow in which the tactile sense control device 50 uses the physical parameter conversion section 67 and the comparison section 69 to present the user's preferred operational feel.
- the first input reception unit 62 receives input of the expression frequency of the sensitivity parameter for selecting the reference manipulation tools 51a to 51c.
- the physical parameter conversion unit 67 converts each sensitivity parameter into a physical parameter (load-displacement curve) based on the expression frequency.
- the comparison section 69 compares the physical parameter determined by the physical parameter conversion section 67 with the physical parameter determined by the curve fitting section 68 for each of the reference manipulation tools 51a to 51c.
- the comparison unit 69 determines whether or not there are reference manipulation tools 51a to 51c having physical parameters similar to the physical parameters converted by the physical parameter conversion unit 67.
- FIG. As described above, the determination here is based on the difference between the physical parameters P 1 to P n determined by the physical parameter conversion unit 67 and the physical parameters P 1 to P n obtained by curve fitting of the reference manipulators 51a to 51c. There is a method of determining whether the sum of squares is less than a threshold.
- the physical parameter setting unit 66 sets the physical parameters of the reference manipulators 51a to 51c similar to the physical parameters determined by the physical parameter conversion unit 67 to the reproduction manipulator 52.
- the physical parameter setting unit 66 sets the physical parameters of the reference manipulators 51a to 51c with the highest degree of similarity to the reproduction manipulator 52.
- the classification unit 64 of the first form may be provided, and the classification unit 64 may determine the reference manipulation implements 51a to 51c (the first conversion model 65a to the third conversion model 65c).
- the user can use the second input screen 120 to repeat the adjustment of the desired operation feel until the desired operation feel is obtained.
- the tactile control device 50 of this aspect can reproduce the user's preferred operational feel in real time.
- FIG. 38 shows an example of a neural network in which the classifier 64 is implemented by a neural network.
- three nodes of the output layer 133 each output an output value y i for data input to the input layer 131 .
- This output value yi is a probability
- y1+y2+y3 is 1.0.
- the three nodes of the output layer 133 correspond to the three reference manipulation tools 51a to 51c, and which of the three reference manipulation tools 51a to 51c is certain depending on the expression frequency. Output the probability of whether it is likely or not.
- FIG. 38 shows a neural network in which L layers (for example, 3 layers) from the input layer 131 to the output layer 133 are fully connected.
- a deep neural network is called a DNN (Deep Neural Network).
- a layer between the input layer 131 and the output layer 133 is called an intermediate layer 132 . Since the number of intermediate layers and the number of nodes can be set arbitrarily, the number of layers, the number of nodes 130 in each layer, and the like are only examples. In this embodiment, the number of nodes 130 in the input layer is the number of sensitivity parameters (24 in FIG. 27(a)). Note that the degree of expression may be set in arbitrary steps such as 5 steps, 3 steps, etc. for each sensitivity parameter, or may be adjusted continuously.
- Equation (1) shows how the output signal of node 130 is calculated.
- w ji (l,l-1) is the weight between the l-th layer j-th node and the l-1-th layer i-th node
- b j is the bias component in the network.
- u j (l) is the output of the l-th layer j-th node
- z i (l-1) is the output of the l-1-th layer i-th node.
- I is the number of nodes in the l-1th layer.
- the input u j (l) to the node is activated by the activation function f as shown in equation (2).
- f means the node activation function.
- ReLU, tanh, sigmoid, etc. are known as activation functions.
- the nodes of the input layer 131 are not activated just by transmitting the input data to the second layer.
- the l-th layer node 130 non-linearizes the input with an activation function and outputs it to the l+1-th layer node 130 . This process is repeated from the input layer 131 to the output layer 133 in the neural network.
- Each node of the output layer 133 receives z i output by each node of the intermediate layer 132 , and each node of the output layer 133 totals z i . Then, an activation function for the output layer is used for the nodes of the output layer 133 .
- the activation function of the output layer 133 is generally a softmax function.
- Each node of the output layer 133 outputs the output value yi of the softmax function.
- a teacher signal (1 or 0) is set after each node of the output layer 133 is made to correspond to the reference manipulation tool.
- each node of the output layer 133 can output the probabilities of the reference manipulators 51a to 51c to which the 24 sensitivity parameters correspond.
- the reference operation tools 51a to 51c correspond to each other from the top. However, if the output value is less than the threshold, it may be determined as unclassified.
- a plurality of users operate three reference manipulators 51a to 51c, and input the degree of expression for each of the reference manipulators 51a to 51c.
- the number of training data sets of 24 sensitivity parameters and one teacher signal (which reference operation tool) is obtained by the number of users ⁇ the number of reference operation tools.
- the teacher signal is one of (1,0,0) (0,1,0) (0,0,1).
- the neural network processes the expression frequency input to the input layer 131 and outputs the output value yi from the output layer 133.
- a node of the output layer 133 receives an input representation frequency and a teacher signal of a pair of training data.
- the error between the output value yi of the node of the output layer 133 and the teacher signal is calculated using a loss function. If the activation function of the output layer 133 is the softmax function, the loss function is the cross-entropy.
- the error between the teacher signal and the output value calculated by the loss function is propagated to the nodes of the input layer 131 by a calculation method called error backpropagation. Weights w between nodes are learned in the process of propagation. Details of the backpropagation method are omitted.
- the neural network outputs a value close to 1.0 for the node 130 corresponding to the reference manipulation tool 51a in the output layer 133 for the expression frequency input for the reference manipulation tool 51a.
- the corresponding node 130 is expected to output a value close to 0.0.
- FIG. 39 shows an example of a decision tree when the classification unit 64 is realized by a decision tree.
- a decision tree is a machine learning technique that finds clusters of data in which specific features appear frequently and generates classification rules for them. In this embodiment, learning corresponds to determining the sensitivity parameters that frequently appear in each of the three reference manipulation tools 51a to 51c and their expression frequencies.
- a method using entropy is known as one of methods for learning the structure of a decision tree.
- neural networks and decision trees In addition to neural networks and decision trees, support vector machines, random forests, logistic regression, etc. may be used as machine learning suitable for classification.
- FIG. 40 is a diagram explaining the first input screen 281 for the degree of expression in STEP1.
- the user operates the slide bar for each sensitivity parameter to input the degree of expression.
- the classification unit 64 described in the first mode uses the learning result to calculate the probability that each of the reference manipulation tools 51a to 51c is selected in the case of the current expression frequency.
- the display control unit 61 displays the probability of each of the reference manipulation tools 51a to 51c in the reference manipulation tool column 112. FIG. Therefore, the user can grasp which of the reference manipulators 51a to 51c the current expression frequency is close to by operating the reference manipulators 51a to 51c.
- the display of the probability may be displayed in real time or when the user inputs a determination operation.
- the display control unit 61 changes the expression frequency set for the reference manipulation tools 51a to 51c to the sensitivity parameter presentation field 282. Initialize the slidebar. Therefore, the user can easily confirm the frequency of expression of each of the reference manipulation tools 51a to 51c.
- the expression frequency at the time of initialization may be, for example, the median value or the average value of the expression frequency input to the reference manipulation tool 51 in the sensory test.
- FIG. 41 is a functional block diagram of a haptic control system 2 in which the haptic control device 50 of the first form is applied to a client server system.
- the terminal device 80 and the server 200 have the same functions as the tactile control device 50 of FIG. is doing.
- FIG. 42 is a sequence diagram explaining the operation of the haptic control system 2.
- FIG. 42 In the explanation of FIG. 42, mainly the difference from FIG. 32 will be explained.
- the first input receiving unit 62 receives the input of the expression frequency of the sensitivity parameter for selecting the reference manipulation tools 51a to 51c input on the first input screen 281 (STEP 1).
- the first communication section 71 of the terminal device 80 transmits the expression frequency of each sensitivity parameter to the server 200.
- the classification unit 64 of the server 200 selects the reference manipulation implements 51a to 51c based on the expression frequency of each sensitivity parameter.
- the second communication section 72 of the server 200 transmits the physical parameters of the reference manipulation tools 51a to 51c to the terminal device 80.
- the first communication unit 71 of the terminal device 80 receives the physical parameters of the reference manipulators 51 a to 51 c , and the physical parameter setting unit 66 sets them to the reproduction manipulator 52 .
- the user determines whether or not to adjust the operation feel to be different from the reference operation tools 51a to 51c depending on whether or not the user prefers the operation feel.
- the second input reception unit 63 receives input of the expression frequency of the sensitivity parameter input on the second input screen 120 (STEP 2).
- the first communication section 71 of the terminal device 80 transmits the expression frequency of the sensitivity parameter to the server 200.
- one of first conversion model 65a to third conversion model 65c of server 200 (already selected in ST103) converts the expression frequency into physical parameters P 1 to P n .
- the physical parameter setting section 66 of the server 200 transmits the converted physical parameters to the terminal device 80 via the second communication section 72 .
- the physical parameters of the reference manipulators 51 a to 51 c received by the first communication unit 71 of the terminal device 80 are set in the reproduction manipulator 52 .
- the haptic control system 2 of this aspect can reproduce the user's preferred operational feel in real time even in a client-server system.
- FIG. 43 is a functional block diagram of a haptic control system 2 in which the haptic control device 50 of the second form is applied to a client server system.
- the terminal device 80 and the server 200 have the same functions as the haptic control device 50 of FIG. is doing.
- FIG. 44 is a sequence diagram explaining the operation of the tactile control system 2 of the second form. In the explanation of FIG. 44, mainly the difference from FIG. 37 will be explained.
- the first input reception unit 62 receives the input of the expression frequency of the sensitivity parameter on the first input screen 281.
- the first communication section 71 of the terminal device 80 transmits the expression frequency of each sensitivity parameter to the server 200.
- the physical parameter conversion unit 67 of the server 200 converts each sensitivity parameter into a physical parameter (load-displacement curve) based on the expression frequency.
- the comparison unit 69 of the server 200 compares the physical parameters converted by the physical parameter conversion unit 67 with the physical parameters for each of the reference manipulation tools 51a to 51c predetermined by the curve fitting unit 68.
- the second communication unit 72 determines the physical parameters of any of the similar reference manipulation tools 51a to 51c. Send the parameters to the terminal device 80 .
- the first communication unit 71 of the terminal device 80 receives the physical parameters of the reference manipulators 51 a to 51 c , and the physical parameter setting unit 66 sets them to the reproduction manipulator 52 .
- the second communication unit 72 selects one of the reference manipulation tools 51a to 51c with the highest degree of similarity. This physical parameter is transmitted to the terminal device 80 .
- the first communication unit 71 of the terminal device 80 receives the physical parameters of the reference manipulators 51 a to 51 c , and the physical parameter setting unit 66 sets them to the reproduction manipulator 52 .
- the classification unit 64 of the first form may be provided, and the classification unit 64 may determine the reference manipulation implements 51a to 51c (the first conversion model 65a to the third conversion model 65c).
- the haptic control system 2 of this aspect can reproduce the user's preferred operational feel in real time even in a client-server system.
- a tactile control device for controlling an operational feel of an operating tool, a display control unit that displays input means for the first expression degree associated with the first sensitivity parameter; a first input reception unit that receives input of the first expression frequency according to a user operation; a physical parameter setting unit that sets physical parameters prepared in advance to the reproduction operation tool based on the first expression frequency;
- the display control unit displays input means for a second expression frequency associated with the second sensitivity parameter, a second input reception unit that receives input of the second expression frequency according to a user operation; a conversion unit that converts the second expression frequency into a physical parameter using a regression model;
- the haptic control device wherein the physical parameter setting unit sets the physical parameter converted by the conversion unit to the reproduction operation tool.
- [Claim 2] a classification unit that classifies the first expression frequency into one of a plurality of reference manipulation tools; 2. The haptic control device according to claim 1, wherein the physical parameter setting unit sets the physical parameter set for the reference operation tool classified by the classification unit to the reproduction operation tool.
- [Claim 3] a curve fitting unit that performs curve fitting on a load-displacement curve realized by a first physical parameter possessed by a plurality of reference manipulators, and estimates the first physical parameter for each of the plurality of reference manipulators; a physical parameter conversion unit that converts the first expression frequency into a second physical parameter using a regression model; The physical parameter setting unit sets the first physical parameter of the reference manipulator having the first physical parameter most similar to the second physical parameter to the reproduction manipulator. Item 2.
- the haptic control device according to item 1.
- [Claim 4] 3. The method according to claim 2, wherein the input means for inputting the second degree of representation can take the degree of representation corresponding to the physical parameter set for the reference manipulation tool classified by the classifying unit and values before and after that. tactile controller.
- [Claim 5] 2. Said first sensibility parameter and said second sensibility parameter are respectively plural, and the number of said first sensibility parameters is larger than the number of said second sensibility parameters.
- a haptic control device as described.
- the classification unit learns the correspondence between the operation feel of the plurality of reference manipulation tools and the expression frequency input for each of the first sensitivity parameters by the user operating each of the plurality of reference manipulation tools. 3.
- the haptic control device of claim 2 wherein the haptic control device is generated.
- the regression model performs regression analysis on the correspondence between the physical parameters possessed by the plurality of reference manipulation tools and the expression frequency input for each of the second sensitivity parameters by the user operating the plurality of reference manipulation tools.
- the haptic control device is generated by: [Claim 8]
- the regression model is generated by performing regression analysis on the correspondence between the physical parameters of an arbitrary reference manipulation tool and the expression frequency input for each of the first sensitivity parameters by the user operating the arbitrary reference manipulation tool. 4.
- the curve fitting unit estimates the first physical parameter by performing curve fitting on the load-displacement curve using a fitting model that obtains the operating stress from the stroke amount using the first physical parameter as a coefficient.
- the haptic control device characterized by: [Claim 10]
- the first affective parameter and the second affective parameter are adjectives, 2.
- the haptic control device according to claim 1, wherein the first degree of expression and the second degree of expression are values indicating degrees of the adjective.
- [Claim 11] 11.
- the apparatus according to any one of claims 1 to 10 wherein the first degree of expression and the second degree of expression are tactile information obtained when a user operates an operating tool, respectively. Tactile controller. [Claim 12] 12.
- a tactile control device that controls the operational feel of the operating tool, a display control unit that displays input means for the first expression degree associated with the first sensitivity parameter; a first input reception unit that receives input of the first expression frequency according to a user operation; Functioning as a physical parameter setting unit that sets a physical parameter prepared in advance to the reproduction operation tool based on the first expression frequency, The display control unit displays input means for a second expression frequency associated with the second sensitivity parameter, Furthermore, a second input reception unit that receives input of the second expression frequency according to a user operation; Functioning as a conversion unit that converts the second expression frequency into a physical parameter by a regression model, The program, wherein the physical parameter setting unit sets the physical parameter converted by the conversion unit to the reproduction operation tool.
- a haptic control method for controlling a haptic by a haptic control device for controlling an operational feel of an operation tool a step of displaying input means for the first expression frequency associated with the first sensitivity parameter; receiving an input of the first expression frequency according to a user operation; setting physical parameters prepared in advance to the reproduction manipulator based on the first expression frequency; a step of displaying input means for the second expression frequency associated with the second sensitivity parameter; receiving an input of the second expression frequency according to a user operation; converting the second expression frequency into a physical parameter by a regression model; a step of setting the converted physical parameter to the reproduction manipulator;
- a haptic control method characterized by comprising: [Claim 15] A haptic control system in which a terminal device and a server communicate via a network, The terminal device a display control unit that displays input means for the first expression degree associated with the first sensitivity parameter; a first input reception unit that receives input of the first expression frequency according to a user operation; a first communication unit that transmits
- [Claim 16] a display control unit that displays input means for the first expression degree associated with the first sensitivity parameter; a first input reception unit that receives input of the first expression frequency according to a user operation; a first communication unit that transmits the first expression frequency to a server; and a physical parameter setting unit that sets the physical parameter transmitted from the server to a reproduction operation tool.
- a second input reception unit that receives input of the second expression frequency according to a user operation, and the first communication unit transmits the second expression frequency to the server via a terminal device and a network a server that communicates with a second communication unit that determines a physical parameter prepared in advance based on the first expression frequency received from the terminal device and transmits the determined physical parameter to the terminal device; a conversion unit that converts the second expression frequency received from the terminal device into the physical parameter using a regression model; The server, wherein the second communication unit transmits the physical parameter converted by the conversion unit to the terminal device.
- the tactile presentation includes tactile presentation, auditory presentation by sound, visual presentation by image display, and the like.
- Sensor presentation is adjusted by adjusting the signals that drive the various controls.
- Patent Literature 3 discloses a technique for exchanging vibration devices themselves in order to realize different vibration intensities.
- the conventional technique has a problem that the sensation is not sufficiently presented according to the physical characteristics of the operation unit. For example, in the case of a rotary operation unit, even if the actuators are driven in the same manner, the user who operates the operation unit will feel differently depending on the size and mass of the operation unit.
- the purpose of this aspect is to provide a technique for presenting sensations according to the physical characteristics of the operation unit.
- the physical parameters correlated with the sensibility parameters are composed of physical parameters that are a composite of the physical parameters of the operation unit and the physical parameters of the actuator. Therefore, the tactile sense presentation device 20 of this aspect is adjusted so that the tactile sense presentation signal is suitable for the physical parameters such as the size and mass of the operation unit.
- the haptic control system 110 includes an adjustment unit that adjusts at least one of the operation signal, the sensation presentation signal, and the sensation presentation based on the physical properties of the operation unit.
- the difference in physical properties of the operating units is detected as follows.
- the user inputs the difference in the physical characteristics of the operation unit to the input/output device 3 as information.
- the size and mass of the operation part are specified.
- the tactile sense presentation device 20 detects the ID, size, mass, etc. representing the difference in physical characteristics of the operation unit by a sensor.
- the sensor that detects the difference in the physical characteristics of the operation unit is a camera, and the camera reads the one-dimensional code and the two-dimensional code.
- the camera identifies the operation unit by recognizing the image of the operation unit.
- the sensor is an IC tag reader, and the IC tag reader reads the ID.
- FIG. 45 is a diagram showing the configuration of the haptic control system 110 of the sensory control system 100 in this aspect.
- the components denoted by the same reference numerals in FIG. 2 perform similar functions, only the main components of this aspect may be mainly described.
- the tactile sense presentation device 20 in FIG. 45 newly has an operation unit sensor 254, a torque sensor 251, and a communication unit 256.
- the operation unit sensor 254 detects that the operation unit is attached and information that can identify the operation unit.
- Information that can identify the operation unit includes an IC tag built into the operation unit, a one-dimensional code and a two-dimensional code attached to the operation unit, and an appearance of the operation unit.
- the operation unit sensor 254 is an IC tag reader and acquires the ID (identification information) of the operation unit from the IC tag.
- the operation unit sensor 254 is a camera and acquires the ID of the operation unit from the one-dimensional code or the two-dimensional code.
- the operation unit sensor 254 is a camera and a classifier, and the operation unit is identified by the classifier that has learned the correspondence between the image data of the operation unit appearance and the ID ( The ID of the operation unit is known).
- the operating device 33 is an example of the operating section, and the operating section may be a mounting section that can be detached from at least a part of the operating device 33 (whole or part thereof). Also, the main control device 10 and the tactile sense presentation device 20 are an example of a sensory control device.
- the torque sensor converts the current that drives the actuator into torque during calibration for estimating the mass of the operation part. Details will be described later.
- the communication unit 256 receives the size of the operation unit from the mobile terminal 60 by communicating with the mobile terminal 60 . Details will be described later.
- the main controller 10 in FIG. 45 newly has an operation section parameter 54, a calibration section 55, and a mass correction section 261.
- the operation unit parameter 54 will be explained with reference to FIG.
- a calibration unit 55 estimates the mass of the operation unit by calibration.
- a mass correction unit 261 corrects the mass of the operation unit.
- the calibration section 55 and the mass correction section 261 will be described later.
- FIG. 46 shows an example of the operation unit parameters 54.
- the ID of the operation unit is associated with mass, size, and other physical parameters.
- the mass and size are physical properties of the operation unit 201, and are included in the physical parameters in this embodiment.
- the size may be the radius, diameter, or overall length (maximum length) in the case of a rotary operation unit that accepts rotation operations. Further, when the operating portion is a pressing type operating portion, the size may be the length in the pressing direction. If the operation unit is a slide operation unit that accepts a slide operation, the size may be any one of the slide amount, height, width, and thickness. If the operation portion is a pivot operation portion that receives a tilting operation, the size may be the length of the operation portion.
- FIGS. 47A and 47B are diagrams for explaining the difference in the physical characteristics of the rotary operation units.
- FIG. 47(a) shows the small operating portion 201a
- FIG. 47(b) shows the large operating portion 201b.
- An arbitrary operation unit among the operation units 201a and 201b is hereinafter referred to as the "operation unit 201".
- the operation units 201a and 201b in FIG. 47 are of a rotary type, depending on the size (diameter), mass, etc. of the operation units 201a and 201b, even if the processor 14 drives the actuators in the same way, the user who operates the operation units may The tactile sensation transmitted to the For example, the larger the diameter, the smaller the torque required to rotate the operation unit 201 . Therefore, if the same reaction force is applied to the rotating operation of the operating portions 201a and 201b, the operating portion 201a may feel difficult to rotate, and the operating portion 201b may feel lacking in operation.
- FIG. 48 is a diagram explaining several methods for detecting the size and mass of the operation unit 201 by the operation unit sensor 254.
- the operation unit 201 has an IC tag 202 built in or attached.
- the operation unit sensor 254 is the IC tag reader 204 , which generates an electric current in the IC tag 202 with electromagnetic waves, communicates with the IC tag 202 , and receives the ID of the operation unit from the IC tag 202 .
- the IC tag reader 204 is preferably installed in the tactile presentation device 20, but may be an external device such as the mobile terminal 60.
- a barcode 203 is attached to the operation unit 201.
- the operation unit sensor 254 captures the barcode 203 with the camera 205 and decodes the barcode 203 to acquire the ID of the operation unit.
- the camera 205 is preferably installed in the tactile sense presentation device 20 , but may be an external device such as the mobile terminal 60 .
- the operation unit sensor 254 captures the operation unit 201 itself with the camera.
- the operation unit sensor 254 estimates the size of the operation unit sensor 254 from the image data based on the preset distance between the camera 205 and the operation unit 201 and the focal length of the camera 205 .
- a classifier that has learned the correspondence between image data of the distance, focal length, and appearance of the operation unit and the ID can output the ID of the operation unit from the image data. For mass, we use a conversion formula that calculates mass from size.
- the operation unit sensor 254 in FIG. 48 may be built in the tactile presentation device 20 or may exist separately from the tactile presentation device 20 .
- the operation unit sensor 254 may be an information processing device such as the mobile terminal 60 carried by the user.
- the operation unit detected by the operation unit sensor 254 may not be included in the operation unit parameters.
- a controller such as a game controller used by a user is a controller that allows the user to change an operation portion such as a knob, and there are cases where the user wants to obtain an operation feel suitable for the attachment portion.
- the mobile terminal 60 estimates the physical parameter.
- a user activates a predetermined application on the mobile terminal 60 .
- the user takes an image of the operation unit 201 attached to the tactile presentation device 20 with a camera controlled by the application.
- the application detects the size of the operation unit 201 from the image data of the operation unit 201 .
- the camera of mobile terminal 60 is preferably a stereo camera or a LiDAR scanner.
- the application transmits the size of the operation unit 201 to the tactile presentation device 20 .
- a communication unit 256 receives the size of the operation unit 201 .
- the communication unit 256 receives the size, the mass is unknown. Therefore, regarding the mass of the operation unit 201, a conversion formula for calculating the mass from the size is used. Alternatively, the application obtains the mass from the size using a conversion formula and transmits it to the tactile presentation device 20 .
- the calibration unit 55 estimates the mass by calibration.
- the calibration unit 55 operates the operation unit with a current pattern (rotates in the case of a rotary type), and estimates the mass of the operation unit from the correspondence between the current and the position.
- FIG. 49 is a diagram explaining a method of estimating the mass of the operation section by calibration.
- FIG. 49(a) is a diagram for explaining the position of the rotary type operation unit 201. As shown in FIG. In the case of the rotary operation unit 201, the position may be the rotation angle of the rotation center. The center of rotation is the center of the circle when the upper surface of the operation unit 201 is circular. When the calibration unit 55 rotates the rotary operation unit 201, the heavier the mass, the larger the current required.
- FIG. 49(b) is a diagram for explaining the relationship between the current required to change the position of the operation unit 201 and the position.
- the relationship between current and position shown in FIG. 49(b) is an example for explanation.
- a larger current is required to change the position.
- the current has a certain relationship with the torque that rotates the operating section, and the torque required to rotate the operating section is obtained from the current. It is known that the larger the mass of the operating portion, the larger the current for changing the position. Torque sensor 251 converts this current to torque.
- the mass M of the operation section 201 to which the calibration section 55 is attached is estimated.
- For size we use a conversion formula that calculates mass from size.
- the application of the mobile terminal 60 can be used, and the size and mass of the attached operation unit can also be estimated by calibration.
- the degree of inclination of the operation unit 201 differs depending on the installation location.
- the inclination of the operation unit 201 differs depending on whether the operation unit 201 is attached to the steering wheel or the center console. If the inclination is different, the operation feel of the pressing type operation portion will be different due to the action of gravity. Therefore, the tactile sense presentation device 20 measures the inclination of the installation location of the operation unit 201 using the acceleration sensor 28 and corrects the mass of the operation unit 201 .
- FIG. 50A and 50B are diagrams for explaining correction of the mass of the operation unit 201.
- FIG. FIG. 50(a) shows the operation reaction force F1 when the operation unit 201 arranged at an installation location with zero inclination is pressed.
- the operation reaction force F1 is, for example, the maximum value Tmax in FIG.
- FIG. 50(b) shows the operation reaction force F2 when the operation unit 201 arranged at the installation location with the inclination ⁇ is pressed.
- the operation reaction force F2 is as follows.
- the mass correction unit 261 corrects the mass of the operation unit 201 by regarding the difference in operation reaction force as the difference in mass.
- FIG. 51 is a flow chart showing the process of adjusting the tactile sense presentation signal according to the physical parameters of the operation unit to which the haptic control system 110 is attached.
- the haptic control system 110 obtains the correspondence between the physical parameters including the mass and size of the operation unit and the sensitivity parameters by the SD method or the like (ST121).
- the operation unit sensor 254 detects the operation unit worn by the user (ST122).
- the tactile sense presentation device 20 determines whether or not there is an operation unit detected in the operation unit parameter 54 (ST123). The case where the operation unit sensor 254 cannot detect the ID is included in the case where there is no operation unit detected in the operation unit parameter 54 .
- step ST123 If the determination in step ST123 is Yes, the conversion model 15 converts the physical parameters registered in the operation unit parameters 54 into sensibility parameters (ST124). It should be noted that the conversion model 15 of this embodiment calculates sensitivity parameters from physical parameters as shown in FIG.
- step ST123 the user takes an image of the operation unit using the application on the mobile terminal 60 and transmits the size and mass to the tactile presentation device (ST125).
- the communication unit 256 receives the size and mass from the application of the mobile terminal 60 (ST126). Note that, as described above, the size and mass obtained by calibration by the calibration unit 55 may be employed.
- the conversion model 15 converts the estimated physical parameters (size, mass) into sensibility parameters (ST127).
- the arithmetic function unit 12 generates a tactile sense presentation signal using physical parameters such as size and mass (registered or estimated in the operation unit parameters 54) (ST128).
- the arithmetic function unit 13 transmits the tactile sense presentation signal to the tactile sense presentation device 20 .
- the processor 18 When the user rotates the operation unit 201, the processor 18 generates an operation signal. If the operating section is a rotary operating section, the operating signal is, for example, a rotation angle. In the case of other operation units, the operation signal is the amount of operation of the operation unit.
- the tactile sense presentation section 30 controls the actuators by the tactile sense presentation signal corresponding to the operation signal (ST129).
- calculation function unit 12 may use the sensory parameters converted from the physical parameters in step ST127 to convert them again into physical parameters to generate the tactile sense presentation signal.
- a dedicated conversion model 15 may be prepared for the second conversion.
- the haptic control system 110 can estimate physical parameters even when an unregistered operation unit is worn. Since the arithmetic function unit 12 as an adjustment unit generates a tactile sense presentation signal based on a physical parameter, it is possible to adjust the tactile sense presentation signal according to the attached operation unit.
- the adjustment unit is not limited to adjusting the 'tactile presentation signal', but can adjust the 'operation signal', the 'sensory presentation signal', the 'sensory presentation' itself, or any combination thereof. may be Specifically, there are the following cases. A case where the processor 18 (an example of the operation detection unit) functions as an adjustment unit and reflects the adjustment in the "operation signal”. A case where the calculation function unit 12 (an example of the signal generation unit) functions as an adjustment unit and reflects the adjustment in the "sensory presentation signal”. - A case in which the tactile sense presentation unit 30 functions as an adjustment unit and reflects the adjustment in the "sensory presentation”.
- the conversion model estimates the sensitivity parameter from the physical parameter, it is possible to build a correlation between the sensitivity parameter and the physical parameter that also reflect the physical parameter of the operation unit.
- this content can also be applied to content in line with "Adjusting a sensory presentation signal". That is, when the physical parameters of the operation unit are changed due to replacement of the operation unit, etc., if the actuator is driven in the same manner as before the operation unit 201 is replaced, the reproduced feeling, that is, the sensitivity parameter will be changed. be different.
- the sensory parameters to be realized are constant, the physical parameters of the actuator can be adjusted by adjusting the sensory presentation signal, and the sensory presentation can be performed in accordance with the sensory parameters that have been set.
- FIG. 52 is a flowchart showing, as a modification of FIG. 51, processing for adjusting the tactile sense presentation signal according to the physical parameters of the operation unit to which the haptic control system 110 is attached.
- FIG. 52 differences from FIG. 51 are mainly described.
- step ST123 an example of the predetermined condition
- the arithmetic function unit 12 stops generating the sensation presentation signal (ST130).
- the arithmetic function unit 12 may generate a predetermined sensation presentation signal such as an initial value instead of stopping the generation of the sensation presentation signal.
- FIG. 53 shows the configuration of a haptic control system 111 as the second embodiment of the sensory control system 100 shown in FIG. 45, together with signal flow. Note that in the description of FIG. 53, differences from FIG. 45 are mainly described.
- the tactile presentation device 20 of the terminal device 80 has a torque sensor 251, an operation section sensor 254, and a communication section 256.
- the communication device 70 has an operation section parameter 54 , a calibration section 55 and a mass correction section 261 .
- the torque sensor 251, operation unit sensor 254, communication unit 256, operation unit parameter 54, calibration unit 55, and mass correction unit 261 may be the same as those described with reference to FIG.
- FIG. 54 is a sequence diagram in which the communication device 70 (server) and the terminal device 80 communicate with each other to estimate the sensitivity parameters of the attached operation unit.
- step ST131 the haptic control system 111 obtains the correspondence between the physical parameters including the mass and size of the operation unit and the sensitivity parameters by the SD method or the like.
- step ST132 when the user wears the operation section, the operation section sensor 254 detects the operation section worn by the user.
- step ST133 the terminal device 80 transmits the ID of the operation unit detected by the operation unit sensor 254 to the communication device 70.
- the terminal device 80 transmits ID non-detection to the communication device 70 .
- step ST134 based on the ID of the operation unit received by the communication device 70, it is determined whether or not there is an operation unit attached to the operation unit parameter 54.
- step ST135 the conversion model 15 converts the physical parameter registered in the operation part parameter 54 into the sensibility parameter.
- step ST136 a message to the effect that the communication device 70 is not registered is transmitted to the terminal device 80.
- step ST ⁇ b>137 the user takes an image of the operation unit using the application of the mobile terminal 60 and transmits the size and mass to the tactile presentation device 20 .
- the communication unit 256 receives the size and mass from the application of the mobile terminal 60.
- step ST139 the mobile terminal 60 transmits the size and mass to the communication device 70.
- step ST140 the conversion model 15 converts the estimated physical parameters (size, mass) into sensibility parameters.
- step ST141 the arithmetic function unit 12 generates a tactile sense presentation signal using physical parameters such as size and mass (registered in the operation unit parameters 54 or estimated).
- step ST142 the communication device 70 transmits a tactile presentation signal to the terminal device 80.
- the tactile sense presentation unit 30 controls the actuator with a tactile sense presentation signal corresponding to the operation signal by the user's operation. At least one of the operation signal, the sensation presentation signal, and the sensation presentation may be adjusted by either the communication device 70 or the terminal device 80 .
- the physical parameters of the operation unit are adjusted according to the size and mass of the operation unit.
- the feel can be controlled to a desired feel for the user.
- the operation unit of aspect 3 is not limited to being detachable.
- the difference can be recognized and an appropriate feel can be generated.
- the operation unit sensor 254 does not directly obtain the size and mass by the application or calibration of the mobile terminal 60, but estimates the size and mass of the attached operation unit by comparing with a reference operation unit. good too. For example, when an operation unit whose ID is registered in the operation unit parameter 54 and an operation unit whose ID is not registered are arranged close to each other, two operation units appear in the image data.
- the processor 18 obtains the ratio of the size of the operating portion whose ID is registered and the size of the operating portion whose ID is not registered, and multiplies this ratio by the size and mass of the operating portion whose ID is registered to obtain the ID. Estimate the size and mass of unregistered controls.
- the processor 18 is an example of an operation detection section
- the arithmetic function section 12 is an example of a signal generation section
- the tactile presentation section 30 is an example of a sensation presentation section.
- [Additional remarks of aspect 3] [Claim 1] an operation unit; an operation detection unit that detects an operation of the operation unit and generates an operation signal; a signal generator that generates a sensation presentation signal based on the operation signal; a sensation presentation unit that presents sensations to an operator based on the sensation presentation signal; an adjustment unit that adjusts at least one of the operation signal, the sensation presentation signal, and the sensation presentation based on physical characteristics of the operation unit.
- the physical properties of the control portion include at least one physical parameter of mass, diameter, radius, or length of at least a portion of the control portion.
- [Claim 3] Having an operation unit sensor that detects the attached operation unit, The operation unit sensor identifies physical characteristics of the operation unit by acquiring identification information possessed by the operation unit, or 2.
- the sensory control device according to claim 1, wherein the physical characteristics of the operation unit are specified from image data obtained by photographing the operation unit.
- the sensation presentation unit stops generating the sensation presentation signal when physical characteristics of the operation unit satisfy a predetermined condition.
- the operation unit is a pressing type operation unit that receives a pressing operation.
- the operation section is a slide operation section that receives a slide operation.
- the sensory control device of claim 1 comprising: [Claim 13] an acceleration sensor that detects the tilt of the operation unit; a mass correction unit that corrects the mass of the operation unit according to the tilt detected by the acceleration sensor;
- the sensory control device of claim 1 comprising: [Claim 14] A sensory control method performed by a device having an operation unit, detecting an operation of the operation unit and generating an operation signal; generating a sensory presentation signal based on the manipulation signal; a step of presenting a sensation to an operator based on the sensation presentation signal; adjusting at least one of the operation signal, the sensation presentation signal, and the sensation presentation based on physical characteristics of the operation unit;
- a sensory control method comprising: [Claim 15] A sensory control system comprising a communication device and a terminal
- the tactile presentation includes tactile presentation, auditory presentation by sound, visual presentation by image display, and the like.
- Sensor presentation is adjusted by adjusting the signals that drive the various operating tools.
- Patent Document 4 discloses a technique for parameterizing the response to shear vibrations generated by a fingertip during key presses by applying it to a mass-spring-damper system approximation of the fingertip.
- the conventional technique does not assume the deformation of an elastic body such as a finger in the operation direction, such as a buckling phenomenon in response to a pressing operation, so there is a problem that the expressive range of sensation presentation is narrowed.
- the finger includes elastic bodies such as skin and flesh, but the buckling phenomenon caused by the elastic bodies is not reflected in the sensation presentation.
- the purpose of this aspect is to provide a technique that further expands the range of expressiveness of sense presentation.
- a dynamic property is a physical property that includes a time factor, for example, a physical property that changes with time.
- the load-displacement curve when a user presses an operating tool such as a switch is that when a rigid body is pressed, and is based on static characteristics that do not include time factors. Therefore, correspondence information between the sensory parameters and the physical parameters cannot be obtained in a state in which the buckling phenomenon that occurs when the user actually presses with a finger is reproduced.
- an elastic body (flesh or flesh of the finger) integrated with the rigid body is placed between the rigid body (equivalent to the bone of the finger) and the operation tool.
- the operating tool is pressed by a finger model pressing tool provided with a finger model corresponding to the skin.
- Finger model Human fingers are taken into account by analyzing the position change [mm] of the manipulating tool and the two force sensor values [N] between the elastic body and the manipulating tool when the pressing tool presses the manipulating tool. Measurement evaluation was performed by the SD method with the configuration. Since the new physical parameters thus obtained include dynamic characteristics, correspondence information between sensibility parameters and physical parameters is generated in a state in which the buckling phenomenon that occurs when the user actually presses with a finger is reproduced. be.
- FIG. 55 is a diagram for explaining static characteristics obtained by a rigid pressing tool and dynamic characteristics obtained by a finger model pressing tool 252 in which a rigid body and an elastic body are integrated.
- the load-displacement curve of the manipulating tool 250 by the pressing tool of the rigid body 253 can express only static characteristics that do not include the time factor. Since the load-displacement curve 75 does not include the effect of the elastic body corresponding to the flesh portion 257 of the finger, it does not fully express the physical properties that contribute to the tactile sensation perceived by the operator.
- the flesh portion 257 of the finger is an elastic body that is deformed by stress.
- a bone 255 that can be regarded as a rigid body is also present inside the finger.
- the finger model presser 252 is designed to have the characteristics of the meat portion 257 and the bone 255.
- FIG. 55 shows a position change and two force sensor values A and B as dynamic characteristics 270 .
- the two force sensor values A and B detect the operation reaction force generated by the finger model pressing tool 252 with respect to the operating tool 250, respectively.
- the two force sensor values A and B are measured by different force sensors, and are arranged at the place where the flesh portion 257 of the finger contacts the operation tool 250 and the rigid portion (corresponding to the bone 255) inside the finger, respectively. ing. Details will be described with reference to FIG.
- the pressing of the operation tool 250 by the finger model pressing tool 252 can capture the movement of the finger in consideration of time, that is, the occurrence and change of sensation. A correlation close to the situation of
- FIG. 56 is a diagram explaining the relative positions of the finger and the operation tool 250 when the finger is deformed.
- the upper part of FIG. 56 shows periods A to C that can be read from the load-displacement curve 75 .
- the lower part of FIG. 56 schematically shows the deformation of the flesh of the finger corresponding to periods A to C.
- period B deformation (buckling) of the metal contact 57 of the operation tool 250 occurs, and the repulsive force disappears.
- the button portion 56 drops downward while maintaining the downward force.
- the operation reaction force becomes the difference compared to the period A. Therefore, the operation reaction force at the contact portion between the finger and the button is reduced.
- FIG. 57 is a diagram explaining the finger model presser 252.
- the finger is an elastic body in which the meat portion 257 is deformed.
- a bone 255 that can be regarded as a rigid body is also present inside the finger. Therefore, a finger model pressing tool 252 having an elastic body 59 that contacts the button portion 56 and a rigid body 58 that presses the button portion 56 via the elastic body 59 is an appropriate model when the finger presses the operation tool 250 . becomes.
- FIG. 58 is a diagram illustrating generation of a sensation presentation signal with a click feeling.
- a click feeling refers to a response at the time of input in an input device such as a button, or a feeling of pressing a switch. In the case of a mechanical switch, the click feeling is obtained by resistance or deformation of the metal contact 57 or the like. However, how the click feeling is generated varies depending on the button structure.
- the click feeling is controlled by the current supplied to the actuator.
- FIG. 58(a) shows the current value of the actuator with respect to time
- FIG. 58(b) shows the operation reaction force with respect to time. Since the current value abruptly decreases in the frame 283, the operation reaction force also abruptly decreases.
- a convex portion 284 in FIG. 58(b) corresponds to the time when the current value suddenly decreased. Therefore, when the user presses the operation tool 250 with his or her finger, a response (click feeling) similar to that of pressing a mechanical switch can be obtained.
- the timing at which the current value suddenly decreases and the amount of current decrease shown in FIG. 58(a) are merely examples, and are adjusted as appropriate.
- FIG. 59(a) shows a functional configuration diagram of a pressing type operating tool
- FIG. 59(b) shows a block diagram of the pressing type operating tool
- a button portion 271 in FIG. 59 is an example of the operation device 33 in FIG. 2
- a VCM 263 is an example of the tactile sense presentation section 30 in FIG.
- two force sensors A and B are arranged in the finger model presser 252 .
- the force sensor A is arranged at a position where the elastic body 59 of the finger model pressing tool 252 and the button portion 271 contact each other, and the force sensor B is arranged inside the rigid body 58 of the finger model pressing tool 252 . By doing so, the force sensor value A detected by the force sensor A can be used to monitor the buckling phenomenon.
- FIG. 59(b) is only an example of a pressing type operation tool, it will be briefly explained.
- the MCU circuit 262 is an example of the processor 18 in FIG. 2, and the position sensor 264 is an example of the position sensor 27 in FIG.
- the MCU circuit 262 outputs a current to a VCM (Voice Coil Motor) 263 according to the amount of operation (change in position) when the button portion 271 of the operation tool 250 is pressed.
- VCM 263 applies an artificial reaction force proportional to the current to button portion 271 . Since the finger model presser 252 presses the button portion 271 from the side opposite to the VCM 263 , an artificial reaction force is transmitted to the finger model presser 252 .
- the artificial reaction force is measured by force sensors A and B.
- FIG. 60 is a diagram for explaining dynamic characteristics when the finger model pressing tool 252 presses the operating tool 250 .
- FIG. 60(a) is a load displacement curve 75 shown for reference
- FIG. 60(b) is an example of a dynamic characteristic 270 when the manipulator 250 is pressed by the finger model pressing tool 252.
- FIG. 60(b) the horizontal axis is time
- the vertical axis is two force sensor values A and B and position change 211.
- FIG. The unit of time is [msec]
- the unit of force sensor values A and B is [N].
- the dynamic characteristics 270 greatly differ depending on the operation tool 250, and FIG. 60(b) is only an example.
- FIG. 61 is a diagram for explaining the temporal transition of the relative positions of the finger model pressing tool 252 and the operating tool 250.
- the buckling period T1 is the period from the peak of the force sensor value B to the peak of the position change 211 .
- the force sensor value B is not constant and has a peak at the starting point of the buckling period T1. This peak will be explained in FIG.
- the peak of the force sensor value B corresponds to the maximum value of the operation reaction force on the load displacement curve 75 . Therefore, the buckling period T1 is the period from when the operation reaction force reaches its maximum value to when the positional change 211 reaches its maximum value.
- FIG. 61(a) shows the relative positions of the finger model presser 252 and the button portion 56 at the beginning of the buckling period T1.
- the fingertip drop period T2 is the period from the peak of the force sensor value B to the downward peak of the force sensor value A.
- the operation reaction force sharply decreases in order to produce a click feeling.
- the operation reaction force applied to the finger model pressing tool 252 is reduced, and the elastic body 59 of the finger model pressing tool 252 begins to restore after the starting point of the fingertip drop period T2.
- the force sensor value A decreases during the fingertip drop period T2. Therefore, the fingertip drop period T2 is the period from when the maximum value of the operation reaction force is obtained until the elastic body of the finger model pressing tool 252 is restored to its maximum.
- FIG. 61(b) shows the relative positions of the finger model presser 252 and the button portion 56 at the end point of the fingertip drop period T2. A comparison with FIG. 61(a) reveals that the elastic body 59 of the finger model pressing tool 252 is restored.
- the fingertip collision period T3 is the period from the downward peak of the force sensor value A to the upward peak of the force sensor value A.
- the finger model pressing tool 252 continues to be pressed after the elastic body 59 of the finger model pressing tool 252 is restored to the maximum in FIG. 61(b), so the force sensor value A sharply increases. Therefore, the fingertip collision period T3 is the period from when the elastic body 59 of the finger model pressing tool 252 is restored to the maximum to when the elastic body 59 is pushed in the most.
- FIG. 61(c) shows the relative positions of the finger model presser 252 and the button portion 56 at the end point of the fingertip collision period T3. A comparison with FIG. 61(b) reveals that the elastic body 59 of the finger model pressing tool 252 is pushed.
- the fingertip vibration period T4 is the period from the upward peak of the force sensor value A until the fluctuation of the force sensor value A falls within a certain value. Since the operation reaction force has already been reduced to produce a click feeling, the force sensor A rapidly decreases even if the positional change 211 continues to increase due to pressing. After that, since the positional change 211 does not increase (the finger model pressing tool 252 also stops moving), the force sensor value B also becomes less likely to change, and the force sensor value A oscillates like chattering. Therefore, the fingertip vibration period T4 is a period until the most pushed elastic body is restored and stabilized.
- FIG. 61(d) shows the relative positions of the finger and the button portion 56 at the end point of the fingertip vibration period T4. A comparison with FIG. 61(c) reveals that the elastic body 59 of the finger model presser 252 is restored.
- the buckling period T1, the fingertip drop period T2, the fingertip collision period T3, and the fingertip vibration period T4 described above are examples of dynamic characteristics. Further, changes in the force sensor values A and B and the position change 211 can be extracted in each period of the buckling period T1, the fingertip drop period T2, the fingertip collision period T3, and the fingertip vibration period T4. These can also be used as dynamic characteristics in this embodiment.
- the dynamic characteristics may be physical characteristics including temporal changes in at least one of the operation reaction force and the amount of operation associated with the operation of the predetermined operation tool 250 .
- This physical property is a physical property that realizes sensation presentation when the elastic body 59 of the finger model pressing tool 252 is brought into contact with the operating tool 250 and operated.
- FIG. 62 is a diagram explaining the dynamic characteristics in more detail together with the periods A to C described above.
- the upper left part of FIG. 62 is an overall diagram including the start to end of dynamic characteristics, and the lower right part of FIG.
- the lower right portion of FIG. 62 shows the correspondence between dynamic characteristics and periods A to C.
- the force sensor value A detected by the force sensor A greatly changes due to the pushing and restoring of the elastic body 59 .
- the force sensor value B detected by the force sensor B is less affected by the deformation of the elastic body 59, so the change is small.
- the lower right portion of FIG. 62 also shows the buckling period T1, the fingertip drop period T2, the fingertip collision period T3, and the fingertip vibration period T4, which are as described in FIG.
- the peak of the force sensor B (the starting point of the buckling period T1 and the fingertip falling period T2), which was not clear in FIG. 60, is clear.
- the haptic control system 1 performs evaluation by the SD method in order to evaluate appropriate dynamic characteristics correlated with sensibility parameters. Therefore, a plurality of manipulators 250 with different dynamic characteristics are prepared.
- FIG. 63 shows dynamic characteristics when a plurality of manipulating tools 250 having different dynamic characteristics are pressed by the finger model pressing tool 252 .
- 25 manipulators 250 were prepared for explanation, and dynamic characteristics were measured for each of the 25 manipulators 250 .
- FIG. 63 shows the dynamic characteristics of four operating tools 250 among them.
- the upper diagram shows the dynamic characteristics 270 during the entire period (about 1 second) during pressing
- the lower diagram shows the buckling period T1, the fingertip drop period T2, and the fingertip impact period T3.
- FIG. 64 is a flow chart for explaining the flow of determination of physical parameters correlated with sensitivity parameters.
- step ST151 the haptic control system 1 measures the dynamic characteristics when the finger model pressing tool 252 presses each of the 25 operating tools 250 .
- step ST152 the input unit 4 receives the degree of expression for each sensitivity parameter for the 25 manipulators 250 by the SD method.
- step ST153 the processor 101 acquires a set of dynamic characteristics and expression frequency of each operating tool 250 for each sensitivity parameter.
- step ST154 the processor 101 obtains the correlation coefficient between dynamic characteristics and expression frequency for each sensitivity parameter.
- step ST155 the processor 101 determines dynamic characteristics with large absolute values of correlation coefficients.
- a large absolute value of the relation coefficient may be, for example, 0.5 or more.
- step ST156 the processor 101 applies the multiple regression analysis described in Equation 5 to the physical parameters and sensitivity parameters that are highly correlated with the sensitivity parameters to create the conversion model 15.
- FIG. 65 is a scatter diagram of pairs of dynamic characteristics and expression frequencies of each operating tool 250 in a certain sensitivity parameter acquired by the processor 101 in step ST153.
- the horizontal axis indicates "there is (no) a sense of recovery” as the sensory parameter
- the vertical axis indicates the buckling period T1.
- the buckling period T1 and the expression frequency of "with (no) sense of recovery” generally tend to rise to the right.
- the correlation coefficient is 0.82.
- FIG. 66 is a scatter diagram of pairs of dynamic characteristics and expression frequencies of each operating tool 250 in a given sensitivity parameter acquired by the processor 101 in step ST153.
- the abscissa indicates "there is (no) a feeling of being sucked" as the sensitivity parameter
- the ordinate indicates the position change during the fingertip collision period T3.
- the position change during the fingertip collision period T3 and the expression frequency of "I feel (no) being sucked in” generally tend to descend to the right.
- the correlation coefficient is 0.65.
- FIG. 67 is a scatter diagram of pairs of dynamic characteristics and expression frequencies of each operating tool 250 in a certain sensitivity parameter acquired by the processor 101 in step ST153.
- the horizontal axis indicates "there is (no) a sense of recovery” as a sensory parameter
- the vertical axis indicates the change in operation reaction force (force sensor value A) during the fingertip vibration period T4.
- the change in the operation reaction force during the fingertip vibration period T4 and the expression frequency of "has (no) sense of return” generally tend to rise to the right.
- the correlation coefficient is 0.78.
- the processor 101 associates the sensitivity parameters and dynamic characteristics shown in FIGS. 65, 66, and 67 by the method of least squares (an example of regression analysis) or the like.
- the method of least squares the strength of the correlation between the sensitivity parameter and the dynamic characteristic is estimated from the correlation coefficient.
- FIG. 68 shows a list of correlation coefficients between each sensitivity parameter and each dynamic characteristic.
- row headings are sensitivity parameters
- column headings are dynamic characteristics of the operating tool 250 .
- correlation coefficients of 0.5 or higher are emphasized with diagonal lines. Therefore, it can be seen that dynamic characteristics with large correlation coefficients are suitable for physical parameters.
- the processor 101 performs the multiple regression analysis described in Equation 5 with sensitivity.
- a conversion model 15 can be created by applying it to physical parameters and sensibility parameters that are highly correlated with the parameters.
- physical parameters P1 to Pn used in Equation 5 physical parameters with large correlation coefficients determined in step ST154 are adopted.
- the multiple regression analysis was described in Number 5 of Embodiment 1 and FIGS. Therefore, the coefficients of determination B 11 to B mn of each operating tool 250 can be determined, and a conversion model 15 as shown in FIG. 23 is obtained for each operating tool 250.
- FIG. 1 the coefficients of determination B 11 to B mn of each operating tool 250 can be determined, and a conversion model 15 as shown in FIG. 23 is obtained for each operating tool 250.
- a haptic control system 2 having a communication device 70 (server) and a terminal device 80 will be described with reference to FIG. Note that the block diagram of the haptic control system 2 may be the same as in FIG.
- FIG. 69 is a sequence diagram in which the communication device 70 (server) and the terminal device 80 communicate to estimate the sensitivity parameters of the operating tool 250 attached.
- step ST161 the communication device 70 and the terminal device 80 communicate with each other, and the finger model pressing tool 252 presses 25 operating tools 250 to measure the dynamic characteristics of each operating tool 250 .
- step ST162 the input unit 4 receives the degree of expression for each sensitivity parameter for the 25 manipulators 250 by the SD method.
- step ST163 the terminal device 80 transmits the expression frequency to the communication device 70.
- step ST164 the processor 14 acquires a set of dynamic characteristics and expression frequency of each operating tool 250 for each sensitivity parameter.
- step ST165 the processor 14 obtains the correlation coefficient between dynamic characteristics and expression frequency for each sensitivity parameter.
- step ST166 the processor 14 determines dynamic characteristics with large absolute values of correlation coefficients.
- a large absolute value of the relation coefficient may be, for example, 0.5 or more.
- step ST167 the processor 14 creates the conversion model 15 by applying the multiple regression analysis described in Equation 5 to the physical parameters and sensitivity parameters that are highly correlated with the sensitivity parameters.
- the haptic control system 1 of this aspect can extract the dynamic characteristics correlated with the sensitivity parameter by pressing the operation tool 250 with the finger model pressing tool 252 . Therefore, since a conversion model for converting the sensitivity parameter into this dynamic characteristic can be created, it is possible to generate a sensory presentation signal with preferable dynamic characteristic.
- the press-type manipulator has been described, but the same can be applied to a rotary manipulator that accepts a rotating operation.
- the rotation angle is the change in position
- the resistance to rotation is the operation reaction force.
- the finger model pressing tool 252 has only one type of elastic body 59
- the finger model pressing tool 252 has a plurality of types of elastic bodies with different elastic forces on the side that contacts the button portion 56. may have.
- the plurality of types of elastic bodies having different elastic forces are, for example, an elastic body corresponding to skin, an elastic body corresponding to flesh, and the like.
- a plurality of types of elastic bodies having different elastic forces may be arranged in layers so that the closer to the rigid body 58, the greater the elastic force. By doing so, it is possible to construct a finger model pressing tool 252 that exhibits dynamic characteristics closer to human touch.
- the shape of the finger model pressing tool 252 may be a simple cube, or may be one imitating the shape of a finger.
- the shape of the finger it is assumed that the fingers of men, women, adults, children, and various races are assumed, and the sizes and shapes of the fingers may be different.
- [Appendix 4] [Claim 1] a receiving step of receiving an input of a sensory parameter indicating the degree of sensory expression when operating the operation tool; a conversion step of converting the received kansei parameter into a physical parameter correlated with the kansei parameter among a plurality of types of physical parameters included in the physical properties related to sensory stimulation; an output step of outputting a sensory stimulus signal based on the transformed physical parameter;
- the sensory control method wherein the physical properties include dynamic properties.
- the physical properties include dynamic properties.
- said dynamic characteristic is a physical characteristic including temporal change in at least one of an operation reaction force and an operation amount accompanying operation of a predetermined operation tool.
- sensory control method [Claim 4] 2. The sensory control method of claim 1, wherein the physical parameter is buckling duration. [Claim 5] 2. The sensory control method of claim 1, wherein the physical parameter is fingertip drop duration. [Claim 6] 2. The sensory control method of claim 1, wherein the physical parameter is fingertip impact duration. [Claim 7] 2. The sensory control method of claim 1, wherein the physical parameter is fingertip vibration duration. [Claim 8] 2. The sensory control method according to claim 1, wherein said physical parameter has a correlation with said sensory parameter.
- a device wherein the physical properties include dynamic properties.
- a sensory control system comprising a communication device and a terminal device capable of communicating with each other,
- the terminal device has an input unit that receives an input of a sensory parameter indicating the degree of sensory expression when the operation tool is operated,
- the communication device has a conversion model that converts the sensory parameter transmitted from the terminal device into a physical parameter correlated with the sensory parameter among a plurality of types of physical parameters included in physical characteristics related to sensory stimulation,
- the terminal device has a sensory presentation unit that outputs a sensory stimulus signal based on the physical parameters converted by the conversion model,
- the sensory control system wherein the physical characteristics include dynamic characteristics.
- [Claim 13] device an input unit that receives an input of a sensitivity parameter indicating the degree of sensory expression when the operation tool is operated; a conversion model for converting the kansei parameter received by the input unit into a physical parameter correlated with the kansei parameter among a plurality of types of physical parameters included in physical properties related to sensory stimulation; Functioning as a sensory presentation unit that outputs a sensory stimulus signal based on the physical parameters converted by the conversion model,
- the program wherein the physical properties include dynamic properties.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
図1は、本開示の態様1に係る感覚制御システム100の基本的な構成を示す。図1に示される感覚制御システム100は、感性データベース16と、記憶部11と、入力部4と、プロセッサ101と、感覚提示部102とを有している。記憶部11は、感性パラメータ-物理パラメータ変換モデル(以下、単に「変換モデル15」と称する。)を格納している。感覚提示部102は、人に感覚を提示する構成部であり、例えば、触覚を提示する触覚提示部(例えば、後述する触覚提示部30)、聴覚を提示するスピーカなどの聴覚提示部、視覚を提示する表示デバイスなどの視覚提示部、またはこれらの任意の組み合わせで構成することができる。
図2は、図1に示された感覚制御システム100の第1の実施形態としての触覚制御システム1の構成を、信号の流れとともに示している。
図3~図5を参照して、本態様に係る触覚制御システム1に含まれる触覚提示部30の一例について説明する。図3~図5に例示される触覚提示部30は、押圧型操作具を操作するときの触覚を再現するものであり、モデルとなる押圧型操作具は、皿状板ばねあるいはドーム状板ばねが操作反力を発生するタクトスイッチ(登録商標)などの押圧型操作具である。触覚提示部30は、主制御装置10から与えられる触覚提示信号に基づいて、希望する感性パラメータに対応した触覚を再現する。触覚提示部30を、各種装置の電子回路に組み込むことにより、この触覚提示部30を、希望する感性パラメータに対応する触覚(ここでは操作感触)を実現した押圧型操作具として、実際の押圧型操作具の代わりに使用することができる。また、触覚提示装置20で操作反力を再現して、操作感触を表現する感性パラメータと、触覚提示装置20を動作させる物理特性に含まれる物理パラメータとの関連を評価し、その評価を、押圧型操作具を設計する際の指針として使用することも可能である。
図6は、図2の触覚制御システム1が記憶する変換モデル15の生成処理(変換モデル生成方法)の一例を示す。変換モデル生成方法は、少なくとも入力部と記憶部とプロセッサとを備える変換モデル生成システムによって実行される。図6における「ST」は処理ステップを示している。
以下では、図2に示す触覚提示装置20で所定の操作具の操作感触に模した触覚提示を行う例について説明する。この例における変換モデル15の感性パラメータは、所定の操作具としての押圧型操作具を操作する操作感触を表現した形容詞の表現度数である。この例における変換モデル15の物理パラメータは、所定の操作具としての押圧型操作具が操作される際の感覚提示を実現する物理特性に含まれる。触覚制御システム1は、入力部4を介して特定の感性パラメータの入力を受け付けると、変換モデル15を用いて、受け付けた特定の感性パラメータを物理パラメータに変換する。押圧型操作具を想定した感性パラメータは、人が押圧型操作具を押圧操作したときの操作感触を表現した形容詞やオノマトペ等による感覚表現の度合いである。物理パラメータで実現される物理特性は、例えば、操作に伴う変移(例えばストローク量)、操作反力(荷重)、可動部21の速度、加速度、加加速度、操作者の指等の身体部位の弾性特性、あるいはこれらの物理特性から導かれる量等である。本明細書での物理パラメータは、物理特性の1つ以上の変数を含むものとして定義される。
図8は、触覚提示装置20の制御動作例のフローチャートを示す。フローチャートに示される処理は、触覚提示装置20に含まれるプロセッサ18の制御動作で実行される。図8のST11において、演算機能部13から触覚提示装置20のプロセッサ18に触覚提示信号が与えられ、ST12において、物理パラメータに基づいて選択された荷重変位曲線に基づく制御が開始される。ST13において、操作装置33が操作されると位置センサ27と加速度センサ28から可動部21に関する検知信号が得られる。プロセッサ18では、感性パラメータである表現度数に対応して設定された荷重変位曲線の動作プロファイルと可動部21の検知位置との差分が計算される。ST14で、触覚提示部30のコイル25に与えられる電流Iが最適化され、ユーザが希望する感性パラメータの表現度数を再現できるように触覚が提示される。
図17~図19を参照し、触覚制御システム1に含まれる触覚提示装置20の変形例について説明する。図19に例示される触覚提示装置40は、回転型操作具の触覚を再現するものである。回転型操作具は、例えばロータリスイッチである。
本開示の感覚制御システム100が実行する感覚制御方法の第1の変形例は、感覚刺激信号を取得する取得ステップと、取得した感覚刺激信号に基づいて感性パラメータを指定する指定ステップと、をさらに含む。また、上述した感性パラメータの入力を受け付ける受付ステップは、ユーザ等からの入力には限定されず、指定ステップで指定された感性パラメータを受け付けるステップである。これにより、第1の変形例に係る感覚制御システム100は、取得した感覚刺激信号に基づいて感性パラメータを指定し、指定された感性パラメータと相関する物理パラメータに基づく感覚提示信号を出力することができる。
本開示の操作装置33は、スライド操作を受け付ける操作面を有してもよい。スライド操作は、ユーザの指等の身体部位を操作装置33の操作面に接触させたまま、接触位置を移動させる操作である。この場合、本開示の触覚提示部30は、操作装置33の操作面を振動させることで、操作反力を発生させる。操作装置33の操作面を振動させる方法としては、例えば、アクチュエータ等による錘の振動によるものが挙げられる。本開示の感覚制御方法の第2の変形例における感覚提示ステップは、このような操作装置33および触覚提示部30を用いて、操作装置33のスライド操作に応答して触覚提示部30から操作反力を発生させることで、触覚を提示する工程とすることができる。詳細には、感覚提示ステップは、操作装置33の操作面でスライド操作が行われると、そのスライド操作を操作装置33により検出し、検出したスライド操作に応答して、触覚提示部30から操作反力を発生させる。
本開示の感性データベース16は、上述の通り、所定の感覚提示に関する物理特性と、当該感覚提示に対する感覚表現の度合いを示す感性パラメータとが対応付けられた対応情報を、1種類以上の感覚提示についてそれぞれ記憶している。感覚提示として主に触覚提示について説明したが、本明細書で主に言及する「触覚」は広義の触覚であり、広義の触覚は、狭義の触覚、圧覚、力覚などを含む概念である。本明細書では、単に「触覚」と記載した場合には広義の触覚を意味する。ここで、狭義の触覚は、例えば身体部位が接触する物体表面の質感等に関する感覚であり、例えば凹凸や粗さなどの感覚表現にかかる感性パラメータと相関性が高い。圧覚は、例えば身体部位と物体との間の抗力等に関する感覚であり、例えば硬さなどの感覚表現にかかる感性パラメータと相関性が高い。力覚は、例えば身体部位にかかる外力に関する感覚であり、例えば引かれたり押されたりする感覚である。なお、狭義の触覚、圧覚および力覚それぞれに主にかかる受容器は異なっており、各受容器の応答特性にも違いがあることが知られている。
図20は、図1に示された感覚制御システム100の第2の実施形態としての触覚制御システム2の構成を、信号の流れとともに示している。
第1の実施形態に係る触覚制御システム1は、例えば、ゲーム、映像、音楽などのエンタテインメント用途に用いることができる。触覚制御システム1をエンタテインメントの用途に用いる場合、例えば、ゲームコントローラなどの操作装置33に含まれるボタン、ジョイスティック、トリガースイッチなどの操作部を通じて、触覚提示装置20からの触覚をユーザに対して提示してもよい。また、操作装置33の操作部以外の箇所、例えば操作装置33を保持するユーザの手などの身体部位の全体または一部に対して、触覚提示装置20からの触覚提示を行ってもよい。ゲームコントローラとしては、例えば自動車のステアリングホイールを模したステアリングコントローラであってもよい。
従来、人に何らかの刺激を与えることで、感覚提示を行う操作具が知られている。ここで、感覚提示は、触覚提示、音による聴覚提示、画像表示などによる視覚提示を含む。種々の操作具を駆動する信号を調整することで、感覚提示を調整することが行われている。
[発明が解決しようとする課題]
しかしながら、従来の技術では、感性的な入力により感覚提示を調整できないという問題がある。すなわち、ユーザが好む感覚はユーザによって異なるが、ユーザは自分の好みを感性的に表現する場合がある。しかし、従来は、この感性的な表現が感覚提示の変更として利用されていない。
態様1では、変換モデル15を用いて感性パラメータを物理パラメータに変換する感覚制御方法について説明した。しかしながら、メーカーが、感性パラメータから変換された物理パラメータが触覚提示に適用された操作具を試作しても、ユーザの好む操作感触を得るためには何回かの試行錯誤が必要な場合が多い。操作具の試作には多くの工程が必要なため、ユーザの好む操作感触を有する操作具の完成に時間がかかる場合がある。
図25は、触覚制御装置50の斜視図である。図25は単体型(スタンドアローン型)の触覚制御装置50である。図25に示すように、触覚制御装置50は、3つの基準操作具51a~51c(複数の基準操作具)、再現操作具52、タッチパネル53、及び、ディスプレイ260を有している。なお、以下では基準操作具51a~51cのうち任意の基準操作具を「基準操作具51」という。基準操作具51は2つ以上であればよい。
まず、図27、図28を参照して、触覚制御装置50の動作の概略を説明する。図27、図28は、触覚制御装置50を使用してユーザが操作感触を調整する作業の概略を示す。
(6) ユーザは再現操作具52を操作してみて、自分の嗜好する操作感触かどうかを確かめる(図28(c))。
図29は、触覚制御装置50の機能を説明する機能ブロック図である。図29に示すように、触覚制御装置50は、表示制御部61、第一入力受付部62、第二入力受付部63、分類部64、第一変換モデル65a、第二変換モデル65b、第三変換モデル65c、及び、物理パラメータ設定部66を有している。触覚制御装置50が有するこれらの各機能は情報処理装置として有するCPUやプロセッサがRAMに展開されたプログラムを実行することで実現される。あるいは、各機能がハードウェア回路で実現されてもよい。
次に、図30等を参照して、分類部64の生成について説明する。図30は、分類部64の生成における学習の流れを示すフローチャート図である。なお、各種の学習は、触覚制御装置50が行うとするが、学習に関しては任意の情報処理装置が行うことができる。
「作動力が軽い(重い)」
「決定感のない(ある)」
「不正確な(正確な)」
「明確な(曖昧な)」
「柔らかい(硬い)」
「ぼやけた(はっきりした)」
「引っかかる(スムーズな)」
「疲れる(疲れない)」
「厳しい(優しい)」
「粗い(細かい)」
「吸い込まれる感触がない(ある)」
「斬新な(伝統的な)」
「安っぽい(高級な)」
「耐久性のある(ない)」
「また操作したくない(したい)」
「楽しい(つまらない)」
「心地よくない(よい)」
「嫌い(好き)」
「はねるような感触がない(ある)」
「マイルド(シャープ)」
「乾いた(湿った)」
「明るい(暗い)」
「冷たい(暖かい)」
「遊びのある(ない)」
なお、これらの感性パラメータは、Web解析、Tweet解析、SNS解析、論文、市場毎のクラスタリング分析、特徴や形容詞抽出により自動的に作成されてよい。すなわち、感性パラメータは固定でなく、動的に変更可能でもよい。
「マイルド(シャープ)」
「粗い(細かい)」
「明るい(暗い)」
「柔らかい(硬い)」
「軽い(重い)」
ST52では、触覚制御装置50が、感性パラメータの表現度数と物理パラメータの対応を重回帰分析により決定する。本態様では、3つの基準操作具51a~51cが用意されるので、3つの基準操作具51a~51cのそれぞれについて荷重変位曲線が得られている。この荷重変位曲線を実現する物理パラメータも既知である。ユーザは基準操作具51a~51cを操作して、この基準操作具51a~51cについてどういう操作感触であるかを表現度数として入力する。十分な人数の表現度数が入力されると、触覚制御装置50は、数5を用いて重回帰分析を行う。重回帰分析については態様1の数5、図22,図23にて説明した。したがって、3つの基準操作具51a~51cそれぞれの決定係数B11~Bmnを決定でき、図23のような変換モデル15が基準操作具51a~51cごとに得られる。この3つの基準操作具51a~51cごとの変換モデルが第一変換モデル65a~第三変換モデル65cである。
図32は、触覚制御装置50が分類部64と第一変換モデル65a~第三変換モデル65cを使用してユーザが嗜好する操作感触を提示する流れを示すフローチャート図である。
続いて、触覚制御装置50の第二形態について説明する。
図34は、触覚制御装置50の機能を説明する機能ブロック図である。なお、図34の説明では主に図29との相違を説明する場合がある。触覚制御装置50は、表示制御部61、第一入力受付部62、第二入力受付部63、物理パラメータ変換部67、カーブフィッティング部68、比較部69、第一変換モデル65a、第二変換モデル65b、第三変換モデル65c、及び、物理パラメータ設定部66を有している。触覚制御装置50が有するこれらの各機能は情報処理装置として有するCPUがRAMに展開されたプログラムを実行することで実現される。あるいは、各機能がハードウェア回路で実現されてもよい。
次に、図35等を参照して、表現度数に対応する物理パラメータ(荷重変位曲線)の学習について説明する。図35は、表現度数に対応する物理パラメータ(荷重変位曲線)の学習の流れを示すフローチャート図である。
フィッティングモデル:y=P1×x0+P2×x1+P3×x2+……Pn×xn
カーブフィッティング部68は、重回帰分析によりP1~Pnを求めることができる。得られたP1~Pnは物理パラメータに相当する。
図37は、触覚制御装置50が物理パラメータ変換部67と比較部69を使用してユーザが嗜好する操作感触を提示する流れを示すフローチャート図である。
図38等を参照して、分類の学習方法について説明する。図38は、分類部64がニューラルネットワークにより実現される場合のニューラルネットワークの一例を示す。図38のニューラルネットワークは入力層131に入力されたデータに対し、出力層133の3つのノードがそれぞれ出力値yiを出力する。この出力値yiは、確率であり、y1+y2+y3は1.0となる。本態様では、出力層133の3つのノードが3つの基準操作具51a~51cに対応しており、表現度数に応じて3つの基準操作具51a~51cのうちどの基準操作具51a~51cが確からしいかの確率を出力する。
図40は、STEP1の表現度数の第一入力画面281について説明する図である。ユーザは各感性パラメータごとにスライドバーを操作して表現度数を入力する。第一形態で説明した分類部64は、学習結果を利用して、現在の表現度数の場合に各基準操作具51a~51cが選ばれる確率を算出する。表示制御部61は、各基準操作具51a~51cの確率を基準操作具欄112に表示する。したがって、ユーザは現在の表現度数がどの基準操作具51a~51cに近いかを、基準操作具51a~51cを操作してみることで把握できる。なお、確率の表示は、リアルタイムに又はユーザが決定操作を入力した場合のどちらに応じて表示されてもよい。
続いて、図41等を参照して、クライアントサーバシステムの動作について説明する。図41は第一形態の触覚制御装置50をクライアントサーバシステムに適用した触覚制御システム2の機能ブロック図である。図41の説明では、図29との相違を主に説明する。図41に示すように、端末装置80とサーバ200がそれぞれ第一通信部71と第二通信部72を有する以外は、端末装置80とサーバ200が図29の触覚制御装置50と同じ機能を有している。
[請求項1]
操作具の操作感触を制御する触覚制御装置であって、
第一の感性パラメータに対応付けられた第一の表現度数の入力手段を表示する表示制御部と、
ユーザ操作に応じて前記第一の表現度数の入力を受け付ける第一入力受付部と、
前記第一の表現度数に基づいて、予め用意された物理パラメータを再現操作具に設定する物理パラメータ設定部と、を有し、
前記表示制御部は、第二の感性パラメータに対応付けられた第二の表現度数の入力手段を表示し、
ユーザ操作に応じて前記第二の表現度数の入力を受け付ける第二入力受付部と、
前記第二の表現度数を回帰モデルにより物理パラメータに変換する変換部と、を有し、
前記物理パラメータ設定部は、前記変換部が変換した物理パラメータを前記再現操作具に設定することを特徴とする触覚制御装置。
[請求項2]
前記第一の表現度数を複数の基準操作具のうちの1つに分類する分類部を有し、
前記物理パラメータ設定部は、前記分類部が分類した前記基準操作具に設定されている前記物理パラメータを前記再現操作具に設定することを特徴とする請求項1に記載の触覚制御装置。
[請求項3]
複数の基準操作具が有する第一の物理パラメータが実現する荷重変位曲線に対しカーブフィッティングを行い、複数の基準操作具ごとに前記第一の物理パラメータを推定するカーブフィッティング部と、
前記第一の表現度数を回帰モデルにより第二の物理パラメータに変換する物理パラメータ変換部と、を有し、
前記物理パラメータ設定部は、前記第二の物理パラメータに最も類似する前記第一の物理パラメータを有する前記基準操作具の前記第一の物理パラメータを前記再現操作具に設定することを特徴とする請求項1に記載の触覚制御装置。
[請求項4]
前記第二の表現度数の入力手段は、前記分類部が分類した基準操作具に設定されている物理パラメータに対応する表現度数とその前後の値を取り得ることを特徴とする請求項2に記載の触覚制御装置。
[請求項5]
前記第一の感性パラメータ及び前記第二の感性パラメータは、それぞれ複数であり、前記第一の感性パラメータの数は、前記第二の感性パラメータの数よりも多いことを特徴とする請求項1に記載の触覚制御装置。
[請求項6]
前記分類部は、前記複数の基準操作具が有する操作感触と、前記複数の基準操作具をユーザがそれぞれ操作して前記第一の感性パラメータごとに入力した表現度数との対応を学習することで生成されていることを特徴とする請求項2に記載の触覚制御装置。
[請求項7]
前記回帰モデルは、前記複数の基準操作具が有する前記物理パラメータと、前記複数の基準操作具をユーザが操作して前記第二の感性パラメータごとに入力した表現度数との対応を回帰分析することで生成されていることを特徴とする請求項2に記載の触覚制御装置。
[請求項8]
前記回帰モデルは、任意の基準操作具が有する物理パラメータと、前記任意の基準操作具をユーザが操作して前記第一の感性パラメータごとに入力した表現度数との対応を回帰分析することで生成されていることを特徴とする請求項3に記載の触覚制御装置。
[請求項9]
前記カーブフィッティング部は、前記第一の物理パラメータを係数にしてストローク量から操作応力を求めるフィッティングモデルを用いて、前記荷重変位曲線に対しカーブフィッティングを行い、前記第一の物理パラメータを推定することを特徴とする請求項3に記載の触覚制御装置。
[請求項10]
前記第一の感性パラメータ及び前記第二の感性パラメータは形容詞であり、
前記第一の表現度数及び前記第二の表現度数は、前記形容詞の度合いを示す値であることを特徴とする請求項1に記載の触覚制御装置。
[請求項11]
前記第一の表現度数及び前記第二の表現度数は、ユーザが操作具をそれぞれ操作した際に得られる触覚の情報であることを特徴とする請求項1~10のいずれか1項に記載の触覚制御装置。
[請求項12]
前記回帰モデルにおいて、前記第一の表現度数及び前記第二の表現度数は、前記操作具をそれぞれ操作した際に得られる触覚としての作動力と相関する請求項11に記載の触覚制御装置。
[請求項13]
操作具の操作感触を制御する触覚制御装置を、
第一の感性パラメータに対応付けられた第一の表現度数の入力手段を表示する表示制御部と、
ユーザ操作に応じて前記第一の表現度数の入力を受け付ける第一入力受付部と、
前記第一の表現度数に基づいて、予め用意された物理パラメータを再現操作具に設定する物理パラメータ設定部、として機能させ、
前記表示制御部は、第二の感性パラメータに対応付けられた第二の表現度数の入力手段を表示し、
更に、ユーザ操作に応じて前記第二の表現度数の入力を受け付ける第二入力受付部と、
前記第二の表現度数を回帰モデルにより物理パラメータに変換する変換部、として機能させ、
前記物理パラメータ設定部は、前記変換部が変換した物理パラメータを前記再現操作具に設定することを特徴とするプログラム。
[請求項14]
操作具の操作感触を制御する触覚制御装置が触覚を制御する触覚制御方法であって、
第一の感性パラメータに対応付けられた第一の表現度数の入力手段を表示するステップと、
ユーザ操作に応じて前記第一の表現度数の入力を受け付けるステップと、
前記第一の表現度数に基づいて、予め用意された物理パラメータを再現操作具に設定するステップと、
第二の感性パラメータに対応付けられた第二の表現度数の入力手段を表示するステップと、
ユーザ操作に応じて前記第二の表現度数の入力を受け付けるステップと、
前記第二の表現度数を回帰モデルにより物理パラメータに変換するステップと、
変換された物理パラメータを前記再現操作具に設定するステップと、
を有することを特徴とする触覚制御方法。
[請求項15]
端末装置とサーバがネットワークを介して通信する触覚制御システムであって、
前記端末装置は、
第一の感性パラメータに対応付けられた第一の表現度数の入力手段を表示する表示制御部と、
ユーザ操作に応じて前記第一の表現度数の入力を受け付ける第一入力受付部と、
前記第一の表現度数を前記サーバに送信する第一通信部と、
前記サーバから送信された物理パラメータを再現操作具に設定する物理パラメータ設定部と、を有し、
前記表示制御部は、第二の感性パラメータに対応付けられた第二の表現度数の入力手段を表示し、
ユーザ操作に応じて前記第二の表現度数の入力を受け付ける第二入力受付部と、を有し、
前記第一通信部が前記第二の表現度数を前記サーバに送信し、
前記サーバは、
前記端末装置から受信した前記第一の表現度数に基づいて、予め用意された前記物理パラメータを決定し、決定した前記物理パラメータを前記端末装置に送信する第二通信部と、
前記端末装置から受信した前記第二の表現度数を回帰モデルにより物理パラメータに変換する変換部と、を有し、
前記第二通信部は、前記変換部が変換した物理パラメータを前記端末装置に送信することを特徴とする触覚制御システム。
[請求項16]
第一の感性パラメータに対応付けられた第一の表現度数の入力手段を表示する表示制御部と、
ユーザ操作に応じて前記第一の表現度数の入力を受け付ける第一入力受付部と、
前記第一の表現度数をサーバに送信する第一通信部と、前記サーバから送信された物理パラメータを再現操作具に設定する物理パラメータ設定部と、を有し
前記表示制御部は、第二の感性パラメータに対応付けられた第二の表現度数の入力手段を表示し、
ユーザ操作に応じて前記第二の表現度数の入力を受け付ける第二入力受付部と、を有し、前記第一通信部が前記第二の表現度数を前記サーバに送信する端末装置とネットワークを介して通信するサーバであって、
前記端末装置から受信した前記第一の表現度数に基づいて、予め用意された物理パラメータを決定し、決定した前記物理パラメータを前記端末装置に送信する第二通信部と、
前記端末装置から受信した前記第二の表現度数を回帰モデルにより前記物理パラメータに変換する変換部と、を有し、
前記第二通信部は、前記変換部が変換した物理パラメータを前記端末装置に送信することを特徴とするサーバ。
従来、人に何らかの刺激を与えることで、感覚提示を行う操作部が知られている。ここで、感覚提示は、触覚提示、音による聴覚提示、画像表示などによる視覚提示を含む。種々の操作部を駆動する信号を調整することで、感覚提示を調整することが行われている。
[発明が解決しようとする課題]
しかしながら、従来の技術は、操作部の物理特性に応じた感覚提示が十分にされていないという問題がある。例えば、ロータリー式の操作部の場合、操作部の大きさや質量等によって、アクチュエータを同じように駆動しても操作部を操作するユーザに伝わる感覚が異なってしまう。
操作部の物理特性に応じた感覚提示を行う技術を提供できる。
本態様では、操作部(例えば、後述する図45の操作装置33)の物理特性に基づく調整を行う感覚制御方法について説明する。触覚提示装置20がアクチュエータ駆動による操作部を通じた触覚生成を行うのに際し、操作部の物理特性(大きさや質量等)によっては、アクチュエータを同じように駆動しても操作部を操作するユーザ(操作者の一例)に伝わる感触(ユーザが知覚する操作感覚)が異なってしまう。
・操作部の物理特性の違いを情報としてユーザが入出力装置3に入力する。操作部の大きさや質量が特定される。
・触覚提示装置20が操作部の物理特性の違いを表すID、大きさ、質量などをセンサにより検知する。
・操作部の物理特性の違いを検知するセンサは、カメラであり、カメラが一次元コード、二次元コードを読み取る。また、カメラは操作部の画像を認識することで操作部を特定する。あるいは、センサはICタグリーダであり、ICタグリーダがIDを読み取る。
図45は、本態様において、感覚制御システム100の触覚制御システム110の構成を示す図である。本態様において、図2において同一の符号を付した構成要素は同様の機能を果たすので、主に本態様の主要な構成要素についてのみ説明する場合がある。
[操作部センサによる操作部の検知]
図47、図48を参照して、操作部センサ254による操作部の検知方法を説明する。まず、図47は、ロータリー式操作部の物理特性の違いを説明する図である。図47(a)は小さい操作部201aを、図47(b)は大きい操作部201bをそれぞれ示す。なお、以下では、操作部201a、201bのうち任意の操作部を「操作部201」という。
操作部センサ254が検知した操作部が操作部パラメータにない場合が起こりうる。例えば、
・ゲームコントローラなどのユーザが使用するコントローラにおいて、ユーザが、ノブなどの操作部を変更することが可能なコントローラであり、その装着部に適した操作感触を得たい場合がある。
・車両などの操作ハンドルにおいて、ユーザが交換可能であって、装着されたハンドルに適した操作感触を得たい場合がある。
次に、キャリブレーション部55が、キャリブレーションにより質量を推定する方法について説明する。キャリブレーション部55は、操作部201を装着したときに電流パターンで操作部を動作させ(ロータリー式の場合は回転させる)、その電流と位置の対応から操作部の質量を推定する。
操作部201の傾き度合いは設置場所ごとに異なる。例えば、操作部201がステアリングに取り付けられた場合、または、センターコンソールに取り付けられた場合等では、操作部201の傾きが異なる。傾きが異なると重力の作用により特に押圧型操作部の操作感触が異なる。そこで、触覚提示装置20が加速度センサ28により、操作部201の設置場所の傾きを測定し、操作部201の質量を補正する。
F2=F1/cosθ
このように、傾きがある設置場所では大きな操作反力が必要になるが、操作反力と質量には相関がある。そこで、操作反力の違いを質量の違いとみなして、質量補正部261が操作部201の質量を補正する。質量補正部261は、例えば、「補正後の質量=元の質量/cosθ」などの関係を使用して、操作部201の質量を補正する。こうすることで、傾きがある場所に操作部201が設置されても、好ましい操作感触を制御できる。
図51は、触覚制御システム110が装着された操作部の物理パラメータに応じて触覚提示信号を調整する処理を示すフローチャート図である。
・プロセッサ18(操作検出部の一例)が調整部として機能し、"操作信号"に調整を反映するケース。
・演算機能部12(信号生成部の一例)が調整部として機能し、"感覚提示信号"に調整を反映するケース。
・触覚提示部30が調整部として機能し、"感覚提示"に調整を反映するケース。
次に、図53を参照して、通信装置70(サーバー)と端末装置80とを有する触覚制御システム111について説明する。図53は、図45に示された感覚制御システム100の第2の実施形態としての触覚制御システム111の構成を、信号の流れとともに示している。なお、図53の説明では、主に図45との相違を説明する。
本態様の触覚制御システム110,111によれば、操作部の物理パラメータを、操作部の大きさや質量等に応じて調整するので、操作部の大きさや質量が変わっても、操作するユーザに伝わる感触をユーザに取って好ましい感触に制御できる。
例えば、態様3の操作部は着脱可能であることに限られない。例えば、操作部を複数実装するシステムにおいて、ノブサイズ・デザインを異にする操作部が複数配置されている場合、その違いを認識し、適切な感触を生成することもできる。
[請求項1]
操作部と、
前記操作部の操作を検出して操作信号を生成する操作検出部と、
前記操作信号に基づいて感覚提示信号を生成する信号生成部と、
前記感覚提示信号に基づいて操作者に感覚提示を行う感覚提示部と、
前記操作部の物理特性に基づいて、前記操作信号、前記感覚提示信号、及び前記感覚提示の少なくともいずれか1つを調整する調整部と、を備える感覚制御装置。
[請求項2]
前記操作部の物理特性は、前記操作部の少なくとも一部の質量、直径、半径、または全長の少なくとも一つの物理パラメータを含むことを特徴とする請求項1に記載の感覚制御装置。
[請求項3]
装着された操作部を検知する操作部センサを有し、
前記操作部センサは、前記操作部が有する識別情報を取得することで、前記操作部の物理特性を特定するか、または、
前記操作部が撮影された画像データから、前記操作部の物理特性を特定することを特徴とする請求項1に記載の感覚制御装置。
[請求項4]
前記感覚提示部は、前記操作部の物理特性が所定の条件を満たす場合、前記感覚提示信号の生成を停止することを特徴とする請求項1に記載の感覚制御装置。
[請求項5]
前記操作部は押圧操作を受け付ける押圧型操作部であることを特徴とする請求項1に記載の感覚制御装置。
[請求項6]
前記操作部は、スライド操作を受け付けるスライド操作部であることを特徴とする請求項1に記載の感覚制御装置。
[請求項7]
前記操作部は、傾動操作を受け付けるピボット操作部であることを特徴とする請求項1に記載の感覚制御装置。
[請求項8]
前記操作部は、回転操作を受け付けるロータリー式操作部であることを特徴とする請求項1に記載の感覚制御装置。
[請求項9]
前記感覚提示信号は、感性パラメータと相関していることを特徴とする請求項1に記載の感覚制御装置。
[請求項10]
前記感覚提示部は、操作者に触覚提示を行う触覚提示部であることを特徴とする請求項1に記載の感覚制御装置。
[請求項11]
前記操作部の少なくとも一部は、着脱可能であることを特徴とする請求項1に記載の感覚制御装置。
[請求項12]
前記操作部をアクチュエータで駆動した場合に必要なトルクを検出するトルクセンサと、
予め用意されているトルクと質量の関係に基づいて、前記トルクセンサが検出した前記トルクから前記操作部の質量を推定するキャリブレーション部と、
を有することを特徴とする請求項1に記載の感覚制御装置。
[請求項13]
前記操作部の傾きを検出する加速度センサと、
前記加速度センサが検出した前記傾きに応じて、前記操作部の質量を補正する質量補正部と、
を有することを特徴とする請求項1に記載の感覚制御装置。
[請求項14]
操作部を有する装置が行う感覚制御方法であって、
前記操作部の操作を検出して操作信号を生成するステップと、
前記操作信号に基づいて感覚提示信号を生成するステップと、
前記感覚提示信号に基づいて操作者に感覚提示を行うステップと、
前記操作部の物理特性に基づいて、前記操作信号、前記感覚提示信号、及び前記感覚提示の少なくともいずれか1つを調整するステップと、
を有することを特徴とする感覚制御方法。
[請求項15]
互いに通信可能な通信装置と端末装置とを備える感覚制御システムであって、
前記端末装置は、
操作部と、
前記操作部の操作を検出して操作信号を生成する操作検出部と、
前記通信装置から送信された感覚提示信号に基づいて操作者に感覚提示を行う感覚提示部と、を有し、
前記通信装置は、
前記操作信号に基づいて前記感覚提示信号を生成する信号生成部、を有し、
前記端末装置又は前記通信装置は、前記操作部の物理特性に基づいて、前記操作信号、前記感覚提示信号、及び前記感覚提示の少なくともいずれか1つを調整する調整部、を備える感覚制御システム。
従来、人に何らかの刺激を与えることで、感覚提示を行う操作具が知られている。ここで、感覚提示は、触覚提示、音による聴覚提示、画像表示などによる視覚提示を含む。種々の操作具を駆動する信号を調整することで、感覚提示を調整することが行われている。
[発明が解決しようとする課題]
しかしながら、従来の技術では、押込み操作に対する座屈現象等、操作方向への指などの弾性体の変形を想定していないため、感覚提示の表現力の範囲が狭くなるという問題がある。すなわち、指には皮膚や肉という弾性体が含まれるが、弾性体による座屈現象等が感覚提示に反映されていない。
感覚提示の表現力の範囲をより拡大した技術を提供することができる。
本態様では、動特性を含む物理パラメータに基づく感覚刺激信号を出力する触覚制御システム1とその感覚制御方法について説明する。動特性は、時間因子を含む物理特性であり、例えば時間に対し物理特性が変化する。
・物理パラメータ(指先衝突期間T3における操作具の移動距離、指先衝突期間T3におけるフォースセンサ値の変化量、指先衝突期間T3)と、感性パラメータ(復帰感)との相関関係
・物理パラメータ(座屈期間T1での位置変化)と、感性パラメータ(吸い込まれ感)との相関関係
・物理パラメータ(指先振動期間T4)と、感性パラメータ(疲労感)との相関関係
[指モデル押圧具と操作具の構成例]
図55は、剛体の押圧具により得られる静特性と、剛体と弾性体とが一体の指モデル押圧具252により得られる動特性と、を説明する図である。まず、剛体253の押圧具による操作具250の荷重変位曲線は、時間因子を含まない静特性しか表現できない。荷重変位曲線75は、指の肉部257に対応する弾性体の影響を含んでいないため、操作者が知覚する触覚に寄与する物理特性を十分に表現しきれていない。
図58は、クリック感のある感覚提示信号の生成を説明する図である。クリック感とは、ボタンなどの入力装置における入力時の反応、スイッチを押したような手応えなどのことをいう。メカニカル式スイッチの場合、クリック感はメタルコンタクト57などの抵抗や変形により得られる。ただし、クリック感がどのように発生するかはボタン構造によって様々である。
図60は、指モデル押圧具252により操作具250を押圧した場合の動特性を説明する図である。図60(a)は参考に示した荷重変位曲線75であり、図60(b)は指モデル押圧具252により操作具250を押圧した場合の動特性270の一例である。図60(b)は、横軸を時間、縦軸を2つのフォースセンサ値A、B及び位置変化211とした。時間の単位は[msec]、フォースセンサ値A、Bの単位は[N]である。なお、動特性270は操作具250によって大きく異なるものであり、図60(b)は一例に過ぎないことに注意されたい。
図60,図62で説明した動特性のうち感性パラメータと相関するものがある。感性パラメータと相関する適切な動特性が本態様の物理パラメータである。
図64は、感性パラメータと相関する物理パラメータの決定の流れを説明するフローチャート図である。
次に、図69を参照して、通信装置70(サーバー)と端末装置80とを有する触覚制御システム2について説明する。なお、触覚制御システム2のブロック図については図20と同様でよい。
以上説明したように、本態様の触覚制御システム1は、指モデル押圧具252により操作具250を押圧することで、感性パラメータと相関する動特性を抽出できる。従って、感性パラメータをこの動特性に変換する変換モデルを作成できるので、好ましい動特性となる感覚提示信号を生成できる。
例えば、態様2では、押圧型操作具について説明したが、回転操作を受け付けるロータリー式操作具においても同様に適用できる。ロータリー式操作具の場合、回転角度が位置変化であり、回転に対する抵抗力が操作反力である。
[請求項1]
操作具を操作した場合の感覚表現の度合いを示す感性パラメータの入力を受け付ける受付ステップと、
受け付けた感性パラメータを、感覚刺激に関する物理特性に含まれる複数種類の物理パラメータのうち前記感性パラメータと相関する物理パラメータに変換する変換ステップと、
変換された物理パラメータに基づく感覚刺激信号を出力する出力ステップと、を含み、
前記物理特性は動特性を含む感覚制御方法。
[請求項2]
前記動特性は、所定の操作具の操作に伴う操作反力及び操作量の少なくともいずれか一方の時間変化を含む物理特性である請求項1に記載の感覚制御方法。
[請求項3]
前記物理特性は、剛体と弾性体とを含む指モデル押圧具のうち、前記弾性体を前記所定の操作具に接触させて操作する際の感覚提示を実現する物理特性である請求項2に記載の感覚制御方法。
[請求項4]
前記物理パラメータは、座屈期間である請求項1に記載の感覚制御方法。
[請求項5]
前記物理パラメータは、指先落下期間である請求項1に記載の感覚制御方法。
[請求項6]
前記物理パラメータは、指先衝突期間である請求項1に記載の感覚制御方法。
[請求項7]
前記物理パラメータは、指先振動期間である請求項1に記載の感覚制御方法。
[請求項8]
前記物理パラメータは、前記感性パラメータとの相関関係を有する請求項1に記載の感覚制御方法。
[請求項9]
前記操作具は押圧操作を受け付ける押圧型操作具であることを特徴とする請求項1に記載の感覚制御方法。
[請求項10]
前記操作具は、回転操作を受け付けるロータリー式操作具であることを特徴とする請求項1に記載の感覚制御方法。
[請求項11]
操作具を操作した場合の感覚表現の度合いを示す感性パラメータの入力を受け付ける入力部と、
前記入力部が受け付けた感性パラメータを、感覚刺激に関する物理特性に含まれる複数種類の物理パラメータのうち前記感性パラメータと相関する物理パラメータに変換する変換モデルと、
前記変換モデルが変換した物理パラメータに基づく感覚刺激信号を出力する感覚提示部と、を含み、
前記物理特性は動特性を含む装置。
[請求項12]
互いに通信可能な通信装置と端末装置とを備える感覚制御システムであって、
前記端末装置は、操作具を操作した場合の感覚表現の度合いを示す感性パラメータの入力を受け付ける入力部を有し、
前記通信装置は、前記端末装置から送信された前記感性パラメータを、感覚刺激に関する物理特性に含まれる複数種類の物理パラメータのうち前記感性パラメータと相関する物理パラメータに変換する変換モデルを有し、
前記端末装置は、前記変換モデルが変換した物理パラメータに基づく感覚刺激信号を出力する感覚提示部を有し、
前記物理特性は動特性を含む感覚制御システム。
[請求項13]
装置を、
操作具を操作した場合の感覚表現の度合いを示す感性パラメータの入力を受け付ける入力部と、
前記入力部が受け付けた感性パラメータを、感覚刺激に関する物理特性に含まれる複数種類の物理パラメータのうち前記感性パラメータと相関する物理パラメータに変換する変換モデルと、
前記変換モデルが変換した物理パラメータに基づく感覚刺激信号を出力する感覚提示部、として機能させ、
前記物理特性は動特性を含むプログラム。
以上、本発明を実施するための最良の形態について各態様を用いて説明したが、本発明はこうした態様に何等限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々の変形及び置換を加えることができる。例えば、各構成部、各ステップなどに含まれる機能などは論理的に矛盾しないように再配置可能であり、複数の構成部やステップなどを1つに組み合わせたり、或いは分割したりすることが可能である。
3 入出力装置
4 入力部
5 表示部
6、10 主制御装置
7、14、18、41、101 プロセッサ
8、11 記憶部
9 ネットワーク
12、13 演算機能部
15 感性パラメータ-物理パラメータ変換モデル
16 感性データベース
20、40 触覚提示装置
21 可動部
24 ボビン
25 コイル
26 ばね部材
27 位置センサ
28 加速度センサ
29 操作範囲可変部
30、43 触覚提示部
31 ヨーク
31a 外周ヨーク
31b センターヨーク
32 磁石
33、42 操作装置
39 アクチュエータ
43a 抵抗トルク発生装置
43b 回転トルク発生装置
45 センサ
70 通信装置
80 端末装置
100 感覚制御システム
102 感覚提示部
Claims (34)
- 感性パラメータを受け付ける受付ステップと、
受け付けた感性パラメータを、感覚提示に関する物理特性に含まれる複数種類の物理パラメータのうち前記感性パラメータと相関する物理パラメータに変換する変換ステップと、
変換された物理パラメータに基づく感覚提示信号を出力する出力ステップと、
を含む感覚制御方法。 - 前記変換ステップは、受け付ける感性パラメータを当該感性パラメータと相関する物理パラメータに変換可能な変換モデルに基づいて実行される、
請求項1に記載の感覚制御方法。 - 前記変換ステップにおいて用いられる前記変換モデルは、
所定の感覚提示に関する物理特性と、当該感覚提示に対する感覚表現の度合いを示す感性パラメータとの対応情報を、1種類以上の感覚提示についてそれぞれ記憶する記憶ステップと、
前記1種類以上の感覚提示それぞれについての対応情報に基づいて、感覚提示に関する物理特性に含まれる複数種類の物理パラメータのうち、前記感性パラメータと相関する物理パラメータを抽出する抽出ステップと、
前記感性パラメータと前記抽出された物理パラメータとに基づいて、前記変換モデルを生成する生成ステップと、
により得られたものである、請求項2に記載の感覚制御方法。 - 前記記憶ステップは、所定の操作具が操作される際の感覚提示を実現する物理特性と、
当該操作具の操作を反映して入力される感性パラメータとの対応情報を、1種類以上の操作具についてそれぞれ記憶する工程である、
請求項3に記載の感覚制御方法。 - 前記操作具の操作に伴う変移に対する操作反力の変化は、少なくとも極大部と極小部とを含み、
前記物理パラメータは、前記操作に伴う変移と前記操作反力とをそれぞれ軸とする座標平面において、前記操作反力が前記極大部から前記極小部を経て前記極大部と同じ大きさに移行する座標までの窪み部の面積に基づく変数を含む、
請求項4に記載の感覚制御方法。 - 前記操作具の操作に伴う変移に対する操作反力の変化は、少なくとも極大部と極小部とを含み、
前記物理パラメータは、前記操作に伴う変移の量に関する変数を含む、
請求項4に記載の感覚制御方法。 - 前記物理パラメータは、前記操作の開始から前記極大部が現れるまでの前記操作に伴う変移の量に関する変数を含む、
請求項6に記載の感覚制御方法。 - 前記物理パラメータは、前記操作に伴う変移と、前記操作反力とをそれぞれ軸とする座標平面において、前記操作反力が前記極大部から前記極小部を経て前記極大部と同じ大きさに移行する座標までの変移の量と、前記操作の開始から前記極大部までの変移の量との比に関する変数を含む、
請求項6に記載の感覚制御方法。 - 前記操作具の操作に伴う変移に対する操作反力の変化は、少なくとも極大部を含み、
前記物理パラメータは、前記操作に伴う変移と前記操作反力とをそれぞれ軸とする座標平面において、前記極大部の曲率に関する変数を含む、
請求項4に記載の感覚制御方法。 - 前記操作具の操作に伴う変移に対する操作反力の変化は、少なくとも極大部を含み、
前記物理パラメータは、前記操作の開始から前記極大部にかけての前記操作反力の立ち上がりに関する変数を含む、
請求項4に記載の感覚制御方法。 - 前記操作具の操作に伴う変移に対する操作反力の変化は、少なくとも極大部と極小部とを含み、
前記物理パラメータは、前記操作に伴う変移と前記操作反力とをそれぞれ軸とする座標平面において、前記極小部が負号となる引き込み量の大きさに関する変数を含む、
請求項4に記載の感覚制御方法。 - 前記操作具の操作に伴う変移に対して少なくとも操作反力が変化し、
前記物理パラメータは、前記操作反力の前記変移に関する微分に関する変数を含む、
請求項4に記載の感覚制御方法。 - 前記操作具の操作に伴う変移に対して少なくとも操作反力が変化し、
前記物理パラメータは、前記操作反力の前記変移に関する二階微分に関する変数を含む、請求項4に記載の感覚制御方法。 - 前記操作具の操作に伴う変移は、前記操作具の操作量、前記操作具の操作時間、または、前記操作量と前記操作時間との組み合わせ、である、
請求項5から13のいずれか1項に記載の感覚制御方法。 - 前記操作具の操作量は、1次元空間、2次元空間、または3次元空間における量である、
請求項14に記載の感覚制御方法。 - 前記操作具は可動部を備え、
前記操作具の操作に伴う前記可動部の移動量に対する操作反力の変化は、少なくとも極大部と極小部とを含み、
前記物理パラメータは、前記可動部の加速度に関する変数を含む、
請求項4に記載の感覚制御方法。 - 前記操作具はスライド操作を受け付ける操作面を有し、
前記操作具のスライド操作に伴う変移に対する操作反力の変化は、少なくとも極大部と極小部とを含み、
前記操作反力は、前記操作面の振動により発生し、
前記極大部または前記極小部は、前記操作面の振動を生じさせる駆動信号の立上りと立下りの時間変化をそれぞれ異ならせて、所定時間平均での立上りに対応する方向または立下りに対応する方向への動力を他方より大きくすることで、疑似的に合成される、
請求項4に記載の感覚制御方法。 - 感覚刺激信号を取得する取得ステップと、
取得した感覚刺激信号に基づいて感性パラメータを指定する指定ステップと、をさらに含み、
前記受付ステップは、前記指定ステップで指定された感性パラメータを受け付けるステップである、
請求項1から3のいずれか1項に記載の感覚制御方法。 - 前記感覚刺激信号は、聴覚刺激要素に基づく聴覚刺激信号、視覚刺激要素に基づく視覚刺激信号、触覚刺激要素に基づく触覚刺激信号、またはこれらの任意の組み合わせに基づく信号である、
請求項18に記載の感覚制御方法。 - 前記指定ステップは、取得した前記感覚刺激信号の基礎となる聴覚刺激要素、視覚刺激要素および触覚刺激要素の少なくとも1つの物理特性に含まれる物理パラメータを、当該物理パラメータが相関する感性パラメータに変換および指定するステップである、
請求項19に記載の感覚制御方法。 - 前記抽出ステップは、前記複数種類の物理パラメータそれぞれの前記感性パラメータとの相関度に関する情報を、複数種類の感性パラメータについて抽出するステップを含み、
前記生成ステップは、
前記複数種類の物理パラメータと、複数の前記相関度に関する情報とを用いて、前記複数種類の感性パラメータそれぞれを説明する第1関係式を生成する第1生成ステップと、
前記第1関係式に基づいて、前記複数種類の感性パラメータと、複数の前記相関度に関する情報とを用いて、前記複数種類の物理パラメータそれぞれを説明する第2関係式を生成する第2生成ステップと、
前記第2関係式に基づいて、複数種類の感性パラメータを当該複数種類の感性パラメータと相関する複数種類の物理パラメータに変換可能な変換モデル、を生成する第3生成ステップと、を含む、
請求項3から17のいずれか1項に記載の感覚制御方法。 - 前記抽出ステップは、前記複数種類の感性パラメータそれぞれを目的変数、前記複数種類の物理パラメータを説明変数とする重回帰分析により、前記相関度に関する情報を抽出するステップを含む、
請求項21に記載の感覚制御方法。 - 前記第1生成ステップは、前記複数種類の感性パラメータを示す列ベクトルを一辺とし、前記相関度に関する情報を示す係数行列と前記複数種類の物理パラメータを示す列ベクトルとの積を他辺とする行列の等式として前記第1関係式を生成し、
前記第2生成ステップは、前記第1関係式の両辺に前記係数行列の逆行列を乗じることで前記第2関係式を生成する、
請求項21または22に記載の感覚制御方法。 - 前記係数行列は正方行列である、請求項23に記載の感覚制御方法。
- 前記逆行列は疑似逆行列である、請求項23に記載の感覚制御方法。
- 前記受付ステップは、複数種類の感性パラメータを受け付ける工程であり、
前記変換ステップは、受け付けた複数種類の感性パラメータを、前記変換モデルに基づいて、当該複数種類の感性パラメータと相関する複数種類の物理パラメータに変換する工程である、
請求項21から25のいずれか1項に記載の感覚制御方法。 - 前記感覚提示信号に基づいて感覚を提示する感覚提示ステップをさらに含む、請求項1から26のいずれか1項に記載の感覚制御方法。
- 前記感覚提示は、触覚提示、聴覚提示および視覚提示のうち少なくとも1つを含む、請求項1から27のいずれか1項に記載の感覚制御方法。
- 前記感覚提示信号に基づいて感覚を提示する感覚提示ステップをさらに含み、
前記感覚提示ステップは、操作装置の操作に応答して触覚提示部から操作反力を発生させることで触覚を提示する工程であり、
前記操作装置は、スライド操作を受け付ける操作面を有し、
前記触覚提示部は、前記操作面を振動させることで操作反力を発生させ、
前記感覚提示ステップでは、前記操作面の振動を生じさせる駆動信号の立上りと立下りの時間変化をそれぞれ異ならせて、所定時間平均での前記立上りに対応する方向または前記立下りに対応する方向への動力を他方より大きくすることで、前記操作装置のスライド操作に伴う変移に対する前記操作反力の変化が少なくとも極大部または極小部を含むように制御する、
請求項1から3のいずれか1項に記載の感覚制御方法。 - 所定の感覚提示に関する物理特性と、当該感覚提示に対する感覚表現の度合いを示す感性パラメータとの対応情報を、1種類以上の感覚提示についてそれぞれ記憶する記憶ステップと、
前記1種類以上の感覚提示それぞれについての対応情報に基づいて、感覚提示に関する物理特性に含まれる複数種類の物理パラメータのうち、前記感性パラメータと相関する物理パラメータを抽出する抽出ステップと、
前記感性パラメータと前記抽出された物理パラメータとに基づいて、新たに受け付ける感性パラメータを当該感性パラメータと相関する物理パラメータに変換可能な変換モデルを生成する生成ステップと、
を含む変換モデル生成方法。 - 複数種類の感性パラメータそれぞれを、感覚提示に関する物理特性に含まれる複数種類の物理パラメータで説明した第1関係式を、
前記複数種類の物理パラメータそれぞれを、前記複数種類の感性パラメータで説明した第2関係式に変換するステップ、
を含む関係式変換方法。 - 請求項1から31のいずれか1項に記載の方法をコンピュータに実行させるプログラム。
- 感性パラメータを受け付ける入力部と、
受け付けた感性パラメータを、感覚提示に関する物理特性に含まれる複数種類の物理パラメータのうち前記感性パラメータと相関する物理パラメータに変換し、当該変換された物理パラメータに基づく感覚提示信号を出力するプロセッサと、
を備える感覚制御システム。 - 所定の感覚提示に関する物理特性と、当該感覚提示に対する感覚表現の度合いを示す感性パラメータとの対応情報を、1種類以上の感覚提示についてそれぞれ記憶する記憶部と、
前記1種類以上の感覚提示それぞれについての対応情報に基づいて、感覚提示に関する物理特性に含まれる複数種類の物理パラメータのうち、前記感性パラメータと相関する物理パラメータを抽出し、前記感性パラメータと前記抽出された物理パラメータとに基づいて、新たに受け付ける感性パラメータを当該感性パラメータと相関する物理パラメータに変換可能な変換モデルを生成するプロセッサと、
を備える変換モデル生成システム。
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022573686A JP7386363B2 (ja) | 2021-05-19 | 2022-05-19 | 感覚制御方法、感覚制御システム、変換モデル生成方法、変換モデル生成システム、関係式変換方法、およびプログラム |
EP22804758.5A EP4343492A1 (en) | 2021-05-19 | 2022-05-19 | Sensation control method, sensation control system, conversion model generation method, conversion model generation system, relational expression conversion method, and program |
CN202280035896.XA CN117296027A (zh) | 2021-05-19 | 2022-05-19 | 感觉控制方法、感觉控制系统、变换模型生成方法、变换模型生成系统、关系式变换方法以及程序 |
KR1020237038834A KR20230169290A (ko) | 2021-05-19 | 2022-05-19 | 감각 제어 방법, 감각 제어 시스템, 변환 모델 생성 방법, 변환 모델 생성 시스템, 관계식 변환 방법, 및 프로그램 |
JP2022196779A JP7585575B2 (ja) | 2021-05-19 | 2022-12-09 | 感覚制御装置、感覚制御方法、感覚制御システム |
JP2022196780A JP7559304B2 (ja) | 2021-05-19 | 2022-12-09 | 感覚制御方法、装置、感覚制御システム、プログラム |
JP2022196778A JP7481419B2 (ja) | 2021-05-19 | 2022-12-09 | 触覚制御装置、プログラム、触覚制御方法、触覚制御システム、サーバ |
US18/479,880 US20240036652A1 (en) | 2021-05-19 | 2023-10-03 | Sensory Control Method, Sensory Control System, Method For Generating Conversion Model, Conversion Model Generation System, Method For Converting Relational Expression, And Program |
JP2024101190A JP2024111283A (ja) | 2021-05-19 | 2024-06-24 | 感覚制御装置、感覚制御方法、感覚制御システム |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-084696 | 2021-05-19 | ||
JP2021084696 | 2021-05-19 | ||
JP2022-079095 | 2022-05-12 | ||
JP2022-079099 | 2022-05-12 | ||
JP2022079099 | 2022-05-12 | ||
JP2022079095 | 2022-05-12 | ||
JP2022079128 | 2022-05-13 | ||
JP2022-079128 | 2022-05-13 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/479,880 Continuation US20240036652A1 (en) | 2021-05-19 | 2023-10-03 | Sensory Control Method, Sensory Control System, Method For Generating Conversion Model, Conversion Model Generation System, Method For Converting Relational Expression, And Program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022244849A1 true WO2022244849A1 (ja) | 2022-11-24 |
Family
ID=84141638
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/020863 WO2022244849A1 (ja) | 2021-05-19 | 2022-05-19 | 感覚制御方法、感覚制御システム、変換モデル生成方法、変換モデル生成システム、関係式変換方法、およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240036652A1 (ja) |
EP (1) | EP4343492A1 (ja) |
JP (5) | JP7386363B2 (ja) |
KR (1) | KR20230169290A (ja) |
WO (1) | WO2022244849A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024161820A1 (ja) * | 2023-01-30 | 2024-08-08 | パナソニックIpマネジメント株式会社 | ばね機構の制御方法、ばね機構の制御装置、及びプログラム |
JP7544902B1 (ja) | 2023-04-21 | 2024-09-03 | クラシエ株式会社 | 触感フィードバックシステム、触感フィードバック装置、触感フィードバック方法、および、プログラム |
WO2024204195A1 (ja) * | 2023-03-31 | 2024-10-03 | 株式会社栗本鐵工所 | 触覚伝達システム、触覚伝達プログラム、及び、触覚伝達方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10943578B2 (en) * | 2016-12-13 | 2021-03-09 | Ultrahaptics Ip Ltd | Driving techniques for phased-array systems |
WO2024204546A1 (ja) * | 2023-03-31 | 2024-10-03 | 株式会社栗本鐵工所 | 触覚提示システム |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012137888A (ja) * | 2010-12-24 | 2012-07-19 | Kyocera Corp | 触感呈示装置、当該装置に用いるプログラムおよび触感呈示方法 |
WO2013186847A1 (ja) * | 2012-06-11 | 2013-12-19 | 富士通株式会社 | 駆動装置、電子機器、及び駆動制御プログラム |
JP5662425B2 (ja) | 2009-05-30 | 2015-01-28 | ナイキ イノベイト セー. フェー. | 消費者製品のオンラインデザインの方法 |
JP2015135678A (ja) * | 2014-01-16 | 2015-07-27 | イマージョン コーポレーションImmersion Corporation | ユーザ生成コンテンツをオーサリングするためのシステム及び方法 |
JP2019220168A (ja) | 2018-06-11 | 2019-12-26 | イマージョン コーポレーションImmersion Corporation | スピーチコマンドを使用して触覚をデザインするシステムと方法 |
JP2021084696A (ja) | 2019-11-29 | 2021-06-03 | 株式会社吉野工業所 | 合成樹脂製容器、中間容器体、及び合成樹脂製容器の製造方法 |
JP2022079128A (ja) | 2020-11-16 | 2022-05-26 | セイコーエプソン株式会社 | 画像読取装置、画像読取制御方法及びプログラム |
JP2022079095A (ja) | 2020-11-16 | 2022-05-26 | 横河電機株式会社 | インターフェイス装置、インターフェイス方法、および、インターフェイスプログラム |
JP2022079099A (ja) | 2020-11-16 | 2022-05-26 | キヤノン株式会社 | 光走査装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10416767B2 (en) * | 2003-11-20 | 2019-09-17 | National Institute Of Advanced Industrial Science And Technology | Haptic information presentation system and method |
JP4778591B2 (ja) | 2009-05-21 | 2011-09-21 | パナソニック株式会社 | 触感処理装置 |
JP2011188921A (ja) | 2010-03-12 | 2011-09-29 | Panasonic Electric Works Co Ltd | 振動付与装置 |
US8540571B2 (en) * | 2010-03-31 | 2013-09-24 | Immersion Corporation | System and method for providing haptic stimulus based on position |
US8493354B1 (en) * | 2012-08-23 | 2013-07-23 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
KR101357183B1 (ko) * | 2012-05-09 | 2014-02-03 | 한국과학기술원 | 유체를 이용한 멀티 모드의 근감각-촉각 발생기 및 그의 설계방법, 근감각-촉각 발생기를 이용한 댐퍼장치 및 휴대용 기기, 저항력 발생방법 및 그 기록매체 |
US9063570B2 (en) | 2012-06-27 | 2015-06-23 | Immersion Corporation | Haptic feedback control system |
JP6112021B2 (ja) * | 2014-01-09 | 2017-04-12 | 株式会社デンソー | 入力装置 |
WO2018193917A1 (ja) * | 2017-04-21 | 2018-10-25 | アルプス電気株式会社 | 回転型操作装置とその制御方法及びプログラム |
JP7027717B2 (ja) | 2017-08-01 | 2022-03-02 | カシオ計算機株式会社 | 反力発生装置及び電子鍵盤楽器 |
JP6959349B2 (ja) * | 2017-09-29 | 2021-11-02 | 株式会社ソニー・インタラクティブエンタテインメント | 操作デバイス、及びその制御装置 |
JP7087367B2 (ja) | 2017-12-08 | 2022-06-21 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置、プログラム及び制御方法 |
JP2020071674A (ja) | 2018-10-31 | 2020-05-07 | 株式会社東海理化電機製作所 | 触覚呈示装置 |
WO2020141330A2 (en) | 2019-01-04 | 2020-07-09 | Ultrahaptics Ip Ltd | Mid-air haptic textures |
CN113302579B (zh) * | 2019-01-21 | 2025-01-14 | 索尼集团公司 | 信息处理装置、信息处理方法和程序 |
JP7239982B2 (ja) | 2019-06-04 | 2023-03-15 | 国立大学法人広島大学 | 触覚評価システム、触覚評価方法及びプログラム |
JP7264779B2 (ja) | 2019-09-10 | 2023-04-25 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
-
2022
- 2022-05-19 WO PCT/JP2022/020863 patent/WO2022244849A1/ja active Application Filing
- 2022-05-19 KR KR1020237038834A patent/KR20230169290A/ko active Pending
- 2022-05-19 JP JP2022573686A patent/JP7386363B2/ja active Active
- 2022-05-19 EP EP22804758.5A patent/EP4343492A1/en active Pending
- 2022-12-09 JP JP2022196778A patent/JP7481419B2/ja active Active
- 2022-12-09 JP JP2022196780A patent/JP7559304B2/ja active Active
- 2022-12-09 JP JP2022196779A patent/JP7585575B2/ja active Active
-
2023
- 2023-10-03 US US18/479,880 patent/US20240036652A1/en active Pending
-
2024
- 2024-06-24 JP JP2024101190A patent/JP2024111283A/ja active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5662425B2 (ja) | 2009-05-30 | 2015-01-28 | ナイキ イノベイト セー. フェー. | 消費者製品のオンラインデザインの方法 |
JP2012137888A (ja) * | 2010-12-24 | 2012-07-19 | Kyocera Corp | 触感呈示装置、当該装置に用いるプログラムおよび触感呈示方法 |
WO2013186847A1 (ja) * | 2012-06-11 | 2013-12-19 | 富士通株式会社 | 駆動装置、電子機器、及び駆動制御プログラム |
JP2015135678A (ja) * | 2014-01-16 | 2015-07-27 | イマージョン コーポレーションImmersion Corporation | ユーザ生成コンテンツをオーサリングするためのシステム及び方法 |
JP2019220168A (ja) | 2018-06-11 | 2019-12-26 | イマージョン コーポレーションImmersion Corporation | スピーチコマンドを使用して触覚をデザインするシステムと方法 |
JP2021084696A (ja) | 2019-11-29 | 2021-06-03 | 株式会社吉野工業所 | 合成樹脂製容器、中間容器体、及び合成樹脂製容器の製造方法 |
JP2022079128A (ja) | 2020-11-16 | 2022-05-26 | セイコーエプソン株式会社 | 画像読取装置、画像読取制御方法及びプログラム |
JP2022079095A (ja) | 2020-11-16 | 2022-05-26 | 横河電機株式会社 | インターフェイス装置、インターフェイス方法、および、インターフェイスプログラム |
JP2022079099A (ja) | 2020-11-16 | 2022-05-26 | キヤノン株式会社 | 光走査装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024161820A1 (ja) * | 2023-01-30 | 2024-08-08 | パナソニックIpマネジメント株式会社 | ばね機構の制御方法、ばね機構の制御装置、及びプログラム |
WO2024204195A1 (ja) * | 2023-03-31 | 2024-10-03 | 株式会社栗本鐵工所 | 触覚伝達システム、触覚伝達プログラム、及び、触覚伝達方法 |
JP7544902B1 (ja) | 2023-04-21 | 2024-09-03 | クラシエ株式会社 | 触感フィードバックシステム、触感フィードバック装置、触感フィードバック方法、および、プログラム |
Also Published As
Publication number | Publication date |
---|---|
US20240036652A1 (en) | 2024-02-01 |
JP7481419B2 (ja) | 2024-05-10 |
JP2023025707A (ja) | 2023-02-22 |
EP4343492A1 (en) | 2024-03-27 |
JP7386363B2 (ja) | 2023-11-24 |
JP2023017030A (ja) | 2023-02-02 |
JPWO2022244849A1 (ja) | 2022-11-24 |
KR20230169290A (ko) | 2023-12-15 |
JP7559304B2 (ja) | 2024-10-02 |
JP2023017031A (ja) | 2023-02-02 |
JP7585575B2 (ja) | 2024-11-19 |
JP2024111283A (ja) | 2024-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7386363B2 (ja) | 感覚制御方法、感覚制御システム、変換モデル生成方法、変換モデル生成システム、関係式変換方法、およびプログラム | |
US10936072B2 (en) | Haptic information presentation system and method | |
US11385723B2 (en) | Haptic information presentation system and method | |
Pentland | Perceptual user interfaces: perceptual intelligence | |
Caldwell et al. | An integrated tactile/shear feedback array for stimulation of finger mechanoreceptor | |
Srinivasan et al. | Haptics in virtual environments: Taxonomy, research status, and challenges | |
Choi et al. | Vibrotactile display: Perception, technology, and applications | |
US8570291B2 (en) | Tactile processing device | |
Hermann et al. | Sound and meaning in auditory data display | |
US20110148607A1 (en) | System,device and method for providing haptic technology | |
US10481693B2 (en) | Input/output device and method for the computer-based display and exploration of real or virtual object surfaces | |
Popescu et al. | Multimodal interaction modeling | |
Falcao et al. | Applications of haptic devices & virtual reality in consumer products usability evaluation | |
CN117296027A (zh) | 感觉控制方法、感觉控制系统、变换模型生成方法、变换模型生成系统、关系式变换方法以及程序 | |
Jafar et al. | Towards the development of kansei haptic sensing technology for robot application–Exploring human haptic emotion | |
CN113176822A (zh) | 虚拟用户检测 | |
Tome | Exoskin: pneumatically augmenting inelastic materials for texture changing interfaces | |
Kahol et al. | Haptic User Interfaces: Design, testing and evaluation of haptic cueing systems to convey shape, material and texture information | |
Martínez et al. | The sense of touch as the last frontier in virtual reality technology | |
Rocchesso et al. | Contact sounds for continuous feedback | |
Oakley | Haptic augmentation of the cursor: Transforming virtual actions into physical actions | |
US12251903B2 (en) | Curved origami-based metamaterial, manufacturing method of the same, curved origami-based haptic module and method for producing active mechanical haptics | |
Balzarotti et al. | Effects of CHAI3D texture rendering parameters on texture perception | |
Bryden et al. | Building artificial personalities: expressive communication channels based on an interlingua for a human-robot dance | |
Aprile et al. | Enaction and Enactive Interfaces: A Handbook of Terms |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2022573686 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22804758 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20237038834 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020237038834 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280035896.X Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022804758 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022804758 Country of ref document: EP Effective date: 20231219 |