CN113066455A - Music control method, device and readable storage medium - Google Patents
Music control method, device and readable storage medium Download PDFInfo
- Publication number
- CN113066455A CN113066455A CN202110262627.4A CN202110262627A CN113066455A CN 113066455 A CN113066455 A CN 113066455A CN 202110262627 A CN202110262627 A CN 202110262627A CN 113066455 A CN113066455 A CN 113066455A
- Authority
- CN
- China
- Prior art keywords
- music
- rhythm
- type
- rhythm type
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
- G10H1/0016—Means for indicating which keys, frets or strings are to be actuated, e.g. using lights or leds
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
- G10H2210/071—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for rhythm pattern analysis or rhythm style recognition
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Electrophonic Musical Instruments (AREA)
Abstract
The invention discloses a music control method, a device and a readable storage medium, wherein the method comprises the following steps: acquiring first music type information; detecting a rhythmic playing triggering instruction, and acquiring a to-be-played rhythmic type corresponding to the rhythmic playing triggering instruction; and matching a preset music set based on the rhythm type to be played and the first music type information to obtain a target music fragment. The invention realizes that the target music fragment can be obtained only through the rhythm type playing triggering instruction, thereby reducing the difficulty of playing the target music fragment by a user of the musical instrument and further reducing the learning difficulty of the user of the musical instrument when learning how to use the musical instrument.
Description
Technical Field
The present invention relates to the field of music technologies, and in particular, to a music control method, a device, and a readable storage medium.
Background
With the improvement of living standard of people, the requirements of more and more people on the quality of life are higher and higher, including pursuing spiritual enjoyment, such as starting to play music. However, the existing musical instruments, including electronic musical instruments, require users to have certain music theory basis, such as related knowledge of chords, rhythms, notes, etc., which makes it difficult for users to master the using method of the musical instruments.
For example, a beginner of a guitar needs not only long time of diligence and bitter exercise when learning the guitar, but also faces the problems of pain and difficulty in callousing of fingers when pressing strings; in addition, to realize the self-playing and self-singing of a guitar, a beginner of the guitar needs to remember a plurality of fingerings of the left hand, cooperate with different string hooking and sweeping actions of the right hand, and simultaneously cooperate with singing at a proper time, so that the beginner needs to be trained for a long time to reach the self-playing and self-singing level.
In addition, in order to reduce the difficulty of using the musical instrument, the existing musical instrument also provides a light prompting mode to prompt a beginner to perform corresponding operation, however, the fluency of performance is limited by playing through the light prompting instead of playing through muscle memory, and the light prompting mode is not helpful for learning to use the musical instrument.
Therefore, the problem that the use difficulty of the existing musical instrument is large when the user of the musical instrument uses the musical instrument is known.
Disclosure of Invention
The invention mainly aims to provide a music control method, a music control device and a readable storage medium, and aims to solve the technical problem that the learning difficulty is high when a user of an existing musical instrument learns how to use the musical instrument.
In order to achieve the above object, the present invention provides a music control method, including the steps of:
acquiring first music type information;
detecting a rhythmic playing triggering instruction, and acquiring a to-be-played rhythmic type corresponding to the rhythmic playing triggering instruction;
and matching a preset music set based on the rhythm type to be played and the first music type information to obtain a target music fragment.
Optionally, the acquiring the rhythm type to be played corresponding to the rhythmic playing triggering instruction includes:
acquiring a rhythm type set based on a preset rhythm type acquisition mode;
binding the rhythm type in the rhythm type set with the key corresponding to the rhythm type playing triggering instruction; the number of the keys is one or more, and the rhythm type playing triggering instructions corresponding to the keys are different.
Optionally, the acquiring the rhythm set includes selecting the rhythm set, and the acquiring the rhythm set based on a preset rhythm acquisition mode includes:
outputting an alternative rhythm type set;
and receiving and responding to a selection instruction aiming at the alternative rhythm type set to obtain a selected rhythm type set.
Optionally, the acquiring the rhythm type set includes identifying the rhythm type set, and the acquiring the rhythm type set based on a preset rhythm type acquiring manner includes:
and identifying a preset music score to be played to obtain an identified rhythm type set.
Optionally, before the outputting the set of alternative rhythms, the method includes:
acquiring rhythm type use frequency information;
and determining the rhythm type corresponding to the rhythm type use frequency information to obtain a candidate rhythm type set.
Optionally, before the outputting the set of alternative rhythms, the method includes:
acquiring rhythm type use time point information;
and determining the rhythm type corresponding to the rhythm type use time point information to obtain a candidate rhythm type set.
Optionally, the preset music set is a preset music piece set, and the matching of the preset music set based on the to-be-played rhythm type and the first music type information to obtain the target music piece includes:
matching a preset music fragment set based on the rhythm type to be played, the first music type information and a preset mapping relation to obtain an initial music fragment;
and adjusting the initial music segment to obtain the target music segment.
Optionally, the preset music set is a preset note set, and the matching of the preset music set based on the rhythm type to be played and the first music type information to obtain the target music piece includes:
matching the preset note set based on the rhythm type to be played and the first music type information to obtain at least two notes, and combining the notes to obtain an initial music segment; the number of the notes corresponds to the rhythm type to be played;
and adjusting the initial music segment to obtain the target music segment.
Optionally, the adjusting the initial music piece to obtain the target music piece includes:
and acquiring performance parameters, and adjusting the initial music segment based on the performance parameters to obtain the target music segment.
Optionally, the detecting a rhythmic playing triggering instruction and acquiring a to-be-played rhythmic type corresponding to the rhythmic playing triggering instruction includes:
acquiring a user playing mode;
and if the user playing mode is a rhythmic playing mode, executing the detected rhythmic playing triggering instruction, and acquiring the rhythmic operation to be played corresponding to the rhythmic playing triggering instruction.
Optionally, after the preset music set is matched based on the to-be-played rhythm type and the music type information to obtain the target music piece, the method includes:
detecting a user playing mode switching instruction, and switching the rhythm playing mode into a playing mode;
detecting a playing triggering instruction, and acquiring second music type information;
and matching the preset note set based on the second music type information to obtain a target note.
Optionally, the performance parameters include a playing speed parameter, a volume size parameter and a pitch parameter.
Optionally, the obtaining the first music type information includes:
detecting a key triggering instruction, and acquiring first music type information corresponding to the key triggering instruction; the key triggering instruction is a response instruction corresponding to a single key.
Optionally, after the preset music set is matched based on the to-be-played rhythm type and the first music type information to obtain the target music piece, the method includes:
and outputting or playing the target music segment.
Further, to achieve the above object, the present invention also provides a music control apparatus comprising:
the first acquisition module is used for acquiring first music type information;
the second acquisition module is used for detecting a rhythmic playing triggering instruction and acquiring a to-be-played rhythmic type corresponding to the rhythmic playing triggering instruction;
and the first matching module is used for matching a preset music set based on the rhythm type to be played and the first music type information to obtain a target music fragment.
Further, to achieve the above object, the present invention also provides a music apparatus including a memory, a processor, and a music control program stored on the memory and executable on the processor, the music control program implementing the steps of the music control method as described above when executed by the processor.
Further, to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon a music control program that, when executed by a processor, implements the steps of the music control method as described above.
Furthermore, to achieve the above object, the present invention also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the music control method as described above.
The invention obtains the first music type information; detecting a rhythmic playing triggering instruction, and acquiring a to-be-played rhythmic type corresponding to the rhythmic playing triggering instruction; and matching a preset music set based on the rhythm type to be played and the first music type information to obtain a target music fragment. The target music fragment is obtained by acquiring the rhythm type to be played corresponding to the rhythm type playing triggering instruction and matching the first music type information corresponding to the rhythm type playing triggering instruction with the preset music set, so that the process that a user uses the musical instrument is the process of sending the rhythm type playing triggering instruction, the user does not need to spend a large amount of time to practice or can use the musical instrument through complicated operation or perform playing through light prompting, the process that the user uses the musical instrument is simplified, the target music fragment can be obtained only through the rhythm type playing triggering instruction, the difficulty of the user of the musical instrument in playing the target music fragment is reduced, and the learning difficulty of the user of the musical instrument in learning how to use the musical instrument is reduced.
Drawings
FIG. 1 is a flow chart of a music control method according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a music control method according to a second embodiment of the present invention;
FIG. 3 is a functional block diagram of a music control device according to a preferred embodiment of the present invention;
fig. 4 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a flowchart illustrating a music control method according to a first embodiment of the present invention.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than that shown or described herein. The music control method can be applied to musical instruments including guitars and the like. For convenience of description, the execution of the steps of the subject description music control method are omitted below. The music control method comprises the following steps:
in step S110, first music type information is acquired.
Specifically, first music type information is acquired. It should be noted that the first music type information is used to determine the type of the chord, and there is a corresponding relationship between the first music type information and the chord, or the first music type information is the chord.
The acquiring of the first music type information includes:
step a, detecting a key triggering instruction, and acquiring first music type information corresponding to the key triggering instruction; the key triggering instruction is a response instruction corresponding to a single key.
Specifically, a key triggering instruction is detected, and first music type information corresponding to the key triggering instruction is obtained, wherein the key triggering instruction is a response instruction corresponding to a single key, that is, a one-to-one correspondence relationship exists between the key and the first music type, that is, one key corresponds to one chord. For example, there are 7 keys on the guitar, corresponding to the major triad of chords: C. d, E, F, G, A and B.
Step S120, detecting a rhythmic playing triggering instruction, and acquiring a to-be-played rhythmic corresponding to the rhythmic playing triggering instruction.
Specifically, when a user of the guitar plays, the user needs to input not only a key triggering instruction but also a rhythm type playing triggering instruction, and when the rhythm type playing triggering instruction is detected, a rhythm type to be played corresponding to the rhythm type playing triggering instruction is obtained, wherein the rhythm type to be played determines a corresponding music segment, and one music segment is composed of one or more notes with different durations. It should be noted that, in a piece of music, there is a rhythm having a characteristic feature and appearing repeatedly, called a rhythm type, and the rhythm type to be played is a rhythm type used during the playing by the user of the guitar.
Before the detecting of the rhythmic playing triggering instruction and the obtaining of the to-be-played rhythmic corresponding to the rhythmic playing triggering instruction, the method includes:
step b, acquiring a user playing mode;
and c, if the user playing mode is a rhythmic playing mode, executing the detected rhythmic playing triggering instruction, and acquiring the to-be-played rhythmic operation corresponding to the rhythmic playing triggering instruction.
Specifically, a user playing mode is obtained, and if the user playing mode is a rhythmic playing mode, the rhythmic operation to be played corresponding to the rhythmic playing triggering instruction is obtained; and if the user mode is not the rhythm type playing mode, not executing the operation of obtaining the rhythm type to be played corresponding to the rhythm type playing triggering instruction, but executing other operations. It will be appreciated that the user performance mode of the guitar may be varied, i.e. there may be other user performance modes in addition to the rhythmic performance mode.
The obtaining of the rhythm type to be played corresponding to the rhythm type playing triggering instruction includes:
and d, acquiring a rhythm type set based on a preset rhythm type acquisition mode.
Specifically, a rhythm type set is obtained based on a preset rhythm type obtaining mode, wherein the rhythm type set is composed of one or more rhythm types. It should be noted that the rhythm type set corresponds to a music score to be played, one or more rhythm types generally exist in the music score to be played, and the rhythm type in the rhythm type set is one or more rhythm types in the music score to be played.
In addition, besides acquiring the rhythm type set by presetting a rhythm type acquisition mode, the rhythm type set can be acquired by directly inputting voice by a guitar user, namely, the guitar user can directly send an instruction for specifying the rhythm type set to the guitar, for example, after receiving the voice of the guitar user, i.e. after i need to use rhythm type a and rhythm type B, rhythm type a and rhythm type B are directly taken as the rhythm type set.
Step e, binding the rhythm type in the rhythm type set with the key corresponding to the rhythm type playing triggering instruction; the number of the keys is one or more, and the rhythm type playing triggering instructions corresponding to the keys are different.
Specifically, the rhythm type in the rhythm type set is bound with the key corresponding to the rhythm type playing triggering instruction; one or more keys are provided, and the rhythm type playing triggering instructions corresponding to the keys are different. It should be noted that the rhythm type playing triggering instruction is triggered by the user of the guitar triggering the button arranged on the guitar, and in addition, a touch screen can be arranged on the guitar, and the rhythm type playing triggering instruction can be triggered by the user of the guitar touching the corresponding area of the touch screen.
It should be noted that, a user of the guitar needs to designate the rhythm types corresponding to the above keys before playing with the guitar, it can be understood that a song generally does not have only one rhythm type, for example, the rhythm types of the main song and the refrain part of the song are different, that is, a song generally has a plurality of rhythm types, that is, a plurality of rhythm types may exist in the music score to be played, and when the user of the guitar plays the music score to be played through the guitar, the rhythm types may not be the same during the playing process, and there is a need to switch the rhythm types, that is, the user of the guitar needs to switch the rhythm types.
For example, after a user of a guitar has performed a main song with a rhythm type a, the user inputs a rhythm type switching command to the guitar, and upon receiving the rhythm type switching command, the user switches the rhythm type a to one of the other rhythm types in the rhythm type set.
It should be noted that, when pressing different keys, different rhythm type playing trigger instructions are generated, and the different rhythm type playing trigger instructions correspond to different rhythm types, for example, the above keys include key 1, key 2, and key 3, which correspond to rhythm type a, rhythm type B, and rhythm type C, respectively, when the user of the guitar presses key 2, the rhythm type used in the process of the current playing by the user of the guitar is rhythm type B, and when the user of the guitar presses key 1, the rhythm type used in the process of the current playing by the user of the guitar is rhythm type a.
It should be noted that, when the guitar user plays with the same rhythm, the final melody played by the guitar user may also be different, because the same rhythm may correspond to different melodies, which is specifically realized by switching the chord, wherein the chord refers to a group of sounds having a certain interval relationship.
The acquiring of the rhythm type set based on the preset rhythm type acquiring mode includes:
and d1, outputting the alternative rhythm type set.
Specifically, an alternative rhythm type set is output, wherein the alternative rhythm type set needs to contain the currently existing rhythm type as completely as possible so as to meet the playing requirements of a guitar player and avoid the situation that the rhythm type required by the guitar player during playing does not exist in the alternative rhythm type set.
It should be noted that the alternative rhythm type set corresponds to a rhythm type library, and the rhythm type library may be provided by the guitar itself or provided by the cloud. For the rhythm type library provided by the guitar, in order to keep the comprehensiveness of the rhythm types in the rhythm type library, the rhythm types in the rhythm type library can be updated regularly or irregularly, however, because the information storage capacity of the guitar is limited, the comprehensiveness of the rhythm types in the rhythm type library is difficult to ensure, and the rhythm types are difficult to avoid and need to be selectively stored, for example, a great amount of rhythm types with high use probability are stored in the rhythm type library, and the rhythm types with low use probability are not stored or are stored in a small amount; for the rhythm type library provided by the cloud, the information storage capacity of the cloud is far stronger than that of the guitar, so that the comprehensiveness of the rhythm type can be better ensured compared with that of the guitar, and the rhythm type library can be updated in real time to further ensure the comprehensiveness of the rhythm type in the rhythm type library.
For the output mode of the alternative rhythm type set, the alternative rhythm type set can be output in a list form, it should be noted that when screen display is performed in the list form, there is a display order between the rhythm types, for example, if rhythm type a is ordered before rhythm type B, rhythm type a is preferentially displayed, rhythm type B is displayed later, and there is an upper limit to the number of rhythm types in the display content of the list displayed on the screen, so as to facilitate the guitar user to select a rhythm type from the alternative rhythm type set, the display order can be set.
And d2, receiving and responding to the selection instruction aiming at the alternative rhythm type set to obtain a selected rhythm type set.
Specifically, a selection instruction for the candidate rhythm type set is received and responded to, and a selected rhythm type set is obtained. It should be noted that the output mode of the candidate rhythm type set may be output through a display screen (including a touch screen and a non-touch screen), and a user of the guitar inputs a selection instruction for the candidate rhythm type set output by the guitar through the display screen carried by the guitar before playing, for example, the rhythm type corresponding to the selection instruction is rhythm type a and rhythm type B, and in response to the selection instruction, rhythm type a and rhythm type B are taken as the selected rhythm type set. Furthermore, the reception of a selection instruction can also be realized by a terminal establishing a connection with the guitar, for example the user of the guitar enters a selection instruction by means of a mobile phone establishing a connection with the guitar, which mobile phone then sends the selection instruction to the guitar.
The rhythm type set includes a recognition rhythm type set, and the acquisition of the rhythm type set based on a preset rhythm type acquisition mode includes:
and d3, identifying the preset music score to be played to obtain an identified rhythm type set.
Specifically, a preset music score to be played is identified, and a rhythm type identification set is obtained. It should be noted that the preset music score to be played can be acquired after being transmitted by other equipment, or acquired after being selected by the user of the guitar from the music score library of the guitar. For the preset music score to be played provided by other equipment, the other equipment can be storage equipment including a hard disk, a floppy disk, a card reader, a U disk and the like, and the other equipment can also be a terminal including a smart phone, a tablet personal computer, a portable computer and the like. Taking other devices as smart phones for example, after the guitar is connected to the smart phone, the user of the guitar may open the music library through the smart phone (specifically, open the music library through a corresponding application program in the smart phone), and then select a preset music score to be played from the music library.
It should be noted that, similar to the output manner of the above-mentioned alternative rhythm type set, the music score library may also output in a list manner, when the screen display is performed in the list manner, there is a display order between the music scores, for example, when the music score a is ordered before the music score B, the music score a is preferentially displayed, the music score B is displayed later, and there is an upper limit to the number of music scores in the display content of the list displayed on the screen, so as to facilitate the guitar user to select music scores from the music score library, and the display order thereof may be set.
In addition, many times, during playing, the music score is a paper edition, for example, the preset music score to be played is recorded in a book, at this time, it is difficult to directly obtain the preset music score to be played, but obtaining the rhythm type provided by the preset music score to be played alone is relatively easy to implement, for example, a camera arranged on a guitar is used for shooting a part containing the rhythm type in the preset music score to be played to obtain a rhythm type image, and by identifying the rhythm type image, a rhythm type set can be identified, for example, a rhythm type a and a rhythm type B exist in the rhythm type image, and then after identifying the rhythm type image, an identified rhythm type set composed of the rhythm type a and the rhythm type B is obtained.
In addition to shooting through a camera provided on the guitar, shooting can be performed through an external camera or other equipment (including a smart phone, a tablet computer, a portable computer, and the like) with shooting functions connected with the guitar.
Before the outputting the candidate rhythm type set, the method includes:
and d4, acquiring the rhythm type use frequency information.
Specifically, rhythm-type use frequency information is acquired. Wherein the rhythm type use frequency information is determined by the rhythm type use times of the guitar user in a certain time. For example, if the user of the guitar uses the rhythm type a 7 times in a week, the rhythm type use frequency of the rhythm type a is 7 times/week; the guitar user uses rhythm type B5 times in a week, and the rhythm type use frequency of rhythm type B is 5 times/week.
And d5, determining the rhythm type corresponding to the rhythm type use frequency information to obtain a candidate rhythm type set.
Specifically, determining a rhythm type corresponding to the rhythm type use frequency information to obtain a candidate rhythm type set. It can be understood that the higher the usage frequency of the rhythm type is, the higher the probability that the guitar user uses the rhythm type during playing is, for example, the higher the probability that the guitar user can repeatedly use a rhythm type when practicing a tune, therefore, in order to facilitate the guitar user to obtain the selected rhythm type set through the selection instruction, the usage frequency of the rhythm types in the rhythm type library can be determined first to obtain the display sequence of the rhythm types in the alternative rhythm type set, that is, the rhythm type with high usage frequency is displayed preferentially, and the rhythm type with low usage frequency is displayed later.
Before the outputting the candidate rhythm type set, the method includes:
and d6, acquiring the rhythm use time point information.
Specifically, the rhythm type use time point information is acquired. Wherein, the rhythm type using time point information is determined by the last using time of the rhythm type in a certain time of a guitar user. For example, if the user of the guitar used rhythm type a before 10 minutes, the rhythm type use time point of rhythm type a is before 10 minutes; the user of the guitar used rhythm type B before 20 minutes, and the rhythm type use time point of rhythm type B is 20 minutes before.
It should be noted that a rhythm type may be used several times by a guitar user, for example, a guitar user has used rhythm type a before 10 minutes, 3 days and one month, and only the last time of using the rhythm type is considered in determining the rhythm type use time point information, i.e., the rhythm type use time point information of rhythm type a is 10 minutes before, but not 3 days or one month before.
And d7, determining the rhythm type corresponding to the rhythm type use time point information to obtain a candidate rhythm type set.
Specifically, determining the rhythm type corresponding to the time point information used by the rhythm type to obtain a candidate rhythm type set. It can be understood that the closer the using time point of the rhythm type is to the current time, the higher the probability that the guitar user uses the rhythm type during playing is, for example, the higher the probability that the guitar user practices a tune, the higher the probability that the rhythm type is repeatedly used, therefore, in order to facilitate the guitar user to obtain a selected rhythm type set through a selection instruction, the using time point of the rhythm type of each rhythm type in the rhythm type library can be determined first to obtain the display sequence of each rhythm type in the selected rhythm type set, i.e., the closer the using time point of the rhythm type is to the current time, the later the rhythm type is displayed.
Step S130, matching a preset music set based on the rhythm type to be played and the first music type information to obtain a target music fragment.
Specifically, a preset music set is matched based on the rhythm type to be played and the first music type information, and a target music fragment is obtained. Wherein the preset music set is composed of music pieces or composed of elements capable of composing the music pieces.
The preset music set is a preset note set, and the preset music set is matched based on the rhythm type to be played and the first music type information to obtain a target music piece, and the method includes the following steps:
f, matching a preset music fragment set based on the rhythm type to be played, the first music type information and a preset mapping relation to obtain an initial music fragment;
and g, adjusting the initial music segments to obtain the target music segments.
Specifically, matching a preset music fragment set based on the rhythm type to be played, the first music type information and a preset mapping relation to obtain an initial music fragment; and adjusting the initial music segment to obtain a target music segment. The preset mapping relationship is a corresponding relationship between the rhythm type and the music type information and a music piece in the preset music piece set, and in an embodiment, the preset mapping relationship can be obtained by the following table:
for example, if the first music genre is C and the tempo type to be performed is tempo type 1, the initial piece of music is piece 1.
The preset music set is a preset note set, and the preset music set is matched based on the rhythm type to be played and the first music type information to obtain a target music piece, and the method includes the following steps:
h, matching the preset note set based on the rhythm type to be played and the first music type information to obtain at least two notes, and combining the notes to obtain an initial music segment; the number of the notes corresponds to the rhythm type to be played;
and i, adjusting the initial music segment to obtain the target music segment.
Specifically, a preset note set is matched based on the rhythm type to be played and the first music type information to obtain at least two notes, and the notes are combined to obtain an initial music segment; and adjusting the initial music segment to obtain a target music segment. The number of the notes corresponds to the rhythm type to be played, for example, the rhythm type to be played is rhythm type a, the number of the corresponding notes is 5, the rhythm type to be played is rhythm type B, and the number of the corresponding notes is 6. Note that the notes may be the same or different.
It should be noted that the matching process is a process of selecting notes corresponding to the rhythm type to be played and the first music type information from a preset note set, and the process may be selecting at least two corresponding notes from a plurality of notes (for example, 6 notes for guitar) corresponding to the first music type information according to the rhythm type to be played, and then combining the notes according to the rhythm type to be played to obtain the initial music piece.
The adjusting the initial music piece to obtain the target music piece includes:
and j, acquiring performance parameters, and adjusting the initial music segment based on the performance parameters to obtain the target music segment.
Specifically, performance parameters are obtained, and the initial music segment is adjusted based on the performance parameters to obtain the target music segment. It can be understood that the note combination is not necessarily suitable for the playing process, and needs to be adaptively adjusted after the initial music piece is obtained.
The playing parameters comprise a playing speed parameter, a volume size parameter and a tone variation parameter.
Specifically, the performance parameters include a play speed parameter, a volume size parameter, and a pitch parameter. The playing speed parameter affects the total duration of the music segment, and the playing speed parameter can be embodied as a multiple of the duration of the music segment, for example, if the playing duration of the target music segment before adjustment is 5 seconds, and the playing speed parameter is 0.8, the adjusted target music segment is a product of 5 and 0.8, that is, the playing duration is 4 seconds; the volume parameter determines the volume of the target music piece when the target music piece is played through a player of the guitar or a playing device externally connected with the guitar; the key-transposition parameter may transpose the target piece of music, including rising and falling key.
After the preset music set is matched based on the rhythm type to be played and the first music type information to obtain the target music piece, the method includes:
and k, outputting or playing the target music fragment.
Specifically, when playing guitar, the user of guitar needs to transmit the playing result to the playing object by sound, and the transmission includes: outputting the target music piece to a playing device (such as an audio device) externally connected to the guitar, so as to play the target music piece to the playing object through the playing device externally connected to the guitar; or, the target music segment is played to the playing object directly through the playing device of the guitar.
The embodiment obtains the first music type information; detecting a rhythmic playing triggering instruction, and acquiring a to-be-played rhythmic type corresponding to the rhythmic playing triggering instruction; and matching a preset music set based on the rhythm type to be played and the first music type information to obtain a target music fragment. This implementation has realized presetting the music set through obtaining the rhythm type of waiting to play that rhythm type performance trigger command corresponds matches with the first music type information that triggers the command through rhythm type performance and obtains target music fragment for the process that the user used the musical instrument is the process of sending rhythm type performance trigger command, and need not that the user spends a large amount of time to practise or the user just can use the musical instrument through complicated operation or play through light prompt, makes the process that the user used the musical instrument simplified, only need play through rhythm type and trigger the command and can obtain target music fragment, thereby reduced the degree of difficulty that the user of musical instrument played target music fragment, and then reduced the learning degree of difficulty of the user of musical instrument when learning how to use the musical instrument.
Further, referring to fig. 2, on the basis of the first embodiment of the music control method of the present invention, a second embodiment is proposed, where after the preset music set is matched based on the to-be-played rhythm type and the music type information to obtain the target music piece, the method includes:
and step S240, detecting a user playing mode switching instruction, and switching the rhythm playing mode into a playing mode.
Specifically, the user's performance mode switching instruction is detected to switch the rhythm type performance mode to the playing performance mode, it being understood that the playing performance mode simulates a six-string performance mode of a real guitar, for which the user of the guitar can play by real six strings or keys simulating six strings.
In step S250, a playing triggering instruction is detected, and the second music type information is acquired.
Specifically, the playing performance triggering instruction is detected, and the second music type information is acquired, and the second music type information may be the same as or different from the first music type information. It is understood that the playing triggering instruction is triggered by the user of the guitar through the real six strings or the keys simulating the six strings.
And step S260, matching the preset note set based on the second music type information to obtain a target note.
Specifically, the preset note set is matched based on the second music type information, and the target note is obtained. It can be understood that each time one of the real six strings or one of the keys simulating the six strings is triggered, a playing triggering instruction is sent out, and a corresponding note is obtained.
The present embodiment switches the rhythm type performance mode to the playing performance mode by detecting a user performance mode switching instruction; detecting a playing triggering instruction, and acquiring second music type information; and matching the preset note set based on the second music type information to obtain target notes and provide different user playing modes. The switching of the playing modes of the user is realized, the playing modes are provided for the guitar user, so that the real guitar experience of the guitar user is provided, and the user experience is improved.
In addition, the present invention also provides a music control apparatus, as shown in fig. 3, comprising:
a first obtaining module 10, configured to obtain first music type information;
the second obtaining module 20 is configured to detect a rhythmic playing trigger instruction, and obtain a to-be-played rhythmic type corresponding to the rhythmic playing trigger instruction;
and the first matching module 30 is configured to match a preset music set based on the to-be-played rhythm type and the first music type information to obtain a target music piece.
Further, the music control apparatus further includes:
the third acquisition module is used for acquiring a rhythm type set based on a preset rhythm type acquisition mode;
and the binding module is used for binding the rhythm type in the rhythm type set with the key corresponding to the rhythm type playing triggering instruction.
Optionally, the rhythm type set includes a selected rhythm type set, and the third obtaining module includes:
an output unit, configured to output a set of alternative tempo types;
and the receiving and responding unit is used for receiving and responding to the selection instruction aiming at the alternative rhythm type set to obtain a selected rhythm type set.
Optionally, the set of tempo types includes a set of identified tempo types, and the third obtaining module includes:
and the identification unit is used for identifying the preset music score to be played to obtain an identified rhythm type set.
Optionally, the third obtaining module includes:
a first acquisition unit configured to acquire rhythm-type use frequency information;
and the first determining unit is used for determining the rhythm type corresponding to the rhythm type use frequency information to obtain a candidate rhythm type set.
Optionally, the third obtaining module includes:
a second acquisition unit configured to acquire rhythm-type use time point information;
and the second determining unit is used for determining the rhythm type corresponding to the rhythm type use time point information to obtain an alternative rhythm type set.
Optionally, the preset music set is a preset music piece set, and the first matching module 30 includes:
the first matching unit is used for matching a preset music fragment set based on the rhythm type to be played, the first music type information and a preset mapping relation to obtain an initial music fragment;
and the first adjusting unit is used for adjusting the initial music piece to obtain the target music piece.
Optionally, the preset music set is a preset note set, and the first matching module 30 includes:
the second matching unit is used for matching the preset note set based on the rhythm type to be played and the first music type information to obtain at least two notes;
the combination unit is used for combining the notes to obtain an initial music segment;
and the second adjusting unit is used for adjusting the initial music piece to obtain the target music piece.
Optionally, the first adjusting unit or the second adjusting unit includes:
an acquisition subunit configured to acquire performance parameters;
and the adjusting subunit is used for adjusting the initial music segment based on the performance parameters to obtain the target music segment.
Optionally, the music control apparatus further comprises:
a fourth acquiring module for acquiring a user performance mode; and if the user playing mode is a rhythmic playing mode, executing the detected rhythmic playing triggering instruction, and acquiring the rhythmic operation to be played corresponding to the rhythmic playing triggering instruction.
Optionally, the music control apparatus further comprises:
the switching module is used for detecting a playing mode switching instruction of a user and switching the rhythm playing mode into a playing mode;
the fifth acquisition module is used for detecting a playing triggering instruction and acquiring second music type information;
and the second matching module is used for matching the preset note set based on the second music type information to obtain a target note.
Optionally, the first obtaining module 10 includes:
and the third acquisition unit is used for detecting a key triggering instruction and acquiring the first music type information corresponding to the key triggering instruction.
Optionally, the music control apparatus further comprises:
an output module for outputting the target music piece; or the like, or, alternatively,
and the playing module is used for playing the target music fragment.
The specific implementation of the music control apparatus of the present invention is substantially the same as that of the above-mentioned embodiments of the music control method, and will not be described herein again.
In addition, the invention also provides music equipment. As shown in fig. 4, fig. 4 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention.
It should be noted that fig. 4 is a schematic structural diagram of a hardware operating environment of the music device.
As shown in fig. 4, the music apparatus may include: a processor 1001, such as a CPU, a memory 1005, a user interface 1003, a network interface 1004, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Optionally, the music device may further include RF (Radio Frequency) circuits, sensors, audio circuits, a WiFi module, and the like.
Those skilled in the art will appreciate that the musical instrument configuration shown in fig. 4 is not intended to be limiting of musical instruments and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 4, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a music control program. Among them, the operating system is a program that manages and controls hardware and software resources of the music apparatus, and supports the operation of the music control program and other software or programs.
In the music apparatus shown in fig. 4, the user interface 1003 is mainly used for connecting a terminal, and communicating data with the terminal, such as receiving a rhythm type set transmitted from the terminal; the network interface 1004 is mainly used for the background server and performs data communication with the background server; the processor 1001 may be configured to call the music control program stored in the memory 1005 and execute the steps of the music control method as described above.
The specific implementation of the music device of the present invention is substantially the same as the embodiments of the music control method described above, and will not be described herein again.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, where a music control program is stored, and the music control program, when executed by a processor, implements the steps of the music control method as described above.
The specific implementation of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the music control method described above, and is not described herein again.
Furthermore, an embodiment of the present invention further provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the steps of the music control method described above are implemented.
The specific implementation of the computer program product of the present invention is substantially the same as the embodiments of the music control method, and will not be described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, a device, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (16)
1. A music control method, characterized by comprising the steps of:
acquiring first music type information;
detecting a rhythmic playing triggering instruction, and acquiring a to-be-played rhythmic type corresponding to the rhythmic playing triggering instruction;
and matching a preset music set based on the rhythm type to be played and the first music type information to obtain a target music fragment.
2. The music control method according to claim 1, wherein the detecting of the rhythmic performance triggering instruction and the acquiring of the rhythm to be performed corresponding to the rhythmic performance triggering instruction comprise:
acquiring a rhythm type set based on a preset rhythm type acquisition mode;
binding the rhythm type in the rhythm type set with the key corresponding to the rhythm type playing triggering instruction; the number of the keys is one or more, and the rhythm type playing triggering instructions corresponding to the keys are different.
3. The music control method according to claim 2, wherein the rhythm type set includes a selected rhythm type set, and the obtaining of the rhythm type set based on a preset rhythm type obtaining manner includes:
outputting an alternative rhythm type set;
and receiving and responding to a selection instruction aiming at the alternative rhythm type set to obtain a selected rhythm type set.
4. The music control method according to claim 2, wherein the rhythm type set includes a set of recognized rhythm types, and the obtaining of the rhythm type set based on a preset rhythm type obtaining manner includes:
and identifying a preset music score to be played to obtain an identified rhythm type set.
5. The music control method of claim 3, wherein outputting the set of alternative tempo types is preceded by:
acquiring rhythm type use frequency information;
and determining the rhythm type corresponding to the rhythm type use frequency information to obtain a candidate rhythm type set.
6. The music control method of claim 3, wherein outputting the set of alternative tempo types is preceded by:
acquiring rhythm type use time point information;
and determining the rhythm type corresponding to the rhythm type use time point information to obtain a candidate rhythm type set.
7. The music control method of claim 1, wherein the preset music collection is a preset music piece collection, and the matching of the preset music collection based on the rhythm type to be performed and the first music type information to obtain the target music piece comprises:
matching a preset music fragment set based on the rhythm type to be played, the first music type information and a preset mapping relation to obtain an initial music fragment;
and adjusting the initial music segment to obtain the target music segment.
8. The music control method of claim 1, wherein the preset music set is a preset note set, and the matching of the preset music set based on the rhythm type to be played and the first music type information to obtain the target music piece comprises:
matching the preset note set based on the rhythm type to be played and the first music type information to obtain at least two notes, and combining the notes to obtain an initial music segment; the number of the notes corresponds to the rhythm type to be played;
and adjusting the initial music segment to obtain the target music segment.
9. The music control method according to claim 7 or 8, wherein said adjusting the initial music piece to obtain the target music piece comprises:
and acquiring performance parameters, and adjusting the initial music segment based on the performance parameters to obtain the target music segment.
10. The music control method according to claim 8, wherein the detecting of the rhythmic performance triggering instruction and the acquiring of the rhythm to be performed corresponding to the rhythmic performance triggering instruction include:
acquiring a user playing mode;
and if the user playing mode is a rhythmic playing mode, executing the detected rhythmic playing triggering instruction, and acquiring the rhythmic operation to be played corresponding to the rhythmic playing triggering instruction.
11. The music control method according to claim 10, wherein, after obtaining the target music piece based on the matching of the rhythm type to be performed and the music type information with a preset music set, the method comprises:
detecting a user playing mode switching instruction, and switching the rhythm playing mode into a playing mode;
detecting a playing triggering instruction, and acquiring second music type information;
and matching the preset note set based on the second music type information to obtain a target note.
12. The music control method of claim 9, wherein the performance parameters include a play speed parameter, a volume level parameter, and a key change parameter.
13. The music control method of claim 1, wherein said obtaining first music type information comprises:
detecting a key triggering instruction, and acquiring first music type information corresponding to the key triggering instruction; the key triggering instruction is a response instruction corresponding to a single key.
14. The music control method according to claim 1, wherein, after matching a preset music collection based on the to-be-played tempo type and the first music type information to obtain a target music piece, the method comprises:
and outputting or playing the target music segment.
15. A musical apparatus comprising a memory, a processor, and a music control program stored on the memory and executable on the processor, the music control program when executed by the processor implementing the steps of the music control method as claimed in any one of claims 1 to 14.
16. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a music control program that, when executed by a processor, implements the steps of the music control method according to any one of claims 1 to 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110262627.4A CN113066455A (en) | 2021-03-09 | 2021-03-09 | Music control method, device and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110262627.4A CN113066455A (en) | 2021-03-09 | 2021-03-09 | Music control method, device and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113066455A true CN113066455A (en) | 2021-07-02 |
Family
ID=76560470
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110262627.4A Pending CN113066455A (en) | 2021-03-09 | 2021-03-09 | Music control method, device and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113066455A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4072078A (en) * | 1976-04-19 | 1978-02-07 | C.G. Conn, Ltd. | System for automatically producing tone patterns |
CN101551996A (en) * | 2009-05-11 | 2009-10-07 | 曾平蔚 | Electronic plucked instrument and pronunciation control method thereof |
CN101996624A (en) * | 2010-11-24 | 2011-03-30 | 曾科 | Method for performing chord figure and rhythm figure by monochord of electric guitar |
US20160343362A1 (en) * | 2015-05-19 | 2016-11-24 | Harmonix Music Systems, Inc. | Improvised guitar simulation |
CN111680185A (en) * | 2020-05-29 | 2020-09-18 | 平安科技(深圳)有限公司 | Music generation method, music generation device, electronic device and storage medium |
CN214752917U (en) * | 2021-03-09 | 2021-11-16 | 未知星球科技(东莞)有限公司 | Electronic musical instrument |
-
2021
- 2021-03-09 CN CN202110262627.4A patent/CN113066455A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4072078A (en) * | 1976-04-19 | 1978-02-07 | C.G. Conn, Ltd. | System for automatically producing tone patterns |
CN101551996A (en) * | 2009-05-11 | 2009-10-07 | 曾平蔚 | Electronic plucked instrument and pronunciation control method thereof |
CN101996624A (en) * | 2010-11-24 | 2011-03-30 | 曾科 | Method for performing chord figure and rhythm figure by monochord of electric guitar |
US20160343362A1 (en) * | 2015-05-19 | 2016-11-24 | Harmonix Music Systems, Inc. | Improvised guitar simulation |
CN111680185A (en) * | 2020-05-29 | 2020-09-18 | 平安科技(深圳)有限公司 | Music generation method, music generation device, electronic device and storage medium |
CN214752917U (en) * | 2021-03-09 | 2021-11-16 | 未知星球科技(东莞)有限公司 | Electronic musical instrument |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103258529B (en) | A kind of electronic musical instrument, musical performance method | |
JP4934180B2 (en) | Plucked string instrument performance evaluation device | |
JP2002049301A (en) | Key display device, electronic musical instrument system, key display method and memory medium | |
JP5257966B2 (en) | Music reproduction control system, music performance program, and performance data synchronous reproduction method | |
CN214752917U (en) | Electronic musical instrument | |
CN110585705A (en) | Network game control method, device and storage medium | |
KR101636047B1 (en) | System for practicing piano play | |
CN1256997C (en) | Music game equipment and electronic music equipment and its computer programme | |
CN111831249B (en) | Audio playing method and device, storage medium and electronic equipment | |
CN101673540A (en) | Method and device for realizing playing music of mobile terminal | |
CN112420006B (en) | Method and device for operating simulated musical instrument assembly, storage medium and computer equipment | |
CN105489209A (en) | Electroacoustic musical instrument rhythm controllable method and improvement of karaoke thereof | |
CN113066455A (en) | Music control method, device and readable storage medium | |
CN219180187U (en) | Electronic musical instrument | |
CN112435644B (en) | Audio signal output method and device, storage medium and computer equipment | |
JP2003223166A (en) | Information processor, musical score page change method therefor, program for musical score display and recording medium | |
CN113077769A (en) | Music control method, apparatus, readable storage medium, and program product | |
KR102244293B1 (en) | Digital piano system works with mobile devices | |
JP3964335B2 (en) | Karaoke device that automatically adjusts the volume of the guide melody based on the history of the number of times the requester sang the requested song | |
JP6073618B2 (en) | Karaoke equipment | |
JP5847048B2 (en) | Piano roll type score display apparatus, piano roll type score display program, and piano roll type score display method | |
CN216772801U (en) | Electronic musical instrument | |
CN107767851B (en) | Song playing method and mobile terminal | |
KR102397471B1 (en) | Digital instrument system that enables self-learning of traditional music by linking with the application | |
CN114299902B (en) | Electronic musical instrument and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |