[go: up one dir, main page]

HK1013460B - Object image display device and method - Google Patents

Object image display device and method Download PDF

Info

Publication number
HK1013460B
HK1013460B HK98114697.0A HK98114697A HK1013460B HK 1013460 B HK1013460 B HK 1013460B HK 98114697 A HK98114697 A HK 98114697A HK 1013460 B HK1013460 B HK 1013460B
Authority
HK
Hong Kong
Prior art keywords
image
display control
images
data
age
Prior art date
Application number
HK98114697.0A
Other languages
German (de)
French (fr)
Chinese (zh)
Other versions
HK1013460A1 (en
Inventor
Murata Yoshiyuki
Original Assignee
Casio Computer Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP35831492A external-priority patent/JP3163812B2/en
Priority claimed from JP4359848A external-priority patent/JP2796658B2/en
Priority claimed from JP36016792A external-priority patent/JP3092368B2/en
Application filed by Casio Computer Co., Ltd. filed Critical Casio Computer Co., Ltd.
Publication of HK1013460A1 publication Critical patent/HK1013460A1/en
Publication of HK1013460B publication Critical patent/HK1013460B/en

Links

Description

OBJECT IMAGE DISPLAY DEVICES
The present invention relates to an image display control device and method for displaying images such as an animal or building.
Conventionally, a so-called object image creating device is known which creates an image of an object such as a face in a manner similar to creation of a montage photograph. This device is provided with a part pattern memory which stores a plurality of data items on part patterns of respective parts of a face such as a "contour", "hair style", "eyes", "nose", "mouth", and "brows". It also provided with a switch which is operated to designate the respective parts of an image and a switch which is operated to designate desired ones of kinds of part patterns related to the parts designated by the part designating switch when the object image is created.
In this case, in order to create any object image, first, the part designating switch is operated to designate parts of the object image. Desired part patterns are selected from among kinds of part patterns stored in the part pattern memory for the respective parts designated by the operation of the part designating switch, and an image of a desired object is created by combining the selected part patterns.
As described above, the conventional object image creating device only creates an object image of composed of a plurality of part patterns and displays that object image created. It, however, cannot rapidly and easily create a different image related to the created object image, for example, a child's face image related to a created husband's face image and a created wife's face image, a person's possible face or whole body image imagined when his current created weight is increased/decreased, or whole body image obtained when his current created weight is increased/decreased, or another side face image related to a created front face image.
It is therefore an object of the present invention to provide an object image display device which rapidly and easily creates and displays an object image which is related to a created object image.
This object is solved by the invention as claimed in the independent claims. Preferred embodiments of the invention are defined by the dependent claims.
FIG. 1 is a plan view of an object image display device as a first embodiment of the present invention.
FIG. 2 is an enlarged view of the essential portion of the object image display device.
FIG. 3 is a block diagram of a whole illustrative circuit configuration of the object image display device.
FIG. 4 shows a stored state of part patterns in a basic part pattern ROM.
FIG. 5 shows a stored state of part patterns in an infant part pattern ROM.
FIG. 6 shows a stored state of data in an individual/montage data RAM.
FIG. 7 is a general flow indicative of the operation of the first embodiment.
FIG. 8 is a flowchart indicative of the contents of an individual data input/record process.
FIG. 9 is a flowchart indicative of the contents of a combined object montage creation/record process.
FIG. 10 is a flowchart indicative of the contents of a data combining process.
FIG. 11 is a flowchart indicative of the contents of an infant montage creation/record process.
FIG. 12 is a flowchart indicative of the contents of a montage synthesis process.
FIG. 13 is a flowchart indicative of the contents of a montage synthesis process subsequent to FIG. 12.
FIG. 14 is a flowchart indicative of the contents of the part selection/record process.
FIG. 15 is a flowchart indicative of the contents of a display process.
FIGS. 16A-16C each illustrate a montage image.
FIG. 16D illustrates combined objects and an infant's montage image synthesis from the combined objects.
Embodiments will be described below with reference to the accompanying drawings.
(FIRST EMBODIMENT):
FIG. 1 shows the appearance of an object image isplay device as a first embodiment of the present invention. The object image display device 1 of FIG. 1 is composed of a front half body 3 and a rear half body 4 connected through a hinge 2. The front half body 3 is provided with a first display 5 which includes a liquid crystal display, An item switch unit 6, an ON/OFF switch 8 and a part switch unit 7 are provided below the first display 5. As shown on enlarged scale in FIG. 2, the item switch unit 6 is composed of an OK switch 6n, an individual switch 6a, a synthesis switch 6b, a retrieval switch 6c, a synthesis percentage switch 6d, a blood type 6e, a combined object switch 6f, a random number switch 6g, a part select switch 6h, an infant switch 6i, a heredity switch 6j, and a record switch 6k. The part switch unit 7 is composed of a "hair style" switch 7a, a "brows" switch 7b, an "eyes" switch 7c, an "ears" switch 7d, a "nose" switch 7e, a "contour" switch 7f, a "mouth" switch 7g, a "both arms and hands" switch 7h, a "dress" switch 7i and a "both legs" switch 7j. The rear half body 4 is provided with a second display 9 smaller in area than the first display. A data input switch 10 which includes a numerical switch 10a and an alphabetical switch 10b provided below the second display 9 in provided below the second display 9.
Two combined person-montage images MA2 and MA3 are displayed on the first display 5 while a montage image MA6 synthesized from the two montage images MA2, MA3 is displayed on the second display 9.
FIG. 3 shows the whole circuit configuration as the object image display device 1 of this embodiment.
Switch information input to the input unit 18 composed of switches 6-10 is fed to CPU 11, which provides whole control required for the object image device 1 on the basis of the program data stored in a program ROM 12A which 'composes ROM 12 and random number data generated by a random number generator 14, and drives the display driver 15 which in turn drives the first and second display 5 and 9.
ROM 12 is composed of the program ROM 12A, a basic part pattern ROM 12B which stores basic part patterns, and an infant part pattern ROM 12C which stores infant part patterns. As shown in FIG. 4, the basic part pattern ROM 12B stores part patterns indicated as numbers "01"-"50" for each of 10 kinds of "contour", "hair style", "eyes", ..., "both legs" part patterns corresponding to the switches of the part switch unit 7. As shown in FIG. 5, the infant part pattern ROM 12C stores part patterns at areas corresponding to "01"-"50" for each of 5 kinds of "contour", "hair style", "dress", "both arms and hands", "both legs" part patterns which correspond to "contour", "hair style", "dress", "both arms and hands", and "both legs" switches 7f, 7a, 7e, 7i, 7h, 7j of the part switch unit 7. The infant part pattern ROM 12C is different from the basic part pattern ROM 12B in that the former stores only 5 kinds of "contour", "hair style", "dress", "both arms and hands", and "both legs" part patterns where the features of an infant appears most remarkably. Thus, the 5 kinds of "contour", "hair style", "dress", "both arms and hands", and "both legs" part patterns are to be designated by the operation of the respective ones of "contour", "hair style", "dress", "both arms and hands", and "both legs" switches 7f, 7a, 7e, 7i, 7h and 7j of the part switch unit 7.
As shown in FIG. 6, the individual/montage data RAM 13 is composed of a display register 130, an item data area 152, a data synthesis work area 131, a montage synthesis area 132, and a reduction synthesis area 133. The item data area 152 is composed of an individual data area 134 which stores 50 individual data items at the locations "1"-"50", and a montage area 135 which stores pattern numbers of indicative parts of montage data which corresponds to the individual data items. The individual data area 134 stores the name, blood type, telephone number and age of a person input by the data input switch unit 10. The 10 kinds of "contour"-"both legs" part areas of the montage data area 135 store part patterns indicated by numbers "01"-"50" of FIG. 4 for each of the "contour", "hair style", part patterns selected by the part switch unit 7. The infant data area 136 is also composed of an individual data area 134 and a montage data area 135. The select part area 137 stores data on two combined persons and is composed of a first combined person area 138 and a second combined person area 139, which store data "0" or "1" in 10 bits indicative of each of 10 kinds of "brows", "eyes", "contour", etc. When a montage is synthesized, a part patterns is used which is stored in the combined person part area corresponding to the part area where "1" data is stored as a part pattern used for the synthesis of the montage. No part pattern is used which is stores in the part area of the combined person corresponding to a part area where "0" data is stored. In the case of the embodiment of FIG. 6, for example, "1" and "0" are stored in the first and second combined person part areas, respectively, in "contour" area. Thus, in a montage synthesis the "contour" part pattern of the first combined person is employed. For the "both legs" area, "0" and "1" are used which are stored in the first and second combined person part areas. Thus, the "both legs" part pattern of the second combined person is employed.
Therefore, when a new montage image is synthesized on the basis of the first and second combined person montage images, the part patterns stored in the combined person part area corresponding to the part area where "1" data is used mainly so that the synthesized montage image resembles the combined object montage image of data which contains more "1" than the other.
The data synthesis area 131 and montage synthesis area 132 combine the respective selected part patterns at the positions and stores the synthesized montage image. The reduced synthesis area 133 reduces and stores the created montage image for an infant image.
The operation of this embodiment will be described with reference to the flowchart of FIG. 7 and subsequent FIGURES concerned.
FIG. 7 is a general flow chart indicative of the operation of the present embodiment. (an individual data input/record process of FIG. 8 (step SA 1) a combined person montage creation/record process of FIG. 9 (step SA2), an infant montage creation/record process of FIG. 11 (step SA3), a montage synthesis process of FIGS. 12 and 13 (step SA4) and a display process of FIG. 15 (step SA5) are performed in this order.
INDIVIDUAL DATA INPUT/RECORD PROCESS
A specified flow will be described next. First, an individual data input/record process (step SA1). is executed in accordance with the flow of FIG. 8 on condition that the individual switch 6a is operated (step SB1). If. this individual switch 6a is determined to be operated, it is determined whether individual data is input by the operation of the data input switch unit 10 (step SB2). If so, this individual data is written in an individual data area 134 of the individual/montage data RAM 13 of FIG. 6 (step SB3).
Thereafter, it is determined whether the record switch 6k is operated (step SB4). If not, a looping operation involving steps SB2-SB4 is iterated until the operation of the record switch 6k is detected. Thus, during the iteration, by inputting the name, blood type, telephone number and age of the person are input by the operation of the data input switch 10, the respective input data items are written into proper ones of the items "1"-"50" of FIG. 6. Thereafter, by the operation of the record switch 6k, control returns from the flow of FIG. 8 to a general flow of FIG. 7.
COMBINED PERSON MONTAGE CREATION/RECORD PROCESS
A combined person montage creation/record process is executed at step SA2 subsequent to step SA1 of the general flow of FIG. 7. This process is executed in accordance with the flow of FIG. 9 on the condition that a combined person switch 6f is operated (step SC1). If it is determined that the switch 6f is operated, a part pattern No. "01" indicative of basic part patterns in the basic part pattern ROM 12B is set initially (step SC2). The basic part pattern No. "01" set initially set points to the respective part numbers ranging from the "contour" indicated by the part "1" of FIG. 4 to the "both legs" indicated by the part "10". Thus, part pattern numbers indicative of 10 part patterns (of a whole body) ranging from the "contour" indicated by "1" to the "both legs" indicated by "10" and stored in the column "01" at step SC2 are set initially and stored, in the montage data area 135 of the individual/montage data RAM 13.
Subsequently, a data combining process (step SC3) is executed in accordance with a flow chart of FIG. 10.
A "contour" part pattern No. "01" is read from among the respective part pattern numbers set initially in the montage data area 135 (step SD1). A contour part pattern corresponding to the read "contour" part pattern No. "01" is transferred to a montage synthesis area 132 of the individual/montage data RAM 13 (step SD2). The "hair style" part pattern No. "01" is read (step SD3). A hair style part pattern corresponding to the read "hair style" part pattern No. "01" is read (step SD3). A part pattern corresponding to the read "hair style" part pattern No. is transferred to the montage synthesis area 132 (step SD4). Similar processes are performed for the "eyes", "nose", "mouth", etc., are performed (step SD5). A montage image MA1 is synthesized from those transferred part patterns in the montage synthesis area 132 and displayed on the first display 5, as shown in FIG. 16A (step SA6). In FIGS. 16A-16D and 1, the respective displayed montage images MA1-MA6 are not for a whole body, but for an upper half of the whole body.
Thus, by the process at step SD6, a basic type whole-body montage image MA1 composed of a combination of respective part patterns indicated by the part pattern numbers stored in a column covering a "contour" part "01" to "both legs" part "10" of FIG. 4 is displayed as a montage image on the first display 5. If the basic type montage image MA1 is required to be corrected, the user operates the following switch operations to correct the image the basis of the image MA1, to thereby create a desired montage image on that is, at step SC4 subsequent to step SC3 of FIG. 9, it is determined whether the "contour" switch 7f of the part switch unit 7 is operated (step SC4). If so, a part pattern number corresponding to the "contour" part pattern designated by the switch 7f is changed from the initially set "01" to "02" (step SC5). This number "02" is changed and stored in the "contour" area of the individual/montage data RAM 13 and the data combining process (step SC3) is executed.
In this way, if the "contour" switch 7f is operated once, "contour" part pattern of the respective part patterns which constitute the basic type montage image MA1 displayed on the first display 5 is changed from the contour part indicated by the part pattern number "01" set initially to a contour part pattern indicated by a part pattern number "02", which then is displayed. In this case, since 50 kinds of "contour" part patterns are stored in ROM 12B, only the "contour" part pattern of the respective part patterns which constitute the montage image MA1 is sequentially changed by the 50 successive operations of the "contour" switch 7f, to the 50 respective "contour" part patterns, which are then displayed.
After a "contour" part pattern which is the same as, or similar to, the part pattern of the "contour" which is a part of a combined person montage image to be created is selected from the 50 kinds of "contour" part patterns and displayed on the first display 5, the operation of the "hair style" switch 7a causes the determination at step SC6 to be YES to thereby change a hair style part pattern number corresponding to the "hair style" part pattern from "01" to "02" (step SD7). Thus, the pattern of the "hair style" part of the whole body montage image is changed to a "hair style" part pattern corresponding to the part pattern No. "02", which is then displayed. Also, in this case, 50 kinds of "hair style" part patterns are stored in ROM 12B, only the "hair style" part pattern of the respective part patterns which constitute the whole body montage image can be changed by the successive operations of the "hair style" switch 7a to 50 "hair style" part patterns sequentially and the resulting images can be displayed.
Similarly, when the "eyes" switch 7c-"both legs" switch 7j of FIG. 2 are operated to change the current part patten numbers to the part pattern numbers indicative of part patterns which are the same as or similar to, the part patterns which constitute the montage image of a combined person to be created to thereby display a whole-body montage MA2 which is the same, as or similar to, the whole body of the combined person to be created, as shown in the left-hand portion of FIG. 16D on the basis of the basic type whole-body montage image MA1 displayed initially to thereby display a whole-body montage image MA2 which 7s the same as, or similar to, the whole body of the combined person to be created.
If the montage image MA2 is displayed, the OK switch 6n is operated ("OK" may be input by the operation of the alphabetical switches 10b) to confirm on the display 5 the displayed state of a montage image directly before recording and then to operate the record switch 6k. This causes control to pass from step SC16 to SC17, where if the name of the combined person input as individual data in the flow of FIG. 8 is, for example, "A", the respective part pattern numbers (montage data) designated by the part switch unit 7 are recorded finally in a montage data area 135 corresponding to the item "1" and where the "A" is stored. In FIG. 6, "none" described in "both arms and hands" and "both legs" areas indicate that "OK" switch 6n and the record switch 6k have been operated without "both arms and hands" switch 7h and "both legs" switch 7j being operated. In this case, part pattern numbers are finally recorded which correspond to the patterns of all the parts except for "both arms and hands" and "both legs" part patterns.
By executing the individual data input/record process (step SA1) and the combined person-montage creation/record process (steps SC1-SC17) at least twice, at least two combined person montage images MA1, MA2 are created and recorded. The present embodiment show that 50 combined object montage data items and the individual data stored in the individual data 134 are recorded in the montage data area 135 of the individual/montage data RAM 13.
INFANT MONTAGE CREATION/RECORD PROCESS
In the general flow of FIG. 7, an infant montage creation/record process is executed at step SA3 subsequent to step SA2.
This process is executed on condition that the infant switch 61 has been operated in accordance with the flow of FIG. 11 (step SA1). If it is determined that the infant switch 61 is operated, the process at steps SE2-SE15 is executed. The process at step SE2-SE15 is the same in contents as the flow of FIG. 9. In the present embodiment, part patterns of parts where the features of an infant are most remarkable are limited to five switches; the "contour" switch 7f which is a part switch dedicated to the infant only, a "hair style" switch 7a, a "dress" switch 7i, a "both arms and hands" switch 7h, a "both legs" switch 7j, so that the infant montage creation/record process is performed by the operation of those switches.
When the same process as combined person montage creation/record process of FIG. 9 is executed in accordance with the flow of FIG. 11, a basic montage image MA4 for a whole-body except for "eyes", "nose", "mouth", "brows", and "ears" is displayed initially as shown in FIG. 16B (step SA2). Thus, a corrected montage image MA5 composed of a combination of a "contour", "hair style", "dress", "both arms and hands", "both legs" part patterns designated by the operation of the "contour" switch 7f, "hair style" switch 7i, "both arms and hands" switch 7h, and "both legs" switch 7j on the basis of the montage image MA4 is displayed as shown in FIG. 16C. When the "OK" and "record" keys 6n and 6k are operated, the part pattern numbers indicative of the respective part patterns which constitute the corrected montage image MA5 are recorded finally in the infant data area 136 of FIG. 6.
MONTAGE SYNTHESIS PROCESS
In the general flow of FIG. 7, a montage synthesis process is executed at step SA4 subsequent to step SA3. This process involves partly combining a plurality of combined person montage images to create a new synthesized montage image.
This process is executed on condition that the synthesis switch 6b is operated in accordance with a series of flows of FIGS. 12 and 13 (step SF1). When the operation of the synthesis switch 6b is determined, "after individual retrieval ?" is displayed on the first display 5 (step SF2).
The user operates the data input switch 10 to input as retrieval data the same name as any one of the names "A", "B", "C",... on individual data stored beforehand in the individual montage data RAM 13 in accordance with that display (step SF3). It is then determined whether the retrieval switch 6c is operated (step SF4). If so, it is determined whether the name which is the retrieval data input at step SF3 coincides with any one of the names as the individual data (step SF5).
When both data items coincide and a retrieved name input by the operation of the data input switch 10 is stored in the individual data area 134 of the individual/montage data RAM 13, the individual data involving that coincidence and the respective part pattern numbers corresponding to the individual data are read (step SF6). The read individual data and the respective part pattern numbers and the characters "as nth person is being retrieved" are displayed together (step SF7). Thus, when the data input switch unit 10 is operated to input, for example, "A"'s name, "A"'s input name, blood type, telephone number and age as well as part pattern numbers "01", "02", "02", ... corresponding to the respective part patterns which constitute "A"'s montage image MA2 are displayed together on the displays and a "first person is being retrieved" is also displayed on the first display 5.
PART SELECTION/RECORD PROCESS
An internal counter n (not shown) of CPU 11 is counted up (SF8) and part selection/record process is executed (SF9). This process includes a process for determining a selected one of the part patterns of the two combined person montage images as each of the part patterns of a synthesized montage image to be created and a process for recording a part pattern number corresponding to the selected part pattern.
This process is executed on condition that any one of the random number switches 6g, the part select switch 6h, a synthesis percentage switch 6d, the heredity 6i and the blood type switch 6e has been operated at steps SG1, SG4, SG7, and SG10 in accordance with the flow of FIG. 14.
First, if it is determined that the random number switch 6g has been operated (YES at step SG1), the random number generator 14 is operated to generate random number data "1" or "0" sequentially and random by (step SG2). The "1" or 0" generated sequentially and randomly from the random number generator 14 are rewritten at 10 "contour", "hair style", ..., part locations in the first and second combined person areas 138 and 139 (step SG2A). The random number data stored beforehand at the part locations is rewritten with random data generated sequentially and randomly later. When the user operates the OK switch 6n any time after the random data is generated randomly and sequentially (step SG2B), the random number data "1" or "0" rewritten in the part areas are finally recorded (step SG3). In this case, when "1" is stored at one of the respective part locations of the first and second combined person areas 138 and 139, "0" is automatically recorded in the other corresponding part area in order to prevent double use of the part patterns of the combined persons corresponding to each other. Thus, "1" is automatically recorded in any one of the respective part locations of the first and second combined person areas 138 and 139 irrespective of the user's intention. In the present embodiment, since "1" is recorded at any one of the part locations of the first and second combined person areas 138 and 139 depending on the random number data generated randomly and sequentially by the random number generator 14, "1's" can be recorded at all the part locations in an extreme case. Alternatively, "1" or "0" generated randomly and sequentially from the random number generator 14 may be stored sequentially at the 10 respective part locations of the first and second combined person areas 138, 139 until random number data is stored at all the part locations, at which time the "1" or "0" of the random number data stored at the respective part locations may be are recorded finally. Alternatively, "1" or "0" generated randomly and sequentially from the random number generator 14 may be rewritten repeatedly at the 10 respective part locations of the first and second combined person areas 138, 139 until a predetermined interval of time has elapsed, at which time random number data "1's" or "0's" stored at the part locations may be recorded finally.
When it is determined that the part select switch 6h is operated without the random number switch 6g being operated (YES at step SG4), it is determined at the next step whether any one of the switches 7a-7j of the part switch unit 7 has been operated (step SG5). If so, "1" is stored at that part location of the first combined person area 138 corresponding to the part designated by the operated one of the part switches 7a-7i (step SG6). Therefore, for example, when the "contour" switch 7f is operated, "1" is stored at the "contour" area of the first combined-person area 138. When the "hair style" switch 7a is operated, "1" is stored at the "hair style" area. Once "1" is stored at a area in the first combined person area 138, "0" is automatically stored at the part location corresponding to "contour" part areas, etc., of the second combined person area 139.
When it is determined that the synthesis percentage switch 6d has been operated without the part select switch 6h being operated (YES at step SG7), it is determined at the next step whether the numerical switch 10a has been operated (SG8). If so, "1" is stored only at one of the part locations of the first and second combined-person areas 138 and 139 (step SG9). For example, when a numerical value "50" is input by the operation of the numerical switch 10a to set the synthesis percentage of the first and second combined persons at 50 percent, "1's" which are the same in number (in the case of present embodiment, 5) are stored at the respective part locations of the first and second combined person areas 138 and 139. When numerical values "80" and "20" are input by the operation of the numerical switch 10a in order that the synthesis percentages are "80 %" and "20 %", respectively, "1's" the number of which corresponds to "80" and "1's" the number of which corresponds to "20" are stored at the respective part locations of the first and second combined person area 138 and 139. In the case of this embodiment, since the number of part locations areas is 10, 8 "1's" and 2 "1's" are stored at the part locations in the first and second combined person areas 138 and 139, respectively. Also, in this case, if "1" is stored at one of the part locations of the first and second combined person areas 138 and 139, "0" is automatically stored at the corresponding other part area locations of the combined person areas 138 and 139.
When it is determined that the heredity switch 6j or blood type switch 6e has been operated, it is determined at the next step whether the part switch unit 7 is operated (step SG11). If so, "1" is stored in only at one of the part locations of the first and second combined person areas 138 and 139 and corresponding to each other (step SG12). That is, when the heredity switch 6j or blood type 6e is operated and the part switch unit 7 is then operated, "1" or "0" is stored at a part location corresponding to the part designated by the operation of the part switch unit 7 in accordance with probability data depending on the kind of "blood type" which constitutes the individual data area 134 of the combined person, or probability data obeying on a of heredity and conforming to the kind of the "blood type" and the "age" which constitutes the individual data area 134. Since those probability data items are stored beforehand in ROM 12, "1" or "0" is stored depending on the probability data items.
At step SF10 subsequent to step SF9 of FIG. 12 it is determined whether the count of counter is n = 2. If not, the process and determination starting at step SF2 are performed again. Thus, the part selection/record process of FIG. 14 is performed again for a newly selected second combined person.
When it is determined that n = 2 at step SF10, the part selection/record process is completed for the two combined persons. Thus, "Selection of the combined persons has been completed" is displayed on the first display 5 (step SF11).
Control then passes to step SF12 of FIG. 13, where it is determined whether the infant switch 6i is operated to determine whether an infant montage image is to be created as a synthesized montage image. If so, particular infant part pattern numbers are read which are indicative of "contour", "hair style", "dress", "both arms and hands", and "both legs" only stored in the infant data area 136. After this process has been completed, part pattern numbers in other part locations (for the "eyes", "nose", "mouth", "brows", and "ears") other than the above infant part locations which constitute the infant data area 136 and where "1's" are stored by the part selection of step SE9 (FIG. 12) are read (step SF14). In the case of FIG. 6 embodiment, "50", "01", "01", "40" and "30" are stored only at the "contour", "hair style", "dress", "both arms and hands", "both legs" part locations of the infant data 136, and no part pattern numbers are stored at other part locations (indicated by "none").
Since at this time the first and second combined persons are already determined and "1's" or "0's" are stored at the respective part areas of the first and second combined person areas 138 and 139, combined person part pattern numbers are then read which correspond to other "eyes", "nose", "mouth", "brows", "ears" part locations of the infant data area except for the "contour", "hair style", and "dress" part locations and where "1's" in the first and second combined person areas 138 and 139 are stored. In the case of the FIG. 6 embodiment, since "1's" are stored in the "contour", "hair style", "mouth", "dress" and "both arms and hands" part locations which constitute a portion of the first combined person area 138 and the "eyes", "nose", "brows", "ears", and both legs part locations which constitute a portion of the second combined person area 139, the part pattern numbers stored in the part locations of the combined person areas 138 and 139 where "1's" are stored can be read. However, the part pattern numbers stored in the "contour", the "hair style", "dress", both arms-and hands and legs part locations are not read, but the part pattern numbers stored in the "contour", "hair style", "dress", both arms-and hands and legs part locations of the infant data area instead are read preferentially.
As a result, in the case of the present embodiment, the part pattern numbers encircled in FIG. 6 are read as follows:
The first combined person "A"; "30" (mouth):
  • The second combined person "B"; "50" (eyes), "20"(nose), "03" (brows) , "04" (ears): and
  • The infant "X"; "50" (contour), "01" (hair style), "01" (dress), "40" (both arms and hands), and "30" (both legs).
Thereafter, part patterns corresponding to the read part pattern numbers are read from the basic part pattern ROM 12B and infant part pattern ROM 12C and transferred to the montage synthesis area 132, where a montage is synthesized from those part patterns (step SF15).
Subsequently, infant's "age" data is then read from the "age" location in the individual area 134 of RAM 13 (step SF16). The montage image is reduced in size to a one corresponding to the read infant's age, and data on the obtained image is stored in the reduced size synthesis area 133 of RAM 13 (step SF17). In this case, data on the rate of reduction of the montage image corresponding to the infant's age is stored as conversion table in ROM 12.
As a result, since the reduced infant's montage image is composed of a combination of the "mouth" part pattern of the first combined person "A" and the "eyes", "brows", and "ears" part patterns of the second combined person "B" except for the "contour", "hair style", "dress", "both arms and hands", and "both legs" of the infant "X". The resulting montage in this case more resembles the montage of the second combined person "B" than that of the first combined person "A".
When at step SF12 it is determined that the infant switch 6i is not operated, control passes from step SF12 to step SF18, where it is further determined whether the synthesis switch 6b has been operated. If so, the part pattern numbers at the part locations where "1's" is stored by part selection of step SF9 (FIG. 12) among the part locations of the first and second combined person areas 138 and 139 are read since only the part patterns of the combined persons "A" and "B" are to be used without using the part patterns of the infant "X" for the resulting montage image (step SF19).
  • As a result, in the case of this embodiment, the following part pattern numbers hatched in FIG. 6 are read;
  • The first combined person "A": "01" (contour), "02" (hair style), "30" (mouth), and "03" (dress);
  • The second combined person "B": "50" (eyes), "20" (nose), "03" (brows), "04" (ears), "03" (dress).
In this case, since no part pattern numbers are stored at the "both arms and hands" and "both legs" part locations, then data are not read.
Part patterns corresponding to the read part pattern numbers are read from the basic part pattern ROM 12B, transferred to the montage synthesis area 132, where a montage is synthesized from those part patterns and stored (step SF20).
DISPLAY PROCESS
In the general flow of FIG. 7, a display process is executed at step SA5 subsequent to step SA4. This process is performed in accordance with the flow of FIG. 15. That is, the individual's data and montage data of the combined persons "A" and "B" stored in the individual data area 134 and montage data area 135 in RAM 13, and the individual data stored in the individual data area 134 for an infant "X" to be synthesized are read (step SH1). Thereafter, as illustrated in FIG. 1, the respective individual data items of the read combined persons "A" and "B" are displayed on the first display 5, and the "A" and "B" montage images MA2 and MA3 on the basis of the read combined persons "A" and "B" montage data are displayed below those individual data items. The read infant "X's" individual data is displayed on the second display 9, and a montage image MA6 corresponding to the infant "X's" montage image stored in the montage synthesis area 132 or the reduction synthesis area 133 in the montage synthesis process (FIGS. 12 and 13) at step SA4 is displayed below the displayed "X's" individual data on the basis of the "X's" montage image (step SH2).
As described above, as illustrated in FIGS. 1 and 16D, an infant's montage image MA6 having the features of one combined person "A"'s "mouth" and the features of the other combined person "B"'s "eyes", "nose", "brows", and "ears" is rapidly and easily created and displayed by a simple switching operation. A synthesized montage image having some of the features of one combined person's montage image and some of the features of the other combined person's montage image is created and displayed rapidly and easily by a simple operation. Thus, for example, by creation of a lover or husband's and wife's montage images, an infant or synthesis montage image additionally having same or all of the features of the lover or husband's and wife's montage images is created rapidly and easily.
When (1) random number data is generated randomly by the operation of the random number switch and recorded at part locations in one of the first and second combined person areas 138 and 139 steps SG 1-SG 3, (2) the respective part patterns which constitute the first and second combined person montage images are used randomly to create a synthesized montage image composed of the randomly combined part pattern is created, the montage image can be an unexpected one easily irrespective of the user's intention.
If the combination percentages of the first and second combined person montage images are designated by the operation of the synthesis percentage switch 6d and the numerical switch 10a are designated, a synthesized montage image is created depending on the combination percentages. Thus, for example, a synthesized montage image which more resembles one of the combined person montage images is created.
By the operation of the heredity switch 6j and the blood type switch 6e, a synthesized montage image is created depending on the heredity or blood type element.
In the case of the embodiment of FIG. 16D, the infant switch 6i is operated. Thus, the respective montage image MA2, MA3 of the combined persons "A", "B" as well as the montage image MA6 of the infant "X" are displayed. If the infant switch 6i is not operated but the synthesis switch 6b is operated, the respective montage images MA2, MS3 of the combined persons "A" and "B" are displayed on the first display 5 while a synthesized montage image MA6 created newly on the basis of those montage images MA2, MA3 is displayed on the second display 9.
While in the present embodiment when it is determined at step SF11 of FIG. 12 that n = 2, selection of combined person is terminated and the number of combined persons is limited to 2, three or more combined persons may be selected.
While the combined objects are human being and the resulting synthesis is also a human being, the present invention is not limited to that particular example. For example, the combined objects are human being's image and animal's image, and the synthesis may be a combination of a human image and animal's image.
while in the present embodiment the synthesis of an infant's face montage image or a whole-body montage image obtained by a combination corresponding ones of the respective parts of two combined adult images and an infant's image has been illustratively displayed, a synthesized montage image may be created and displays as a combination of corresponding ones of the respective part patterns of two adult images, two children's images, or an adult and a child image.
While in the present embodiment a single infant montage image synthesized from two combined object images has been illustrated as being displayed, on the display the synthesized montage image may be printed out without or in addition to such display of the synthesized image.

Claims (22)

  1. An image display control device for displaying a first, second and third animal and/or human being image, the third image (MA6) being based on the first and second images and first, second and third age data respectively corresponding to the first, second and third images, the image display control device comprising:
    image storage means (13; 132) for storing the first, second and third images;
    age storage means (13; 143) for storing said stored in said image means;
    display means (5, 9) for displaying an image and age corresponding thereto;
    first display control means (11) for reading the first and second images stored in said image storage means and for displaying the first and second images on said display means;
    second display control means (11) for reading from said age storage means the first and second age data respectively corresponding to the first and second images, and for displaying the first and second age data on said display means;
    operation means (6i) operated by a user; and
    third display control means (11) responsive to the operation of said operation means in a state where the first and second images are displayed by said first display control means, for reading the third image from said image storage means and for displaying the third image read from said image storage means on said display means.
  2. The image display control device according to claim 1, wherein the third image is displayed on the basis of displayed first and second images (MA2, 3).
  3. The image display control device according to claim 1 or 2, wherein the third display control means includes means for reading from said age storage means the third age data corresponding to the third image, and for displaying the third age data read from said age storage means on said display means.
  4. The image display control device according to one of claims 1 to 3, wherein the second display control means reads from said age storage means the first and second age data when the first and second images are displayed by said first display control means.
  5. The image display control device according to claim 3, wherein the third display control means reads from said age storage means the third age data when the third image is displayed by said third display control means.
  6. The image display control device according to one of claims 1 to 5, wherein the third image is displayed together with the first and second images on said display means.
  7. The image display control device according to one of claims 1 to 6, further comprising printing means for printing the third image (MA6).
  8. The image display control device according to one of claims 1 to 7, further comprising:
    name input means (10,18) for inputting name data of the first and second images as first and second names, respectively, to the image display control device;
    name storage means (13, 134) for storing the first and second name data input by said name input means; and
    fourth display control means (11) for displaying on said display means, the first and second name data stored in said name storage means.
  9. The image display control device according to one of claims 1 to 8, wherein the first image comprises a male adult image (MA2), the second image comprises a female adult image (MA3) and the third image comprises an infant image (MA6).
  10. The image display control device according to one of claims 1 to 9, wherein at least a part of the third image (MA6) corresponds to at least a part of each of the first and second images (MA2, 3).
  11. The image display control device according to one of claims 1 to 10, wherein the first, second and third images each comprise a plurality of combined partial images.
  12. The image display control device according to one of claims 1 to 11, wherein at least some of said image storage means, said age storage means, said display means, said first display control means, said second display control means, said operation means, and said third display control means are provided in a portable electronic device.
  13. An image display control method for displaying, on the basis of displayed first and second animal and/or human being images (MA2, 3), a third animal and/or human being image (MA6) which is based on said first and second images (MA1, MA2) and first, second and third age data stored respectively for said first, second and third images, the method of comprising the steps of:
    controlling (SH1, 2) image storage means (13, 132) in which the first, second and third images are stored so as to display the first and second images stored in said image storage means;
    controlling (SH1, 2) age storage means (13, 134) in which first and second age data are stored respectively corresponding to the first and second images so as to display the first and second age data in a state where said first and second image data are displayed by said step of controlling said image storage means; and
    reading the third image from the image storage means in response to an operation from operation means (6i) in a state where the first and second images are displayed by said step of controlling said image storage means, and displaying the third image read from said image storage means.
  14. The image display control method according to claim 13, wherein the third image is displayed together with the first and second images.
  15. The image display control method according to claim 13 or 14 wherein the age storage means also stores third age data corresponding to the third image.
  16. The image display control method according to claim 15, wherein the step of reading the third image includes the step of reading (SH1) from the age storage means the third age data, and displaying (SH2) the third age data read from the age storage means.
  17. The image display control method according to one of claims 13 to 16, wherein at least one of said age data is displayed in a state where the corresponding image is displayed.
  18. The image display control method according to one of claims 13 to 17 further comprising the step of printing the third image (MA6).
  19. The image display control method according to one of claims 13 to 18, further comprising the steps of:
    inputting (SA1, SB2) name data for the first and second images as first and second names, respectively;
    storing (SA1, SB3) the first and second name data input by said step of inputting name data; and
    displaying (SH2) the first and second name data stored by said step of storing the first and second name data.
  20. The image display control method according to one of claims 13 to 19, wherein the first image comprises a male adult image (MA2), the second image comprises a female adult image (MA3) and the third image comprises an infant image (MA6).
  21. The image display control method according to one of claims 13 to 20, wherein at least a part of the third image (MA6) comprises at least a part of each of the first and second images (MA2, 3).
  22. The image display control method according to one of claims 13 to 21, wherein the first, second and third images each comprise a plurality of combined partial images.
HK98114697.0A 1992-12-25 1998-12-22 Object image display device and method HK1013460B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP35831492A JP3163812B2 (en) 1992-12-25 1992-12-25 Montage creating apparatus and montage creating method
JP358314/92 1992-12-25
JP359848/92 1992-12-30
JP360167/92 1992-12-30
JP4359848A JP2796658B2 (en) 1992-12-30 1992-12-30 Image display control device and image display control method
JP36016792A JP3092368B2 (en) 1992-12-30 1992-12-30 Image display control device and image display control method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
HK00101220.0A Division HK1022362B (en) 1992-12-25 1998-12-22 Object image display devices

Related Child Applications (1)

Application Number Title Priority Date Filing Date
HK00101220.0A Addition HK1022362B (en) 1992-12-25 1998-12-22 Object image display devices

Publications (2)

Publication Number Publication Date
HK1013460A1 HK1013460A1 (en) 1999-08-27
HK1013460B true HK1013460B (en) 2003-07-18

Family

ID=

Similar Documents

Publication Publication Date Title
EP0609559B1 (en) Object image display device and method
EP0603892B1 (en) Object-image display apparatus
KR0142164B1 (en) Image display control device and method
US5644690A (en) Electronic montage creation device
US5612716A (en) Image display device
JPH10260666A (en) Display control device and recording medium recording display control program
HK1013460B (en) Object image display device and method
US5608852A (en) Evaluation data display devices
JP2796658B2 (en) Image display control device and image display control method
JP3027835B2 (en) Image display control device and image display control method
JP2845240B2 (en) Image display control device and image display control method
JP2995323B2 (en) Image display control device and image display control method
JP2939881B2 (en) Image display control device and image display control method
EP0584758B1 (en) Image display device
HK1022362B (en) Object image display devices
JP3455756B2 (en) Object image display control device and object image display control method
JP3489165B2 (en) Image output control device and image output control method
JP2006202188A (en) Image composition device and pattern checking method thereof
JP3543154B2 (en) Object image output control apparatus and object image output control method
JP3341049B2 (en) Image display control device and image display control method
HK1013459A (en) Object-image display apparatus
JPS6229979Y2 (en)
JPH06195431A (en) Montage preparation device
JPH07306950A (en) Montage image processing equipment
JPH07323606A (en) Printer