GB2359229A - Computer graphics rendering of partially transparent object - Google Patents
Computer graphics rendering of partially transparent object Download PDFInfo
- Publication number
- GB2359229A GB2359229A GB9926760A GB9926760A GB2359229A GB 2359229 A GB2359229 A GB 2359229A GB 9926760 A GB9926760 A GB 9926760A GB 9926760 A GB9926760 A GB 9926760A GB 2359229 A GB2359229 A GB 2359229A
- Authority
- GB
- United Kingdom
- Prior art keywords
- data
- fragment
- accordance
- contribution
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/503—Blending, e.g. for anti-aliasing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/62—Semi-transparency
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
Abstract
A computer graphics apparatus is operable to generate a representation of a three dimensional scene including an object 112 which is at least partially transmissive of light. The object 112 is represented by blending the colour of the object with the colour of the part of the scene obscured by the object. The blending function is preferably dependent upon the thickness of the object, which will govern the extent to which the colour of the light transmissive object 110 introduces the final colour of the appearance of the object in the scene.
Description
2359229 1 COMPUTER GRAPHICS APPARATUS is This invention relates to
computer graphics apparatus, and is most particularly concerned with apparatus for generating graphical images of at least partially transparent objects.
In computer graphics, transparency can also be interpreted as a lack of opacity. In numerical terms, opacity can be represented as a number between 0 and 1, an opacity of 0 being applicable to a purely transparent material which allows the transmission of all light incident thereon and an opacity of 1 being applicable to a purely opaque material which allows no transmission of light therethrough.
When calculating final pixel values, the opacity value of a partially opaque material can be used as a "blending" value, representing the contribution of the colour of that material to the final pixel colour, relative to the colour of objects lying behind the material under consideration. In the simplest case, the blending value can be used as a multiplication factor, such as in the formula below:
2 C = aF + (1 - a) 8 (1) where C is the final pixel colour, a is the opacity value of a partially opaque object, F is the colour of the partially opaque object, and B is the background colour in front of which the partially opaque object is placed. Increasing the opacity value assigned to an object causes the contribution of the object colour to the final pixel colour to increase, and the contribution of the background colour consequently to decrease, and vice versa.
opacity is described further in "Learn 3D Graphics Programming an the PW by Richard F. Ferraro (ISBN 0-201-48332-7, Addison-Wesley Developers Press, 1996), in connection with the RenderWare Real Time 3D Graphics Library by Criterion software Limited. In that book, partial opacity is handled by applying an opacity value to an object, by assigning the opacity value either with a polygon or with a material to which the polygon points. The opacity value ranges between 0 and 1. Other ranges are possible, such ad 0 to 255 which is a convenient range since it allows opacity to be expressed as an unsigned integer, which can be stored in eight bits, i.e.
3 one byte, of memory.
US Patent 5724561 describes a system for generating blend values (or opacity values) for the depiction of fog in a scene. In this document, the opacity value at a pixel is dependent on the contents of the Z-buffer. Various arrangements are made to provide full resolution of the range of opacity values between 0.0 and 1.0.
A first aspect of the present invention provides apparatus for generating a graphic representation of a partially opaque object in a scene, including means for receiving information relating to primitives of a partially opaque object, means f or calculating opacity values for said partially opaque primitives depending on stored depth values, and means for blending background colour with primitive colour -in accordance with calculated opacity values to generate pixel values representative of the partially opaque objects.
By treating the object as a collection of primitives, similar to treatment of any other object in the scene, and by providing a facility for calculation of opacity values as required, the present invention provides advantages over the above described disclosure, in that
4 a more realistic representation of partial opacity and transparency can be provided. Non-uniformities in opacity, such as might be encountered in fogs or smoke, are capable of being represented particularly realistically with the present invention.
A second aspect of the invention provides computer graphics apparatus for graphically representing objects in three dimensional space, including means for storing image data, means for receiving image data relating to a three dimensional object to be displayed graphically, and means for determining the contribution of the appearance of the three dimensional object to be displayed to the overall appearance of the corresponding part of the final image, and means for determining image data as a function of stored image data and received image data in accordance with said calculated contribution.
The contribution may be determined by said determining means as a function of thickness of the three dimensional object to be displayed and a material opacity density attribute contained in said received image data relating to said three dimensional object to be displayed.
Preferably, said contribution is calculated on a pixel by pixel basis.
Preferably, the computer graphics apparatus includes means for determining from received image data thickness of a three dimensional object to be displayed graphically, from which said contribution can thereby be calculated.
A third aspect of the invention provides a software development tool for the development of computer software f or use in a computer apparatus, the development tool including a facility for inclusion in a computer program product such that said computer program product can be operable to conf igure a computer apparatus in accordance with the second aspect of the invention.
A f ourth aspect of the invention provides a method of displaying a representation of a three dimensional object graphically, including the steps of storing image data for defining a graphical image, receiving image data relating to a three dimensional object to be displayed graphically, and combining said received image data with said stored data to determine the overall appearance of an image including a representation of said three dimensional object.
6 The step of combining data may include determining a contribution of said received image data to said overall image, and said contribution may be calculated by determining a thickness of said three dimensional object and an inherent opacity attribute of said object, and determining said contribution therefrom.
Further aspects and advantages of the present invention will become apparent from the following description of specific embodiments of the invention with reference to the accompanying drawings, in which:
Figure 1 illustrates computer graphics apparatus in accordance with a specific embodiment of the invention; Figure 2 illustrates a graphics controller of the apparatus illustrated in Figure 1; Figure 3 illustrates a fog processor of the graphics controller illustrated in Figure 2; Figure 4 illustrates an index creation unit of the fog processor illustrated in Figure 3; Figure 5 illustrates a graph showing look-up table values 7 for a look-up table of the index creation unit illustrated in Figure 4; Figure 6 illustrates a blend value generator of the fog processor illustrated in Figure 3; Figure 7 illustrates a graph of opacity against depth for a selection of different opacity density values; is Figure 8 illustrates a graph showing look-up table values for a log function look-up table in the blend value generator illustrated in Figure 6; Figure 9 illustrates a graph showing look-up table values for an exponential function look-up table in the blend value generator illustrated in Figure 6; Figure 10 illustrates a blending unit of the graphics controller illustrated in Figure 2; Figure 11 is a plan view of a view volume of a scene to be represented graphically by computer graphics apparatus as illustrated in Figure 1; Figure 12 is a view, in the intended viewing direction, 8 of the view volume illustrated in Figure 11; Figure 13 is a schematic representation of a semi-opaque object defined in an octree structure; and is Figure 14 is a cross section of the semi-opaque object as shown in Figure 9 across the plane indicated by arrows A.
As illustrated in Figure 1, a computer apparatus 10 comprises a console 12 in communication with a television set 14 for display of graphical images thereon, and has connected thereto one or more input devices 16 such as a hand held controller, joystick or the like. Data storage media 18 such as an optical disc or a memory card can be inserted into the console 12 for storage and retrieval of information for operation of the console 12.
Alternatively or additionally, a modem can be connected to the console 12f via which the console 12 can receive a signal (such as from the Internet) bearing instructions to program the console 12 in a desired manner.
The console 12 comprises a game controller 20, capable of receiving input instructions from the input devices 16, and for storing and retrieving data from the storage 9 media "18. The game controller 20 implements a game defined in computer implementable instructions from the data storage media 18, and generates data relating to the state of the game to be passed to a graphics data generator 22. The graphics data generator 22 acts on information supplied by the game controller 20 to define a scene to be represented graphically and a viewing direction in which the scene is to be viewed in a graphical representation. A displayed image is generated in a graphics controller 24, which receives configuration commands relating to the scene and the viewing direction from the graphics data generator 22, and therefrom, generates a rasterised image to be supplied to the television set 14 for display.
The graphics data generator 22 makes use of an application programmers' interface (API) provided in the graphics controller 24, to supply data thereto in a required form. A scene to be represented graphically can be defined as a data structure including a list of objects, each object consisting of a pointer to a data structure defining that object to be included in the scene.
Each object data structure may include a list of pointers to data structures defining polygons, which define the shape of that object. Each polygon data structure may include a pointer to a data structure defining a material from which the object is to be constructed. Each material data structure includes attributes defining colour, opacity and the like, and may point to a texture map which maps over the surface of the object pointing (indirectly) to the material in use.
Data ordered in this way can be delivered by the graphics data generator 22 to the graphics controller 24 for the production of a rasterised image.
Figure 2 illustrates the graphics controller 24 in more detail. As shown, the graphics controller 24 has a pipeline structure, being capable of receiving graphics data in the form previously described. As graphics data is delivered, it is stored in a polygon queue 30, which releases polygons to later parts of the pipeline one at a time.
A polygon is released by the polygon queue 30 and is passed to a scan conversion unit 32. The scan conversion unit 32 separates out the polygon into fragments taking account of the level of resolution allowed by the memory capacity of a pixel buffer 42. A fragment corresponds to a pixel of the f inal image to be generated. Each fragment is delivered by the scan conversion unit 32 to an interpolation unit 34 which interpolates attributes to the fragment in question, from available data. This is because data might only be defined formally for vertices of a polygon. This is particularly important if the fragment is to be mapped to a texture map.
Following interpolation, the fragment is passed to a depth test unit 35 which compares depth data for the fragment with depth data stored in a Zbuffer 43 associated with the pixel buffer 42.
Generally, depth information for both the fragment and the existing data in the Z-buffer 43 is held in reciprocal form, i.e. in terms of 1/Z, where Z is the depth of the point in question from the viewing position. This provides a better range and resolution of available values of Z. Also, interpolation between reciprocal Z values can be performed more easily than between true Z values.
Without performing an inverting operation on the depth information, the depth test unit 35 establishes whether is 12 the fog fragment is in front of the existing object in the particular location in the pixel buffer 42. This is achieved by a simple comparison test. If the depth data (the reciprocal of the actual depth) of the fragment is greater than the contents of the Z-buffer 43, then the fragment is closer to the viewing position than is the background. This takes into account the fact that the values being compared are the reciprocals of actual depths, and so the reciprocal of the actual depth of a foreground object will be greater than the reciprocal of the actual depth of a background object.
If the fog fragment is behind the existing contents of the pixel buffer 42, then the fog fragment is rejected, as not being in a position in which it can affect the scene from the current viewing position. Then, the next fog fragment is received from the interpolation unit 34.
However, if the fog fragment is in front of the existing object, then the fog fragment under coordination is considered further. The pipeline further comprises a fog selector unit 36 which receives a fragment from the depth test unit 35, and which is operable under the control of a pipeline configuration unit 44. The pipeline configuration unit 44 receives configuration flags from 13 the graphics data generator 22 program to control aspects of the graphics controller 24 which could have several possible modes of operation dependent on the needs of the game defined in the game controller 20 and the graphics data generator 22 or the availability of hardware functionality.
For instance, a graphics data generator 22 not taking advantage of the specific functionality of this embodiment might assign specific opacity values to specific polygons delivered to a graphics pipeline. The pipeline configuration unit 44 is responsive to this to configure the fog mode selector 36 so as to pass the fragment and the associated opacity values directly to a blending unit 38.
Alternatively, taking advantage of the functionality of the embodiment, an alternative blending mode could be set by the graphics data generator, this mode causing the pipeline configuration unit 44 to conf igure the fog mode selector 36 to pass fragments to a fog processor 40 with an associated opacity density, so that appropriate depthdependent opacity values can be calculated therein and passed to the blending unit 38. Additionally, the fog mode selector 36 is arranged to check the opacity density 14 of a fragment. If the opacity density is 0 (representing complete transparency), or 255 (representing complete opacity), then processing is trivial and the fragment can be written directly into the frame buffer.
is The blending unit 38 is also configured by the pipeline configuration unit 44 to establish which of a variety of available blending functions should be applied. For instance, it might not be appropriate in all cases to use the blending function previously described in equation (1).
Furthermore, in Figure 2 an arrow indicates that the pipeline configuration unit 44 is operative to configure the interpolation unit 34. Several different interpolation algorithms can be provided in the interpolation unit, one of which is selected by the application program.
The blending unit 38 is operative to blend the characteristics of the fragment, be it a fog fragment or otherwise, into the contents of the pIxel buffer 42, having regard to existing pixel information held therein. If a fog fragment is identified, and directed to the fog processor 40 by the fog mode selector 36, the fog processor 40 calculates an opacity value (also known as a blend value) for use in the blending unit.
operation of the fog processor 40 will now be described in further detail with reference to Figure 3. The fog processor 40 comprises a background depth register 50 for receiving a depth value from the Z-buffer 43. Further, a fragment input register 52 comprises a fragment depth location 52a for loading depth information relating to a received fog fragment and a fragment opacity density location 52b for receiving the opacity density for that fog fragment.
The depth information held in the background depth register 50 is loaded from the Z-buffer 43, from the location corresponding to the pixel or gioup of pixels corresponding to the position of the fog fragment.
The depth information received in both registers 50, 52a is passed to an index creation unit 56 described in further detail with reference to Figure 4. The index creation unit generates a thickness value, representative of the distance between the fog fragment in question and the existing object in the frame buffer. That thickness is passed to a blend value generator 58 which also 16 receives the opacity density value held in the opacity density location 52b in the fragment input register 52, which calculates a blend value for use in the blending unit 38. The blend value generator 58 is described in further detail in Figure 6.
As shown in Figure 4, the index creation unit 56 receives depth data 1/Z,, retrieved from the Z-buffer 43 from the background input register 50, and depth data 1/Z,., received from the fog test. unit 36, from the fragment input register 52. These data are passed to respective inverters 60, 62, which both refer to the same look up table 64 to perform an inversion operation. As a result, values Z,, and Z,,,, are generated.
The look-up table may not offer a range of possible inputs, and corresponding outputs, over the full range of possible values of 1/Z. However, by expressing 1/Z as a fixed point number, the inverse thereof can be calculated using the mantissa thereof as input value to the lookup table, and reexpressing the exponential part of the number as a ', shift left" rather than a "shift right" or vice versa as the case may be, in accordance with the inversion operation. The graph of the inversion function, on which the contents of the look-up table 64 17 are based, is illustrated in Figure 5. The inputs of the look-up table 64 are read from the horizontal axis of that graph, and outputs of the look-up table 64 are taken from the vertical axis of the graph.
is Z,,, and Z,, are then passed to a subtraction unit 66 which calculates Z,, ,,,,, the difference between the depth of the existing frame contents and the depth of the fog fragment under consideration. Z,,,,,,, which is a thickness value, is then delivered to the blend value generator 58.
The blend value generator 58, as illustrated in Figure 6, receives the thickness value Z,,,, from the index creation unit 56, and the opacity density value A from the fog fragment, via the opacity density portion 52b, of the input register 52. The blend value generator 58 is designed to operate to simulate as far as possible the physical impact of a partially opaque substance, such as fog, on the propagation of light therethrough. Generally, attenuation of light through fog can be represented mathematically in many different ways. However, the preferred mathematical representation for the present specific embodiment is the following formula:
18 a = 1 - (1 - A) T (2) In that formula, A is the opacity density inherent to the fog under consideration, and T is the thickness of the fog. a is the opacity resultant from a fog of opacity density A and thickness T. The opacity density A is constrained in a range between 0 and 1. Figure 7 shows a graph of opacity a against thickness T (expressed in units U), for nominal units, for five different values of opacity density A. It can be seen that for lower opacity densities A, the graph is approximately linear, while as opacity density approaches 1, the function by which opacity is governed in respect to thickness approaches a step function. This function has been observed to produce a reasonable representation of the effect which fog has on light propagation.
The arrangement illustrated in Figure 6 is intended to provide a unit which can generate an opacity value based on the general form of equation (2). Certain allowances are made for the efficient running of a computer programf in that opacity density A is now def ined over a range between 0 and 255. As noted above. this allows opacity density A to be defined as an unsigned integer, which 19 again can be stored in a byte of memory. Furthermore, for the same reasons, a is scaled so as to range between 0 and 255. In that way, sufficient resolution is obtained without the need to represent decimal fractions in binary form. Finally, the thickness Z.,,, is scaled by the unit of thickness U. Therefore, the formula given in equation (2), taking account of scaling for implementation of computer apparatus, can be modified to take into account convenient scaling, and re-expressed as the following:
a = 255 1 1 - A u 25.) z,,,,,=.
(3) The scaling by unit thickness U means that if Z,,,, = U, then a = A.
The expression of equation (3) could be implemented on a computer by means of a look up table with two arguments, namely A and Z,,,,, However, a look-up table dependent on two arguments can be expensive of memory. Therefore, the present specific embodiment provides an arrangement which allows use of considerably less memory to implement equation (3), with no loss in the resolution of the result.
Initially, the thickness value Z,,, is received, and is scaled by division by a scaling factor namely (-U log 2). In practice, division is effected by precalculation of -1/(U log 2), which is then multiplied with ZD.P.. Also, the opacity density value A from the fog fragment data is received by a look up unit 72 which refers to a look up table 74, which delivers a value representative of the expression log(I-A/255). That number, and -Z..../U log 2, are passed to a multiplication unit 76, outputting their product X. X is then passed to a look up unit 78, making reference to a look up table 80, which delivers a value of Y, which equals 0.5x. Y is then passed to a final calculation unit 82 which calculates a in accordance with the following equation:
a = 255 [ 1 - Y] (4) Since Y = 0.5x and X = (-Z,,.p,/U log 2) x log (1-A/255), by substitution into equation (4), the following expression 21 for (x can be obtained:
Cl = 255 11 - 0.5-(ZDIFjX log 2) log (1 - A1255) 1 (5) Since 0.5-log(l AaSS) 1 log 2 = 1 - A/255 (6) is true for all A, in the range of 0 to 254, equation (5) is equivalent to equation (3).
The foregoing demonstrates that the blend value generator 58 is operative to produce a blend value a, from an opacity density value A and a thickness value Z.,... in accordance with equation (3). Graphs showing how the contents of the look up tables 74, 80 are derived in accordance with the functions illustrated in blocks 72 and 78 in Figure 6 are illustrated for further clarification in Figures 8 and 9 respectively.
once a blend value a has been obtained, the blending unit 38 operates in accordance with a fog blending mode to 22 generate new colour inf ormation to be read into the frame buffer 42. The operation of the blending unit will be described in further detail with reference to Figure 10. The blending unit 38 includes a blend mode register 90 which receives information relating to the control of the blending function unit 96. In a fog fragment blending mode, the blending function unit 96 is made responsive to receive opacity values Q from the fog processor 40, to blend a fog colour received into a fog colour register 94 and the frame colour received into a frame colour register 92 from the frame buffer.
The blend mode received in the blend mode register 90 is expressed as a pair of values. If a polygon to be processed is a fog polygon which bounds the far side of a fog object, it must be rendered so that it does not itself contribute opacity to the scene, but alters the contents of the depth values held in the Z-buffer 43. This is achieved by setting the blend mode to (1, 0). This signifies that the entirety of the background is to be held in the pixel buffer 42 (the background being the existing contents of the pixel buffer 42) and the colour of the fog does not contribute to the final colour stored in the pixel buffer 42. Further, a completely opaque polygon to be processed will have opacity density A =
23 255. In that case, the blend mode sent to the blend register 90 will be (0, 1).
Generally, where the pair of values for the blend mode are expressed as (p, q) the following expression holds:
CouT -., p x CFRAME + q x CFOG is (7) For a fog fragment, the blend mode is set such that q = a and p = 1 oc. It will be appreciated that more complicated ways of blending fog colour with existing frame colour could be provided. However, it is an advantage of the blending expression of equation (7) that new colour values can be obtained with little further calculation.
Figures 11 and 12 illustrate an example of a fog object included in a scene, and the effect on the appearance of a fog object which can be caused by differences in opacity density and thickness. As shown in Figure 11, two objects 102, 104 which point to a fully opaque material are inserted into a scene, of which a viewable volume 100 is indicated. Between those two objects 102, 104, a fog object 110 is -inserted. The fog object 110 is 24 positioned so as to overlap the front face of the far object 102, at an angle, so that a portion 112 of the fog object tapers to zero thickness.
It can be seen in Figure 12 that the portion 112 of the fog object 110, which tapers to zero thickness has a colour which gradually becomes closer and closer to the colour of the background object as the thickness decreases. This is because the opacity of that portion 112 becomes gradually less than the opacity of the remaining portion 112 of the fog object 110.
Moreover, the lower half of the object 110 is illustrated with a higher opacity density than the upper half, resulting in less of the background colour contributing to the appearance of the fog object 110 in the lower half thereof.
In processing a fog object, certain savings can be made on processing time by making use of space partitioning techniques. For instance, a fog object could be designed such that it is defined by means of an octree. A very simple fog object is illustrated in Figures 13 and 14; it will be appreciated that Figures 13 and 14 are intended to represent a simple example only and the low level of detail and coarse resolution illustrated therein would not often be acceptable in computer graphics. An octree is useful in that it allows back-to-front ordering of fog polygons very easily. It is important to insert f og polygons into a scene from back to front so that thickness can properly be taken into account. otherwise, it would be necessary to refer back to previous polygon insertion when calculating new blended colour values.
Moreoverf octree representation, as shown in Figures 13 and 14, comprises a number of cubic cells. Some of these cells 122 are shown at a lower level of division than others 124. However, adjacent cells share fog polygon sides. of adjacent polygons, only one needs to be rendered. In accordance with the specific embodiment, it is preferable that the front face of each cell is inserted into a scene, by blending as a fog fragment. A back f ace of the cell would be treated as merely f or updating the depth inf ormation held in the Z-buf f er i. e. in blend mode represented by values (1, 0) as explained above. Figure 14, by means of shading against faces 126 to be inserted, indicates that certain f aces, such as those marked 128, need not be included in the polygon queue 30 for insertion into the scene. By not including those faces, the efficiency of rendering the scene can be 26 enhanced.
The present invention as described can be used for the representation of all kinds of partially opaque objects in a scene. These can include static objects, such as glass, water, or fog, which may be uniformly or nonuniformly partially opaque, or dynamic objects such as clouds, smokes or the like. In the case of a fog, the fog can be applied over the whole or just a part of a visible scene. objects can also be uniformly or nonuniformly coloured, such colour being capable of modulation by textures applied to a fog object.
Whereas the embodiment processes fragment, where each fragment correspondsto a pixel of the final image to be generated, other resolutions of processing could be considered. This might be used to increase the processing speed of the image generating apparatus, or to decrease the potential f or aliasing ef f ects to af f ect the quality of the f inal image. Further, the interpolation unit 34 is described as having several different interpolation algorithms. Alternatively, the interpolation unit 34 might only be configured to interpolate a specific manner, such that the application program has no control over the interpolation method 27 used.
The specific embodiment demonstrates representation of a partially transparent object, defined in terms of its surface geometry. other methods of defining the shape of an object can be provided, such as wire frame models, or mathematical models. In each instance, it is possible to identify the thickness of the object in the viewing direction, and to identify the contribution of a transparent material of that thickness in the viewing direction in order to cause blending of the fog colour with the background colour.
28
Claims (17)
1. A computer graphics apparatus for graphically representing a three dimensional scene, the apparatus comprising: means for receiving data defining an object to be graphically represented; means for identifying received object defining data defining an object to be graphically represented as at least partially light transmissive; and means for generating image data representing a graphical representation of said scene from a viewing position, said image data generating means including means for determining a contribution of an object, defined by data identified by said identifying means, to the graphical representation thereof taking into account the graphical representation of any objects obscured by said object in said scene from said viewing position.
2. Apparatus in accordance with claim 1 including means for processing received object data to generate fragment data representing fragments of said object defined by said received object data, wherein said contribution determining means is operable to generate a contribution for each fragment of an object identified by said 29 identifying means.
is
3. Apparatus in accordance with claim 2 and comprising display means for displaying a rasterised image, and wherein the data processing means is operable to process said object data into fragment data, each fragment corresponding with one pixel of a final rasterised image.
4. Apparatus in accordance with claim 2 or claim 3 comprising depth testing means operable to compare depth data for a fragment of an at least partially light transmitted object with further objects of said scene in said viewing direction of said fragment, to determine visibility of said fragment from said viewing direction.
5. Apparatus in accordance with any one of claims 2 to 4 wherein said object data receiving means is operable -to receive an opacity density value for an object, and said contribution determining means is operable to determine a contribution on the basis of said received opacity density value and a thickness of said object in said viewing direction.
6. Apparatus in accordance with claim 5 wherein said contribution determining means is operable to determine said contribution in accordance with exponential function of said thickness.
a limited
7. Apparatus in accordance with claim 6 wherein the rate of change of the contribution with respect to the thickness is dependent upon the opacity density value.
is
8. Apparatus in accordance with claim 7 wherein said contribution is determined in accordance with the following formula:
a = 1 - (1 - A) 1 wherein:
1); and a is the contribution; A is the opacity density value (in the range 0 to T is the thickness (in nominal units).
9. A method of graphically representing a three dimensional scene, comprising: receiving data defining an object to be graphically represented; 31 checking received object data to identify data defining an object to be graphically represented as at least partially light transmissive; generating image data from said received object data, said image data defining a graphical representation of said scene from a viewing point, including determining a contribution of an object defined by data identified in said checking step to the graphical representation thereof, taking into consideration the graphical representation of any objects obscured by said object in said scene from said viewing position.
10. A method in accordance with claim 9 including the step of processing received object data to generate fragment data representing fragments of said object defined by said received document data, and wherein said step of determining a contribution includes determining a contribution for each fragment generated by said fragment generating step.
11. A method in accordance with claim 10 including the step of displaying a rasterised image on the basis of image data generated in said generating step, and wherein each fragments generated in said fragment generating step corresponds with a pixel of a rasterised image.
32
12. A method in accordance with claim 10 or claim 11 comprising comparing the distance of a fragment in said scene relative to the viewing position with the distance of any other objects in the scene from said viewing position in the same viewing direction as the view direction of said fragment, to determine a visibility of said fragment in said viewing direction.
13. A method in accordance with any one of claims 10 to 12 wherein said data receiving step includes receiving an opacity density value for an object, and said contribution determining step includes determining a contribution on the basis of said received opacity density value and a thickness of said object in said viewing direction.
14. A storage medium storing computer executable instructions for configuring a computer apparatus to operate as apparatus in accordance with any one of claims 1 to 8.
15. A signal carrying computer executable instructions for configuring a computer apparatus to operate as apparatus in accordance with any one of claims 1 to 8.
33
16. A storage medium storing computer processor executable instructions for configuring a computer to operate in accordance with the method of one of claims 9 to 13.
17. A signal carrying processor executable instructions for configuring a computer to operate in accordance with a method of one of claims 9 to 13.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0321448A GB2389503B (en) | 1999-11-11 | 1999-11-11 | Computer graphics apparatus |
GB9926760A GB2359229B (en) | 1999-11-11 | 1999-11-11 | Three-dimensional computer graphics apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9926760A GB2359229B (en) | 1999-11-11 | 1999-11-11 | Three-dimensional computer graphics apparatus |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9926760D0 GB9926760D0 (en) | 2000-01-12 |
GB2359229A true GB2359229A (en) | 2001-08-15 |
GB2359229B GB2359229B (en) | 2003-12-24 |
Family
ID=10864384
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9926760A Expired - Fee Related GB2359229B (en) | 1999-11-11 | 1999-11-11 | Three-dimensional computer graphics apparatus |
GB0321448A Expired - Fee Related GB2389503B (en) | 1999-11-11 | 1999-11-11 | Computer graphics apparatus |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0321448A Expired - Fee Related GB2389503B (en) | 1999-11-11 | 1999-11-11 | Computer graphics apparatus |
Country Status (1)
Country | Link |
---|---|
GB (2) | GB2359229B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2226937A (en) * | 1988-12-05 | 1990-07-11 | Rediffusion Simulation Ltd | Image display |
US5831627A (en) * | 1996-06-27 | 1998-11-03 | R/Greenberg Associates | System and method for providing improved graphics generation performance using memory lookup |
GB2331217A (en) * | 1997-11-07 | 1999-05-12 | Sega Enterprises Kk | Image processor |
GB2343600A (en) * | 1998-11-06 | 2000-05-10 | Videologic Ltd | Depth sorting for use in 3-dimensional computer shading and texturing systems |
GB2344039A (en) * | 1998-09-10 | 2000-05-24 | Sega Enterprises Kk | Blending processing for overlapping translucent polygons |
EP1014308A2 (en) * | 1998-12-22 | 2000-06-28 | Mitsubishi Denki Kabushiki Kaisha | Method and apparatus for volume rendering with multiple depth buffers |
GB2354416A (en) * | 1999-09-17 | 2001-03-21 | Imagination Tech Ltd | Depth based blending for 3D graphics systems |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5339386A (en) * | 1991-08-08 | 1994-08-16 | Bolt Beranek And Newman Inc. | Volumetric effects pixel processing |
-
1999
- 1999-11-11 GB GB9926760A patent/GB2359229B/en not_active Expired - Fee Related
- 1999-11-11 GB GB0321448A patent/GB2389503B/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2226937A (en) * | 1988-12-05 | 1990-07-11 | Rediffusion Simulation Ltd | Image display |
US5831627A (en) * | 1996-06-27 | 1998-11-03 | R/Greenberg Associates | System and method for providing improved graphics generation performance using memory lookup |
GB2331217A (en) * | 1997-11-07 | 1999-05-12 | Sega Enterprises Kk | Image processor |
GB2344039A (en) * | 1998-09-10 | 2000-05-24 | Sega Enterprises Kk | Blending processing for overlapping translucent polygons |
GB2343600A (en) * | 1998-11-06 | 2000-05-10 | Videologic Ltd | Depth sorting for use in 3-dimensional computer shading and texturing systems |
EP1014308A2 (en) * | 1998-12-22 | 2000-06-28 | Mitsubishi Denki Kabushiki Kaisha | Method and apparatus for volume rendering with multiple depth buffers |
GB2354416A (en) * | 1999-09-17 | 2001-03-21 | Imagination Tech Ltd | Depth based blending for 3D graphics systems |
Also Published As
Publication number | Publication date |
---|---|
GB2389503A (en) | 2003-12-10 |
GB0321448D0 (en) | 2003-10-15 |
GB9926760D0 (en) | 2000-01-12 |
GB2389503B (en) | 2004-04-21 |
GB2359229B (en) | 2003-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Diepstraten et al. | Transparency in interactive technical illustrations | |
Westermann et al. | Efficiently using graphics hardware in volume rendering applications | |
EP0610004B1 (en) | Image generating apparatus and method of generating an image | |
Elinas et al. | Real-time rendering of 3D clouds | |
US5959631A (en) | Hardware and software for the visualization of three-dimensional data sets | |
GB2223384A (en) | Shadow algorithm | |
JP2004038926A (en) | Texture map editing | |
KR20010113730A (en) | Method and apparatus for processing images | |
US6791544B1 (en) | Shadow rendering system and method | |
US6396502B1 (en) | System and method for implementing accumulation buffer operations in texture mapping hardware | |
US6219062B1 (en) | Three-dimensional graphic display device | |
EP1221141B1 (en) | Depth based blending for 3d graphics systems | |
Pighin et al. | Progressive previewing of ray-traced images using image-plane discontinuity meshing | |
KR0147439B1 (en) | Hardware-based graphical workstation solution for refraction | |
JP2006517705A (en) | Computer graphics system and computer graphic image rendering method | |
Nielsen et al. | Fast texture-based form factor calculations for radiosity using graphics hardware | |
GB2359229A (en) | Computer graphics rendering of partially transparent object | |
JP3642593B2 (en) | Image composition apparatus and image composition method | |
US5900882A (en) | Determining texture coordinates in computer graphics | |
US6693634B1 (en) | Reduction rate processing circuit and method with logarithmic operation and image processor employing same | |
GB2359230A (en) | Computer graphics rendering of partially transparent object | |
US6460063B1 (en) | Division circuit and graphic display processing apparatus | |
US6377279B1 (en) | Image generation apparatus and image generation method | |
Lowe et al. | A fragment culling technique for rendering arbitrary portals | |
WO1997002546A1 (en) | Computer graphics circuit |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20181111 |