US20100177162A1 - Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream - Google Patents
Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream Download PDFInfo
- Publication number
- US20100177162A1 US20100177162A1 US12/629,247 US62924709A US2010177162A1 US 20100177162 A1 US20100177162 A1 US 20100177162A1 US 62924709 A US62924709 A US 62924709A US 2010177162 A1 US2010177162 A1 US 2010177162A1
- Authority
- US
- United States
- Prior art keywords
- video
- resolution
- data stream
- full resolution
- video data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/25—Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
Definitions
- Certain embodiments of the invention relate to wireless communication. More specifically, certain embodiments of the invention relate to a method and system for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
- an image is presented in a display device, for example in a television, a monitor and/or a gaming console.
- Most video broadcasts nowadays, utilize video processing applications that enable broadcasting video images in the form of bit streams that comprise information regarding characteristics of the image to be displayed.
- These video applications may utilize various interpolation and/or rate conversion functions to present content comprising still and/or moving images on a display.
- de-interlacing functions may be utilized to convert moving and/or still images to a format that is suitable for certain types of display devices that are unable to handle interlaced content.
- Interlaced 3D and/or 2D video comprises fields, each of which may be captured at a distinct time interval.
- a frame may comprise a pair of fields, for example, a top field and a bottom field.
- the pictures forming the video may comprise a plurality of ordered lines.
- video content for the even-numbered lines may be captured.
- video content for the odd-numbered lines may be captured.
- the even-numbered lines may be collectively referred to as the top field, while the odd-numbered lines may be collectively referred to as the bottom field.
- the odd-numbered lines may be collectively referred to as the top field, while the even-numbered lines may be collectively referred to as the bottom field.
- Interlaced video may comprise fields that were converted from progressive frames.
- a progressive frame may be converted into two interlaced fields by organizing the even numbered lines into one field and the odd numbered lines into another field.
- a system and/or method for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
- FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
- FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention.
- FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention.
- FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention.
- FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention.
- FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
- an output full resolution 3D video may be generated utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source, wherein a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
- 3D video or image processing on the data streams may be performed within the wireless communication device.
- the 3D video or image processing may be performed external to the wireless communication device.
- the data streams may be compressed prior to communicating them for the external 3D video or image processing.
- the 3D video or images may be displayed locally on the wireless communication device.
- the 3D video or images may be formatted so that they may be locally presented on a display of the wireless communication device.
- FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
- the wireless device 150 may comprise an antenna 151 , a chip 162 , a transceiver 152 , a baseband processor 154 , a processor 155 , a system memory 158 , a logic block 160 , a high resolution camera 164 A, a low resolution camera 164 B, an audio CODEC 172 A, a video CODEC 172 B, and an external headset port 166 .
- the wireless device 150 may also comprise an analog microphone 168 , integrated hands-free (IHF) stereo speakers 170 , a hearing aid compatible (HAC) coil 174 , a dual digital microphone 176 , a vibration transducer 178 , and a touchscreen/display 180 .
- IHF integrated hands-free
- HAC hearing aid compatible
- 3D video may be more desirable because it may be more realistic to humans to perceive 3D rather than 2D images.
- the transceiver 152 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to modulate and upconvert baseband signals to RF signals for transmission by one or more antennas, which may be represented generically by the antenna 151 .
- the transceiver 152 may also be enabled to downconvert and demodulate received RF signals to baseband signals.
- the RF signals may be received by one or more antennas, which may be represented generically by the antenna 151 . Different wireless systems may use different antennas for transmission and reception.
- the transceiver 152 may be enabled to execute other functions, for example, filtering the baseband and/or RF signals, and/or amplifying the baseband and/or RF signals.
- the transceiver 152 may be implemented as a separate transmitter and a separate receiver.
- the plurality of transceivers, transmitters and/or receivers may enable the wireless device 150 to handle a plurality of wireless protocols and/or standards including cellular, WLAN and PAN.
- Wireless technologies handled by the wireless device 150 may comprise GPS, GALILEO, GLONASS, GSM, CDMA, CDMA2000, WCDMA, GNSS, GMS, GPRS, EDGE, WIMAX, WLAN, LTE, 3GPP, UMTS, BLUETOOTH, and ZIGBEE, for example.
- the baseband processor 154 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to process baseband signals for transmission via the transceiver 152 and/or the baseband signals received from the transceiver 152 .
- the processor 155 may be any suitable processor or controller such as a CPU, DSP, ARM, or any type of integrated circuit processor.
- the processor 155 may comprise suitable logic, circuitry, and/or code that may be enabled to control the operations of the transceiver 152 and/or the baseband processor 154 .
- the processor 155 may be utilized to update and/or modify programmable parameters and/or values in a plurality of components, devices, and/or processing elements in the transceiver 152 and/or the baseband processor 154 . At least a portion of the programmable parameters may be stored in the system memory 158 .
- Control and/or data information which may comprise the programmable parameters, may be transferred from other portions of the wireless device 150 , not shown in FIG. 1 , to the processor 155 .
- the processor 155 may be enabled to transfer control and/or data information, which may include the programmable parameters, to other portions of the wireless device 150 , not shown in FIG. 1 , which may be part of the wireless device 150 .
- the processor 155 may utilize the received control and/or data information, which may comprise the programmable parameters or video source data, to determine an operating mode of the transceiver 152 .
- the processor 155 may be utilized to select a specific frequency for a local oscillator, a specific gain for a variable gain amplifier, configure the local oscillator and/or configure the variable gain amplifier for operation in accordance with various embodiments of the invention.
- the received video source data and/or processed full-resolution 3D video data may be stored in the system memory 158 via the processor 155 , for example.
- the information stored in system memory 158 may be transferred to the transceiver 152 from the system memory 158 via the processor 155 .
- the processor 155 may be operable to process received video data streams from a high resolution video source and a low resolution video source. The processor 155 may thereby generate a full resolution 3D video from the received data streams
- the system memory 158 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to store a plurality of control and/or video data information, including video or image processing parameters, or full resolution 3D video data.
- the system memory 158 may store at least a portion of the programmable parameters that may be manipulated by the processor 155 .
- the logic block 160 may comprise suitable logic, circuitry, interfaces, and/or code that may enable controlling of various functionalities of the wireless device 150 .
- the logic block 160 may comprise one or more state machines that may generate signals to control the transceiver 152 and/or the baseband processor 154 .
- the logic block 160 may also comprise registers that may hold data for controlling, for example, the transceiver 152 and/or the baseband processor 154 .
- the logic block 160 may also generate and/or store status information that may be read by, for example, the processor 155 . Amplifier gains and/or filtering characteristics, for example, may be controlled by the logic block 160 .
- the BT radio/processor 163 may comprise suitable circuitry, logic, interfaces, and/or code that may enable transmission and reception of Bluetooth signals.
- the BT radio/processor 163 may enable processing and/or handling of BT baseband signals.
- the BT radio/processor 163 may process or handle BT signals received and/or BT signals transmitted via a wireless communication medium.
- the BT radio/processor 163 may also provide control and/or feedback information to/from the baseband processor 154 and/or the processor 155 , based on information from the processed BT signals.
- the BT radio/processor 163 may communicate information and/or data from the processed BT signals to the processor 155 and/or to the system memory 158 .
- the BT radio/processor 163 may receive information from the processor 155 and/or the system memory 158 , which may be processed and transmitted via the wireless communication medium a Bluetooth headset, for example.
- the high-resolution camera 164 A may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images.
- the high-resolution camera 164 A may be capable of capturing high-definition images and video and may be controlled via the processor 155 , for example.
- the high-resolution camera 164 A may comprise multi-megapixels, for example.
- the low-resolution camera 164 B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images.
- the low-resolution camera 164 B may comprise a smaller, lower-cost camera than the high-resolution camera 164 A, and may comprise a VGA image/video camera, for example.
- the wireless device 150 may comprise two cameras for full resolution 3D images and video, without the need for two high-resolution cameras.
- the 3D image and/or video may be displayed on the touchscreen/display 180 , for example, may be stored in the system memory 158 , and/or may be communicated externally via the transceiver 152 and the antenna 151 .
- the audio CODEC 172 A may comprise suitable circuitry, logic, interfaces, and/or code that may process audio signals received from and/or communicated to input/output devices.
- the input devices may be within or communicatively coupled to the wireless device 150 , and may comprise the analog microphone 168 , the stereo speakers 170 , the hearing aid compatible (HAC) coil 174 , the dual digital microphone 176 , and the vibration transducer 178 , for example.
- the audio CODEC 172 A may be operable to up-convert and/or down-convert signal frequencies to desired frequencies for processing and/or transmission via an output device.
- the video CODEC 172 B may comprise suitable circuitry, logic, interfaces, and/or code that may be operable to process video signals received and/or communicated from and/or to input output devices, such as the high-resolution camera 164 A and the low-resolution camera 164 B.
- the video CODEC 172 B may communicate processed video signals to the processor 155 for further processing, or for communication to devices external to the wireless device 150 via the transceiver 152 .
- the chip 162 may comprise an integrated circuit with multiple functional blocks integrated within, such as the transceiver 152 , the baseband processor 154 , the BT radio/processor 163 , the audio CODEC 172 A, and the video CODEC 172 B.
- the number of functional blocks integrated in the chip 162 is not limited to the number shown in FIG. 1 . Accordingly, any number of blocks may be integrated on the chip 162 , including cameras such as the high-resolution camera 164 A and the low-resolution camera 164 B, depending on chip space and wireless device 150 requirements, for example.
- the external headset port 166 may comprise a physical connection for an external headset to be communicatively coupled to the wireless device 150 .
- the analog microphone 168 may comprise suitable circuitry, logic, and/or code that may detect sound waves and convert them to electrical signals via a piezoelectric effect, for example.
- the electrical signals generated by the analog microphone 168 may comprise analog signals that may require analog to digital conversion before processing.
- the stereo speakers 170 may comprise a pair of speakers that may be operable to generate audio signals from electrical signals received from the audio CODEC 172 A.
- the HAC coil 174 may comprise suitable circuitry, logic, and/or code that may enable communication between the wireless device 150 and a T-coil in a hearing aid, for example. In this manner, electrical audio signals may be communicated to a user that utilizes a hearing aid, without the need for generating sound signals via a speaker, such as the stereo speakers 170 , and converting the generated sound signals back to electrical signals in a hearing aid, and subsequently back into amplified sound signals in the user's ear, for example.
- the dual digital microphone 176 may comprise suitable circuitry, logic, and/or code that may be operable to detect sound waves and convert them to electrical signals.
- the electrical signals generated by the dual digital microphone 176 may comprise digital signals, and thus may not require analog to digital conversion prior to digital processing in the audio CODEC 172 A.
- the dual digital microphone 176 may enable beamforming capabilities, for example.
- the vibration transducer 178 may comprise suitable circuitry, logic, and/or code that may enable notification of an incoming call, alerts and/or message to the wireless device 150 without the use of sound.
- the vibration transducer may generate vibrations that may be in synch with, for example, audio signals such as speech or music.
- video stream data may be communicated from image and/or video sources, such as the high-resolution camera 164 A and the low-resolution camera 164 B to the video CODEC 172 B.
- the video CODEC 172 B may process the received video data before communicating the data to the processor for further processing or communication to a device external to the wireless device 150 .
- the high-resolution camera 164 A and the low-resolution camera 164 B may be operable to capture images and/or video. By utilizing two spatially separated cameras, a 3D image/video may be generated from the two image/video signals.
- the high-resolution camera 164 A and the low-resolution camera 164 B may communicate video data streams to the video CODEC 172 B and process the received signals before communicating the processed signals to the processor for further processing.
- the processor 155 may be operable to process 3D video and/or images obtained utilizing the high-resolution camera 164 A and the low-resolution camera 1648 .
- the wireless device 150 may be reduced by utilizing a smaller, lower-cost, lower-resolution camera with a high-resolution camera, while still supporting full-resolution 3D images and video.
- the processing performed by the processor 155 may comprise right and left-view generation to enable 3D video, which may comprise still and/or moving images.
- Various methodologies may be utilized to capture, generate (at capture or playtime), and/or render 3D video images.
- One of the more common methods for implementing 3D video is stereoscopic 3D video.
- the 3D video impression is generated by rendering multiple views, most commonly two views: a left view and a right view, corresponding to the viewer's left eye and right eye to give depth to displayed images.
- left view and right view video sequences may be captured and/or processed to enable creating 3D images.
- the left view and right view data may then be communicated either as separate streams, or may be combined into a single transport stream and only separated into different view sequences by the end-user receiving/displaying device.
- the separate left and right view video sequences may be compressed based on MPEG-2 MVP, H.264 and/or MPEG-4 advanced video coding (AVC) or MPEG-4 multi-view video coding (MVC).
- MPEG-2 MVP H.264 and/or MPEG-4 advanced video coding (AVC)
- MPEG-4 multi-view video coding MVC
- 3D video and image processing may be achieved utilizing one full resolution video stream and one lower resolution video stream.
- a wireless device 150 comprising one or more processors and/or circuits may be enabled to generate an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source.
- a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
- 3D video or image processing on the data streams may be performed within the wireless communication device 150 and/or external to the wireless communication device 150 .
- FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention. Referring to FIG. 1B , there is shown the wireless device 150 and the touchscreen/display 180 , which may be as described with respect to FIG. 1A .
- the high-resolution camera 164 A and the low-resolution camera 164 B in the wireless device 150 may be operable to capture images and/or video.
- a 3D image/video may be generated from the two image/video signals, and by utilizing a lower-resolution camera in concert with a high-resolution camera, full high definition 3D images and/or video may be generated, while reducing cost and space requirements of the wireless device 150 .
- the captured images and/or video may be processed in the wireless device 150 and may subsequently be displayed on the touchscreen/display 180 .
- the processed images and/or video may be communicated external to the wireless device 150 .
- the captured images and/or video may be communicated from the wireless device 150 without processing, before being processed by an external device. In this manner, processor requirements in the wireless device 150 may be reduced.
- a wireless device 150 comprising one or more processors and/or circuits may be operable to generate an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source.
- a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
- 3D video or image processing on the data streams may be performed within the wireless communication device 150 and/or external to the wireless communication device 150 .
- FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention.
- a 3D video processing module 201 there is shown a 3D video processing module 201 , a reformat module 203 , and a 3D video output 205 .
- a full resolution video stream 207 There is also shown a full resolution video stream 207 , a low resolution stream 209 , and a plurality of video streams 211 .
- the 3D video processing module 201 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to process received video streams, such as the full resolution video stream 207 and the low resolution stream 209 .
- the 3D video processing module 201 may be integrated in the wireless device 150 , such as in the processor 155 , for example, or may be in an external device, such as a computer or audio/visual system that is operable to receive and process video streams. In instances where the 3D video processing module 201 is external to the wireless device 150 , the full resolution video stream 207 and the lower resolution stream 209 may be communicated to the external device in parallel.
- the reformat module 203 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format processed 3D image or video data into a plurality of video streams 211 .
- the reformat module 203 may format image and/or video data to the appropriate format for a target video output device.
- the 3D video output 205 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to display 3D images and/or video.
- the 3D video output 205 may comprise a high-definition television, for example.
- two cameras may generate two input streams, the full resolution video stream 207 and the low resolution stream 209 .
- the full resolution stream 207 may comprise a stream of full resolution required for target compression or for LCD resolution.
- the low resolution stream 209 may comprise a reduced resolution stream, such as a VGA video stream, for example.
- the streams may be communicated to the 3D video processing module 201 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images.
- the 3D video stream generated by the 3D video processing module 201 may be communicated to the reformat module 203 , which may be operable to format the processed 3D video stream into a plurality of video streams 211 .
- the plurality of video streams 211 may be communicated to the 3D video output 205 for display.
- FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention.
- a 3D video implementation 300 comprising a high resolution camera 301 A, a low resolution video camera 301 B, a 3D processing module 303 , a format module 305 , a television 307 , the wireless device 150 and the touchscreen/display 180 .
- the high resolution camera 301 A and the low resolution video camera 301 B may be substantially similar to the high-resolution camera 164 A and the low-resolution camera 164 B described with respect to FIG. 1A
- the and the 3D processing module 303 may be substantially similar to the 3D processing module 201 described with respect to FIG. 2 .
- the wireless device 150 , and the touchscreen/display 180 may be as described with respect to FIG. 1A .
- the format module 305 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format video and/or image data for a desired display type.
- the format module 305 may define an appropriate aspect ratio or scan rate as required by the wireless device 150 or the television 307 .
- the high-resolution camera 301 A and the low-resolution camera 301 B may generate two input streams to be communicated to the 3D video processing module 303 which may combine the two streams to generate a full-resolution 3D video stream.
- this process may be applied to video or still images.
- the 3D video stream generated by the 3D video processing module 301 may be communicated to the format module 305 , which may be operable to format the processed 3D video stream into the appropriate format for display on the wireless device 150 , the television 307 , or similar display device.
- FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention.
- a networked 3D video implementation 400 comprising the high resolution camera 301 A, the low resolution video camera 301 B, compression modules 401 A and 401 B, decompression modules 403 A and 403 B, a 3D video processing module 405 , a format module 407 , the wireless device 150 , and the television 307 .
- the high resolution camera 301 A, the low resolution video camera 301 B, the wireless device 150 , and the television 307 may be as described previously.
- the 3D video processing module 405 , the format module 407 , and the wireless device 409 may be substantially similar to the 3D video processing module 303 , the format module 305 , and the wireless device 150 described previously.
- the wireless device 409 may comprise a touchscreen/display 411 .
- the compression modules 401 A and 401 B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to compress received image and/or video data for subsequent communication to remote devices.
- the compression modules 401 A and 401 B may be integrated in a wireless device, such as the wireless device 150 , and may enable more efficient communication of data over a network by reducing data size.
- the decompression modules 403 A and 403 B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to decompress received compressed data.
- the decompression modules 403 A and 403 B may be remote from the image/video data source.
- the high-resolution camera 301 A and the low-resolution camera 301 B may generate two input streams to be communicated to the compression modules 401 A and 401 B, where the data streams may be compressed for more efficient communication.
- the compressed streams may be communicated to a remote device comprising the decompression modules 403 A and 403 B, which may be enabled to decompress the received data for further processing.
- the decompressed data may be communicated to the 3D video processing module 405 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images.
- the 3D video stream generated by the 3D video processing module 405 may be communicated to the format module 407 , which may be operable to format the processed 3D video stream into the appropriate format for display on the wireless device 409 , the television 307 , or a similar display device.
- FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
- video and/or image data may be captured utilizing a high-resolution camera and a low-resolution camera.
- the exemplary steps may proceed to step 511 , where the data streams may be combined and processed to generate full-resolution 3D video/images.
- step 505 the data is not to be processed locally, the data streams may be compressed in step 507 , followed by step 509 where the compressed data may be communicated to a remote device, before the exemplary steps proceed to step 511 , where the data streams may be combined and processed to generate full-resolution 3D video/images.
- the process may then proceed to step 513 where the 3D video/images may be formatted for a desired display device and subsequently displayed on that device, followed by end step 515 .
- a method and system are disclosed for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
- a wireless device comprising one or more processors and/or circuits may be enabled to generate an output full resolution 3D video utilizing a first video data stream 207 generated from a high resolution video source 164 A and a second video data stream 209 generated from a low resolution video source 1648 , wherein a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
- 3D video or image processing on the data streams 207 / 209 may be performed within the wireless communication device 150 .
- the 3D video or image processing 201 / 303 may be performed external to the wireless communication device 150 .
- the data streams 207 / 209 may be compressed 401 A/ 401 B prior to communicating them for the external 3D video or image processing 405 .
- the 3D video or image may be displayed locally on the wireless communication device 150 .
- the 3D video or images may be formatted for local displaying 180 on the wireless communication device 150 .
- Another embodiment of the invention may provide a machine and/or computer readable storage and/or medium, having stored thereon, a machine code and/or a computer program having at least one code section executable by a machine and/or a computer, thereby causing the machine and/or computer to perform the steps as described herein for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
- aspects of the invention may be realized in hardware, software, firmware or a combination thereof.
- the invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
- a typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- One embodiment of the present invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components.
- the degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
- the present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
- Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
- other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
- This application makes reference to and claims priority to U.S. Provisional Application Serial No. 61/144,959 filed on Jan. 15, 2009.
- The above stated application is hereby incorporated herein by reference in its entirety.
- [Not Applicable]
- [Not Applicable]
- Certain embodiments of the invention relate to wireless communication. More specifically, certain embodiments of the invention relate to a method and system for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
- In 3D or 2D video systems, an image is presented in a display device, for example in a television, a monitor and/or a gaming console. Most video broadcasts, nowadays, utilize video processing applications that enable broadcasting video images in the form of bit streams that comprise information regarding characteristics of the image to be displayed. These video applications may utilize various interpolation and/or rate conversion functions to present content comprising still and/or moving images on a display. For example, de-interlacing functions may be utilized to convert moving and/or still images to a format that is suitable for certain types of display devices that are unable to handle interlaced content.
- Interlaced 3D and/or 2D video comprises fields, each of which may be captured at a distinct time interval. A frame may comprise a pair of fields, for example, a top field and a bottom field. The pictures forming the video may comprise a plurality of ordered lines. During one of the time intervals, video content for the even-numbered lines may be captured. During a subsequent time interval, video content for the odd-numbered lines may be captured. The even-numbered lines may be collectively referred to as the top field, while the odd-numbered lines may be collectively referred to as the bottom field. Alternatively, the odd-numbered lines may be collectively referred to as the top field, while the even-numbered lines may be collectively referred to as the bottom field.
- In the case of progressive 2D and/or 3D video frames, all the lines of the frame may be captured or played in sequence during one time interval. Interlaced video may comprise fields that were converted from progressive frames. For example, a progressive frame may be converted into two interlaced fields by organizing the even numbered lines into one field and the odd numbered lines into another field.
- Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
- A system and/or method for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
- Various advantages, aspects and novel features of the present invention, as well as details of an illustrated embodiment thereof, will be more fully understood from the following description and drawings.
-
FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention. -
FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention. -
FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention. -
FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention. -
FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention. -
FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention. - Certain aspects of the invention may be found in a method and system for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream. In various exemplary aspects of the invention, an output
full resolution 3D video may be generated utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source, wherein a resolution of the outputfull resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. In one embodiment of the invention, 3D video or image processing on the data streams may be performed within the wireless communication device. In another embodiment of the invention, the 3D video or image processing may be performed external to the wireless communication device. The data streams may be compressed prior to communicating them for the external 3D video or image processing. The 3D video or images may be displayed locally on the wireless communication device. The 3D video or images may be formatted so that they may be locally presented on a display of the wireless communication device. -
FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention. Referring toFIG. 1A , thewireless device 150 may comprise anantenna 151, achip 162, atransceiver 152, abaseband processor 154, aprocessor 155, asystem memory 158, alogic block 160, ahigh resolution camera 164A, alow resolution camera 164B, anaudio CODEC 172A, avideo CODEC 172B, and anexternal headset port 166. Thewireless device 150 may also comprise ananalog microphone 168, integrated hands-free (IHF)stereo speakers 170, a hearing aid compatible (HAC)coil 174, a dualdigital microphone 176, avibration transducer 178, and a touchscreen/display 180. - Most video content is currently generated and played in two-dimensional (2D) format. In various video related applications such as, for example, DVD/Blu-ray movies and/or digital TV, 3D video may be more desirable because it may be more realistic to humans to perceive 3D rather than 2D images.
- The
transceiver 152 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to modulate and upconvert baseband signals to RF signals for transmission by one or more antennas, which may be represented generically by theantenna 151. Thetransceiver 152 may also be enabled to downconvert and demodulate received RF signals to baseband signals. The RF signals may be received by one or more antennas, which may be represented generically by theantenna 151. Different wireless systems may use different antennas for transmission and reception. Thetransceiver 152 may be enabled to execute other functions, for example, filtering the baseband and/or RF signals, and/or amplifying the baseband and/or RF signals. Although a single transceiver on each chip is shown, the invention is not so limited. Accordingly, thetransceiver 152 may be implemented as a separate transmitter and a separate receiver. In addition, there may be a plurality of transceivers, transmitters and/or receivers. In this regard, the plurality of transceivers, transmitters and/or receivers may enable thewireless device 150 to handle a plurality of wireless protocols and/or standards including cellular, WLAN and PAN. Wireless technologies handled by thewireless device 150 may comprise GPS, GALILEO, GLONASS, GSM, CDMA, CDMA2000, WCDMA, GNSS, GMS, GPRS, EDGE, WIMAX, WLAN, LTE, 3GPP, UMTS, BLUETOOTH, and ZIGBEE, for example. - The
baseband processor 154 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to process baseband signals for transmission via thetransceiver 152 and/or the baseband signals received from thetransceiver 152. Theprocessor 155 may be any suitable processor or controller such as a CPU, DSP, ARM, or any type of integrated circuit processor. Theprocessor 155 may comprise suitable logic, circuitry, and/or code that may be enabled to control the operations of thetransceiver 152 and/or thebaseband processor 154. For example, theprocessor 155 may be utilized to update and/or modify programmable parameters and/or values in a plurality of components, devices, and/or processing elements in thetransceiver 152 and/or thebaseband processor 154. At least a portion of the programmable parameters may be stored in thesystem memory 158. - Control and/or data information, which may comprise the programmable parameters, may be transferred from other portions of the
wireless device 150, not shown inFIG. 1 , to theprocessor 155. Similarly, theprocessor 155 may be enabled to transfer control and/or data information, which may include the programmable parameters, to other portions of thewireless device 150, not shown inFIG. 1 , which may be part of thewireless device 150. - The
processor 155 may utilize the received control and/or data information, which may comprise the programmable parameters or video source data, to determine an operating mode of thetransceiver 152. For example, theprocessor 155 may be utilized to select a specific frequency for a local oscillator, a specific gain for a variable gain amplifier, configure the local oscillator and/or configure the variable gain amplifier for operation in accordance with various embodiments of the invention. Moreover, the received video source data and/or processed full-resolution 3D video data, may be stored in thesystem memory 158 via theprocessor 155, for example. The information stored insystem memory 158 may be transferred to thetransceiver 152 from thesystem memory 158 via theprocessor 155. - The
processor 155 may be operable to process received video data streams from a high resolution video source and a low resolution video source. Theprocessor 155 may thereby generate afull resolution 3D video from the received data streams - The
system memory 158 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to store a plurality of control and/or video data information, including video or image processing parameters, orfull resolution 3D video data. Thesystem memory 158 may store at least a portion of the programmable parameters that may be manipulated by theprocessor 155. - The
logic block 160 may comprise suitable logic, circuitry, interfaces, and/or code that may enable controlling of various functionalities of thewireless device 150. For example, thelogic block 160 may comprise one or more state machines that may generate signals to control thetransceiver 152 and/or thebaseband processor 154. Thelogic block 160 may also comprise registers that may hold data for controlling, for example, thetransceiver 152 and/or thebaseband processor 154. Thelogic block 160 may also generate and/or store status information that may be read by, for example, theprocessor 155. Amplifier gains and/or filtering characteristics, for example, may be controlled by thelogic block 160. - The BT radio/
processor 163 may comprise suitable circuitry, logic, interfaces, and/or code that may enable transmission and reception of Bluetooth signals. The BT radio/processor 163 may enable processing and/or handling of BT baseband signals. In this regard, the BT radio/processor 163 may process or handle BT signals received and/or BT signals transmitted via a wireless communication medium. The BT radio/processor 163 may also provide control and/or feedback information to/from thebaseband processor 154 and/or theprocessor 155, based on information from the processed BT signals. The BT radio/processor 163 may communicate information and/or data from the processed BT signals to theprocessor 155 and/or to thesystem memory 158. Moreover, the BT radio/processor 163 may receive information from theprocessor 155 and/or thesystem memory 158, which may be processed and transmitted via the wireless communication medium a Bluetooth headset, for example. - The high-
resolution camera 164A may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images. The high-resolution camera 164A may be capable of capturing high-definition images and video and may be controlled via theprocessor 155, for example. The high-resolution camera 164A may comprise multi-megapixels, for example. - The low-
resolution camera 164B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images. The low-resolution camera 164B may comprise a smaller, lower-cost camera than the high-resolution camera 164A, and may comprise a VGA image/video camera, for example. In this manner, thewireless device 150 may comprise two cameras forfull resolution 3D images and video, without the need for two high-resolution cameras. The 3D image and/or video may be displayed on the touchscreen/display 180, for example, may be stored in thesystem memory 158, and/or may be communicated externally via thetransceiver 152 and theantenna 151. - The
audio CODEC 172A may comprise suitable circuitry, logic, interfaces, and/or code that may process audio signals received from and/or communicated to input/output devices. The input devices may be within or communicatively coupled to thewireless device 150, and may comprise theanalog microphone 168, thestereo speakers 170, the hearing aid compatible (HAC)coil 174, the dualdigital microphone 176, and thevibration transducer 178, for example. Theaudio CODEC 172A may be operable to up-convert and/or down-convert signal frequencies to desired frequencies for processing and/or transmission via an output device. - The
video CODEC 172B may comprise suitable circuitry, logic, interfaces, and/or code that may be operable to process video signals received and/or communicated from and/or to input output devices, such as the high-resolution camera 164A and the low-resolution camera 164B. Thevideo CODEC 172B may communicate processed video signals to theprocessor 155 for further processing, or for communication to devices external to thewireless device 150 via thetransceiver 152. - The
chip 162 may comprise an integrated circuit with multiple functional blocks integrated within, such as thetransceiver 152, thebaseband processor 154, the BT radio/processor 163, theaudio CODEC 172A, and thevideo CODEC 172B. The number of functional blocks integrated in thechip 162 is not limited to the number shown inFIG. 1 . Accordingly, any number of blocks may be integrated on thechip 162, including cameras such as the high-resolution camera 164A and the low-resolution camera 164B, depending on chip space andwireless device 150 requirements, for example. - The
external headset port 166 may comprise a physical connection for an external headset to be communicatively coupled to thewireless device 150. Theanalog microphone 168 may comprise suitable circuitry, logic, and/or code that may detect sound waves and convert them to electrical signals via a piezoelectric effect, for example. The electrical signals generated by theanalog microphone 168 may comprise analog signals that may require analog to digital conversion before processing. - The
stereo speakers 170 may comprise a pair of speakers that may be operable to generate audio signals from electrical signals received from theaudio CODEC 172A. TheHAC coil 174 may comprise suitable circuitry, logic, and/or code that may enable communication between thewireless device 150 and a T-coil in a hearing aid, for example. In this manner, electrical audio signals may be communicated to a user that utilizes a hearing aid, without the need for generating sound signals via a speaker, such as thestereo speakers 170, and converting the generated sound signals back to electrical signals in a hearing aid, and subsequently back into amplified sound signals in the user's ear, for example. - The dual
digital microphone 176 may comprise suitable circuitry, logic, and/or code that may be operable to detect sound waves and convert them to electrical signals. The electrical signals generated by the dualdigital microphone 176 may comprise digital signals, and thus may not require analog to digital conversion prior to digital processing in theaudio CODEC 172A. The dualdigital microphone 176 may enable beamforming capabilities, for example. - The
vibration transducer 178 may comprise suitable circuitry, logic, and/or code that may enable notification of an incoming call, alerts and/or message to thewireless device 150 without the use of sound. The vibration transducer may generate vibrations that may be in synch with, for example, audio signals such as speech or music. - In operation, video stream data may be communicated from image and/or video sources, such as the high-
resolution camera 164A and the low-resolution camera 164B to thevideo CODEC 172B. Thevideo CODEC 172B may process the received video data before communicating the data to the processor for further processing or communication to a device external to thewireless device 150. - In an embodiment of the invention, the high-
resolution camera 164A and the low-resolution camera 164B may be operable to capture images and/or video. By utilizing two spatially separated cameras, a 3D image/video may be generated from the two image/video signals. The high-resolution camera 164A and the low-resolution camera 164B may communicate video data streams to thevideo CODEC 172B and process the received signals before communicating the processed signals to the processor for further processing. Theprocessor 155 may be operable to process 3D video and/or images obtained utilizing the high-resolution camera 164A and the low-resolution camera 1648. In this manner, space and cost of thewireless device 150 may be reduced by utilizing a smaller, lower-cost, lower-resolution camera with a high-resolution camera, while still supporting full-resolution 3D images and video. The processing performed by theprocessor 155 may comprise right and left-view generation to enable 3D video, which may comprise still and/or moving images. - Various methodologies may be utilized to capture, generate (at capture or playtime), and/or render 3D video images. One of the more common methods for implementing 3D video is stereoscopic 3D video. In stereoscopic 3D video based applications the 3D video impression is generated by rendering multiple views, most commonly two views: a left view and a right view, corresponding to the viewer's left eye and right eye to give depth to displayed images. In this regard, left view and right view video sequences may be captured and/or processed to enable creating 3D images. The left view and right view data may then be communicated either as separate streams, or may be combined into a single transport stream and only separated into different view sequences by the end-user receiving/displaying device.
- Various compression/encoding standards may be utilized to enable compressing and/or encoding of the view sequences into transport streams during communication of stereoscopic 3D video. For example, the separate left and right view video sequences may be compressed based on MPEG-2 MVP, H.264 and/or MPEG-4 advanced video coding (AVC) or MPEG-4 multi-view video coding (MVC).
- In an embodiment of the invention, 3D video and image processing may be achieved utilizing one full resolution video stream and one lower resolution video stream. In this regard, a
wireless device 150 comprising one or more processors and/or circuits may be enabled to generate an outputfull resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source. A resolution of the outputfull resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. 3D video or image processing on the data streams may be performed within thewireless communication device 150 and/or external to thewireless communication device 150. -
FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention. Referring toFIG. 1B , there is shown thewireless device 150 and the touchscreen/display 180, which may be as described with respect toFIG. 1A . - In operation, the high-
resolution camera 164A and the low-resolution camera 164B in thewireless device 150 may be operable to capture images and/or video. By utilizing two spatially separated cameras, a 3D image/video may be generated from the two image/video signals, and by utilizing a lower-resolution camera in concert with a high-resolution camera, fullhigh definition 3D images and/or video may be generated, while reducing cost and space requirements of thewireless device 150. - The captured images and/or video may be processed in the
wireless device 150 and may subsequently be displayed on the touchscreen/display 180. In another embodiment of the invention, the processed images and/or video may be communicated external to thewireless device 150. Alternatively, the captured images and/or video may be communicated from thewireless device 150 without processing, before being processed by an external device. In this manner, processor requirements in thewireless device 150 may be reduced. - In various embodiments of the invention, a
wireless device 150 comprising one or more processors and/or circuits may be operable to generate an outputfull resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source. A resolution of the outputfull resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. 3D video or image processing on the data streams may be performed within thewireless communication device 150 and/or external to thewireless communication device 150. -
FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention. Referring toFIG. 2 , there is shown a 3Dvideo processing module 201, areformat module 203, and a3D video output 205. There is also shown a fullresolution video stream 207, alow resolution stream 209, and a plurality of video streams 211. - The 3D
video processing module 201 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to process received video streams, such as the fullresolution video stream 207 and thelow resolution stream 209. The 3Dvideo processing module 201 may be integrated in thewireless device 150, such as in theprocessor 155, for example, or may be in an external device, such as a computer or audio/visual system that is operable to receive and process video streams. In instances where the 3Dvideo processing module 201 is external to thewireless device 150, the fullresolution video stream 207 and thelower resolution stream 209 may be communicated to the external device in parallel. - The
reformat module 203 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format processed 3D image or video data into a plurality of video streams 211. Thereformat module 203 may format image and/or video data to the appropriate format for a target video output device. - The
3D video output 205 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to display 3D images and/or video. The3D video output 205 may comprise a high-definition television, for example. - In operation, two cameras, such as the
high resolution camera 164A and the low-resolution camera 164B, may generate two input streams, the fullresolution video stream 207 and thelow resolution stream 209. Thefull resolution stream 207 may comprise a stream of full resolution required for target compression or for LCD resolution. Thelow resolution stream 209 may comprise a reduced resolution stream, such as a VGA video stream, for example. The streams may be communicated to the 3Dvideo processing module 201 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images. - The 3D video stream generated by the 3D
video processing module 201 may be communicated to thereformat module 203, which may be operable to format the processed 3D video stream into a plurality of video streams 211. The plurality ofvideo streams 211 may be communicated to the3D video output 205 for display. -
FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention. Referring toFIG. 3 , there is shown a3D video implementation 300 comprising ahigh resolution camera 301A, a lowresolution video camera 301B, a3D processing module 303, aformat module 305, atelevision 307, thewireless device 150 and the touchscreen/display 180. Thehigh resolution camera 301A and the lowresolution video camera 301B may be substantially similar to the high-resolution camera 164A and the low-resolution camera 164B described with respect toFIG. 1A , and the and the3D processing module 303 may be substantially similar to the3D processing module 201 described with respect toFIG. 2 . Thewireless device 150, and the touchscreen/display 180 may be as described with respect toFIG. 1A . - The
format module 305 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format video and/or image data for a desired display type. For example, theformat module 305 may define an appropriate aspect ratio or scan rate as required by thewireless device 150 or thetelevision 307. - In operation, the high-
resolution camera 301A and the low-resolution camera 301B, may generate two input streams to be communicated to the 3Dvideo processing module 303 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images. The 3D video stream generated by the 3D video processing module 301 may be communicated to theformat module 305, which may be operable to format the processed 3D video stream into the appropriate format for display on thewireless device 150, thetelevision 307, or similar display device. -
FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention. Referring toFIG. 4 , there is shown a networked3D video implementation 400 comprising thehigh resolution camera 301A, the lowresolution video camera 301B, 401A and 401B,compression modules 403A and 403B, a 3Ddecompression modules video processing module 405, aformat module 407, thewireless device 150, and thetelevision 307. Thehigh resolution camera 301A, the lowresolution video camera 301B, thewireless device 150, and thetelevision 307 may be as described previously. The 3Dvideo processing module 405, theformat module 407, and thewireless device 409 may be substantially similar to the 3Dvideo processing module 303, theformat module 305, and thewireless device 150 described previously. Thewireless device 409 may comprise a touchscreen/display 411. - The
401A and 401B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to compress received image and/or video data for subsequent communication to remote devices. For example, thecompression modules 401A and 401B may be integrated in a wireless device, such as thecompression modules wireless device 150, and may enable more efficient communication of data over a network by reducing data size. - The
403A and 403B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to decompress received compressed data. Thedecompression modules 403A and 403B may be remote from the image/video data source.decompression modules - In operation, the high-
resolution camera 301A and the low-resolution camera 301B, may generate two input streams to be communicated to the 401A and 401B, where the data streams may be compressed for more efficient communication. The compressed streams may be communicated to a remote device comprising thecompression modules 403A and 403B, which may be enabled to decompress the received data for further processing.decompression modules - The decompressed data may be communicated to the 3D
video processing module 405 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images. The 3D video stream generated by the 3Dvideo processing module 405 may be communicated to theformat module 407, which may be operable to format the processed 3D video stream into the appropriate format for display on thewireless device 409, thetelevision 307, or a similar display device. -
FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention. Referring toFIG. 5 , instep 503 afterstart step 501, video and/or image data may be captured utilizing a high-resolution camera and a low-resolution camera. If instep 505, the video/image data is to be processed locally, such as within thewireless device 150, the exemplary steps may proceed to step 511, where the data streams may be combined and processed to generate full-resolution 3D video/images. If instep 505, the data is not to be processed locally, the data streams may be compressed instep 507, followed bystep 509 where the compressed data may be communicated to a remote device, before the exemplary steps proceed to step 511, where the data streams may be combined and processed to generate full-resolution 3D video/images. The process may then proceed to step 513 where the 3D video/images may be formatted for a desired display device and subsequently displayed on that device, followed byend step 515. - In an embodiment of the invention, a method and system are disclosed for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream. In this regard, a wireless device comprising one or more processors and/or circuits may be enabled to generate an output
full resolution 3D video utilizing a firstvideo data stream 207 generated from a highresolution video source 164A and a secondvideo data stream 209 generated from a low resolution video source 1648, wherein a resolution of the outputfull resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. 3D video or image processing on the data streams 207/209 may be performed within thewireless communication device 150. The 3D video orimage processing 201/303 may be performed external to thewireless communication device 150. The data streams 207/209 may be compressed 401A/401B prior to communicating them for the external 3D video orimage processing 405. The 3D video or image may be displayed locally on thewireless communication device 150. The 3D video or images may be formatted for local displaying 180 on thewireless communication device 150. - Another embodiment of the invention may provide a machine and/or computer readable storage and/or medium, having stored thereon, a machine code and/or a computer program having at least one code section executable by a machine and/or a computer, thereby causing the machine and/or computer to perform the steps as described herein for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
- Accordingly, aspects of the invention may be realized in hardware, software, firmware or a combination thereof. The invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- One embodiment of the present invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components. The degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
- The present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form. However, other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
- While the invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present invention without departing from its scope. Therefore, it is intended that the present invention not be limited to the particular embodiments disclosed, but that the present invention will include all embodiments falling within the scope of the appended claims.
Claims (20)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/629,247 US20100177162A1 (en) | 2009-01-15 | 2009-12-02 | Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream |
| EP10000181A EP2209321A3 (en) | 2009-01-15 | 2010-01-11 | Enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream |
| CN201010005536A CN101795418A (en) | 2009-01-15 | 2010-01-15 | Method and system for realizing wireless communication |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14495909P | 2009-01-15 | 2009-01-15 | |
| US12/629,247 US20100177162A1 (en) | 2009-01-15 | 2009-12-02 | Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20100177162A1 true US20100177162A1 (en) | 2010-07-15 |
Family
ID=41858891
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US12/629,247 Abandoned US20100177162A1 (en) | 2009-01-15 | 2009-12-02 | Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20100177162A1 (en) |
| EP (1) | EP2209321A3 (en) |
| CN (1) | CN101795418A (en) |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070069923A1 (en) * | 2005-05-09 | 2007-03-29 | Ehud Mendelson | System and method for generate and update real time navigation waypoint automatically |
| US20130069825A1 (en) * | 2011-09-21 | 2013-03-21 | Rio Systems Ltd. | Methods, circuits and systems for generating navigation beacon signals |
| US20140143797A1 (en) * | 2009-04-27 | 2014-05-22 | Mitsubishi Electric Corporation | Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distrubtion apparatus, stereoscopic video viewing system, stereoscipic video viewing method, and stereoscopic video viewing apparatus |
| US20140285622A1 (en) * | 2009-04-27 | 2014-09-25 | Lg Electronics Inc. | Broadcast receiver and 3d video data processing method thereof |
| US20140341293A1 (en) * | 2011-09-16 | 2014-11-20 | Dolby Laboratories Licensing Corporation | Frame-Compatible Full Resolution Stereoscopic 3D Compression And Decompression |
| US9077966B2 (en) | 2010-02-15 | 2015-07-07 | Thomson Licensing | Apparatus and method for processing video content |
| US9628769B2 (en) | 2011-02-15 | 2017-04-18 | Thomson Licensing Dtv | Apparatus and method for generating a disparity map in a receiving device |
| US20170111593A1 (en) * | 2005-06-21 | 2017-04-20 | Cedar Crest Partners Inc. | System, method and apparatus for capture, conveying and securing information including media information such as video |
| EP2640073A4 (en) * | 2010-11-12 | 2017-05-31 | Electronics And Telecommunications Research Institute | Method and apparatus for determining a video compression standard in a 3dtv service |
| US9936109B2 (en) * | 2010-10-22 | 2018-04-03 | University Of New Brunswick | Method and system for fusing images |
| US20220134227A1 (en) * | 2019-02-25 | 2022-05-05 | Google Llc | Variable end-point user interface rendering |
| CN115460461A (en) * | 2022-09-07 | 2022-12-09 | 北京奇艺世纪科技有限公司 | Video processing method and device, terminal equipment and computer readable storage medium |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5735181B2 (en) * | 2011-09-29 | 2015-06-17 | ドルビー ラボラトリーズ ライセンシング コーポレイション | Dual layer frame compatible full resolution stereoscopic 3D video delivery |
| TWI595770B (en) | 2011-09-29 | 2017-08-11 | 杜比實驗室特許公司 | Frame-compatible full-resolution stereoscopic 3d video delivery with symmetric picture resolution and quality |
| DE102012106860A1 (en) * | 2012-07-27 | 2014-02-13 | Jenoptik Robot Gmbh | Device and method for identifying and documenting at least one object passing through a radiation field |
| CN103841335A (en) * | 2014-03-20 | 2014-06-04 | 梁红 | Novel method for improving night vision image effect |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2887272B2 (en) * | 1987-07-14 | 1999-04-26 | 株式会社 エイ・ティ・ア−ル通信システム研究所 | 3D image device |
| EP1418766A3 (en) * | 1998-08-28 | 2010-03-24 | Imax Corporation | Method and apparatus for processing images |
| US7260274B2 (en) * | 2000-12-01 | 2007-08-21 | Imax Corporation | Techniques and systems for developing high-resolution imagery |
| WO2006079963A2 (en) * | 2005-01-28 | 2006-08-03 | Koninklijke Philips Electronics N.V. | Device for registering images |
| JP4818053B2 (en) * | 2006-10-10 | 2011-11-16 | 株式会社東芝 | High resolution device and method |
| WO2008085874A2 (en) * | 2007-01-05 | 2008-07-17 | Marvell World Trade Ltd. | Methods and systems for improving low-resolution video |
| EP2259599A1 (en) * | 2009-05-29 | 2010-12-08 | Telefonaktiebolaget L M Ericsson (Publ) | Method and arrangement for processing a stereo image for three-dimensional viewing |
-
2009
- 2009-12-02 US US12/629,247 patent/US20100177162A1/en not_active Abandoned
-
2010
- 2010-01-11 EP EP10000181A patent/EP2209321A3/en not_active Withdrawn
- 2010-01-15 CN CN201010005536A patent/CN101795418A/en active Pending
Cited By (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070069923A1 (en) * | 2005-05-09 | 2007-03-29 | Ehud Mendelson | System and method for generate and update real time navigation waypoint automatically |
| US20170111593A1 (en) * | 2005-06-21 | 2017-04-20 | Cedar Crest Partners Inc. | System, method and apparatus for capture, conveying and securing information including media information such as video |
| US10356388B2 (en) * | 2009-04-27 | 2019-07-16 | Mitsubishi Electric Corporation | Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distribution apparatus, stereoscopic video viewing system, stereoscopic video viewing method, and stereoscopic video viewing apparatus |
| US20140143797A1 (en) * | 2009-04-27 | 2014-05-22 | Mitsubishi Electric Corporation | Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distrubtion apparatus, stereoscopic video viewing system, stereoscipic video viewing method, and stereoscopic video viewing apparatus |
| US20140285622A1 (en) * | 2009-04-27 | 2014-09-25 | Lg Electronics Inc. | Broadcast receiver and 3d video data processing method thereof |
| US9077966B2 (en) | 2010-02-15 | 2015-07-07 | Thomson Licensing | Apparatus and method for processing video content |
| US9148646B2 (en) | 2010-02-15 | 2015-09-29 | Thomson Licensing | Apparatus and method for processing video content |
| US9936109B2 (en) * | 2010-10-22 | 2018-04-03 | University Of New Brunswick | Method and system for fusing images |
| EP2640073A4 (en) * | 2010-11-12 | 2017-05-31 | Electronics And Telecommunications Research Institute | Method and apparatus for determining a video compression standard in a 3dtv service |
| US9628769B2 (en) | 2011-02-15 | 2017-04-18 | Thomson Licensing Dtv | Apparatus and method for generating a disparity map in a receiving device |
| US9473788B2 (en) * | 2011-09-16 | 2016-10-18 | Dolby Laboratories Licensing Corporation | Frame-compatible full resolution stereoscopic 3D compression and decompression |
| US20140341293A1 (en) * | 2011-09-16 | 2014-11-20 | Dolby Laboratories Licensing Corporation | Frame-Compatible Full Resolution Stereoscopic 3D Compression And Decompression |
| US20130069825A1 (en) * | 2011-09-21 | 2013-03-21 | Rio Systems Ltd. | Methods, circuits and systems for generating navigation beacon signals |
| US20220134227A1 (en) * | 2019-02-25 | 2022-05-05 | Google Llc | Variable end-point user interface rendering |
| US12102917B2 (en) * | 2019-02-25 | 2024-10-01 | Google Llc | Variable end-point user interface rendering |
| CN115460461A (en) * | 2022-09-07 | 2022-12-09 | 北京奇艺世纪科技有限公司 | Video processing method and device, terminal equipment and computer readable storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2209321A3 (en) | 2013-03-20 |
| EP2209321A2 (en) | 2010-07-21 |
| CN101795418A (en) | 2010-08-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20100177162A1 (en) | Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream | |
| CN102342112B (en) | Stereo image data transmitting apparatus, stereo image data transmitting method, stereo image data receiving apparatus, and stereo image data receiving method | |
| US20100265315A1 (en) | Three-dimensional image combining apparatus | |
| WO2010099178A2 (en) | System and method for displaying multiple images/videos on a single display | |
| KR101832407B1 (en) | Method and system for communication of stereoscopic three dimensional video information | |
| US9270975B2 (en) | Information integrating device and information integrating method which integrates stereoscopic video information using main information and complementary information | |
| AU2010231544A1 (en) | System and format for encoding data and three-dimensional rendering | |
| WO2013015116A1 (en) | Encoding device and encoding method, and decoding device and decoding method | |
| US20100194845A1 (en) | Television system and control method thereof | |
| US20120262454A1 (en) | Stereoscopic image data transmission device, stereoscopic image data transmission method, stereoscopic image data reception device, and stereoscopic image data reception method | |
| US9749608B2 (en) | Apparatus and method for generating a three-dimension image data in portable terminal | |
| MX2012004580A (en) | 3d-image-data transmission device, 3d-image-data transmission method, 3d-image-data reception device, and 3d-image-data reception method. | |
| CN113689810B (en) | Image display apparatus and method thereof | |
| EP2312859A2 (en) | Method and system for communicating 3D video via a wireless communication link | |
| US20120300026A1 (en) | Audio-Video Signal Processing | |
| JP2012199897A5 (en) | ||
| HK1147001A (en) | A method and system for implementing wireless communication | |
| JP5170278B2 (en) | Display control device, display control method, program, and display control system | |
| JPH10164448A (en) | Method and device for reproducing image for television broadcasting system | |
| KR102761772B1 (en) | Transmission device | |
| JP5075996B2 (en) | Video display method and video display device | |
| EP2015573A1 (en) | Telecomunication device and system | |
| JP5626883B2 (en) | Videophone device and control method thereof | |
| KR101674187B1 (en) | Apparatus for stereophonic acquisition for broadband interpolation and Method thereof | |
| CA3007360A1 (en) | Remote-controlled media studio |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MACFARLANE, CHARLES;REEL/FRAME:023782/0475 Effective date: 20091202 |
|
| AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
| AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
| AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |