[go: up one dir, main page]

US20100177162A1 - Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream - Google Patents

Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream Download PDF

Info

Publication number
US20100177162A1
US20100177162A1 US12/629,247 US62924709A US2010177162A1 US 20100177162 A1 US20100177162 A1 US 20100177162A1 US 62924709 A US62924709 A US 62924709A US 2010177162 A1 US2010177162 A1 US 2010177162A1
Authority
US
United States
Prior art keywords
video
resolution
data stream
full resolution
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/629,247
Inventor
Charles Macfarlane
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/629,247 priority Critical patent/US20100177162A1/en
Priority to EP10000181A priority patent/EP2209321A3/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MACFARLANE, CHARLES
Priority to CN201010005536A priority patent/CN101795418A/en
Publication of US20100177162A1 publication Critical patent/US20100177162A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNOR'S INTEREST Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor

Definitions

  • Certain embodiments of the invention relate to wireless communication. More specifically, certain embodiments of the invention relate to a method and system for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
  • an image is presented in a display device, for example in a television, a monitor and/or a gaming console.
  • Most video broadcasts nowadays, utilize video processing applications that enable broadcasting video images in the form of bit streams that comprise information regarding characteristics of the image to be displayed.
  • These video applications may utilize various interpolation and/or rate conversion functions to present content comprising still and/or moving images on a display.
  • de-interlacing functions may be utilized to convert moving and/or still images to a format that is suitable for certain types of display devices that are unable to handle interlaced content.
  • Interlaced 3D and/or 2D video comprises fields, each of which may be captured at a distinct time interval.
  • a frame may comprise a pair of fields, for example, a top field and a bottom field.
  • the pictures forming the video may comprise a plurality of ordered lines.
  • video content for the even-numbered lines may be captured.
  • video content for the odd-numbered lines may be captured.
  • the even-numbered lines may be collectively referred to as the top field, while the odd-numbered lines may be collectively referred to as the bottom field.
  • the odd-numbered lines may be collectively referred to as the top field, while the even-numbered lines may be collectively referred to as the bottom field.
  • Interlaced video may comprise fields that were converted from progressive frames.
  • a progressive frame may be converted into two interlaced fields by organizing the even numbered lines into one field and the odd numbered lines into another field.
  • a system and/or method for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
  • FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
  • FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention.
  • FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention.
  • FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention.
  • FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention.
  • FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
  • an output full resolution 3D video may be generated utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source, wherein a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
  • 3D video or image processing on the data streams may be performed within the wireless communication device.
  • the 3D video or image processing may be performed external to the wireless communication device.
  • the data streams may be compressed prior to communicating them for the external 3D video or image processing.
  • the 3D video or images may be displayed locally on the wireless communication device.
  • the 3D video or images may be formatted so that they may be locally presented on a display of the wireless communication device.
  • FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
  • the wireless device 150 may comprise an antenna 151 , a chip 162 , a transceiver 152 , a baseband processor 154 , a processor 155 , a system memory 158 , a logic block 160 , a high resolution camera 164 A, a low resolution camera 164 B, an audio CODEC 172 A, a video CODEC 172 B, and an external headset port 166 .
  • the wireless device 150 may also comprise an analog microphone 168 , integrated hands-free (IHF) stereo speakers 170 , a hearing aid compatible (HAC) coil 174 , a dual digital microphone 176 , a vibration transducer 178 , and a touchscreen/display 180 .
  • IHF integrated hands-free
  • HAC hearing aid compatible
  • 3D video may be more desirable because it may be more realistic to humans to perceive 3D rather than 2D images.
  • the transceiver 152 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to modulate and upconvert baseband signals to RF signals for transmission by one or more antennas, which may be represented generically by the antenna 151 .
  • the transceiver 152 may also be enabled to downconvert and demodulate received RF signals to baseband signals.
  • the RF signals may be received by one or more antennas, which may be represented generically by the antenna 151 . Different wireless systems may use different antennas for transmission and reception.
  • the transceiver 152 may be enabled to execute other functions, for example, filtering the baseband and/or RF signals, and/or amplifying the baseband and/or RF signals.
  • the transceiver 152 may be implemented as a separate transmitter and a separate receiver.
  • the plurality of transceivers, transmitters and/or receivers may enable the wireless device 150 to handle a plurality of wireless protocols and/or standards including cellular, WLAN and PAN.
  • Wireless technologies handled by the wireless device 150 may comprise GPS, GALILEO, GLONASS, GSM, CDMA, CDMA2000, WCDMA, GNSS, GMS, GPRS, EDGE, WIMAX, WLAN, LTE, 3GPP, UMTS, BLUETOOTH, and ZIGBEE, for example.
  • the baseband processor 154 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to process baseband signals for transmission via the transceiver 152 and/or the baseband signals received from the transceiver 152 .
  • the processor 155 may be any suitable processor or controller such as a CPU, DSP, ARM, or any type of integrated circuit processor.
  • the processor 155 may comprise suitable logic, circuitry, and/or code that may be enabled to control the operations of the transceiver 152 and/or the baseband processor 154 .
  • the processor 155 may be utilized to update and/or modify programmable parameters and/or values in a plurality of components, devices, and/or processing elements in the transceiver 152 and/or the baseband processor 154 . At least a portion of the programmable parameters may be stored in the system memory 158 .
  • Control and/or data information which may comprise the programmable parameters, may be transferred from other portions of the wireless device 150 , not shown in FIG. 1 , to the processor 155 .
  • the processor 155 may be enabled to transfer control and/or data information, which may include the programmable parameters, to other portions of the wireless device 150 , not shown in FIG. 1 , which may be part of the wireless device 150 .
  • the processor 155 may utilize the received control and/or data information, which may comprise the programmable parameters or video source data, to determine an operating mode of the transceiver 152 .
  • the processor 155 may be utilized to select a specific frequency for a local oscillator, a specific gain for a variable gain amplifier, configure the local oscillator and/or configure the variable gain amplifier for operation in accordance with various embodiments of the invention.
  • the received video source data and/or processed full-resolution 3D video data may be stored in the system memory 158 via the processor 155 , for example.
  • the information stored in system memory 158 may be transferred to the transceiver 152 from the system memory 158 via the processor 155 .
  • the processor 155 may be operable to process received video data streams from a high resolution video source and a low resolution video source. The processor 155 may thereby generate a full resolution 3D video from the received data streams
  • the system memory 158 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to store a plurality of control and/or video data information, including video or image processing parameters, or full resolution 3D video data.
  • the system memory 158 may store at least a portion of the programmable parameters that may be manipulated by the processor 155 .
  • the logic block 160 may comprise suitable logic, circuitry, interfaces, and/or code that may enable controlling of various functionalities of the wireless device 150 .
  • the logic block 160 may comprise one or more state machines that may generate signals to control the transceiver 152 and/or the baseband processor 154 .
  • the logic block 160 may also comprise registers that may hold data for controlling, for example, the transceiver 152 and/or the baseband processor 154 .
  • the logic block 160 may also generate and/or store status information that may be read by, for example, the processor 155 . Amplifier gains and/or filtering characteristics, for example, may be controlled by the logic block 160 .
  • the BT radio/processor 163 may comprise suitable circuitry, logic, interfaces, and/or code that may enable transmission and reception of Bluetooth signals.
  • the BT radio/processor 163 may enable processing and/or handling of BT baseband signals.
  • the BT radio/processor 163 may process or handle BT signals received and/or BT signals transmitted via a wireless communication medium.
  • the BT radio/processor 163 may also provide control and/or feedback information to/from the baseband processor 154 and/or the processor 155 , based on information from the processed BT signals.
  • the BT radio/processor 163 may communicate information and/or data from the processed BT signals to the processor 155 and/or to the system memory 158 .
  • the BT radio/processor 163 may receive information from the processor 155 and/or the system memory 158 , which may be processed and transmitted via the wireless communication medium a Bluetooth headset, for example.
  • the high-resolution camera 164 A may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images.
  • the high-resolution camera 164 A may be capable of capturing high-definition images and video and may be controlled via the processor 155 , for example.
  • the high-resolution camera 164 A may comprise multi-megapixels, for example.
  • the low-resolution camera 164 B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images.
  • the low-resolution camera 164 B may comprise a smaller, lower-cost camera than the high-resolution camera 164 A, and may comprise a VGA image/video camera, for example.
  • the wireless device 150 may comprise two cameras for full resolution 3D images and video, without the need for two high-resolution cameras.
  • the 3D image and/or video may be displayed on the touchscreen/display 180 , for example, may be stored in the system memory 158 , and/or may be communicated externally via the transceiver 152 and the antenna 151 .
  • the audio CODEC 172 A may comprise suitable circuitry, logic, interfaces, and/or code that may process audio signals received from and/or communicated to input/output devices.
  • the input devices may be within or communicatively coupled to the wireless device 150 , and may comprise the analog microphone 168 , the stereo speakers 170 , the hearing aid compatible (HAC) coil 174 , the dual digital microphone 176 , and the vibration transducer 178 , for example.
  • the audio CODEC 172 A may be operable to up-convert and/or down-convert signal frequencies to desired frequencies for processing and/or transmission via an output device.
  • the video CODEC 172 B may comprise suitable circuitry, logic, interfaces, and/or code that may be operable to process video signals received and/or communicated from and/or to input output devices, such as the high-resolution camera 164 A and the low-resolution camera 164 B.
  • the video CODEC 172 B may communicate processed video signals to the processor 155 for further processing, or for communication to devices external to the wireless device 150 via the transceiver 152 .
  • the chip 162 may comprise an integrated circuit with multiple functional blocks integrated within, such as the transceiver 152 , the baseband processor 154 , the BT radio/processor 163 , the audio CODEC 172 A, and the video CODEC 172 B.
  • the number of functional blocks integrated in the chip 162 is not limited to the number shown in FIG. 1 . Accordingly, any number of blocks may be integrated on the chip 162 , including cameras such as the high-resolution camera 164 A and the low-resolution camera 164 B, depending on chip space and wireless device 150 requirements, for example.
  • the external headset port 166 may comprise a physical connection for an external headset to be communicatively coupled to the wireless device 150 .
  • the analog microphone 168 may comprise suitable circuitry, logic, and/or code that may detect sound waves and convert them to electrical signals via a piezoelectric effect, for example.
  • the electrical signals generated by the analog microphone 168 may comprise analog signals that may require analog to digital conversion before processing.
  • the stereo speakers 170 may comprise a pair of speakers that may be operable to generate audio signals from electrical signals received from the audio CODEC 172 A.
  • the HAC coil 174 may comprise suitable circuitry, logic, and/or code that may enable communication between the wireless device 150 and a T-coil in a hearing aid, for example. In this manner, electrical audio signals may be communicated to a user that utilizes a hearing aid, without the need for generating sound signals via a speaker, such as the stereo speakers 170 , and converting the generated sound signals back to electrical signals in a hearing aid, and subsequently back into amplified sound signals in the user's ear, for example.
  • the dual digital microphone 176 may comprise suitable circuitry, logic, and/or code that may be operable to detect sound waves and convert them to electrical signals.
  • the electrical signals generated by the dual digital microphone 176 may comprise digital signals, and thus may not require analog to digital conversion prior to digital processing in the audio CODEC 172 A.
  • the dual digital microphone 176 may enable beamforming capabilities, for example.
  • the vibration transducer 178 may comprise suitable circuitry, logic, and/or code that may enable notification of an incoming call, alerts and/or message to the wireless device 150 without the use of sound.
  • the vibration transducer may generate vibrations that may be in synch with, for example, audio signals such as speech or music.
  • video stream data may be communicated from image and/or video sources, such as the high-resolution camera 164 A and the low-resolution camera 164 B to the video CODEC 172 B.
  • the video CODEC 172 B may process the received video data before communicating the data to the processor for further processing or communication to a device external to the wireless device 150 .
  • the high-resolution camera 164 A and the low-resolution camera 164 B may be operable to capture images and/or video. By utilizing two spatially separated cameras, a 3D image/video may be generated from the two image/video signals.
  • the high-resolution camera 164 A and the low-resolution camera 164 B may communicate video data streams to the video CODEC 172 B and process the received signals before communicating the processed signals to the processor for further processing.
  • the processor 155 may be operable to process 3D video and/or images obtained utilizing the high-resolution camera 164 A and the low-resolution camera 1648 .
  • the wireless device 150 may be reduced by utilizing a smaller, lower-cost, lower-resolution camera with a high-resolution camera, while still supporting full-resolution 3D images and video.
  • the processing performed by the processor 155 may comprise right and left-view generation to enable 3D video, which may comprise still and/or moving images.
  • Various methodologies may be utilized to capture, generate (at capture or playtime), and/or render 3D video images.
  • One of the more common methods for implementing 3D video is stereoscopic 3D video.
  • the 3D video impression is generated by rendering multiple views, most commonly two views: a left view and a right view, corresponding to the viewer's left eye and right eye to give depth to displayed images.
  • left view and right view video sequences may be captured and/or processed to enable creating 3D images.
  • the left view and right view data may then be communicated either as separate streams, or may be combined into a single transport stream and only separated into different view sequences by the end-user receiving/displaying device.
  • the separate left and right view video sequences may be compressed based on MPEG-2 MVP, H.264 and/or MPEG-4 advanced video coding (AVC) or MPEG-4 multi-view video coding (MVC).
  • MPEG-2 MVP H.264 and/or MPEG-4 advanced video coding (AVC)
  • MPEG-4 multi-view video coding MVC
  • 3D video and image processing may be achieved utilizing one full resolution video stream and one lower resolution video stream.
  • a wireless device 150 comprising one or more processors and/or circuits may be enabled to generate an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source.
  • a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
  • 3D video or image processing on the data streams may be performed within the wireless communication device 150 and/or external to the wireless communication device 150 .
  • FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention. Referring to FIG. 1B , there is shown the wireless device 150 and the touchscreen/display 180 , which may be as described with respect to FIG. 1A .
  • the high-resolution camera 164 A and the low-resolution camera 164 B in the wireless device 150 may be operable to capture images and/or video.
  • a 3D image/video may be generated from the two image/video signals, and by utilizing a lower-resolution camera in concert with a high-resolution camera, full high definition 3D images and/or video may be generated, while reducing cost and space requirements of the wireless device 150 .
  • the captured images and/or video may be processed in the wireless device 150 and may subsequently be displayed on the touchscreen/display 180 .
  • the processed images and/or video may be communicated external to the wireless device 150 .
  • the captured images and/or video may be communicated from the wireless device 150 without processing, before being processed by an external device. In this manner, processor requirements in the wireless device 150 may be reduced.
  • a wireless device 150 comprising one or more processors and/or circuits may be operable to generate an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source.
  • a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
  • 3D video or image processing on the data streams may be performed within the wireless communication device 150 and/or external to the wireless communication device 150 .
  • FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention.
  • a 3D video processing module 201 there is shown a 3D video processing module 201 , a reformat module 203 , and a 3D video output 205 .
  • a full resolution video stream 207 There is also shown a full resolution video stream 207 , a low resolution stream 209 , and a plurality of video streams 211 .
  • the 3D video processing module 201 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to process received video streams, such as the full resolution video stream 207 and the low resolution stream 209 .
  • the 3D video processing module 201 may be integrated in the wireless device 150 , such as in the processor 155 , for example, or may be in an external device, such as a computer or audio/visual system that is operable to receive and process video streams. In instances where the 3D video processing module 201 is external to the wireless device 150 , the full resolution video stream 207 and the lower resolution stream 209 may be communicated to the external device in parallel.
  • the reformat module 203 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format processed 3D image or video data into a plurality of video streams 211 .
  • the reformat module 203 may format image and/or video data to the appropriate format for a target video output device.
  • the 3D video output 205 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to display 3D images and/or video.
  • the 3D video output 205 may comprise a high-definition television, for example.
  • two cameras may generate two input streams, the full resolution video stream 207 and the low resolution stream 209 .
  • the full resolution stream 207 may comprise a stream of full resolution required for target compression or for LCD resolution.
  • the low resolution stream 209 may comprise a reduced resolution stream, such as a VGA video stream, for example.
  • the streams may be communicated to the 3D video processing module 201 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images.
  • the 3D video stream generated by the 3D video processing module 201 may be communicated to the reformat module 203 , which may be operable to format the processed 3D video stream into a plurality of video streams 211 .
  • the plurality of video streams 211 may be communicated to the 3D video output 205 for display.
  • FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention.
  • a 3D video implementation 300 comprising a high resolution camera 301 A, a low resolution video camera 301 B, a 3D processing module 303 , a format module 305 , a television 307 , the wireless device 150 and the touchscreen/display 180 .
  • the high resolution camera 301 A and the low resolution video camera 301 B may be substantially similar to the high-resolution camera 164 A and the low-resolution camera 164 B described with respect to FIG. 1A
  • the and the 3D processing module 303 may be substantially similar to the 3D processing module 201 described with respect to FIG. 2 .
  • the wireless device 150 , and the touchscreen/display 180 may be as described with respect to FIG. 1A .
  • the format module 305 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format video and/or image data for a desired display type.
  • the format module 305 may define an appropriate aspect ratio or scan rate as required by the wireless device 150 or the television 307 .
  • the high-resolution camera 301 A and the low-resolution camera 301 B may generate two input streams to be communicated to the 3D video processing module 303 which may combine the two streams to generate a full-resolution 3D video stream.
  • this process may be applied to video or still images.
  • the 3D video stream generated by the 3D video processing module 301 may be communicated to the format module 305 , which may be operable to format the processed 3D video stream into the appropriate format for display on the wireless device 150 , the television 307 , or similar display device.
  • FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention.
  • a networked 3D video implementation 400 comprising the high resolution camera 301 A, the low resolution video camera 301 B, compression modules 401 A and 401 B, decompression modules 403 A and 403 B, a 3D video processing module 405 , a format module 407 , the wireless device 150 , and the television 307 .
  • the high resolution camera 301 A, the low resolution video camera 301 B, the wireless device 150 , and the television 307 may be as described previously.
  • the 3D video processing module 405 , the format module 407 , and the wireless device 409 may be substantially similar to the 3D video processing module 303 , the format module 305 , and the wireless device 150 described previously.
  • the wireless device 409 may comprise a touchscreen/display 411 .
  • the compression modules 401 A and 401 B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to compress received image and/or video data for subsequent communication to remote devices.
  • the compression modules 401 A and 401 B may be integrated in a wireless device, such as the wireless device 150 , and may enable more efficient communication of data over a network by reducing data size.
  • the decompression modules 403 A and 403 B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to decompress received compressed data.
  • the decompression modules 403 A and 403 B may be remote from the image/video data source.
  • the high-resolution camera 301 A and the low-resolution camera 301 B may generate two input streams to be communicated to the compression modules 401 A and 401 B, where the data streams may be compressed for more efficient communication.
  • the compressed streams may be communicated to a remote device comprising the decompression modules 403 A and 403 B, which may be enabled to decompress the received data for further processing.
  • the decompressed data may be communicated to the 3D video processing module 405 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images.
  • the 3D video stream generated by the 3D video processing module 405 may be communicated to the format module 407 , which may be operable to format the processed 3D video stream into the appropriate format for display on the wireless device 409 , the television 307 , or a similar display device.
  • FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
  • video and/or image data may be captured utilizing a high-resolution camera and a low-resolution camera.
  • the exemplary steps may proceed to step 511 , where the data streams may be combined and processed to generate full-resolution 3D video/images.
  • step 505 the data is not to be processed locally, the data streams may be compressed in step 507 , followed by step 509 where the compressed data may be communicated to a remote device, before the exemplary steps proceed to step 511 , where the data streams may be combined and processed to generate full-resolution 3D video/images.
  • the process may then proceed to step 513 where the 3D video/images may be formatted for a desired display device and subsequently displayed on that device, followed by end step 515 .
  • a method and system are disclosed for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
  • a wireless device comprising one or more processors and/or circuits may be enabled to generate an output full resolution 3D video utilizing a first video data stream 207 generated from a high resolution video source 164 A and a second video data stream 209 generated from a low resolution video source 1648 , wherein a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream.
  • 3D video or image processing on the data streams 207 / 209 may be performed within the wireless communication device 150 .
  • the 3D video or image processing 201 / 303 may be performed external to the wireless communication device 150 .
  • the data streams 207 / 209 may be compressed 401 A/ 401 B prior to communicating them for the external 3D video or image processing 405 .
  • the 3D video or image may be displayed locally on the wireless communication device 150 .
  • the 3D video or images may be formatted for local displaying 180 on the wireless communication device 150 .
  • Another embodiment of the invention may provide a machine and/or computer readable storage and/or medium, having stored thereon, a machine code and/or a computer program having at least one code section executable by a machine and/or a computer, thereby causing the machine and/or computer to perform the steps as described herein for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
  • aspects of the invention may be realized in hardware, software, firmware or a combination thereof.
  • the invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
  • a typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • One embodiment of the present invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components.
  • the degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
  • the present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
  • Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
  • other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A wireless communication device including one or more processors and/or circuits may be enabled to generating an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source. The resolution of the output full resolution 3D video may be greater than first and second video data streams. 3D video or image processing on the data streams may be performed within the wireless communication device. The 3D video or image processing may be performed external to the wireless communication device. The data streams may be compressed prior to communicating them for the external 3D video or image processing. The 3D video or image may be displayed locally on the wireless communication device. The 3D video or images may be formatted for local displaying on the wireless communication device.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS/INCORPORATION BY REFERENCE
  • This application makes reference to and claims priority to U.S. Provisional Application Serial No. 61/144,959 filed on Jan. 15, 2009.
  • The above stated application is hereby incorporated herein by reference in its entirety.
  • FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • [Not Applicable]
  • [MICROFICHE/COPYRIGHT REFERENCE]
  • [Not Applicable]
  • FIELD OF THE INVENTION
  • Certain embodiments of the invention relate to wireless communication. More specifically, certain embodiments of the invention relate to a method and system for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
  • BACKGROUND OF THE INVENTION
  • In 3D or 2D video systems, an image is presented in a display device, for example in a television, a monitor and/or a gaming console. Most video broadcasts, nowadays, utilize video processing applications that enable broadcasting video images in the form of bit streams that comprise information regarding characteristics of the image to be displayed. These video applications may utilize various interpolation and/or rate conversion functions to present content comprising still and/or moving images on a display. For example, de-interlacing functions may be utilized to convert moving and/or still images to a format that is suitable for certain types of display devices that are unable to handle interlaced content.
  • Interlaced 3D and/or 2D video comprises fields, each of which may be captured at a distinct time interval. A frame may comprise a pair of fields, for example, a top field and a bottom field. The pictures forming the video may comprise a plurality of ordered lines. During one of the time intervals, video content for the even-numbered lines may be captured. During a subsequent time interval, video content for the odd-numbered lines may be captured. The even-numbered lines may be collectively referred to as the top field, while the odd-numbered lines may be collectively referred to as the bottom field. Alternatively, the odd-numbered lines may be collectively referred to as the top field, while the even-numbered lines may be collectively referred to as the bottom field.
  • In the case of progressive 2D and/or 3D video frames, all the lines of the frame may be captured or played in sequence during one time interval. Interlaced video may comprise fields that were converted from progressive frames. For example, a progressive frame may be converted into two interlaced fields by organizing the even numbered lines into one field and the odd numbered lines into another field.
  • Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
  • BRIEF SUMMARY OF THE INVENTION
  • A system and/or method for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
  • Various advantages, aspects and novel features of the present invention, as well as details of an illustrated embodiment thereof, will be more fully understood from the following description and drawings.
  • BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
  • FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention.
  • FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention.
  • FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention.
  • FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention.
  • FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Certain aspects of the invention may be found in a method and system for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream. In various exemplary aspects of the invention, an output full resolution 3D video may be generated utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source, wherein a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. In one embodiment of the invention, 3D video or image processing on the data streams may be performed within the wireless communication device. In another embodiment of the invention, the 3D video or image processing may be performed external to the wireless communication device. The data streams may be compressed prior to communicating them for the external 3D video or image processing. The 3D video or images may be displayed locally on the wireless communication device. The 3D video or images may be formatted so that they may be locally presented on a display of the wireless communication device.
  • FIG. 1A is a block diagram of an exemplary wireless system that is operable to provide 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention. Referring to FIG. 1A, the wireless device 150 may comprise an antenna 151, a chip 162, a transceiver 152, a baseband processor 154, a processor 155, a system memory 158, a logic block 160, a high resolution camera 164A, a low resolution camera 164B, an audio CODEC 172A, a video CODEC 172B, and an external headset port 166. The wireless device 150 may also comprise an analog microphone 168, integrated hands-free (IHF) stereo speakers 170, a hearing aid compatible (HAC) coil 174, a dual digital microphone 176, a vibration transducer 178, and a touchscreen/display 180.
  • Most video content is currently generated and played in two-dimensional (2D) format. In various video related applications such as, for example, DVD/Blu-ray movies and/or digital TV, 3D video may be more desirable because it may be more realistic to humans to perceive 3D rather than 2D images.
  • The transceiver 152 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to modulate and upconvert baseband signals to RF signals for transmission by one or more antennas, which may be represented generically by the antenna 151. The transceiver 152 may also be enabled to downconvert and demodulate received RF signals to baseband signals. The RF signals may be received by one or more antennas, which may be represented generically by the antenna 151. Different wireless systems may use different antennas for transmission and reception. The transceiver 152 may be enabled to execute other functions, for example, filtering the baseband and/or RF signals, and/or amplifying the baseband and/or RF signals. Although a single transceiver on each chip is shown, the invention is not so limited. Accordingly, the transceiver 152 may be implemented as a separate transmitter and a separate receiver. In addition, there may be a plurality of transceivers, transmitters and/or receivers. In this regard, the plurality of transceivers, transmitters and/or receivers may enable the wireless device 150 to handle a plurality of wireless protocols and/or standards including cellular, WLAN and PAN. Wireless technologies handled by the wireless device 150 may comprise GPS, GALILEO, GLONASS, GSM, CDMA, CDMA2000, WCDMA, GNSS, GMS, GPRS, EDGE, WIMAX, WLAN, LTE, 3GPP, UMTS, BLUETOOTH, and ZIGBEE, for example.
  • The baseband processor 154 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to process baseband signals for transmission via the transceiver 152 and/or the baseband signals received from the transceiver 152. The processor 155 may be any suitable processor or controller such as a CPU, DSP, ARM, or any type of integrated circuit processor. The processor 155 may comprise suitable logic, circuitry, and/or code that may be enabled to control the operations of the transceiver 152 and/or the baseband processor 154. For example, the processor 155 may be utilized to update and/or modify programmable parameters and/or values in a plurality of components, devices, and/or processing elements in the transceiver 152 and/or the baseband processor 154. At least a portion of the programmable parameters may be stored in the system memory 158.
  • Control and/or data information, which may comprise the programmable parameters, may be transferred from other portions of the wireless device 150, not shown in FIG. 1, to the processor 155. Similarly, the processor 155 may be enabled to transfer control and/or data information, which may include the programmable parameters, to other portions of the wireless device 150, not shown in FIG. 1, which may be part of the wireless device 150.
  • The processor 155 may utilize the received control and/or data information, which may comprise the programmable parameters or video source data, to determine an operating mode of the transceiver 152. For example, the processor 155 may be utilized to select a specific frequency for a local oscillator, a specific gain for a variable gain amplifier, configure the local oscillator and/or configure the variable gain amplifier for operation in accordance with various embodiments of the invention. Moreover, the received video source data and/or processed full-resolution 3D video data, may be stored in the system memory 158 via the processor 155, for example. The information stored in system memory 158 may be transferred to the transceiver 152 from the system memory 158 via the processor 155.
  • The processor 155 may be operable to process received video data streams from a high resolution video source and a low resolution video source. The processor 155 may thereby generate a full resolution 3D video from the received data streams
  • The system memory 158 may comprise suitable logic, circuitry, interfaces, and/or code that may be enabled to store a plurality of control and/or video data information, including video or image processing parameters, or full resolution 3D video data. The system memory 158 may store at least a portion of the programmable parameters that may be manipulated by the processor 155.
  • The logic block 160 may comprise suitable logic, circuitry, interfaces, and/or code that may enable controlling of various functionalities of the wireless device 150. For example, the logic block 160 may comprise one or more state machines that may generate signals to control the transceiver 152 and/or the baseband processor 154. The logic block 160 may also comprise registers that may hold data for controlling, for example, the transceiver 152 and/or the baseband processor 154. The logic block 160 may also generate and/or store status information that may be read by, for example, the processor 155. Amplifier gains and/or filtering characteristics, for example, may be controlled by the logic block 160.
  • The BT radio/processor 163 may comprise suitable circuitry, logic, interfaces, and/or code that may enable transmission and reception of Bluetooth signals. The BT radio/processor 163 may enable processing and/or handling of BT baseband signals. In this regard, the BT radio/processor 163 may process or handle BT signals received and/or BT signals transmitted via a wireless communication medium. The BT radio/processor 163 may also provide control and/or feedback information to/from the baseband processor 154 and/or the processor 155, based on information from the processed BT signals. The BT radio/processor 163 may communicate information and/or data from the processed BT signals to the processor 155 and/or to the system memory 158. Moreover, the BT radio/processor 163 may receive information from the processor 155 and/or the system memory 158, which may be processed and transmitted via the wireless communication medium a Bluetooth headset, for example.
  • The high-resolution camera 164A may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images. The high-resolution camera 164A may be capable of capturing high-definition images and video and may be controlled via the processor 155, for example. The high-resolution camera 164A may comprise multi-megapixels, for example.
  • The low-resolution camera 164B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to capture video and images. The low-resolution camera 164B may comprise a smaller, lower-cost camera than the high-resolution camera 164A, and may comprise a VGA image/video camera, for example. In this manner, the wireless device 150 may comprise two cameras for full resolution 3D images and video, without the need for two high-resolution cameras. The 3D image and/or video may be displayed on the touchscreen/display 180, for example, may be stored in the system memory 158, and/or may be communicated externally via the transceiver 152 and the antenna 151.
  • The audio CODEC 172A may comprise suitable circuitry, logic, interfaces, and/or code that may process audio signals received from and/or communicated to input/output devices. The input devices may be within or communicatively coupled to the wireless device 150, and may comprise the analog microphone 168, the stereo speakers 170, the hearing aid compatible (HAC) coil 174, the dual digital microphone 176, and the vibration transducer 178, for example. The audio CODEC 172A may be operable to up-convert and/or down-convert signal frequencies to desired frequencies for processing and/or transmission via an output device.
  • The video CODEC 172B may comprise suitable circuitry, logic, interfaces, and/or code that may be operable to process video signals received and/or communicated from and/or to input output devices, such as the high-resolution camera 164A and the low-resolution camera 164B. The video CODEC 172B may communicate processed video signals to the processor 155 for further processing, or for communication to devices external to the wireless device 150 via the transceiver 152.
  • The chip 162 may comprise an integrated circuit with multiple functional blocks integrated within, such as the transceiver 152, the baseband processor 154, the BT radio/processor 163, the audio CODEC 172A, and the video CODEC 172B. The number of functional blocks integrated in the chip 162 is not limited to the number shown in FIG. 1. Accordingly, any number of blocks may be integrated on the chip 162, including cameras such as the high-resolution camera 164A and the low-resolution camera 164B, depending on chip space and wireless device 150 requirements, for example.
  • The external headset port 166 may comprise a physical connection for an external headset to be communicatively coupled to the wireless device 150. The analog microphone 168 may comprise suitable circuitry, logic, and/or code that may detect sound waves and convert them to electrical signals via a piezoelectric effect, for example. The electrical signals generated by the analog microphone 168 may comprise analog signals that may require analog to digital conversion before processing.
  • The stereo speakers 170 may comprise a pair of speakers that may be operable to generate audio signals from electrical signals received from the audio CODEC 172A. The HAC coil 174 may comprise suitable circuitry, logic, and/or code that may enable communication between the wireless device 150 and a T-coil in a hearing aid, for example. In this manner, electrical audio signals may be communicated to a user that utilizes a hearing aid, without the need for generating sound signals via a speaker, such as the stereo speakers 170, and converting the generated sound signals back to electrical signals in a hearing aid, and subsequently back into amplified sound signals in the user's ear, for example.
  • The dual digital microphone 176 may comprise suitable circuitry, logic, and/or code that may be operable to detect sound waves and convert them to electrical signals. The electrical signals generated by the dual digital microphone 176 may comprise digital signals, and thus may not require analog to digital conversion prior to digital processing in the audio CODEC 172A. The dual digital microphone 176 may enable beamforming capabilities, for example.
  • The vibration transducer 178 may comprise suitable circuitry, logic, and/or code that may enable notification of an incoming call, alerts and/or message to the wireless device 150 without the use of sound. The vibration transducer may generate vibrations that may be in synch with, for example, audio signals such as speech or music.
  • In operation, video stream data may be communicated from image and/or video sources, such as the high-resolution camera 164A and the low-resolution camera 164B to the video CODEC 172B. The video CODEC 172B may process the received video data before communicating the data to the processor for further processing or communication to a device external to the wireless device 150.
  • In an embodiment of the invention, the high-resolution camera 164A and the low-resolution camera 164B may be operable to capture images and/or video. By utilizing two spatially separated cameras, a 3D image/video may be generated from the two image/video signals. The high-resolution camera 164A and the low-resolution camera 164B may communicate video data streams to the video CODEC 172B and process the received signals before communicating the processed signals to the processor for further processing. The processor 155 may be operable to process 3D video and/or images obtained utilizing the high-resolution camera 164A and the low-resolution camera 1648. In this manner, space and cost of the wireless device 150 may be reduced by utilizing a smaller, lower-cost, lower-resolution camera with a high-resolution camera, while still supporting full-resolution 3D images and video. The processing performed by the processor 155 may comprise right and left-view generation to enable 3D video, which may comprise still and/or moving images.
  • Various methodologies may be utilized to capture, generate (at capture or playtime), and/or render 3D video images. One of the more common methods for implementing 3D video is stereoscopic 3D video. In stereoscopic 3D video based applications the 3D video impression is generated by rendering multiple views, most commonly two views: a left view and a right view, corresponding to the viewer's left eye and right eye to give depth to displayed images. In this regard, left view and right view video sequences may be captured and/or processed to enable creating 3D images. The left view and right view data may then be communicated either as separate streams, or may be combined into a single transport stream and only separated into different view sequences by the end-user receiving/displaying device.
  • Various compression/encoding standards may be utilized to enable compressing and/or encoding of the view sequences into transport streams during communication of stereoscopic 3D video. For example, the separate left and right view video sequences may be compressed based on MPEG-2 MVP, H.264 and/or MPEG-4 advanced video coding (AVC) or MPEG-4 multi-view video coding (MVC).
  • In an embodiment of the invention, 3D video and image processing may be achieved utilizing one full resolution video stream and one lower resolution video stream. In this regard, a wireless device 150 comprising one or more processors and/or circuits may be enabled to generate an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source. A resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. 3D video or image processing on the data streams may be performed within the wireless communication device 150 and/or external to the wireless communication device 150.
  • FIG. 1B is a block diagram illustrating exemplary 3D video capture, in accordance with an embodiment of the invention. Referring to FIG. 1B, there is shown the wireless device 150 and the touchscreen/display 180, which may be as described with respect to FIG. 1A.
  • In operation, the high-resolution camera 164A and the low-resolution camera 164B in the wireless device 150 may be operable to capture images and/or video. By utilizing two spatially separated cameras, a 3D image/video may be generated from the two image/video signals, and by utilizing a lower-resolution camera in concert with a high-resolution camera, full high definition 3D images and/or video may be generated, while reducing cost and space requirements of the wireless device 150.
  • The captured images and/or video may be processed in the wireless device 150 and may subsequently be displayed on the touchscreen/display 180. In another embodiment of the invention, the processed images and/or video may be communicated external to the wireless device 150. Alternatively, the captured images and/or video may be communicated from the wireless device 150 without processing, before being processed by an external device. In this manner, processor requirements in the wireless device 150 may be reduced.
  • In various embodiments of the invention, a wireless device 150 comprising one or more processors and/or circuits may be operable to generate an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source. A resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. 3D video or image processing on the data streams may be performed within the wireless communication device 150 and/or external to the wireless communication device 150.
  • FIG. 2 is a block diagram illustrating an exemplary 3D video capture and processing sequence, in accordance with an embodiment of the invention. Referring to FIG. 2, there is shown a 3D video processing module 201, a reformat module 203, and a 3D video output 205. There is also shown a full resolution video stream 207, a low resolution stream 209, and a plurality of video streams 211.
  • The 3D video processing module 201 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to process received video streams, such as the full resolution video stream 207 and the low resolution stream 209. The 3D video processing module 201 may be integrated in the wireless device 150, such as in the processor 155, for example, or may be in an external device, such as a computer or audio/visual system that is operable to receive and process video streams. In instances where the 3D video processing module 201 is external to the wireless device 150, the full resolution video stream 207 and the lower resolution stream 209 may be communicated to the external device in parallel.
  • The reformat module 203 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format processed 3D image or video data into a plurality of video streams 211. The reformat module 203 may format image and/or video data to the appropriate format for a target video output device.
  • The 3D video output 205 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to display 3D images and/or video. The 3D video output 205 may comprise a high-definition television, for example.
  • In operation, two cameras, such as the high resolution camera 164A and the low-resolution camera 164B, may generate two input streams, the full resolution video stream 207 and the low resolution stream 209. The full resolution stream 207 may comprise a stream of full resolution required for target compression or for LCD resolution. The low resolution stream 209 may comprise a reduced resolution stream, such as a VGA video stream, for example. The streams may be communicated to the 3D video processing module 201 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images.
  • The 3D video stream generated by the 3D video processing module 201 may be communicated to the reformat module 203, which may be operable to format the processed 3D video stream into a plurality of video streams 211. The plurality of video streams 211 may be communicated to the 3D video output 205 for display.
  • FIG. 3 is a diagram illustrating exemplary 3D video implementation, in accordance with an embodiment of the invention. Referring to FIG. 3, there is shown a 3D video implementation 300 comprising a high resolution camera 301A, a low resolution video camera 301B, a 3D processing module 303, a format module 305, a television 307, the wireless device 150 and the touchscreen/display 180. The high resolution camera 301A and the low resolution video camera 301B may be substantially similar to the high-resolution camera 164A and the low-resolution camera 164B described with respect to FIG. 1A, and the and the 3D processing module 303 may be substantially similar to the 3D processing module 201 described with respect to FIG. 2. The wireless device 150, and the touchscreen/display 180 may be as described with respect to FIG. 1A.
  • The format module 305 may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to format video and/or image data for a desired display type. For example, the format module 305 may define an appropriate aspect ratio or scan rate as required by the wireless device 150 or the television 307.
  • In operation, the high-resolution camera 301A and the low-resolution camera 301B, may generate two input streams to be communicated to the 3D video processing module 303 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images. The 3D video stream generated by the 3D video processing module 301 may be communicated to the format module 305, which may be operable to format the processed 3D video stream into the appropriate format for display on the wireless device 150, the television 307, or similar display device.
  • FIG. 4 is a diagram illustrating exemplary networked 3D video implementation, in accordance with an embodiment of the invention. Referring to FIG. 4, there is shown a networked 3D video implementation 400 comprising the high resolution camera 301A, the low resolution video camera 301B, compression modules 401A and 401B, decompression modules 403A and 403B, a 3D video processing module 405, a format module 407, the wireless device 150, and the television 307. The high resolution camera 301A, the low resolution video camera 301B, the wireless device 150, and the television 307 may be as described previously. The 3D video processing module 405, the format module 407, and the wireless device 409 may be substantially similar to the 3D video processing module 303, the format module 305, and the wireless device 150 described previously. The wireless device 409 may comprise a touchscreen/display 411.
  • The compression modules 401A and 401B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to compress received image and/or video data for subsequent communication to remote devices. For example, the compression modules 401A and 401B may be integrated in a wireless device, such as the wireless device 150, and may enable more efficient communication of data over a network by reducing data size.
  • The decompression modules 403A and 403B may comprise suitable logic, circuitry, interfaces, and/or code that may be operable to decompress received compressed data. The decompression modules 403A and 403B may be remote from the image/video data source.
  • In operation, the high-resolution camera 301A and the low-resolution camera 301B, may generate two input streams to be communicated to the compression modules 401A and 401B, where the data streams may be compressed for more efficient communication. The compressed streams may be communicated to a remote device comprising the decompression modules 403A and 403B, which may be enabled to decompress the received data for further processing.
  • The decompressed data may be communicated to the 3D video processing module 405 which may combine the two streams to generate a full-resolution 3D video stream. In an embodiment of the invention, this process may be applied to video or still images. The 3D video stream generated by the 3D video processing module 405 may be communicated to the format module 407, which may be operable to format the processed 3D video stream into the appropriate format for display on the wireless device 409, the television 307, or a similar display device.
  • FIG. 5 is a block diagram illustrating exemplary steps for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream, in accordance with an embodiment of the invention. Referring to FIG. 5, in step 503 after start step 501, video and/or image data may be captured utilizing a high-resolution camera and a low-resolution camera. If in step 505, the video/image data is to be processed locally, such as within the wireless device 150, the exemplary steps may proceed to step 511, where the data streams may be combined and processed to generate full-resolution 3D video/images. If in step 505, the data is not to be processed locally, the data streams may be compressed in step 507, followed by step 509 where the compressed data may be communicated to a remote device, before the exemplary steps proceed to step 511, where the data streams may be combined and processed to generate full-resolution 3D video/images. The process may then proceed to step 513 where the 3D video/images may be formatted for a desired display device and subsequently displayed on that device, followed by end step 515.
  • In an embodiment of the invention, a method and system are disclosed for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream. In this regard, a wireless device comprising one or more processors and/or circuits may be enabled to generate an output full resolution 3D video utilizing a first video data stream 207 generated from a high resolution video source 164A and a second video data stream 209 generated from a low resolution video source 1648, wherein a resolution of the output full resolution 3D video is greater than a resolution of the first video data stream and the second video data stream. 3D video or image processing on the data streams 207/209 may be performed within the wireless communication device 150. The 3D video or image processing 201/303 may be performed external to the wireless communication device 150. The data streams 207/209 may be compressed 401A/401B prior to communicating them for the external 3D video or image processing 405. The 3D video or image may be displayed locally on the wireless communication device 150. The 3D video or images may be formatted for local displaying 180 on the wireless communication device 150.
  • Another embodiment of the invention may provide a machine and/or computer readable storage and/or medium, having stored thereon, a machine code and/or a computer program having at least one code section executable by a machine and/or a computer, thereby causing the machine and/or computer to perform the steps as described herein for enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream.
  • Accordingly, aspects of the invention may be realized in hardware, software, firmware or a combination thereof. The invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • One embodiment of the present invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components. The degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
  • The present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form. However, other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
  • While the invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present invention without departing from its scope. Therefore, it is intended that the present invention not be limited to the particular embodiments disclosed, but that the present invention will include all embodiments falling within the scope of the appended claims.

Claims (20)

1. A method for enabling wireless communication, the method comprising:
performing by one or more processors and/or circuits in a single video processing device:
generating an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source, wherein a resolution of said output full resolution 3D video is greater than a resolution of said first video data stream and said second video data stream.
2. The method according to claim 1, comprising generating left view video for said full resolution 3D video utilizing information from one or both of said first video data stream and said second video data stream.
3. The method according to claim 1, comprising generating right view video for said full resolution 3D video utilizing information from one or both of said first video data stream and said second video data stream.
4. The method according to claim 1, comprising compressing said generated output full resolution 3D video.
5. The method according to claim 4, comprising communicating said compressed generated output full resolution 3D video to a device external to said wireless communication device for processing.
6. The method according to claim 5, wherein said device external to said wireless communication device decompresses said compressed generated output full resolution 3D video.
7. The method according to claim 6, wherein said device external to said wireless communication device processes and displays said decompressed generated output full resolution 3D video.
8. The method according to claim 1, comprising formatting said output full resolution 3D video for local display by said wireless communication device
9. The method according to claim 8, comprising displaying said formatted output full resolution 3D video by said wireless communication device.
10. The method according to claim 1, wherein one or more of said output full resolution 3D video, said first video data stream, and/or said second video data stream comprises still and/or moving images.
11. A system for enabling wireless communication, the system comprising:
one or more processors and/or circuits for use in a video processing device, said one or more processors and/or circuits are operable to:
generate an output full resolution 3D video utilizing a first video data stream generated from a high resolution video source and a second video data stream generated from a low resolution video source, wherein a resolution of said output full resolution 3D video is greater than a resolution of said first video data stream and said second video data stream.
12. The system according to claim 11, wherein said one or more processors and/or circuits are operable to generate left view video for said full resolution 3D video utilizing information from one or both of said first video data stream and said second video data stream.
13. The system according to claim 11, wherein said one or more processors and/or circuits are operable to generate right view video for said full resolution 3D video utilizing information from one or both of said first video data stream and said second video data stream.
14. The system according to claim 11, wherein said one or more processors and/or circuits are operable to compress said generated output full resolution 3D video.
15. The system according to claim 14, wherein said one or more processors and/or circuits are operable to communicate said compressed generated output full resolution 3D video to a device external to said wireless communication device for processing.
16. The system according to claim 15, wherein said device external to said wireless communication device decompresses said compressed generated output full resolution 3D video.
17. The system according to claim 16, wherein said device external to said wireless communication device processes and displays said decompressed generated output full resolution 3D video.
18. The system according to claim 11, wherein said one or more processors and/or circuits are operable to format said output full resolution 3D video for local display by said wireless communication device
19. The system according to claim 18, wherein said one or more processors and/or circuits are operable to display said formatted output full resolution 3D video by said wireless communication device.
20. The system according to claim 11, wherein one or more of said output full resolution 3D video, said first video data stream, and/or said second video data stream comprises still and/or moving images.
US12/629,247 2009-01-15 2009-12-02 Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream Abandoned US20100177162A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/629,247 US20100177162A1 (en) 2009-01-15 2009-12-02 Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream
EP10000181A EP2209321A3 (en) 2009-01-15 2010-01-11 Enabling 3D video and image processing using one full resolution video stream and one lower resolution video stream
CN201010005536A CN101795418A (en) 2009-01-15 2010-01-15 Method and system for realizing wireless communication

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14495909P 2009-01-15 2009-01-15
US12/629,247 US20100177162A1 (en) 2009-01-15 2009-12-02 Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream

Publications (1)

Publication Number Publication Date
US20100177162A1 true US20100177162A1 (en) 2010-07-15

Family

ID=41858891

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/629,247 Abandoned US20100177162A1 (en) 2009-01-15 2009-12-02 Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream

Country Status (3)

Country Link
US (1) US20100177162A1 (en)
EP (1) EP2209321A3 (en)
CN (1) CN101795418A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070069923A1 (en) * 2005-05-09 2007-03-29 Ehud Mendelson System and method for generate and update real time navigation waypoint automatically
US20130069825A1 (en) * 2011-09-21 2013-03-21 Rio Systems Ltd. Methods, circuits and systems for generating navigation beacon signals
US20140143797A1 (en) * 2009-04-27 2014-05-22 Mitsubishi Electric Corporation Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distrubtion apparatus, stereoscopic video viewing system, stereoscipic video viewing method, and stereoscopic video viewing apparatus
US20140285622A1 (en) * 2009-04-27 2014-09-25 Lg Electronics Inc. Broadcast receiver and 3d video data processing method thereof
US20140341293A1 (en) * 2011-09-16 2014-11-20 Dolby Laboratories Licensing Corporation Frame-Compatible Full Resolution Stereoscopic 3D Compression And Decompression
US9077966B2 (en) 2010-02-15 2015-07-07 Thomson Licensing Apparatus and method for processing video content
US9628769B2 (en) 2011-02-15 2017-04-18 Thomson Licensing Dtv Apparatus and method for generating a disparity map in a receiving device
US20170111593A1 (en) * 2005-06-21 2017-04-20 Cedar Crest Partners Inc. System, method and apparatus for capture, conveying and securing information including media information such as video
EP2640073A4 (en) * 2010-11-12 2017-05-31 Electronics And Telecommunications Research Institute Method and apparatus for determining a video compression standard in a 3dtv service
US9936109B2 (en) * 2010-10-22 2018-04-03 University Of New Brunswick Method and system for fusing images
US20220134227A1 (en) * 2019-02-25 2022-05-05 Google Llc Variable end-point user interface rendering
CN115460461A (en) * 2022-09-07 2022-12-09 北京奇艺世纪科技有限公司 Video processing method and device, terminal equipment and computer readable storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5735181B2 (en) * 2011-09-29 2015-06-17 ドルビー ラボラトリーズ ライセンシング コーポレイション Dual layer frame compatible full resolution stereoscopic 3D video delivery
TWI595770B (en) 2011-09-29 2017-08-11 杜比實驗室特許公司 Frame-compatible full-resolution stereoscopic 3d video delivery with symmetric picture resolution and quality
DE102012106860A1 (en) * 2012-07-27 2014-02-13 Jenoptik Robot Gmbh Device and method for identifying and documenting at least one object passing through a radiation field
CN103841335A (en) * 2014-03-20 2014-06-04 梁红 Novel method for improving night vision image effect

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2887272B2 (en) * 1987-07-14 1999-04-26 株式会社 エイ・ティ・ア−ル通信システム研究所 3D image device
EP1418766A3 (en) * 1998-08-28 2010-03-24 Imax Corporation Method and apparatus for processing images
US7260274B2 (en) * 2000-12-01 2007-08-21 Imax Corporation Techniques and systems for developing high-resolution imagery
WO2006079963A2 (en) * 2005-01-28 2006-08-03 Koninklijke Philips Electronics N.V. Device for registering images
JP4818053B2 (en) * 2006-10-10 2011-11-16 株式会社東芝 High resolution device and method
WO2008085874A2 (en) * 2007-01-05 2008-07-17 Marvell World Trade Ltd. Methods and systems for improving low-resolution video
EP2259599A1 (en) * 2009-05-29 2010-12-08 Telefonaktiebolaget L M Ericsson (Publ) Method and arrangement for processing a stereo image for three-dimensional viewing

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070069923A1 (en) * 2005-05-09 2007-03-29 Ehud Mendelson System and method for generate and update real time navigation waypoint automatically
US20170111593A1 (en) * 2005-06-21 2017-04-20 Cedar Crest Partners Inc. System, method and apparatus for capture, conveying and securing information including media information such as video
US10356388B2 (en) * 2009-04-27 2019-07-16 Mitsubishi Electric Corporation Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distribution apparatus, stereoscopic video viewing system, stereoscopic video viewing method, and stereoscopic video viewing apparatus
US20140143797A1 (en) * 2009-04-27 2014-05-22 Mitsubishi Electric Corporation Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distrubtion apparatus, stereoscopic video viewing system, stereoscipic video viewing method, and stereoscopic video viewing apparatus
US20140285622A1 (en) * 2009-04-27 2014-09-25 Lg Electronics Inc. Broadcast receiver and 3d video data processing method thereof
US9077966B2 (en) 2010-02-15 2015-07-07 Thomson Licensing Apparatus and method for processing video content
US9148646B2 (en) 2010-02-15 2015-09-29 Thomson Licensing Apparatus and method for processing video content
US9936109B2 (en) * 2010-10-22 2018-04-03 University Of New Brunswick Method and system for fusing images
EP2640073A4 (en) * 2010-11-12 2017-05-31 Electronics And Telecommunications Research Institute Method and apparatus for determining a video compression standard in a 3dtv service
US9628769B2 (en) 2011-02-15 2017-04-18 Thomson Licensing Dtv Apparatus and method for generating a disparity map in a receiving device
US9473788B2 (en) * 2011-09-16 2016-10-18 Dolby Laboratories Licensing Corporation Frame-compatible full resolution stereoscopic 3D compression and decompression
US20140341293A1 (en) * 2011-09-16 2014-11-20 Dolby Laboratories Licensing Corporation Frame-Compatible Full Resolution Stereoscopic 3D Compression And Decompression
US20130069825A1 (en) * 2011-09-21 2013-03-21 Rio Systems Ltd. Methods, circuits and systems for generating navigation beacon signals
US20220134227A1 (en) * 2019-02-25 2022-05-05 Google Llc Variable end-point user interface rendering
US12102917B2 (en) * 2019-02-25 2024-10-01 Google Llc Variable end-point user interface rendering
CN115460461A (en) * 2022-09-07 2022-12-09 北京奇艺世纪科技有限公司 Video processing method and device, terminal equipment and computer readable storage medium

Also Published As

Publication number Publication date
EP2209321A3 (en) 2013-03-20
EP2209321A2 (en) 2010-07-21
CN101795418A (en) 2010-08-04

Similar Documents

Publication Publication Date Title
US20100177162A1 (en) Method and system for enabling 3d video and image processing using one full resolution video stream and one lower resolution video stream
CN102342112B (en) Stereo image data transmitting apparatus, stereo image data transmitting method, stereo image data receiving apparatus, and stereo image data receiving method
US20100265315A1 (en) Three-dimensional image combining apparatus
WO2010099178A2 (en) System and method for displaying multiple images/videos on a single display
KR101832407B1 (en) Method and system for communication of stereoscopic three dimensional video information
US9270975B2 (en) Information integrating device and information integrating method which integrates stereoscopic video information using main information and complementary information
AU2010231544A1 (en) System and format for encoding data and three-dimensional rendering
WO2013015116A1 (en) Encoding device and encoding method, and decoding device and decoding method
US20100194845A1 (en) Television system and control method thereof
US20120262454A1 (en) Stereoscopic image data transmission device, stereoscopic image data transmission method, stereoscopic image data reception device, and stereoscopic image data reception method
US9749608B2 (en) Apparatus and method for generating a three-dimension image data in portable terminal
MX2012004580A (en) 3d-image-data transmission device, 3d-image-data transmission method, 3d-image-data reception device, and 3d-image-data reception method.
CN113689810B (en) Image display apparatus and method thereof
EP2312859A2 (en) Method and system for communicating 3D video via a wireless communication link
US20120300026A1 (en) Audio-Video Signal Processing
JP2012199897A5 (en)
HK1147001A (en) A method and system for implementing wireless communication
JP5170278B2 (en) Display control device, display control method, program, and display control system
JPH10164448A (en) Method and device for reproducing image for television broadcasting system
KR102761772B1 (en) Transmission device
JP5075996B2 (en) Video display method and video display device
EP2015573A1 (en) Telecomunication device and system
JP5626883B2 (en) Videophone device and control method thereof
KR101674187B1 (en) Apparatus for stereophonic acquisition for broadband interpolation and Method thereof
CA3007360A1 (en) Remote-controlled media studio

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MACFARLANE, CHARLES;REEL/FRAME:023782/0475

Effective date: 20091202

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION