[go: up one dir, main page]

KR20050008245A - An apparatus and method for inserting 3D graphic images in video - Google Patents

An apparatus and method for inserting 3D graphic images in video Download PDF

Info

Publication number
KR20050008245A
KR20050008245A KR1020030048117A KR20030048117A KR20050008245A KR 20050008245 A KR20050008245 A KR 20050008245A KR 1020030048117 A KR1020030048117 A KR 1020030048117A KR 20030048117 A KR20030048117 A KR 20030048117A KR 20050008245 A KR20050008245 A KR 20050008245A
Authority
KR
South Korea
Prior art keywords
image
information
video
video signal
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
KR1020030048117A
Other languages
Korean (ko)
Inventor
박정선
양희덕
Original Assignee
(주)워치비젼
정보통신연구진흥원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)워치비젼, 정보통신연구진흥원 filed Critical (주)워치비젼
Priority to KR1020030048117A priority Critical patent/KR20050008245A/en
Publication of KR20050008245A publication Critical patent/KR20050008245A/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

본 발명은 동영상의 3차원 그래픽 영상 합성 장치 및 그 방법에 관한 것으로서, 사용자 인터페이스부와, 모션 분석부와, 영상합성부와, 영상 저장부와, 영상출력부를 포함하고, 프레임들의 흐름인 동영상신호가 입력되면 그 동영상신호를 분석하여 장면이 전환된 프레임에 대한 정보인 장면전환영상정보를 검출하는 과정과, 외부로부터 합성영상의 합성영역정보를 전달받아 상기 검출된 장면전환영상정보에 해당되는 프레임 및 그 프레임 이후로 입력되는 프레임들 중 상기 합성영역정보에 해당되는 합성영역을 포함하는 모든 프레임들 각각에 대한 특징정보에 의거하여 카메라 모션 정보를 도출하는 과정과, 외부로부터 3차원 합성영상을 전달받아 상기 과정에서 도출된 모션정보에 의거하여 상기 동영상 신호 및 3차원 합성영상을 합성하는 과정과, 상기 3차원 합성영상을 저장하고, 출력하는 과정을 수행함으로써, 가상의 3차원 영상을 실제 동영상 내에 보다 효과적으로 합성할 수 있다는 효과가 있다. 이로 인해, 단순한 촬영으로 이루어진 동영상에 부가적인 효과를 극대화시킬 수 있다.The present invention relates to a 3D graphic image synthesizing apparatus and method for moving image, comprising a user interface unit, a motion analysis unit, an image synthesis unit, an image storage unit, an image output unit, and a video signal that is a flow of frames. Is inputted, analyzing the video signal to detect scene change image information, which is information on a frame to which a scene has been changed, and receiving a composition region information of a synthesized image from the outside, and receiving a frame corresponding to the detected scene change image information. And deriving the camera motion information based on the feature information of each of the frames including the synthesis region corresponding to the synthesis region information among frames input after the frame, and transmitting the 3D composite image from the outside. Receiving and synthesizing the video signal and the 3D composite image based on the motion information derived in the process; By storing the group 3-D composite image, and performs a process for outputting, there is an effect that it can effectively synthesize the three-dimensional image of the virtual than within the actual video. Because of this, it is possible to maximize the additional effect on the video consisting of a simple shooting.

Description

일반 동영상에서의 3차원 그래픽 영상합성을 위한 장치 및 그 방법{An apparatus and method for inserting 3D graphic images in video}An apparatus and method for inserting 3D graphic images in general video

본 발명은 동영상의 3차원 그래픽 영상합성장치 및 그 방법에 관한 것으로서, 특히, 일반적인 동영상에서 하드웨어적인 도움 없이 그 동영상의 카메라 모션을 고려하여 자연스럽게 3차원 그래픽 영상과 동영상을 합성할 수 있도록 하는 동영상의 3차원 그래픽 영상합성장치 및 그 방법에 관한 것이다.The present invention relates to a 3D graphic image synthesizing apparatus and a method thereof for moving images. In particular, a video for synthesizing a 3D graphic image and a video naturally in consideration of the camera motion of the video without hardware assistance in a general video. A three-dimensional graphic image synthesizing apparatus and a method thereof are provided.

기존의 동영상 편집이나, 영상합성 등의 동영상처리기술은 축구장, 테니스장 또는 야구장과 같은 스포츠 영상과 같이 규격화된 필드가 존재하는 동영상에 국한된 것이 일반적이다.Existing video processing techniques such as video editing and video synthesis are generally limited to video having standardized fields such as sports video such as soccer field, tennis court or baseball field.

따라서, TV 드라마나 영화 등과 같은 동영상을 처리하기 위해서는 스포츠영상에서 사용된 추적방법보다 훨씬 복잡한 추적방법이 적용되어야만하며, 동영상 처리를 하드웨어에 의존하는 경우 보다 정확한 동영상 처리를 위해 고비용의 장치를 이용하여야 했다.Therefore, in order to process video such as TV drama or movie, a more complicated tracking method has to be applied than the tracking method used in sports video, and if the video processing is dependent on hardware, an expensive device must be used for more accurate video processing. did.

본 발명은 상기한 바와 같은 종래의 문제점을 해결하기 위하여 안출된 것으로서, 본 발명의 목적은 일반적인 동영상에서 하드웨어적인 도움 없이 그 동영상의 카메라 모션을 고려하여 자연스럽게 3차원 그래픽 가상영상과 동영상을 합성할 수 있도록 하는 동영상의 3차원 그래픽 영상합성장치 및 그 방법을 제공하는 데에 있다.The present invention has been made to solve the conventional problems as described above, the object of the present invention is to synthesize a 3D graphic virtual video and a video naturally in consideration of the camera motion of the video without the help of hardware in general video The present invention provides a 3D graphic image synthesizing apparatus and method thereof.

도 1은 본 발명의 실시예에 따른 동영상의 3차원 그래픽 영상합성장치에 대한 블록도,1 is a block diagram of a 3D graphic image synthesizing apparatus of a video according to an embodiment of the present invention;

도 2은 본 발명의 실시예에 따른 영상합성부에 대한 개략적인 블록도,2 is a schematic block diagram of an image synthesizer according to an embodiment of the present invention;

도 3는 본 발명의 실시예에 따른 동영상의 영상합성방법에 대한 처리 흐름도,3 is a flowchart illustrating a method of synthesizing an image according to an embodiment of the present invention;

도 4은 본 발명의 실시예에 따른 3차원 그래픽 영상합성 과정시 합성영역에 대한 좌표계 생성에 관한 개략도.4 is a schematic diagram of generating a coordinate system for a synthesis region in a 3D graphic image synthesis process according to an embodiment of the present invention.

도 5은 에피폴라 기하학을 이용한 에센셜 행렬로 부터의 카메라 모션 파라미터의 추출을 나타내는 도면.5 shows the extraction of camera motion parameters from an essential matrix using epipolar geometry.

상기 목적을 달성하기 위한 본 발명의 동영상의 3차원 그래픽 영상합성장치는 연속적인 프레임들의 흐름인 동영상신호를 입력하는 동영상 입력부와, 상기 동영상신호를 구성하는 프레임들의 소정영역에 소정의 합성영상을 합성하기 위해 사용자와의 인터페이스를 수행하는 사용자 인터페이스부와, 상기 사용자 인터페이스부로부터 사용자가 선택한 합성영역정보를 전달받고, 프레임들 각각의 특징정보에의거하여, 카메라의 모션 정보를 도출하는 모션 분석부와, 상기 사용자 인터페이스부로부터 사용자가 합성하고자 하는 3차원 합성영상 및 3차원 합성영역정보를 전달받고, 상기 동영상신호와 상기 합성영상을 합성한 합성영상을 생성하는 영상합성부와, 상기 영상합성부로부터 합성영상을 전달받아 그 합성영상을 동영상신호로 저장하는 영상 저장부와, 상기 합성영상에 대한 동영상신호를 출력하는 영상출력부를 포함하는 것을 특징으로 한다.In order to achieve the above object, a 3D graphic image sum growth value of a moving image of the present invention synthesizes a moving image input unit for inputting a moving image signal which is a flow of consecutive frames, and a predetermined composite image in a predetermined region of the frames constituting the moving image signal. A user interface unit for performing an interface with a user, a motion analysis unit for receiving composite region information selected by the user from the user interface unit, and deriving motion information of a camera based on feature information of each frame; An image synthesizer configured to receive a 3D synthesized image and 3D synthesized region information to be synthesized by a user from the user interface unit, and generate a synthesized image obtained by synthesizing the video signal and the synthesized image; Image save that receives a composite video and saves it as a video signal It characterized in that it comprises a unit, image output unit for outputting a video signal for the composite image.

한편, 상기 목적을 달성하기 위한 본 발명의 동영상 합성방법은 연속적인 프레임들의 흐름인 동영상신호가 입력되면, 그 동영상신호를 분석하여 외부로부터 합성영상의 합성영역정보를 전달받아, 입력되는 프레임들 중 상기 합성영역정보에 해당되는 합성영역을 포함하는 모든 프레임들 각각에 대한 특징정보에 의거하여 카메라 모션 정보를 도출하고, 외부로부터 3차원 합성영상을 전달받아, 상기 과정에서 도출된 모션정보에 의거하여, 상기 동영상 신호 및 3차원 합성영상을 합성하는 과정과, 상기 3차원 합성영상을 저장하고, 출력하는 과정을 포함하는 것을 특징으로 한다.On the other hand, the video synthesis method of the present invention for achieving the above object, if a video signal that is a flow of a continuous frame is input, by analyzing the video signal received from the outside of the synthesis region information of the synthesized image, of the input frame The camera motion information is derived based on the feature information of each of the frames including the synthesis region corresponding to the synthesis region information, the 3D composite image is received from the outside, and based on the motion information derived in the process. And synthesizing the video signal and the 3D composite image, and storing and outputting the 3D composite image.

도 2는 본 발명의 실시예에 따른 영상합성부(200)에 대한 개략적인 블록도이다. 도 2를 참조하면, 본 발명의 영상합성부(200)는 3차원 영상 합성 영역 변환 정보검출부(210), 영상처리부(220) 및 영상효과부(230)를 포함한다.2 is a schematic block diagram of an image synthesizer 200 according to an exemplary embodiment of the present invention. Referring to FIG. 2, the image synthesizer 200 includes a 3D image synthesis region transformation information detector 210, an image processor 220, and an image effector 230.

3차원 영상 합성 영역 변환 정보 검출부(210)는 사용자 인터페이스부('도 1'의 600)를 통해 입력되는 합성영역의 위치정보를 통해 그 합성영역내의 원근변화율정보를 검출한다. 3차원 영상 합성 영역 변환 정보 검출부(210)에서는 다양한 원근변화율정보를 검출하기 위해 다양한 기준점을 이용한다. 예를 들어, 특징정보 추출시에 이동(Translation)이나 회전(Rotation) 변환이 일어나면 두 개의 기준점이 필요하고, 어파인(Affine) 변환이 일어날 경우에는 세 개의 기준점이 필요하고, 투영(Perspective) 변환이 일어날 경우에는 네 개의 기준점을 필요로 한다. 이 때, 각 점의 위치정보를 이용하여 원근변화율 정보를 검출하는 방법은 영상처리 분야에서 이미 공지된 기술이므로 구체적인 설명을 생략한다.The 3D image synthesis region transformation information detector 210 detects perspective change rate information in the synthesis region through position information of the synthesis region input through the user interface unit 600 of FIG. 1. The 3D image synthesis region transformation information detector 210 uses various reference points to detect various perspective change rate information. For example, two reference points are required if a translation or rotation transformation occurs when extracting feature information, and three reference points are required if an affine transformation occurs, and a perspective transformation is required. If this happens, four reference points are required. In this case, the method of detecting the perspective change rate information using the location information of each point is already known in the field of image processing, and thus a detailed description thereof will be omitted.

영상처리부(220)는 동영상을 입력받고, 모션 분석정보를 입력받고, 합성영상을 전달받고, 원근변화율정보를 전달받아 동영상신호의 해당영역에 3차원 합성영상을 합성한 3차원 합성영상을 생성한다.The image processor 220 receives a video, receives motion analysis information, receives a composite image, receives perspective change rate information, and generates a 3D synthesized image obtained by synthesizing a 3D synthesized image in a corresponding region of the video signal. .

영상효과부(230)는 영상처리부(220)에서 생성된 합성영상에 대한 흐림 및 그림자 처리를 수행한다.The image effect unit 230 performs blur and shadow processing on the synthesized image generated by the image processor 220.

도 3는 본 발명의 실시예에 따른 동영상의 영상합성방법에 대한 처리 흐름도이다. 도 3를 참조하면, 본 발명의 실시예에 따른 동영상의 영상합성방법은 외부로부터 합성영상의 합성영역정보를 전달받아(310), 3차원 영상 합성을 위한 3차원 합성 영역 좌표계를 산출해 낸다(320). 이를 상기 합성영역정보에 해당되는 합성영역을 포함하는 모든 프레임들 각각에 대한 특징정보에 의거하여 카메라 모션 정보를 도출한다(330). 그리고 이 카메라 모션 정보로부터 합성 영역 내의 원근 변화율 정보를 추출해(340), 외부로부터 입력된 합성영상을 전달받아, 상기 과정에서 도출된 모션정보에 의거하여, 동영상 신호 및 합성영상을 합성한다.(350)3 is a flowchart illustrating a method of synthesizing an image according to an embodiment of the present invention. Referring to FIG. 3, the image synthesis method according to an embodiment of the present invention receives the synthesis region information of the synthesized image from the outside (310), and calculates a 3D synthesis region coordinate system for synthesizing the 3D image ( 320). The camera motion information is derived based on the feature information of each of the frames including the synthesis region corresponding to the synthesis region information (330). The perspective change rate information in the synthesis region is extracted from the camera motion information (340), and the composite image input from the outside is received, and the video signal and the synthesized image are synthesized based on the motion information derived in the process. )

도 4은 본 발명의 실시예에 따른 3차원 그래픽 영상합성 과정시 합성영역에대한 좌표계 생성에 관한 개략도이다. 3차원 그래픽 영상 합성 영역은 사용자가 입력한 점들(460)로 둘러 쌓인 영역에서의 최외접 사각형의 한 변(430)들과 일대일 대응관계에 있다고 보고, 3차원 그래픽 영상 합성 영역의 한 변(440)은 3차원 공간상의 한 변(450)이 정사영된 것으로 추정할 수 있다. 이것은 최외접 사각형의 한변(430)의 노름과 3차원 공간상의 한 변(450)의 노름이 같다는 가정하에서 아래와 같은 식에 따라서 그래픽 영상 합성영역위에 3차원 좌표계를 계산해 낼 수 있다.4 is a schematic diagram of generating a coordinate system for a synthesis region in a 3D graphic image synthesis process according to an embodiment of the present invention. The 3D graphic image synthesis region is considered to have a one-to-one correspondence with one side 430 of the outermost quadrangle in the area surrounded by the points 460 input by the user, and one side 440 of the 3D graphic image synthesis region. ) Can be assumed to be orthogonal projection of one side 450 in three-dimensional space. This is assuming that the norm of one side 430 of the outermost quadrangle and the norm of one side 450 in three-dimensional space are the same.

: 3차원 그래픽 영상 삽입 영역의 외접사각형 변 벡터(430) : An lateral rectangle vector of a 3D graphic image insertion region (430)

: 3차원 그래픽 영상 삽입 영역의 변 벡터,이 정사영된 벡터(440) : Side vector of 3D graphic image insertion area, This Orthogonal Vector (440)

:로 정사영된 원래의 벡터(450) : The original vector (450)

: 3차원 회전 행렬 : 3-D Rotation Matrix

: 벡터의 x,y,z축의 3차원 좌표 Vector Three-dimensional coordinates of the x, y, and z axes of

: 3차원 그래픽 영상 삽입 영역의 꼭지점 좌표(460) : Vertex coordinates of the 3D graphic image insertion area (460)

또한, 도 5에서와 같이 에피폴라 기하학(Epipolar Geometry)을 이용한 연속단위영상(510)에서의 카메라 모션을 추출해 낼 수 있다. 참고로, 에피폴라 기하학(Epipolar Geometry)이란 양안 카메라를 이용해 3차원 상의 피사체의 한점을 왼쪽 카메라로 사영시켰을때, 반드시 그 점은 오른쪽 카메라에도 사영된다는 가정에서 출발하는 기하학을 말한다. 즉, 이 발명에서는 3차원상의 한점(540)을 촬영한 연속 단위 영상(510)의 일부를 이전 프레임의 이미지(520)와 다음 프레임의 이미지(530)로 보고 카메라의 회전, 이동에 관한 파라메타를 조합한 행렬인 에센셜 행렬을 계산한다.In addition, as shown in FIG. 5, camera motion may be extracted from the continuous unit image 510 using epipolar geometry. For reference, Epipolar Geometry refers to a geometry starting from the assumption that when a single point of a three-dimensional object is projected to the left camera using a binocular camera, the point is necessarily projected to the right camera. That is, according to the present invention, a part of the continuous unit image 510 photographing a three-dimensional point 540 is regarded as the image 520 of the previous frame and the image 530 of the next frame, and the parameters related to the rotation and movement of the camera are determined. Compute the essential matrix, which is the combined matrix.

이상의 설명은 하나의 실시예를 설명한 것에 불과하고, 본 발명은 상술한 실시예에 한정되지 않으며 첨부한 특허청구범위 내에서 다양하게 변경 가능한 것이다. 예를 들어 본 발명의 실시예에 구체적으로 나타난 각 구성 요소의 형상 및 구조는 변형하여 실시할 수 있는 것이다.The above description is only for explaining one embodiment, and the present invention is not limited to the above-described embodiment and can be variously changed within the scope of the appended claims. For example, the shape and structure of each component specifically shown in the embodiment of the present invention can be modified.

이상에서 설명한 바와 같이 본 발명에 따른 동영상의 3차원 영상합성장치 및 그 방법은 스포츠영상에 제한적으로 적용되던 동영상 처리기술을 TV 드라마나 영화 등과 같은 일반적인 동영상에도 적용할 수 있도록 한다. 즉, 동영상 처리기술의 활용범위를 확대할 수 있다는 장점이 있다.As described above, the 3D image synthesizing apparatus and method for moving images according to the present invention can be applied to general moving images such as TV dramas or movies. That is, there is an advantage in that the application range of the video processing technology can be expanded.

또한, 하드웨어적인 도움 없이도 그 동영상의 카메라 모션을 고려하여 자연스럽게 3차원 그래픽 영상과 동영상을 합성할 수 있도록 한다. 따라서, 동영상 처리를 위한 고비용의 장치를 사용하지 않고도, 동영상의 편집 및 합성을 정확하게 수행할 수 있으며, 이로 인해 디지털 영상의 시청자들이 보다 자연스럽고, 현실감있는 합성영상을 감상할 수 있도록 한다.In addition, it allows the user to synthesize 3D graphic images and videos in consideration of the camera motion of the video without hardware assistance. Therefore, it is possible to accurately edit and synthesize a video without using an expensive device for processing a video, thereby allowing viewers of digital images to enjoy a more natural and realistic composite video.

Claims (5)

연속적인 프레임들의 흐름인 동영상신호를 입력하는 동영상 입력부와,A video input unit for inputting a video signal which is a flow of consecutive frames; 상기 동영상신호를 구성하는 프레임들의 소정영역에 소정의 합성영상을 합성하기 위해 사용자와의 인터페이스를 수행하는 사용자 인터페이스부와,A user interface unit for performing an interface with a user to synthesize a predetermined composite image in a predetermined region of frames constituting the video signal; 상기 사용자 인터페이스부로부터 사용자가 선택한 합성영역정보를 전달받고, 상기 장면전환 검출부로부터 장면전환영상정보를 전달받고, 상기 동영상 입력부로부터 동영상신호를 전달받아, 상기 장면전환영상정보에 해당되는 프레임 및 그 프레임이후로 입력되는 프레임들 중 상기 합성영역정보에 해당되는 합성영역을 포함하는 프레임들 각각의 특징정보에 의거하여, 카메라의 모션 정보를 도출하는 모션 분석부와,Receives composite region information selected by the user from the user interface unit, scene change image information is received from the scene change detection unit, and receives a video signal from the video input unit, the frame corresponding to the scene change image information and the frame A motion analysis unit for deriving motion information of a camera based on feature information of each of frames including a synthesis region corresponding to the synthesis region information among frames input; 상기 사용자 인터페이스부로부터 사용자가 합성하고자 하는 합성영상 및 합성영역정보를 전달받고, 상기 모션 분석부에서 도출된 모션 정보를 전달받고, 상기 동영상 입력부로부터 동영상신호를 전달받아, 상기 모션정보에 의거하여 상기 동영상신호와 상기 합성영상을 합성한 합성영상을 생성하는 영상합성부를 포함하는 것을 특징으로 하는 동영상의 영상합성장치.Receives the synthesized image and the synthesis region information that the user wants to synthesize from the user interface unit, receives the motion information derived from the motion analysis unit, receives a video signal from the video input unit, based on the motion information And a video synthesizer configured to generate a synthesized video obtained by synthesizing the video signal and the synthesized video. 제1항에 있어서, 영상합성부는The method of claim 1, wherein the image synthesis unit 상기 합성영역의 위치정보를 통해 그 합성영역내의 원근변화율정보를 검출하는 변환정보검출부와,A conversion information detector for detecting perspective change rate information in the synthesis region through the position information of the synthesis region; 상기 변환정보검출부에서 검출된 원근변화율정보에 의거하여, 상기 동영상신호의 해당영역에 합성영상을 합성한 합성영상을 생성하는 영상처리부와,An image processor for generating a synthesized image obtained by synthesizing a synthesized image in a corresponding region of the video signal based on the perspective change rate information detected by the conversion information detector; 상기 사용자 인터페이스부를 통해 입력된 사용자의 선택정보에 의거하여 상기 영상처리부에서 생성된 합성영상에 대한 흐림 및 그림자 처리를 수행하는 영상효과부를 포함하는 것을 특징으로 하는 동영상의 영상합성장치.And an image effect unit for performing blur and shadow processing on the synthesized image generated by the image processor based on the user's selection information input through the user interface unit. 연속적인 프레임들의 흐름인 동영상신호가 입력되면, 그 동영상신호를 분석하여 장면이 전환된 프레임에 대한 정보인 장면전환영상정보를 검출하는 제1과정과,A first process of detecting scene change video information which is information on a frame to which a scene is changed by analyzing the video signal when a video signal that is a flow of consecutive frames is input; 외부로부터 합성영상의 합성영역정보를 전달받아, 상기 제1과정에서 검출된 장면전환영상정보에 해당되는 프레임 및 그 프레임 이후로 입력되는 프레임들 중 상기 합성영역정보에 해당되는 합성영역을 포함하는 모든 프레임들 각각에 대한 특징정보에 의거하여 카메라 모션 정보를 도출하는 제2과정과,All of the frames including the synthesized region corresponding to the synthesized region information among the frames corresponding to the scene change image information detected in the first process and frames inputted after the frame are received from the outside. A second process of deriving camera motion information based on the feature information of each of the frames; 외부로부터 합성영상을 전달받아, 상기 제2과정에서 도출된 모션정보에 의거하여, 상기 동영상 신호 및 합성영상을 합성하는 제3과정과,Receiving a synthesized image from the outside and synthesizing the video signal and the synthesized image based on the motion information derived in the second process; 상기 합성영상을 저장하고, 출력하는 제4과정을 포함하는 것을 특징으로 하는 동영상의 영상합성방법.And a fourth process of storing and outputting the synthesized image. 프레임 단위에서 추적된 그래픽 영상 합성 영역의 4점을 이용해 3차원 그래픽 객체를 위한 좌표계 계산 방법Coordinate System Calculation Method for 3D Graphic Object Using 4 Points of Graphic Image Composition Area Tracked in Frame Unit 단안 카메라로 촬영한 영상을 프레임 단위로 나누어 에피폴라 기하학을 적용해 양안 카메라에서와 같은 에센셜 행렬을 추출해 카메라 모션 파라메타를 추출하는 계산 방법Calculation method to extract camera motion parameters by dividing the image taken by monocular camera by frame and applying the epipolar geometry to extract the same essential matrix as in binocular camera
KR1020030048117A 2003-07-14 2003-07-14 An apparatus and method for inserting 3D graphic images in video Ceased KR20050008245A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020030048117A KR20050008245A (en) 2003-07-14 2003-07-14 An apparatus and method for inserting 3D graphic images in video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020030048117A KR20050008245A (en) 2003-07-14 2003-07-14 An apparatus and method for inserting 3D graphic images in video

Publications (1)

Publication Number Publication Date
KR20050008245A true KR20050008245A (en) 2005-01-21

Family

ID=37221518

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020030048117A Ceased KR20050008245A (en) 2003-07-14 2003-07-14 An apparatus and method for inserting 3D graphic images in video

Country Status (1)

Country Link
KR (1) KR20050008245A (en)

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100801643B1 (en) * 2006-04-06 2008-02-11 박주영 3D Conti manufacturing system and its provision method
US8402050B2 (en) 2010-08-13 2013-03-19 Pantech Co., Ltd. Apparatus and method for recognizing objects using filter information
KR101335391B1 (en) * 2010-04-12 2013-12-03 한국전자통신연구원 Video composing apparatus and its method
US9058653B1 (en) 2011-06-10 2015-06-16 Flir Systems, Inc. Alignment of visible light sources based on thermal images
US9143703B2 (en) 2011-06-10 2015-09-22 Flir Systems, Inc. Infrared camera calibration techniques
US9207708B2 (en) 2010-04-23 2015-12-08 Flir Systems, Inc. Abnormal clock rate detection in imaging sensor arrays
US9208542B2 (en) 2009-03-02 2015-12-08 Flir Systems, Inc. Pixel-wise noise reduction in thermal images
US9235023B2 (en) 2011-06-10 2016-01-12 Flir Systems, Inc. Variable lens sleeve spacer
US9235876B2 (en) 2009-03-02 2016-01-12 Flir Systems, Inc. Row and column noise reduction in thermal images
US9292909B2 (en) 2009-06-03 2016-03-22 Flir Systems, Inc. Selective image correction for infrared imaging devices
USD765081S1 (en) 2012-05-25 2016-08-30 Flir Systems, Inc. Mobile communications device attachment with camera
US9451183B2 (en) 2009-03-02 2016-09-20 Flir Systems, Inc. Time spaced infrared image enhancement
US9473681B2 (en) 2011-06-10 2016-10-18 Flir Systems, Inc. Infrared camera system housing with metalized surface
US9509924B2 (en) 2011-06-10 2016-11-29 Flir Systems, Inc. Wearable apparatus with integrated infrared imaging module
US9521289B2 (en) 2011-06-10 2016-12-13 Flir Systems, Inc. Line based image processing and flexible memory system
US9517679B2 (en) 2009-03-02 2016-12-13 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
US9635285B2 (en) 2009-03-02 2017-04-25 Flir Systems, Inc. Infrared imaging enhancement with fusion
US9674458B2 (en) 2009-06-03 2017-06-06 Flir Systems, Inc. Smart surveillance camera systems and methods
US9706138B2 (en) 2010-04-23 2017-07-11 Flir Systems, Inc. Hybrid infrared sensor array having heterogeneous infrared sensors
US9706137B2 (en) 2011-06-10 2017-07-11 Flir Systems, Inc. Electrical cabinet infrared monitor
US9706139B2 (en) 2011-06-10 2017-07-11 Flir Systems, Inc. Low power and small form factor infrared imaging
US9716843B2 (en) 2009-06-03 2017-07-25 Flir Systems, Inc. Measurement device for electrical installations and related methods
US9723227B2 (en) 2011-06-10 2017-08-01 Flir Systems, Inc. Non-uniformity correction techniques for infrared imaging devices
US9756262B2 (en) 2009-06-03 2017-09-05 Flir Systems, Inc. Systems and methods for monitoring power systems
US9756264B2 (en) 2009-03-02 2017-09-05 Flir Systems, Inc. Anomalous pixel detection
US9807319B2 (en) 2009-06-03 2017-10-31 Flir Systems, Inc. Wearable imaging devices, systems, and methods
US9811884B2 (en) 2012-07-16 2017-11-07 Flir Systems, Inc. Methods and systems for suppressing atmospheric turbulence in images
US9819880B2 (en) 2009-06-03 2017-11-14 Flir Systems, Inc. Systems and methods of suppressing sky regions in images
US9843742B2 (en) 2009-03-02 2017-12-12 Flir Systems, Inc. Thermal image frame capture using de-aligned sensor array
US9848134B2 (en) 2010-04-23 2017-12-19 Flir Systems, Inc. Infrared imager with integrated metal layers
US9900526B2 (en) 2011-06-10 2018-02-20 Flir Systems, Inc. Techniques to compensate for calibration drifts in infrared imaging devices
US9948872B2 (en) 2009-03-02 2018-04-17 Flir Systems, Inc. Monitor and control systems and methods for occupant safety and energy efficiency of structures
US9961277B2 (en) 2011-06-10 2018-05-01 Flir Systems, Inc. Infrared focal plane array heat spreaders
US9973692B2 (en) 2013-10-03 2018-05-15 Flir Systems, Inc. Situational awareness by compressed display of panoramic views
US9986175B2 (en) 2009-03-02 2018-05-29 Flir Systems, Inc. Device attachment with infrared imaging sensor
US9998697B2 (en) 2009-03-02 2018-06-12 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
US10051210B2 (en) 2011-06-10 2018-08-14 Flir Systems, Inc. Infrared detector array with selectable pixel binning systems and methods
US10079982B2 (en) 2011-06-10 2018-09-18 Flir Systems, Inc. Determination of an absolute radiometric value using blocked infrared sensors
US10091439B2 (en) 2009-06-03 2018-10-02 Flir Systems, Inc. Imager with array of multiple infrared imaging modules
US10169666B2 (en) 2011-06-10 2019-01-01 Flir Systems, Inc. Image-assisted remote control vehicle systems and methods
US10244190B2 (en) 2009-03-02 2019-03-26 Flir Systems, Inc. Compact multi-spectrum imaging with fusion
US10389953B2 (en) 2011-06-10 2019-08-20 Flir Systems, Inc. Infrared imaging device having a shutter
US10757308B2 (en) 2009-03-02 2020-08-25 Flir Systems, Inc. Techniques for device attachment with dual band imaging sensor
US10841508B2 (en) 2011-06-10 2020-11-17 Flir Systems, Inc. Electrical cabinet infrared monitor systems and methods
US11297264B2 (en) 2014-01-05 2022-04-05 Teledyne Fur, Llc Device attachment with dual band imaging sensor

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100801643B1 (en) * 2006-04-06 2008-02-11 박주영 3D Conti manufacturing system and its provision method
US9208542B2 (en) 2009-03-02 2015-12-08 Flir Systems, Inc. Pixel-wise noise reduction in thermal images
US9635285B2 (en) 2009-03-02 2017-04-25 Flir Systems, Inc. Infrared imaging enhancement with fusion
US10757308B2 (en) 2009-03-02 2020-08-25 Flir Systems, Inc. Techniques for device attachment with dual band imaging sensor
US10244190B2 (en) 2009-03-02 2019-03-26 Flir Systems, Inc. Compact multi-spectrum imaging with fusion
US9948872B2 (en) 2009-03-02 2018-04-17 Flir Systems, Inc. Monitor and control systems and methods for occupant safety and energy efficiency of structures
US9998697B2 (en) 2009-03-02 2018-06-12 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
US9756264B2 (en) 2009-03-02 2017-09-05 Flir Systems, Inc. Anomalous pixel detection
US9235876B2 (en) 2009-03-02 2016-01-12 Flir Systems, Inc. Row and column noise reduction in thermal images
US9517679B2 (en) 2009-03-02 2016-12-13 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
US9986175B2 (en) 2009-03-02 2018-05-29 Flir Systems, Inc. Device attachment with infrared imaging sensor
US10033944B2 (en) 2009-03-02 2018-07-24 Flir Systems, Inc. Time spaced infrared image enhancement
US9451183B2 (en) 2009-03-02 2016-09-20 Flir Systems, Inc. Time spaced infrared image enhancement
US9843742B2 (en) 2009-03-02 2017-12-12 Flir Systems, Inc. Thermal image frame capture using de-aligned sensor array
US10091439B2 (en) 2009-06-03 2018-10-02 Flir Systems, Inc. Imager with array of multiple infrared imaging modules
US9716843B2 (en) 2009-06-03 2017-07-25 Flir Systems, Inc. Measurement device for electrical installations and related methods
US9292909B2 (en) 2009-06-03 2016-03-22 Flir Systems, Inc. Selective image correction for infrared imaging devices
US9756262B2 (en) 2009-06-03 2017-09-05 Flir Systems, Inc. Systems and methods for monitoring power systems
US9807319B2 (en) 2009-06-03 2017-10-31 Flir Systems, Inc. Wearable imaging devices, systems, and methods
US9674458B2 (en) 2009-06-03 2017-06-06 Flir Systems, Inc. Smart surveillance camera systems and methods
US9819880B2 (en) 2009-06-03 2017-11-14 Flir Systems, Inc. Systems and methods of suppressing sky regions in images
US9843743B2 (en) 2009-06-03 2017-12-12 Flir Systems, Inc. Infant monitoring systems and methods using thermal imaging
KR101335391B1 (en) * 2010-04-12 2013-12-03 한국전자통신연구원 Video composing apparatus and its method
US9848134B2 (en) 2010-04-23 2017-12-19 Flir Systems, Inc. Infrared imager with integrated metal layers
US9706138B2 (en) 2010-04-23 2017-07-11 Flir Systems, Inc. Hybrid infrared sensor array having heterogeneous infrared sensors
US9207708B2 (en) 2010-04-23 2015-12-08 Flir Systems, Inc. Abnormal clock rate detection in imaging sensor arrays
US8402050B2 (en) 2010-08-13 2013-03-19 Pantech Co., Ltd. Apparatus and method for recognizing objects using filter information
US9405986B2 (en) 2010-08-13 2016-08-02 Pantech Co., Ltd. Apparatus and method for recognizing objects using filter information
US9473681B2 (en) 2011-06-10 2016-10-18 Flir Systems, Inc. Infrared camera system housing with metalized surface
US9143703B2 (en) 2011-06-10 2015-09-22 Flir Systems, Inc. Infrared camera calibration techniques
US10841508B2 (en) 2011-06-10 2020-11-17 Flir Systems, Inc. Electrical cabinet infrared monitor systems and methods
US9723228B2 (en) 2011-06-10 2017-08-01 Flir Systems, Inc. Infrared camera system architectures
US9716844B2 (en) 2011-06-10 2017-07-25 Flir Systems, Inc. Low power and small form factor infrared imaging
US9706139B2 (en) 2011-06-10 2017-07-11 Flir Systems, Inc. Low power and small form factor infrared imaging
US9706137B2 (en) 2011-06-10 2017-07-11 Flir Systems, Inc. Electrical cabinet infrared monitor
US9900526B2 (en) 2011-06-10 2018-02-20 Flir Systems, Inc. Techniques to compensate for calibration drifts in infrared imaging devices
US9538038B2 (en) 2011-06-10 2017-01-03 Flir Systems, Inc. Flexible memory systems and methods
US9961277B2 (en) 2011-06-10 2018-05-01 Flir Systems, Inc. Infrared focal plane array heat spreaders
US9058653B1 (en) 2011-06-10 2015-06-16 Flir Systems, Inc. Alignment of visible light sources based on thermal images
US9521289B2 (en) 2011-06-10 2016-12-13 Flir Systems, Inc. Line based image processing and flexible memory system
US9509924B2 (en) 2011-06-10 2016-11-29 Flir Systems, Inc. Wearable apparatus with integrated infrared imaging module
US10389953B2 (en) 2011-06-10 2019-08-20 Flir Systems, Inc. Infrared imaging device having a shutter
US10051210B2 (en) 2011-06-10 2018-08-14 Flir Systems, Inc. Infrared detector array with selectable pixel binning systems and methods
US10079982B2 (en) 2011-06-10 2018-09-18 Flir Systems, Inc. Determination of an absolute radiometric value using blocked infrared sensors
US9235023B2 (en) 2011-06-10 2016-01-12 Flir Systems, Inc. Variable lens sleeve spacer
US10169666B2 (en) 2011-06-10 2019-01-01 Flir Systems, Inc. Image-assisted remote control vehicle systems and methods
US10230910B2 (en) 2011-06-10 2019-03-12 Flir Systems, Inc. Infrared camera system architectures
US9723227B2 (en) 2011-06-10 2017-08-01 Flir Systems, Inc. Non-uniformity correction techniques for infrared imaging devices
US10250822B2 (en) 2011-06-10 2019-04-02 Flir Systems, Inc. Wearable apparatus with integrated infrared imaging module
USD765081S1 (en) 2012-05-25 2016-08-30 Flir Systems, Inc. Mobile communications device attachment with camera
US9811884B2 (en) 2012-07-16 2017-11-07 Flir Systems, Inc. Methods and systems for suppressing atmospheric turbulence in images
US9973692B2 (en) 2013-10-03 2018-05-15 Flir Systems, Inc. Situational awareness by compressed display of panoramic views
US11297264B2 (en) 2014-01-05 2022-04-05 Teledyne Fur, Llc Device attachment with dual band imaging sensor

Similar Documents

Publication Publication Date Title
KR20050008245A (en) An apparatus and method for inserting 3D graphic images in video
US10504293B2 (en) Augmenting multi-view image data with synthetic objects using IMU and image data
US10540773B2 (en) System and method for infinite smoothing of image sequences
KR102013978B1 (en) Method and apparatus for fusion of images
Kanade et al. Virtualized reality: Constructing virtual worlds from real scenes
US10650574B2 (en) Generating stereoscopic pairs of images from a single lens camera
Simon et al. Reconstructing while registering: a novel approach for markerless augmented reality
JP3745117B2 (en) Image processing apparatus and image processing method
Kanbara et al. A stereoscopic video see-through augmented reality system based on real-time vision-based registration
US8441521B2 (en) Method and apparatus for determining view of stereoscopic image for stereo synchronization
US10586378B2 (en) Stabilizing image sequences based on camera rotation and focal length parameters
US20120105602A1 (en) Methods, systems, and computer program products for creating three-dimensional video sequences
WO2003036565A2 (en) System and method for obtaining video of multiple moving fixation points within a dynamic scene
JPH08149517A (en) Method for gererating three-dimensional image from two-dimensional image
KR102382247B1 (en) Image processing apparatus, image processing method, and computer program
Placitelli et al. Low-cost augmented reality systems via 3D point cloud sensors
KR20120123087A (en) System and method for combining 3d text with 3d content
WO2022091811A1 (en) Image processing device, image processing method, and image processing system
KR100945307B1 (en) Method and apparatus for compositing images from stereoscopic video
KR19990057669A (en) Apparatus and method for three-dimensional image conversion of two-dimensional continuous image using virtual stereo image
De Gaspari et al. ARSTUDIO: A virtual studio system with augmented reality features
KR100868076B1 (en) Apparatus and method for synthesizing images from interlaced video
Inamoto et al. Free viewpoint video synthesis and presentation of sporting events for mixed reality entertainment
CN116402878A (en) Light field image processing method and device
WO2018117099A1 (en) Image processing device and program

Legal Events

Date Code Title Description
A201 Request for examination
PA0109 Patent application

Patent event code: PA01091R01D

Comment text: Patent Application

Patent event date: 20030714

PA0201 Request for examination
N231 Notification of change of applicant
PN2301 Change of applicant

Patent event date: 20030822

Comment text: Notification of Change of Applicant

Patent event code: PN23011R01D

PG1501 Laying open of application
E902 Notification of reason for refusal
PE0902 Notice of grounds for rejection

Comment text: Notification of reason for refusal

Patent event date: 20050504

Patent event code: PE09021S01D

E601 Decision to refuse application
PE0601 Decision on rejection of patent

Patent event date: 20051025

Comment text: Decision to Refuse Application

Patent event code: PE06012S01D

Patent event date: 20050504

Comment text: Notification of reason for refusal

Patent event code: PE06011S01I